Movatterモバイル変換


[0]ホーム

URL:


CN105424006B - Unmanned plane hovering accuracy measurement method based on binocular vision - Google Patents

Unmanned plane hovering accuracy measurement method based on binocular vision
Download PDF

Info

Publication number
CN105424006B
CN105424006BCN201510736167.9ACN201510736167ACN105424006BCN 105424006 BCN105424006 BCN 105424006BCN 201510736167 ACN201510736167 ACN 201510736167ACN 105424006 BCN105424006 BCN 105424006B
Authority
CN
China
Prior art keywords
msub
mrow
mtd
camera
mtr
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510736167.9A
Other languages
Chinese (zh)
Other versions
CN105424006A (en
Inventor
王万国
刘俍
刘越
张方正
董罡
雍军
吴观斌
慕世友
傅孟潮
魏传虎
张飞
李建祥
赵金龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
State Grid Intelligent Technology Co Ltd
Original Assignee
Electric Power Research Institute of State Grid Shandong Electric Power Co Ltd
Shandong Luneng Intelligence Technology Co Ltd
State Grid Corp of China SGCC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Electric Power Research Institute of State Grid Shandong Electric Power Co Ltd, Shandong Luneng Intelligence Technology Co Ltd, State Grid Corp of China SGCCfiledCriticalElectric Power Research Institute of State Grid Shandong Electric Power Co Ltd
Priority to CN201510736167.9ApriorityCriticalpatent/CN105424006B/en
Publication of CN105424006ApublicationCriticalpatent/CN105424006A/en
Application grantedgrantedCritical
Publication of CN105424006BpublicationCriticalpatent/CN105424006B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Classifications

Landscapes

Abstract

Translated fromChinese

本发明公开了基于双目视觉的无人机悬停精度测量方法,包括如下步骤:标定阶段:利用张正友棋盘标定法对相机进行标定,从而确定标定参数定义标定结果参数;定位阶段:在进行无人机悬停精度测量时,将滑轨放置于无人机悬停点正下方,将双目相机按照设定距离平行固定于滑轨上,且双目相机能够沿滑轨方向移动,相机镜头垂直向上放置,两相机成像平面应位于同一平面,光轴相互平行;左目相机和右目相机分别采集无人机图像,传输至计算机;计算机根据左目图像和右目图像,根据标定结果参数,计算无人机的三维位置坐标;悬停结束后,根据无人机三维轨迹计算悬停精度。本发明的有益效果:实现无人机目标的检测、跟踪、精确匹配和三维定位。

The invention discloses a method for measuring the hovering accuracy of an unmanned aerial vehicle based on binocular vision, which includes the following steps: Calibration stage: use Zhang Zhengyou's checkerboard calibration method to calibrate the camera, so as to determine the calibration parameters and define the calibration result parameters; When measuring the hovering accuracy of the man-machine, the slide rail is placed directly below the hovering point of the drone, and the binocular camera is fixed on the slide rail in parallel according to the set distance, and the binocular camera can move along the direction of the slide rail, and the camera lens Placed vertically upwards, the imaging planes of the two cameras should be on the same plane, and the optical axes are parallel to each other; the left-eye camera and the right-eye camera respectively collect images of the UAV and transmit them to the computer; The three-dimensional position coordinates of the drone; after the hovering is over, the hovering accuracy is calculated according to the three-dimensional trajectory of the drone. The beneficial effect of the present invention is to realize the detection, tracking, precise matching and three-dimensional positioning of the UAV target.

Description

Translated fromChinese
基于双目视觉的无人机悬停精度测量方法Measurement method of drone hovering accuracy based on binocular vision

技术领域technical field

本发明涉及一种基于双目视觉的无人机悬停精度测量方法。The invention relates to a method for measuring the hovering accuracy of an unmanned aerial vehicle based on binocular vision.

背景技术Background technique

无人机悬停精度是无人机性能的一个重要指标,反映了无人机的核心——飞行控制系统的稳定性和精确性。目前,在对无人机飞行功能的检验检测过程中,悬停精度的测量方法为人工观察的方法,不能保证其安全性、客观性和规范性。UAV hovering accuracy is an important indicator of UAV performance, reflecting the core of the UAV - the stability and accuracy of the flight control system. At present, in the inspection and detection process of the flight function of UAVs, the measurement method of hovering accuracy is the method of manual observation, which cannot guarantee its safety, objectivity and standardization.

无人机三维空间定位技术主要有两种方式:基于机载设备的定位方式和基于地面设备的定位方式。There are two main ways of UAV three-dimensional space positioning technology: the positioning method based on airborne equipment and the positioning method based on ground equipment.

根据机载设备类型的不同,基于机载设备的定位技术主要有3种:图4(a)基于GPS设备、图4(b)基于机载视频和图4(c)基于惯性导航装置的定位技术。机载设备集成在每个无人机的飞行控制系统中,而不是独立于无人机之外,因此灵活性较差,不适用于针对不同无人机进行定位的任务。According to different types of airborne equipment, there are three main positioning technologies based on airborne equipment: Figure 4(a) based on GPS equipment, Figure 4(b) based on airborne video and Figure 4(c) based on inertial navigation device positioning technology. The airborne equipment is integrated in the flight control system of each UAV, rather than independent of the UAV, so it is less flexible and not suitable for positioning tasks for different UAVs.

基于地面设备的定位方式可以避免上述问题。根据地面设备类型的不同,基于地面设备的定位技术可以分为3种:图4(d)超声波测距仪、图4(e)激光测距仪、和图4(f)基于机器视觉的定位技术。其中,超声波或激光测距仪多用于目标距离的测量,而三维激光测距仪的扫描速度较慢,主要应用于静态三维场景的重建,无法计算目标的运动轨迹。The positioning method based on ground equipment can avoid the above problems. According to the different types of ground equipment, the positioning technology based on ground equipment can be divided into three types: Figure 4(d) ultrasonic range finder, Figure 4(e) laser range finder, and Figure 4(f) machine vision based positioning technology. Among them, ultrasonic or laser rangefinders are mostly used to measure the distance of the target, while the scanning speed of the 3D laser rangefinder is relatively slow, and it is mainly used in the reconstruction of static 3D scenes, and cannot calculate the trajectory of the target.

发明内容Contents of the invention

本发明的目的就是为了解决上述问题,提供一种基于双目视觉的无人机悬停精度测量方法,它能够实时计算无人机的三维飞行轨迹,并自动计算悬停精度,提高了测量的准确性与规范性。The purpose of the present invention is to solve the above problems, to provide a method for measuring the hovering accuracy of unmanned aerial vehicles based on binocular vision, which can calculate the three-dimensional flight trajectory of the unmanned aerial vehicle in real time, and automatically calculate the hovering accuracy, which improves the accuracy of the measurement. Accuracy and standardization.

为了实现上述目的,本发明采用如下技术方案:In order to achieve the above object, the present invention adopts the following technical solutions:

基于双目视觉的无人机悬停精度测量方法,包括如下步骤:A method for measuring drone hovering accuracy based on binocular vision, including the following steps:

步骤(1):标定阶段:利用张正友棋盘标定法对相机进行标定,从而确定标定参数,定义标定结果参数;Step (1): Calibration stage: Use Zhang Zhengyou’s checkerboard calibration method to calibrate the camera, so as to determine the calibration parameters and define the calibration result parameters;

步骤(2):定位阶段:在进行无人机悬停精度测量时,将滑轨放置于无人机悬停点正下方,将双目相机按照设定距离平行固定于滑轨上,且双目相机能够沿滑轨方向移动,相机镜头垂直向上放置,双目相机成像平面应位于同一平面,光轴相互平行;左目相机和右目相机分别采集无人机图像,传输至计算机;计算机根据采集到的左目图像和右目图像,结合步骤(1)得到的标定结果参数,计算无人机的三维位置坐标;悬停结束后,根据无人机三维轨迹计算悬停精度。Step (2): Positioning stage: when measuring the hovering accuracy of the drone, place the slide rail directly below the hover point of the drone, fix the binocular camera on the slide rail in parallel with the set distance, and The eye camera can move along the direction of the slide rail, the camera lens is placed vertically upward, the imaging plane of the binocular camera should be on the same plane, and the optical axes are parallel to each other; the left eye camera and the right eye camera respectively collect the images of the drone and transmit them to the computer; The left-eye image and right-eye image, combined with the calibration result parameters obtained in step (1), calculate the three-dimensional position coordinates of the UAV; after the hovering is over, calculate the hovering accuracy according to the three-dimensional trajectory of the UAV.

所述步骤(1)的步骤为:The step of described step (1) is:

步骤(1-1):将两个相机固定在同一个滑轨上,定义距离L,调整滑轨上两相机位置,使其中心点之间的距离为L;Step (1-1): Fix the two cameras on the same slide rail, define the distance L, and adjust the positions of the two cameras on the slide rail so that the distance between their center points is L;

步骤(1-2):采用张正友棋盘标定法对相机进行标定,并记录标定结果参数result={Mleft,Dleft,Mright,Dright,R,T}。result表示标定结果参数,Mleft和Dleft分别表示左目相机的相机矩阵和畸变系数向量,Mright和Dright分别表示右目相机的相机矩阵和畸变系数向量,R和T分别表示两相机之间的旋转矩阵和平移向量。对于每个相机,Step (1-2): Calibrate the camera using Zhang Zhengyou's checkerboard calibration method, and record the calibration result parameter result={Mleft , Dleft , Mright , Dright , R, T}. result represents the calibration result parameters, Mleft and Dleft represent the camera matrix and distortion coefficient vector of the left eye camera respectively, Mright and Dright represent the camera matrix and distortion coefficient vector of the right eye camera respectively, R and T represent the distance between the two cameras Rotation matrix and translation vector. For each camera,

其中,M为相机矩阵,fx,fy是以像素为单位的焦距。Among them, M is the camera matrix, and fx , fy are focal lengths in pixels.

所述步骤(2)的步骤为:The step of described step (2) is:

步骤(2-1):在进行无人机悬停精度测量时,将滑轨放置于无人机悬停点正下方,将双目相机平行固定于滑轨上,相机镜头垂直向上放置,两相机成像平面应位于同一平面,光轴相互平行;左目相机采集无人机左目图像,右目相机采集无人机右目图像;Step (2-1): When measuring the hovering accuracy of the drone, place the slide rail directly below the hover point of the drone, fix the binocular camera parallel to the slide rail, and place the camera lens vertically upwards. The camera imaging planes should be on the same plane, and the optical axes are parallel to each other; the left-eye camera collects the left-eye image of the UAV, and the right-eye camera collects the right-eye image of the UAV;

步骤(2-2):目标区域定位:采用手动选取的方式,选定左目图像中的目标区域;Step (2-2): target area positioning: manually select the target area in the left-eye image;

步骤(2-3):目标跟踪:利用TLD算法,对左目图像中的目标进行跟踪;Step (2-3): target tracking: use the TLD algorithm to track the target in the left eye image;

步骤(2-4):目标匹配:在右目图像中,寻找与左目图像的目标区域最相似的匹配区域;Step (2-4): target matching: in the right-eye image, find the matching area most similar to the target area of the left-eye image;

步骤(2-5):同名点匹配:分别利用左目和右目图像中矩形目标区域的中心点,作为同名点;Step (2-5): Homogenous point matching: respectively use the center point of the rectangular target area in the left-eye and right-eye images as the homonymous point;

步骤(2-6):三维坐标计算:建立相机的坐标系,结合步骤(1)得到的标定结果参数,计算目标点在相机坐标系中的三维坐标;Step (2-6): Three-dimensional coordinate calculation: establish the coordinate system of the camera, combine the calibration result parameters obtained in step (1), and calculate the three-dimensional coordinates of the target point in the camera coordinate system;

步骤(2-7):悬停精度评估,根据目标点的三维坐标轨迹,计算无人机的悬停精度。Step (2-7): Hovering accuracy evaluation, according to the three-dimensional coordinate trajectory of the target point, calculate the hovering accuracy of the drone.

所述步骤(2-2)的步骤为:令开始对目标进行定位的时刻t=0。首先手动框取目标区域,目标区域为以BL为左上角点,高h、宽w的矩形区域。The step of the step (2-2) is: let the time t=0 when the positioning of the target is started. First, manually frame the target area, which is a rectangular area withBL as the upper left corner point, height h, and width w.

所述步骤(2-3)的步骤为:根据t=0时刻确定的左目图像目标区域,在t=1及其以后的时刻,采用TLD算法对左目图像的目标进行跟踪。The step of the step (2-3) is: according to the target area of the left-eye image determined at the time t=0, at t=1 and later, the TLD algorithm is used to track the target of the left-eye image.

所述步骤(2-4)的步骤为:每次得到左目图像中的目标区域后,在右目图像中,寻找与左目图像的目标区域最相似的匹配区域,匹配区域为以BR为左上角点,高h、宽w的矩形区域;The step of said step (2-4) is: after obtaining the target area in the left-eye image each time, in the right-eye image, look for the matching area most similar to the target area of the left-eye image, and the matching area is the upper left corner withBR Point, a rectangular area with height h and width w;

则目标匹配表示为:Then the target match is expressed as:

其中,Ileft表示左目图像灰度值,Iright表示右目图像灰度值,(xL,yL)表示点BL的坐标,(xR,yR)表示点BR的坐标;此时搜寻范围xR∈[0,xL],其中sh为搜寻区域的高度。这样,得到使得公式(1)最小的BR点坐标(xR,yR)后,则视差d=xL-xRAmong them, Ileft represents the gray value of the left eye image, Iright represents the gray value of the right eye image, (xL , yL ) represents the coordinates of pointBL , (xR , yR ) represents the coordinates of pointBR ; The search range xR ∈ [0,xL ], where sh is the height of the search area. In this way, after obtaining the BR point coordinates (xR , yR ) that make formula (1) the smallest, then the parallax d=xL −xR ;

在t≥1时刻,左目图像通过TLD算法得到新的目标区域后,对右目图像的搜寻范围进行更新根据公式(1)确定右目图像中的目标区域;以此类推,来计算每一帧左目图像中的目标区域,以及同一目标在右目图像中的对应区域。At time t≥1, after the left-eye image obtains a new target area through the TLD algorithm, the search range of the right-eye image is updated Determine the target area in the right-eye image according to formula (1); and so on, calculate the target area in each frame of the left-eye image, and the corresponding area of the same target in the right-eye image.

所述步骤(2-5)的步骤为:The step of described step (2-5) is:

利用左目图像中目标区域的中心点作为左目目标的同名点,利用右目图像中的目标区域的中心点作为右目目标的同名点。Use the center point of the target area in the left eye image As the same name point of the left-eye target, use the center point of the target area in the right-eye image Point of the same name as the right eye target.

所述同名点即左目图像和右目图像中,对应实际目标同一部位的像素点,必须保证前后帧、左右目图像中的同名点对应实际目标的同一位置。The same-named points are the pixels corresponding to the same part of the actual target in the left-eye image and the right-eye image. It must be ensured that the same-named points in the front and back frames and the left and right images correspond to the same position of the actual target.

所述步骤(2-6)的步骤为:The step of described step (2-6) is:

相机坐标系是以左目相机的光心OL为原点,XOLY平面平行于成像平面,光轴方向为Z轴,根据标定的相机参数result,得到重投影矩阵The camera coordinate system is based on the optical center OL of the left eye camera, the XOL Y plane is parallel to the imaging plane, and the optical axis direction is the Z axis. According to the calibrated camera parameter result, the reprojection matrix is obtained

其中,为左相机的主点坐标,为右相机的主点坐标;Tx为两相机之间平移矩阵的X轴分量;fl为左侧相机焦距;in, is the principal point coordinates of the left camera, is the principal point coordinates of the right camera; Tx is the X-axis component of the translation matrix between the two cameras; fl is the focal length of the left camera;

在左右目光轴相互平行情况下,已知左目图像同名点坐标PL(xL,yL)和右目图像同名点坐标PR(xR,yR),计算目标点在左右视图的视差d=xL-xR,则令When the axes of the left and right eyes are parallel to each other, the coordinates of the homonymous point of the left eye image PL (xL , yL ) and the coordinates of the same name point of the right eye image PR (xR , yR ) are known, and the disparity d of the target point in the left and right views is calculated =xL -xR , then let

得到目标点在相机坐标系中的三维坐标:Get the 3D coordinates of the target point in the camera coordinate system:

其中为中间结果变量,xc、yc和zc分别为目标点Pc在相机坐标系中的X、Y和Z轴坐标。in with is the intermediate result variable, and xc , yc and zc are the X, Y and Z axis coordinates of the target point Pc in the camera coordinate system, respectively.

所述步骤(2-7)的步骤为:The step of described step (2-7) is:

在无人机悬停时,采用步骤(2-5)实时计算无人机的三维位置坐标,得到其飞行轨迹;When the drone is hovering, the steps (2-5) are used to calculate the three-dimensional position coordinates of the drone in real time to obtain its flight trajectory;

假设得到轨迹点的集合为P=[P1,P2,...,PN],共包含N个点,其中Assume that the set of track points obtained is P=[P1 ,P2 ,...,PN ], which contains N points in total, among which

Pn=[xn,yn,zn]T,n=1,2,...,N;Pn =[xn ,yn ,zn ]T ,n=1,2,...,N;

飞行轨迹点集合的质心The centroid of the set of flight path points

进行无人机悬停精度测试时,规定无人机悬停的离地高度记作H0When carrying out the hovering accuracy test of the drone, it is stipulated that the height above the ground for the hovering of the drone is recorded as H0 ;

在进行悬停精度检测时,将双目相机放置于悬停的正下方,具体来说,是将左目相机放置于悬停点并使光轴垂直于水平面;悬停精度分为定点悬停精度的水平偏差和垂直偏差悬停控制精度的水平偏差和垂直偏差When performing hovering accuracy detection, place the binocular camera directly below the hovering, specifically, place the left-eye camera at the hovering point and make the optical axis perpendicular to the horizontal plane; hovering accuracy is divided into fixed-point hovering accuracy level deviation and vertical deviation Horizontal deviation of hover control accuracy and vertical deviation

由于OL为坐标系原点,计算公式为:Since OL is the origin of the coordinate system, the calculation formula is:

其中,eX和eY分别为无人机悬停过程中,在以左目相机为原点的相机坐标系下,X轴方向和Y轴方向的运动范围,zn表示无人机飞行轨迹点Pn的Z轴坐标。Among them, eX and eY are the motion ranges in the X-axis direction and the Y-axis direction in the camera coordinate system with the left-eye camera as the origin during the hovering process of the UAV, and zn represents the UAV flight trajectory point P Z coordinate ofn .

本发明的有益效果:Beneficial effects of the present invention:

1能够实时计算无人机的三维飞行轨迹,并自动计算悬停精度,提高了测量的准确性与规范性;1. It can calculate the three-dimensional flight trajectory of the UAV in real time, and automatically calculate the hovering accuracy, which improves the accuracy and standardization of the measurement;

2无需对无人机作任何改造,系统的可扩展性较好;2 There is no need to make any modifications to the UAV, and the system has good scalability;

3整个测量过程几乎不需要人的参与,自动化程度高;3 The whole measurement process hardly requires human participation, and the degree of automation is high;

4设备的使用方法简便,只需将双目相机放置于预先设置的悬停点下方,即可在计算机端获得无人机的飞行轨迹与悬停精度。4 The device is easy to use, just place the binocular camera under the preset hovering point, and you can get the flight trajectory and hovering accuracy of the drone on the computer side.

5基于机器视觉的三维定位技术,可以将目标的检测、跟踪、匹配、三维定位过程整合在一起,只需要将相机放置在指定位置,所有的相关算法都在计算机上完成并显示。因此针对于检验检测任务,一种有效方法是:利用双目视觉技术,结合无人机检验检测任务的特殊情况和需求,实现无人机目标的检测、跟踪、精确匹配和三维定位。5 The three-dimensional positioning technology based on machine vision can integrate the process of target detection, tracking, matching and three-dimensional positioning. It only needs to place the camera at the specified position, and all relevant algorithms are completed and displayed on the computer. Therefore, for inspection and detection tasks, an effective method is to use binocular vision technology, combined with the special circumstances and needs of UAV inspection and detection tasks, to realize the detection, tracking, precise matching and three-dimensional positioning of UAV targets.

附图说明Description of drawings

图1为本发明的整体流程图;Fig. 1 is the overall flowchart of the present invention;

图2为本发明的标定阶段流程图;Fig. 2 is a flow chart of the calibration stage of the present invention;

图3为本发明的定位阶段流程图;Fig. 3 is a flow chart of the positioning stage of the present invention;

图4(a)为基于GPS机载设备的定位方式;Figure 4(a) is the positioning method based on GPS airborne equipment;

图4(b)为基于机载视频设备的定位方式;Figure 4(b) is a positioning method based on airborne video equipment;

图4(c)为基于机载惯性导航设备的定位方式;Figure 4(c) is the positioning method based on airborne inertial navigation equipment;

图4(d)为基于地面超声波测距仪的定位方式;Figure 4(d) is the positioning method based on the ground ultrasonic range finder;

图4(e)为基于地面激光测距仪的定位方式;Fig. 4 (e) is the positioning method based on the ground laser rangefinder;

图4(f)为基于地面机器视觉的定位方式;Figure 4(f) is a positioning method based on ground machine vision;

图5基于双目视觉的无人机悬停精度测量系统;Figure 5 UAV hovering accuracy measurement system based on binocular vision;

图6双目相机标定示意图;Figure 6 Schematic diagram of binocular camera calibration;

图7目标跟踪与匹配;Figure 7 Target tracking and matching;

图8左右目、前后帧的同名点匹配;Figure 8 Matching of points with the same name in the left and right eyes, front and back frames;

图9双目定位系统坐标系示意图;Figure 9 is a schematic diagram of the coordinate system of the binocular positioning system;

图10悬停误差示意图。Figure 10. Schematic diagram of hovering error.

具体实施方式detailed description

下面结合附图与实施例对本发明作进一步说明。The present invention will be further described below in conjunction with the accompanying drawings and embodiments.

硬件组成为:2个工业相机,1个相机滑轨,1套工控机与显示器,1个棋盘标定版。整个系统结构如图5所示。The hardware consists of: 2 industrial cameras, 1 camera slide rail, 1 set of industrial computer and monitor, and 1 checkerboard calibration version. The whole system structure is shown in Fig. 5.

1、2:工业相机。镜头焦距5mm,分辨率1384x1032,帧率16帧/秒。1, 2: industrial camera. The focal length of the lens is 5mm, the resolution is 1384x1032, and the frame rate is 16 frames per second.

3:相机滑轨。长度1.2m,带有刻度,分度值为1mm。3: Camera slide rail. The length is 1.2m, with a scale, and the division value is 1mm.

4:工控机与显示器。凌华MXC-6000工控机。4: Industrial computer and monitor. ADLINK MXC-6000 industrial computer.

5:棋盘标定板。方格个数19×17,每个方格宽度20mm。5: Checkerboard calibration board. The number of squares is 19×17, and the width of each square is 20mm.

6:无人机。无人直升机、多旋翼无人机。6: Drones. Unmanned helicopters, multi-rotor drones.

如图1所示,整个过程分为2个阶段:标定阶段和定位阶段。在标定阶段,利用张正友棋盘标定法[1]对相机进行标定。在定位阶段,根据标定参数,计算无人机目标的三维位置。As shown in Figure 1, the whole process is divided into two stages: the calibration stage and the positioning stage. In the calibration stage, the camera is calibrated using Zhang Zhengyou's chessboard calibration method[1] . In the positioning stage, the three-dimensional position of the UAV target is calculated according to the calibration parameters.

3.2标定阶段:3.2 Calibration stage:

对于标定技术相关算法的理论问题,基本都已得到了的解决。目前的相机标定主要有三类方法:传统标定方法、主动视觉标定方法和自标定方法。传统标定方法利用几何约束已知的标定模板计算摄像机参数,设备简便,精度高,是目前最常用的方法。主动视觉标定方法将摄像机固定在云台等机械机构上,严格限制摄像机的旋转和平移运动,标定精度高,设备复杂,标定时间长;自标定方法仅依靠场景中多幅图像之间的对应关系计算摄像机参数,标定灵活方便,但属于非线性标定,鲁棒性不高。The theoretical problems related to calibration technology algorithms have basically been solved. Currently, there are three main methods of camera calibration: traditional calibration methods, active vision calibration methods and self-calibration methods. The traditional calibration method uses a calibration template with known geometric constraints to calculate camera parameters, which is the most commonly used method at present because of its simple equipment and high precision. The active vision calibration method fixes the camera on a mechanical mechanism such as a pan/tilt, and strictly limits the rotation and translation of the camera. The calibration accuracy is high, the equipment is complex, and the calibration time is long; the self-calibration method only relies on the correspondence between multiple images in the scene It is flexible and convenient to calculate the camera parameters, but it belongs to nonlinear calibration and the robustness is not high.

本发明利用传统标定法,在文献[1]方法的基础上进行扩展。定义标定结果result={Mleft,Dleft,Mright,Dright,R,T}。result表示标定结果,Mleft和Dleft分别表示目相机的相机矩阵和畸变系数向量,Mright和Dright分别表示右目相机的相机矩阵和畸变系数向量,R和T分别表示两相机之间的旋转矩阵和平移向量。如图2、6所示,标定步骤为:The present invention utilizes the traditional calibration method and expands on the basis of the method in document [1]. Define the calibration result result={Mleft , Dleft , Mright , Dright , R, T}. result represents the calibration result, Mleft and Dleft represent the camera matrix and distortion coefficient vector of the target camera respectively, Mright and Dright represent the camera matrix and distortion coefficient vector of the right camera respectively, R and T represent the rotation between the two cameras respectively Matrix and translation vector. As shown in Figures 2 and 6, the calibration steps are:

1将两个相机固定在同一个滑轨上,定义距离L,调整滑轨上两相机位置,使其中心点之间的距离为L。1 Fix the two cameras on the same slide rail, define the distance L, and adjust the positions of the two cameras on the slide rail so that the distance between their center points is L.

2采用张正友棋盘标定法对相机进行标定,并记录标定结果参数为如下形式:2 Use Zhang Zhengyou's checkerboard calibration method to calibrate the camera, and record the calibration result parameters in the following form:

result={Mleft,Dleft,Mright,Dright,R,T}result={Mleft ,Dleft ,Mright ,Dright ,R,T}

其中result表示标定结果,Mleft和Dleft分别表示目相机的相机矩阵和畸变系数向量,Mright和Dright分别表示右目相机的相机矩阵和畸变系数向量,R和T分别表示两相机之间的旋转矩阵和平移向量。Where result represents the calibration result, Mleft and Dleft represent the camera matrix and distortion coefficient vector of the target camera respectively, Mright and Dright represent the camera matrix and distortion coefficient vector of the right camera respectively, R and T represent the distance between the two cameras Rotation matrix and translation vector.

3.3定位阶段:3.3 Positioning stage:

如图5所示,在进行无人机悬停精度测量时,将双目相机平行固定于滑轨上,两相机成像平面应尽量位于同一平面,光轴相互平行。将滑轨放置于无人机悬停点正下方,相机镜头垂直向上放置。左目和右目相机采集无人机图像,通过GigE千兆网传输至便携电脑。便携电脑根据左目和右目图像,采用相关算法定位无人机的三维位置坐标。悬停结束后,根据无人机三维轨迹计算悬停精度。As shown in Figure 5, when measuring the hovering accuracy of the UAV, the binocular camera is fixed on the slide rail in parallel, and the imaging planes of the two cameras should be located on the same plane as much as possible, and the optical axes are parallel to each other. Place the slide rail directly below the hovering point of the drone, and place the camera lens vertically upwards. The left-eye and right-eye cameras collect drone images and transmit them to the portable computer through the GigE gigabit network. According to the left-eye and right-eye images, the portable computer uses relevant algorithms to locate the three-dimensional position coordinates of the drone. After hovering, the hovering accuracy is calculated according to the 3D trajectory of the drone.

如图3所示,无人机的三维位置定位算法主要分为以下几个步骤:目标区域定位与跟踪、左右目目标匹配、同名点匹配、三维坐标计算和悬停精度的评估。As shown in Figure 3, the 3D position positioning algorithm of the UAV is mainly divided into the following steps: target area positioning and tracking, left and right eye target matching, point matching with the same name, 3D coordinate calculation and hovering accuracy evaluation.

目标区域定位与跟踪Target area positioning and tracking

目标区域定位可采用手动选取或自动检测的方式。由于无人机目标图像背景为静态的单一背景,因此可以采用显著性目标检测方式,获得目标区域。如图7所示,在t=0时刻,通过手动选取或自动检测的方式,得到左目图像中的目标位置(实线矩形框)后,采用TLD(Tracking-Learning-Detection)算法[2]进行跟踪。TLD算法的优势是,在目标移出并重新进入图像区域时,算法仍能检测并跟踪此目标。Target area positioning can be manually selected or automatically detected. Since the UAV target image background is a static single background, the salient target detection method can be used to obtain the target area. As shown in Figure 7, at time t=0, after obtaining the target position (rectangular frame with solid line) in the left-eye image by manual selection or automatic detection, TLD (Tracking-Learning-Detection) algorithm[2] is used to carry out track. The advantage of the TLD algorithm is that when the target moves out and re-enters the image area, the algorithm can still detect and track the target.

左右目目标匹配Left and right target matching

假设在时刻t=0初次确定了左目图像的目标区域,如图7所示,目标区域为以BL为左上角点,高h、宽w的矩形区域。左右目目标匹配就是在右目图像中,寻找与左目目标区域最相似的匹配区域,匹配区域为以BR为左上角点,高h、宽w的矩形区域。则目标匹配可表示为以下问题:Assuming that the target area of the left-eye image is determined for the first time at time t= 0, as shown in FIG. 7 , the target area is a rectangular area with height h and width w with BL as the upper left corner point. Left and right eye target matching is to find the matching area most similar to the left eye target area in the right eye image. The matching area is a rectangular area with BR as the upper left corner point, height h, and width w. Then goal matching can be expressed as the following problem:

公式(1) Formula 1)

其中,Ileft和Iright分别代表左目和右目图像灰度值,(xL,yL)和(xR,yR)分别为点BL和BR的坐标。此时搜寻范围xR∈[0,xL],如图7中t=0时刻图像中灰色区域所示。得到使得公式(1)最小的(xR,yR)后,则视差d=xL-xRAmong them, Ileft and Iright represent the gray value of the left-eye and right-eye images respectively, and (xL , yL ) and (xR , yR ) are the coordinates of pointsBL andBR respectively. At this time, the search range xR ∈ [0,xL ], The gray area in the image at time t=0 in Fig. 7 is shown. After obtaining (xR , yR ) that minimizes the formula (1), then the parallax d=xL -xR .

在t+1时刻,左目图像通过TLD算法得到新的目标区域,对右目图像的搜寻范围进行更新如t+1时刻图像中灰色区域所示。根据公式(1)确定右目图像中的目标区域。以此类推,来计算每一帧左目和右目图像的目标区域。At time t+1, the left-eye image obtains a new target area through the TLD algorithm, and updates the search range of the right-eye image As shown in the gray area in the image at time t+1. Determine the target area in the right-eye image according to formula (1). By analogy, the target area of each frame of left-eye and right-eye images is calculated.

同名点匹配dot matching

同名点,即左目和右目图像中,对应实际目标同一部位的像素点。如图8所示,我们必须保证前后帧、左右目图像中的同名点对应实际目标的同一位置。一种简单的方式是利用图像中目标区域的中心点,即作为左目和右目目标的同名点。Points with the same name, that is, pixels corresponding to the same part of the actual target in the left-eye and right-eye images. As shown in Figure 8, we must ensure that the points with the same name in the front and back frames and the left and right images correspond to the same position of the actual target. A simple way is to use the center point of the target area in the image, namely with Points of the same name as left-eye and right-eye targets.

三维坐标计算3D coordinate calculation

相机坐标系是以左目相机的光心OL为原点,XOLY平面平行于成像平面,光轴方向为Z轴,如图9所示。根据标定的相机参数,得到重投影矩阵The camera coordinate system takes the optical centerOL of the left eye camera as the origin, theXOL Y plane is parallel to the imaging plane, and the optical axis direction is the Z axis, as shown in Figure 9. According to the calibrated camera parameters, the reprojection matrix is obtained

公式(2) Formula (2)

其中为左、右相机的主点坐标(式中未用到);Tx为两相机之间平移矩阵的X轴分量;fl为左侧相机焦距。在左右目光轴相互平行情况下,已知左右目图像同名点坐标PL(xL,yL)和PR(xR,yR),计算目标点在左右视图的视差d=xL-xR,则令in are the principal point coordinates of the left and right cameras (not used in the formula ); Tx is the X-axis component of the translation matrix between the two cameras; fl is the focal length of the left camera. In the case that the axes of the left and right eyes are parallel to each other, and the coordinatesPL (xL , yL ) and PR (xR , yR ) of the left and right eyes images with the same name are known, the disparity of the target point in the left and right views is calculated d=xL - xR , then let

公式(3) Formula (3)

这样就得到目标点在相机坐标系中的三维坐标:In this way, the three-dimensional coordinates of the target point in the camera coordinate system are obtained:

公式(4) Formula (4)

悬停精度评估Hover Accuracy Evaluation

在无人机悬停时,采用以上方法实时计算无人机的三维位置坐标,得到其飞行轨迹。假设得到轨迹点的集合为P=[P1,P2,...,PN],共包含N个点,其中When the drone is hovering, the above method is used to calculate the three-dimensional position coordinates of the drone in real time to obtain its flight trajectory. Assume that the set of track points obtained is P=[P1 ,P2 ,...,PN ], which contains N points in total, among which

Pn=[xn,yn,zn]T,n=1,2,...,N。图10中的点为飞行轨迹点在水平面的投影。飞行轨迹点集合的质心在进行悬停精度检测时,将双目相机放置于悬停的的正下方,具体来说,是将左目相机放置于悬停点并使光轴垂直于水平面。停精度分为定点悬停精度的水平偏差和垂直偏差悬停控制精度的水平偏差和垂直偏差由于OL为坐标系原点,计算公式为:Pn =[xn ,yn ,zn ]T ,n=1,2,...,N. The points in Figure 10 are the projections of the flight track points on the horizontal plane. The centroid of the set of flight path points When testing the hovering accuracy, place the binocular camera directly below the hovering camera. Specifically, place the left camera at the hovering point and make the optical axis perpendicular to the horizontal plane. The stopping accuracy is divided into the horizontal deviation of the fixed-point hovering accuracy and vertical deviation Horizontal deviation of hover control accuracy and vertical deviation Since OL is the origin of the coordinate system, the calculation formula is:

参考文献:references:

[1]Zhang Z.A flexible new technique for camera calibration[J].PatternAnalysis and Machine Intelligence,IEEE Transactions on,2000,22(11):1330-1334.[1] Zhang Z.A flexible new technique for camera calibration[J].Pattern Analysis and Machine Intelligence,IEEE Transactions on,2000,22(11):1330-1334.

[2]Kalal Z,Mikolajczyk K,Matas J.Tracking-learning-detection[J].Pattern Analysis and Machine Intelligence,IEEE Transactions on,2012,34(7):1409-1422.[2] Kalal Z, Mikolajczyk K, Matas J. Tracking-learning-detection [J]. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 2012, 34(7): 1409-1422.

上述虽然结合附图对本发明的具体实施方式进行了描述,但并非对本发明保护范围的限制,所属领域技术人员应该明白,在本发明的技术方案的基础上,本领域技术人员不需要付出创造性劳动即可做出的各种修改或变形仍在本发明的保护范围以内。Although the specific implementation of the present invention has been described above in conjunction with the accompanying drawings, it does not limit the protection scope of the present invention. Those skilled in the art should understand that on the basis of the technical solution of the present invention, those skilled in the art do not need to pay creative work Various modifications or variations that can be made are still within the protection scope of the present invention.

Claims (8)

Translated fromChinese
1.基于双目视觉的无人机悬停精度测量方法,其特征是,包括如下步骤:1. The method for measuring the hovering precision of an unmanned aerial vehicle based on binocular vision is characterized in that it comprises the steps:步骤(1):标定阶段:利用张正友棋盘标定法对相机进行标定,从而确定标定参数定义标定结果参数;Step (1): Calibration stage: Use Zhang Zhengyou’s checkerboard calibration method to calibrate the camera, so as to determine the calibration parameters and define the calibration result parameters;步骤(2):定位阶段:在进行无人机悬停精度测量时,将滑轨放置于无人机悬停点正下方,将双目相机按照设定距离平行固定于滑轨上,且双目相机能够沿滑轨方向移动,相机镜头垂直向上放置,双目相机成像平面应位于同一平面,光轴相互平行;左目相机和右目相机分别采集无人机图像,传输至计算机;计算机根据采集到的左目图像和右目图像,结合步骤(1)得到的标定结果参数,计算无人机的三维位置坐标;悬停结束后,根据无人机三维轨迹计算悬停精度;Step (2): Positioning stage: when measuring the hovering accuracy of the drone, place the slide rail directly below the hover point of the drone, fix the binocular camera on the slide rail in parallel with the set distance, and The eye camera can move along the direction of the slide rail, the camera lens is placed vertically upward, the imaging plane of the binocular camera should be on the same plane, and the optical axes are parallel to each other; the left eye camera and the right eye camera respectively collect the images of the drone and transmit them to the computer; The left-eye image and the right-eye image are combined with the calibration result parameters obtained in step (1) to calculate the three-dimensional position coordinates of the UAV; after the hovering is over, the hovering accuracy is calculated according to the three-dimensional trajectory of the UAV;所述步骤(2)的步骤为:The step of described step (2) is:步骤(2-1):在进行无人机悬停精度测量时,将滑轨放置于无人机悬停点正下方,将双目相机平行固定于滑轨上,相机镜头垂直向上放置,双目相机成像平面应位于同一平面,光轴相互平行;左目相机采集无人机左目图像,右目相机采集无人机右目图像;Step (2-1): When measuring the hovering accuracy of the drone, place the slide rail directly below the hover point of the drone, fix the binocular camera on the slide rail in parallel, and place the camera lens vertically upwards. The imaging plane of the eye camera should be located on the same plane, and the optical axes are parallel to each other; the left eye camera collects the image of the left eye of the UAV, and the right eye camera collects the image of the right eye of the UAV;步骤(2-2):目标区域检测:采用显著性目标检测方式,获得左目图像中的目标区域;Step (2-2): target area detection: adopt the salient target detection method to obtain the target area in the left-eye image;步骤(2-3):目标跟踪:利用TLD算法,对左目图像中的目标进行跟踪;Step (2-3): target tracking: use the TLD algorithm to track the target in the left eye image;步骤(2-4):目标匹配:在右目图像中,寻找与左目图像的目标区域最相似的匹配区域;Step (2-4): target matching: in the right-eye image, find the matching area most similar to the target area of the left-eye image;步骤(2-5):同名点匹配:分别利用左目和右目图像中矩形目标区域的中心点,作为同名点;Step (2-5): Homogenous point matching: respectively use the center point of the rectangular target area in the left-eye and right-eye images as the homonymous point;步骤(2-6):三维坐标计算:建立相机的坐标系,结合步骤(1)得到的标定结果参数,计算目标点在相机坐标系中的三维坐标;Step (2-6): Three-dimensional coordinate calculation: establish the coordinate system of the camera, combine the calibration result parameters obtained in step (1), and calculate the three-dimensional coordinates of the target point in the camera coordinate system;步骤(2-7):悬停精度评估,根据目标点的三维坐标轨迹,计算无人机的悬停精度。Step (2-7): Hovering accuracy evaluation, according to the three-dimensional coordinate trajectory of the target point, calculate the hovering accuracy of the drone.2.如权利要求1所述的基于双目视觉的无人机悬停精度测量方法,其特征是,所述步骤(1)的步骤为:2. the hovering accuracy measurement method of unmanned aerial vehicle based on binocular vision as claimed in claim 1, is characterized in that, the step of described step (1) is:步骤(1-1):将两个相机固定在同一个滑轨上,定义距离L,调整滑轨上两相机位置,使其中心点之间的距离为L;Step (1-1): Fix the two cameras on the same slide rail, define the distance L, and adjust the positions of the two cameras on the slide rail so that the distance between their center points is L;步骤(1-2):采用张正友棋盘标定法对相机进行标定,并记录标定结果参数result={Mleft,Dleft,Mright,Dright,R,T};result表示标定结果,Mleft和Dleft分别表示目相机的相机矩阵和畸变系数向量,Mright和Dright分别表示右目相机的相机矩阵和畸变系数向量,R和T分别表示两相机之间的旋转矩阵和平移向量;对于每个相机,Step (1-2): Calibrate the camera using Zhang Zhengyou’s checkerboard calibration method, and record the calibration result parameter result={Mleft , Dleft , Mright , Dright , R, T}; result represents the calibration result, Mleft and Dleft represents the camera matrix and distortion coefficient vector of the eye camera respectively, Mright and Dright represent the camera matrix and distortion coefficient vector of the right eye camera respectively, R and T represent the rotation matrix and translation vector between the two cameras respectively; for each camera, <mrow> <mi>M</mi> <mo>=</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <msub> <mi>f</mi> <mi>x</mi> </msub> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <msub> <mi>c</mi> <mi>x</mi> </msub> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <msub> <mi>f</mi> <mi>y</mi> </msub> </mtd> <mtd> <msub> <mi>c</mi> <mi>y</mi> </msub> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mo>;</mo> </mrow><mrow><mi>M</mi><mo>=</mo><mfenced open = "[" close = "]"><mtable><mtr><mtd><msub><mi>f</mi><mi>x</mi></msub></mtd><mtd><mn>0</mn></mtd><mtd><msub><mi>c</mi><mi>x</mi></msub></mtd></mtr><mtr><mtd><mn>0</mn></mtd><mtd><msub><mi>f</mi><mi>y</mi></msub></mtd><mtd><msub><mi>c</mi><mi>y</mi></msub></mtd></mtr><mtr><mtd><mn>0</mn></mtd><mtd><mn>0</mn></mtd><mtd><mn>1</mn></mtd></mtr></mtable></mfenced><mo>;</mo></mrow>其中,M为相机矩阵,fx,fy是以像素为单位的焦距,cx表示相机的主点坐标的横坐标,cy表示相机的主点坐标的纵坐标。Among them, M is the camera matrix, fx and fy are focal lengths in pixels, cx represents the abscissa of the principal point coordinates of the camera, andcy represents the ordinate of the principal point coordinates of the camera.3.如权利要求1所述的基于双目视觉的无人机悬停精度测量方法,其特征是,所述步骤(2-2)的步骤为:令设开始对目标进行定位的时刻t=0,首先采用显著性目标检测方式获得目标区域,目标区域为以BL为左上角点,高h、宽w的矩形区域。3. the hovering precision measurement method of unmanned aerial vehicle based on binocular vision as claimed in claim 1, is characterized in that, the step of described step (2-2) is: make the moment t= 0, first use the salient target detection method to obtain the target area, the target area is a rectangular area withBL as the upper left corner point, height h, and width w.4.如权利要求1所述的基于双目视觉的无人机悬停精度测量方法,其特征是,所述步骤(2-3)的步骤为:根据t=0时刻确定的左目图像目标区域,在t=1及其以后的时刻,采用TLD算法对左目图像的目标进行跟踪。4. the drone hovering accuracy measurement method based on binocular vision as claimed in claim 1, is characterized in that, the step of described step (2-3) is: according to the left eye image target area determined at t=0 moment , at t=1 and after, the TLD algorithm is used to track the target in the left-eye image.5.如权利要求1所述的基于双目视觉的无人机悬停精度测量方法,其特征是,所述步骤(2-4)的步骤为:每次得到左目图像中的目标区域后,在右目图像中,寻找与左目图像的目标区域最相似的匹配区域,匹配区域为以BR为左上角点,高h、宽w的矩形区域;5. the hovering accuracy measurement method of unmanned aerial vehicle based on binocular vision as claimed in claim 1, is characterized in that, the step of described step (2-4) is: after obtaining the target area in left eye image at every turn, In the right-eye image, find the matching area most similar to the target area of the left-eye image. The matching area is a rectangular area with BR as the upper left corner point, height h, and width w;则目标匹配表示为:Then the target match is expressed as: <mrow> <munder> <mi>min</mi> <mrow> <msub> <mi>x</mi> <mi>R</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>R</mi> </msub> </mrow> </munder> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>i</mi> <mo>=</mo> <mn>0</mn> </mrow> <mi>h</mi> </munderover> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>j</mi> <mo>=</mo> <mn>0</mn> </mrow> <mi>w</mi> </munderover> <mo>|</mo> <msub> <mi>I</mi> <mrow> <mi>l</mi> <mi>e</mi> <mi>f</mi> <mi>t</mi> </mrow> </msub> <mo>&amp;lsqb;</mo> <msub> <mi>x</mi> <mi>L</mi> </msub> <mo>+</mo> <mi>i</mi> <mo>&amp;rsqb;</mo> <mo>&amp;lsqb;</mo> <msub> <mi>y</mi> <mi>L</mi> </msub> <mo>+</mo> <mi>j</mi> <mo>&amp;rsqb;</mo> <mo>-</mo> <msub> <mi>I</mi> <mrow> <mi>r</mi> <mi>i</mi> <mi>g</mi> <mi>h</mi> <mi>t</mi> </mrow> </msub> <mo>&amp;lsqb;</mo> <msub> <mi>x</mi> <mi>R</mi> </msub> <mo>+</mo> <mi>i</mi> <mo>&amp;rsqb;</mo> <mo>&amp;lsqb;</mo> <msub> <mi>y</mi> <mi>R</mi> </msub> <mo>+</mo> <mi>j</mi> <mo>&amp;rsqb;</mo> <mo>|</mo> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow><mrow><munder><mi>min</mi><mrow><msub><mi>x</mi><mi>R</mi></msub><mo>,</mo><msub><mi>y</mi><mi>R</mi></msub></mrow></munder><munderover><mo>&amp;Sigma;</mo><mrow><mi>i</mi><mo>=</mo><mn>0</mn></mrow><mi>h</mi></munderover><munderover><mo>&amp;Sigma;</mo><mrow><mi>j</mi><mo>=</mo><mn>0</mn></mrow><mi>w</mi></munderover><mo>|</mo><msub><mi>I</mi><mrow><mi>l</mi><mi>e</mi><mi>f</mi><mi>t</mi></mrow></msub><mo>&amp;lsqb;</mo><msub><mi>x</mi><mi>L</mi></msub><mo>+</mo><mi>i</mi><mo>&amp;rsqb;</mo><mo>&amp;lsqb;</mo><msub><mi>y</mi><mi>L</mi></msub><mo>+</mo><mi>j</mi><mo>&amp;rsqb;</mo><mo>-</mo><msub><mi>I</mi><mrow><mi>r</mi><mi>i</mi><mi>g</mi><mi>h</mi><mi>t</mi></mrow></msub><mo>&amp;lsqb;</mo><msub><mi>x</mi><mi>R</mi></msub><mo>+</mo><mi>i</mi><mo>&amp;rsqb;</mo><mo>&amp;lsqb;</mo><msub><mi>y</mi><mi>R</mi></msub><mo>+</mo><mi>j</mi><mo>&amp;rsqb;</mo><mo>|</mo><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>1</mn><mo>)</mo></mrow></mrow>其中,Ileft表示左目图像灰度值,Iright表示右目图像灰度值,(xL,yL)表示点BL的坐标,(xR,yR)表示点BR的坐标;此时搜寻范围xR∈[0,xL],其中sh为搜寻区域的高度,sw为搜寻区域的宽度,得到使得公式(1)最小的BR点坐标(xR,yR)后,则视差Among them, Ileft represents the gray value of the left eye image, Iright represents the gray value of the right eye image, (xL , yL ) represents the coordinates of pointBL , (xR , yR ) represents the coordinates of pointBR ; The search range xR ∈ [0,xL ], where sh is the height of the search area, sw is the width of the search area, after obtaining the coordinates (xR , yR ) of the BR point that makes formula (1) the smallest, then the parallaxd=xL-xRd= xL-xR ;在t≥1时刻,左目图像通过TLD算法得到新的目标区域后,对右目图像的搜寻范围进行更新根据公式(1)确定右目图像中的目标区域;以此类推,来计算每一帧左目图像中的目标区域,以及同一目标在右目图像中的对应区域。At time t≥1, after the left-eye image obtains a new target area through the TLD algorithm, the search range of the right-eye image is updated Determine the target area in the right-eye image according to formula (1); and so on, calculate the target area in each frame of the left-eye image, and the corresponding area of the same target in the right-eye image.6.如权利要求1所述的基于双目视觉的无人机悬停精度测量方法,其特征是,所述步骤(2-5)的步骤为:6. the unmanned aerial vehicle hovering accuracy measurement method based on binocular vision as claimed in claim 1, is characterized in that, the step of described step (2-5) is:利用左目图像中目标区域的中心点作为左目目标的同名点,利用右目图像中的目标区域的中心点作为右目目标的同名点。Use the center point of the target area in the left eye image As the same name point of the left-eye target, use the center point of the target area in the right-eye image Point of the same name as the right eye target.7.如权利要求1所述的基于双目视觉的无人机悬停精度测量方法,其特征是,所述步骤(2-6)的步骤为:7. the hovering accuracy measurement method of unmanned aerial vehicle based on binocular vision as claimed in claim 1, is characterized in that, the step of described step (2-6) is:相机坐标系是以左目相机的光心OL为原点,XOLY平面平行于成像平面,光轴方向为Z轴,根据标定的相机参数,得到重投影矩阵The camera coordinate system is based on the optical center OL of the left-eye camera as the origin, the XOL Y plane is parallel to the imaging plane, and the optical axis direction is the Z axis. According to the calibrated camera parameters, the reprojection matrix is obtained <mrow> <mi>Q</mi> <mo>=</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mn>1</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mrow> <mo>-</mo> <msubsup> <mi>c</mi> <mi>x</mi> <mi>l</mi> </msubsup> </mrow> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mrow> <mo>-</mo> <msubsup> <mi>c</mi> <mi>y</mi> <mi>l</mi> </msubsup> </mrow> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <msup> <mi>f</mi> <mi>l</mi> </msup> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mrow> <mo>-</mo> <mn>1</mn> <mo>/</mo> <msub> <mi>T</mi> <mi>x</mi> </msub> </mrow> </mtd> <mtd> <mrow> <mo>(</mo> <msubsup> <mi>c</mi> <mi>x</mi> <mi>l</mi> </msubsup> <mo>-</mo> <msubsup> <mi>c</mi> <mi>x</mi> <mi>r</mi> </msubsup> <mo>)</mo> <mo>/</mo> <msub> <mi>T</mi> <mi>x</mi> </msub> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> </mrow><mrow><mi>Q</mi><mo>=</mo><mfenced open = "[" close = "]"><mtable><mtr><mtd><mn>1</mn></mtd><mtd><mn>0</mn></mtd><mtd><mn>0</mn></mtd><mtd><mrow><mo>-</mo><msubsup><mi>c</mi><mi>x</mi><mi>l</mi></msubsup></mrow></mtd></mtr><mtr><mtd><mn>0</mn></mtd><mtd><mn>1</mn></mtd><mtd><mn>0</mn></mtd><mtd><mrow><mo>-</mo><msubsup><mi>c</mi><mi>y</mi><mi>l</mi></msubsup></mrow></mtd></mtr><mtr><mtd><mn>0</mn></mtd><mtd><mn>0</mn></mtd><mtd><mn>0</mn></mtd><mtd><msup><mi>f</mi><mi>l</mi></msup></mtd></mtr><mtr><mtd><mn>0</mn></mtd><mtd><mn>0</mn></mtd><mtd><mrow><mo>-</mo><mn>1</mn><mo>/</mo><msub><mi>T</mi><mi>x</mi></msub></mrow></mtd><mtd><mrow><mo>(</mo><msubsup><mi>c</mi><mi>x</mi><mi>l</mi></msubsup><mo>-</mo><msubsup><mi>c</mi><mi>x</mi><mi>r</mi></msubsup><mo>)</mo><mo>/</mo><msub><mi>T</mi><mi>x</mi></msub></mrow></mtd></mtr></mtable></mfenced><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>2</mn><mo>)</mo></mrow></mrow>其中,为左相机的主点坐标,为右相机的主点坐标;Tx为两相机之间平移矩阵的X轴分量;fl为左侧相机焦距;in, is the principal point coordinates of the left camera, is the principal point coordinates of the right camera; Tx is the X-axis component of the translation matrix between the two cameras; fl is the focal length of the left camera;在左右目光轴相互平行情况下,已知左目图像同名点坐标PL(xL,yL)和右目图像同名点坐标PR(xR,yR),计算目标点在左右视图的视差d=xL-xR,则令When the axes of the left and right eyes are parallel to each other, the coordinates of the homonymous point of the left eye image PL (xL , yL ) and the coordinates of the same name point of the right eye image PR (xR , yR ) are known, and the disparity d of the target point in the left and right views is calculated =xL -xR , then let <mrow> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <msub> <mover> <mi>x</mi> <mo>^</mo> </mover> <mi>c</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mover> <mi>y</mi> <mo>^</mo> </mover> <mi>c</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mover> <mi>z</mi> <mo>^</mo> </mover> <mi>c</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mover> <mi>w</mi> <mo>^</mo> </mover> <mi>c</mi> </msub> </mtd> </mtr> </mtable> </mfenced> <mo>=</mo> <mi>Q</mi> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <msub> <mi>x</mi> <mi>L</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>y</mi> <mi>L</mi> </msub> </mtd> </mtr> <mtr> <mtd> <mi>d</mi> </mtd> </mtr> <mtr> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mo>=</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <msub> <mi>x</mi> <mi>L</mi> </msub> <mo>-</mo> <msubsup> <mi>c</mi> <mi>x</mi> <mi>l</mi> </msubsup> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>y</mi> <mi>L</mi> </msub> <mo>-</mo> <msubsup> <mi>c</mi> <mi>y</mi> <mi>l</mi> </msubsup> </mrow> </mtd> </mtr> <mtr> <mtd> <msup> <mi>f</mi> <mi>l</mi> </msup> </mtd> </mtr> <mtr> <mtd> <mfrac> <mrow> <mo>-</mo> <mi>d</mi> <mo>+</mo> <msubsup> <mi>c</mi> <mi>x</mi> <mi>l</mi> </msubsup> <mo>-</mo> <msubsup> <mi>c</mi> <mi>x</mi> <mi>r</mi> </msubsup> </mrow> <msub> <mi>T</mi> <mi>x</mi> </msub> </mfrac> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>3</mn> <mo>)</mo> </mrow> </mrow><mrow><mfenced open = "[" close = "]"><mtable><mtr><mtd><msub><mover><mi>x</mi><mo>^</mo></mover><mi>c</mi></msub></mtd></mtr><mtr><mtd><msub><mover><mi>y</mi><mo>^</mo></mover><mi>c</mi></msub></mtd></mtr><mtr><mtd><msub><mover><mi>z</mi><mo>^</mo></mover><mi>c</mi></msub></mtd></mtr><mtr><mtd><msub><mover><mi>w</mi><mo>^</mo></mover><mi>c</mi></msub></mtd></mtr></mtable></mfenced><mo>=</mo><mi>Q</mi><mfenced open = "[" close = "]"><mtable><mtr><mtd><msub><mi>x</mi><mi>L</mi></msub></mtd></mtr><mtr><mtd><msub><mi>y</mi><mi>L</mi></msub></mtd></mtr><mtr><mtd><mi>d</mi></mtd></mtr><mtr><mtd><mn>1</mn></mtd></mtr></mtable></mfenced><mo>=</mo><mfenced open = "[" close = "]"><mtable><mtr><mtd><msub><mi>x</mi><mi>L</mi></msub><mo>-</mo><msubsup><mi>c</mi><mi>x</mi><mi>l</mi></msubsup></mtd></mtr><mtr><mtd><mrow><msub><mi>y</mi><mi>L</mi></msub><mo>-</mo><msubsup><mi>c</mi><mi>y</mi><mi>l</mi></msubsup></mrow></mtd></mtr><mtr><mtd><msup><mi>f</mi><mi>l</mi></msup></mtd></mtr><mtr><mtd><mfrac><mrow><mo>-</mo><mi>d</mi><mo>+</mo><msubsup><mi>c</mi><mi>x</mi><mi>l</mi></msubsup><mo>-</mo><msubsup><mi>c</mi><mi>x</mi><mi>r</mi></msubsup></mrow><msub><mi>T</mi><mi>x</mi></msub></mfrac></mtd></mtr></mtable></mfenced><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>3</mn><mo>)</mo></mrow></mrow>得到目标点在相机坐标系中的三维坐标:Get the 3D coordinates of the target point in the camera coordinate system: <mrow> <msub> <mi>P</mi> <mi>c</mi> </msub> <mo>=</mo> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>c</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>c</mi> </msub> <mo>,</mo> <msub> <mi>z</mi> <mi>c</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mrow> <mo>(</mo> <msub> <mover> <mi>x</mi> <mo>^</mo> </mover> <mi>c</mi> </msub> <mo>/</mo> <msub> <mover> <mi>w</mi> <mo>^</mo> </mover> <mi>c</mi> </msub> <mo>,</mo> <msub> <mover> <mi>y</mi> <mo>^</mo> </mover> <mi>c</mi> </msub> <mo>/</mo> <msub> <mover> <mi>w</mi> <mo>^</mo> </mover> <mi>c</mi> </msub> <mo>,</mo> <msub> <mover> <mi>z</mi> <mo>^</mo> </mover> <mi>c</mi> </msub> <mo>/</mo> <msub> <mover> <mi>w</mi> <mo>^</mo> </mover> <mi>c</mi> </msub> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>4</mn> <mo>)</mo> </mrow> <mo>;</mo> </mrow><mrow><msub><mi>P</mi><mi>c</mi></msub><mo>=</mo><mrow><mo>(</mo><msub><mi>x</mi><mi>c</mi></msub><mo>,</mo><msub><mi>y</mi><mi>c</mi></msub><mo>,</mo><msub><mi>z</mi><mi>c</mi></msub><mo>)</mo></mrow><mo>=</mo><mrow><mo>(</mo><msub><mover><mi>x</mi><mo>^</mo></mover><mi>c</mi></msub><mo>/</mo><msub><mover><mi>w</mi><mo>^</mo></mover><mi>c</mi></msub><mo>,</mo><msub><mover><mi>y</mi><mo>^</mo></mover><mi>c</mi></msub><mo>/</mo><msub><mover><mi>w</mi><mo>^</mo></mover><mi>c</mi></msub><mo>,</mo><msub><mover><mi>z</mi><mo>^</mo></mover><mi>c</mi></msub><mo>/</mo><msub><mover><mi>w</mi><mo>^</mo></mover><mi>c</mi></msub><mo>)</mo></mrow><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>4</mn><mo>)</mo></mrow><mo>;</mo></mrow>其中为中间结果变量,xc、yc和zc分别为目标点Pc在相机坐标系中的X、Y和Z轴坐标。in with is the intermediate result variable, and xc , yc and zc are the X, Y and Z axis coordinates of the target point Pc in the camera coordinate system, respectively.8.如权利要求1所述的基于双目视觉的无人机悬停精度测量方法,其特征是,所述步骤(2-7)的步骤为:8. the hovering accuracy measurement method of unmanned aerial vehicle based on binocular vision as claimed in claim 1, is characterized in that, the step of described step (2-7) is:在无人机悬停时,采用步骤(2-5)实时计算无人机的三维位置坐标,得到其飞行轨迹;When the drone is hovering, the steps (2-5) are used to calculate the three-dimensional position coordinates of the drone in real time to obtain its flight trajectory;假设得到轨迹点的集合为P=[P1,P2,...,PN],共包含N个点,其中Assume that the set of track points obtained is P=[P1 ,P2 ,...,PN ], which contains N points in total, among whichPn=[xn,yn,zn]T,n=1,2,...,N;Pn =[xn ,yn ,zn ]T ,n=1,2,...,N;飞行轨迹点集合的质心The centroid of the set of flight path points进行无人机悬停精度测试时,规定无人机悬停的离地高度记作H0When carrying out the hovering accuracy test of the drone, it is stipulated that the height above the ground for the hovering of the drone is recorded as H0 ;在进行悬停精度检测时,将双目相机放置于悬停的正下方,具体来说,是将左目相机放置于悬停点并使光轴垂直于水平面;悬停精度分为定点悬停精度的水平偏差和垂直偏差悬停控制精度的水平偏差和垂直偏差When performing hovering accuracy detection, place the binocular camera directly below the hovering, specifically, place the left-eye camera at the hovering point and make the optical axis perpendicular to the horizontal plane; hovering accuracy is divided into fixed-point hovering accuracy level deviation and vertical deviation Horizontal deviation of hover control accuracy and vertical deviation由于OL为坐标系原点,计算公式为:Since OL is the origin of the coordinate system, the calculation formula is: <mrow> <msubsup> <mi>E</mi> <mrow> <mi>H</mi> <mi>o</mi> <mi>v</mi> <mi>i</mi> <mi>n</mi> <mi>g</mi> </mrow> <mi>h</mi> </msubsup> <mo>=</mo> <msqrt> <mrow> <msubsup> <mi>x</mi> <mi>m</mi> <mn>2</mn> </msubsup> <mo>+</mo> <msubsup> <mi>y</mi> <mi>m</mi> <mn>2</mn> </msubsup> </mrow> </msqrt> <mo>,</mo> </mrow><mrow><msubsup><mi>E</mi><mrow><mi>H</mi><mi>o</mi><mi>v</mi><mi>i</mi><mi>n</mi><mi>g</mi></mrow><mi>h</mi></msubsup><mo>=</mo><msqrt><mrow><msubsup><mi>x</mi><mi>m</mi><mn>2</mn></msubsup><mo>+</mo><msubsup><mi>y</mi><mi>m</mi><mn>2</mn></msubsup></mrow></msqrt><mo>,</mo></mrow> <mrow> <msubsup> <mi>E</mi> <mrow> <mi>H</mi> <mi>o</mi> <mi>v</mi> <mi>i</mi> <mi>n</mi> <mi>g</mi> </mrow> <mi>v</mi> </msubsup> <mo>=</mo> <mo>|</mo> <msub> <mi>z</mi> <mi>m</mi> </msub> <mo>-</mo> <msub> <mi>H</mi> <mn>0</mn> </msub> <mo>|</mo> <mo>,</mo> </mrow><mrow><msubsup><mi>E</mi><mrow><mi>H</mi><mi>o</mi><mi>v</mi><mi>i</mi><mi>n</mi><mi>g</mi></mrow><mi>v</mi></msubsup><mo>=</mo><mo>|</mo><msub><mi>z</mi><mi>m</mi></msub><mo>-</mo><msub><mi>H</mi><mn>0</mn></msub><mo>|</mo><mo>,</mo></mrow> <mrow> <msubsup> <mi>E</mi> <mrow> <mi>C</mi> <mi>o</mi> <mi>n</mi> <mi>t</mi> <mi>r</mi> <mi>o</mi> <mi>l</mi> </mrow> <mi>h</mi> </msubsup> <mo>=</mo> <mfrac> <msqrt> <mrow> <msubsup> <mi>e</mi> <mi>X</mi> <mn>2</mn> </msubsup> <mo>+</mo> <msubsup> <mi>e</mi> <mi>Y</mi> <mn>2</mn> </msubsup> </mrow> </msqrt> <mn>2</mn> </mfrac> <mo>,</mo> </mrow><mrow><msubsup><mi>E</mi><mrow><mi>C</mi><mi>o</mi><mi>n</mi><mi>t</mi><mi>r</mi><mi>o</mi><mi>l</mi></mrow><mi>h</mi></msubsup><mo>=</mo><mfrac><msqrt><mrow><msubsup><mi>e</mi><mi>X</mi><mn>2</mn></msubsup><mo>+</mo><msubsup><mi>e</mi><mi>Y</mi><mn>2</mn></msubsup></mrow></msqrt><mn>2</mn></mfrac><mo>,</mo></mrow> <mrow> <msubsup> <mi>E</mi> <mrow> <mi>C</mi> <mi>o</mi> <mi>n</mi> <mi>h</mi> <mi>o</mi> <mi>l</mi> </mrow> <mi>v</mi> </msubsup> <mo>=</mo> <mfrac> <mrow> <munder> <mi>max</mi> <mrow> <mi>n</mi> <mo>&amp;Element;</mo> <mo>&amp;lsqb;</mo> <mn>1</mn> <mo>,</mo> <mi>N</mi> <mo>&amp;rsqb;</mo> </mrow> </munder> <mrow> <mo>(</mo> <msub> <mi>z</mi> <mi>n</mi> </msub> <mo>)</mo> </mrow> <mo>-</mo> <munder> <mi>min</mi> <mrow> <mi>n</mi> <mo>&amp;Element;</mo> <mo>&amp;lsqb;</mo> <mn>1</mn> <mo>,</mo> <mi>N</mi> <mo>&amp;rsqb;</mo> </mrow> </munder> <mrow> <mo>(</mo> <msub> <mi>z</mi> <mi>n</mi> </msub> <mo>)</mo> </mrow> </mrow> <mn>2</mn> </mfrac> </mrow><mrow><msubsup><mi>E</mi><mrow><mi>C</mi><mi>o</mi><mi>n</mi><mi>h</mi><mi>o</mi><mi>l</mi></mrow><mi>v</mi></msubsup><mo>=</mo><mfrac><mrow><munder><mi>max</mi><mrow><mi>n</mi><mo>&amp;Element;</mo><mo>&amp;lsqb;</mo><mn>1</mn><mo>,</mo><mi>N</mi><mo>&amp;rsqb;</mo></mrow></munder><mrow><mo>(</mo><msub><mi>z</mi><mi>n</mi></msub><mo>)</mo></mrow><mo>-</mo><munder><mi>min</mi><mrow><mi>n</mi><mo>&amp;Element;</mo><mo>&amp;lsqb;</mo><mn>1</mn><mo>,</mo><mi>N</mi><mo>&amp;rsqb;</mo></mrow></munder><mrow><mo>(</mo><msub><mi>z</mi><mi>n</mi></msub><mo>)</mo></mrow></mrow><mn>2</mn></mfrac></mrow>其中,eX和eY分别为无人机悬停过程中,在以左目相机为原点的相机坐标系下,X轴方向和Y轴方向的运动范围,zn表示无人机飞行轨迹点Pn的Z轴坐标。Among them, eX and eY are the motion ranges in the X-axis direction and the Y-axis direction in the camera coordinate system with the left-eye camera as the origin during the hovering process of the UAV, and zn represents the UAV flight trajectory point P Z coordinate ofn .
CN201510736167.9A2015-11-022015-11-02Unmanned plane hovering accuracy measurement method based on binocular visionActiveCN105424006B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201510736167.9ACN105424006B (en)2015-11-022015-11-02Unmanned plane hovering accuracy measurement method based on binocular vision

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201510736167.9ACN105424006B (en)2015-11-022015-11-02Unmanned plane hovering accuracy measurement method based on binocular vision

Publications (2)

Publication NumberPublication Date
CN105424006A CN105424006A (en)2016-03-23
CN105424006Btrue CN105424006B (en)2017-11-24

Family

ID=55502384

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201510736167.9AActiveCN105424006B (en)2015-11-022015-11-02Unmanned plane hovering accuracy measurement method based on binocular vision

Country Status (1)

CountryLink
CN (1)CN105424006B (en)

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN107036625A (en)*2016-02-022017-08-11中国电力科学研究院A kind of flying quality detection method of power transmission line unmanned helicopter patrol inspection system
CN105957109A (en)*2016-04-292016-09-21北京博瑞爱飞科技发展有限公司Target tracking method and device
CN106020218B (en)*2016-05-162018-11-13国家电网公司A kind of the hovering method for testing precision and system of unmanned plane
CN106153008B (en)*2016-06-172018-04-06北京理工大学A kind of rotor wing unmanned aerial vehicle objective localization method of view-based access control model
CN107300377B (en)*2016-11-012019-06-14北京理工大学 A three-dimensional target localization method for rotary-wing UAV under the orbital trajectory
CN106709955B (en)*2016-12-282020-07-24天津众阳科技有限公司Space coordinate system calibration system and method based on binocular stereo vision
CN108965651A (en)*2017-05-192018-12-07深圳市道通智能航空技术有限公司A kind of drone height measurement method and unmanned plane
CN107424156B (en)*2017-06-282019-12-06北京航空航天大学 Accurate measurement method for autonomous formation of unmanned aerial vehicles based on visual attention imitating barn owl eyes
CN109211185A (en)*2017-06-302019-01-15北京臻迪科技股份有限公司A kind of flight equipment, the method and device for obtaining location information
CN107490375B (en)*2017-09-212018-08-21重庆鲁班机器人技术研究院有限公司Spot hover accuracy measuring device, method and unmanned vehicle
CN108489454A (en)*2018-03-222018-09-04沈阳上博智像科技有限公司Depth distance measurement method, device, computer readable storage medium and electronic equipment
CN109211573B (en)*2018-09-122021-01-08北京工业大学Method for evaluating hovering stability of unmanned aerial vehicle
CN109360240B (en)*2018-09-182022-04-22华南理工大学Small unmanned aerial vehicle positioning method based on binocular vision
CN109813509B (en)*2019-01-142020-01-24中山大学 A method for measuring vertical disturbance of high-speed railway bridges based on unmanned aerial vehicles
CN109855822B (en)*2019-01-142019-12-06中山大学 A method for measuring vertical disturbance of high-speed railway bridges based on UAV
CN110986891B (en)*2019-12-062021-08-24西北农林科技大学 A system for accurate and rapid measurement of tree canopy using unmanned aerial vehicles
CN111688949B (en)*2020-06-242022-06-28天津大学Unmanned aerial vehicle hovering attitude measuring device and method
CN112188112A (en)*2020-09-282021-01-05苏州臻迪智能科技有限公司Light supplement control method, light supplement control device, storage medium and electronic equipment
CN112365526B (en)*2020-11-302023-08-25湖南傲英创视信息科技有限公司Binocular detection method and system for weak and small targets
CN114818546B (en)*2022-05-242024-08-16重庆大学 A dual-dimensional evaluation method for UAV hovering wind resistance performance based on error sorting
CN114877876B (en)*2022-07-122022-09-23南京市计量监督检测院Unmanned aerial vehicle hovering precision evaluation method

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN101489149B (en)*2008-12-252010-06-23清华大学Binocular tri-dimensional video collecting system
US9071819B2 (en)*2010-03-232015-06-30Exelis Inc.System and method for providing temporal-spatial registration of images
CN101876532B (en)*2010-05-252012-05-23大连理工大学Camera on-field calibration method in measuring system
CN102967305B (en)*2012-10-262015-07-01南京信息工程大学Multi-rotor unmanned aerial vehicle pose acquisition method based on markers in shape of large and small square
US9501218B2 (en)*2014-01-102016-11-22Microsoft Technology Licensing, LlcIncreasing touch and/or hover accuracy on a touch-enabled device
CN104006803B (en)*2014-06-202016-02-03中国人民解放军国防科学技术大学The photographing measurement method of spin stabilization spacecraft rotational motion parameter
CN104932523A (en)*2015-05-272015-09-23深圳市高巨创新科技开发有限公司Positioning method and apparatus for unmanned aerial vehicle

Also Published As

Publication numberPublication date
CN105424006A (en)2016-03-23

Similar Documents

PublicationPublication DateTitle
CN105424006B (en)Unmanned plane hovering accuracy measurement method based on binocular vision
CN107144241B (en)A kind of binocular vision high-precision measuring method based on depth of field compensation
CN105091849B (en)A kind of non-parallel binocular distance-finding method of optical axis
CN106289184B (en)A kind of no GNSS signal and cooperate with vision deformation monitoring method without unmanned plane under control point
CN110345921B (en)Stereo visual field vision measurement and vertical axis aberration and axial aberration correction method and system
CN110728715A (en)Camera angle self-adaptive adjusting method of intelligent inspection robot
CN105698699A (en)A binocular visual sense measurement method based on time rotating shaft constraint
CN106384382A (en)Three-dimensional reconstruction system and method based on binocular stereoscopic vision
CN109859272A (en)A kind of auto-focusing binocular camera scaling method and device
CN105225241A (en)The acquisition methods of unmanned plane depth image and unmanned plane
CN113034615B (en)Equipment calibration method and related device for multi-source data fusion
CN106408601A (en)GPS-based binocular fusion positioning method and device
CN109084959B (en) An Optical Axis Parallelism Correction Method Based on Binocular Ranging Algorithm
CN111307046B (en)Tree height measuring method based on hemispherical image
CN114359406A (en)Calibration of auto-focusing binocular camera, 3D vision and depth point cloud calculation method
CN104807449A (en)Power transmission line crossing measuring system based on stereo photogrammetry measurement
CN114445503B (en) A long-distance and large-field-of-view binocular camera calibration method and three-dimensional measurement method
CN104729484A (en)Multi-view stereo aerial photographic device for unmanned aerial vehicles and method for determining focal length of multi-view stereo aerial photographic device
CN106096207A (en)A kind of rotor wing unmanned aerial vehicle wind resistance appraisal procedure based on multi-vision visual and system
CN106709955A (en)Space coordinate system calibrate system and method based on binocular stereo visual sense
CN105571518A (en)Three dimensional information vision measurement method based on refraction image deviation
CN110514114A (en) A method for calibrating the spatial position of tiny targets based on binocular vision
CN114926538B (en) External parameter calibration method and device for monocular laser speckle projection system
CN115690623A (en)Remote target damage assessment method based on three-dimensional reconstruction
CN113340272B (en) A real-time localization method of ground targets based on UAV micro-swarm

Legal Events

DateCodeTitleDescription
C06Publication
PB01Publication
C10Entry into substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant
CP01Change in the name or title of a patent holder

Address after:Wang Yue Central Road Ji'nan City, Shandong province 250002 City No. 2000

Co-patentee after:National Network Intelligent Technology Co., Ltd.

Patentee after:Electric Power Research Institute of State Grid Shandong Electric Power Company

Co-patentee after:State Grid Corporation of China

Address before:Wang Yue Central Road Ji'nan City, Shandong province 250002 City No. 2000

Co-patentee before:Shandong Luneng Intelligent Technology Co., Ltd.

Patentee before:Electric Power Research Institute of State Grid Shandong Electric Power Company

Co-patentee before:State Grid Corporation of China

CP01Change in the name or title of a patent holder
TR01Transfer of patent right

Effective date of registration:20201029

Address after:250101 Electric Power Intelligent Robot Production Project 101 in Jinan City, Shandong Province, South of Feiyue Avenue and East of No. 26 Road (ICT Industrial Park)

Patentee after:National Network Intelligent Technology Co.,Ltd.

Address before:Wang Yue Central Road Ji'nan City, Shandong province 250002 City No. 2000

Patentee before:ELECTRIC POWER RESEARCH INSTITUTE OF STATE GRID SHANDONG ELECTRIC POWER Co.

Patentee before:National Network Intelligent Technology Co.,Ltd.

Patentee before:STATE GRID CORPORATION OF CHINA

TR01Transfer of patent right

[8]ページ先頭

©2009-2025 Movatter.jp