




技术领域technical field
本发明涉及一种基于量测模型优化的惯性/视觉里程计组合导航定位方法,属于惯性/视觉里程计组合导航技术领域。The invention relates to an inertial/visual odometry combined navigation and positioning method optimized based on a measurement model, and belongs to the technical field of inertial/visual odometry combined navigation.
背景技术Background technique
近年来,随着无人机、无人车和机器人等智能运动体的研制发展,对导航系统性能提出更高要求。In recent years, with the development of intelligent moving bodies such as unmanned aerial vehicles, unmanned vehicles and robots, higher requirements have been placed on the performance of navigation systems.
在现有技术下,广泛应用于智能运动体的导航系统多采用惯性器件(IMU-加速度计和陀螺仪)/卫星组合导航系统实现。但考虑到在城市、丛林、室内等卫星信号受到干扰或失效的环境,IMU/卫星组合导航系统将难以提供准确的导航信息。视觉里程计技术是一种基于相关图像序列处理的六自由度运动估计方法,由于其仅依赖于被动测量的相机,具有自主性好、隐蔽性好、结构简单、成本低等优点,必将成为未来智能运动体导航系统的重要信息单元。In the prior art, the navigation systems widely used in intelligent moving bodies are mostly implemented by inertial devices (IMU-accelerometer and gyroscope)/satellite integrated navigation system. However, considering that the satellite signals are interfered or disabled in cities, jungles, and indoor environments, it will be difficult for the IMU/satellite integrated navigation system to provide accurate navigation information. Visual odometry technology is a six-degree-of-freedom motion estimation method based on related image sequence processing. Because it only relies on passively measured cameras, it has the advantages of good autonomy, good concealment, simple structure and low cost. Important information unit of future intelligent moving body navigation system.
惯性导航系统的误差主要由惯性传感器(IMU-加速度计和陀螺仪)测量误差引起,惯性导航系统的误差随时间累积,但是其测量精度不会受到突然机动的影响,而视觉里程计由于采用递推的方式求解位姿,视觉里程计不可避免地存在累积误差,且这种累积误差和基于光电编码器的里程计类似,是空间相关而非时间相关的;并且当运动体机动过大时,会由于图像场景变化过大而造成匹配失效进而影响定位精度。视觉里程计根据对图像信息利用的不同,主要分为利用特征点进行跟踪匹配的特征点法,以LIBVISO、PTAM等开源算法为代表,和直接利用像素灰度信息匹配的直接法,以SVO、LSD-SLAM等开源算法为代表。各种代表性的算法运动估计方法不同,并且运动体的机动与输出位姿噪声的关系难以准确建模。考虑到两者的互补性,基于惯性导航和视觉里程计的组合导航受到广泛的重视,但现有的IMU和视觉里程计的组合导航大多直接使用姿态和位置组合,忽视了视觉里程计定位误差累积和与机动相关的特点。如何实现惯性导航和视觉里程计导航信息的有效融合,具有重要的研究意义。The error of the inertial navigation system is mainly caused by the measurement error of the inertial sensor (IMU-accelerometer and gyroscope). The error of the inertial navigation system accumulates with time, but its measurement accuracy will not be affected by sudden maneuvers. To solve the pose and attitude by pushing, the visual odometry inevitably has cumulative errors, and this cumulative error is similar to the odometer based on the photoelectric encoder, which is related to space rather than time; and when the moving body maneuvers too much, Due to the large change of the image scene, the matching failure will be caused and the positioning accuracy will be affected. According to the different use of image information, visual odometry is mainly divided into the feature point method that uses feature points to track and match, represented by open source algorithms such as LIBVISO and PTAM, and the direct method that directly uses pixel grayscale information for matching, such as SVO, PTAM and other open source algorithms. Open source algorithms such as LSD-SLAM are represented. Various representative algorithms have different motion estimation methods, and the relationship between the movement of the moving body and the output pose noise is difficult to accurately model. Considering the complementarity of the two, the combined navigation based on inertial navigation and visual odometry has received extensive attention, but most of the existing combined navigation of IMU and visual odometry directly use the combination of attitude and position, ignoring the positioning error of visual odometry. Cumulative and maneuver-related characteristics. How to realize the effective fusion of inertial navigation and visual odometry navigation information has important research significance.
发明内容SUMMARY OF THE INVENTION
为解决现有技术的不足,本发明的目的在于提供一种基于量测模型优化的惯性/视觉里程计组合导航定位方法,在载体运动过程中有效利用视觉里程计的角速度、线速度和位置信息,实现与惯性导航的最优融合,提高组合导航系统的精度。In order to solve the deficiencies of the prior art, the purpose of the present invention is to provide an inertial/visual odometry combined navigation and positioning method optimized based on a measurement model, which effectively utilizes the angular velocity, linear velocity and position information of the visual odometer during the movement of the carrier. , to achieve the optimal integration with inertial navigation and improve the accuracy of the integrated navigation system.
为了实现上述目标,本发明采用如下的技术方案:In order to achieve above-mentioned goal, the present invention adopts following technical scheme:
一种基于量测模型优化的惯性/视觉里程计组合导航定位方法,其特征是,An inertial/visual odometry combined navigation and positioning method based on measurement model optimization, characterized in that:
步骤1)建立组合导航系统的状态方程,将惯性传感器误差扩展为系统状态变量,包括陀螺的随机常数漂移、陀螺仪的一阶马尔科夫过程漂移和加速度计的一阶马尔科夫过程漂移;Step 1) establish the state equation of the integrated navigation system, and expand the inertial sensor error into a system state variable, including the random constant drift of the gyroscope, the first-order Markov process drift of the gyroscope and the first-order Markov process drift of the accelerometer;
步骤2)结合视觉里程计输出的相对位姿,得到相对位姿到导航系下的姿态和位置的转换关系;Step 2) Combining the relative pose output by the visual odometry, obtain the conversion relationship between the relative pose and the pose and position under the navigation system;
步骤3)选择组合导航系统的量测变量,构建惯性/视觉里程计组合卡尔曼滤波量测方程;Step 3) select the measurement variable of the integrated navigation system, and construct the inertial/visual odometry combined Kalman filter measurement equation;
步骤4)对系统状态方程和量测方程进行离散化处理,并采用卡尔曼滤波对系统状态量进行反馈校正。Step 4) Discretize the system state equation and the measurement equation, and use Kalman filter to feedback and correct the system state quantity.
前述的一种基于量测模型优化的惯性/视觉里程计组合导航定位方法,其特征是,所述步骤1)中系统的状态方程为:The aforementioned a kind of inertial/visual odometry combined navigation and positioning method optimized based on the measurement model, is characterized in that, the state equation of the system in the described step 1) is:
其中,X为状态向量,为捷联惯导系统中的数学平台误差角,分别为x、y、z的三轴数学平台误差角;δvn为载体的速度误差,载体在地理系下x、y、z轴上的速度误差;δpn为载体的位置误差,δL、δλ、δh分别为经度、纬度、高度误差;εb为陀螺仪的随机常数漂移,分别为陀螺仪x、y、z轴的随机常数漂移;为陀螺仪的一阶马尔科夫过程漂移,分别为陀螺仪x、y、z轴的一阶马尔科夫过程漂移;为加速度计的一阶马尔科夫过程漂移,为加速度计x、y、z轴的一阶马尔科夫过程漂移; where X is the state vector, is the mathematical platform error angle in the strapdown inertial navigation system, are the three-axis mathematical platform error angles of x, y, and z respectively; δvn is the velocity error of the carrier, The velocity error of the carrier on the x, y, and z axes in the geographic system; δpn is the position error of the carrier, δL, δλ, and δh are the longitude, latitude, and altitude errors, respectively; εb is the random constant drift of the gyroscope, are the random constant drift of the gyroscope x, y, and z axes, respectively; is the first-order Markov process drift of the gyroscope, are the first-order Markov process drift of the gyroscope x, y, and z axes, respectively; is the first-order Markov process drift of the accelerometer, is the first-order Markov process drift of the x, y, and z axes of the accelerometer;
上标n表示地理系,上标b表示载体系,下标I表示惯导系统解算值,下标V表示视觉里程计解算值;The superscript n represents the geographic system, the superscript b represents the carrier system, the subscript I represents the solution value of the inertial navigation system, and the subscript V represents the solution value of visual odometry;
其中,为地理系相对于惯性系的角速率误差,为地理系相对于惯性系的角速率,表示从载体系到地理系的坐标变换矩阵,fn为地理系下加速度计的输出比力,为地球系相对于惯性系的角速率误差,为地理系相对于地球系的角速率误差,为地球系相对于惯性系的角速率,为地理系相对于地球系的角速率,δgn为重力加速度误差,vn为载体在地理系下的速度,为载体在地理系下x轴上的速度,Rm为卯酉圈半径,Rn为子午圈半径,h为当地海拔高度,L为当地纬度,Tg和Ta分别为一阶马尔科夫过程的相关时间,wr和wa分别为相关的驱动白噪声。 in, is the angular rate error of the geographic frame relative to the inertial frame, is the angular velocity of the geographic frame relative to the inertial frame, represents the coordinate transformation matrix from the carrier system to the geographic system, fn is the output specific force of the accelerometer in the geographic system, is the angular rate error of the Earth system relative to the inertial system, is the angular rate error of the geographic system relative to the Earth system, is the angular velocity of the Earth system relative to the inertial system, is the angular velocity of the geographic system relative to the earth system, δgn is the gravitational acceleration error, vn is the velocity of the carrier in the geographic system, is the speed of the carrier on the x-axis in the geographic system, Rm is the radius of the suo unitary circle, Rn is the radius of the meridian circle, h is the local altitude, L is the local latitude, Tg and Ta are the first-order Markov Correlation time of the process, wr and wa are the correlated driving white noise, respectively.
前述的一种基于量测模型优化的惯性/视觉里程计组合导航定位方法,其特征是,所述步骤2)中相对位姿到导航系下的姿态和位置的转换关系为:其中,为k帧图像时刻的载体系到地理系的坐标变换矩阵,为相机系到载体系的坐标变换矩阵,和为视觉里程计测得k+1帧图像时刻相对于k帧图像时刻的旋转矩阵和平移向量,为k帧图像时刻载体在地理系下的位置,上标T表示矩阵的转置,下标k和k+1表示对应帧时刻的物理量。The aforementioned a kind of inertial/visual odometry combined navigation and positioning method optimized based on the measurement model, is characterized in that, in the described step 2), the transformation relationship between the relative pose to the attitude and position under the navigation system is: in, is the coordinate transformation matrix from the carrier system to the geographic system at the time of the k-frame image, is the coordinate transformation matrix from the camera system to the carrier system, and Measure the rotation matrix and translation vector of k+1 frame image time relative to k frame image time for visual odometry, is the position of the k-frame image time carrier in the geographic system, the superscript T represents the transposition of the matrix, and the subscripts k and k+1 represent the physical quantities of the corresponding frame time.
前述的一种基于量测模型优化的惯性/视觉里程计组合导航定位方法,其特征是,所述步骤3)的量测方程为:The aforementioned inertial/visual odometry combined navigation and positioning method optimized based on the measurement model, is characterized in that, the measurement equation of the step 3) is:
其中,为载体相对于地理系的角速度误差,δvn为载体的速度误差,δpn为载体的位置误差,下标I表示惯导系统解算值,下标V表示视觉里程计解算值,H为系统的量测系数矩阵,V为系统的量测噪声阵; in, is the angular velocity error of the carrier relative to the geographic system, δvn is the velocity error of the carrier, δpn is the position error of the carrier, the subscript I represents the inertial navigation system solution value, the subscript V represents the visual odometry solution value, H is the The measurement coefficient matrix of the system, V is the measurement noise matrix of the system;
其中,表示从地理系到载体系的坐标变换矩阵,Mv和Mp分别是与速度和位置相关的系数,vx和vy分别为载体速度在x轴和y轴上的分量,为的模值,I为单位矩阵。 in, Represents the coordinate transformation matrix from the geographic system to the carrier system, Mv and Mp are the coefficients related to the velocity and position, respectively, vx and vy are the components of the carrier velocity on the x-axis and y-axis, respectively, for the modulo value of , I is the identity matrix.
前述的一种基于量测模型优化的惯性/视觉里程计组合导航定位方法,其特征是,所述步骤4)的具体过程为:A kind of inertial/visual odometry combined navigation and positioning method optimized based on the measurement model, is characterized in that, the concrete process of described step 4) is:
401)将系统状态方程和量测方程离散化处理:401) Discretize the system state equation and measurement equation:
Xk=Φk,k-1Xk-1+Γk,k-1Wk-1,Zk=HkXk+Vk,其中,Xk为tk时刻系统状态量,Xk-1为tk-1时刻系统状态量,Φk,k-1为tk-1时刻至tk时刻系统的状态转移矩阵,Γk,k-1为tk-1时刻至tk时刻系统的噪声驱动矩阵,Wk-1为tk-1时刻系统的噪声矩阵,Zk为tk时刻系统的姿态量测矩阵,Hk为tk时刻的姿态量测系数矩阵,Vk为tk时刻的姿态观测量的噪声矩阵;Xk =Φk,k-1 Xk-1 +Γk,k-1 Wk-1 , Zk =Hk Xk +Vk , where Xk is the system state quantity at time tk , Xk -1 is the state quantity of the system at time tk-1 , Φk,k-1 is the state transition matrix of the system from time tk-1 to time tk , Γk,k-1 is time tk-1 to time tk The noise driving matrix of the system, Wk-1 is the noise matrix of the system at time tk-1 , Zk is the attitude measurement matrix of the system at time tk , Hk is the attitude measurement coefficient matrix at time tk , Vk is Noise matrix of attitude observations at time tk ;
402)对离散化处理得到的公式加入控制项Uk-1,并采用闭环修正系统状态方程对惯导系统误差进行修正:其中,Bk,k-1为tk-1时刻至tk时刻系统的控制项系数矩阵;402) Add the control term Uk-1 to the formula obtained by discretization, and use the closed-loop correction system state equation to correct the inertial navigation system error: Among them, Bk,k-1 is the control term coefficient matrix of the system from time tk-1 to time tk ;
403)得到系统的线性化卡尔曼滤波器方程:403) Obtain the linearized Kalman filter equation of the system:
其中,为tk-1时刻至tk时刻一步预测状态量;为tk-1时刻滤波状态估计量;为tk时刻滤波状态估计量;Kk为tk时刻滤波增益矩阵;Zk为tk时刻的新息序列;Pk,k-1为tk-1时刻至tk时刻一步预测协方差矩阵;Pk-1为tk-1时刻滤波状态估计协方差矩阵;Φk,k-1T为Φk,k-1的转置矩阵;Γk-1为tk-1时刻系统噪声系数矩阵;Qk-1为tk-1时刻的系统观测噪声估计协方差阵;Γk-1T为Γk-1的转置矩阵;为Hk的转置矩阵;Rk为tk时刻的量测噪声估计协方差阵;Pk为tk时刻滤波状态估计协方差矩阵;I为单位矩阵; in, Predict the state quantity in one step from time tk-1 to time tk ; is the filter state estimator at time tk-1 ; is the filter state estimator at time tk ; Kk is the filter gain matrix at time tk ; Zk is the innovation sequence at time tk ; Pk,k-1 is the one-step prediction covariance from time tk-1 to time tk matrix; Pk-1 is the estimated covariance matrix of the filter state at time tk-1 ; Φk,k-1T is the transposed matrix of Φk,k-1 ; Γk-1 is the system noise at time tk-1 coefficient matrix; Qk-1 is the covariance matrix of systematic observation noise estimation at time tk-1 ; Γk-1T is the transpose matrix of Γk-1 ; is the transposed matrix of Hk ; Rk is the estimated covariance matrix of the measurement noise at time tk ; Pk is the estimated covariance matrix of the filter state at time tk ; I is the identity matrix;
404)根据步骤403)得到的线性化卡尔曼滤波器方程估计系统导航误差值,包括姿态、位置、速度误差,并用惯性导航系统推算的导航参数减去系统导航误差值,得到基于量测模型优化的惯性/视觉里程计组合导航修正值。404) Estimate the system navigation error value according to the linearized Kalman filter equation obtained in step 403), including the attitude, position, and speed errors, and subtract the system navigation error value from the navigation parameters estimated by the inertial navigation system to obtain an optimization based on the measurement model. The inertial/visual odometry combined navigation correction value.
本发明所达到的有益效果:本发明方法将视觉里程计作为角速度、线速度以及位置传感器以获取量测数据进而构建量测方程,符合视觉里程计输出的噪声特性,保证了对视觉里程计输出信息的有效利用,具有广泛的适用性;本发明方法应用卡尔曼滤波方法对惯性/视觉里程计组合导航系统的状态量实现了最优估计,利用惯性导航和视觉里程计的互补性,有效地提高组合导航系统的精度,适合工程应用。The beneficial effects achieved by the present invention: the method of the present invention uses the visual odometry as an angular velocity, linear velocity and position sensor to obtain measurement data and then constructs a measurement equation, which conforms to the noise characteristics of the output of the visual odometer and ensures the output of the visual odometer. The effective use of information has wide applicability; the method of the present invention uses the Kalman filter method to achieve the optimal estimation of the state quantity of the inertial/visual odometry integrated navigation system, and uses the complementarity of inertial navigation and visual odometry to effectively Improve the accuracy of the integrated navigation system, suitable for engineering applications.
附图说明Description of drawings
图1是本发明的架构图;Fig. 1 is the architecture diagram of the present invention;
图2是本发明所使用的KITTI数据集车辆真实运动轨迹示意图;Fig. 2 is the KITTI data set vehicle real motion track schematic diagram used in the present invention;
图3(a)-(c)是本发明的导航结果与使用姿态位置组合的导航结果比较图,图3(a)是位置误差比较曲线,图3(b)是姿态误差比较曲线,图3(c)是速度误差比较曲线。Figure 3(a)-(c) is a comparison diagram of the navigation result of the present invention and the navigation result using the combination of attitude and position, Figure 3 (a) is the position error comparison curve, Figure 3 (b) is the attitude error comparison curve, Figure 3 (c) is the speed error comparison curve.
具体实施方式Detailed ways
下面结合附图对本发明作进一步描述。以下实施例仅用于更加清楚地说明本发明的技术方案,而不能以此来限制本发明的保护范围。The present invention will be further described below in conjunction with the accompanying drawings. The following examples are only used to illustrate the technical solutions of the present invention more clearly, and cannot be used to limit the protection scope of the present invention.
如图1所示,本发明所述的惯基于量测模型优化的惯性/视觉里程计组合导航方法的原理是:视觉里程计模块对双目相机输出的原始图像进行畸变校正,随后对序列图像进行特征提取与立体匹配,再进行特征点的跟踪,从而估计出相对位姿。卡尔曼滤波器模块通过建立卡尔曼滤波状态方程和量测方程,利用陀螺仪和加速度计原始输出进行捷联惯导解算,当到了滤波周期后,分别进行时间更新和量测更新,实现对组合导航系统状态量的最优估计,以提高组合导航系统性能。As shown in FIG. 1 , the principle of the inertial/visual odometry combined navigation method optimized based on the inertial measurement model of the present invention is as follows: the visual odometer module performs distortion correction on the original image output by the binocular camera, and then corrects the sequence of images. Perform feature extraction and stereo matching, and then track feature points to estimate the relative pose. The Kalman filter module uses the original output of the gyroscope and accelerometer to solve the strapdown inertial navigation by establishing the Kalman filter state equation and measurement equation. The optimal estimation of the state quantity of the integrated navigation system to improve the performance of the integrated navigation system.
本发明的具体实施方式如下:The specific embodiments of the present invention are as follows:
步骤1)建立惯性/视觉里程计组合导航算法的状态方程:Step 1) Establish the state equation of the inertial/visual odometry combined navigation algorithm:
首先定义坐标系:地理坐标系选取“东-北-天”为导航系(n),载体坐标系(b)选取为“右-前-上”。需要估计的状态量如式所示:First define the coordinate system: the geographic coordinate system selects "East-North-Sky" as the navigation system (n), and the carrier coordinate system (b) is selected as "Right-Front-Up". The state quantity that needs to be estimated is shown in the formula:
式中,X为状态向量,为捷联惯导系统中的数学平台误差角,分别为x、y、z的三轴数学平台误差角;δvn为载体的速度误差,载体在地理系下x、y、z轴上的速度误差;δpn为载体的位置误差,δL、δλ、δh分别为经度、纬度、高度误差;εb为陀螺仪的随机常数漂移,分别为陀螺仪x、y、z轴的随机常数漂移;为陀螺仪的一阶马尔科夫过程漂移,分别为陀螺仪x、y、z轴的一阶马尔科夫过程漂移;为加速度计的一阶马尔科夫过程漂移,为加速度计x、y、z轴的一阶马尔科夫过程漂移。上标n表示地理系,上标b表示载体系;where X is the state vector, is the mathematical platform error angle in the strapdown inertial navigation system, are the three-axis mathematical platform error angles of x, y, and z respectively; δvn is the velocity error of the carrier, The velocity error of the carrier on the x, y, and z axes in the geographic system; δpn is the position error of the carrier, δL, δλ, and δh are the longitude, latitude, and altitude errors, respectively; εb is the random constant drift of the gyroscope, are the random constant drift of the gyroscope x, y, and z axes, respectively; is the first-order Markov process drift of the gyroscope, are the first-order Markov process drift of the gyroscope x, y, and z axes, respectively; is the first-order Markov process drift of the accelerometer, is the first-order Markov process drift of the accelerometer x, y, and z axes. The superscript n represents the geographic system, and the superscript b represents the carrier system;
IMU的测量值作为状态方程的输入变量,状态方程表示为:The measured value of the IMU is used as the input variable of the state equation, and the state equation is expressed as:
式中,为地理系相对于惯性系的角速率误差,为地理系相对于惯性系的角速率,表示从载体系到地理系的坐标变换矩阵,fn为地理系下加速度计的输出比力,为地球系相对于惯性系的角速率误差,为地理系相对于地球系的角速率误差,为地球系相对于惯性系的角速率,为地理系相对于地球系的角速率,δgn为重力加速度误差,vn为载体在地理系下的速度,为载体在地理系下x轴上的速度,Rm为卯酉圈半径,Rn为子午圈半径,h为当地海拔高度,L为当地纬度,Tg和Ta分别为一阶马尔科夫过程的相关时间,wr和wa分别为相关的驱动白噪声。 In the formula, is the angular rate error of the geographic frame relative to the inertial frame, is the angular velocity of the geographic frame relative to the inertial frame, represents the coordinate transformation matrix from the carrier system to the geographic system, fn is the output specific force of the accelerometer in the geographic system, is the angular rate error of the Earth system relative to the inertial system, is the angular rate error of the geographic system relative to the Earth system, is the angular velocity of the Earth system relative to the inertial system, is the angular velocity of the geographic system relative to the earth system, δgn is the gravitational acceleration error, vn is the velocity of the carrier in the geographic system, is the speed of the carrier on the x-axis in the geographic system, Rm is the radius of the suo unitary circle, Rn is the radius of the meridian circle, h is the local altitude, L is the local latitude, Tg and Ta are the first-order Markov Correlation time of the process, wr and wa are the correlated driving white noise, respectively.
步骤2)建立基于量测模型优化的惯性/视觉里程计组合导航算法的量测方程Step 2) Establish the measurement equation of the inertial/visual odometry integrated navigation algorithm optimized based on the measurement model
视觉里程计通过对序列图像信息的匹配和跟踪估算相邻图像间相机的运动,它的输出是前后两个连续采样时刻的相对位姿,用旋转矩阵和平移向量来表示,k与k+1表示图像的帧数。Visual odometry estimates the motion of the camera between adjacent images by matching and tracking sequence image information. Its output is the relative pose of two consecutive sampling moments before and after, using the rotation matrix and translation vector to represent, k and k+1 represent the number of frames of the image.
下式表示了相对位姿到导航系下的姿态和位置的转换关系。The following formula expresses the transformation relationship between the relative pose and the pose and position under the navigation system.
式中,表示从相机系到载体系的坐标变换矩阵,一般认为在实验前已经标定好且在一次实验中保持不变。下标k和k+1表示载体在对应帧数图像的时刻的相应物理量的大小。In the formula, Represents the coordinate transformation matrix from the camera system to the carrier system, generally considered to be It has been calibrated before the experiment and remains unchanged for one experiment. The subscripts k and k+1 represent the size of the corresponding physical quantity of the carrier at the moment corresponding to the frame number image.
由于基于不同解算原理和方法的视觉里程计噪声特性不同,因此不失一般性,将视觉里程计视为一个黑箱,直接观测得到的相对位姿的噪声不会累积,近似于白噪声,因此噪声体现在姿态角增量和平移向量上,即:Because the noise characteristics of visual odometry based on different calculation principles and methods are different, without loss of generality, visual odometry is regarded as a black box, and the noise of the relative pose obtained by direct observation will not accumulate, which is similar to white noise, so The noise is reflected in the attitude angle increment and translation vector, namely:
其中,Δγ,Δθ,Δφ表示从k和k+1时刻横滚角、俯仰角、航向角的变化量,表示带误差的横滚角、俯仰角、航向角测量值,wγ,wθ,wφ,wt分别表示横滚角、俯仰角、航向角、平移向量的白噪声。Among them, Δγ, Δθ, Δφ represent the changes of roll angle, pitch angle, and heading angle from time k and k+1, Represents the roll angle, pitch angle, and heading angle measurement values with errors, wγ , wθ , wφ , wt represent the white noise of the roll angle, pitch angle, heading angle, and translation vector, respectively.
基于以上分析的视觉里程计噪声特性,选取惯导系统和视觉里程计解算出的角速度误差的差值、线速度误差差值和位置误差差值作为组合系统的观测量,如下式所示:Based on the noise characteristics of the visual odometry analyzed above, the difference between the angular velocity error, the linear velocity error, and the position error calculated by the inertial navigation system and the visual odometry are selected as the observations of the combined system, as shown in the following formula:
其中,表示载体相对于地理系的角速度误差,δvn为载体的速度误差,δpn为载体的位置误差。下标I表示惯导系统解算值,下标V表示视觉里程计解算值。H为系统的量测矩阵。in, Represents the angular velocity error of the carrier relative to the geographic system, δvn is the velocity error of the carrier, and δpn is the position error of the carrier. The subscript I represents the solution value of the inertial navigation system, and the subscript V represents the solution value of the visual odometry. H is the measurement matrix of the system.
的值可以由陀螺仪输出计算得到,如式所示: The value of can be calculated from the gyroscope output as follows:
其中,为陀螺仪输出,为地球自转角速率,为载体运动引起的角速度。in, is the gyroscope output, is the angular rate of Earth's rotation, is the angular velocity caused by the motion of the carrier.
因此,忽略二阶小项,可以表示为:Therefore, ignoring the second order minor terms, It can be expressed as:
的值可以根据欧拉角微分方程,由姿态变化角速率转换得到,如下式所示: The value of can be converted from the angular rate of attitude change according to the Euler angle differential equation, as shown in the following formula:
其中,为在三轴上的分量。in, for components on three axes.
为系统的状态量,可以由k到k+1时刻的视觉位置估计变换计算。根据式(5)~(7),系统的量测矩阵H如下式所示: is the state quantity of the system, It can be computed from the visual position estimation transformation at time k to k+1. According to equations (5) to (7), the measurement matrix H of the system is as follows:
其中,符号×是向量叉乘的简化表示,Mv和Mp分别是与速度和位置相关的系数,为的模值,I为单位矩阵。步骤3)建立基于量测模型优化的惯性/视觉里程计组合导航卡尔曼滤波模型where the symbol × is a simplified representation of the vector cross product, Mv and Mp are coefficients related to velocity and position, respectively, for the modulo value of , I is the identity matrix. Step 3) Establish an inertial/visual odometry combined navigation Kalman filter model optimized based on the measurement model
31)将系统状态方程和量测方程离散化处理:31) Discretize the system state equation and measurement equation:
Xk=Φk,k-1Xk-1+Γk,k-1Wk-1 (9)Xk =Φk,k-1 Xk-1 +Γk,k-1 Wk-1 (9)
Zk=HkXk+Vk (10)Zk =Hk Xk +Vk (10)
其中,Xk为tk时刻系统状态量,Xk-1为tk-1时刻系统状态量,Φk,k-1为tk-1时刻至tk时刻系统的状态转移矩阵,Γk,k-1为tk-1时刻至tk时刻系统的噪声驱动矩阵,Wk-1为tk-1时刻系统的噪声矩阵,Zk为tk时刻系统的姿态量测矩阵,Hk为tk时刻的姿态量测系数矩阵,Vk为tk时刻的姿态观测量的噪声矩阵。Among them, Xk is the state quantity of the system at time tk , Xk-1 is the state quantity of the system at time tk-1 , Φk,k-1 is the state transition matrix of the system from time tk-1 to time tk , Γk , k-1 is the noise driving matrix of the system from time tk-1 to time tk , Wk-1 is the noise matrix of the system at time tk-1 , Zk is the attitude measurement matrix of the system at time tk , Hk is the attitude measurement coefficient matrix at time tk , and Vk is the noise matrix of attitude observation at time tk .
32)对离散化处理得到的公式加入控制项Uk-1,并采用闭环修正系统状态方程对惯导系统误差进行修正:32) Add the control term Uk-1 to the formula obtained by discretization, and use the closed-loop correction system state equation to correct the inertial navigation system error:
其中,Bk,k-1为tk-1时刻至tk时刻系统的控制项系数矩阵。Among them, Bk,k-1 is the control term coefficient matrix of the system from time tk-1 to time tk .
33)得到系统的线性化卡尔曼滤波器方程:33) Obtain the linearized Kalman filter equation of the system:
其中,为tk-1时刻至tk时刻一步预测状态量;为tk-1时刻滤波状态估计量;为tk时刻滤波状态估计量;Kk为tk时刻滤波增益矩阵;Zk为tk时刻的新息序列;Pk,k-1为tk-1时刻至tk时刻一步预测协方差矩阵;Pk-1为tk-1时刻滤波状态估计协方差矩阵;Φk,k-1T为Φk,k-1的转置矩阵;Γk-1为tk-1时刻系统噪声系数矩阵;Qk-1为tk-1时刻的系统观测噪声估计协方差阵;Γk-1T为Γk-1的转置矩阵;为Hk的转置矩阵;Rk为tk时刻的量测噪声估计协方差阵;Pk为tk时刻滤波状态估计协方差矩阵;I为单位矩阵。in, Predict the state quantity in one step from time tk-1 to time tk ; is the filter state estimator at time tk-1 ; is the filter state estimator at time tk ; Kk is the filter gain matrix at time tk ; Zk is the innovation sequence at time tk ; Pk,k-1 is the one-step prediction covariance from time tk-1 to time tk matrix; Pk-1 is the estimated covariance matrix of the filter state at time tk-1 ; Φk,k-1T is the transposed matrix of Φk,k-1 ; Γk-1 is the system noise at time tk-1 coefficient matrix; Qk-1 is the covariance matrix of systematic observation noise estimation at time tk-1 ; Γk-1T is the transpose matrix of Γk-1 ; is the transposed matrix of Hk ; Rk is the estimated covariance matrix of the measurement noise at time tk ; Pk is the estimated covariance matrix of the filter state at time tk ; I is the identity matrix.
为了验证发明所提出的基于量测模型优化的惯性/视觉里程计组合导航方法的正确性和有效性,采用本发明方法建立模型,利用KITTI数据集进行半物理仿真验证。测试车辆运行的真实轨迹如图2所示。In order to verify the correctness and effectiveness of the inertial/visual odometry combined navigation method based on measurement model optimization proposed by the invention, the method of the invention is used to establish a model, and the KITTI data set is used for semi-physical simulation verification. The real trajectory of the test vehicle running is shown in Figure 2.
基于所选取的数据集,基于本发明所述的基于量测模型优化的惯性/视觉里程计组合导航方法进行验证,并与姿态位置直接组合的卡尔曼滤波进行比较,导航误差比较曲线如图3(a)-图3(c)所示。Based on the selected data set, the inertial/visual odometry combined navigation method based on the measurement model optimization of the present invention is verified, and compared with the Kalman filter directly combined with the attitude and position, the navigation error comparison curve is shown in Figure 3 (a)-Fig. 3(c).
图3(a)-图3(c)中实线代表本发明误差曲线,虚线代表姿态位置直接组合的卡尔曼滤波误差曲线。从图3(a)-图3(c)导航误差比较曲线可以看出,采用本发明提出的基于量测模型优化的惯性/视觉里程计组合导航方法,惯性导航系统精度相较于姿态位置直接组合的卡尔曼滤波有明显提高,具有有益的工程应用价值。The solid line in Fig. 3(a)-Fig. 3(c) represents the error curve of the present invention, and the dashed line represents the Kalman filter error curve of the direct combination of attitude and position. It can be seen from the comparison curves of the navigation errors in Fig. 3(a)-Fig. 3(c) that using the inertial/visual odometry combined navigation method based on the optimization of the measurement model proposed by the present invention, the accuracy of the inertial navigation system is directly higher than that of the attitude position. The combined Kalman filter is obviously improved and has beneficial engineering application value.
以上所述仅是本发明的优选实施方式,应当指出,对于本技术领域的普通技术人员来说,在不脱离本发明技术原理的前提下,还可以做出若干改进和变形,这些改进和变形也应视为本发明的保护范围。The above are only the preferred embodiments of the present invention. It should be pointed out that for those skilled in the art, without departing from the technical principle of the present invention, several improvements and modifications can also be made. These improvements and modifications It should also be regarded as the protection scope of the present invention.
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201810477871.0ACN108731670B (en) | 2018-05-18 | 2018-05-18 | Inertial/visual odometer integrated navigation positioning method based on measurement model optimization |
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201810477871.0ACN108731670B (en) | 2018-05-18 | 2018-05-18 | Inertial/visual odometer integrated navigation positioning method based on measurement model optimization |
| Publication Number | Publication Date |
|---|---|
| CN108731670A CN108731670A (en) | 2018-11-02 |
| CN108731670Btrue CN108731670B (en) | 2021-06-22 |
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201810477871.0AActiveCN108731670B (en) | 2018-05-18 | 2018-05-18 | Inertial/visual odometer integrated navigation positioning method based on measurement model optimization |
| Country | Link |
|---|---|
| CN (1) | CN108731670B (en) |
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN109341724B (en)* | 2018-12-04 | 2023-05-05 | 中国航空工业集团公司西安航空计算技术研究所 | On-line calibration method for relative pose of airborne camera-inertial measurement unit |
| CN109544696B (en)* | 2018-12-04 | 2022-12-20 | 中国航空工业集团公司西安航空计算技术研究所 | Accurate registration method for airborne enhanced synthetic visual virtual and real images based on visual inertial combination |
| CN109655058A (en)* | 2018-12-24 | 2019-04-19 | 中国电子科技集团公司第二十研究所 | A kind of inertia/Visual intelligent Combinated navigation method |
| CN109443355B (en)* | 2018-12-25 | 2020-10-27 | 中北大学 | Visual-inertial tight coupling combined navigation method based on self-adaptive Gaussian PF |
| CN109443353B (en)* | 2018-12-25 | 2020-11-06 | 中北大学 | Visual-inertial tight coupling combined navigation method based on fuzzy self-adaptive ICKF |
| CN109443354B (en)* | 2018-12-25 | 2020-08-14 | 中北大学 | Visual-Inertial Tight Coupling Integrated Navigation Method Based on Firefly Swarm Optimization PF |
| CN109631875A (en)* | 2019-01-11 | 2019-04-16 | 京东方科技集团股份有限公司 | A method and system for optimizing sensor attitude fusion measurement method |
| CN109813311B (en)* | 2019-03-18 | 2020-09-15 | 南京航空航天大学 | Unmanned aerial vehicle formation collaborative navigation method |
| CN110221333B (en)* | 2019-04-11 | 2023-02-10 | 同济大学 | A Measurement Error Compensation Method for Vehicle INS/OD Integrated Navigation System |
| FR3097045B1 (en)* | 2019-06-06 | 2021-05-14 | Safran Electronics & Defense | Method and device for resetting an inertial unit of a means of transport from information delivered by a viewfinder of the means of transport |
| CN110455275B (en)* | 2019-08-07 | 2023-07-07 | 天津理工大学 | A positioning and navigation system and method for a wall-climbing robot of a large spherical storage tank |
| CN110728716B (en)* | 2019-09-04 | 2023-11-17 | 深圳市道通智能航空技术股份有限公司 | Calibration method and device and aircraft |
| CN112985456B (en)* | 2019-12-16 | 2024-06-21 | 上海航空电器有限公司 | Navigation positioning correction method integrating airborne height measurement data and topographic data |
| CN113074751B (en)* | 2019-12-17 | 2023-02-07 | 杭州海康威视数字技术股份有限公司 | A visual positioning error detection method and device |
| CN110986939B (en)* | 2020-01-02 | 2022-06-28 | 东南大学 | Visual inertia odometer method based on IMU (inertial measurement Unit) pre-integration |
| CN115003983B (en)* | 2020-02-13 | 2024-10-01 | Oppo广东移动通信有限公司 | Method, computer system and medium for updating state variables in SLAM system |
| CN111595337B (en)* | 2020-04-13 | 2023-09-26 | 浙江深寻科技有限公司 | Inertial positioning self-correction method based on visual modeling |
| CN111708377B (en)* | 2020-06-21 | 2022-10-25 | 西北工业大学 | Flight Control Method Based on Inertial Navigation/Flight Control System Information Fusion |
| CN112066983B (en)* | 2020-09-08 | 2022-09-23 | 中国人民解放军国防科技大学 | Inertial/odometer combined navigation filtering method, electronic device and storage medium |
| CN112050809B (en)* | 2020-10-08 | 2022-06-17 | 吉林大学 | Wheel type odometer and gyroscope information fusion unmanned vehicle directional positioning method |
| CN112325878A (en)* | 2020-10-30 | 2021-02-05 | 南京航空航天大学 | Ground carrier combined navigation method based on UKF and air unmanned aerial vehicle node assistance |
| CN112461237B (en)* | 2020-11-26 | 2023-03-14 | 浙江同善人工智能技术有限公司 | Multi-sensor fusion positioning method applied to dynamic change scene |
| CN112683263B (en)* | 2020-12-12 | 2022-11-11 | 西北工业大学 | UWB/IMU/ODOM multi-sensor data fusion mobile robot positioning method based on improved model |
| CN112729283A (en)* | 2020-12-21 | 2021-04-30 | 西北工业大学 | Navigation method based on depth camera/MEMS inertial navigation/odometer combination |
| CN112577493B (en)* | 2021-03-01 | 2021-05-04 | 中国人民解放军国防科技大学 | A method and system for autonomous positioning of unmanned aerial vehicles based on remote sensing map assistance |
| CN113155154B (en)* | 2021-04-07 | 2022-09-20 | 扬州大学 | Error correction method based on attitude and mileage of sensor and camera |
| CN113362366B (en)* | 2021-05-21 | 2023-07-04 | 上海奥视达智能科技有限公司 | Sphere rotation speed determining method and device, terminal and storage medium |
| CN113375667B (en)* | 2021-07-15 | 2022-02-22 | 北京百度网讯科技有限公司 | Navigation method, device, equipment and storage medium |
| CN114184209B (en)* | 2021-10-29 | 2023-10-13 | 北京自动化控制设备研究所 | Inertial error suppression method for low-speed detection platform system |
| CN114383609A (en)* | 2021-12-22 | 2022-04-22 | 中国煤炭科工集团太原研究院有限公司 | Combined Navigation Mine Positioning Method and Device Based on Strapdown Inertial Navigation and Odometer |
| CN114323076B (en)* | 2021-12-31 | 2024-06-21 | 深圳市优必选科技股份有限公司 | Odometer calibration method, device, robot and readable storage medium |
| CN115371668B (en)* | 2022-07-29 | 2024-08-23 | 重庆大学 | Tunnel unmanned aerial vehicle positioning system based on image recognition and inertial navigation |
| CN115790613B (en)* | 2022-11-11 | 2024-07-02 | 北京鲲鹏博睿科技有限公司 | Visual information-assisted inertial/odometer combined navigation method and device |
| CN115727845B (en)* | 2022-11-22 | 2025-04-08 | 东南大学 | A micro-inertial/wheel speedometer integrated navigation method based on prior trajectory assistance |
| CN116539064A (en)* | 2023-04-06 | 2023-08-04 | 中国电子科技南湖研究院 | Indoor unmanned aerial vehicle driving distance estimation method, system and medium |
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN102435188A (en)* | 2011-09-15 | 2012-05-02 | 南京航空航天大学 | A Monocular Vision/Inertial Fully Autonomous Navigation Method for Indoor Environment |
| CN102538781A (en)* | 2011-12-14 | 2012-07-04 | 浙江大学 | Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method |
| CN103292804A (en)* | 2013-05-27 | 2013-09-11 | 浙江大学 | Monocular natural vision landmark assisted mobile robot positioning method |
| CN105371840A (en)* | 2015-10-30 | 2016-03-02 | 北京自动化控制设备研究所 | An Inertial/Visual Odometer/LiDAR Integrated Navigation Method |
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN102435188A (en)* | 2011-09-15 | 2012-05-02 | 南京航空航天大学 | A Monocular Vision/Inertial Fully Autonomous Navigation Method for Indoor Environment |
| CN102538781A (en)* | 2011-12-14 | 2012-07-04 | 浙江大学 | Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method |
| CN103292804A (en)* | 2013-05-27 | 2013-09-11 | 浙江大学 | Monocular natural vision landmark assisted mobile robot positioning method |
| CN105371840A (en)* | 2015-10-30 | 2016-03-02 | 北京自动化控制设备研究所 | An Inertial/Visual Odometer/LiDAR Integrated Navigation Method |
| Title |
|---|
| A Design of Odometer-aided Visual Inertial Integrated Navigation Algorithm Based on Multiple View Geometry Constraints;Deqiang Tian .et al;《2017 9th International Conference on Intelligent Human-Machine Systems and Cybernetics》;20170921;第161-166页* |
| Loosely Coupled Kalman Filtering for Fusion of Visual Odometry and Inertial Navigation;Salim .et al;《Proceedings of the 16th International Conference on Information Fusion》;20131021;第219-226页* |
| 基于惯性传感器和视觉里程计的机器人定位;夏凌楠等;《仪器仪表学报》;20130131;第34卷(第1期);第166-172页* |
| Publication number | Publication date |
|---|---|
| CN108731670A (en) | 2018-11-02 |
| Publication | Publication Date | Title |
|---|---|---|
| CN108731670B (en) | Inertial/visual odometer integrated navigation positioning method based on measurement model optimization | |
| US12092740B2 (en) | Positioning method and device based on multi-sensor fusion | |
| CN112629538B (en) | Ship horizontal attitude measurement method based on fusion complementary filtering and Kalman filtering | |
| CN109991636B (en) | Map construction method and system based on GPS, IMU and binocular vision | |
| CN112097763B (en) | Underwater vehicle combined navigation method based on MEMS IMU/magnetometer/DVL combination | |
| CN101949703B (en) | Strapdown inertial/satellite combined navigation filtering method | |
| CN102706366B (en) | SINS (strapdown inertial navigation system) initial alignment method based on earth rotation angular rate constraint | |
| CN100516775C (en) | A Method for Determining Initial Attitude of Strapdown Inertial Navigation System | |
| CN111156994A (en) | INS/DR & GNSS loose integrated navigation method based on MEMS inertial component | |
| CN109931955B (en) | An Initial Alignment Method for Strapdown Inertial Navigation System Based on State Dependent Lie Group Filtering | |
| CN106871928B (en) | Strap-down inertial navigation initial alignment method based on lie group filtering | |
| CN106767797B (en) | inertial/GPS combined navigation method based on dual quaternion | |
| CN113503872B (en) | A low-speed unmanned vehicle positioning method based on the fusion of camera and consumer-grade IMU | |
| CN109870173A (en) | A kind of track correct method of the submarine pipeline inertial navigation system based on checkpoint | |
| CN103630146B (en) | The laser gyro IMU scaling method that a kind of discrete parsing is combined with Kalman filter | |
| Nourmohammadi et al. | Design and experimental evaluation of indirect centralized and direct decentralized integration scheme for low-cost INS/GNSS system | |
| CN103245359A (en) | Method for calibrating fixed errors of inertial sensor in inertial navigation system in real time | |
| CN110017837A (en) | A kind of Combinated navigation method of the diamagnetic interference of posture | |
| CN117516524A (en) | GPS/INS Kalman filter positioning method based on CNN-GRU prediction | |
| CN110954102A (en) | Magnetometer-assisted inertial navigation system and method for robot positioning | |
| CN112033440B (en) | Method for achieving latitude-free initial alignment under swing base based on gradient descent optimization | |
| CN111189474A (en) | Autonomous calibration method of MARG sensor based on MEMS | |
| Li et al. | Low-cost MEMS sensor-based attitude determination system by integration of magnetometers and GPS: A real-data test and performance evaluation | |
| CN114690229A (en) | GPS-fused mobile robot visual inertial navigation method | |
| CN112325841B (en) | Method for estimating installation error angle of communication-in-motion antenna |
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |