Movatterモバイル変換


[0]ホーム

URL:


CN108731670B - Inertial/visual odometer integrated navigation positioning method based on measurement model optimization - Google Patents

Inertial/visual odometer integrated navigation positioning method based on measurement model optimization
Download PDF

Info

Publication number
CN108731670B
CN108731670BCN201810477871.0ACN201810477871ACN108731670BCN 108731670 BCN108731670 BCN 108731670BCN 201810477871 ACN201810477871 ACN 201810477871ACN 108731670 BCN108731670 BCN 108731670B
Authority
CN
China
Prior art keywords
time
matrix
inertial
navigation
error
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810477871.0A
Other languages
Chinese (zh)
Other versions
CN108731670A (en
Inventor
罗玮
熊智
李一博
于永军
邢丽
万众
许建新
陈明星
熊骏
王钲淳
张苗
段胜青
徐丽敏
景羿铭
孙瑶洁
李文龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Aeronautics and Astronautics
Original Assignee
Nanjing University of Aeronautics and Astronautics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Aeronautics and AstronauticsfiledCriticalNanjing University of Aeronautics and Astronautics
Priority to CN201810477871.0ApriorityCriticalpatent/CN108731670B/en
Publication of CN108731670ApublicationCriticalpatent/CN108731670A/en
Application grantedgrantedCritical
Publication of CN108731670BpublicationCriticalpatent/CN108731670B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Landscapes

Abstract

Translated fromChinese

本发明公开了一种基于量测模型优化的惯性/视觉里程计组合导航定位方法,首先建立组合导航系统的状态方程,将惯性传感器误差扩展为系统状态变量,包括陀螺的随机常数漂移、陀螺仪的一阶马尔科夫过程漂移和加速度计的一阶马尔科夫过程漂移;随后,将视觉里程计作为角速度、线速度以及位置传感器以获取量测数据进而构建量测方程;最后在载体运动过程中对导航误差进行实时反馈校正,获得误差校正后的惯性导航系统导航结果。本发明方法能够在载体运动过程中有效利用视觉里程计的角速度、线速度和位置信息,实现与惯性导航的有效融合,提高组合导航系统的精度和可靠性,适用于工程应用。

Figure 201810477871

The invention discloses an inertial/visual odometry combined navigation and positioning method optimized based on a measurement model. First, the state equation of the integrated navigation system is established, and the inertial sensor error is expanded into a system state variable, including the random constant drift of the gyroscope, the gyroscope, and the gyroscope. The first-order Markov process drift of the accelerometer and the first-order Markov process drift of the accelerometer; then, the visual odometry is used as the angular velocity, linear velocity and position sensor to obtain the measurement data and then construct the measurement equation; finally, in the carrier motion process Real-time feedback correction is performed on the navigation error in the system, and the navigation result of the inertial navigation system after error correction is obtained. The method of the invention can effectively utilize the angular velocity, linear velocity and position information of the visual odometer during the movement of the carrier, realize effective integration with inertial navigation, improve the precision and reliability of the integrated navigation system, and is suitable for engineering applications.

Figure 201810477871

Description

Translated fromChinese
基于量测模型优化的惯性/视觉里程计组合导航定位方法Inertial/visual odometry combined navigation and positioning method based on measurement model optimization

技术领域technical field

本发明涉及一种基于量测模型优化的惯性/视觉里程计组合导航定位方法,属于惯性/视觉里程计组合导航技术领域。The invention relates to an inertial/visual odometry combined navigation and positioning method optimized based on a measurement model, and belongs to the technical field of inertial/visual odometry combined navigation.

背景技术Background technique

近年来,随着无人机、无人车和机器人等智能运动体的研制发展,对导航系统性能提出更高要求。In recent years, with the development of intelligent moving bodies such as unmanned aerial vehicles, unmanned vehicles and robots, higher requirements have been placed on the performance of navigation systems.

在现有技术下,广泛应用于智能运动体的导航系统多采用惯性器件(IMU-加速度计和陀螺仪)/卫星组合导航系统实现。但考虑到在城市、丛林、室内等卫星信号受到干扰或失效的环境,IMU/卫星组合导航系统将难以提供准确的导航信息。视觉里程计技术是一种基于相关图像序列处理的六自由度运动估计方法,由于其仅依赖于被动测量的相机,具有自主性好、隐蔽性好、结构简单、成本低等优点,必将成为未来智能运动体导航系统的重要信息单元。In the prior art, the navigation systems widely used in intelligent moving bodies are mostly implemented by inertial devices (IMU-accelerometer and gyroscope)/satellite integrated navigation system. However, considering that the satellite signals are interfered or disabled in cities, jungles, and indoor environments, it will be difficult for the IMU/satellite integrated navigation system to provide accurate navigation information. Visual odometry technology is a six-degree-of-freedom motion estimation method based on related image sequence processing. Because it only relies on passively measured cameras, it has the advantages of good autonomy, good concealment, simple structure and low cost. Important information unit of future intelligent moving body navigation system.

惯性导航系统的误差主要由惯性传感器(IMU-加速度计和陀螺仪)测量误差引起,惯性导航系统的误差随时间累积,但是其测量精度不会受到突然机动的影响,而视觉里程计由于采用递推的方式求解位姿,视觉里程计不可避免地存在累积误差,且这种累积误差和基于光电编码器的里程计类似,是空间相关而非时间相关的;并且当运动体机动过大时,会由于图像场景变化过大而造成匹配失效进而影响定位精度。视觉里程计根据对图像信息利用的不同,主要分为利用特征点进行跟踪匹配的特征点法,以LIBVISO、PTAM等开源算法为代表,和直接利用像素灰度信息匹配的直接法,以SVO、LSD-SLAM等开源算法为代表。各种代表性的算法运动估计方法不同,并且运动体的机动与输出位姿噪声的关系难以准确建模。考虑到两者的互补性,基于惯性导航和视觉里程计的组合导航受到广泛的重视,但现有的IMU和视觉里程计的组合导航大多直接使用姿态和位置组合,忽视了视觉里程计定位误差累积和与机动相关的特点。如何实现惯性导航和视觉里程计导航信息的有效融合,具有重要的研究意义。The error of the inertial navigation system is mainly caused by the measurement error of the inertial sensor (IMU-accelerometer and gyroscope). The error of the inertial navigation system accumulates with time, but its measurement accuracy will not be affected by sudden maneuvers. To solve the pose and attitude by pushing, the visual odometry inevitably has cumulative errors, and this cumulative error is similar to the odometer based on the photoelectric encoder, which is related to space rather than time; and when the moving body maneuvers too much, Due to the large change of the image scene, the matching failure will be caused and the positioning accuracy will be affected. According to the different use of image information, visual odometry is mainly divided into the feature point method that uses feature points to track and match, represented by open source algorithms such as LIBVISO and PTAM, and the direct method that directly uses pixel grayscale information for matching, such as SVO, PTAM and other open source algorithms. Open source algorithms such as LSD-SLAM are represented. Various representative algorithms have different motion estimation methods, and the relationship between the movement of the moving body and the output pose noise is difficult to accurately model. Considering the complementarity of the two, the combined navigation based on inertial navigation and visual odometry has received extensive attention, but most of the existing combined navigation of IMU and visual odometry directly use the combination of attitude and position, ignoring the positioning error of visual odometry. Cumulative and maneuver-related characteristics. How to realize the effective fusion of inertial navigation and visual odometry navigation information has important research significance.

发明内容SUMMARY OF THE INVENTION

为解决现有技术的不足,本发明的目的在于提供一种基于量测模型优化的惯性/视觉里程计组合导航定位方法,在载体运动过程中有效利用视觉里程计的角速度、线速度和位置信息,实现与惯性导航的最优融合,提高组合导航系统的精度。In order to solve the deficiencies of the prior art, the purpose of the present invention is to provide an inertial/visual odometry combined navigation and positioning method optimized based on a measurement model, which effectively utilizes the angular velocity, linear velocity and position information of the visual odometer during the movement of the carrier. , to achieve the optimal integration with inertial navigation and improve the accuracy of the integrated navigation system.

为了实现上述目标,本发明采用如下的技术方案:In order to achieve above-mentioned goal, the present invention adopts following technical scheme:

一种基于量测模型优化的惯性/视觉里程计组合导航定位方法,其特征是,An inertial/visual odometry combined navigation and positioning method based on measurement model optimization, characterized in that:

步骤1)建立组合导航系统的状态方程,将惯性传感器误差扩展为系统状态变量,包括陀螺的随机常数漂移、陀螺仪的一阶马尔科夫过程漂移和加速度计的一阶马尔科夫过程漂移;Step 1) establish the state equation of the integrated navigation system, and expand the inertial sensor error into a system state variable, including the random constant drift of the gyroscope, the first-order Markov process drift of the gyroscope and the first-order Markov process drift of the accelerometer;

步骤2)结合视觉里程计输出的相对位姿,得到相对位姿到导航系下的姿态和位置的转换关系;Step 2) Combining the relative pose output by the visual odometry, obtain the conversion relationship between the relative pose and the pose and position under the navigation system;

步骤3)选择组合导航系统的量测变量,构建惯性/视觉里程计组合卡尔曼滤波量测方程;Step 3) select the measurement variable of the integrated navigation system, and construct the inertial/visual odometry combined Kalman filter measurement equation;

步骤4)对系统状态方程和量测方程进行离散化处理,并采用卡尔曼滤波对系统状态量进行反馈校正。Step 4) Discretize the system state equation and the measurement equation, and use Kalman filter to feedback and correct the system state quantity.

前述的一种基于量测模型优化的惯性/视觉里程计组合导航定位方法,其特征是,所述步骤1)中系统的状态方程为:The aforementioned a kind of inertial/visual odometry combined navigation and positioning method optimized based on the measurement model, is characterized in that, the state equation of the system in the described step 1) is:

Figure GDA0003002011500000021
其中,X为状态向量,
Figure GDA0003002011500000022
为捷联惯导系统中的数学平台误差角,
Figure GDA0003002011500000023
分别为x、y、z的三轴数学平台误差角;δvn为载体的速度误差,
Figure GDA0003002011500000031
载体在地理系下x、y、z轴上的速度误差;δpn为载体的位置误差,δL、δλ、δh分别为经度、纬度、高度误差;εb为陀螺仪的随机常数漂移,
Figure GDA0003002011500000032
分别为陀螺仪x、y、z轴的随机常数漂移;
Figure GDA0003002011500000033
为陀螺仪的一阶马尔科夫过程漂移,
Figure GDA0003002011500000034
分别为陀螺仪x、y、z轴的一阶马尔科夫过程漂移;
Figure GDA0003002011500000035
为加速度计的一阶马尔科夫过程漂移,
Figure GDA0003002011500000036
为加速度计x、y、z轴的一阶马尔科夫过程漂移;
Figure GDA0003002011500000021
where X is the state vector,
Figure GDA0003002011500000022
is the mathematical platform error angle in the strapdown inertial navigation system,
Figure GDA0003002011500000023
are the three-axis mathematical platform error angles of x, y, and z respectively; δvn is the velocity error of the carrier,
Figure GDA0003002011500000031
The velocity error of the carrier on the x, y, and z axes in the geographic system; δpn is the position error of the carrier, δL, δλ, and δh are the longitude, latitude, and altitude errors, respectively; εb is the random constant drift of the gyroscope,
Figure GDA0003002011500000032
are the random constant drift of the gyroscope x, y, and z axes, respectively;
Figure GDA0003002011500000033
is the first-order Markov process drift of the gyroscope,
Figure GDA0003002011500000034
are the first-order Markov process drift of the gyroscope x, y, and z axes, respectively;
Figure GDA0003002011500000035
is the first-order Markov process drift of the accelerometer,
Figure GDA0003002011500000036
is the first-order Markov process drift of the x, y, and z axes of the accelerometer;

上标n表示地理系,上标b表示载体系,下标I表示惯导系统解算值,下标V表示视觉里程计解算值;The superscript n represents the geographic system, the superscript b represents the carrier system, the subscript I represents the solution value of the inertial navigation system, and the subscript V represents the solution value of visual odometry;

Figure GDA0003002011500000037
其中,
Figure GDA0003002011500000038
为地理系相对于惯性系的角速率误差,
Figure GDA0003002011500000039
为地理系相对于惯性系的角速率,
Figure GDA00030020115000000310
表示从载体系到地理系的坐标变换矩阵,fn为地理系下加速度计的输出比力,
Figure GDA00030020115000000311
为地球系相对于惯性系的角速率误差,
Figure GDA00030020115000000312
为地理系相对于地球系的角速率误差,
Figure GDA00030020115000000313
为地球系相对于惯性系的角速率,
Figure GDA00030020115000000314
为地理系相对于地球系的角速率,δgn为重力加速度误差,vn为载体在地理系下的速度,
Figure GDA00030020115000000315
为载体在地理系下x轴上的速度,Rm为卯酉圈半径,Rn为子午圈半径,h为当地海拔高度,L为当地纬度,Tg和Ta分别为一阶马尔科夫过程的相关时间,wr和wa分别为相关的驱动白噪声。
Figure GDA0003002011500000037
in,
Figure GDA0003002011500000038
is the angular rate error of the geographic frame relative to the inertial frame,
Figure GDA0003002011500000039
is the angular velocity of the geographic frame relative to the inertial frame,
Figure GDA00030020115000000310
represents the coordinate transformation matrix from the carrier system to the geographic system, fn is the output specific force of the accelerometer in the geographic system,
Figure GDA00030020115000000311
is the angular rate error of the Earth system relative to the inertial system,
Figure GDA00030020115000000312
is the angular rate error of the geographic system relative to the Earth system,
Figure GDA00030020115000000313
is the angular velocity of the Earth system relative to the inertial system,
Figure GDA00030020115000000314
is the angular velocity of the geographic system relative to the earth system, δgn is the gravitational acceleration error, vn is the velocity of the carrier in the geographic system,
Figure GDA00030020115000000315
is the speed of the carrier on the x-axis in the geographic system, Rm is the radius of the suo unitary circle, Rn is the radius of the meridian circle, h is the local altitude, L is the local latitude, Tg and Ta are the first-order Markov Correlation time of the process, wr and wa are the correlated driving white noise, respectively.

前述的一种基于量测模型优化的惯性/视觉里程计组合导航定位方法,其特征是,所述步骤2)中相对位姿到导航系下的姿态和位置的转换关系为:

Figure GDA0003002011500000041
其中,
Figure GDA0003002011500000042
为k帧图像时刻的载体系到地理系的坐标变换矩阵,
Figure GDA0003002011500000043
为相机系到载体系的坐标变换矩阵,
Figure GDA0003002011500000044
Figure GDA0003002011500000045
为视觉里程计测得k+1帧图像时刻相对于k帧图像时刻的旋转矩阵和平移向量,
Figure GDA0003002011500000046
为k帧图像时刻载体在地理系下的位置,上标T表示矩阵的转置,下标k和k+1表示对应帧时刻的物理量。The aforementioned a kind of inertial/visual odometry combined navigation and positioning method optimized based on the measurement model, is characterized in that, in the described step 2), the transformation relationship between the relative pose to the attitude and position under the navigation system is:
Figure GDA0003002011500000041
in,
Figure GDA0003002011500000042
is the coordinate transformation matrix from the carrier system to the geographic system at the time of the k-frame image,
Figure GDA0003002011500000043
is the coordinate transformation matrix from the camera system to the carrier system,
Figure GDA0003002011500000044
and
Figure GDA0003002011500000045
Measure the rotation matrix and translation vector of k+1 frame image time relative to k frame image time for visual odometry,
Figure GDA0003002011500000046
is the position of the k-frame image time carrier in the geographic system, the superscript T represents the transposition of the matrix, and the subscripts k and k+1 represent the physical quantities of the corresponding frame time.

前述的一种基于量测模型优化的惯性/视觉里程计组合导航定位方法,其特征是,所述步骤3)的量测方程为:The aforementioned inertial/visual odometry combined navigation and positioning method optimized based on the measurement model, is characterized in that, the measurement equation of the step 3) is:

Figure GDA0003002011500000047
其中,
Figure GDA0003002011500000048
为载体相对于地理系的角速度误差,δvn为载体的速度误差,δpn为载体的位置误差,下标I表示惯导系统解算值,下标V表示视觉里程计解算值,H为系统的量测系数矩阵,V为系统的量测噪声阵;
Figure GDA0003002011500000047
in,
Figure GDA0003002011500000048
is the angular velocity error of the carrier relative to the geographic system, δvn is the velocity error of the carrier, δpn is the position error of the carrier, the subscript I represents the inertial navigation system solution value, the subscript V represents the visual odometry solution value, H is the The measurement coefficient matrix of the system, V is the measurement noise matrix of the system;

Figure GDA0003002011500000049
其中,
Figure GDA00030020115000000410
表示从地理系到载体系的坐标变换矩阵,Mv和Mp分别是与速度和位置相关的系数,vx和vy分别为载体速度在x轴和y轴上的分量,
Figure GDA00030020115000000411
Figure GDA00030020115000000412
的模值,
Figure GDA00030020115000000413
I为单位矩阵。
Figure GDA0003002011500000049
in,
Figure GDA00030020115000000410
Represents the coordinate transformation matrix from the geographic system to the carrier system, Mv and Mp are the coefficients related to the velocity and position, respectively, vx and vy are the components of the carrier velocity on the x-axis and y-axis, respectively,
Figure GDA00030020115000000411
for
Figure GDA00030020115000000412
the modulo value of ,
Figure GDA00030020115000000413
I is the identity matrix.

前述的一种基于量测模型优化的惯性/视觉里程计组合导航定位方法,其特征是,所述步骤4)的具体过程为:A kind of inertial/visual odometry combined navigation and positioning method optimized based on the measurement model, is characterized in that, the concrete process of described step 4) is:

401)将系统状态方程和量测方程离散化处理:401) Discretize the system state equation and measurement equation:

Xk=Φk,k-1Xk-1k,k-1Wk-1,Zk=HkXk+Vk,其中,Xk为tk时刻系统状态量,Xk-1为tk-1时刻系统状态量,Φk,k-1为tk-1时刻至tk时刻系统的状态转移矩阵,Γk,k-1为tk-1时刻至tk时刻系统的噪声驱动矩阵,Wk-1为tk-1时刻系统的噪声矩阵,Zk为tk时刻系统的姿态量测矩阵,Hk为tk时刻的姿态量测系数矩阵,Vk为tk时刻的姿态观测量的噪声矩阵;Xkk,k-1 Xk-1k,k-1 Wk-1 , Zk =Hk Xk +Vk , where Xk is the system state quantity at time tk , Xk -1 is the state quantity of the system at time tk-1 , Φk,k-1 is the state transition matrix of the system from time tk-1 to time tk , Γk,k-1 is time tk-1 to time tk The noise driving matrix of the system, Wk-1 is the noise matrix of the system at time tk-1 , Zk is the attitude measurement matrix of the system at time tk , Hk is the attitude measurement coefficient matrix at time tk , Vk is Noise matrix of attitude observations at time tk ;

402)对离散化处理得到的公式加入控制项Uk-1,并采用闭环修正系统状态方程对惯导系统误差进行修正:

Figure GDA0003002011500000051
其中,Bk,k-1为tk-1时刻至tk时刻系统的控制项系数矩阵;402) Add the control term Uk-1 to the formula obtained by discretization, and use the closed-loop correction system state equation to correct the inertial navigation system error:
Figure GDA0003002011500000051
Among them, Bk,k-1 is the control term coefficient matrix of the system from time tk-1 to time tk ;

403)得到系统的线性化卡尔曼滤波器方程:403) Obtain the linearized Kalman filter equation of the system:

Figure GDA0003002011500000052
其中,
Figure GDA0003002011500000053
为tk-1时刻至tk时刻一步预测状态量;
Figure GDA0003002011500000054
为tk-1时刻滤波状态估计量;
Figure GDA0003002011500000055
为tk时刻滤波状态估计量;Kk为tk时刻滤波增益矩阵;Zk为tk时刻的新息序列;Pk,k-1为tk-1时刻至tk时刻一步预测协方差矩阵;Pk-1为tk-1时刻滤波状态估计协方差矩阵;Φk,k-1T为Φk,k-1的转置矩阵;Γk-1为tk-1时刻系统噪声系数矩阵;Qk-1为tk-1时刻的系统观测噪声估计协方差阵;Γk-1T为Γk-1的转置矩阵;
Figure GDA0003002011500000056
为Hk的转置矩阵;Rk为tk时刻的量测噪声估计协方差阵;Pk为tk时刻滤波状态估计协方差矩阵;I为单位矩阵;
Figure GDA0003002011500000052
in,
Figure GDA0003002011500000053
Predict the state quantity in one step from time tk-1 to time tk ;
Figure GDA0003002011500000054
is the filter state estimator at time tk-1 ;
Figure GDA0003002011500000055
is the filter state estimator at time tk ; Kk is the filter gain matrix at time tk ; Zk is the innovation sequence at time tk ; Pk,k-1 is the one-step prediction covariance from time tk-1 to time tk matrix; Pk-1 is the estimated covariance matrix of the filter state at time tk-1 ; Φk,k-1T is the transposed matrix of Φk,k-1 ; Γk-1 is the system noise at time tk-1 coefficient matrix; Qk-1 is the covariance matrix of systematic observation noise estimation at time tk-1 ; Γk-1T is the transpose matrix of Γk-1 ;
Figure GDA0003002011500000056
is the transposed matrix of Hk ; Rk is the estimated covariance matrix of the measurement noise at time tk ; Pk is the estimated covariance matrix of the filter state at time tk ; I is the identity matrix;

404)根据步骤403)得到的线性化卡尔曼滤波器方程估计系统导航误差值,包括姿态、位置、速度误差,并用惯性导航系统推算的导航参数减去系统导航误差值,得到基于量测模型优化的惯性/视觉里程计组合导航修正值。404) Estimate the system navigation error value according to the linearized Kalman filter equation obtained in step 403), including the attitude, position, and speed errors, and subtract the system navigation error value from the navigation parameters estimated by the inertial navigation system to obtain an optimization based on the measurement model. The inertial/visual odometry combined navigation correction value.

本发明所达到的有益效果:本发明方法将视觉里程计作为角速度、线速度以及位置传感器以获取量测数据进而构建量测方程,符合视觉里程计输出的噪声特性,保证了对视觉里程计输出信息的有效利用,具有广泛的适用性;本发明方法应用卡尔曼滤波方法对惯性/视觉里程计组合导航系统的状态量实现了最优估计,利用惯性导航和视觉里程计的互补性,有效地提高组合导航系统的精度,适合工程应用。The beneficial effects achieved by the present invention: the method of the present invention uses the visual odometry as an angular velocity, linear velocity and position sensor to obtain measurement data and then constructs a measurement equation, which conforms to the noise characteristics of the output of the visual odometer and ensures the output of the visual odometer. The effective use of information has wide applicability; the method of the present invention uses the Kalman filter method to achieve the optimal estimation of the state quantity of the inertial/visual odometry integrated navigation system, and uses the complementarity of inertial navigation and visual odometry to effectively Improve the accuracy of the integrated navigation system, suitable for engineering applications.

附图说明Description of drawings

图1是本发明的架构图;Fig. 1 is the architecture diagram of the present invention;

图2是本发明所使用的KITTI数据集车辆真实运动轨迹示意图;Fig. 2 is the KITTI data set vehicle real motion track schematic diagram used in the present invention;

图3(a)-(c)是本发明的导航结果与使用姿态位置组合的导航结果比较图,图3(a)是位置误差比较曲线,图3(b)是姿态误差比较曲线,图3(c)是速度误差比较曲线。Figure 3(a)-(c) is a comparison diagram of the navigation result of the present invention and the navigation result using the combination of attitude and position, Figure 3 (a) is the position error comparison curve, Figure 3 (b) is the attitude error comparison curve, Figure 3 (c) is the speed error comparison curve.

具体实施方式Detailed ways

下面结合附图对本发明作进一步描述。以下实施例仅用于更加清楚地说明本发明的技术方案,而不能以此来限制本发明的保护范围。The present invention will be further described below in conjunction with the accompanying drawings. The following examples are only used to illustrate the technical solutions of the present invention more clearly, and cannot be used to limit the protection scope of the present invention.

如图1所示,本发明所述的惯基于量测模型优化的惯性/视觉里程计组合导航方法的原理是:视觉里程计模块对双目相机输出的原始图像进行畸变校正,随后对序列图像进行特征提取与立体匹配,再进行特征点的跟踪,从而估计出相对位姿。卡尔曼滤波器模块通过建立卡尔曼滤波状态方程和量测方程,利用陀螺仪和加速度计原始输出进行捷联惯导解算,当到了滤波周期后,分别进行时间更新和量测更新,实现对组合导航系统状态量的最优估计,以提高组合导航系统性能。As shown in FIG. 1 , the principle of the inertial/visual odometry combined navigation method optimized based on the inertial measurement model of the present invention is as follows: the visual odometer module performs distortion correction on the original image output by the binocular camera, and then corrects the sequence of images. Perform feature extraction and stereo matching, and then track feature points to estimate the relative pose. The Kalman filter module uses the original output of the gyroscope and accelerometer to solve the strapdown inertial navigation by establishing the Kalman filter state equation and measurement equation. The optimal estimation of the state quantity of the integrated navigation system to improve the performance of the integrated navigation system.

本发明的具体实施方式如下:The specific embodiments of the present invention are as follows:

步骤1)建立惯性/视觉里程计组合导航算法的状态方程:Step 1) Establish the state equation of the inertial/visual odometry combined navigation algorithm:

首先定义坐标系:地理坐标系选取“东-北-天”为导航系(n),载体坐标系(b)选取为“右-前-上”。需要估计的状态量如式所示:First define the coordinate system: the geographic coordinate system selects "East-North-Sky" as the navigation system (n), and the carrier coordinate system (b) is selected as "Right-Front-Up". The state quantity that needs to be estimated is shown in the formula:

Figure GDA0003002011500000061
Figure GDA0003002011500000061

式中,X为状态向量,

Figure GDA0003002011500000062
为捷联惯导系统中的数学平台误差角,
Figure GDA0003002011500000063
分别为x、y、z的三轴数学平台误差角;δvn为载体的速度误差,
Figure GDA0003002011500000064
载体在地理系下x、y、z轴上的速度误差;δpn为载体的位置误差,δL、δλ、δh分别为经度、纬度、高度误差;εb为陀螺仪的随机常数漂移,
Figure GDA0003002011500000071
分别为陀螺仪x、y、z轴的随机常数漂移;
Figure GDA0003002011500000072
为陀螺仪的一阶马尔科夫过程漂移,
Figure GDA0003002011500000073
分别为陀螺仪x、y、z轴的一阶马尔科夫过程漂移;
Figure GDA0003002011500000074
为加速度计的一阶马尔科夫过程漂移,
Figure GDA0003002011500000075
为加速度计x、y、z轴的一阶马尔科夫过程漂移。上标n表示地理系,上标b表示载体系;where X is the state vector,
Figure GDA0003002011500000062
is the mathematical platform error angle in the strapdown inertial navigation system,
Figure GDA0003002011500000063
are the three-axis mathematical platform error angles of x, y, and z respectively; δvn is the velocity error of the carrier,
Figure GDA0003002011500000064
The velocity error of the carrier on the x, y, and z axes in the geographic system; δpn is the position error of the carrier, δL, δλ, and δh are the longitude, latitude, and altitude errors, respectively; εb is the random constant drift of the gyroscope,
Figure GDA0003002011500000071
are the random constant drift of the gyroscope x, y, and z axes, respectively;
Figure GDA0003002011500000072
is the first-order Markov process drift of the gyroscope,
Figure GDA0003002011500000073
are the first-order Markov process drift of the gyroscope x, y, and z axes, respectively;
Figure GDA0003002011500000074
is the first-order Markov process drift of the accelerometer,
Figure GDA0003002011500000075
is the first-order Markov process drift of the accelerometer x, y, and z axes. The superscript n represents the geographic system, and the superscript b represents the carrier system;

IMU的测量值作为状态方程的输入变量,状态方程表示为:The measured value of the IMU is used as the input variable of the state equation, and the state equation is expressed as:

Figure GDA0003002011500000076
式中,
Figure GDA0003002011500000077
为地理系相对于惯性系的角速率误差,
Figure GDA0003002011500000078
为地理系相对于惯性系的角速率,
Figure GDA0003002011500000079
表示从载体系到地理系的坐标变换矩阵,fn为地理系下加速度计的输出比力,
Figure GDA00030020115000000710
为地球系相对于惯性系的角速率误差,
Figure GDA00030020115000000711
为地理系相对于地球系的角速率误差,
Figure GDA00030020115000000712
为地球系相对于惯性系的角速率,
Figure GDA00030020115000000713
为地理系相对于地球系的角速率,δgn为重力加速度误差,vn为载体在地理系下的速度,
Figure GDA00030020115000000714
为载体在地理系下x轴上的速度,Rm为卯酉圈半径,Rn为子午圈半径,h为当地海拔高度,L为当地纬度,Tg和Ta分别为一阶马尔科夫过程的相关时间,wr和wa分别为相关的驱动白噪声。
Figure GDA0003002011500000076
In the formula,
Figure GDA0003002011500000077
is the angular rate error of the geographic frame relative to the inertial frame,
Figure GDA0003002011500000078
is the angular velocity of the geographic frame relative to the inertial frame,
Figure GDA0003002011500000079
represents the coordinate transformation matrix from the carrier system to the geographic system, fn is the output specific force of the accelerometer in the geographic system,
Figure GDA00030020115000000710
is the angular rate error of the Earth system relative to the inertial system,
Figure GDA00030020115000000711
is the angular rate error of the geographic system relative to the Earth system,
Figure GDA00030020115000000712
is the angular velocity of the Earth system relative to the inertial system,
Figure GDA00030020115000000713
is the angular velocity of the geographic system relative to the earth system, δgn is the gravitational acceleration error, vn is the velocity of the carrier in the geographic system,
Figure GDA00030020115000000714
is the speed of the carrier on the x-axis in the geographic system, Rm is the radius of the suo unitary circle, Rn is the radius of the meridian circle, h is the local altitude, L is the local latitude, Tg and Ta are the first-order Markov Correlation time of the process, wr and wa are the correlated driving white noise, respectively.

步骤2)建立基于量测模型优化的惯性/视觉里程计组合导航算法的量测方程Step 2) Establish the measurement equation of the inertial/visual odometry integrated navigation algorithm optimized based on the measurement model

视觉里程计通过对序列图像信息的匹配和跟踪估算相邻图像间相机的运动,它的输出是前后两个连续采样时刻的相对位姿,用旋转矩阵

Figure GDA0003002011500000081
和平移向量
Figure GDA0003002011500000082
来表示,k与k+1表示图像的帧数。Visual odometry estimates the motion of the camera between adjacent images by matching and tracking sequence image information. Its output is the relative pose of two consecutive sampling moments before and after, using the rotation matrix
Figure GDA0003002011500000081
and translation vector
Figure GDA0003002011500000082
to represent, k and k+1 represent the number of frames of the image.

下式表示了相对位姿到导航系下的姿态和位置的转换关系。The following formula expresses the transformation relationship between the relative pose and the pose and position under the navigation system.

Figure GDA0003002011500000083
Figure GDA0003002011500000083

式中,

Figure GDA0003002011500000084
表示从相机系到载体系的坐标变换矩阵,一般认为
Figure GDA0003002011500000085
在实验前已经标定好且在一次实验中保持不变。下标k和k+1表示载体在对应帧数图像的时刻的相应物理量的大小。In the formula,
Figure GDA0003002011500000084
Represents the coordinate transformation matrix from the camera system to the carrier system, generally considered to be
Figure GDA0003002011500000085
It has been calibrated before the experiment and remains unchanged for one experiment. The subscripts k and k+1 represent the size of the corresponding physical quantity of the carrier at the moment corresponding to the frame number image.

由于基于不同解算原理和方法的视觉里程计噪声特性不同,因此不失一般性,将视觉里程计视为一个黑箱,直接观测得到的相对位姿的噪声不会累积,近似于白噪声,因此噪声体现在姿态角增量和平移向量上,即:Because the noise characteristics of visual odometry based on different calculation principles and methods are different, without loss of generality, visual odometry is regarded as a black box, and the noise of the relative pose obtained by direct observation will not accumulate, which is similar to white noise, so The noise is reflected in the attitude angle increment and translation vector, namely:

Figure GDA0003002011500000086
Figure GDA0003002011500000086

其中,Δγ,Δθ,Δφ表示从k和k+1时刻横滚角、俯仰角、航向角的变化量,

Figure GDA0003002011500000087
Figure GDA0003002011500000088
表示带误差的横滚角、俯仰角、航向角测量值,wγ,wθ,wφ,wt分别表示横滚角、俯仰角、航向角、平移向量的白噪声。Among them, Δγ, Δθ, Δφ represent the changes of roll angle, pitch angle, and heading angle from time k and k+1,
Figure GDA0003002011500000087
Figure GDA0003002011500000088
Represents the roll angle, pitch angle, and heading angle measurement values with errors, wγ , wθ , wφ , wt represent the white noise of the roll angle, pitch angle, heading angle, and translation vector, respectively.

基于以上分析的视觉里程计噪声特性,选取惯导系统和视觉里程计解算出的角速度误差的差值、线速度误差差值和位置误差差值作为组合系统的观测量,如下式所示:Based on the noise characteristics of the visual odometry analyzed above, the difference between the angular velocity error, the linear velocity error, and the position error calculated by the inertial navigation system and the visual odometry are selected as the observations of the combined system, as shown in the following formula:

Figure GDA0003002011500000089
Figure GDA0003002011500000089

其中,

Figure GDA0003002011500000091
表示载体相对于地理系的角速度误差,δvn为载体的速度误差,δpn为载体的位置误差。下标I表示惯导系统解算值,下标V表示视觉里程计解算值。H为系统的量测矩阵。in,
Figure GDA0003002011500000091
Represents the angular velocity error of the carrier relative to the geographic system, δvn is the velocity error of the carrier, and δpn is the position error of the carrier. The subscript I represents the solution value of the inertial navigation system, and the subscript V represents the solution value of the visual odometry. H is the measurement matrix of the system.

Figure GDA0003002011500000092
的值可以由陀螺仪输出计算得到,如式所示:
Figure GDA0003002011500000092
The value of can be calculated from the gyroscope output as follows:

Figure GDA0003002011500000093
Figure GDA0003002011500000093

其中,

Figure GDA0003002011500000094
为陀螺仪输出,
Figure GDA0003002011500000095
为地球自转角速率,
Figure GDA0003002011500000096
为载体运动引起的角速度。in,
Figure GDA0003002011500000094
is the gyroscope output,
Figure GDA0003002011500000095
is the angular rate of Earth's rotation,
Figure GDA0003002011500000096
is the angular velocity caused by the motion of the carrier.

因此,忽略二阶小项,

Figure GDA0003002011500000097
可以表示为:Therefore, ignoring the second order minor terms,
Figure GDA0003002011500000097
It can be expressed as:

Figure GDA0003002011500000098
Figure GDA0003002011500000098

Figure GDA0003002011500000099
的值可以根据欧拉角微分方程,由姿态变化角速率转换得到,如下式所示:
Figure GDA0003002011500000099
The value of can be converted from the angular rate of attitude change according to the Euler angle differential equation, as shown in the following formula:

Figure GDA00030020115000000910
Figure GDA00030020115000000910

其中,

Figure GDA00030020115000000911
Figure GDA00030020115000000912
在三轴上的分量。in,
Figure GDA00030020115000000911
for
Figure GDA00030020115000000912
components on three axes.

Figure GDA00030020115000000913
为系统的状态量,
Figure GDA00030020115000000914
可以由k到k+1时刻的视觉位置估计变换计算。根据式(5)~(7),系统的量测矩阵H如下式所示:
Figure GDA00030020115000000913
is the state quantity of the system,
Figure GDA00030020115000000914
It can be computed from the visual position estimation transformation at time k to k+1. According to equations (5) to (7), the measurement matrix H of the system is as follows:

Figure GDA00030020115000000915
Figure GDA00030020115000000915

其中,符号×是向量叉乘的简化表示,

Figure GDA00030020115000000916
Mv和Mp分别是与速度和位置相关的系数,
Figure GDA00030020115000000917
Figure GDA00030020115000000918
的模值,
Figure GDA0003002011500000101
I为单位矩阵。步骤3)建立基于量测模型优化的惯性/视觉里程计组合导航卡尔曼滤波模型where the symbol × is a simplified representation of the vector cross product,
Figure GDA00030020115000000916
Mv and Mp are coefficients related to velocity and position, respectively,
Figure GDA00030020115000000917
for
Figure GDA00030020115000000918
the modulo value of ,
Figure GDA0003002011500000101
I is the identity matrix. Step 3) Establish an inertial/visual odometry combined navigation Kalman filter model optimized based on the measurement model

31)将系统状态方程和量测方程离散化处理:31) Discretize the system state equation and measurement equation:

Xk=Φk,k-1Xk-1k,k-1Wk-1 (9)Xkk,k-1 Xk-1k,k-1 Wk-1 (9)

Zk=HkXk+Vk (10)Zk =Hk Xk +Vk (10)

其中,Xk为tk时刻系统状态量,Xk-1为tk-1时刻系统状态量,Φk,k-1为tk-1时刻至tk时刻系统的状态转移矩阵,Γk,k-1为tk-1时刻至tk时刻系统的噪声驱动矩阵,Wk-1为tk-1时刻系统的噪声矩阵,Zk为tk时刻系统的姿态量测矩阵,Hk为tk时刻的姿态量测系数矩阵,Vk为tk时刻的姿态观测量的噪声矩阵。Among them, Xk is the state quantity of the system at time tk , Xk-1 is the state quantity of the system at time tk-1 , Φk,k-1 is the state transition matrix of the system from time tk-1 to time tk , Γk , k-1 is the noise driving matrix of the system from time tk-1 to time tk , Wk-1 is the noise matrix of the system at time tk-1 , Zk is the attitude measurement matrix of the system at time tk , Hk is the attitude measurement coefficient matrix at time tk , and Vk is the noise matrix of attitude observation at time tk .

32)对离散化处理得到的公式加入控制项Uk-1,并采用闭环修正系统状态方程对惯导系统误差进行修正:32) Add the control term Uk-1 to the formula obtained by discretization, and use the closed-loop correction system state equation to correct the inertial navigation system error:

Figure GDA0003002011500000102
Figure GDA0003002011500000102

其中,Bk,k-1为tk-1时刻至tk时刻系统的控制项系数矩阵。Among them, Bk,k-1 is the control term coefficient matrix of the system from time tk-1 to time tk .

33)得到系统的线性化卡尔曼滤波器方程:33) Obtain the linearized Kalman filter equation of the system:

Figure GDA0003002011500000103
Figure GDA0003002011500000103

Figure GDA0003002011500000104
Figure GDA0003002011500000104

Figure GDA0003002011500000105
Figure GDA0003002011500000105

Figure GDA0003002011500000106
Figure GDA0003002011500000106

Figure GDA0003002011500000107
Figure GDA0003002011500000107

其中,

Figure GDA0003002011500000108
为tk-1时刻至tk时刻一步预测状态量;
Figure GDA0003002011500000109
为tk-1时刻滤波状态估计量;
Figure GDA00030020115000001010
为tk时刻滤波状态估计量;Kk为tk时刻滤波增益矩阵;Zk为tk时刻的新息序列;Pk,k-1为tk-1时刻至tk时刻一步预测协方差矩阵;Pk-1为tk-1时刻滤波状态估计协方差矩阵;Φk,k-1T为Φk,k-1的转置矩阵;Γk-1为tk-1时刻系统噪声系数矩阵;Qk-1为tk-1时刻的系统观测噪声估计协方差阵;Γk-1T为Γk-1的转置矩阵;
Figure GDA0003002011500000111
为Hk的转置矩阵;Rk为tk时刻的量测噪声估计协方差阵;Pk为tk时刻滤波状态估计协方差矩阵;I为单位矩阵。in,
Figure GDA0003002011500000108
Predict the state quantity in one step from time tk-1 to time tk ;
Figure GDA0003002011500000109
is the filter state estimator at time tk-1 ;
Figure GDA00030020115000001010
is the filter state estimator at time tk ; Kk is the filter gain matrix at time tk ; Zk is the innovation sequence at time tk ; Pk,k-1 is the one-step prediction covariance from time tk-1 to time tk matrix; Pk-1 is the estimated covariance matrix of the filter state at time tk-1 ; Φk,k-1T is the transposed matrix of Φk,k-1 ; Γk-1 is the system noise at time tk-1 coefficient matrix; Qk-1 is the covariance matrix of systematic observation noise estimation at time tk-1 ; Γk-1T is the transpose matrix of Γk-1 ;
Figure GDA0003002011500000111
is the transposed matrix of Hk ; Rk is the estimated covariance matrix of the measurement noise at time tk ; Pk is the estimated covariance matrix of the filter state at time tk ; I is the identity matrix.

为了验证发明所提出的基于量测模型优化的惯性/视觉里程计组合导航方法的正确性和有效性,采用本发明方法建立模型,利用KITTI数据集进行半物理仿真验证。测试车辆运行的真实轨迹如图2所示。In order to verify the correctness and effectiveness of the inertial/visual odometry combined navigation method based on measurement model optimization proposed by the invention, the method of the invention is used to establish a model, and the KITTI data set is used for semi-physical simulation verification. The real trajectory of the test vehicle running is shown in Figure 2.

基于所选取的数据集,基于本发明所述的基于量测模型优化的惯性/视觉里程计组合导航方法进行验证,并与姿态位置直接组合的卡尔曼滤波进行比较,导航误差比较曲线如图3(a)-图3(c)所示。Based on the selected data set, the inertial/visual odometry combined navigation method based on the measurement model optimization of the present invention is verified, and compared with the Kalman filter directly combined with the attitude and position, the navigation error comparison curve is shown in Figure 3 (a)-Fig. 3(c).

图3(a)-图3(c)中实线代表本发明误差曲线,虚线代表姿态位置直接组合的卡尔曼滤波误差曲线。从图3(a)-图3(c)导航误差比较曲线可以看出,采用本发明提出的基于量测模型优化的惯性/视觉里程计组合导航方法,惯性导航系统精度相较于姿态位置直接组合的卡尔曼滤波有明显提高,具有有益的工程应用价值。The solid line in Fig. 3(a)-Fig. 3(c) represents the error curve of the present invention, and the dashed line represents the Kalman filter error curve of the direct combination of attitude and position. It can be seen from the comparison curves of the navigation errors in Fig. 3(a)-Fig. 3(c) that using the inertial/visual odometry combined navigation method based on the optimization of the measurement model proposed by the present invention, the accuracy of the inertial navigation system is directly higher than that of the attitude position. The combined Kalman filter is obviously improved and has beneficial engineering application value.

以上所述仅是本发明的优选实施方式,应当指出,对于本技术领域的普通技术人员来说,在不脱离本发明技术原理的前提下,还可以做出若干改进和变形,这些改进和变形也应视为本发明的保护范围。The above are only the preferred embodiments of the present invention. It should be pointed out that for those skilled in the art, without departing from the technical principle of the present invention, several improvements and modifications can also be made. These improvements and modifications It should also be regarded as the protection scope of the present invention.

Claims (5)

1. An inertial/visual odometer integrated navigation positioning method based on measurement model optimization is characterized in that,
step 1) establishing a state equation of the integrated navigation system, and expanding errors of the inertial sensor into system state variables, wherein the system state variables comprise random constant drift of a gyroscope, first-order Markov process drift of the gyroscope and first-order Markov process drift of an accelerometer;
step 2) obtaining a conversion relation from the relative pose to the pose and the position under the navigation system by combining the relative pose output by the visual odometer;
step 3) selecting measurement variables of the integrated navigation system, and constructing an inertia/visual odometer integrated Kalman filtering measurement equation;
step 4) discretizing a system state equation and a measurement equation, and performing feedback correction on the system state quantity by adopting Kalman filtering;
the measurement equation of the step 3) is as follows:
Figure FDA0003002011490000011
wherein,
Figure FDA0003002011490000012
error in angular velocity of the carrier with respect to the geographic system, δ vnAs error in velocity of the carrier, δ pnAnd the subscript I represents the calculation value of the inertial navigation system, the subscript V represents the calculation value of the visual mileage, H is a measurement coefficient matrix of the system, and V is a measurement noise matrix of the system.
2. The integrated navigation and positioning method for inertial/visual odometer based on metrology model optimization as claimed in claim 1, wherein the state equation of the system in step 1) is:
Figure FDA0003002011490000013
wherein, X is a state vector,
Figure FDA0003002011490000014
is a mathematical platform error angle in a strapdown inertial navigation system,
Figure FDA0003002011490000015
the error angles of the three-axis mathematical platform are x, y and z respectively; delta vnIs the error in the speed of the carrier,
Figure FDA0003002011490000016
speed errors of the carrier on x, y and z axes under the geographical system; δ pnThe position error of the carrier is delta L, delta lambda and delta h are respectively longitude, latitude and height errors; epsilonbIs the random constant drift of the gyroscope,
Figure FDA0003002011490000017
random constant drifts of the gyroscope x, y and z axes respectively;
Figure FDA0003002011490000021
for the first order markov process drift of the gyroscope,
Figure FDA0003002011490000022
first order markov process drifts for gyroscope x, y, z axes respectively;
Figure FDA0003002011490000023
for the first order markov process drift of the accelerometer,
Figure FDA0003002011490000024
first order markov process drift for accelerometer x, y, z axes;
the superscript n represents a geographical system, the superscript b represents a loading system, the subscript I represents an inertial navigation system calculation value, and the subscript V represents a visual mileage calculation value;
Figure FDA0003002011490000025
wherein,
Figure FDA0003002011490000026
the angular rate error of the geographic system relative to the inertial system,
Figure FDA0003002011490000027
is the angular velocity of the geographic system relative to the inertial system,
Figure FDA0003002011490000028
representing a coordinate transformation matrix from a carrier system to a geographical system, fnIn order to output specific force of the accelerometer under geographic conditions,
Figure FDA0003002011490000029
is the angular rate error of the earth's system relative to the inertial system,
Figure FDA00030020114900000210
is the angular rate error of the geographic system relative to the earth system,
Figure FDA00030020114900000211
is the angular velocity of the earth's system relative to the inertial system,
Figure FDA00030020114900000212
angular velocity of the geographic system relative to the Earth's system, δ gnIs the gravity acceleration error, vnIs the speed of the carrier under the geographic system,
Figure FDA00030020114900000213
speed of the carrier on the x-axis under geographic system, RmRadius of a unit of fourth quarternRadius of meridian, h is the local altitude, L is the local altitude, TgAnd TaRespectively, the correlation time, w, of the first order Markov processrAnd waRespectively, associated driving white noise.
3. The integrated navigation and positioning method for the inertial/visual odometer based on the metrology model optimization as claimed in claim 1, wherein the transformation relationship from the relative pose in step 2) to the pose and position under the navigation system is:
Figure FDA00030020114900000214
wherein,
Figure FDA00030020114900000215
is a coordinate transformation matrix from the carrier to the geographic system at the time of k frames of images,
Figure FDA00030020114900000216
is a coordinate transformation matrix of the camera system to the carrier system,
Figure FDA00030020114900000217
and
Figure FDA00030020114900000218
the rotational matrix and translation vectors for the k +1 frame image time relative to the k frame image time are measured for the visual odometer,
Figure FDA00030020114900000219
for the position of the carrier under the geographical system at the moment of the k frames of images, the superscript T represents the transpose of the matrix, and the subscripts k and k +1 represent the physical quantities corresponding to the frame moments.
4. The integrated inertial/visual odometer navigation positioning method based on metrology model optimization of claim 2,
Figure FDA0003002011490000031
wherein,
Figure FDA0003002011490000032
representing a coordinate transformation matrix from a geographical system to a carrier system, MvAnd MpRespectively, coefficients related to speed and position, vxAnd vyThe components of the carrier velocity in the x-axis and y-axis respectively,
Figure FDA0003002011490000033
is composed of
Figure FDA0003002011490000034
The value of the modulus of the (c) component,
Figure FDA0003002011490000035
and I is an identity matrix.
5. The integrated navigation and positioning method for inertial/visual odometer based on metrology model optimization as claimed in claim 1, wherein the specific process of step 4) is:
401) discretizing a system state equation and a measurement equation:
Xk=Φk,k-1Xk-1k,k-1Wk-1,Zk=HkXk+Vkwherein X iskIs tkTime of day system state quantity, Xk-1Is tk-1Time of day system state quantity, phik,k-1Is tk-1Time to tkState transition matrix of time system, gammak,k-1Is tk-1Time to tkNoise-driven matrix of time-of-day system, Wk-1Is tk-1Noise matrix of time of day system, ZkIs tkAttitude measurement matrix of time system, HkIs tkAttitude measurement coefficient matrix at time, VkIs tkA noise matrix of attitude observations at a time;
402) adding a control item U to a formula obtained by discretizationk-1And correcting the inertial navigation system error by adopting a closed loop correction system state equation:
Figure FDA0003002011490000036
wherein, Bk,k-1Is tk-1Time to tkA control item coefficient matrix of the time system;
403) obtaining a linearized Kalman filter equation of the system:
Figure FDA0003002011490000041
wherein,
Figure FDA0003002011490000042
is tk-1Time to tkPredicting the state quantity one step at a time;
Figure FDA0003002011490000043
is tk-1A time filtering state estimator;
Figure FDA0003002011490000044
is tkA time filtering state estimator; kkIs tkA time filtering gain matrix; zkIs tkAn innovation sequence of moments; pk,k-1Is tk-1Time to tkPredicting a covariance matrix one step at a time; pk-1Is tk-1Estimating a covariance matrix by a time filtering state; phik,k-1TIs phik,k-1The transposed matrix of (2); gamma-shapedk-1Is tk-1A time system noise coefficient matrix; qk-1Is tk-1A system observation noise estimation covariance matrix at the moment; gamma-shapedk-1TIs gammak-1The transposed matrix of (2); hkTIs HkThe transposed matrix of (2); rkIs tkEstimating a covariance matrix of measurement noise at a moment; pkIs tkEstimating a covariance matrix by a time filtering state; i is an identity matrix;
404) and estimating a system navigation error value including attitude, position and speed errors according to the linearized Kalman filter equation obtained in the step 403), and subtracting the system navigation error value from a navigation parameter calculated by an inertial navigation system to obtain an inertial/visual odometer combined navigation correction value optimized based on a measurement model.
CN201810477871.0A2018-05-182018-05-18Inertial/visual odometer integrated navigation positioning method based on measurement model optimizationActiveCN108731670B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201810477871.0ACN108731670B (en)2018-05-182018-05-18Inertial/visual odometer integrated navigation positioning method based on measurement model optimization

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201810477871.0ACN108731670B (en)2018-05-182018-05-18Inertial/visual odometer integrated navigation positioning method based on measurement model optimization

Publications (2)

Publication NumberPublication Date
CN108731670A CN108731670A (en)2018-11-02
CN108731670Btrue CN108731670B (en)2021-06-22

Family

ID=63938504

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201810477871.0AActiveCN108731670B (en)2018-05-182018-05-18Inertial/visual odometer integrated navigation positioning method based on measurement model optimization

Country Status (1)

CountryLink
CN (1)CN108731670B (en)

Families Citing this family (35)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN109341724B (en)*2018-12-042023-05-05中国航空工业集团公司西安航空计算技术研究所On-line calibration method for relative pose of airborne camera-inertial measurement unit
CN109544696B (en)*2018-12-042022-12-20中国航空工业集团公司西安航空计算技术研究所Accurate registration method for airborne enhanced synthetic visual virtual and real images based on visual inertial combination
CN109655058A (en)*2018-12-242019-04-19中国电子科技集团公司第二十研究所A kind of inertia/Visual intelligent Combinated navigation method
CN109443355B (en)*2018-12-252020-10-27中北大学Visual-inertial tight coupling combined navigation method based on self-adaptive Gaussian PF
CN109443353B (en)*2018-12-252020-11-06中北大学Visual-inertial tight coupling combined navigation method based on fuzzy self-adaptive ICKF
CN109443354B (en)*2018-12-252020-08-14中北大学 Visual-Inertial Tight Coupling Integrated Navigation Method Based on Firefly Swarm Optimization PF
CN109631875A (en)*2019-01-112019-04-16京东方科技集团股份有限公司 A method and system for optimizing sensor attitude fusion measurement method
CN109813311B (en)*2019-03-182020-09-15南京航空航天大学Unmanned aerial vehicle formation collaborative navigation method
CN110221333B (en)*2019-04-112023-02-10同济大学 A Measurement Error Compensation Method for Vehicle INS/OD Integrated Navigation System
FR3097045B1 (en)*2019-06-062021-05-14Safran Electronics & Defense Method and device for resetting an inertial unit of a means of transport from information delivered by a viewfinder of the means of transport
CN110455275B (en)*2019-08-072023-07-07天津理工大学 A positioning and navigation system and method for a wall-climbing robot of a large spherical storage tank
CN110728716B (en)*2019-09-042023-11-17深圳市道通智能航空技术股份有限公司Calibration method and device and aircraft
CN112985456B (en)*2019-12-162024-06-21上海航空电器有限公司Navigation positioning correction method integrating airborne height measurement data and topographic data
CN113074751B (en)*2019-12-172023-02-07杭州海康威视数字技术股份有限公司 A visual positioning error detection method and device
CN110986939B (en)*2020-01-022022-06-28东南大学Visual inertia odometer method based on IMU (inertial measurement Unit) pre-integration
CN115003983B (en)*2020-02-132024-10-01Oppo广东移动通信有限公司 Method, computer system and medium for updating state variables in SLAM system
CN111595337B (en)*2020-04-132023-09-26浙江深寻科技有限公司Inertial positioning self-correction method based on visual modeling
CN111708377B (en)*2020-06-212022-10-25西北工业大学 Flight Control Method Based on Inertial Navigation/Flight Control System Information Fusion
CN112066983B (en)*2020-09-082022-09-23中国人民解放军国防科技大学 Inertial/odometer combined navigation filtering method, electronic device and storage medium
CN112050809B (en)*2020-10-082022-06-17吉林大学Wheel type odometer and gyroscope information fusion unmanned vehicle directional positioning method
CN112325878A (en)*2020-10-302021-02-05南京航空航天大学Ground carrier combined navigation method based on UKF and air unmanned aerial vehicle node assistance
CN112461237B (en)*2020-11-262023-03-14浙江同善人工智能技术有限公司Multi-sensor fusion positioning method applied to dynamic change scene
CN112683263B (en)*2020-12-122022-11-11西北工业大学UWB/IMU/ODOM multi-sensor data fusion mobile robot positioning method based on improved model
CN112729283A (en)*2020-12-212021-04-30西北工业大学Navigation method based on depth camera/MEMS inertial navigation/odometer combination
CN112577493B (en)*2021-03-012021-05-04中国人民解放军国防科技大学 A method and system for autonomous positioning of unmanned aerial vehicles based on remote sensing map assistance
CN113155154B (en)*2021-04-072022-09-20扬州大学Error correction method based on attitude and mileage of sensor and camera
CN113362366B (en)*2021-05-212023-07-04上海奥视达智能科技有限公司Sphere rotation speed determining method and device, terminal and storage medium
CN113375667B (en)*2021-07-152022-02-22北京百度网讯科技有限公司Navigation method, device, equipment and storage medium
CN114184209B (en)*2021-10-292023-10-13北京自动化控制设备研究所 Inertial error suppression method for low-speed detection platform system
CN114383609A (en)*2021-12-222022-04-22中国煤炭科工集团太原研究院有限公司 Combined Navigation Mine Positioning Method and Device Based on Strapdown Inertial Navigation and Odometer
CN114323076B (en)*2021-12-312024-06-21深圳市优必选科技股份有限公司Odometer calibration method, device, robot and readable storage medium
CN115371668B (en)*2022-07-292024-08-23重庆大学Tunnel unmanned aerial vehicle positioning system based on image recognition and inertial navigation
CN115790613B (en)*2022-11-112024-07-02北京鲲鹏博睿科技有限公司Visual information-assisted inertial/odometer combined navigation method and device
CN115727845B (en)*2022-11-222025-04-08东南大学 A micro-inertial/wheel speedometer integrated navigation method based on prior trajectory assistance
CN116539064A (en)*2023-04-062023-08-04中国电子科技南湖研究院Indoor unmanned aerial vehicle driving distance estimation method, system and medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN102435188A (en)*2011-09-152012-05-02南京航空航天大学 A Monocular Vision/Inertial Fully Autonomous Navigation Method for Indoor Environment
CN102538781A (en)*2011-12-142012-07-04浙江大学Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method
CN103292804A (en)*2013-05-272013-09-11浙江大学Monocular natural vision landmark assisted mobile robot positioning method
CN105371840A (en)*2015-10-302016-03-02北京自动化控制设备研究所 An Inertial/Visual Odometer/LiDAR Integrated Navigation Method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN102435188A (en)*2011-09-152012-05-02南京航空航天大学 A Monocular Vision/Inertial Fully Autonomous Navigation Method for Indoor Environment
CN102538781A (en)*2011-12-142012-07-04浙江大学Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method
CN103292804A (en)*2013-05-272013-09-11浙江大学Monocular natural vision landmark assisted mobile robot positioning method
CN105371840A (en)*2015-10-302016-03-02北京自动化控制设备研究所 An Inertial/Visual Odometer/LiDAR Integrated Navigation Method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
A Design of Odometer-aided Visual Inertial Integrated Navigation Algorithm Based on Multiple View Geometry Constraints;Deqiang Tian .et al;《2017 9th International Conference on Intelligent Human-Machine Systems and Cybernetics》;20170921;第161-166页*
Loosely Coupled Kalman Filtering for Fusion of Visual Odometry and Inertial Navigation;Salim .et al;《Proceedings of the 16th International Conference on Information Fusion》;20131021;第219-226页*
基于惯性传感器和视觉里程计的机器人定位;夏凌楠等;《仪器仪表学报》;20130131;第34卷(第1期);第166-172页*

Also Published As

Publication numberPublication date
CN108731670A (en)2018-11-02

Similar Documents

PublicationPublication DateTitle
CN108731670B (en)Inertial/visual odometer integrated navigation positioning method based on measurement model optimization
US12092740B2 (en)Positioning method and device based on multi-sensor fusion
CN112629538B (en)Ship horizontal attitude measurement method based on fusion complementary filtering and Kalman filtering
CN109991636B (en)Map construction method and system based on GPS, IMU and binocular vision
CN112097763B (en)Underwater vehicle combined navigation method based on MEMS IMU/magnetometer/DVL combination
CN101949703B (en)Strapdown inertial/satellite combined navigation filtering method
CN102706366B (en)SINS (strapdown inertial navigation system) initial alignment method based on earth rotation angular rate constraint
CN100516775C (en) A Method for Determining Initial Attitude of Strapdown Inertial Navigation System
CN111156994A (en)INS/DR & GNSS loose integrated navigation method based on MEMS inertial component
CN109931955B (en) An Initial Alignment Method for Strapdown Inertial Navigation System Based on State Dependent Lie Group Filtering
CN106871928B (en)Strap-down inertial navigation initial alignment method based on lie group filtering
CN106767797B (en)inertial/GPS combined navigation method based on dual quaternion
CN113503872B (en) A low-speed unmanned vehicle positioning method based on the fusion of camera and consumer-grade IMU
CN109870173A (en)A kind of track correct method of the submarine pipeline inertial navigation system based on checkpoint
CN103630146B (en)The laser gyro IMU scaling method that a kind of discrete parsing is combined with Kalman filter
Nourmohammadi et al.Design and experimental evaluation of indirect centralized and direct decentralized integration scheme for low-cost INS/GNSS system
CN103245359A (en)Method for calibrating fixed errors of inertial sensor in inertial navigation system in real time
CN110017837A (en)A kind of Combinated navigation method of the diamagnetic interference of posture
CN117516524A (en) GPS/INS Kalman filter positioning method based on CNN-GRU prediction
CN110954102A (en)Magnetometer-assisted inertial navigation system and method for robot positioning
CN112033440B (en)Method for achieving latitude-free initial alignment under swing base based on gradient descent optimization
CN111189474A (en)Autonomous calibration method of MARG sensor based on MEMS
Li et al.Low-cost MEMS sensor-based attitude determination system by integration of magnetometers and GPS: A real-data test and performance evaluation
CN114690229A (en)GPS-fused mobile robot visual inertial navigation method
CN112325841B (en)Method for estimating installation error angle of communication-in-motion antenna

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp