技术领域Technical field
本发明涉及计算机视觉技术领域,更具体的说是涉及一种局部几何信息融合的激光惯性里程计方法。The invention relates to the technical field of computer vision, and more specifically to a laser inertial odometry method for local geometric information fusion.
背景技术Background technique
随着科技的迅猛发展,无人驾驶技术逐渐成为汽车行业的研究热点。传统的驾驶方式存在诸多问题,例如交通事故、交通拥堵和能源浪费等,这些问题给人们的生活和社会带来了许多负面影响。而无人驾驶技术的出现为解决这些问题提供了新的可能性。无人驾驶技术利用激光雷达、摄像头和惯性传感器等感知设备,同时结合人工智能和自动控制算法,使汽车具备自主感知、决策和操作能力,能够在没有人类驾驶员的情况下进行安全、高效的驾驶。With the rapid development of science and technology, driverless technology has gradually become a research hotspot in the automotive industry. There are many problems in traditional driving methods, such as traffic accidents, traffic congestion and energy waste, etc. These problems have brought many negative impacts on people's lives and society. The emergence of driverless technology provides new possibilities for solving these problems. Unmanned driving technology uses sensing devices such as lidar, cameras and inertial sensors, combined with artificial intelligence and automatic control algorithms, to enable cars to have autonomous perception, decision-making and operation capabilities, and to carry out safe and efficient driving without a human driver. drive.
随着应用需求的不断增加和细分,无人系统面临着在一些复杂、随机和多变的环境下进行自主作业的挑战,例如室内、地下、隧道和对抗干扰环境等。在这些环境中,全球导航卫星系统(Global Navigation Satellite System,GNSS)的信号可能被遮挡或屏蔽。因此,激光雷达和惯性传感器等自主传感器成为了常用的感知和导航手段。激光雷达能够收集自身到周围物体之间的距离信息,确定自身在环境中的相对位置,具有误差恒定、测量频率高等优点,是同步定位与建图(Simultaneous Localization and Mapping,SLAM)技术应用最为广泛的传感器之一。文献“LOAM:Lidar odometry and mapping in real-time”,(J.Zhang and S.Singh,In Proc.of Robotics:Science and Systems,2014)提出了激光里程计和建图(Lidar Odometry and Mapping,LOAM)框架,其根据曲率从激光扫描点云中提取位于尖锐边缘和平面表面块的特征点,并且分别配准到边缘线段和平面特征上。文献“Lego-loam:Lightweight and ground optimized lidar odometry andmapping onvariable terrain”,(T.Shan and B.Englot,In IEEE/RSJ International Conferenceon Intelligent Robots and Systems,2018)在LOAM的基础上继续改进,提出了LeGO-LOAM(Lightweight and Groud-Optimized LOAM)框架。相较于LOAM框架,LeGO-LOAM框架增加了点云地面分割模块,通过提取地面点和分割点,对点云进行聚类,并基于平滑度二次选取特征点,提高了鲁棒性。As application requirements continue to increase and subdivide, unmanned systems face the challenge of operating autonomously in some complex, random and changeable environments, such as indoors, underground, tunnels and interference-resistant environments. In these environments, Global Navigation Satellite System (GNSS) signals may be blocked or shielded. Therefore, autonomous sensors such as lidar and inertial sensors have become a common means of perception and navigation. Lidar can collect distance information between itself and surrounding objects and determine its relative position in the environment. It has the advantages of constant error and high measurement frequency. It is the most widely used Simultaneous Localization and Mapping (SLAM) technology. one of the sensors. The document "LOAM: Lidar odometry and mapping in real-time" (J. Zhang and S. Singh, In Proc. of Robotics: Science and Systems, 2014) proposed laser odometry and mapping (LOAM). ) framework, which extracts feature points located on sharp edges and planar surface blocks from laser scanning point clouds based on curvature, and registers them to edge line segments and planar features respectively. The document "Lego-loam: Lightweight and ground optimized lidar odometry and mapping on variable terrain", (T.Shan and B.Englot, In IEEE/RSJ International Conference on Intelligent Robots and Systems, 2018) continues to improve on the basis of LOAM and proposes LeGO -LOAM (Lightweight and Groud-Optimized LOAM) framework. Compared with the LOAM framework, the LeGO-LOAM framework adds a point cloud ground segmentation module. By extracting ground points and segmentation points, the point cloud is clustered, and feature points are secondary selected based on smoothness, which improves robustness.
激光雷达SLAM在处理平面光滑特征时能够体现出优势,但它依赖于简单的扫描匹配方法,稳定性不高。此外,激光SLAM的环境特征不明显,数据缺乏色彩、纹理信息,易产生运动畸变,对于导航来说并不可靠,因此往往需要与惯性测量单元(Inertial MeasurementUnit,IMU)进行融合。文献“LIOSAM:Tightly-coupled LidarInertial Odometry viaSmoothing and Mapping”,(T.Shan and B.Englot,In IEEE/RSJ InternationalConference on Intelligent Robots and Systems,2020)利用ISAM平滑优化工具箱,将惯性、激光雷达和GPS进行融合,并且采用一种局部地图匹配策略替换LOAM中的全局地图匹配策略,LIO-SAM具有更高的精度和更短的计算耗时。文献“Fast-lio2:Fast direct lidar-inertial odometry”,(Xu W,Cai Y,He D,et al.,IEEE Transactions On Robotics,2022)直接将激光雷达获得的原始点云与地图进行匹配,使用Ikd-tree来保障对地图的快速搜索,利用迭代误差卡尔曼滤波器将激光里程计和IMU进行紧融合。然而,目前激光惯性里程计方法依然存在一个主要的问题:在点云配准过程中,通常仅使用局部的点云地图信息进行拟合,而当处于一个结构化的环境中,许多局部直线和平面表示的是一个平面,这会带来信息的冗余。此外,如果点在同一个平面或直线上,用于提取平面或直线的点越多,其拟合的结果越准确。因此,通常尝试将粗配准的点线(面)对进行融合,可以有效提高系统和准确性。Lidar SLAM can show advantages in processing planar smooth features, but it relies on a simple scan matching method and is not very stable. In addition, the environmental characteristics of laser SLAM are not obvious, the data lacks color and texture information, and it is prone to motion distortion. It is not reliable for navigation, so it often needs to be fused with an inertial measurement unit (IMU). The document "LIOSAM: Tightly-coupled LidarInertial Odometry via Smoothing and Mapping", (T.Shan and B.Englot, In IEEE/RSJ International Conference on Intelligent Robots and Systems, 2020) uses the ISAM smoothing optimization toolbox to combine inertial, lidar and GPS Fusion is performed and a local map matching strategy is used to replace the global map matching strategy in LOAM. LIO-SAM has higher accuracy and shorter calculation time. The document "Fast-lio2: Fast direct lidar-inertial odometry", (Xu W, Cai Y, He D, et al., IEEE Transactions On Robotics, 2022) directly matches the original point cloud obtained by lidar with the map, using Ikd-tree is used to ensure fast search of the map, and the iterative error Kalman filter is used to tightly fuse the laser odometry and IMU. However, the current laser inertial odometry method still has a major problem: in the point cloud registration process, only local point cloud map information is usually used for fitting, and when in a structured environment, many local straight lines and Plane represents a plane, which brings redundancy of information. In addition, if the points are on the same plane or straight line, the more points used to extract the plane or straight line, the more accurate the fitting result will be. Therefore, usually trying to fuse coarsely registered point-line (surface) pairs can effectively improve the system and accuracy.
因此,如何有效提高里程计算法在具有结构化的环境中的稳定性,有效减少里程计算法耗时,提高里程计算法的准确性和鲁棒性,是本领域技术人员亟需解决的问题。Therefore, how to effectively improve the stability of the mileage calculation method in a structured environment, effectively reduce the time consumption of the mileage calculation method, and improve the accuracy and robustness of the mileage calculation method are issues that technicians in the field urgently need to solve.
发明内容Contents of the invention
鉴于上述问题,本发明提供一种局部几何信息融合的激光惯性里程计方法,以至少解决上述背景技术中提到的部分技术问题。In view of the above problems, the present invention provides a laser inertial odometry method for local geometric information fusion to solve at least some of the technical problems mentioned in the above background art.
为了实现上述目的,本发明采用如下技术方案:In order to achieve the above objects, the present invention adopts the following technical solutions:
本发明实施例提供了一种局部几何信息融合的激光惯性里程计方法,包括如下步骤:The embodiment of the present invention provides a laser inertial odometry method for local geometric information fusion, which includes the following steps:
S1、获取预设时间段内,目标车辆的点云数据和惯性测量数据;并通过所述惯性测量数据对所述点云数据进行畸变矫正;S1. Obtain the point cloud data and inertial measurement data of the target vehicle within a preset time period; and perform distortion correction on the point cloud data through the inertial measurement data;
S2、对畸变矫正后的所述点云数据进行特征提取,获得边缘点和平面点;S2. Perform feature extraction on the point cloud data after distortion correction to obtain edge points and plane points;
S3、分别对所述边缘点和所述平面点进行匹配处理;S3. Perform matching processing on the edge points and the plane points respectively;
S4、基于匹配处理后的边缘点和平面点,同时结合IMU预积分约束,获得所述目标车辆的里程计。S4. Based on the matched edge points and plane points, and combined with the IMU pre-integration constraints, obtain the odometer of the target vehicle.
进一步地,所述步骤S1具体包括:Further, the step S1 specifically includes:
S11、获取预设时间段内目标车辆的点云数据,并同步获得所述目标车辆的惯性测量数据;S11. Obtain the point cloud data of the target vehicle within a preset time period, and simultaneously obtain the inertial measurement data of the target vehicle;
S12、通过对所述惯性测量数据进行插值处理,分别获取所述预设时间段内,第任意个时刻的点云位姿和最后个时刻的点云位姿;S12. By performing interpolation processing on the inertial measurement data, obtain the point cloud pose at any moment and the point cloud pose at the last moment within the preset time period;
S13、根据所述第任意个时刻的点云位姿和所述最后个时刻的点云位姿,获得所述点云数据在第任意个时刻至最后个时刻的相对运动;S13. According to the point cloud pose at any time and the point cloud pose at the last time, obtain the relative motion of the point cloud data from any time to the last time;
S14、根据所述点云数据在第任意个时刻至最后个时刻的相对运动,并结合惯性测量的外参数,获得所述点云数据在第任意个时刻至最后个时刻的相对变换;S14. According to the relative motion of the point cloud data from any first moment to the last moment, and combined with the external parameters of inertial measurement, obtain the relative transformation of the point cloud data from any first moment to the last moment;
S15、基于所述点云数据在第任意个时刻至最后个时刻的相对变换,将第任意个时刻的点云数据投影到最后个时刻的坐标系下,得到最后个时刻的点云数据,实现对所述点云数据进行畸变矫正。S15. Based on the relative transformation of the point cloud data from any moment to the last moment, project the point cloud data at any moment into the coordinate system of the last moment to obtain the point cloud data at the last moment, and implement Distortion correction is performed on the point cloud data.
进一步地,所述步骤S2具体包括:获取每个点云数据的曲率;所述曲率大于第一预设阈值的点云数据属于边缘点;所述曲率小于第一预设阈值的点云数据属于平面点。Further, the step S2 specifically includes: obtaining the curvature of each point cloud data; the point cloud data with a curvature greater than the first preset threshold belongs to the edge point; the point cloud data with the curvature less than the first preset threshold belongs to Flat point.
进一步地,所述步骤S3中具体包括:Further, the step S3 specifically includes:
S31、分别对所述边缘点和所述平面点进行初始匹配;S31. Perform initial matching on the edge points and the plane points respectively;
S32、对初始匹配后的边缘点和平面点,分别进行融合匹配。S32. Perform fusion matching on the edge points and plane points after initial matching.
进一步地,所述步骤S31具体包括:Further, the step S31 specifically includes:
S311、使用IMU积分估计的姿态作为初值,将当前时刻下激光雷达坐标系下的边缘点或平面点变换到世界坐标系下,获得世界坐标系下的边缘点或平面点;S311. Use the attitude estimated by the IMU integral as the initial value, transform the edge point or plane point in the lidar coordinate system at the current moment into the world coordinate system, and obtain the edge point or plane point in the world coordinate system;
S312、查询世界坐标系下当前局部地图中边缘点或平面点的多个邻近点,获得邻近点集合;S312. Query multiple adjacent points of the edge point or plane point in the current local map in the world coordinate system, and obtain a set of adjacent points;
S313、获取所述邻近点集合的协方差矩阵;S313. Obtain the covariance matrix of the neighboring point set;
S314、对所述协方差矩阵进行特征值分解处理,获得多个特征值;S314. Perform eigenvalue decomposition processing on the covariance matrix to obtain multiple eigenvalues;
S315、基于所述多个特征值,分别对所述边缘点和平面点进行初始匹配。S315. Based on the multiple feature values, perform initial matching on the edge points and plane points respectively.
进一步地,所述步骤S315具体包括:Further, the step S315 specifically includes:
(1)对于所述边缘点:将首个特征值和最后一个特征值的比值作为矩阵条件;判断所述矩阵条件是否大于第二预设阈值;若所述矩阵条件大于第二预设阈值,则获得所述边缘点的初始匹配直线对(pl,vl);其中,pl等于邻近点集合中所有邻近点的均值;vl表示首个特征值对应的特征向量;(1) For the edge point: use the ratio of the first eigenvalue to the last eigenvalue as the matrix condition; determine whether the matrix condition is greater than the second preset threshold; if the matrix condition is greater than the second preset threshold, Then the initial matching straight line pair (pl , vl ) of the edge point is obtained; where pl is equal to the mean of all adjacent points in the adjacent point set; vl represents the eigenvector corresponding to the first eigenvalue;
(2)对于所述平面点:判断最后一个特征值是否小于第三预设阈值;若最后一个特征值小于第三预设阈值,则获得所述平面点的初始匹配平面对(pp,np);其中,pp等于邻近点集合中所有邻近点的均值;np表示最后一个特征值对应的特征向量。(2) For the plane point: determine whether the last eigenvalue is less than the third preset threshold; if the last eigenvalue is less than the third preset threshold, obtain the initial matching plane pair (pp ,n) of the plane pointp ); where pp is equal to the mean of all neighboring points in the neighboring point set; np represents the eigenvector corresponding to the last eigenvalue.
进一步地,在所述步骤S32中,对初始匹配后的边缘点进行融合匹配,具体包括:Further, in step S32, fusion matching is performed on the edge points after initial matching, which specifically includes:
根据所述初始匹配直线对,获得初始匹配直线集合;对于所述初始匹配直线集合中的每条直线对,筛选出最邻近的直线对;将每条直线对记作第一直线对,将所述第一直线对最邻近的直线对记作第二直线对;According to the initial matching straight line pair, an initial matching straight line set is obtained; for each straight line pair in the initial matching straight line set, the nearest straight line pair is screened out; each straight line pair is recorded as the first straight line pair, and The nearest pair of straight lines to the first straight line pair is recorded as the second straight line pair;
获取所述第一直线对和所述第二直线对之间的距离向量rl;Obtain the distance vector rl between the first straight line pair and the second straight line pair;
若所述距离向量rl的二范数小于第四预设阈值,则将所述第一直线对和所述第二直线对相融合,并获取融合后的直线对邻近点集合;If the second norm of the distance vector rl is less than the fourth preset threshold, fuse the first straight line pair and the second straight line pair, and obtain the fused straight line pair adjacent point set;
获取所述直线对邻近点集合对应的直线对协方差矩阵;Obtain the straight line pair covariance matrix corresponding to the straight line pair adjacent point set;
对所述直线对协方差矩阵进行特征值分解处理,获得多个特征值,基于其中最大特征值对应的特征向量获得融合匹配直线对/>其中/>等于所述直线对邻近点集合中所有邻近点的均值。Perform eigenvalue decomposition processing on the covariance matrix of the straight line to obtain multiple eigenvalues, based on the eigenvector corresponding to the largest eigenvalue Obtain fusion matching straight line pair/> Among them/> Is equal to the mean value of the straight line to all neighboring points in the neighboring point set.
进一步地,在所述步骤S32中,对初始匹配后的平面点进行融合匹配,具体包括:Further, in step S32, fusion matching is performed on the initially matched plane points, which specifically includes:
根据所述初始匹配平面对,获得初始匹配平面集合;对于所述初始匹配平面集合中的每条平面对,筛选出最邻近的平面对;将每条平面对记作第一平面对,将所述第一平面对最邻近的平面对记作第二平面对;According to the initial matching plane pair, an initial matching plane set is obtained; for each plane pair in the initial matching plane set, the nearest plane pair is screened out; each plane pair is recorded as the first plane pair, and the The nearest pair of planes to the first plane pair is denoted as the second plane pair;
获取所述第一平面对和所述第二平面对之间的距离向量rp;Obtain the distance vector rp between the first plane pair and the second plane pair;
若所述距离向量rp的二范数小于第五预设阈值,则将所述第一平面对和所述第二平面对相融合,并获取融合后的平面对邻近点集合;If the second norm of the distance vector rp is less than the fifth preset threshold, fuse the first plane pair and the second plane pair, and obtain the fused plane pair adjacent point set;
获取所述平面对邻近点集合对应的平面对协方差矩阵;Obtain the plane pair covariance matrix corresponding to the plane pair neighboring point set;
对所述平面对协方差矩阵进行特征值分解处理,获得多个特征值,基于其中最小特征值对应的特征向量获得融合匹配平面对/>其中/>等于所述平面对邻近点集合中所有邻近点的均值。Perform eigenvalue decomposition processing on the covariance matrix of the plane to obtain multiple eigenvalues, based on the eigenvector corresponding to the smallest eigenvalue. Obtain fusion matching plane pair/> Among them/> Is equal to the mean value of all neighboring points in the set of neighboring points in the plane.
进一步地,所述步骤S4具体包括:Further, the step S4 specifically includes:
S41、针对融合匹配后的边缘点和平面点,分别计算残差,获得边缘点残差和平面点残差;S41. Calculate the residuals respectively for the edge points and plane points after fusion and matching, and obtain the edge point residuals and plane point residuals;
S42、获取IMU预积分约束;所述IMU预积分约束包括速度预积分值、位置预积分值和旋转预积分值;S42. Obtain IMU pre-integration constraints; the IMU pre-integration constraints include speed pre-integration value, position pre-integration value and rotation pre-integration value;
S43、根据所述获得边缘点残差和所述平面点残差,同时结合所述速度预积分值、所述位置预积分值和所述旋转预积分值,建一个最小二乘问题,以获得目标车辆的里程计。S43. According to the obtained edge point residual and the plane point residual, combined with the speed pre-integrated value, the position pre-integrated value and the rotation pre-integrated value, construct a least squares problem to obtain The target vehicle's odometer.
经由上述的技术方案可知,与现有技术相比,本发明公开提供了一种局部几何信息融合的激光惯性里程计方法,包括如下有益效果:It can be seen from the above technical solution that compared with the existing technology, the present invention provides a laser inertial odometry method for local geometric information fusion, which includes the following beneficial effects:
本发明对点云数据进行特征提取,获得边缘点和计算点,基于此可以有效降低数据的存储和算法处理时间,使得里程计算法能实时的运行。The present invention extracts features from point cloud data to obtain edge points and calculation points. Based on this, the data storage and algorithm processing time can be effectively reduced, allowing the mileage calculation method to run in real time.
通常来说,点云匹配仅考虑了点云局部的信息,许多匹配直线和平面其实是重复冗余的;本发明中通过融合这些冗余信息,可以有效提高平面或直线的拟合的有效性,减少算法耗时。Generally speaking, point cloud matching only considers local information of the point cloud, and many matching straight lines and planes are actually repetitive and redundant; in the present invention, by fusing these redundant information, the effectiveness of fitting planes or straight lines can be effectively improved. , reduce the time consumption of the algorithm.
本发明通过高频率的IMU信息可以对点云进行畸变矫正,融合IMU和激光雷达的信息可以联合优化位姿,提高算法的鲁棒性。The present invention can perform distortion correction on point clouds through high-frequency IMU information, and the fusion of IMU and laser radar information can jointly optimize poses and improve the robustness of the algorithm.
本发明的其它特征和优点将在随后的说明书中阐述,并且,部分地从说明书中变得显而易见,或者通过实施本发明而了解。本发明的目的和其他优点可通过在所写的说明书、权利要求书、以及附图中所特别指出的结构来实现和获得。Additional features and advantages of the invention will be set forth in the description which follows, and in part will be apparent from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention may be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
下面通过附图和实施例,对本发明的技术方案做进一步的详细描述。The technical solution of the present invention will be further described in detail below through the accompanying drawings and examples.
附图说明Description of the drawings
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据提供的附图获得其他的附图。In order to explain the embodiments of the present invention or the technical solutions in the prior art more clearly, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below. Obviously, the drawings in the following description are only These are embodiments of the present invention. For those of ordinary skill in the art, other drawings can be obtained based on the provided drawings without exerting creative efforts.
图1为本发明实施例提供的局部几何信息融合的激光惯性里程计方法流程示意图。Figure 1 is a schematic flowchart of the laser inertial odometry method for local geometric information fusion provided by an embodiment of the present invention.
图2为本发明实施例提供的激光点云畸变矫正示意图。Figure 2 is a schematic diagram of laser point cloud distortion correction provided by an embodiment of the present invention.
图3为本发明实施例提供的匹配直线或平面融合的示意图。Figure 3 is a schematic diagram of straight line matching or plane fusion provided by an embodiment of the present invention.
图4为本发明实施例提供的基于因子图的激光惯性紧融合框架图。Figure 4 is a framework diagram of laser inertial compact fusion based on factor graph provided by an embodiment of the present invention.
具体实施方式Detailed ways
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some of the embodiments of the present invention, rather than all the embodiments. Based on the embodiments of the present invention, all other embodiments obtained by those of ordinary skill in the art without creative efforts fall within the scope of protection of the present invention.
参见图1所示,本发明实施例公开了一种局部几何信息融合的激光惯性里程计方法,包括如下步骤:As shown in Figure 1, an embodiment of the present invention discloses a laser inertial odometry method for local geometric information fusion, which includes the following steps:
S1、获取预设时间段内,目标车辆的点云数据和惯性测量数据;并通过惯性测量数据对点云数据进行畸变矫正;S1. Obtain the point cloud data and inertial measurement data of the target vehicle within a preset time period; and perform distortion correction on the point cloud data through the inertial measurement data;
S2、对畸变矫正后的点云数据进行特征提取,获得边缘点和平面点;S2. Extract features from the distortion-corrected point cloud data to obtain edge points and plane points;
S3、分别对边缘点和平面点进行匹配处理;S3. Match edge points and plane points respectively;
S4、基于匹配处理后的边缘点和平面点,同时结合IMU预积分约束,获得所述目标车辆的里程计。S4. Based on the matched edge points and plane points, and combined with the IMU pre-integration constraints, obtain the odometer of the target vehicle.
接下来分别对上述各个步骤进行详细说明。Next, each of the above steps will be described in detail.
在上述步骤S1中,通过激光雷达获得预设时间段内目标车辆的点云数据,并同步通过IMU测量获得目标车辆的惯性测量数据(即IMU数据);通过对惯性测量数据进行插值处理,分别获取预设时间段内,第任意个时刻t任意的点云位姿和最后个时刻t末的点云位姿;基于这两个时刻下的点云位姿,获得点云数据在t任意时刻至t末时刻的相对运动,进而获得点云数据在t任意时刻至t末时刻的相对变换;最后,将t任意时刻的点云数据投影到t末时刻的坐标系下,得到t末时刻的点云数据,实现对点云数据进行畸变矫正;In the above step S1, the point cloud data of the target vehicle within a preset time period is obtained through lidar, and the inertial measurement data (i.e., IMU data) of the target vehicle is simultaneously obtained through IMU measurement; by interpolating the inertial measurement data, respectively Obtain the point cloud poseat any time t and the point cloud poseat the end of the last time t within the preset time period; based on the point cloud poses at these two moments, obtain the point cloud data atany time t tothe end of t, and then obtain the relative transformation of the point cloud data fromany time t tothe end of t; finally, project the point cloud dataat any time t onto the coordinate system atthe end of t to obtain the point cloud data at theend of t. Point cloud data enables distortion correction of point cloud data;
如图2所示,通过IMU测量和激光雷达同步获取数据,其中IMU测量获取的时刻点为t1、t3、t4和t6,激光雷达获取的时刻点为t2和t5;由于IMU具有较高的频率,对于t2时刻的点云数据,假设在一个IMU采样周期t1-t3中载体保持匀速运动,则通过对t1和t3时刻的性测量数据进行插值处理,可以获得t2时刻的点云位姿,具体表示为:As shown in Figure 2, data are obtained simultaneously through IMU measurement and lidar. The time points obtained by IMU measurement are t1 , t3 , t4 and t6 , and the time points obtained by lidar are t2 and t5 ; due to The IMU has a higher frequency. For the point cloud data at time t2 , assuming that the carrier maintains uniform motion during an IMU sampling period t1 -t3 , by interpolating the sexual measurement data at time t1 and t3 , The point cloud pose at time t2 can be obtained, specifically expressed as:
其中,表示t2时刻下IMU坐标系(表示为I系)到世界坐标系(表示为W系)的位姿矩阵;/>表示t1时刻的旋转矩阵;/>表示t1时刻的平移矩阵;/>表示t1时刻下IMU测量的角速度;/>表示t1时刻下IMU测量的线速度;0表示3×1阶矩阵;T表示转置操作;in, Represents the pose matrix from the IMU coordinate system (expressed as I system) to the world coordinate system (expressed as W system) at time t2 ;/> Represents the rotation matrix at time t1 ;/> Represents the translation matrix at time t1 ;/> Indicates the angular velocity measured by IMU at time t1 ;/> Represents the linear velocity measured by the IMU at time t1 ; 0 represents a 3×1 order matrix; T represents the transpose operation;
同理,t5时刻图像的点云位姿可以被获得;利用/>和/>可以得到点云数据在t2时刻和t5时刻之间的相对运动,表示为:In the same way, the point cloud pose of the image at time t5 can be obtained; utilized/> and/> The relative motion of the point cloud data between time t2 and time t5 can be obtained, expressed as:
利用预先获得的激光雷达和IMU的外参数表示IMU坐标系到激光雷达坐标系的位姿矩阵,得到激光雷达坐标系(表示为L系)下,点云数据在t2时刻和t5时刻的相对变换,表示为:Utilize pre-obtained external parameters of lidar and IMU Represents the pose matrix from the IMU coordinate system to the lidar coordinate system, and obtains the relative transformation of the point cloud data at time t2 and time t5 in the lidar coordinate system (expressed as L system), which is expressed as:
最后,可以将在t2时刻的激光点云数据投影到t5时刻激光雷达坐标系下,从而得到t5时刻的激光点云数据,表示为:Finally, the laser point cloud data at time t2 can be Project to the lidar coordinate system at time t5 to obtain the laser point cloud data at time t5 , expressed as:
在上述步骤S2中,激光点云通常含有每个点相对于激光原点的三维坐标信息,每个点采集时对应的时间信息,反射强度信息以及每个点所在的线束(环)信息;因此进行激光特征提取时,更多的是考虑激光点云的几何信息;对于单个激光点,竖直方向的角度分辨率过大导致临近点的距离也相对过大,而水平方向的角度分辨率较小,通常都是同一物体上的临近点,因此可以使用水平方向近邻点寻找激光点云中的几何特征;In the above step S2, the laser point cloud usually contains the three-dimensional coordinate information of each point relative to the laser origin, the corresponding time information when each point is collected, the reflection intensity information and the line beam (ring) information where each point is located; therefore, proceed When extracting laser features, more consideration is given to the geometric information of the laser point cloud; for a single laser point, the angular resolution in the vertical direction is too large, resulting in a relatively large distance between adjacent points, while the angular resolution in the horizontal direction is small. , are usually adjacent points on the same object, so the horizontal neighbor points can be used to find geometric features in the laser point cloud;
基于上述内容,本发明实施例考虑使用曲率提取点云数据的边缘点和平面点;具体地,对于每个激光点云数据,考虑其水平方向近邻点,使用曲率公式计算曲率;并通过判断曲率的大小,将点云数据划分为边缘点和平面点;其中,曲率公式表示如下:Based on the above content, the embodiment of the present invention considers using curvature to extract edge points and plane points of point cloud data; specifically, for each laser point cloud data, consider its horizontal neighbor points, use the curvature formula to calculate the curvature; and determine the curvature The size of the point cloud data is divided into edge points and plane points; among them, the curvature formula is expressed as follows:
其中,C表示当前激光点pi在水平方向的临近点集合;ci表示第i个激光点pi的曲率;pj表示pi的第j个邻近点的位置;Among them, C represents the set of adjacent points in the horizontal direction of the current laser point pi ; ci represents the curvature of the i-th laser point pi ; pj represents the position of the j-th adjacent point of pi ;
通过设置第一预设阈值,可以将点云数据划分为边缘点和平面点,例如若激光点云数据的曲率大于第一预设阈值,则该点云数据属于边缘点,反之,则为平面点。By setting the first preset threshold, the point cloud data can be divided into edge points and plane points. For example, if the curvature of the laser point cloud data is greater than the first preset threshold, the point cloud data belongs to edge points, and vice versa. point.
在上述步骤S3中,具体包括:In the above step S3, it specifically includes:
S31、分别对边缘点和平面点进行初始匹配;具体包括:S31. Perform initial matching on edge points and plane points respectively; details include:
对于第k时刻下激光雷达坐标系下每一个边缘点或平面点使用IMU积分估计的姿态/>作为初值,/>表示第k时刻下激光雷达坐标系Lk到世界坐标系W之间的变换,将当前时刻下的边缘点或平面点变换到世界坐标系下,获得世界坐标系下的边缘点或平面点/>该变换过程可以表示为:For each edge point or plane point in the lidar coordinate system at the kth moment Attitude estimation using IMU integrals/> As the initial value,/> Represents the transformation between the lidar coordinate system Lk and the world coordinate system W at the kth time. Transform the edge point or plane point at the current moment into the world coordinate system to obtain the edge point or plane point under the world coordinate system/ > The transformation process can be expressed as:
之后使用KDTree查询当前世界坐标系下局部地图MapW中周围的K个近邻点,获得邻近点集合S,并基于邻近点集合S,计算集合S的协方差矩阵M;具体表示为:Then use KDTree to query the local map MapW in the current world coordinate system From the surrounding K neighboring points, a neighboring point set S is obtained, and based on the neighboring point set S, the covariance matrix M of the set S is calculated; specifically expressed as:
其中,表示在W系中邻近点集合S中第s个邻近点,xs、ys、zs表示/>的三维坐标;表示集合S中所有邻近点的均值;具体地,/>表示所有邻近点在x轴上的均值;表示所有邻近点在y轴上的均值;/>表示所有邻近点在z轴上的均值;in, Represents the sth neighboring point in the neighboring point set S in the W system, xs , ys , zs represents /> three-dimensional coordinates; Represents the mean of all neighboring points in the set S; specifically,/> Represents the mean value of all neighboring points on the x-axis; Represents the mean value of all neighboring points on the y-axis;/> Represents the mean value of all neighboring points on the z-axis;
对协方差矩阵M进行特征值分解,获得多个特征值;基于该多个特征值,分别对边缘点和平面点进行初始匹配,该初始匹配具体如下:Perform eigenvalue decomposition on the covariance matrix M to obtain multiple eigenvalues; based on the multiple eigenvalues, initial matching is performed on edge points and plane points respectively. The initial matching is as follows:
(1)对于边缘点将首个特征值和最后一个特征值的比值作为矩阵条件;判断矩阵条件是否大于第二预设阈值;若矩阵条件大于第二预设阈值,则获得边缘点的初始匹配直线对(pl,vl);其中,pl等于邻近点集合中所有邻近点的均值;vl表示首个特征值对应的特征向量;(1) For edge points Use the ratio of the first eigenvalue to the last eigenvalue as the matrix condition; determine whether the matrix condition is greater than the second preset threshold; if the matrix condition is greater than the second preset threshold, obtain the initial matching straight line pair of the edge point (pl , vl ); where pl is equal to the mean of all neighboring points in the neighboring point set; vl represents the eigenvector corresponding to the first eigenvalue;
例如,假设对协方差矩阵M进行特征值分解后,获得3个特征值,记作λ1>λ2>λ3,那么对于边缘点则判断矩阵条件数/>是否大于第二预设阈值δl,若大于第二预设阈值δl,则可以获得边缘点/>的初始匹配直线对(pl,vl),其中/>vl是λ1对应的特征向量;For example, assuming that after eigenvalue decomposition of the covariance matrix M, 3 eigenvalues are obtained, recorded as λ1 > λ2 > λ3 , then for the edge point Then the judgment matrix condition number/> Whether it is greater than the second preset threshold δl , if it is greater than the second preset threshold δl , the edge point can be obtained/> The initial matching straight line pair (pl ,vl ), where/> vl is the eigenvector corresponding to λ1 ;
(2)对于平面点判断最后一个特征值是否小于第三预设阈值;若最后一个特征值小于第三预设阈值,则获得平面点的初始匹配平面对(pp,np);其中,pp等于邻近点集合中所有邻近点的均值;np表示最后一个特征值对应的特征向量;(2) For plane points Determine whether the last eigenvalue is less than the third preset threshold; if the last eigenvalue is less than the third preset threshold, obtain the initial matching plane pair (pp ,np ) of the plane point; where pp is equal to the set of neighboring points The mean of all adjacent points in; np represents the eigenvector corresponding to the last eigenvalue;
例如,假设对协方差矩阵M进行特征值分解后,获得3个特征值,记作λ1>λ2>λ3,那么对于平面点则判断特征值λ3是否小于第三预设阈值δp;若小于第三预设阈值δp;则可以获得平面点/>的初始匹配平面对(pp,np),其中/>np是λ3对应的特征向量。For example, assuming that after eigenvalue decomposition of the covariance matrix M, 3 eigenvalues are obtained, recorded as λ1 > λ2 > λ3 , then for a plane point Then determine whether the characteristic value λ3 is less than the third preset threshold δp ; if it is less than the third preset threshold δp ; then the plane point can be obtained/> The initial matching plane pair (pp ,np ), where/> np is the eigenvector corresponding to λ3 .
S32、对初始匹配后的边缘点和平面点,分别进行融合匹配;如图3所示:S32. Perform fusion matching on the edge points and plane points after initial matching respectively; as shown in Figure 3:
(1)对初始匹配后的边缘点进行融合匹配:(1) Fusion matching of edge points after initial matching:
根据边缘点的初始匹配直线对(pl,vl),可以获得初始匹配直线集合对于每一条直线对/>(为了方便说明,将每条直线对记作第一直线对),使用KDTree寻找其最近邻的直线对/>(为了方便说明,将第一直线对对应的最邻近的直线对记作第二直线对),通过估计第一直线对与第二直线对之间的距离来判断是否融合,即:According to edge points of the initial matching straight line pair (pl , vl ), the initial matching straight line set can be obtained For each straight line pair/> (For convenience of explanation, each straight line pair is recorded as the first straight line pair), use KDTree to find its nearest neighbor straight line pair/> (For convenience of explanation, the nearest straight line pair corresponding to the first straight line pair is recorded as the second straight line pair). Whether to fuse is determined by estimating the distance between the first straight line pair and the second straight line pair, that is:
其中,rl表示初始匹配直线集合中,第一直线对与第二直线对之间的距离向量;表示第一直线对对应的特征向量;/>表示第二直线对对应的特征向量;/>表示第一直线对对应的邻近点集合Em中所有邻近点的均值;/>表示第二直线对对应的邻近点集合En中所有邻近点的均值;Among them, rl represents the distance vector between the first straight line pair and the second straight line pair in the initial matching straight line set; Indicates the eigenvector corresponding to the first straight line pair;/> Indicates the eigenvector corresponding to the second straight line pair;/> Represents the mean value of all adjacent points in the adjacent point set Em corresponding to the first straight line pair;/> Represents the mean value of all adjacent points in the adjacent point set En corresponding to the second straight line pair;
若距离向量rl的二范数小于第四预设阈值,则将第一直线对和第二直线对相融合,并获取融合后的新邻近点集合,记作直线对邻近点集合Emerge,表示为:If the second norm of the distance vector rl is less than the fourth preset threshold, the first straight line pair and the second straight line pair are merged, and a new fused neighboring point set is obtained, which is recorded as the straight line pair neighboring point set Emerge , expressed as:
Emerge=Em+EnEmerge =Em +En
之后,计算直线对邻近点集合Emerge的直线对协方差矩阵,对该直线对协方差矩阵进行特征值分解处理,获得多个特征值,基于其中最大特征值对应的特征向量获得新的匹配直线对,记作融合匹配直线对/>其中,/>等于所述直线对邻近点集合中所有邻近点的均值。After that, calculate the straight line covariance matrix of the straight line pair neighboring point set Emerge , perform eigenvalue decomposition processing on the straight line covariance matrix, and obtain multiple eigenvalues, based on the eigenvector corresponding to the largest eigenvalue Obtain a new matching straight line pair, which is recorded as a fused matching straight line pair/> Among them,/> Is equal to the mean value of the straight line to all neighboring points in the neighboring point set.
(2)对初始匹配后的平面点进行融合匹配:(2) Fusion matching of plane points after initial matching:
根据平面点的初始匹配平面对(pp,np),获得初始匹配平面集合对于每一条平面对/>(为了方便说明,将每条平面对记作第一平面对),使用KDTree寻找其最近邻的平面对/>(为了方便说明,将第一平面对对应的最邻近的平面对记作第二平面对),通过估计第一平面对与第二平面对之间的距离来判断是否融合,即:According to plane point Initial matching plane pair (pp ,np ), obtain the initial matching plane set For each plane pair/> (For convenience of explanation, each plane pair is recorded as the first plane pair), use KDTree to find its nearest neighbor plane pair/> (For convenience of explanation, the nearest plane pair corresponding to the first plane pair is recorded as the second plane pair). Whether to fuse is determined by estimating the distance between the first plane pair and the second plane pair, that is:
其中,rp表示初始匹配平面集合中,第一平面对与第二平面对之间的距离向量;表示第一平面对对应的特征向量;/>表示第二平面对对应的特征向量;/>表示第一平面对对应的邻近点集合Pu中所有邻近点的均值;/>表示第二平面对对应的邻近点集合Pv中所有邻近点的均值;T表示转置操作;Among them, rp represents the distance vector between the first plane pair and the second plane pair in the initial matching plane set; Represents the eigenvector corresponding to the first plane pair;/> Indicates the eigenvector corresponding to the second plane pair;/> Represents the mean value of all adjacent points in the adjacent point set Pu corresponding to the first plane pair;/> Represents the mean value of all adjacent points in the adjacent point set Pv corresponding to the second plane pair; T represents the transpose operation;
若距离向量rp的二范数小于第五预设阈值,则将第一平面对和第二平面对相融合,并获取融合后的新邻近点集合,记作平面对邻近点集合Pmerge,表示为:If the second norm of the distance vector rp is less than the fifth preset threshold, the first plane pair and the second plane pair are fused, and the new fused neighboring point set is obtained, which is denoted as the plane pair neighboring point set Pmerge , expressed as:
Pmerge=Pu+PvPmerge =Pu +Pv
之后,计算平面对邻近点集合Pmerge的平面对协方差矩阵,对该平面对协方差矩阵进行特征值分解处理,获得多个特征值,基于其中最小特征值对应的特征向量获得新的匹配平面对,记作融合匹配平面对/>其中,/>等于平面对邻近点集合中所有邻近点的均值。After that, calculate the plane pair covariance matrix of the plane pair neighboring point set Pmerge , perform eigenvalue decomposition processing on the plane pair covariance matrix, and obtain multiple eigenvalues, based on the eigenvector corresponding to the smallest eigenvalue. Obtain a new matching plane pair, which is recorded as a fusion matching plane pair/> Among them,/> Is equal to the mean value of all adjacent points in the set of adjacent points on the plane.
在上述步骤S4中,基于匹配处理后的边缘点和平面点,同时结合IMU预积分约束,获得目标车辆的里程计,具体包括:In the above step S4, based on the matched edge points and plane points, and combined with the IMU pre-integration constraints, the odometer of the target vehicle is obtained, which specifically includes:
S41、针对融合匹配后的边缘点和平面点,分别计算残差;具体包括:S41. Calculate the residuals respectively for the edge points and plane points after fusion and matching; specifically include:
在上述内容中,采用来表示第k时刻下激光坐标系下的边缘点或平面点,接下来,为了对边缘点或平面点进行区分,采用/>来表示边缘点;采用/>来表示平面点;In the above content, use to represent the edge point or plane point under the laser coordinate system at the kth time. Next, in order to distinguish the edge point or plane point, use/> to represent edge points; use/> to represent a plane point;
(1)对融合匹配后的边缘点,计算残差:(1) For the edge points after fusion and matching, calculate the residual:
针对每个边缘点及其对应的融合匹配直线对/>使用点到线的距离定义残差,具体表示为:for each edge point and its corresponding fusion matching straight line pair/> The residual is defined using the distance from the point to the line, which is expressed as:
其中,表示边缘点/>的残差;×表示叉乘;in, Represents edge points/> The residual; × represents the cross product;
(2)对融合匹配后的平面点,计算残差:(2) For the plane points after fusion and matching, calculate the residual:
针对每个平面点及其对应的融合匹配平面对/>使用点到面的距离定义残差,具体表示为:for each plane point and its corresponding fusion matching plane pair/> The residual is defined using the distance from the point to the surface, which is expressed as:
其中,表示平面点/>的残差;·表示点乘;in, Represents a plane point/> The residual;·represents the dot product;
且在上述内容中,未加粗T表示转置操作;加粗表示从局部坐标系转换到地图坐标系之间的状态变量;And in the above content, the unbold T indicates the transpose operation; the bold Represents the state variables converted from the local coordinate system to the map coordinate system;
S42、获取IMU预积分约束:S42. Obtain IMU pre-integration constraints:
(1)对于IMU测量,可以表示为:(1) For IMU measurement, it can be expressed as:
其中,和/>分别表示在IMU坐标系下中t时刻下IMU测量的角速度和线加速度值,且/>和都受到偏置/>和白噪声/>的影响;/>表示第t时刻下从世界坐标系W到IMU坐标系I的旋转矩阵;gW表示世界坐标系下的重力向量;in, and/> respectively represent the angular velocity and linear acceleration values measured by the IMU at time t in the IMU coordinate system, and/> and are both biased/> and white noise/> influence;/> represents the rotation matrix from the world coordinate system W to the IMU coordinate system I at the t-th moment; gW represents the gravity vector in the world coordinate system;
现在可以使用IMU的测量结果来推断目标车辆的运动;目标车辆在t+Δt时刻的速度,位置和旋转的计算公式表示为:The measurement results of the IMU can now be used to infer the movement of the target vehicle; the calculation formula for the speed, position and rotation of the target vehicle at time t+Δt is expressed as:
其中,表示第t+Δt时刻世界坐标系下目标车辆的速度矩阵;/>表示第t时刻世界坐标系下目标车辆的的速度矩阵;/>表示第t+Δt时刻世界坐标系下目标车辆的位置矩阵;/>表示第t时刻世界坐标系下目标车辆的位置矩阵;;/>表示第t+Δt时刻下目标车辆到世界坐标系的旋转矩阵;/>表示第t时刻下目标车辆到世界坐标系的旋转矩阵;然后应用IMU预积分方法获得两个时间步之间目标车辆的相对运动,则t时刻和t+Δt时刻之间的预积分值ΔvΔt,ΔQΔt和ΔRΔt可以被计算为in, Represents the speed matrix of the target vehicle in the world coordinate system at time t+Δt;/> Represents the speed matrix of the target vehicle in the world coordinate system at time t;/> Represents the position matrix of the target vehicle in the world coordinate system at time t+Δt;/> Represents the position matrix of the target vehicle in the world coordinate system at time t;;/> Represents the rotation matrix from the target vehicle to the world coordinate system at time t+Δt;/> Represents the rotation matrix of the target vehicle to the world coordinate system at the t-th time; then apply the IMU pre-integration method to obtain the relative motion of the target vehicle between the two time steps, then the pre-integration value ΔvΔt between the t time and the t+Δt time , ΔQΔt and ΔRΔt can be calculated as
其中,ΔvΔt表示t时刻和t+Δt时刻之间目标车辆的速度预积分值;ΔQΔt表示t t时刻和t+Δt时刻之间目标车辆的位置预积分值;ΔRΔt表示时刻和t+Δt时刻之间目标车辆的旋转预积分值;Among them, ΔvΔt represents the speed pre-integrated value of the target vehicle between t time and t+Δt time; ΔQΔt represents the position pre-integrated value of the target vehicle between tt time and t+Δt time; ΔRΔt represents the time and t+Δt The rotation pre-integration value of the target vehicle between moments;
S43、基于上述所获得的边缘点残差和平面点残差,结合IMU预积分约束(即速度预积分值、位置预积分值和旋转预积分值),可以构建一个最小二乘问题来最小化获得状态变量的最大似然估计,即获得里程计估计值;表示为:S43. Based on the edge point residuals and plane point residuals obtained above, combined with the IMU pre-integration constraints (ie, velocity pre-integration value, position pre-integration value and rotation pre-integration value), a least squares problem can be constructed to minimize Get state variables The maximum likelihood estimate of , that is, the odometer estimate is obtained; expressed as:
式中,||fp(T)||2表示边缘化的先验误差;表示IMU的预积分误差;该具体表示为:In the formula, ||fp (T)||2 represents the marginalized prior error; Represents the pre-integration error of the IMU; the Specifically expressed as:
如图4所示,通过使用因子图,使用Bayes树的增量平滑来优化代价函数可以获得最优解。As shown in Figure 4, the optimal solution can be obtained by optimizing the cost function using incremental smoothing of Bayes trees using factor graphs.
本领域内的技术人员应明白,本发明的实施例可提供为方法、系统、或计算机程序产品。因此,本发明可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本发明可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器和光学存储器等)上实施的计算机程序产品的形式。Those skilled in the art will appreciate that embodiments of the present invention may be provided as methods, systems, or computer program products. Thus, the invention may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, magnetic disk storage, optical storage, etc.) embodying computer-usable program code therein.
本发明是参照根据本发明实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。The invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each process and/or block in the flowchart illustrations and/or block diagrams, and combinations of processes and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing device to produce a machine, such that the instructions executed by the processor of the computer or other programmable data processing device produce a use A device for realizing the functions specified in one process or multiple processes of the flowchart and/or one block or multiple blocks of the block diagram.
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。These computer program instructions may also be stored in a computer-readable memory that causes a computer or other programmable data processing apparatus to operate in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including the instruction means, the instructions The device implements the functions specified in a process or processes of the flowchart and/or a block or blocks of the block diagram.
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。These computer program instructions may also be loaded onto a computer or other programmable data processing device, causing a series of operating steps to be performed on the computer or other programmable device to produce computer-implemented processing, thereby executing on the computer or other programmable device. Instructions provide steps for implementing the functions specified in a process or processes of a flowchart diagram and/or a block or blocks of a block diagram.
综上,本发明提供了一种局部几何信息融合的激光惯性里程计方法,通过融合点云地图的局部几何信息,可以有效的提高算法在具有结构化的环境中的稳定性,有效减少算法耗时,提高算法精度;此外,通过合理融合IMU的信息,融合两个传感器的优势,提高了里程计算法的准确性和鲁棒性。In summary, the present invention provides a laser inertial odometry method for local geometric information fusion. By fusing the local geometric information of the point cloud map, the stability of the algorithm in a structured environment can be effectively improved, and the algorithm consumption can be effectively reduced. At the same time, the accuracy of the algorithm is improved; in addition, by reasonably fusing the information of the IMU and combining the advantages of the two sensors, the accuracy and robustness of the mileage calculation method are improved.
本说明书中各个实施例采用递进的方式描述,每个实施例重点说明的都是与其他实施例的不同之处,各个实施例之间相同相似部分互相参见即可。对于实施例公开的装置而言,由于其与实施例公开的方法相对应,所以描述的比较简单,相关之处参见方法部分说明即可。Each embodiment in this specification is described in a progressive manner. Each embodiment focuses on its differences from other embodiments. The same and similar parts between the various embodiments can be referred to each other. As for the device disclosed in the embodiment, since it corresponds to the method disclosed in the embodiment, the description is relatively simple. For relevant details, please refer to the description in the method section.
对所公开的实施例的上述说明,使本领域专业技术人员能够实现或使用本发明。对这些实施例的多种修改对本领域的专业技术人员来说将是显而易见的,本文中所定义的一般原理可以在不脱离本发明的精神或范围的情况下,在其它实施例中实现。因此,本发明将不会被限制于本文所示的这些实施例,而是要符合与本文所公开的原理和新颖特点相一致的最宽的范围。The above description of the disclosed embodiments enables those skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be practiced in other embodiments without departing from the spirit or scope of the invention. Therefore, the present invention is not to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202311734829.XACN117705145A (en) | 2023-12-18 | 2023-12-18 | Laser inertial odometer method for local geometric information fusion |
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202311734829.XACN117705145A (en) | 2023-12-18 | 2023-12-18 | Laser inertial odometer method for local geometric information fusion |
| Publication Number | Publication Date |
|---|---|
| CN117705145Atrue CN117705145A (en) | 2024-03-15 |
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202311734829.XAPendingCN117705145A (en) | 2023-12-18 | 2023-12-18 | Laser inertial odometer method for local geometric information fusion |
| Country | Link |
|---|---|
| CN (1) | CN117705145A (en) |
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN119478348A (en)* | 2024-10-25 | 2025-02-18 | 北京城建设计发展集团股份有限公司 | A method and system for detecting a target object in a city rail defense zone |
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN119478348A (en)* | 2024-10-25 | 2025-02-18 | 北京城建设计发展集团股份有限公司 | A method and system for detecting a target object in a city rail defense zone |
| Publication | Publication Date | Title |
|---|---|---|
| CN111862672B (en) | Parking lot vehicle self-positioning and map construction method based on top view | |
| CN115407357B (en) | Low-beam LiDAR-IMU-RTK positioning and mapping algorithm based on large scenes | |
| CN109186606B (en) | Robot composition and navigation method based on SLAM and image information | |
| Liu et al. | Real-time 6d lidar slam in large scale natural terrains for ugv | |
| Xiong et al. | Road-model-based road boundary extraction for high definition map via lidar | |
| Yabuuchi et al. | Visual localization for autonomous driving using pre-built point cloud maps | |
| Cai et al. | A lightweight feature map creation method for intelligent vehicle localization in urban road environments | |
| Cao et al. | Accurate localization of autonomous vehicles based on pattern matching and graph-based optimization in urban environments | |
| Agrawal et al. | PCE-SLAM: A real-time simultaneous localization and mapping using LiDAR data | |
| CN114792338A (en) | Visual fusion localization method based on prior 3D lidar point cloud map | |
| CN112729318A (en) | AGV fork truck is from moving SLAM navigation of fixed position | |
| Pang et al. | Low-cost and high-accuracy LiDAR SLAM for large outdoor scenarios | |
| CN114782639A (en) | Rapid differential latent AGV dense three-dimensional reconstruction method based on multi-sensor fusion | |
| Lin et al. | GLO-SLAM: A slam system optimally combining GPS and LiDAR odometry | |
| CN117705145A (en) | Laser inertial odometer method for local geometric information fusion | |
| Hu et al. | A novel lidar inertial odometry with moving object detection for dynamic scenes | |
| CN112747752B (en) | Vehicle positioning method, device, equipment and storage medium based on laser odometer | |
| Aggarwal | On the use of artificial intelligence techniques in transportation systems | |
| Guo et al. | 3d lidar slam based on ground segmentation and scan context loop detection | |
| CN111239761B (en) | Method for indoor real-time establishment of two-dimensional map | |
| Zhao et al. | L-VIWO: Visual-inertial-wheel odometry based on lane lines | |
| Lui et al. | A pure vision-based approach to topological SLAM | |
| Wang et al. | Monocular visual-inertial localization in a point cloud map using feature-to-distribution registration | |
| CN116558522A (en) | Large scene repositioning method based on laser vision fusion data | |
| Zeng et al. | Entropy-based Keyframe Established and Accelerated Fast LiDAR Odometry and Mapping |
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination |