Movatterモバイル変換


[0]ホーム

URL:


CN106918830A - A positioning method and mobile robot based on multiple navigation modules - Google Patents

A positioning method and mobile robot based on multiple navigation modules
Download PDF

Info

Publication number
CN106918830A
CN106918830ACN201710180534.0ACN201710180534ACN106918830ACN 106918830 ACN106918830 ACN 106918830ACN 201710180534 ACN201710180534 ACN 201710180534ACN 106918830 ACN106918830 ACN 106918830A
Authority
CN
China
Prior art keywords
mobile robot
pose
moment
time
inertial navigation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710180534.0A
Other languages
Chinese (zh)
Inventor
梅涛
朱昕毅
陈剑
方健
姜丽丽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anke Robot Co ltd
Original Assignee
Anke Robot Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anke Robot Co ltdfiledCriticalAnke Robot Co ltd
Priority to CN201710180534.0ApriorityCriticalpatent/CN106918830A/en
Publication of CN106918830ApublicationCriticalpatent/CN106918830A/en
Pendinglegal-statusCriticalCurrent

Links

Classifications

Landscapes

Abstract

The invention discloses a positioning method based on multiple navigation modules and a mobile robot, wherein the mobile robot comprises a GPS navigation module, an inertial navigation module and a laser radar navigation module; the method comprises the following steps: longitude and latitude coordinates at the t-1 moment and the t moment through a GPS navigation module are converted into coordinate values under an XY coordinate system; obtaining the yaw angle change of the mobile robot through signal information corresponding to the inertial navigation module at the t-1 moment and the t moment; scanning data of the laser radar navigation module at t-1 moment and t moment are obtained; obtaining an inertial navigation pose and a laser pose of the mobile robot at the time t by combining a pre-established kinematic model of the mobile robot; and fusing the inertial navigation pose and the laser pose of the mobile robot by combining the environmental landmark information to obtain the corresponding current pose of the mobile robot at the time t. The invention can realize the real-time positioning of the mobile robot body and improve the positioning precision without laying any other devices on the site.

Description

Translated fromChinese
一种基于多导航模块的定位方法及移动机器人A positioning method and mobile robot based on multiple navigation modules

技术领域technical field

本发明涉及移动机器人导航定位的技术领域,尤其涉及一种基于多导航模块的定位方法及移动机器人。The invention relates to the technical field of navigation and positioning of mobile robots, in particular to a positioning method based on multiple navigation modules and a mobile robot.

背景技术Background technique

目前的移动机器人导航方式主要有卫星导航、激光导航和超声波导航等。单一导航系统都有各自的独特性能和局限性,由于机器人运行环境的不同,使用单一导航系统的机器人的运动在每一步运行中都存在系统误差,而基于多传感器信息的组合导航技术是对机器人自身携带的传感器所观察到的多种信息使用信息融合算法来综合消除单一导航系统存在的误差,估计环境地图和在此环境地图中机器人的移动轨迹。The current mobile robot navigation methods mainly include satellite navigation, laser navigation and ultrasonic navigation. A single navigation system has its own unique performance and limitations. Due to the different operating environments of robots, there are systematic errors in the movement of robots using a single navigation system in every step of operation, and the integrated navigation technology based on multi-sensor information is very important for robots. The various information observed by the sensors carried by itself use the information fusion algorithm to comprehensively eliminate the errors existing in a single navigation system, estimate the environmental map and the robot's movement trajectory in this environmental map.

在公开号为CN101576384的专利中描述了一种基于视觉信息和里程计信息对移动机器人进行定位的方法,该方法基于里程计数据预估机器人位置,再通过摄像头视觉信息对预估位置进行校正。该方法主要融合了视觉和里程计两种传感器数据,相对于单一导航系统稳定性和定位精度有所提高。但该方法仅在室内简单环境下有良好的定位效果,在较为复杂的环境下会产生较大的误差,且在室外强光环境下无法工作。In the patent with the publication number CN101576384, a method for positioning a mobile robot based on visual information and odometer information is described. The method estimates the position of the robot based on the odometer data, and then corrects the estimated position through the visual information of the camera. This method mainly combines two kinds of sensor data, vision and odometer, and improves the stability and positioning accuracy compared with a single navigation system. However, this method only has a good positioning effect in a simple indoor environment, and it will produce a large error in a more complex environment, and it cannot work in an outdoor strong light environment.

公开号为CN106017458A)的专利中描述了一种基于红外相机和惯导系统的组合导航方式,在设定的机器人移动路径的每一个控制点的位置上方设置位姿控制板,利用机器人上设置的红外照相机拍摄包含有位于机器人上方的位姿控制板的图像,并采用图像颜色识别技术,识别出图像中的位姿控制板确定机器人的当前位置及当前偏移角度,实现对机器人绝对位置的定位。该方法成本较低,但是需要机器人所带相机准确捕捉到控制板图像信息,稳定性较差,抗干扰能力一般。The patent with the publication number CN106017458A) describes a combined navigation method based on an infrared camera and an inertial navigation system. A pose control board is set above the position of each control point of the set robot movement path, and the The infrared camera shoots images containing the pose control board above the robot, and uses image color recognition technology to recognize the pose control board in the image to determine the current position and current offset angle of the robot, and realize the positioning of the absolute position of the robot . The cost of this method is low, but it needs the camera of the robot to accurately capture the image information of the control board, the stability is poor, and the anti-interference ability is average.

公开号为CN105737820的专利中描述了一种基于红外射线仪与特殊路标的定位方法,红外LED发射红外线,经路标标签反射,在摄像机上成像,通过成像分析路标信息,路标位置即为小车位置。该方法中路标已事先标定,其坐标已知,此方法依赖于准确地发现路标,读取路标,但摄像机的视频范围有限,容易错过路标,系统实时性较差,定位效果一般。The patent with the publication number CN105737820 describes a positioning method based on an infrared ray meter and a special road sign. The infrared LED emits infrared rays, which are reflected by the road sign label and imaged on the camera. The road sign information is analyzed through imaging, and the position of the road mark is the position of the car. In this method, the landmarks have been calibrated in advance, and their coordinates are known. This method relies on accurately finding and reading the landmarks, but the video range of the camera is limited, and it is easy to miss the landmarks. The real-time performance of the system is poor, and the positioning effect is average.

公开号为CN104102222的专利中描述了一种基于激光测距仪和反射板的定位方法,激光测距仪发射的激光经反射板返回,机器人通过多个方向反射板与本体的距离计算得到机器人定位。该方法易于实现,算法简单,但需要在现场布设大量反射板,成本较高,且激光必须准确射到反射板才能有效工作,工作环境要求较高,而定位精度不高。The patent with the publication number CN104102222 describes a positioning method based on a laser rangefinder and a reflector. The laser emitted by the laser rangefinder returns through the reflector, and the robot is positioned by calculating the distance between the reflector and the body in multiple directions. . This method is easy to implement and the algorithm is simple, but it needs to arrange a large number of reflectors on site, the cost is high, and the laser must hit the reflectors accurately to work effectively, the working environment requires high requirements, and the positioning accuracy is not high.

可见,上述组合导航的方式需要对现场进行特殊的布置,且存在定位不够精确等技术问题,影响导航的顺利进行。It can be seen that the above-mentioned integrated navigation method requires special layout of the site, and there are technical problems such as inaccurate positioning, which affects the smooth progress of navigation.

发明内容Contents of the invention

本发明的主要目的在于提出一种基于多导航模块的定位方法,及移动机器人,旨在无需在现场布设任何其他装置,实现对机器人本体的实时定位并提高定位精度。The main purpose of the present invention is to propose a multi-navigation module-based positioning method and a mobile robot, aiming to realize real-time positioning of the robot body and improve positioning accuracy without deploying any other devices on site.

为实现上述目的,本发明提出一种基于多导航模块的定位方法,应用于移动机器人,其特征在于,所述移动机器人包括GPS导航模块、惯性导航模块、激光雷达导航模块;所述方法包括:In order to achieve the above object, the present invention proposes a positioning method based on multiple navigation modules, which is applied to a mobile robot, wherein the mobile robot includes a GPS navigation module, an inertial navigation module, and a laser radar navigation module; the method includes:

通过所述GPS导航模块获取所述移动机器人在t-1时刻以及t时刻的经纬度坐标,并转换至XY坐标系下坐标值;通过所述惯性导航模块在t-1时刻和t时刻对应的信号信息得出所述移动机器人的偏航角变化;获取所述激光雷达导航模块在t-1时刻和t时刻的扫描数据;Obtain the latitude and longitude coordinates of the mobile robot at t-1 time and t time by the GPS navigation module, and convert to the coordinate value under the XY coordinate system; pass the signal corresponding to the t-1 time and t time by the inertial navigation module The information obtains the yaw angle change of the mobile robot; obtains the scanning data of the lidar navigation module at the t-1 moment and the t moment;

根据预先建立的所述移动机器人的运动学模型、所述移动机器人在t-1时刻和t时刻的坐标值、所述偏航角变化得出所述移动机器人在t时刻的惯导位姿;According to the pre-established kinematics model of the mobile robot, the coordinate values of the mobile robot at time t-1 and time t, and the change of the yaw angle, the inertial navigation pose of the mobile robot at time t is obtained;

根据所述移动机器人在t-1时刻和t时刻的扫描数据计算所述移动机器人在t时刻的激光位姿;Calculate the laser pose of the mobile robot at time t according to the scanning data of the mobile robot at time t-1 and time t;

结合环境路标信息对所述移动机器人的惯导位姿与所述激光位姿进行融合,得到所述移动机器人在t时刻对应的当前位姿。The inertial navigation pose of the mobile robot and the laser pose are fused in combination with environmental landmark information to obtain the current pose of the mobile robot at time t.

可选地,根据预先建立的所述移动机器人的运动学模型、所述移动机器人在t-1时刻和t时刻的坐标值和所述偏航角变化得出所述移动机器人在t时刻的惯导位姿包括:Optionally, the inertia of the mobile robot at time t is obtained according to the pre-established kinematics model of the mobile robot, the coordinate values of the mobile robot at time t-1 and time t, and the change of the yaw angle. Guided poses include:

判断所述移动机器人在t-1时刻和t时刻的坐标值的差值是否在预设的合理范围内;Judging whether the difference between the coordinate values of the mobile robot at time t-1 and time t is within a preset reasonable range;

若是,则根据所述移动机器人在t时刻的坐标值、预先建立的所述移动机器人的运动学模型和所述偏航角的变化得出所述移动机器人在t时刻的惯导位姿;If so, then obtain the inertial navigation pose of the mobile robot at time t according to the coordinate value of the mobile robot at time t, the kinematic model of the mobile robot established in advance and the change of the yaw angle;

若否,则通过所述惯性导航模块的t-1时刻和t时刻对应的信号信息得出所述移动机器人的运动距离和方向,并根据所述移动机器人在t-1时刻的坐标值、所述移动机器人的运动距离和方向以及预先建立的所述移动机器人的运动学模型、所述偏航角的变化得出所述移动机器人在t时刻的惯导位姿。If not, the motion distance and direction of the mobile robot are obtained from the signal information corresponding to the t-1 moment of the inertial navigation module and the t moment, and according to the coordinate value of the mobile robot at the t-1 moment, the obtained The motion distance and direction of the mobile robot, the pre-established kinematics model of the mobile robot, and the change of the yaw angle are used to obtain the inertial navigation pose of the mobile robot at time t.

可选地,所述根据所述移动机器人在t-1时刻和t时刻的坐标值以及在t-1时刻和t时刻的扫描数据、所述偏航角变化计算所述移动机器人在t时刻的激光位姿包括:Optionally, the calculation of the yaw angle of the mobile robot at time t is based on the coordinate values of the mobile robot at time t-1 and time t, the scan data at time t-1 and time t, and the change in yaw angle. Laser pose includes:

将所述移动机器人从t-1时刻至t时刻的环境变化信息作为从t-1时刻的扫描数据至t时刻的扫描数据的初始位姿变换,执行PLICP算法,迭代计算得到最终位姿变换后,计算所述移动机器人在t时刻的激光位姿。Using the environmental change information of the mobile robot from time t-1 to time t as the initial pose transformation from the scan data at time t-1 to the scan data at time t, execute the PLICP algorithm, and iteratively calculate to obtain the final pose transformation , to calculate the laser pose of the mobile robot at time t.

可选地,所述移动机器人还包括视觉导航模块,所述移动机器人从t-1时刻至t时刻的环境变化信息为所述视觉导航模块对在t时刻对应的环境的匹配信息。Optionally, the mobile robot further includes a visual navigation module, and the environmental change information of the mobile robot from time t-1 to time t is the matching information of the visual navigation module to the corresponding environment at time t.

可选地,所述结合环境路标信息对所述移动机器人的惯导位姿与所述激光位姿进行融合,得到所述移动机器人在t时刻对应的当前位姿包括:Optionally, the fusion of the inertial navigation pose of the mobile robot and the laser pose by combining the environmental landmark information to obtain the current pose of the mobile robot corresponding to time t includes:

根据所述环境路标信息的观测结果计算卡尔曼增益矩阵;calculating a Kalman gain matrix according to the observation results of the environmental landmark information;

利用所述卡尔曼增益矩阵依据扩展卡尔曼滤波算法对所述移动机器人的惯导位姿与所述激光位姿进行计算,得到所述移动机器人在t时刻对应的当前位姿。Using the Kalman gain matrix to calculate the inertial navigation pose and the laser pose of the mobile robot according to the extended Kalman filter algorithm, to obtain the current pose of the mobile robot corresponding to time t.

此外,为实现上述目的,本发明还提供一种移动机器人,其特征在于,包括:In addition, in order to achieve the above object, the present invention also provides a mobile robot, which is characterized in that it includes:

GPS导航模块,用于获取在t-1时刻以及t时刻的经纬度坐标,并转换至XY坐标系下坐标值;The GPS navigation module is used to obtain the latitude and longitude coordinates at time t-1 and time t, and convert them to coordinate values under the XY coordinate system;

惯性导航模块,用于在t-1时刻和t时刻对应的信号信息得出所述移动机器人的偏航角变化;The inertial navigation module is used to obtain the yaw angle change of the mobile robot from the signal information corresponding to the t-1 moment and the t moment;

激光雷达导航模块,用于获取在t-1时刻和t时刻的扫描数据;LiDAR navigation module, used to obtain the scan data at t-1 moment and t moment;

第一计算模块,用于根据预先建立的所述移动机器人的运动学模型、所述移动机器人在t-1时刻和t时刻的坐标值、所述偏航角变化得出所述移动机器人在t时刻的惯导位姿;The first calculation module is used to obtain the mobile robot at t according to the pre-established kinematics model of the mobile robot, the coordinate values of the mobile robot at time t-1 and time t, and the change of the yaw angle. Inertial navigation pose at all times;

第二计算模块,用于根据所述移动机器人在t-1时刻和t时刻的扫描数据计算所述移动机器人在t时刻的激光位姿;The second calculation module is used to calculate the laser pose of the mobile robot at time t according to the scanning data of the mobile robot at time t-1 and time t;

第三计算模块,用于结合环境路标信息对所述移动机器人的惯导位姿与所述激光位姿进行融合,得到所述移动机器人在t时刻对应的当前位姿。The third calculation module is used to fuse the inertial navigation pose of the mobile robot with the laser pose in combination with the environmental landmark information to obtain the current pose of the mobile robot at time t.

可选地,所述第一计算模块用于:Optionally, the first calculation module is used for:

判断所述移动机器人在t-1时刻和t时刻的坐标值的差值是否在预设的合理范围内;Judging whether the difference between the coordinate values of the mobile robot at time t-1 and time t is within a preset reasonable range;

若是,则根据所述移动机器人在t时刻的坐标值、预先建立的所述移动机器人的运动学模型和所述偏航角的变化得出所述移动机器人在t时刻的惯导位姿;If so, then obtain the inertial navigation pose of the mobile robot at time t according to the coordinate value of the mobile robot at time t, the kinematic model of the mobile robot established in advance and the change of the yaw angle;

若否,则通过所述惯性导航模块的t-1时刻和t时刻对应的信号信息得出所述移动机器人的运动距离和方向,并根据所述移动机器人在t-1时刻的坐标值、所述移动机器人的运动距离和方向以及预先建立的所述移动机器人的运动学模型、所述偏航角的变化得出所述移动机器人在t时刻的惯导位姿。If not, the motion distance and direction of the mobile robot are obtained from the signal information corresponding to the t-1 moment of the inertial navigation module and the t moment, and according to the coordinate value of the mobile robot at the t-1 moment, the obtained The motion distance and direction of the mobile robot, the pre-established kinematics model of the mobile robot, and the change of the yaw angle are used to obtain the inertial navigation pose of the mobile robot at time t.

可选地,所述第二计算模块用于:Optionally, the second calculation module is used for:

将所述移动机器人从t-1时刻至t时刻的环境变化信息作为从t-1时刻的扫描数据至t时刻的扫描数据的初始位姿变换,执行PLICP算法,迭代计算得到最终位姿变换后,计算所述移动机器人在t时刻的激光位姿。Using the environmental change information of the mobile robot from time t-1 to time t as the initial pose transformation from the scan data at time t-1 to the scan data at time t, execute the PLICP algorithm, and iteratively calculate to obtain the final pose transformation , to calculate the laser pose of the mobile robot at time t.

可选地,所述移动机器人还包括视觉导航模块,所述移动机器人从t-1时刻至t时刻的环境变化信息为所述视觉导航模块对在t时刻对应的环境的匹配信息。Optionally, the mobile robot further includes a visual navigation module, and the environmental change information of the mobile robot from time t-1 to time t is the matching information of the visual navigation module to the corresponding environment at time t.

可选地,所述第三计算模块用于:Optionally, the third computing module is used for:

根据所述环境路标信息的观测结果计算卡尔曼增益矩阵;calculating a Kalman gain matrix according to the observation results of the environmental landmark information;

利用所述卡尔曼增益矩阵依据扩展卡尔曼滤波算法对所述移动机器人的惯导位姿与所述激光位姿进行计算,得到所述移动机器人在t时刻对应的当前位姿。Using the Kalman gain matrix to calculate the inertial navigation pose and the laser pose of the mobile robot according to the extended Kalman filter algorithm, to obtain the current pose of the mobile robot corresponding to time t.

本发明提出的基于多导航模块的定位方法及移动机器人,克服了现有技术中单一导航系统存在的局限性和系统误差,以及现有简单组合导航技术对使用环境的抗干扰性较差的问题,提高了定位精度与机器人导航稳定性,大大扩展的机器人的适运行环境。The multi-navigation module-based positioning method and mobile robot proposed by the present invention overcome the limitations and system errors of a single navigation system in the prior art, and the problem that the existing simple combined navigation technology has poor anti-interference to the use environment , improve the positioning accuracy and robot navigation stability, and greatly expand the suitable operating environment of the robot.

附图说明Description of drawings

图1为本发明实施例的移动机器人的基于多导航模块的定位方法的流程示意图;Fig. 1 is a schematic flow chart of a positioning method based on multiple navigation modules of a mobile robot according to an embodiment of the present invention;

图2为本发明实施例的路径的规划的流程示意图Fig. 2 is a schematic flow chart of path planning in an embodiment of the present invention

图3为本发明实施例的移动机器人的结构示意图;Fig. 3 is the structural representation of the mobile robot of the embodiment of the present invention;

图4为本发明另一实施例的移动机器人的结构示意图;FIG. 4 is a schematic structural view of a mobile robot according to another embodiment of the present invention;

本发明目的的实现、功能特点及优点将结合实施例,参照附图做进一步说明。The realization of the purpose of the present invention, functional characteristics and advantages will be further described in conjunction with the embodiments and with reference to the accompanying drawings.

具体实施方式detailed description

应当理解,此处所描述的具体实施例仅仅用以解释本发明,并不用于限定本发明。It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

在后续的描述中,使用用于表示元件的诸如“模块”、“部件”或“单元”的后缀仅为了有利于本发明的说明,其本身并没有特定的意义。因此,"模块"与"部件"可以混合地使用。In the following description, use of suffixes such as 'module', 'part' or 'unit' for denoting elements is only for facilitating description of the present invention and has no specific meaning by itself. Therefore, "module" and "component" may be used mixedly.

如图1所示,本发明提供一种基于多导航模块的定位方法,应用于移动机器人,所述移动机器人包括GPS导航模块、惯性导航模块、激光雷达导航模块;所述方法包括步骤:As shown in Figure 1, the present invention provides a kind of positioning method based on multi-navigation module, is applied to mobile robot, and described mobile robot comprises GPS navigation module, inertial navigation module, lidar navigation module; Described method comprises steps:

S1、通过所述GPS导航模块获取所述移动机器人在t-1时刻以及t时刻的经纬度坐标,并转换至XY坐标系下坐标值;通过所述惯性导航模块在t-1时刻和t时刻对应的信号信息得出所述移动机器人的偏航角变化;获取所述激光雷达导航模块在t-1时刻和t时刻的扫描数据;S1. Obtain the latitude and longitude coordinates of the mobile robot at time t-1 and time t through the GPS navigation module, and convert them to coordinate values in the XY coordinate system; correspond to time t-1 and time t through the inertial navigation module Obtain the yaw angle variation of described mobile robot from the signal information; Obtain the scanning data of described lidar navigation module at t-1 moment and t moment;

S2、根据预先建立的所述移动机器人的运动学模型、所述移动机器人在t-1时刻和t时刻的坐标值、所述偏航角变化得出所述移动机器人在t时刻的惯导位姿;S2. Obtain the inertial navigation position of the mobile robot at time t according to the pre-established kinematics model of the mobile robot, the coordinate values of the mobile robot at time t-1 and time t, and the change of the yaw angle posture;

S3、根据所述移动机器人在t-1时刻和t时刻的扫描数据计算所述移动机器人在t时刻的激光位姿;S3. Calculate the laser pose of the mobile robot at time t according to the scanning data of the mobile robot at time t-1 and time t;

S4、结合环境路标信息对所述移动机器人的惯导位姿与所述激光位姿进行融合,得到所述移动机器人在t时刻对应的当前位姿。S4. Fusion the inertial navigation pose of the mobile robot and the laser pose in combination with the environmental landmark information to obtain the current pose of the mobile robot corresponding to time t.

在具体实施时,本实施例采用以下算法执行上述步骤S2、S3;具体地,在步骤S2中,根据移动机器人的运动学模型,计算从t-1时刻到t时刻位置的变化(△xt,△yt)以及IMU的偏航角变化θt,并计算出t时刻里程计所对应的机器人惯导位姿Podom(t)=[x(t),y(t),θ(t)]T。在步骤S3中,可以令q0=(△xt,△ytt),根据t时刻激光雷达的扫描数据St以及t-1时刻的参考扫描数据St-1,将q0。作为从St-1到St的初始位姿变换,执行PLICP算法,迭代计算得到最终的位姿变换qk。再计算t时刻雷达扫描匹配所对应的机器人位姿Pscan(t)=R(θk)Pscan-(t-1)+tkIn specific implementation, this embodiment adopts the following algorithm to execute the above-mentioned steps S2 and S3; specifically, in step S2, according to the kinematics model of the mobile robot, the change of position from time t-1 to time t is calculated (Δxt ,△yt ) and the yaw angle change θt of the IMU, and calculate the inertial navigation pose of the robot corresponding to the odometer at time t Podom (t)=[x(t),y(t),θ(t )]T. In step S3, q0 =(△xt ,△ytt ), according to the scan data S t of the laser radar at timet and the reference scan data St-1 at time t-1 , q0 . As the initial pose transformation from St-1 to St , the PLICP algorithm is executed, and the final pose transformation qk is obtained through iterative calculation. Then calculate the robot pose Pscan (t)=R(θk )Pscan -(t-1)+tk corresponding to the radar scan matching at time t.

一般地,在步骤S1之前,还可以完成移动机器人的运动学模型的建立以及路径的规划。Generally, before step S1, the establishment of the kinematics model of the mobile robot and the planning of the path can also be completed.

在建立移动机器人的运动学模型时,可以根据机器人底盘结构,确定机器人运动线速度与角速度之间的关系,从而分析机器人移动位姿规律,搭建机器人运动模型。When establishing the kinematics model of the mobile robot, the relationship between the linear velocity and the angular velocity of the robot can be determined according to the structure of the robot chassis, so as to analyze the movement law of the robot and build a robot motion model.

请参照图2,在进行路径的规划时,可以包括以下步骤:Referring to Figure 2, the following steps may be included in path planning:

S11、在环境现场进行激光扫描,利用激光雷达采集障碍物信息二维点集;S11. Carry out laser scanning on the environmental site, and use laser radar to collect two-dimensional point sets of obstacle information;

S12、将采集到的障碍物信息二维点集转换为栅格地图作为参考扫描结果,并将机器人位置转换为XY坐标系下坐标;S12. Convert the collected two-dimensional point set of obstacle information into a grid map as a reference scanning result, and convert the position of the robot into coordinates in the XY coordinate system;

S13、在室外环境实时获取机器人经纬度坐标,并转换为与激光导航系统同参数的XY坐标;S13. Obtain the latitude and longitude coordinates of the robot in real time in the outdoor environment, and convert them into XY coordinates with the same parameters as the laser navigation system;

S14、在XY坐标系下对机器人运动轨迹进行规划;S14, planning the trajectory of the robot in the XY coordinate system;

此时,移动机器人在行进过程中,可以利用本体携带激光扫描仪在现场对环境轮廓进行激光扫描;根据扫描点建立极坐标系和笛卡尔坐标系。At this time, when the mobile robot is moving, it can use the laser scanner on the body to carry out laser scanning on the environment contour; establish a polar coordinate system and a Cartesian coordinate system according to the scanning points.

在本发明的一个实施例中中,上述步骤S2可以包括:In one embodiment of the present invention, the above step S2 may include:

判断所述移动机器人在t-1时刻和t时刻的坐标值的差值是否在预设的合理范围内;Judging whether the difference between the coordinate values of the mobile robot at time t-1 and time t is within a preset reasonable range;

若是,则说明此时GPS导航模块的精确度较高,可以根据所述移动机器人在t时刻的坐标值直接获取移动机器人的位置;然后结合预先建立的所述移动机器人的运动学模型和所述偏航角的变化得出所述移动机器人在t时刻的惯导位姿;If so, it means that the accuracy of the GPS navigation module is high at this time, and the position of the mobile robot can be directly obtained according to the coordinate value of the mobile robot at time t; The change of yaw angle draws the inertial navigation pose of the mobile robot at time t;

若否,则说明此时GPS导航模块可能信号较弱,可以通过所述惯性导航模块的t-1时刻和t时刻对应的信号信息得出所述移动机器人的运动距离和方向,更具体地,可以根据编码器计算机器人运动距离;根据电子罗盘与陀螺仪判别机器人运动方向;然后根据所述移动机器人在t-1时刻的坐标值、所述移动机器人的运动距离和方向计算出移动机器人的位置,然后结合预先建立的所述移动机器人的运动学模型和所述偏航角的变化得出所述移动机器人在t时刻的惯导位姿。If not, then it shows that the GPS navigation module may have a weaker signal at this time, and the motion distance and direction of the mobile robot can be obtained by the signal information corresponding to the t-1 moment and the t moment of the inertial navigation module, more specifically, The moving distance of the robot can be calculated according to the encoder; the moving direction of the robot can be judged according to the electronic compass and the gyroscope; , and then combine the pre-established kinematics model of the mobile robot and the change of the yaw angle to obtain the inertial navigation pose of the mobile robot at time t.

在本发明的另一实施例中,上述步骤S3具体包括:将所述移动机器人从t-1时刻至t时刻的环境变化信息作为从t-1时刻的扫描数据至t时刻的扫描数据的初始位姿变换,执行PLICP算法,迭代计算得到最终位姿变换后,计算所述移动机器人在t时刻的激光位姿。In another embodiment of the present invention, the above step S3 specifically includes: taking the environmental change information of the mobile robot from time t-1 to time t as the initial data from the scan data at time t-1 to the scan data at time t For pose transformation, the PLICP algorithm is executed, and after the final pose transformation is obtained through iterative calculation, the laser pose of the mobile robot at time t is calculated.

更具体地,移动机器人还包括视觉导航模块,所述移动机器人从t-1时刻至t时刻环境变化信息为所述视觉导航模块对在t时刻对应的环境的匹配信息。More specifically, the mobile robot further includes a visual navigation module, and the environmental change information of the mobile robot from time t-1 to time t is the matching information of the visual navigation module to the corresponding environment at time t.

在本发明的又一实施例中,可以根据在得出惯导位姿后采用迭代最近点算法将扫描点集映射到路径规划时采集障碍物信息二维点集上,根据机器人学空间坐标转换原理得到两者变换关系;该变换关系用作激光位姿。In yet another embodiment of the present invention, the iterative closest point algorithm can be used to map the scanning point set to the two-dimensional point set of obstacle information collected during path planning after the inertial navigation pose is obtained, and the spatial coordinate transformation according to robotics The principle obtains the transformation relationship between the two; this transformation relationship is used as the laser pose.

在本发明的一个实施例中,上述步骤S4具体包括:In one embodiment of the present invention, the above step S4 specifically includes:

根据所述环境路标信息的观测结果计算卡尔曼增益矩阵;calculating a Kalman gain matrix according to the observation results of the environmental landmark information;

利用所述卡尔曼增益矩阵依据扩展卡尔曼滤波算法对所述移动机器人的惯导位姿与所述激光位姿进行计算,得到所述移动机器人在t时刻对应的当前位姿。Using the Kalman gain matrix to calculate the inertial navigation pose and the laser pose of the mobile robot according to the extended Kalman filter algorithm, to obtain the current pose of the mobile robot corresponding to time t.

上面对本发明实施例中的基于多导航模块的定位方法,进行了描述,下面对本发明实施例中的移动机器人进行描述。The positioning method based on multiple navigation modules in the embodiment of the present invention has been described above, and the mobile robot in the embodiment of the present invention will be described below.

如图3所示,本发明提出一种移动机器人,包括GPS导航模块10、惯性导航模块20、激光雷达导航模块30、第一计算模块71、第二计算模块72、第三计算模块73。As shown in FIG. 3 , the present invention proposes a mobile robot, including a GPS navigation module 10 , an inertial navigation module 20 , a laser radar navigation module 30 , a first computing module 71 , a second computing module 72 , and a third computing module 73 .

其中,GPS导航模块10、惯性导航模块20、激光雷达导航模块30与工控机50相连,其采集到的数据发送至数据采集板60,第一计算模块71、第二计算模块72、第三计算模块73集成在控制板70上,控制板70可以控制电机驱动器80以驱动前后轮运动,电机驱动器80可以包括后轮无刷直流电机驱动器、前轮步进电机驱动器。Wherein, the GPS navigation module 10, the inertial navigation module 20, the laser radar navigation module 30 are connected with the industrial computer 50, and the data collected by them are sent to the data acquisition board 60, the first calculation module 71, the second calculation module 72, the third calculation module The module 73 is integrated on the control board 70, and the control board 70 can control the motor driver 80 to drive the movement of the front and rear wheels. The motor driver 80 can include a rear wheel brushless DC motor driver and a front wheel stepper motor driver.

GPS导航模块10用于获取在t-1时刻以及t时刻的经纬度坐标,并转换至XY坐标系下坐标值;The GPS navigation module 10 is used to obtain the latitude and longitude coordinates at the time t-1 and the time t, and converts to coordinate values under the XY coordinate system;

惯性导航模块20用于在t-1时刻和t时刻对应的信号信息得出所述移动机器人的偏航角变化;惯性导航模块20包括电子罗盘、角度传感器、编码器等。The inertial navigation module 20 is used to obtain the yaw angle change of the mobile robot from the signal information corresponding to the time t-1 and time t; the inertial navigation module 20 includes an electronic compass, an angle sensor, an encoder, and the like.

激光雷达导航模块30用于获取在t-1时刻和t时刻的扫描数据;The lidar navigation module 30 is used to obtain the scanning data at the time t-1 and the time t;

第一计算模块71用于根据预先建立的所述移动机器人的运动学模型、所述移动机器人在t-1时刻和t时刻的坐标值、所述偏航角变化得出所述移动机器人在t时刻的惯导位姿;在具体实施时,可以根据移动机器人的运动学模型,计算从t-1时刻到t时刻位置的变化(△xt,△yt)以及IMU的偏航角变化θt,并计算出t时刻里程计所对应的机器人惯导位姿Podom(t)=[x(t),y(t),θ(t)]TThe first calculation module 71 is used to obtain the kinematics model of the mobile robot established in advance, the coordinate values of the mobile robot at time t-1 and time t, and the change of the yaw angle to obtain the position of the mobile robot at t Inertial navigation pose at any time; in specific implementation, the position change from time t-1 to time t (△xt , △yt ) and the yaw angle change θ of the IMU can be calculated according to the kinematics model of the mobile robott , and calculate the robot inertial navigation pose Podom (t)=[x(t),y(t),θ(t)]T corresponding to the odometer at time t;

第二计算模块72用于根据所述移动机器人在t-1时刻和t时刻的扫描数据计算所述移动机器人在t时刻的激光位姿;在具体实施时,可以令q0=(△xt,△ytt),根据t时刻激光雷达的扫描数据St以及t-1时刻的参考扫描数据St-1,将q0。作为从St-1到St的初始位姿变换,执行PLICP算法,迭代计算得到最终的位姿变换qk。再计算t时刻雷达扫描匹配所对应的机器人位姿Pscan(t)=R(θk)Pscan-(t-1)+tkThe second calculation module 72 is used to calculate the laser pose of the mobile robot at time t according to the scan data of the mobile robot at time t-1 and time t; in specific implementation, q0 =(Δxt ,△ytt ), according to the scanning data St of the lidar at time t and the reference scanning data St-1 at time t-1 , set q0 . As the initial pose transformation from St-1 to St , the PLICP algorithm is executed, and the final pose transformation qk is obtained through iterative calculation. Then calculate the robot pose Pscan (t)=R(θk )Pscan -(t-1)+tk corresponding to the radar scan matching at time t;

第三计算模块73用于结合环境路标信息对所述移动机器人的惯导位姿与所述激光位姿进行融合,得到所述移动机器人在t时刻对应的当前位姿。The third calculation module 73 is used to fuse the inertial navigation pose of the mobile robot with the laser pose in combination with the environmental landmark information to obtain the current pose of the mobile robot at time t.

一般地,还可以包括路径规划模块,集成在控制板上,路径规划模块用于在环境现场进行激光扫描,利用激光雷达采集障碍物信息二维点集;将采集到的障碍物信息二维点集转换为栅格地图作为参考扫描结果,并将机器人位置转换为XY坐标系下坐标;在室外环境实时获取机器人经纬度坐标,并转换为与激光导航系统同参数的XY坐标;在XY坐标系下对机器人运动轨迹进行规划;Generally, a path planning module can also be included, which is integrated on the control board. The path planning module is used to perform laser scanning on the environmental site, and use laser radar to collect two-dimensional point sets of obstacle information; the collected obstacle information two-dimensional points The set is converted into a grid map as a reference scan result, and the position of the robot is converted into coordinates in the XY coordinate system; the latitude and longitude coordinates of the robot are obtained in real time in the outdoor environment, and converted into XY coordinates with the same parameters as the laser navigation system; in the XY coordinate system Plan the trajectory of the robot;

此时,移动机器人在行进过程中,可以利用本体携带激光扫描仪在现场对环境轮廓进行激光扫描;根据扫描点建立极坐标系和笛卡尔坐标系。At this time, when the mobile robot is moving, it can use the laser scanner on the body to carry out laser scanning on the environment contour; establish a polar coordinate system and a Cartesian coordinate system according to the scanning points.

在本发明的一个实施例中,上述第一计算模块71可以用于:In one embodiment of the present invention, the above-mentioned first calculation module 71 can be used for:

判断所述移动机器人在t-1时刻和t时刻的坐标值的差值是否在预设的合理范围内;Judging whether the difference between the coordinate values of the mobile robot at time t-1 and time t is within a preset reasonable range;

若是,则说明此时GPS导航模块的精确度较高,可以根据所述移动机器人在t时刻的坐标值直接获取移动机器人的位置;然后结合预先建立的所述移动机器人的运动学模型和所述偏航角的变化得出所述移动机器人在t时刻的惯导位姿;If so, it means that the accuracy of the GPS navigation module is high at this time, and the position of the mobile robot can be directly obtained according to the coordinate value of the mobile robot at time t; The change of yaw angle draws the inertial navigation pose of the mobile robot at time t;

若否,则说明此时GPS导航模块可能信号较弱,可以通过所述惯性导航模块的t-1时刻和t时刻对应的信号信息得出所述移动机器人的运动距离和方向,更具体地,可以根据编码器计算机器人运动距离;根据电子罗盘与陀螺仪判别机器人运动方向;然后根据所述移动机器人在t-1时刻的坐标值、所述移动机器人的运动距离和方向计算出移动机器人的位置,然后结合预先建立的所述移动机器人的运动学模型和所述偏航角的变化得出所述移动机器人在t时刻的惯导位姿。If not, then it shows that the GPS navigation module may have a weaker signal at this time, and the motion distance and direction of the mobile robot can be obtained by the signal information corresponding to the t-1 moment and the t moment of the inertial navigation module, more specifically, The moving distance of the robot can be calculated according to the encoder; the moving direction of the robot can be judged according to the electronic compass and the gyroscope; , and then combine the pre-established kinematics model of the mobile robot and the change of the yaw angle to obtain the inertial navigation pose of the mobile robot at time t.

在本发明的一个实施例中,上述第二计算模块72可以用于:将所述移动机器人从t-1时刻至t时刻的环境变化信息作为从t-1时刻的扫描数据至t时刻的扫描数据的初始位姿变换,执行PLICP算法,迭代计算得到最终位姿变换后,计算所述移动机器人在t时刻的激光位姿。将所述移动机器人从t-1时刻至t时刻的环境变化信息作为从t-1时刻的扫描数据至t时刻的扫描数据的初始位姿变换,执行PLICP算法,迭代计算得到最终位姿变换后,计算所述移动机器人在t时刻的激光位姿。In an embodiment of the present invention, the above-mentioned second calculation module 72 can be used to: take the environmental change information of the mobile robot from time t-1 to time t as the scanning data from time t-1 to time t For the initial pose transformation of the data, the PLICP algorithm is executed, and after the final pose transformation is obtained through iterative calculation, the laser pose of the mobile robot at time t is calculated. Using the environmental change information of the mobile robot from time t-1 to time t as the initial pose transformation from the scan data at time t-1 to the scan data at time t, execute the PLICP algorithm, and iteratively calculate to obtain the final pose transformation , to calculate the laser pose of the mobile robot at time t.

更具体地,如图4所示,移动机器人还包括视觉导航模块40,所述移动机器人从t-1时刻至t时刻环境变化信息为所述视觉导航模块对在t时刻对应的环境的匹配信息。More specifically, as shown in FIG. 4 , the mobile robot also includes a visual navigation module 40, and the environmental change information of the mobile robot from time t-1 to time t is the matching information of the visual navigation module to the corresponding environment at time t .

在本发明的一个实施例中,上述第三计算模块73用于:In one embodiment of the present invention, the above-mentioned third calculation module 73 is used for:

根据所述环境路标信息的观测结果计算卡尔曼增益矩阵;calculating a Kalman gain matrix according to the observation results of the environmental landmark information;

利用所述卡尔曼增益矩阵依据扩展卡尔曼滤波算法对所述移动机器人的惯导位姿与所述激光位姿进行计算,得到所述移动机器人在t时刻对应的当前位姿。Using the Kalman gain matrix to calculate the inertial navigation pose and the laser pose of the mobile robot according to the extended Kalman filter algorithm, to obtain the current pose of the mobile robot corresponding to time t.

本发明提出的基于多导航模块的定位方法及移动机器人,采用多种主流导航方式进行融合,改进了算法,提高了定位精度与机器人导航稳定性,大大扩展的机器人的适运行环境,且克服了现有技术中单一导航系统存在的局限性和系统误差,以及现有简单组合导航技术对使用环境的抗干扰性较差的问题。The multi-navigation module-based positioning method and the mobile robot proposed by the present invention adopt a variety of mainstream navigation methods for fusion, improve the algorithm, improve the positioning accuracy and robot navigation stability, greatly expand the suitable operating environment of the robot, and overcome the The limitations and system errors of the single navigation system in the prior art, as well as the poor anti-interference ability of the existing simple integrated navigation technology to the use environment.

需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。It should be noted that, in this document, the term "comprising", "comprising" or any other variation thereof is intended to cover a non-exclusive inclusion such that a process, method, article or apparatus comprising a set of elements includes not only those elements, It also includes other elements not expressly listed, or elements inherent in the process, method, article, or device. Without further limitations, an element defined by the phrase "comprising a ..." does not preclude the presence of additional identical elements in the process, method, article, or apparatus comprising that element.

通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端设备(可以是手机,计算机,服务器,空调器,或者网络设备等)执行本发明各个实施例所述的方法。Through the description of the above embodiments, those skilled in the art can clearly understand that the methods of the above embodiments can be implemented by means of software plus a necessary general-purpose hardware platform, and of course also by hardware, but in many cases the former is better implementation. Based on such an understanding, the essence of the technical solution of the present invention or the part that contributes to the prior art can be embodied in the form of software products, and the computer software products are stored in a storage medium (such as ROM/RAM, disk, CD) contains several instructions to make a terminal device (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the methods described in various embodiments of the present invention.

以上仅为本发明的优选实施例,并非因此限制本发明的专利范围,凡是利用本发明说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本发明的专利保护范围内。The above are only preferred embodiments of the present invention, and are not intended to limit the patent scope of the present invention. Any equivalent structure or equivalent process conversion made by using the description of the present invention and the contents of the accompanying drawings, or directly or indirectly used in other related technical fields , are all included in the scope of patent protection of the present invention in the same way.

Claims (10)

CN201710180534.0A2017-03-232017-03-23 A positioning method and mobile robot based on multiple navigation modulesPendingCN106918830A (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201710180534.0ACN106918830A (en)2017-03-232017-03-23 A positioning method and mobile robot based on multiple navigation modules

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201710180534.0ACN106918830A (en)2017-03-232017-03-23 A positioning method and mobile robot based on multiple navigation modules

Publications (1)

Publication NumberPublication Date
CN106918830Atrue CN106918830A (en)2017-07-04

Family

ID=59462107

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201710180534.0APendingCN106918830A (en)2017-03-232017-03-23 A positioning method and mobile robot based on multiple navigation modules

Country Status (1)

CountryLink
CN (1)CN106918830A (en)

Cited By (38)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN107246868A (en)*2017-07-262017-10-13上海舵敏智能科技有限公司A kind of collaborative navigation alignment system and navigation locating method
CN107340522A (en)*2017-07-102017-11-10浙江国自机器人技术有限公司A kind of method, apparatus and system of laser radar positioning
CN107450561A (en)*2017-09-182017-12-08河南科技学院The autonomous path planning of mobile robot and obstacle avoidance system and its application method
CN107688184A (en)*2017-07-242018-02-13宗晖(上海)机器人有限公司A kind of localization method and system
CN107782304A (en)*2017-10-262018-03-09广州视源电子科技股份有限公司Mobile robot positioning method and device, mobile robot and storage medium
CN108489487A (en)*2018-03-202018-09-04广州番禺职业技术学院 A new fast three-dimensional attitude angle high-precision measurement method and device
CN109116397A (en)*2018-07-252019-01-01吉林大学A kind of vehicle-mounted multi-phase machine vision positioning method, device, equipment and storage medium
CN109211248A (en)*2018-07-312019-01-15哈尔滨工程大学A kind of intelligent vehicle navigation system and its air navigation aid based on multisensor
CN109297510A (en)*2018-09-272019-02-01百度在线网络技术(北京)有限公司Relative pose scaling method, device, equipment and medium
CN109683604A (en)*2017-10-182019-04-26苏州宝时得电动工具有限公司Automatic running device and its localization method and device
CN109752725A (en)*2019-01-142019-05-14天合光能股份有限公司 A low-speed commercial robot, positioning and navigation method and positioning and navigation system
CN109959937A (en)*2019-03-122019-07-02广州高新兴机器人有限公司Localization method, storage medium and electronic equipment of the gallery environment based on laser radar
CN110160545A (en)*2018-03-152019-08-23北京航空航天大学A kind of the enhancing positioning system and method for laser radar and GPS
CN110262495A (en)*2019-06-262019-09-20山东大学Mobile robot autonomous navigation and pinpoint control system and method can be achieved
CN110455278A (en)*2019-08-202019-11-15和县隆盛精密机械有限公司The localization method and positioning system of AGV trolley
CN110954100A (en)*2019-12-302020-04-03广东省智能制造研究所Method for estimating body state of foot type robot based on fusion of laser and inertial navigation
CN111006655A (en)*2019-10-212020-04-14南京理工大学Multi-scene autonomous navigation positioning method for airport inspection robot
CN111121768A (en)*2019-12-232020-05-08深圳市优必选科技股份有限公司Robot pose estimation method and device, readable storage medium and robot
CN111665470A (en)*2019-03-072020-09-15阿里巴巴集团控股有限公司Positioning method and device and robot
CN111708010A (en)*2019-03-012020-09-25北京图森智途科技有限公司 A positioning method, device, system and movable device for a mobile device
CN111780744A (en)*2020-06-242020-10-16浙江大华技术股份有限公司Mobile robot hybrid navigation method, equipment and storage device
CN111795689A (en)*2020-07-212020-10-20北京电磁方圆科技有限公司 A multi-information combined navigation system and method for a robot
CN111830955A (en)*2019-04-152020-10-27富华科精密工业(深圳)有限公司Automatic navigation method, server and storage medium
CN111982106A (en)*2020-08-282020-11-24北京信息科技大学Navigation method, navigation device, storage medium and electronic device
CN111994169A (en)*2020-09-082020-11-27北京北特圣迪科技发展有限公司Motion control method of independently driven and steered performance trolley based on position compensation
CN113091736A (en)*2021-04-022021-07-09京东数科海益信息科技有限公司Robot positioning method, device, robot and storage medium
CN113267178A (en)*2021-03-252021-08-17浙江大学Model pose measurement system and method based on multi-sensor fusion
CN113566833A (en)*2021-07-282021-10-29上海工程技术大学Multi-sensor fusion vehicle positioning method and system
CN113587930A (en)*2021-10-082021-11-02广东省科学院智能制造研究所Indoor and outdoor navigation method and device of autonomous mobile robot based on multi-sensor fusion
CN113932820A (en)*2020-06-292022-01-14杭州海康威视数字技术股份有限公司Object detection method and device
CN114018246A (en)*2021-11-152022-02-08北京克莱明科技有限公司Positioning navigation method and positioning navigation device
CN114167467A (en)*2020-09-112022-03-11苏州宝时得电动工具有限公司Robot positioning method and robot
CN114430524A (en)*2021-12-302022-05-03杭州海康威视数字技术股份有限公司Ultra-wideband base station calibration method and device and electronic equipment
CN114527755A (en)*2022-02-212022-05-24山东新一代信息产业技术研究院有限公司Method, equipment and storage medium for automatic pile returning and charging of robot
CN114590243A (en)*2020-12-032022-06-07河南小狮智能科技有限公司 A precise berthing method for unmanned transport equipment in ports
CN114763993A (en)*2021-01-112022-07-19河南小狮智能科技有限公司Positioning system for port automatic driving vehicle
CN115683022A (en)*2021-07-272023-02-03中移(成都)信息通信科技有限公司Wheel spacing and wheel diameter calibration method and device for double-wheel differential mobile robot
CN116466724A (en)*2023-04-252023-07-21浙江亚特电器股份有限公司Mobile positioning method and device of robot and robot

Citations (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN104914865A (en)*2015-05-292015-09-16国网山东省电力公司电力科学研究院Transformer station inspection tour robot positioning navigation system and method
CN105865452A (en)*2016-04-292016-08-17浙江国自机器人技术有限公司Mobile platform pose estimation method based on indirect Kalman filtering
CN106123890A (en)*2016-06-142016-11-16中国科学院合肥物质科学研究院A kind of robot localization method of Fusion
CN106324616A (en)*2016-09-282017-01-11深圳市普渡科技有限公司Map construction method based on inertial navigation unit and laser radar
CN106525053A (en)*2016-12-282017-03-22清研华宇智能机器人(天津)有限责任公司Indoor positioning method for mobile robot based on multi-sensor fusion

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN104914865A (en)*2015-05-292015-09-16国网山东省电力公司电力科学研究院Transformer station inspection tour robot positioning navigation system and method
CN105865452A (en)*2016-04-292016-08-17浙江国自机器人技术有限公司Mobile platform pose estimation method based on indirect Kalman filtering
CN106123890A (en)*2016-06-142016-11-16中国科学院合肥物质科学研究院A kind of robot localization method of Fusion
CN106324616A (en)*2016-09-282017-01-11深圳市普渡科技有限公司Map construction method based on inertial navigation unit and laser radar
CN106525053A (en)*2016-12-282017-03-22清研华宇智能机器人(天津)有限责任公司Indoor positioning method for mobile robot based on multi-sensor fusion

Cited By (49)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN107340522A (en)*2017-07-102017-11-10浙江国自机器人技术有限公司A kind of method, apparatus and system of laser radar positioning
CN107340522B (en)*2017-07-102020-04-17浙江国自机器人技术有限公司Laser radar positioning method, device and system
CN107688184A (en)*2017-07-242018-02-13宗晖(上海)机器人有限公司A kind of localization method and system
CN107246868A (en)*2017-07-262017-10-13上海舵敏智能科技有限公司A kind of collaborative navigation alignment system and navigation locating method
CN107246868B (en)*2017-07-262021-11-02上海舵敏智能科技有限公司Collaborative navigation positioning system and navigation positioning method
CN107450561A (en)*2017-09-182017-12-08河南科技学院The autonomous path planning of mobile robot and obstacle avoidance system and its application method
CN109683604A (en)*2017-10-182019-04-26苏州宝时得电动工具有限公司Automatic running device and its localization method and device
CN107782304A (en)*2017-10-262018-03-09广州视源电子科技股份有限公司Mobile robot positioning method and device, mobile robot and storage medium
CN110160545A (en)*2018-03-152019-08-23北京航空航天大学A kind of the enhancing positioning system and method for laser radar and GPS
CN108489487A (en)*2018-03-202018-09-04广州番禺职业技术学院 A new fast three-dimensional attitude angle high-precision measurement method and device
CN109116397A (en)*2018-07-252019-01-01吉林大学A kind of vehicle-mounted multi-phase machine vision positioning method, device, equipment and storage medium
CN109116397B (en)*2018-07-252022-12-30吉林大学Vehicle-mounted multi-camera visual positioning method, device, equipment and storage medium
CN109211248A (en)*2018-07-312019-01-15哈尔滨工程大学A kind of intelligent vehicle navigation system and its air navigation aid based on multisensor
US11480443B2 (en)2018-09-272022-10-25Apollo Intelligent Driving Technology (Beijing) Co., Ltd.Method for calibrating relative pose, device and medium
CN109297510A (en)*2018-09-272019-02-01百度在线网络技术(北京)有限公司Relative pose scaling method, device, equipment and medium
CN109752725A (en)*2019-01-142019-05-14天合光能股份有限公司 A low-speed commercial robot, positioning and navigation method and positioning and navigation system
CN111708010A (en)*2019-03-012020-09-25北京图森智途科技有限公司 A positioning method, device, system and movable device for a mobile device
CN111708010B (en)*2019-03-012024-04-12北京图森智途科技有限公司Mobile equipment positioning method, device and system and mobile equipment
CN111665470A (en)*2019-03-072020-09-15阿里巴巴集团控股有限公司Positioning method and device and robot
CN109959937A (en)*2019-03-122019-07-02广州高新兴机器人有限公司Localization method, storage medium and electronic equipment of the gallery environment based on laser radar
CN109959937B (en)*2019-03-122021-07-27广州高新兴机器人有限公司Laser radar-based positioning method for corridor environment, storage medium and electronic equipment
CN111830955A (en)*2019-04-152020-10-27富华科精密工业(深圳)有限公司Automatic navigation method, server and storage medium
CN110262495A (en)*2019-06-262019-09-20山东大学Mobile robot autonomous navigation and pinpoint control system and method can be achieved
CN110455278A (en)*2019-08-202019-11-15和县隆盛精密机械有限公司The localization method and positioning system of AGV trolley
CN111006655A (en)*2019-10-212020-04-14南京理工大学Multi-scene autonomous navigation positioning method for airport inspection robot
CN111121768A (en)*2019-12-232020-05-08深圳市优必选科技股份有限公司Robot pose estimation method and device, readable storage medium and robot
CN110954100A (en)*2019-12-302020-04-03广东省智能制造研究所Method for estimating body state of foot type robot based on fusion of laser and inertial navigation
CN111780744A (en)*2020-06-242020-10-16浙江大华技术股份有限公司Mobile robot hybrid navigation method, equipment and storage device
CN111780744B (en)*2020-06-242023-12-29浙江华睿科技股份有限公司Mobile robot hybrid navigation method, equipment and storage device
CN113932820A (en)*2020-06-292022-01-14杭州海康威视数字技术股份有限公司Object detection method and device
CN111795689A (en)*2020-07-212020-10-20北京电磁方圆科技有限公司 A multi-information combined navigation system and method for a robot
CN111982106A (en)*2020-08-282020-11-24北京信息科技大学Navigation method, navigation device, storage medium and electronic device
CN111994169A (en)*2020-09-082020-11-27北京北特圣迪科技发展有限公司Motion control method of independently driven and steered performance trolley based on position compensation
CN114167467A (en)*2020-09-112022-03-11苏州宝时得电动工具有限公司Robot positioning method and robot
CN114590243A (en)*2020-12-032022-06-07河南小狮智能科技有限公司 A precise berthing method for unmanned transport equipment in ports
CN114763993A (en)*2021-01-112022-07-19河南小狮智能科技有限公司Positioning system for port automatic driving vehicle
CN113267178A (en)*2021-03-252021-08-17浙江大学Model pose measurement system and method based on multi-sensor fusion
CN113091736A (en)*2021-04-022021-07-09京东数科海益信息科技有限公司Robot positioning method, device, robot and storage medium
CN115683022A (en)*2021-07-272023-02-03中移(成都)信息通信科技有限公司Wheel spacing and wheel diameter calibration method and device for double-wheel differential mobile robot
CN113566833A (en)*2021-07-282021-10-29上海工程技术大学Multi-sensor fusion vehicle positioning method and system
US12105522B2 (en)2021-10-082024-10-01Institute Of Intelligent Manufacturing, GdasMulti-sensor-fusion-based autonomous mobile robot indoor and outdoor navigation method and robot
CN113587930A (en)*2021-10-082021-11-02广东省科学院智能制造研究所Indoor and outdoor navigation method and device of autonomous mobile robot based on multi-sensor fusion
CN113587930B (en)*2021-10-082022-04-05广东省科学院智能制造研究所 Indoor and outdoor navigation method and device for autonomous mobile robot based on multi-sensor fusion
CN114018246A (en)*2021-11-152022-02-08北京克莱明科技有限公司Positioning navigation method and positioning navigation device
CN114018246B (en)*2021-11-152024-02-06北京克莱明科技有限公司Positioning navigation method and positioning navigation device
CN114430524A (en)*2021-12-302022-05-03杭州海康威视数字技术股份有限公司Ultra-wideband base station calibration method and device and electronic equipment
CN114430524B (en)*2021-12-302023-09-01杭州海康威视数字技术股份有限公司Calibration method and device of ultra-wideband base station and electronic equipment
CN114527755A (en)*2022-02-212022-05-24山东新一代信息产业技术研究院有限公司Method, equipment and storage medium for automatic pile returning and charging of robot
CN116466724A (en)*2023-04-252023-07-21浙江亚特电器股份有限公司Mobile positioning method and device of robot and robot

Similar Documents

PublicationPublication DateTitle
CN106918830A (en) A positioning method and mobile robot based on multiple navigation modules
Georgiev et al.Localization methods for a mobile robot in urban environments
CN108226938B (en)AGV trolley positioning system and method
CN109166140B (en) A method and system for vehicle trajectory estimation based on multi-line lidar
CN108051002B (en)Transport vehicle space positioning method and system based on inertial measurement auxiliary vision
CN107180215B (en) Automatic mapping and high-precision positioning method of parking lot based on storage location and two-dimensional code
CN106840148B (en) Wearable localization and path guidance method based on binocular camera in outdoor working environment
CN110345937A (en)Appearance localization method and system are determined in a kind of navigation based on two dimensional code
CN106199626B (en) Indoor 3D point cloud map generation system and method based on swing lidar
US11138465B2 (en)Systems and methods for transforming coordinates between distorted and undistorted coordinate systems
CN107167148A (en) Synchronous positioning and map construction method and device
US9122278B2 (en)Vehicle navigation
JP2017090239A (en)Information processing device, control method, program, and storage media
JP2020064056A (en) Position estimation apparatus and method
US11287281B2 (en)Analysis of localization errors in a mobile object
KR20140144921A (en)Simulation system for autonomous vehicle using virtual reality
CN110211228A (en)For building the data processing method and device of figure
CN110458885B (en)Positioning system and mobile terminal based on stroke perception and vision fusion
CN112068152A (en) Method and system for simultaneous 2D localization and 2D map creation using a 3D scanner
WO2016157428A1 (en)Measurement device, measurement method, and program
CN113093759A (en)Robot formation construction method and system based on multi-sensor information fusion
Hoang et al.3D motion estimation based on pitch and azimuth from respective camera and laser rangefinder sensing
Canh et al.Multisensor data fusion for reliable obstacle avoidance
CN112556681B (en)Vision-based navigation and positioning method for orchard machine
JP7337617B2 (en) Estimation device, estimation method and program

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
RJ01Rejection of invention patent application after publication

Application publication date:20170704

RJ01Rejection of invention patent application after publication

[8]ページ先頭

©2009-2025 Movatter.jp