Movatterモバイル変換


[0]ホーム

URL:


CN114964207B - Robot 3D point cloud map dynamic updating method and device and robot - Google Patents

Robot 3D point cloud map dynamic updating method and device and robot
Download PDF

Info

Publication number
CN114964207B
CN114964207BCN202210437336.9ACN202210437336ACN114964207BCN 114964207 BCN114964207 BCN 114964207BCN 202210437336 ACN202210437336 ACN 202210437336ACN 114964207 BCN114964207 BCN 114964207B
Authority
CN
China
Prior art keywords
map
pose
local
sub
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210437336.9A
Other languages
Chinese (zh)
Other versions
CN114964207A (en
Inventor
袁国斌
刘彪
柏林
舒海燕
沈创芸
祝涛剑
王恒华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Gosuncn Robot Co Ltd
Original Assignee
Guangzhou Gosuncn Robot Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Gosuncn Robot Co LtdfiledCriticalGuangzhou Gosuncn Robot Co Ltd
Priority to CN202210437336.9ApriorityCriticalpatent/CN114964207B/en
Publication of CN114964207ApublicationCriticalpatent/CN114964207A/en
Application grantedgrantedCritical
Publication of CN114964207BpublicationCriticalpatent/CN114964207B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Classifications

Landscapes

Abstract

The invention provides a dynamic updating method of a robot 3D point cloud map, which comprises the following steps: s1, acquiring a first pose of a robot based on 3D point cloud map positioning; s2, according to the second pose of the robot output by the laser odometer, maintaining a subgraph of a preset time; s3, judging that the angle difference between the first pose and the second pose is always larger than a first threshold value or the displacement difference is always larger than a second threshold value; if yes, executing step S4; s4, extracting a local map corresponding to the current position from the existing 3D point cloud map; s5, carrying out point cloud map projection on the subgraph and the local map to form a depth image format; and S6, when the variance difference value between the depth map of the local map and the depth map of the sub-map is larger than a third threshold value, the sub-map is directly replaced with the local map, and dynamic updating of the map is completed. According to the method and the device for updating the map, the change of the 3D point cloud map into the map is estimated according to the variance difference value of the local map depth map and the subgraph depth map, and whether the map is required to be updated can be accurately judged.

Description

Robot 3D point cloud map dynamic updating method and device and robot
Technical Field
The invention relates to the field of robots, in particular to a method and a device for dynamically updating a 3D point cloud map of a robot and the robot.
Background
Autonomous mobile robots require that the robot can achieve autonomous path finding and walking capabilities, provided that the robot knows where it is located. The localization technology of autonomous mobile robots is one of the hot spot technology of research in recent years. The positioning modes widely applied to the autonomous navigation mobile robot at present comprise a laser positioning mode, a GPS positioning mode, a visual positioning mode and the like. The patrol robot is a specific application of an automatic mobile robot, is usually carried with a 3D laser radar for positioning and navigation, and is mainly used for patrol, photographing, video recording and other works in places such as parks, buildings and the like.
The patrol robot realizes positioning navigation based on a 3D laser point cloud map which is built in advance. However, most real world scenes such as park scenes plus park trees to build buildings, spring and winter. Irrespective of these variations, a map constructed in advance becomes easily outdated. Affecting the positioning accuracy of the robot and possibly even causing the position loss of the robot. Therefore, it is necessary to automatically update and maintain the 3D point cloud map.
The existing technical scheme solves the problem that the 3D point cloud map of the mobile robot is outdated, and the map is manually updated mainly through maintenance/technicians. However, the update has the following problems: 1. the manual intervention not only wastes manpower but also affects the automation level of the robot;
2. The manual maintenance is difficult to realize in real time, and the map is updated only when the robot positioning is in a problem;
The background description provided herein is for the purpose of generally presenting the context of the disclosure. Unless otherwise indicated herein, the materials described in this section are not prior art to the claims in this application and are not admitted to be prior art by inclusion in this section.
Disclosure of Invention
Aiming at the technical problems in the related art, the invention provides a dynamic updating method of a robot 3D point cloud map, which comprises the following steps:
s1, acquiring a first pose pose _map of a robot based on 3D point cloud map positioning;
S2, according to a robot second pose pose-odom output by the laser odometer, maintaining a sub map for a preset time;
s3, judging that the angle difference between the first pose pose _map and the second pose pose _ odom is always larger than a first threshold value or the displacement difference is always larger than a second threshold value in a continuous second time period; if yes, executing step S4;
s4, extracting a local map local_map corresponding to the current position from the existing 3D point cloud map;
s5, projecting the sub-map and the local map into a depth image format by using a point cloud map, and respectively marking the depth image format as a sub-map depth map sub-map img and a local map depth map local map img;
And S6, when the variance difference value between the local map depth map local_map_img and the sub map depth map sub_map_img is larger than a third threshold value, directly replacing the sub map sub_map with the local map local_map to complete dynamic updating of the map.
Specifically, the predetermined time is 5s.
Specifically, the second period of time is 5s.
Specifically, the first threshold is 3 degrees or the second threshold is 0.3m.
Specifically, the third threshold is 0.5.
In a second aspect, another embodiment of the present invention discloses a robot 3D point cloud map dynamic updating apparatus, which includes:
the first pose acquisition unit is used for acquiring a first pose pose _map of the robot based on the 3D point cloud map positioning;
The second pose acquisition unit is used for maintaining a sub map of a preset time according to the second pose pose-odom of the robot output by the laser odometer;
The pose comparison unit is used for judging that the angle difference between the first pose pose _map and the second pose pose _ odom is always larger than a first threshold value or the displacement difference is always larger than a second threshold value in a continuous second time period; if yes, executing a local map acquisition unit;
the local map acquisition unit is used for extracting a local map local_map corresponding to the current position from the existing 3D point cloud map;
The depth map projection unit is used for projecting the sub-map and the local map into a depth image format by using a point cloud map, and respectively recording the sub-map and the local map as a sub-map depth map sub-map img and a local map depth map img;
And the local map updating unit is used for directly replacing the sub map with the local map when the variance difference between the local map depth map local_map_img and the sub map depth map sub_map_img is larger than a third threshold value, so as to complete dynamic map updating.
Specifically, the predetermined time and the second time period are both 5s.
Specifically, the first threshold is 3 degrees or the second threshold is 0.3m.
Specifically, the third threshold is 0.5.
In a third aspect, another embodiment of the present invention provides a robot, where the robot includes a central processor, and a storage unit, where the storage unit stores instructions stored on the storage unit, and the instructions, when executed by the processor, are configured to implement the method for dynamically updating a 3D point cloud map of the robot.
The invention considers that the local map of the position may need to be updated when the angle difference between the first pose pose _map and the second pose pose _ odom is always larger than a first threshold value in a second time period or the displacement difference is always larger than a second threshold value in the second time period, and further considers that the local map of the position needs to be updated when the variance difference between the local map depth map local_map_img and the sub-map depth map_img is larger than a third threshold value. According to the method, the image algorithm is used for evaluating the change of the 3D point cloud map into the map according to the variance difference value of the local map depth map local_map_img and the sub-map depth map sub_map_img, so that the change degree of the 3D point cloud map can be efficiently and accurately detected, and whether map updating is needed or not can be accurately judged.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of a method for dynamically updating a 3D point cloud map of a robot according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a dynamic update device for a 3D point cloud map of a robot according to an embodiment of the present invention;
fig. 3 is a schematic diagram of a robot according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which are derived by a person skilled in the art based on the embodiments of the invention, fall within the scope of protection of the invention.
Example 1
Referring to fig. 1, the embodiment discloses a dynamic update method for a robot 3D point cloud map, which includes the following steps:
s1, acquiring a first pose pose _map of a robot based on 3D point cloud map positioning;
specifically, the robot starts to realize positioning in the scene based on the 3D point cloud map, and outputs a first pose pose _map of the robot.
Specifically, when the robot patrol in a preset scene, the positioning mode based on the 3D point cloud map is started to realize the positioning of the robot.
The robot is provided with a laser radar, and is positioned according to the laser radar and a pre-stored 3D point cloud map.
S2, according to a robot second pose pose-odom output by the laser odometer, maintaining a sub map for a preset time;
Specifically, the robot in this embodiment is equipped with a laser odometer (LiDAR odometry), which is a method of estimating the propagation position and direction by tracking the laser speckle pattern reflected by surrounding objects.
Specifically, the predetermined time is 5s.
S3, judging that the angle difference between the first pose pose _map and the second pose pose _ odom is always larger than a first threshold value or the displacement difference is always larger than a second threshold value in a continuous second time period; if yes, executing step S4;
specifically, the present embodiment compares that the angle difference between the first pose pose _map and the second pose pose _ odom is always greater than the first threshold value in a continuous second period of time, for example, the angle difference between the first pose pose _map and the second pose pose _ odom is always greater than the first threshold value in 5 s.
The specific first threshold is 3 degrees.
Or the displacement difference between the first pose pose _map and the second pose pose _ odom is always greater than the second threshold in a continuous second period of time, for example, the displacement difference between the first pose pose _map and the second pose pose _ odom is always greater than the second threshold in 5 s.
Specifically, the second threshold is 0.3m.
If the angle difference between the first pose pose _map and the second pose pose _ odom is always greater than a first threshold value in a second time period or the displacement difference is always greater than a second threshold value in a second time period, the local map of the position is considered to be initially considered to be possibly updated.
S4, extracting a local map local_map corresponding to the current position from the existing 3D point cloud map;
specifically, the robot stores an established 3D point cloud map, and the established 3D point cloud map may have a difference from the current time.
According to the acquired current position of the robot, the embodiment acquires a local map of the current position from the established 3D map.
S5, projecting the sub-map and the local map into a depth image format by using a point cloud map, and respectively marking the depth image format as a sub-map depth map sub-map img and a local map depth map local map img;
specifically, converting the point cloud map (x, y, z) into a depth image (u, v, pixel);
The present embodiment converts the point cloud map (x, y, z) into a depth image (u, v, pixel) according to the following formula with a pixel size resolution of 0.1m per image: u=round (x 0.1), v=round (y 0.1), pixel=round (z 100/256), where round () is rounded up.
The depth image (DEPTH IMAGES), also referred to as a range image (RANGE IMAGE), refers to an image having as pixel values the range values from an image collector to points in the scene, which directly reflects the geometry of the scene's visible surface.
S6, when the variance difference value between the local map depth map local_map_img and the sub map depth map sub_map_img is larger than a third threshold value, directly replacing the sub map sub_map with the local map local_map to complete dynamic updating of the map;
Specifically, the present embodiment calculates the variance value of the local map depth map local_map_img, and the present embodiment calculates the mean value pmean=1/n (P (u, v, 1) +p (u, v, 2) +........... +p (u, v, n)) of the local map depth map local_map_img; the variance v=1/n of the local map depth map local_map_img is then calculated from the mean value [ (P (u, V, 1) -Pmean)/(2+ (P (u, V, 2) -Pmean)/(2 +........... + (P (u, V, n) -Pmean)/(2) ].
A specific third threshold is 0.5.
The present embodiment considers that the local map of the position may need to be updated when the angle difference between the first pose pose _map and the second pose pose _ odom is always greater than a first threshold value in the second period of time or the displacement difference is always greater than a second threshold value in the second period of time, and further considers that the local map of the position needs to be updated when the variance difference between the local map depth map local_map_img and the sub-map depth map sub_map_img is greater than a third threshold value. According to the embodiment, the change degree of the 3D point cloud map can be efficiently and accurately detected by an image algorithm, namely, the change degree of the 3D point cloud map is estimated according to the variance difference value of the local map depth map local_map_img and the sub map depth map sub_map_img, so that whether map updating is needed or not can be accurately judged.
Example two
Referring to fig. 2, the embodiment discloses a robot 3D point cloud map dynamic updating device, which includes the following units:
the first pose acquisition unit is used for acquiring a first pose pose _map of the robot based on the 3D point cloud map positioning;
specifically, the robot starts to realize positioning in the scene based on the 3D point cloud map, and outputs a first pose pose _map of the robot.
Specifically, when the robot patrol in a preset scene, the positioning mode based on the 3D point cloud map is started to realize the positioning of the robot.
The robot is provided with a laser radar, and is positioned according to the laser radar and a pre-stored 3D point cloud map.
The second pose acquisition unit is used for maintaining a sub map of a preset time according to the second pose pose-odom of the robot output by the laser odometer;
Specifically, the robot in this embodiment is equipped with a laser odometer (LiDAR odometry), which is a method of estimating the propagation position and direction by tracking the laser speckle pattern reflected by surrounding objects.
Specifically, the predetermined time is 5s.
The pose comparison unit is used for judging that the angle difference between the first pose pose _map and the second pose pose _ odom is always larger than a first threshold value or the displacement difference is always larger than a second threshold value in a continuous second time period; if yes, executing a local map acquisition unit;
specifically, the present embodiment compares that the angle difference between the first pose pose _map and the second pose pose _ odom is always greater than the first threshold value in a continuous second period of time, for example, the angle difference between the first pose pose _map and the second pose pose _ odom is always greater than the first threshold value in 5 s.
The specific first threshold is 3 degrees.
Or the displacement difference between the first pose pose _map and the second pose pose _ odom is always greater than the second threshold in a continuous second period of time, for example, the displacement difference between the first pose pose _map and the second pose pose _ odom is always greater than the second threshold in 5 s.
Specifically, the second threshold is 0.3m.
If the angle difference between the first pose pose _map and the second pose pose _ odom is always greater than a first threshold value in a second time period or the displacement difference is always greater than a second threshold value in a second time period, the local map of the position is considered to be initially considered to be possibly updated.
The local map acquisition unit is used for extracting a local map local_map corresponding to the current position from the existing 3D point cloud map;
specifically, the robot stores an established 3D point cloud map, and the established 3D point cloud map may have a difference from the current time.
According to the acquired current position of the robot, the embodiment acquires a local map of the current position from the established 3D map.
The depth map projection unit is used for projecting the sub-map and the local map into a depth image format by using a point cloud map, and respectively recording the sub-map and the local map as a sub-map depth map sub-map img and a local map depth map img;
specifically, converting the point cloud map (x, y, z) into a depth image (u, v, pixel);
The present embodiment converts the point cloud map (x, y, z) into a depth image (u, v, pixel) according to the following formula with a pixel size resolution of 0.1m per image: u=round (x 0.1), v=round (y 0.1), pixel=round (z 100/256), where round () is rounded up.
The depth image (DEPTH IMAGES), also referred to as a range image (RANGE IMAGE), refers to an image having as pixel values the range values from an image collector to points in the scene, which directly reflects the geometry of the scene's visible surface.
And the local map updating unit is used for directly replacing the sub map with the local map when the variance difference between the local map depth map local_map_img and the sub map depth map sub_map_img is larger than a third threshold value, so as to complete dynamic map updating.
Specifically, the present embodiment calculates the variance value of the local map depth map local_map_img, and the present embodiment calculates the mean value pmean=1/n (P (u, v, 1) +p (u, v, 2) +........... +p (u, v, n)) of the local map depth map local_map_img; the variance v=1/n of the local map depth map local_map_img is then calculated from the mean value [ (P (u, V, 1) -Pmean)/(2+ (P (u, V, 2) -Pmean)/(2 +........... + (P (u, V, n) -Pmean)/(2) ].
A specific third threshold is 0.5.
The present embodiment considers that the local map of the position may need to be updated when the angle difference between the first pose pose _map and the second pose pose _ odom is always greater than a first threshold value in the second period of time or the displacement difference is always greater than a second threshold value in the second period of time, and further considers that the local map of the position needs to be updated when the variance difference between the local map depth map local_map_img and the sub-map depth map sub_map_img is greater than a third threshold value. According to the embodiment, the change degree of the 3D point cloud map can be efficiently and accurately detected by an image algorithm, namely, the change degree of the 3D point cloud map is estimated according to the variance difference value of the local map depth map local_map_img and the sub map depth map sub_map_img, so that whether map updating is needed or not can be accurately judged.
Example III
Referring to fig. 3, fig. 3 is a schematic structural view of the robot of the present embodiment. The robot 20 of this embodiment comprises a processor 21, a memory 22 and a computer program stored in said memory 22 and executable on said processor 21. The steps of the above-described method embodiments are implemented by the processor 21 when executing the computer program. Or the processor 21, when executing the computer program, performs the functions of the modules/units in the above-described device embodiments.
Illustratively, the computer program may be partitioned into one or more modules/units that are stored in the memory 22 and executed by the processor 21 to complete the present invention. The one or more modules/units may be a series of computer program instruction segments capable of performing a specific function for describing the execution of the computer program in the robot 20. For example, the computer program may be divided into modules in the second embodiment, and specific functions of each module refer to the working process of the apparatus described in the foregoing embodiment, which is not described herein.
The robot 20 may include, but is not limited to, a processor 21, a memory 22. It will be appreciated by those skilled in the art that the schematic diagram is merely an example of the robot 20 and is not limiting of the robot 20, and may include more or fewer components than shown, or may combine certain components, or different components, e.g., the robot 20 may also include input and output devices, network access devices, buses, etc.
The Processor 21 may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (DIGITAL SIGNAL Processor, DSP), application SPECIFIC INTEGRATED Circuit (ASIC), off-the-shelf Programmable gate array (Field-Programmable GATE ARRAY, FPGA) or other Programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. The general purpose processor may be a microprocessor or the processor may be any conventional processor or the like, and the processor 21 is a control center of the robot 20, and connects various parts of the entire robot 20 using various interfaces and lines.
The memory 22 may be used to store the computer program and/or module, and the processor 21 may implement various functions of the robot 20 by running or executing the computer program and/or module stored in the memory 22 and invoking data stored in the memory 22. The memory 22 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like; the storage data area may store data (such as audio data, phonebook, etc.) created according to the use of the handset, etc. In addition, the memory 22 may include high-speed random access memory, and may also include non-volatile memory, such as a hard disk, memory, plug-in hard disk, smart memory card (SMART MEDIA CARD, SMC), secure Digital (SD) card, flash memory card (FLASH CARD), at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
Wherein the integrated modules/units of the robot 20 may be stored in a computer readable storage medium if implemented in the form of software functional units and sold or used as separate products. Based on such understanding, the present invention may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and the computer program may implement the steps of each of the method embodiments described above when executed by the processor 21. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the computer readable medium contains content that can be appropriately scaled according to the requirements of jurisdictions in which such content is subject to legislation and patent practice, such as in certain jurisdictions in which such content is subject to legislation and patent practice, the computer readable medium does not include electrical carrier signals and telecommunication signals.
It should be noted that the above-described apparatus embodiments are merely illustrative, and the units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. In addition, in the drawings of the embodiment of the device provided by the invention, the connection relation between the modules represents that the modules have communication connection, and can be specifically implemented as one or more communication buses or signal lines. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather is intended to cover all modifications, equivalents, alternatives, and improvements that fall within the spirit and scope of the invention.

Claims (10)

CN202210437336.9A2022-04-212022-04-21Robot 3D point cloud map dynamic updating method and device and robotActiveCN114964207B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202210437336.9ACN114964207B (en)2022-04-212022-04-21Robot 3D point cloud map dynamic updating method and device and robot

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202210437336.9ACN114964207B (en)2022-04-212022-04-21Robot 3D point cloud map dynamic updating method and device and robot

Publications (2)

Publication NumberPublication Date
CN114964207A CN114964207A (en)2022-08-30
CN114964207Btrue CN114964207B (en)2024-09-27

Family

ID=82970809

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202210437336.9AActiveCN114964207B (en)2022-04-212022-04-21Robot 3D point cloud map dynamic updating method and device and robot

Country Status (1)

CountryLink
CN (1)CN114964207B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN111708047A (en)*2020-06-162020-09-25浙江大华技术股份有限公司Robot positioning evaluation method, robot and computer storage medium
CN113932790A (en)*2021-09-012022-01-14北京迈格威科技有限公司Map updating method, device, system, electronic equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN113436310A (en)*2020-03-232021-09-24南京科沃斯机器人技术有限公司Scene establishing method, system and device and self-moving robot

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN111708047A (en)*2020-06-162020-09-25浙江大华技术股份有限公司Robot positioning evaluation method, robot and computer storage medium
CN113932790A (en)*2021-09-012022-01-14北京迈格威科技有限公司Map updating method, device, system, electronic equipment and storage medium

Also Published As

Publication numberPublication date
CN114964207A (en)2022-08-30

Similar Documents

PublicationPublication DateTitle
CN111192331A (en)External parameter calibration method and device for laser radar and camera
CN115097419A (en) A method and device for external parameter calibration from lidar to IMU
US20220074743A1 (en)Aerial survey method, aircraft, and storage medium
CN117974916A (en)High-precision map generation method and device based on information fusion
CN115326051A (en)Positioning method and device based on dynamic scene, robot and medium
CN112650250B (en)Map construction method and robot
CN115267812B (en)Positioning method, device, medium and robot based on highlight region
CN115717897B (en)Point cloud map dynamic object filtering method, device, medium and robot
CN109598199B (en)Lane line generation method and device
CN115902843A (en)Multi-laser-radar calibration method and device and electronic equipment
CN114964207B (en)Robot 3D point cloud map dynamic updating method and device and robot
CN113066100B (en)Target tracking method, device, equipment and storage medium
CN112364693B (en)Binocular vision-based obstacle recognition method, device, equipment and storage medium
CN115685236A (en)Robot, robot skid processing method, device and readable storage medium
CN112396051B (en)Determination method and device for passable area, storage medium and electronic device
CN117876500A (en) External parameter calibration method, device, electronic equipment and vehicle for sensing component
CN111380529B (en)Mobile device positioning method, device and system and mobile device
CN114564018B (en) Control method, control device, terminal equipment and computer readable storage medium
CN114895679B (en)Human body following method, device and robot with linkage of two-axis cradle head and chassis
CN115407356B (en) A positioning method, device, robot and medium for integrating lane lines
CN110763232B (en)Robot and navigation positioning method and device thereof
CN110244710B (en)Automatic tracing method, device, storage medium and electronic equipment
CN116892925A (en)2D grid map dynamic updating method, device and robot
CN116698038A (en)Positioning loss judging method and device of robot and robot
CN114216451B (en)Robot map updating method and device

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp