Movatterモバイル変換


[0]ホーム

URL:


CN111624997A - Robot control method and system based on TOF camera module and robot - Google Patents

Robot control method and system based on TOF camera module and robot
Download PDF

Info

Publication number
CN111624997A
CN111624997ACN202010398887.XACN202010398887ACN111624997ACN 111624997 ACN111624997 ACN 111624997ACN 202010398887 ACN202010398887 ACN 202010398887ACN 111624997 ACN111624997 ACN 111624997A
Authority
CN
China
Prior art keywords
robot
camera module
tof camera
obstacle
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010398887.XA
Other languages
Chinese (zh)
Inventor
戴剑锋
何再生
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Amicro Semiconductor Co Ltd
Original Assignee
Zhuhai Amicro Semiconductor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Amicro Semiconductor Co LtdfiledCriticalZhuhai Amicro Semiconductor Co Ltd
Priority to CN202010398887.XApriorityCriticalpatent/CN111624997A/en
Publication of CN111624997ApublicationCriticalpatent/CN111624997A/en
Pendinglegal-statusCriticalCurrent

Links

Images

Classifications

Landscapes

Abstract

The invention discloses a robot control method, a system and a robot based on a TOF camera module, wherein the robot acquires environmental information and establishes a global map; the robot plans a walking route according to the global map and walks according to the walking route; the TOF camera module acquires depth information of obstacles on a robot walking route; and the robot establishes a local map based on the global map and the depth information and replans a walking route according to the local map. Increase plane TOF on laser SLAM or vision SLAM's basis, TOF camera module can survey the depth information of barrier in the scope of visual angle, just then can survey 3 dimension coordinate information on every side to can fix a position the barrier condition in the robot the place ahead, when the robot is close to the barrier, just can avoid the barrier in advance and realize no collision function.

Description

Robot control method and system based on TOF camera module and robot
Technical Field
The invention relates to the technical field of intelligent robots, in particular to a robot control method and system based on a TOF camera module and a robot.
Background
At present, SLAM robots based on inertial navigation, vision and laser are more and more popular, a family sweeping cleaning robot is relatively strong in representativeness, the indoor environment is positioned and a map is built in real time by combining the data of the vision, the laser, a gyroscope, acceleration and a wheel odometer, and then positioning navigation is realized according to the built map. However, at present, a robot often has movable obstacles such as toys and electric wires on the ground in a complex obstacle environment, and when the robot collides with the obstacle of the type, the robot pushes the obstacle or is wound by the obstacle of the electric wire type.
Disclosure of Invention
In order to solve the problems, the invention provides a robot control method and system based on a TOF camera module and a robot. The specific technical scheme of the invention is as follows:
a robot control method based on a TOF camera module specifically comprises the following steps: s1: the robot acquires environmental information and establishes a global map; s2: the robot plans a walking route according to the global map and walks according to the walking route; s3: the TOF camera module acquires depth information of obstacles on a robot walking route; s4: and the robot establishes a local map based on the global map and the depth information and replans a walking route according to the local map.
In one or more aspects of the present invention, the robot calibrates and optimizes a walking route by acquiring real-time environment information before walking.
In one or more aspects of the invention, when the robot walks according to the walking route, the robot acquires information of the surrounding environment, extracts feature point information of the surrounding environment for matching, detects whether the actual position of the robot is matched with the position on the global map, calibrates the position of the robot on the global map, and optimizes the global map.
In one or more aspects of the present invention, the method for the robot to create the local map in step S4 includes: the TOF camera module obtains depth information of an obstacle within a visual angle range, the detected depth information of the obstacle is represented by Z0-Zn, a global coordinate system is established by the machine, position information of the obstacle under the global coordinate of the robot is obtained according to the depth information detected by the TOF camera module and calibrated parameters of the TOF camera module, and a local map is established according to the position information and the global map.
In one or more aspects of the invention, when the robot acquires the depth information to update the route, the ground trafficability is judged according to the detected information, then the robot makes a behavior decision according to the judgment result, the robot updates the decision result into the local map by combining the depth information received at the same time, and the robot continues to re-plan the route according to the local map.
In one or more aspects of the present invention, the robot makes a behavior decision based on a rank walking method or/and an edge walking method.
In one or more aspects of the invention, when the robot walks rank, the distance between the obstacle and the robot and the model of the obstacle, which are obtained by the TOF camera module, are used to predict how long the current rank can pass, and the time for decelerating in advance when approaching the obstacle or/and the type of the obstacle ahead are predicted.
In one or more aspects of the invention, when the robot walks edgewise, the distance between the obstacle and the robot and the model of the obstacle, which are obtained by the TOF camera module, are used for judging whether the obstacle at the current edgewise is a wall surface or not, and changing an edgewise route according to the structure of the obstacle, and pre-judging the edgewise position with the best overall starting or/and pre-judging the direction with the best current edgewise. The robot calibrates and optimizes the route in real time in the walking process, so that the robot is always positioned on the optimal walking route; when the robot is interfered by an obstacle, the robot acquires the depth information of the obstacle through the TOF camera module to establish a local map and re-plan a route, so that the interference to the walking of the robot is avoided, and the anti-interference performance of the robot is enhanced.
The utility model provides a robot control system based on TOF camera module, it includes controller and the TOF camera module, information acquisition module, distance detection module, gyroscope and the odometer that link to each other with the controller, TOF camera module is used for acquireing the degree of depth information of place ahead barrier, information acquisition module is used for acquireing environmental data and establishes or revises global map, distance detection module is used for detecting the distance of robot and wall or barrier, the gyroscope is used for detecting robot pivoted angle, the odometer is used for measuring the stroke of robot. The control system of robot detects small-size barriers such as toys, electric wires through adopting TOF camera module, and then takes corresponding measure, powerful for adopting a plurality of cameras or multi-thread laser head, greatly reduced manufacturing cost moreover.
The robot control system is provided with a TOF camera module, and navigation is performed by the TOF camera module-based robot control method. When the robot walks, real-time local map information of the current robot is established, the walking route of the robot can be planned in advance according to the local map, obstacles are avoided, and the robot can work without collision.
Drawings
FIG. 1 is a flow chart of a robot control method based on a TOF camera module according to the present invention;
FIG. 2 is a schematic view of horizontal and vertical detection angles of TOF camera modules of the robot;
FIG. 3 is a schematic diagram of a robot detecting an obstacle under local coordinates according to the present invention;
FIG. 4 is a schematic flow chart of a robot behavior planning method according to the present invention;
fig. 5 is a schematic structural diagram of a robot according to the present invention.
Detailed Description
Reference will now be made in detail to the embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar functions throughout.
In the description of the present invention, it should be noted that, for the terms of orientation, such as "central", "lateral", "longitudinal", "length", "width", "thickness", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "clockwise", "counterclockwise", etc., it indicates that the orientation and positional relationship shown in the drawings are based on the orientation or positional relationship shown in the drawings, and is only for the convenience of describing the present invention and simplifying the description, but does not indicate or imply that the device or element referred to must have a specific orientation, be constructed in a specific orientation, and be operated without limiting the specific scope of protection of the present invention.
Furthermore, if the terms "first" and "second" are used for descriptive purposes only, they are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features. Thus, a definition of "a first" or "a second" feature may explicitly or implicitly include one or more of the feature, and in the description of the invention, "at least" means one or more unless specifically defined otherwise.
In the present invention, unless otherwise expressly specified or limited, the terms "assembled", "connected", and "connected" are to be construed broadly, e.g., as meaning fixedly connected, detachably connected, or integrally connected; or may be a mechanical connection; the two elements can be directly connected or connected through an intermediate medium, and the two elements can be communicated with each other. The specific meanings of the above terms in the present invention can be understood by those of ordinary skill in the art according to specific situations.
In the present invention, unless otherwise specified and limited, "above" or "below" a first feature may include the first and second features being in direct contact, and may also include the first and second features not being in direct contact but being in contact with each other through another feature therebetween. Also, the first feature being "above," "below," and "above" the second feature includes the first feature being directly above and obliquely above the second feature, or simply an elevation which indicates a level of the first feature being higher than an elevation of the second feature. The first feature being "above", "below" and "beneath" the second feature includes the first feature being directly below or obliquely below the second feature, or merely means that the first feature is at a lower level than the second feature.
The technical scheme and the beneficial effects of the invention are clearer and clearer by further describing the specific embodiment of the invention with the accompanying drawings of the specification. The embodiments described below are exemplary and are intended to be illustrative of the invention, but are not to be construed as limiting the invention.
Referring to fig. 1, a robot control method based on a TOF camera module is known, where TOF is an abbreviation of time of Flight (time of Flight) technology, that is, a sensor emits modulated near-infrared light, which is reflected after encountering an object, and the sensor converts the distance of a shot scene by calculating a time difference or a phase difference between light emission and reflection to generate depth information, and in addition, the three-dimensional contour of the object can be presented in a topographic map manner that different colors represent different distances by combining with a conventional camera for shooting, so as to obtain a three-dimensional 3D model, and the TOF camera module is a camera for acquiring data by using the TOF technology. The method specifically comprises the following steps: s1: the robot acquires environmental information and establishes a global map; s2: the robot plans a walking route according to the global map and walks according to the walking route; s3: in the walking process of the robot, the TOF camera module acquires depth information of obstacles on the walking route of the robot; s4: the robot establishes a local map based on the global map and the depth information and modifies the walking route according to the local map. When the obstacle interferes with the walking of the robot, the robot acquires the depth information of the obstacle through the TOF camera module to establish a local map and re-plan a route, so that the interference to the walking of the robot is avoided, and the anti-interference performance of the robot is enhanced.
As one embodiment, the robot calibrates or optimizes the walking route by acquiring real-time environment information and comparing the real-time environment information with a grid map in the robot before walking. When the robot walks according to the walking route, the robot acquires the information of the surrounding environment, extracts the characteristic point information of the surrounding environment for matching, detects whether the actual position of the robot is matched with the position on the global map, and then calibrates the position of the robot on the global map and optimizes the global map.
According to one implementation, the robot establishes a global coordinate system, position information of the obstacle under the global coordinate of the robot is obtained according to the current coordinate of the robot, the depth information of the obstacle detected by the TOF camera module and the related calibrated parameters of the TOF camera module, and the robot establishes a local map according to the position information. Fig. 2 is a schematic diagram of horizontal and vertical detection angles of a TOF camera module, the left diagram is the horizontal detection angle of the TOF camera module, Q1+ Q2 in the diagram represents the detection angle of the TOF camera module, Q1 is the detection angle of the TOF camera module towards the left, and Q2 is the detection angle of the TOF camera module towards the right. The right graph shows the TOF vertical detection angle, where P1+ P2 indicates the vertical detection angle of the TOF camera module, P1 indicates the upward detection angle of the TOF camera module, and P2 indicates the downward detection angle of the TOF camera module. As can be seen from FIG. 3, the depth values of the objects within the view angle range of the TOF camera module can be measured within the measurement range, and the depth information of the obstacles detected by the TOF camera module can be represented by Z0-Zn, so that the position information of the obstacles under the global coordinates of the robot can be obtained according to the current coordinates of the robot, the depth of the obstacles detected by the TOF camera module and the related calibrated parameters of the TOF camera module. Assuming that the current pose of the robot is (X0, Y0, angle0), calculating according to a formula, wherein X = (u-u0)/Ku X Z/f, and Y = (v-v0)/Kv X Z/f; where (X, Y, Z) is the coordinates of the depth information point in the camera coordinate system, and (u, v) is the pixel coordinates of the projection of the point (X, Y, Z) onto the imaging plane. Because the depth camera captures a depth image, the gray value of each pixel point in the image represents the depth value from the position in the environment where the pixel point is located to the imaging plane of the sensor, namely the value of Z is the value detected by the TOF camera module. 1/Ku and 1/Kv respectively represent the width and height of a pixel, f represents the focal length of the module, u0 and v0 represent the coordinates of the center point of an image, and the five parameters can be obtained by performing internal reference calibration on the TOF camera module. The relative coordinates (X, Y) of the obstacle with respect to the TOF camera module can be determined by the above formula. The position of the obstacle in the global coordinates of the robot can be determined by combining the current coordinates (X0, Y0) of the machine: x' = X0+ X; y' = Y0+ Y; furthermore, the rotation angle a = atan2(X '-X, Y' -Y) of the obstacle relative to the robot can be obtained from (X ', Y') and the current coordinate information of the robot, and the angle information of the obstacle in the global coordinate can be obtained: a ' = a + angle0, so the final obstacle position coordinate is (X0, Y0, a '), the robot builds a local map according to the obstacle position coordinate (X0, Y0, a '), combined with the global map, and the robot replans the route according to the local map. The robot calibrates and optimizes the route in real time in the walking process, so that the robot is always on the optimal walking route.
As one implementation, as can be seen from fig. 4, when the robot acquires depth information to update a route, it first determines ground trafficability according to the detected information, then the robot makes a behavior decision according to the determination result, the robot updates the decision result into a local map in combination with the received depth information, and the robot continues to re-plan the route according to the local map. The robot carries out behavior decision based on rank walking method or/and edgewise walking method, rank walking is the straight line bow-shaped walking of the common walking robot; the edge walking is that the robot takes the edge of the wall as the basis and walks along the wall. When the robot walks through rank, the distance between the obstacle and the robot and the model of the obstacle obtained by the TOF camera module are utilized, and the robot meets the corresponding environment by combining the rank walking rule, so that the decision behavior can be optimized. The robot pre-judges how long the current rank can pass according to a local map, a walking route and a walking speed; the robot pre-judges the time of early deceleration when approaching the obstacle according to the distance from the obstacle and the walking speed; the robot pre-judges the type of the front obstacle according to the depth information and the walking route, and decides whether the front obstacle is turned back in advance along the edge to the next rank or close to the obstacle or/and is used for judging whether a left and right missed-scanning area exists in the current rank, and the leakage is timely repaired. When the robot walks along the edge, the distance between the obstacle and the robot and the model of the obstacle, which are obtained by the TOF camera module, are utilized, and the robot meets the corresponding environment by combining the rules of walking along the edge, so that the decision behavior can be optimized. The robot judges whether the current obstacle along the edge is a wall surface or not according to the depth information and the local map and changes an edge route according to the obstacle structure; the robot compares the local map and the global map to predict an edgewise position that is globally starting best or/and to predict a currently best edgewise direction. The machine adopts the two methods to plan the route, so that the walking route of the robot is neat and ordered.
Referring to fig. 5, the robot control system based on the TOF camera module comprises a controller, a TOF camera module, an information acquisition module, a distance detection module, a gyroscope and a odometer, wherein the TOF camera module, the information acquisition module, the distance detection module, the gyroscope and the odometer are connected with the controller, the TOF camera module is used for acquiring depth information of a front obstacle, the information acquisition module is used for acquiring environmental data to establish or modify a global map, the distance detection module is used for detecting the distance between the robot and a wall surface or an obstacle, the gyroscope is used for detecting the rotation angle of the robot, and the odometer is used for measuring the travel of the robot. The information acquisition module is a camera module or a laser head. The laser head is a single line laser head. The distance detection module at least comprises an ultrasonic distance sensor, an infrared intensity detection sensor, an infrared distance sensor, a physical switch detection collision sensor, a capacitance change detection sensor or a resistance change detection sensor. The camera module is used for matching characteristic point information of the surrounding environment in advance by shooting pictures of the periphery of the robot, observing whether the robot is located at the same position on a planned route, and further optimizing the position of the robot, so that a wrong map of the robot is optimized and calibrated; or a single line laser rangefinder may be used to scan the environment ahead of time and for map correction and repositioning before sweeping. And the production cost is low compared with that of a multi-line laser head. The machine provided by the invention is additionally provided with a plane TOF on the basis of a laser SLAM or a vision SLAM, and is used for detecting obstacles to realize a collision-free operation function.
The robot control system is provided with a TOF camera module, and navigation is performed by the TOF camera module-based robot control method. The TOF camera module 2 is located in front of the robotmain body 1, and theinformation acquisition module 3 is located in the middle area of the robot. Before the robot walks, a global map is established, a route is planned according to the global map, when the robot walks, real-time local map information of the current robot is established according to the global map and detection information of a TOF camera module, the walking route of the robot can be planned and modified in advance according to the local map, obstacles are avoided, and the robot does not collide.
In the description of the specification, reference to the description of "one embodiment", "preferably", "an example", "a specific example" or "some examples", etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention, and schematic representations of the terms in this specification do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. The connection mode connected in the description of the specification has obvious effects and practical effectiveness.
With the above structure and principle in mind, those skilled in the art should understand that the present invention is not limited to the above embodiments, and modifications and substitutions based on the known technology in the field are within the scope of the present invention, which should be limited by the claims.

Claims (10)

CN202010398887.XA2020-05-122020-05-12Robot control method and system based on TOF camera module and robotPendingCN111624997A (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202010398887.XACN111624997A (en)2020-05-122020-05-12Robot control method and system based on TOF camera module and robot

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202010398887.XACN111624997A (en)2020-05-122020-05-12Robot control method and system based on TOF camera module and robot

Publications (1)

Publication NumberPublication Date
CN111624997Atrue CN111624997A (en)2020-09-04

Family

ID=72271917

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202010398887.XAPendingCN111624997A (en)2020-05-122020-05-12Robot control method and system based on TOF camera module and robot

Country Status (1)

CountryLink
CN (1)CN111624997A (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN109571470A (en)*2018-12-032019-04-05江西洪都航空工业集团有限责任公司A kind of robot
CN112051853A (en)*2020-09-182020-12-08哈尔滨理工大学Intelligent obstacle avoidance system and method based on machine vision
CN112308039A (en)*2020-11-252021-02-02珠海市一微半导体有限公司Obstacle segmentation processing method and chip based on TOF camera
CN112347876A (en)*2020-10-262021-02-09珠海市一微半导体有限公司Obstacle identification method based on TOF camera and cleaning robot
CN112363513A (en)*2020-11-252021-02-12珠海市一微半导体有限公司Obstacle classification and obstacle avoidance control method based on depth information
CN112415998A (en)*2020-10-262021-02-26珠海市一微半导体有限公司Obstacle classification and obstacle avoidance control system based on TOF camera
CN112506189A (en)*2020-11-192021-03-16深圳优地科技有限公司Method for controlling robot to move
CN112698654A (en)*2020-12-252021-04-23珠海市一微半导体有限公司Single-point TOF-based mapping and positioning method, chip and mobile robot
CN112747746A (en)*2020-12-252021-05-04珠海市一微半导体有限公司Point cloud data acquisition method based on single-point TOF, chip and mobile robot
CN113029159A (en)*2021-04-252021-06-25上海理工大学Indoor mobile robot visual positioning navigation system
CN113787516A (en)*2021-08-162021-12-14深圳优地科技有限公司Positioning method and device and robot
CN114167871A (en)*2021-12-062022-03-11北京云迹科技有限公司 An obstacle detection method, device, electronic device and storage medium
CN114521849A (en)*2020-11-202022-05-24余姚舜宇智能光学技术有限公司TOF optical system for sweeping robot and sweeping robot
CN114815809A (en)*2022-03-222022-07-29深圳市杉川机器人有限公司Obstacle avoidance method and system for mobile robot, terminal device and storage medium
WO2024212587A1 (en)*2023-04-122024-10-17珠海格力智能装备有限公司Method and apparatus for controlling mobile robot, and mobile robot

Citations (9)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN104932502A (en)*2015-06-042015-09-23福建天晴数码有限公司Short-distance obstacle avoiding method and short-distance obstacle avoiding system based on three-dimensional depth camera
CN105607635A (en)*2016-01-052016-05-25东莞市松迪智能机器人科技有限公司 Automatic guided vehicle panoramic optical vision navigation control system and omnidirectional automatic guided vehicle
CN106054900A (en)*2016-08-082016-10-26电子科技大学Temporary robot obstacle avoidance method based on depth camera
CN106227218A (en)*2016-09-272016-12-14深圳乐行天下科技有限公司The navigation barrier-avoiding method of a kind of Intelligent mobile equipment and device
CN107272705A (en)*2017-07-312017-10-20中南大学A kind of multiple neural network controlling planning method of robot path under intelligent environment
CN107515606A (en)*2017-07-202017-12-26北京格灵深瞳信息技术有限公司Robot implementation method, control method and robot, electronic equipment
CN109063575A (en)*2018-07-052018-12-21中国计量大学A kind of intelligent grass-removing based on monocular vision is independently orderly mowed method
CN110675457A (en)*2019-09-272020-01-10Oppo广东移动通信有限公司 Positioning method and device, equipment, storage medium
CN110889349A (en)*2019-11-182020-03-17哈尔滨工业大学 A visual localization method for sparse 3D point cloud images based on VSLAM

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN104932502A (en)*2015-06-042015-09-23福建天晴数码有限公司Short-distance obstacle avoiding method and short-distance obstacle avoiding system based on three-dimensional depth camera
CN105607635A (en)*2016-01-052016-05-25东莞市松迪智能机器人科技有限公司 Automatic guided vehicle panoramic optical vision navigation control system and omnidirectional automatic guided vehicle
CN106054900A (en)*2016-08-082016-10-26电子科技大学Temporary robot obstacle avoidance method based on depth camera
CN106227218A (en)*2016-09-272016-12-14深圳乐行天下科技有限公司The navigation barrier-avoiding method of a kind of Intelligent mobile equipment and device
CN107515606A (en)*2017-07-202017-12-26北京格灵深瞳信息技术有限公司Robot implementation method, control method and robot, electronic equipment
CN107272705A (en)*2017-07-312017-10-20中南大学A kind of multiple neural network controlling planning method of robot path under intelligent environment
CN109063575A (en)*2018-07-052018-12-21中国计量大学A kind of intelligent grass-removing based on monocular vision is independently orderly mowed method
CN110675457A (en)*2019-09-272020-01-10Oppo广东移动通信有限公司 Positioning method and device, equipment, storage medium
CN110889349A (en)*2019-11-182020-03-17哈尔滨工业大学 A visual localization method for sparse 3D point cloud images based on VSLAM

Cited By (17)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN109571470A (en)*2018-12-032019-04-05江西洪都航空工业集团有限责任公司A kind of robot
CN112051853A (en)*2020-09-182020-12-08哈尔滨理工大学Intelligent obstacle avoidance system and method based on machine vision
CN112347876B (en)*2020-10-262024-04-05珠海一微半导体股份有限公司Obstacle recognition method based on TOF camera and cleaning robot
CN112347876A (en)*2020-10-262021-02-09珠海市一微半导体有限公司Obstacle identification method based on TOF camera and cleaning robot
CN112415998A (en)*2020-10-262021-02-26珠海市一微半导体有限公司Obstacle classification and obstacle avoidance control system based on TOF camera
CN112506189A (en)*2020-11-192021-03-16深圳优地科技有限公司Method for controlling robot to move
CN114521849A (en)*2020-11-202022-05-24余姚舜宇智能光学技术有限公司TOF optical system for sweeping robot and sweeping robot
CN112363513A (en)*2020-11-252021-02-12珠海市一微半导体有限公司Obstacle classification and obstacle avoidance control method based on depth information
CN112308039A (en)*2020-11-252021-02-02珠海市一微半导体有限公司Obstacle segmentation processing method and chip based on TOF camera
CN112747746A (en)*2020-12-252021-05-04珠海市一微半导体有限公司Point cloud data acquisition method based on single-point TOF, chip and mobile robot
CN112698654A (en)*2020-12-252021-04-23珠海市一微半导体有限公司Single-point TOF-based mapping and positioning method, chip and mobile robot
CN112698654B (en)*2020-12-252024-07-12珠海一微半导体股份有限公司Single-point TOF-based map building and positioning method, chip and mobile robot
CN113029159A (en)*2021-04-252021-06-25上海理工大学Indoor mobile robot visual positioning navigation system
CN113787516A (en)*2021-08-162021-12-14深圳优地科技有限公司Positioning method and device and robot
CN114167871A (en)*2021-12-062022-03-11北京云迹科技有限公司 An obstacle detection method, device, electronic device and storage medium
CN114815809A (en)*2022-03-222022-07-29深圳市杉川机器人有限公司Obstacle avoidance method and system for mobile robot, terminal device and storage medium
WO2024212587A1 (en)*2023-04-122024-10-17珠海格力智能装备有限公司Method and apparatus for controlling mobile robot, and mobile robot

Similar Documents

PublicationPublication DateTitle
CN111624997A (en)Robot control method and system based on TOF camera module and robot
CN112415998B (en)Obstacle classification obstacle avoidance control system based on TOF camera
CN112004645B (en)Intelligent cleaning robot
CN110275538A (en)Intelligent cruise vehicle navigation method and system
KR100901311B1 (en) Autonomous platform
CN112346463B (en) A Path Planning Method for Unmanned Vehicles Based on Velocity Sampling
CN111693050B (en)Indoor medium and large robot navigation method based on building information model
WO2021254367A1 (en)Robot system and positioning navigation method
CN112518739B (en)Track-mounted chassis robot reconnaissance intelligent autonomous navigation method
CN112183133A (en) An autonomous charging method for mobile robots based on ArUco code guidance
WO2022111017A1 (en)Tof-camera-based obstacle classification and obstacle avoidance control method
CN115143964A (en) An autonomous navigation method for quadruped robot based on 2.5D cost map
WO2017149813A1 (en)Sensor calibration system
JP7133251B2 (en) Information processing device and mobile robot
RU2740229C1 (en)Method of localizing and constructing navigation maps of mobile service robot
US20230071794A1 (en)Method and system for building lane-level map by using 3D point cloud map
CN111602028A (en)Method for automatically guiding a vehicle along a virtual rail system
CN111258311A (en)Obstacle avoidance method of underground mobile robot based on intelligent vision
CN112698654B (en)Single-point TOF-based map building and positioning method, chip and mobile robot
JP2024103605A (en) Information processing device, control method, program, and storage medium
CN112308033A (en)Obstacle collision warning method based on depth data and visual chip
CN115855068A (en)Robot path autonomous navigation method and system based on BIM
CN114903374B (en) A sweeping machine and control method thereof
CN119618188A (en)Navigation method for agricultural robot in orchard
JP7358108B2 (en) Information processing device, information processing method and program

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
CB02Change of applicant information
CB02Change of applicant information

Address after:519000 2706, No. 3000, Huandao East Road, Hengqin new area, Zhuhai, Guangdong

Applicant after:Zhuhai Yiwei Semiconductor Co.,Ltd.

Address before:Room 105-514, No.6 Baohua Road, Hengqin New District, Zhuhai City, Guangdong Province

Applicant before:AMICRO SEMICONDUCTOR Co.,Ltd.

RJ01Rejection of invention patent application after publication
RJ01Rejection of invention patent application after publication

Application publication date:20200904


[8]ページ先頭

©2009-2025 Movatter.jp