Movatterモバイル変換


[0]ホーム

URL:


CN115191884B - Mobile control method, device, cleaning robot and storage medium - Google Patents

Mobile control method, device, cleaning robot and storage medium
Download PDF

Info

Publication number
CN115191884B
CN115191884BCN202110384482.5ACN202110384482ACN115191884BCN 115191884 BCN115191884 BCN 115191884BCN 202110384482 ACN202110384482 ACN 202110384482ACN 115191884 BCN115191884 BCN 115191884B
Authority
CN
China
Prior art keywords
cleaning robot
base station
determining
identification area
scanning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110384482.5A
Other languages
Chinese (zh)
Other versions
CN115191884A (en
Inventor
苟建松
李松
周新伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Midea Robozone Technology Co Ltd
Original Assignee
Midea Robozone Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Midea Robozone Technology Co LtdfiledCriticalMidea Robozone Technology Co Ltd
Priority to CN202110384482.5ApriorityCriticalpatent/CN115191884B/en
Priority to PCT/CN2022/079029prioritypatent/WO2022213749A1/en
Publication of CN115191884ApublicationCriticalpatent/CN115191884A/en
Application grantedgrantedCritical
Publication of CN115191884BpublicationCriticalpatent/CN115191884B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Classifications

Landscapes

Abstract

The application discloses a movement control method, a movement control device, a cleaning robot and a storage medium. The method comprises the steps of obtaining scanning data of a laser radar of a cleaning robot and image data collected by an image collecting unit of the cleaning robot, determining a first position of an identification area on a base station relative to the cleaning robot based on the scanning data, determining a second position of the identification area relative to the cleaning robot based on the image data, wherein the identification area contains reflective materials, and controlling the cleaning robot to move towards the base station based on the first position and the second position so as to enable the cleaning robot to return to the base station position.

Description

Mobile control method, device, cleaning robot and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a movement control method, a movement control device, a cleaning robot, and a storage medium.
Background
With the development of technology, some mobile robots have a need to automatically return to a base station, such as a cleaning robot, such as a sweeper, a mopping machine, etc. Therefore, how to enable the cleaning robot to detect the base station and automatically return to the base station position is a problem to be solved.
Disclosure of Invention
In order to solve the related technical problems, the embodiment of the application provides a mobile control method, a mobile control device, a cleaning robot and a storage medium.
The technical scheme of the embodiment of the application is realized as follows:
the embodiment of the application provides a mobile control method, which comprises the following steps:
acquiring scanning data of a laser radar of a cleaning robot and image data acquired by an image acquisition unit of the cleaning robot;
determining a first position of an identification area on a base station relative to the cleaning robot based on the scan data, and determining a second position of the identification area relative to the cleaning robot based on the image data;
And controlling the cleaning robot to move towards the base station based on the first position and the second position so as to return the cleaning robot to the base station position.
In the above scheme, the scanning data comprises reflective intensity information of a plurality of scanning points, the identification area comprises at least two reflective materials with different reflective intensities, and the determining the first position of the identification area on the base station relative to the cleaning robot based on the scanning data comprises the following steps:
Determining a plurality of first target scanning points from the plurality of scanning points based on the reflective intensity information of the plurality of scanning points, wherein the difference between the reflective intensity of each first target scanning point and the reflective intensity of the next scanning point of the first target scanning point is greater than or equal to a first threshold value;
the first location is determined based on the determined plurality of first target scan points.
In the above scheme, the scanning data further comprises distance information between a plurality of scanning points and the cleaning robot, at least two reflective materials of the identification area are not located on the same plane, and the determining the first position based on the determined first target scanning points comprises the following steps:
Determining a third position of the identification area relative to the cleaning robot based on the plurality of first target scan points;
determining a plurality of second target scanning points from the plurality of scanning points based on distance information of the plurality of scanning points and the cleaning robot, wherein the difference between the distance between each second target scanning point and the cleaning robot and the distance between the next scanning point and the cleaning robot is greater than or equal to a second threshold value and less than or equal to a third threshold value;
determining a fourth position of the identification area relative to the cleaning robot based on the plurality of second target scan points;
the first position is determined using the third position and the fourth position.
In the above aspect, the determining, based on the image data, the second position of the identification area relative to the cleaning robot includes:
and determining the second position based on the image data and combining the preset appearance characteristic data of the identification area.
In the above aspect, the controlling the cleaning robot to move toward the base station based on the first position and the second position includes:
determining a fifth position of the base station relative to the cleaning robot based on the first position and the second position in combination with a preset compensation function;
Controlling the cleaning robot to move toward the base station using the fifth position, wherein,
The compensation function is generated by using the relative position relationship between the geometric center of the identification area and the geometric center of the base station.
In the above solution, when the cleaning robot is controlled to move to the base station, the method further includes:
Periodically acquiring scanning data of the laser radar and image data acquired by the image acquisition unit;
Updating the first location based on the acquired scan data, and updating the second location based on the acquired image data;
and controlling the cleaning robot to move towards the base station based on the updated first position and second position.
In the above scheme, the method further comprises:
after the cleaning robot and the base station are successfully docked, detecting whether the cleaning robot is in a charging state or not;
and determining the base station fault when the cleaning robot is not in a charging state, and sending out fault information, wherein the fault information is used for prompting the base station to fail.
The embodiment of the application also provides a mobile control device, which comprises:
an acquisition unit for acquiring scanning data of a laser radar of a cleaning robot and image data acquired by an image acquisition unit of the cleaning robot;
the cleaning robot comprises a base station, a first processing unit, a second processing unit, a first image processing unit and a second image processing unit, wherein the base station is used for scanning the cleaning robot, the first processing unit is used for determining a first position of an identification area on the base station relative to the cleaning robot based on the scanning data, and determining a second position of the identification area relative to the cleaning robot based on the image data;
and a second processing unit for controlling the cleaning robot to move toward the base station based on the first position and the second position so as to return the cleaning robot to the base station position.
The embodiment of the application also provides a cleaning robot, which comprises a processor and a memory for storing a computer program capable of running on the processor,
Wherein the processor is configured to execute the steps of any of the methods described above when the computer program is run.
The embodiment of the application also provides a storage medium, on which a computer program is stored, which when executed by a processor, implements the steps of any of the methods described above.
The method, the device, the cleaning robot and the storage medium for controlling movement provided by the embodiment of the application are used for acquiring scanning data of a laser radar of the cleaning robot and image data acquired by an image acquisition unit of the cleaning robot, determining a first position of an identification area on a base station relative to the cleaning robot based on the scanning data, determining a second position of the identification area relative to the cleaning robot based on the image data, wherein the identification area contains reflective materials, and controlling the cleaning robot to move towards the base station based on the first position and the second position so as to enable the cleaning robot to return to the base station position. According to the scheme, the position of the identification area containing the reflective material on the base station relative to the cleaning robot is determined based on the scanning data of the laser radar of the cleaning robot and the image data acquired by the image acquisition unit, and the cleaning robot is controlled to move towards the base station based on the determined position.
Drawings
FIG. 1 is a flow chart of a mobile control method according to an embodiment of the application;
FIG. 2 is a schematic diagram of a structure of an identification area according to an embodiment of the present application;
FIG. 3 is a graph showing the variation of the reflection intensity of the scanning spot according to the embodiment of the application;
FIG. 4 is a schematic diagram of another configuration of an identification area according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a mobile control device according to an embodiment of the present application;
Fig. 6 is a schematic structural view of a cleaning robot according to an embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the accompanying drawings and examples.
In order to enable the cleaning robot to detect the base station and automatically return to the base station position, it is considered that a product identifier such as a bar code or a two-dimensional code is arranged on the base station, and when the cleaning robot needs to return to the base station position, the cleaning robot can determine the position of the base station relative to the cleaning robot by detecting the product identifier and automatically move to the base station. However, although product identifiers such as bar codes and two-dimensional codes can contain obvious appearance characteristics and rich product information, the product identifiers are usually easy to identify, and the false detection rate is extremely low, the product identifiers are limited by the structure of the cleaning robot, the resolution capability and the measurement range of a sensor installed by the cleaning robot, and the implementation difficulty of the scheme is high.
In practical application, it may be considered that the cleaning robot detects the position of the base station relative to itself based on the geometric feature of the base station (i.e., the shape of the base station), for example, the cleaning robot may scan the environment where the cleaning robot is located by the laser radar, determine the position of the base station relative to itself based on the obtained scan data, determine a plane based on the obtained scan data, and determine a curved surface based on the obtained scan data when the cleaning robot can determine a circular arc. However, since the cleaning robot may be in a more complex environment, the accuracy of detecting the position of the base station with respect to itself using the above scheme may be low.
In practical application, it is also considered that the cleaning robot and the base station are respectively provided with an infrared sensor, and the cleaning robot is guided to be in butt joint with the base station through interaction of the infrared signals between the cleaning robot and the base station, namely, the cleaning robot is guided to return to the base station. However, the consistency and stability of the infrared sensors in the production process and the consistency of the infrared sensors in the installation process are difficult to ensure, and in the use process, the infrared signals are easy to interfere, reflect and other abnormal conditions, and meanwhile, the risk of damaging devices is high.
Based on the above, in various embodiments of the present application, a laser radar and an image acquisition unit are installed in a cleaning robot, the position of an identification area containing reflective material on a base station relative to the cleaning robot is determined based on scan data of the laser radar and image data acquired by the image acquisition unit, and the cleaning robot is controlled to move toward the base station based on the determined position, so that the cleaning robot can detect the base station and automatically return to the base station position when the cleaning robot needs to return to the base station position, thereby improving user experience, and simultaneously, the process of automatically returning the cleaning robot to the base station position does not need the guidance of the base station, in other words, the installation of a signal transmitting sensor on the base station is not required, thereby reducing the production cost of the base station of the cleaning robot.
An embodiment of the present application provides a movement control method, as shown in fig. 1, including:
Step 101, acquiring scanning data of a laser radar of a cleaning robot and image data acquired by an image acquisition unit of the cleaning robot;
step 102, determining a first position of an identification area on a base station relative to the cleaning robot based on the scanning data, and determining a second position of the identification area relative to the cleaning robot based on the image data;
here, the identification area comprises a retroreflective material;
and 103, controlling the cleaning robot to move towards the base station based on the first position and the second position so as to return the cleaning robot to the base station position.
Here, after the cleaning robot returns to the base station position, the base station may perform maintenance on the cleaning robot, and may specifically include at least one of the following:
charging the cleaning robot;
performing fault detection on the cleaning robot;
Performing fault repair on the cleaning robot;
Cleaning the cleaning robot;
the designated parts are replaced for the cleaning robot.
Of course, in practical application, the maintenance mode of the cleaning robot may be other modes, which is not limited by the embodiment of the present application.
In practical application, the cleaning robot can be a sweeper, a mopping machine and the like.
In addition, in various embodiments of the present application, the position may be understood as coordinates in a laser radar coordinate system, and may be represented as a two-dimensional space pose (x, y, theta), or may be information that can represent the position, such as coordinates of a map provided in the cleaning robot, coordinates of a map generated by the cleaning robot, or the like.
In step 101, in practical application, the laser radar may be a single-line laser radar, and the image acquisition unit may be a camera. Of course, other types of lidar (e.g., multi-line lidar) and image acquisition units (e.g., three-dimensional (3D) cameras) may be provided as desired, and embodiments of the application are not limited in this regard.
In practical application, the cleaning robot can scan the environment by 360 degrees through the laser radar to obtain the scanning data, and can control the image acquisition unit to acquire the panoramic image of the environment when controlling the in-situ rotation of the cleaning robot to obtain the image data.
In practical application, the scanning data may be acquired first, the first position may be determined based on the scanning data, and then the cleaning robot may be controlled to adjust the pose based on the first position, so that the cleaning robot may acquire the image data including the image information of the base station through the image acquisition unit, acquire the image data, and determine the second position based on the image data.
In step 102, in actual application, the structure of the identification area and the type of the reflective material included in the identification area may be set according to the appearance and the material of the base station, the application scenario of the cleaning robot, the characteristics (i.e., types) of the laser radar and the image acquisition unit of the cleaning robot, and so on. For example, in order to improve the accuracy of the determined first position, in the case that the housing of the base station includes a first reflective material, a second reflective material may be disposed at intervals in the identification area, where the second reflective material has a reflective strength higher than that of the first reflective material, in the case that the housing of the base station includes the second reflective material, the first reflective material may be disposed at intervals in the identification area, and in the case that the housing of the base station includes a third reflective material, where the third reflective material has a reflective strength higher than that of the first reflective material and lower than that of the second reflective material, the first reflective material and the second reflective material may be disposed at intervals in the identification area. For another example, in the case where the cleaning robot is a floor cleaning machine, it is necessary to provide a waterproof light reflecting material in the identification area.
In practical application, the shape and size of the reflective material can be set according to the size of the base station, the characteristics (i.e. types) of the laser radar and the image acquisition unit of the cleaning robot, and the like. Illustratively, as shown in fig. 2, square-shaped first reflective materials and square-shaped second reflective materials may be disposed at intervals in the identification area, and the widths of the square-shaped reflective materials may be the same or different.
Based on this, in one embodiment, the scan data may include reflective intensity information for a plurality of scan points;
accordingly, the determining, based on the scan data, a first position of an identification area on a base station relative to the cleaning robot may include:
Determining a plurality of first target scanning points from the plurality of scanning points based on the reflective intensity information of the plurality of scanning points, wherein the difference between the reflective intensity of each first target scanning point and the reflective intensity of the next scanning point of the first target scanning point is greater than or equal to a first threshold value;
the first location is determined based on the determined plurality of first target scan points.
Here, the scanning points refer to all points scanned by the laser radar in the environment where the cleaning robot is located, in other words, points where the object surface in the environment where the cleaning robot is located can reflect laser light.
In practical application, the difference between the reflective intensity of each first target scanning point and the reflective intensity of the next scanning point is greater than or equal to a first threshold, which is understood to be that the reflective intensity difference between at least two reflective materials contained in the identification area is greater than or equal to the first threshold. In other words, the first target scan point may be understood as a scan point on the boundary of each retroreflective material in the identified area.
In practical application, the first threshold value can be set according to requirements.
In practical application, the method can detect the change edge of the reflection intensity of each scanning point in the scanning data from strong to weak and the change edge from weak to strong by using a preset pattern matching algorithm and other modes, so as to obtain the plurality of first target scanning points. Illustratively, as shown in fig. 3, where the horizontal axis may represent the index of the scanning point and the vertical axis may represent the reflection intensity of the scanning point, 4 reflection intensity variation edges, i.e., 4 first target scanning points, can be determined in fig. 3.
In practical application, after the plurality of first target scanning points are determined, the position of each first target scanning point relative to the cleaning robot can be determined, and then the first position can be determined according to the position of each first target scanning point relative to the cleaning robot.
In practical application, in order to further improve the accuracy of the determined first position, at least two reflective materials with different reflective intensities contained in the identification area may be disposed on different planes, in other words, the at least two reflective materials are disposed in a staggered manner in the horizontal depth direction of the base station. Illustratively, as shown in fig. 4, the first reflective material and the second reflective material may be disposed in square shapes at intervals in the marking area, and the width of each square reflective material may be the same or different from the front view of the marking area, and the first reflective material and the second reflective material may not be in the same plane from the top view of the marking area. At this time, in the scan data, a distance between the scan point corresponding to the second reflective material and the cleaning robot is greater than a distance between the scan point corresponding to the first reflective material and the cleaning robot.
Based on this, in an embodiment, the scan data may further include distance information between a plurality of scan points and the cleaning robot;
The determining the first location based on the determined plurality of first target scan points may include:
Determining a third position of the identification area relative to the cleaning robot based on the plurality of first target scan points;
determining a plurality of second target scanning points from the plurality of scanning points based on distance information of the plurality of scanning points and the cleaning robot, wherein the difference between the distance between each second target scanning point and the cleaning robot and the distance between the next scanning point and the cleaning robot is greater than or equal to a second threshold value and less than or equal to a third threshold value;
determining a fourth position of the identification area relative to the cleaning robot based on the plurality of second target scan points;
the first position is determined using the third position and the fourth position.
In practical application, if it is determined that at least two reflective materials of the identification area are located on the same plane according to the scan data, that is, it is determined that the distance between each first target scan point and the cleaning robot is the same, the first position may be determined directly by using the plurality of first target scan points. If it is determined that at least two reflective materials of the identification area are not located on the same plane according to the scan data, that is, it is determined that the distances between each first target scan point and the cleaning robot are different, accuracy verification needs to be performed on the plurality of first target scan points, that is, a plurality of second target scan points are determined.
Here, the second target scan point may also be understood as a scan point on the boundary of each retroreflective material in the identification area, and the plurality of first target scan points are identical to the plurality of second target scan points in case of accurate calculation. In the case where the plurality of first target scan points and the plurality of second target scan points are not identical, an error in calculation is described, and therefore, the first position needs to be determined by combining the plurality of first target scan points and the plurality of second target scan points.
In practical application, the difference between the distance between each second target scanning point and the cleaning robot and the distance between the next scanning point and the cleaning robot is greater than or equal to a second threshold value and less than or equal to a third threshold value, and it can be understood that the difference between the distance between at least two reflective materials contained in the identification area and the cleaning robot is greater than or equal to the second threshold value and less than or equal to the third threshold value.
In practical application, the second threshold and the third threshold may be set according to requirements.
In practical application, in the process of determining the first position by combining the plurality of first target scanning points and the plurality of second target scanning points, the third position can be determined firstly based on the position of each first target scanning point relative to the cleaning robot, then the fourth position can be determined based on the position of each second target scanning point relative to the cleaning robot, and finally the first position can be determined by utilizing the third position and the fourth position and combining a pre-trained deep learning model, a pre-set filtering algorithm, averaging and other modes.
In practical application, in order to further improve the accuracy of the determined first position, after the third position and the fourth position are determined, geometric features of the identification area, such as width information, can be extracted based on the scan data, then a sixth position of the identification area relative to the cleaning robot is determined based on the geometric features of the identification area, and the first position is determined by using the third position, the fourth position and the sixth position and combining a pre-trained deep learning model, a pre-set filtering algorithm, an averaging method and the like.
In practical application, the appearance characteristic data of the identification area may be preset in the cleaning robot, and the second position may be determined based on the appearance characteristic data.
Based on this, in an embodiment, the determining the second position of the identification area with respect to the cleaning robot based on the image data may include:
and determining the second position based on the image data and combining the preset appearance characteristic data of the identification area.
In practical application, the image data (which may include data such as shape and color of each reflective material) of the identification region may be used to train a deep learning model, and appearance characteristic data of the identification region may be obtained based on the deep learning model. Of course, the appearance characteristic data of the identification area may be determined in other manners according to the requirement, which is not limited by the embodiment of the present application.
In practical application, the second position can be determined by means of a pre-trained deep learning model, a pre-set feature matching algorithm and the like based on the image data and combining with the pre-set appearance feature data of the identification area.
In step 103, in practical applications, the location of the identification area (i.e. the geometric center of the identification area) on the base station may correspond to the geometric center of the base station, in other words, the identification area may be centrally and symmetrically arranged based on a central axis of the housing of the base station. At this time, after the first position and the second position are determined, the first position and the second position can be directly utilized, and a fifth position of the base station (i.e. the geometric center of the base station) relative to the cleaning robot is determined by combining a pre-trained deep learning model, a pre-set filtering algorithm, an averaging method and the like, and the cleaning robot is controlled to move towards the base station by utilizing the fifth position.
In practical applications, the identification area may not be symmetrically arranged on the basis of a central axis of the housing of the base station, but be arranged at other positions which can be scanned by the laser radar and do not correspond to the geometric center of the base station (for example, asymmetrically arranged on the basis of a central axis of the housing of the base station), and at this time, when the cleaning robot is controlled to move towards the base station, the deviation of the geometric center of the identification area relative to the geometric center of the base station needs to be compensated.
Based on this, in an embodiment, the controlling the cleaning robot to move toward the base station based on the first position and the second position may include:
determining a fifth position of the base station relative to the cleaning robot based on the first position and the second position in combination with a preset compensation function;
Controlling the cleaning robot to move toward the base station using the fifth position, wherein,
The compensation function is generated by using the relative position relationship between the geometric center of the identification area and the geometric center of the base station.
In practical application, in order to improve the accuracy of the butt joint of the cleaning robot and the base station, the position of the base station relative to the base station can be detected in real time in the process that the cleaning robot moves towards the base station, so that the position of the base station relative to the base station is always updated and converged to a stable result, and the cleaning robot and the base station are controlled to be in more accurate butt joint.
Based on this, in an embodiment, when controlling the cleaning robot to move toward the base station, the method may further include:
Periodically acquiring scanning data of the laser radar and image data acquired by the image acquisition unit;
Updating the first location based on the acquired scan data, and updating the second location based on the acquired image data;
and controlling the cleaning robot to move towards the base station based on the updated first position and second position.
In practical application, the period for acquiring the scan data of the laser radar and the image data acquired by the image acquisition unit may be set according to requirements, for example, 100 milliseconds (mm).
Specifically, in practical application, when the identification area is centrally and symmetrically arranged based on a central axis of the housing of the base station, the control of the cleaning robot to move towards the base station based on the updated first position and the updated second position may include determining an updated fifth position by using the updated first position and the updated second position and combining a pre-trained deep learning model, a pre-set filtering algorithm, an averaging method and the like, and controlling the cleaning robot to move towards the base station by using the updated fifth position. When the identification area is not symmetrically arranged in the center based on one central axis of the housing of the base station, the control of the cleaning robot to move towards the base station based on the updated first position and the updated second position may include determining an updated fifth position based on the updated first position and the updated second position in combination with a preset compensation function, and controlling the cleaning robot to move towards the base station by using the updated fifth position.
In practical application, after the fifth position is determined, the cleaning robot can determine a moving path by using a preset path planning algorithm, and move towards the base station based on the determined moving path.
In practical application, an obstacle avoidance algorithm can be preset based on the shape of the shell of the base station (such as the base station with a U-shaped shell), and the cleaning robot is controlled to move to the base station based on the obstacle avoidance algorithm, so that the cleaning robot can finely adjust the butt joint direction with the base station left and right, and the situation that the base station shell is damaged or fails due to collision with the edge of the base station is avoided.
During practical application, the base station may have faults such as loose sockets and the like, and maintenance of the cleaning robot cannot be performed, and the cleaning robot can determine whether the cleaning robot is in maintenance of the base station or not through detection of the state of the cleaning robot, so that whether the base station has faults or not is judged.
Based on this, in an embodiment, the method may further include:
after the cleaning robot and the base station are successfully docked, detecting whether the cleaning robot is in a charging state or not;
and determining the base station fault when the cleaning robot is not in a charging state, and sending out fault information, wherein the fault information is used for prompting the base station to fail.
In practical application, the manner of determining that the cleaning robot and the base station are successfully docked may be set according to requirements, for example, by determining that the cleaning robot and the base station are successfully docked by using sensors such as a laser radar and an image acquisition unit, which is not limited in the embodiment of the present application.
In practical application, the manner of detecting whether the cleaning robot is in the charging state may also be set according to requirements, for example, whether the battery of the cleaning robot has an input current, which is not limited in the embodiment of the present application.
In practical application, the fault information can be sent to a user terminal to prompt the user that the base station has a fault, and of course, the fault information can also be sent to an indicator light control unit of the cleaning robot to prompt the user that the base station has a fault in the form of an indicator light. The device specifically receiving the fault information may be set according to requirements, which is not limited in the embodiment of the present application.
In practical application, other signal emission sensors, such as infrared sensors, can be respectively installed on the cleaning robot and the base station, and the cleaning robot is controlled to return to the base station by combining with the installed other signal emission sensors. Of course, after the cleaning robot successfully interfaces with the base station based on the scan data and the image data, if the cleaning robot determines that the entire process of returning to the base station position does not receive a signal (such as an infrared signal) sent by the base station, it may determine that the cleaning robot itself and/or other signal emitting sensors (such as infrared sensors) on the base station fail, and may send out failure information to prompt the cleaning robot itself and/or other signal emitting sensors on the base station to fail.
The movement control method comprises the steps of obtaining scanning data of a laser radar of a cleaning robot and image data collected by an image collecting unit of the cleaning robot, determining a first position of an identification area on a base station relative to the cleaning robot based on the scanning data, determining a second position of the identification area relative to the cleaning robot based on the image data, wherein the identification area contains reflective materials, and controlling the cleaning robot to move towards the base station based on the first position and the second position so as to enable the cleaning robot to return to the base station position. According to the scheme of the embodiment of the application, the position of the identification area containing the reflective material on the base station relative to the cleaning robot is determined based on the scanning data of the laser radar of the cleaning robot and the image data acquired by the image acquisition unit, and the cleaning robot is controlled to move towards the base station based on the determined position; the cleaning robot can automatically return to the base station position by detecting the base station when the cleaning robot needs to return to the base station position, so that user experience can be improved, meanwhile, the process of automatically returning to the base station position by the cleaning robot does not need the guiding of the base station, in other words, a signal transmitting sensor is not required to be installed on the base station, and the cleaning robot can automatically return to the base station position by only arranging an identification area containing a light reflecting material with low cost on the base station, so that the production cost of the base station of the cleaning robot can be reduced, and in addition, the damage rate of the identification area can be reduced because the light reflecting material is not easy to damage, so that the service life of the base station of the cleaning robot can be prolonged, and the user experience can be further improved.
In addition, the mobile control method provided by the embodiment of the application can enable the cleaning robot to identify the base stations of different types (different identification areas can be set by the base stations of different types). Meanwhile, the cleaning robot moves to the base station based on a preset path planning algorithm and an obstacle avoidance algorithm, and can accurately, smoothly and collision-free dock with the base station (namely successfully return to the base station position). Further, by judging the state of the cleaning robot, the cleaning robot can detect whether or not the cleaning robot is receiving maintenance of the base station, and can determine whether or not the base station has failed.
In practical application, some cleaning robots are provided with laser radars and image acquisition units, such as a sweeper, and the mobile control method provided by the embodiment of the application does not need to modify the hardware structure of the cleaning robot, saves the hardware modification cost, can be realized only through software control, and has simple scheme realization and extremely strong expandability.
In order to implement the method according to the embodiment of the present application, the embodiment of the present application further provides a movement control device, which is disposed on the cleaning robot, as shown in fig. 5, and the device includes:
An acquisition unit 501, configured to acquire scan data of a laser radar of a cleaning robot and image data acquired by an image acquisition unit of the cleaning robot;
a first processing unit 502 for determining a first position of an identification area on a base station relative to the cleaning robot based on the scan data, and determining a second position of the identification area relative to the cleaning robot based on the image data;
And a second processing unit 503 for controlling the cleaning robot to move toward the base station based on the first position and the second position so as to return the cleaning robot to the base station position.
In one embodiment, the scanning data includes reflective intensity information of a plurality of scanning points, the identification area includes at least two reflective materials with different reflective intensities, and the first processing unit 502 is specifically configured to:
Determining a plurality of first target scanning points from the plurality of scanning points based on the reflective intensity information of the plurality of scanning points, wherein the difference between the reflective intensity of each first target scanning point and the reflective intensity of the next scanning point of the first target scanning point is greater than or equal to a first threshold value;
the first location is determined based on the determined plurality of first target scan points.
In an embodiment, the scan data further includes distance information between a plurality of scan points and the cleaning robot, at least two reflective materials of the marking area are not located on the same plane, and the first processing unit 502 is specifically configured to:
Determining a third position of the identification area relative to the cleaning robot based on the plurality of first target scan points;
determining a plurality of second target scanning points from the plurality of scanning points based on distance information of the plurality of scanning points and the cleaning robot, wherein the difference between the distance between each second target scanning point and the cleaning robot and the distance between the next scanning point and the cleaning robot is greater than or equal to a second threshold value and less than or equal to a third threshold value;
determining a fourth position of the identification area relative to the cleaning robot based on the plurality of second target scan points;
the first position is determined using the third position and the fourth position.
In an embodiment, the first processing unit 502 is specifically configured to determine the second position based on the image data in combination with preset appearance feature data of the identification area.
In an embodiment, the second processing unit 503 is specifically configured to:
determining a fifth position of the base station relative to the cleaning robot based on the first position and the second position in combination with a preset compensation function;
Controlling the cleaning robot to move toward the base station using the fifth position, wherein,
The compensation function is generated by using the relative position relationship between the geometric center of the identification area and the geometric center of the base station.
In an embodiment, when the second processing unit 503 controls the cleaning robot to move toward the base station, the acquiring unit 501 is further configured to periodically acquire the scan data of the laser radar and the image data acquired by the image acquiring unit;
the first processing unit 502 is further configured to update the first location based on the acquired scan data, and update the second location based on the acquired image data;
The second processing unit 503 is further configured to control the cleaning robot to move toward the base station based on the updated first position and the updated second position.
In one embodiment, the apparatus further comprises:
the detection unit is used for detecting whether the cleaning robot is in a charging state or not after the cleaning robot is successfully docked with the base station;
The sending unit is used for determining the base station fault when the detecting unit detects that the cleaning robot is not in a charging state, and sending fault information which is used for prompting the base station to be faulty.
In practical applications, the acquiring unit 501, the first processing unit 502, the second processing unit 503 and the detecting unit may be implemented by a processor in the mobile control device, and the transmitting unit may be implemented by a processor in the mobile control device in combination with a communication interface.
It should be noted that, when the movement control device provided in the above embodiment controls the movement of the cleaning robot, only the division of the program modules is used for illustration, and in practical application, the process allocation may be performed by different program modules according to needs, i.e. the internal structure of the device is divided into different program modules to complete all or part of the processes described above. In addition, the mobile control device and the mobile control method provided in the foregoing embodiments belong to the same concept, and specific implementation processes of the mobile control device and the mobile control method are detailed in the method embodiments and are not repeated herein.
Based on the hardware implementation of the program modules, and in order to implement the method according to the embodiment of the present application, the embodiment of the present application further provides a cleaning robot, as shown in fig. 6, the cleaning robot 600 includes:
The communication interface 601 is capable of performing information interaction with other electronic devices;
A processor 602, connected to the communication interface 601, for implementing information interaction with other electronic devices, and configured to execute the methods provided by one or more of the above technical solutions when running a computer program;
Memory 603 stores a computer program capable of running on the processor 602.
Specifically, the processor 602 is configured to:
Acquiring scanning data of a laser radar of the cleaning robot 600 and image data acquired by an image acquisition unit of the cleaning robot 600;
determining a first position of an identification area on a base station relative to the cleaning robot 600 based on the scan data, and determining a second position of the identification area relative to the cleaning robot 600 based on the image data;
Based on the first position and the second position, the cleaning robot 600 is controlled to move toward the base station to return the cleaning robot 600 to the base station position.
Wherein, in one embodiment, the scan data comprises reflective intensity information of a plurality of scan points, the identification area comprises at least two reflective materials having different reflective intensities, and the processor 602 is further configured to:
Determining a plurality of first target scanning points from the plurality of scanning points based on the reflective intensity information of the plurality of scanning points, wherein the difference between the reflective intensity of each first target scanning point and the reflective intensity of the next scanning point of the first target scanning point is greater than or equal to a first threshold value;
the first location is determined based on the determined plurality of first target scan points.
In an embodiment, the scan data further comprises distance information between a plurality of scan points and the cleaning robot 600, at least two reflective materials of the identified area are not located on the same plane, and the processor 602 is further configured to:
determining a third position of the identification area relative to the cleaning robot 600 based on the plurality of first target scan points;
determining a plurality of second target scanning points from the plurality of scanning points based on distance information of the plurality of scanning points and the cleaning robot 600, wherein a difference between a distance of each second target scanning point from the cleaning robot 600 and a distance of a next scanning point of the second target scanning point from the cleaning robot 600 is greater than or equal to a second threshold value and less than or equal to a third threshold value;
determining a fourth position of the identification area relative to the cleaning robot 600 based on the plurality of second target scan points;
the first position is determined using the third position and the fourth position.
In an embodiment, the processor 602 is further configured to determine the second location based on the image data in combination with preset appearance feature data of the identification area.
In an embodiment, the processor 602 is further configured to:
determining a fifth position of the base station relative to the cleaning robot 600 based on the first position and the second position in combination with a preset compensation function;
Controlling the cleaning robot 600 to move toward the base station using the fifth position, wherein,
The compensation function is generated by using the relative position relationship between the geometric center of the identification area and the geometric center of the base station.
In an embodiment, the processor 602 is further configured to:
Periodically acquiring scanning data of the laser radar and image data acquired by the image acquisition unit;
Updating the first location based on the acquired scan data, and updating the second location based on the acquired image data;
based on the updated first and second positions, the cleaning robot 600 is controlled to move toward the base station.
In an embodiment, the processor 602 is further configured to:
After determining that the cleaning robot 600 is successfully docked with the base station, detecting whether the cleaning robot 600 is in a charged state;
When the cleaning robot 600 is detected not to be in a charging state, determining that the base station fails, and sending out failure information through the communication interface 601, wherein the failure information is used for prompting that the base station fails.
It should be noted that, the detailed process of the processor 602 for executing the above operation is described in the method embodiment, and will not be described herein.
Of course, in practice, the various components in the cleaning robot 600 are coupled together by a bus system 604. It is understood that the bus system 604 is used to enable connected communications between these components. The bus system 604 includes a power bus, a control bus, and a status signal bus in addition to the data bus. But for clarity of illustration, the various buses are labeled as bus system 604 in fig. 6.
The memory 603 in the embodiment of the present application is used to store various types of data to support the operation of the cleaning robot 600. Examples of such data include any computer program for operating on the cleaning robot 600.
The method disclosed in the above embodiment of the present application may be applied to the processor 602 or implemented by the processor 602. The processor 602 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuitry in hardware or instructions in software in the processor 602. The Processor 602 may be a general purpose Processor, a digital signal Processor (DSP, digital Signal Processor), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like. The processor 602 may implement or perform the methods, steps, and logic blocks disclosed in embodiments of the present application. The general purpose processor may be a microprocessor or any conventional processor or the like. The steps of the method disclosed in the embodiment of the application can be directly embodied in the hardware of the decoding processor or can be implemented by combining hardware and software modules in the decoding processor. The software modules may be located in a storage medium in a memory 603. The processor 602 reads information in the memory 603 and, in combination with its hardware, performs the steps of the method as described above.
In an exemplary embodiment, the cleaning robot 600 may be implemented by one or more Application Specific Integrated Circuits (ASICs), DSPs, programmable logic devices (PLDs, programmable Logic Device), complex Programmable logic devices (CPLDs, complex Programmable Logic Device), field-Programmable gate arrays (FPGAs), general purpose processors, controllers, microcontrollers (MCUs, micro Controller Unit), microprocessors (microprocessors), or other electronic components for performing the foregoing methods.
It will be appreciated that the memory 603 of embodiments of the application may be either volatile memory or nonvolatile memory, and may include both volatile and nonvolatile memory. The non-volatile Memory may be, among other things, a Read Only Memory (ROM), a programmable Read Only Memory (PROM, programmable Read-Only Memory), erasable programmable Read-Only Memory (EPROM, erasable Programmable Read-Only Memory), electrically erasable programmable Read-Only Memory (EEPROM, ELECTRICALLY ERASABLE PROGRAMMABLE READ-Only Memory), Magnetic random access Memory (FRAM, ferromagnetic random access Memory), flash Memory (Flash Memory), magnetic surface Memory, optical disk, or compact disk-Only Memory (CD-ROM, compact Disc Read-Only Memory), which may be disk Memory or tape Memory. The volatile memory may be random access memory (RAM, random Access Memory) which acts as external cache memory. By way of example and not limitation, many forms of RAM are available, such as static random access memory (SRAM, static Random Access Memory), synchronous static random access memory (SSRAM, synchronous Static Random Access Memory), dynamic random access memory (DRAM, dynamic Random Access Memory), synchronous dynamic random access memory (SDRAM, synchronous Dynamic Random Access Memory), and, Double data rate synchronous dynamic random access memory (DDRSDRAM, double Data Rate Synchronous Dynamic Random Access Memory), enhanced synchronous dynamic random access memory (ESDRAM, enhanced Synchronous Dynamic Random Access Memory), synchronous link dynamic random access memory (SLDRAM, syncLink Dynamic Random Access Memory), Direct memory bus random access memory (DRRAM, direct Rambus Random Access Memory). the memory described by embodiments of the present application is intended to comprise, without being limited to, these and any other suitable types of memory.
In an exemplary embodiment, the present application also provides a storage medium, i.e. a computer storage medium, in particular a computer readable storage medium, for example comprising a memory 603 storing a computer program executable by the processor 602 of the cleaning robot 600 for performing the steps of the method described above. The computer readable storage medium may be FRAM, ROM, PROM, EPROM, EEPROM, flash Memory, magnetic surface Memory, optical disk, or CD-ROM.
It should be noted that "first," "second," etc. are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order.
In addition, the embodiments of the present application may be arbitrarily combined without any collision.
The foregoing description is only of the preferred embodiments of the present application, and is not intended to limit the scope of the present application.

Claims (8)

the cleaning robot comprises a base station, a first processing unit, a second processing unit, a third processing unit, a fourth processing unit, a fifth processing unit and a compensation function, wherein the first processing unit is used for controlling the cleaning robot to move towards the base station based on the first position and the second position, periodically acquiring scanning data of the laser radar and image data acquired by the image acquisition unit when controlling the cleaning robot to move towards the base station, updating the first position based on the acquired scanning data and updating the second position based on the acquired image data, determining a fifth position of the base station relative to the cleaning robot based on the updated first position and the second position and combining a preset compensation function, and controlling the cleaning robot to move towards the base station by utilizing the fifth position so as to enable the cleaning robot to return to the base station position, wherein the compensation function is generated by utilizing a relative position relation between the geometric center of the identification area and the geometric center of the base station.
CN202110384482.5A2021-04-092021-04-09 Mobile control method, device, cleaning robot and storage mediumActiveCN115191884B (en)

Priority Applications (2)

Application NumberPriority DateFiling DateTitle
CN202110384482.5ACN115191884B (en)2021-04-092021-04-09 Mobile control method, device, cleaning robot and storage medium
PCT/CN2022/079029WO2022213749A1 (en)2021-04-092022-03-03Method and apparatus for controlling movement, cleaning robot, and storage medium

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202110384482.5ACN115191884B (en)2021-04-092021-04-09 Mobile control method, device, cleaning robot and storage medium

Publications (2)

Publication NumberPublication Date
CN115191884A CN115191884A (en)2022-10-18
CN115191884Btrue CN115191884B (en)2025-04-29

Family

ID=83545138

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202110384482.5AActiveCN115191884B (en)2021-04-092021-04-09 Mobile control method, device, cleaning robot and storage medium

Country Status (2)

CountryLink
CN (1)CN115191884B (en)
WO (1)WO2022213749A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN117675422A (en)*2022-08-252024-03-08无锡小天鹅电器有限公司Binding method and binding device for multi-electric appliance equipment and computer readable storage medium
WO2024230617A1 (en)*2023-05-062024-11-14美智纵横科技有限责任公司Base station position determination method, mobile robot and computer-readable storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN109674402A (en)*2019-01-042019-04-26云鲸智能科技(东莞)有限公司A kind of information processing method and relevant device
CN110946513A (en)*2018-09-272020-04-03广东美的生活电器制造有限公司Control method and device of sweeping robot

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JPH06214640A (en)*1993-01-181994-08-05Fujita CorpTravel controller for mobile object
KR101575597B1 (en)*2014-07-302015-12-08엘지전자 주식회사Robot cleaning system and method of controlling robot cleaner
CN104848851B (en)*2015-05-292017-08-18山东鲁能智能技术有限公司 Substation inspection robot and its method based on multi-sensor data fusion composition
CN106983460B (en)*2017-04-072019-08-27小狗电器互联网科技(北京)股份有限公司 A kind of image control method of sweeping robot area cleaning
CN108065866A (en)*2017-11-072018-05-25郑州谦贤科技有限公司A kind of automatic pointing sweeping machine and method
CN207718228U (en)*2018-02-022018-08-10福建(泉州)哈工大工程技术研究院A kind of reliable indoor omni-directional mobile robots recharging system
CN108888188B (en)*2018-06-142020-09-01深圳市无限动力发展有限公司Sweeping robot position calibration method and system
US11157016B2 (en)*2018-07-102021-10-26Neato Robotics, Inc.Automatic recognition of multiple floorplans by cleaning robot
CN110967703A (en)*2018-09-272020-04-07广东美的生活电器制造有限公司Indoor navigation method and indoor navigation device using laser radar and camera
CN110000784B (en)*2019-04-092020-11-06深圳市远弗科技有限公司Robot recharging positioning navigation method, system, equipment and storage medium
CN110221617B (en)*2019-07-162024-03-08昆山市工研院智能制造技术有限公司Robot charging seat, automatic robot recharging system and automatic recharging method
CN110353583A (en)*2019-08-212019-10-22追创科技(苏州)有限公司Sweeping robot and automatic control method thereof

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN110946513A (en)*2018-09-272020-04-03广东美的生活电器制造有限公司Control method and device of sweeping robot
CN109674402A (en)*2019-01-042019-04-26云鲸智能科技(东莞)有限公司A kind of information processing method and relevant device

Also Published As

Publication numberPublication date
WO2022213749A1 (en)2022-10-13
CN115191884A (en)2022-10-18

Similar Documents

PublicationPublication DateTitle
CN111258320A (en)Robot obstacle avoidance method and device, robot and readable storage medium
CN107710094B (en)Online calibration check during autonomous vehicle operation
US11493930B2 (en)Determining changes in marker setups for robot localization
CN113033280B (en) System and method for trailer posture estimation
CN110850859B (en)Robot and obstacle avoidance method and obstacle avoidance system thereof
US11579632B2 (en)System and method for assisting collaborative sensor calibration
CN115191884B (en) Mobile control method, device, cleaning robot and storage medium
CN110849366A (en)Navigation method and system based on fusion of vision and laser radar
US11875682B2 (en)System and method for coordinating collaborative sensor calibration
KR20200121756A (en)Initialization Diagnosis Method and System of a Mobile Robot
CN111436864B (en)Control method, device and storage medium
CN110986920A (en)Positioning navigation method, device, equipment and storage medium
EP4242775B1 (en)Charging station, method for returning to said charging station for a lawnmower robot
EP4033325A1 (en)Robot movement limiting frame working starting point determining method and movement control method
CN115587603A (en)Robot and method and system for identifying workstation thereof, storage medium and workstation
CN113900454A (en) Charging pile method, device, equipment and storage medium
CN112826377A (en)Recharging alignment method and device of sweeper and sweeper
CN112698643A (en)Method and system for docking charging device by robot, robot and computer storage medium
CN114777761A (en)Cleaning machine and map construction method
CN113741447B (en)Robot charging pile alignment method and device, terminal equipment and storage medium
CN116540712A (en)Robot recharging method, system, equipment and storage medium
CN117519164A (en)Navigation method of movable robot and movable robot
CN115542890A (en)Robot, docking charging seat method thereof, control device and storage medium
CN117234196A (en)Autonomous mobile device, autonomous mobile device operation method, autonomous mobile device operation system, and storage medium
CN116442830A (en)Automatic charging method, device, equipment and chargeable equipment

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp