Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
It should be understood that, the sequence numbers of the steps in this embodiment do not mean the execution sequence, and the execution sequence of each process should be determined by the function and the inherent logic of the process, and should not constitute any limitation to the implementation process of the embodiment of the present application.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
Referring to fig. 1, fig. 1 is a first flowchart of a SLAM closed-loop detection method based on a laser radar according to an embodiment of the present application. As shown in fig. 1, a SLAM closed loop detection method based on a laser radar includes the following steps:
step 101, performing closed-loop detection through a visual closed-loop detection module based on camera data acquired by a camera to obtain a first closed-loop detection result.
The camera data specifically includes image data acquired by a camera capturing an external image. The camera data is obtained by shooting images by a camera in the robot travelling path.
The vision closed-loop detection module is used for comparing and processing the camera data acquired by the camera and judging whether the robot passes through the current position point or not to realize closed-loop detection. Specifically, the closed-loop detection can be realized by comparing the visual information of the current point and the past walking point acquired by the monocular camera, the binocular camera, the multi-view camera or the fisheye camera and judging whether the robot walks through the current position point.
Wherein, optionally, the visual closed-loop detection module can be a visual closed-loop detection part in a visual SLAM module of an existing robot.
Specifically, as an optional implementation manner, the performing closed-loop detection by the visual closed-loop detection module based on the camera data acquired by the camera to obtain a first closed-loop detection result includes:
matching the first image frame of the current position point with the image frame of the historical position point obtained by recording through a visual closed-loop detection module;
if the similarity between a second image frame and the first image frame in the image frames of the historical position points is greater than a threshold value, determining that the first closed-loop detection result is a closed loop; and/or the presence of a gas in the gas,
and if the similarity between the first image frame and the previous image frame and the similarity between the next image frame and the first image frame of the second image frame in the image frames of the historical position points are both greater than a threshold value, determining that the first closed-loop detection result is a closed loop.
The current position point is specifically a current position point where the robot is located on the travel path.
Whether it is a laser SLAM module or a visual closed loop detection module, closed loops are easily detected incorrectly in similar environments, so that matching of the current frame with the key frames in the key frame library is required to determine whether closed loops really exist.
The robot collects images at each travel point along the travel path during travel, collects a plurality of images at each travel point, determines a key image frame from the plurality of images collected at each travel point, and correspondingly stores the key image frame and the travel point to form a database (namely a key frame library) for recording the obtained image frames of the historical position points. And matching the first image frame of the current position point with the recorded image frames of the historical position points in the database through a visual closed-loop detection module.
The robot is, for example, a sweeper, and in a specific implementation, the historical position of the robot may be in one-to-one correspondence with the frame number of the key image captured by the camera, and the current position of the sweeper is also recorded each time the key frame is saved, such as a position of a poision (sweeper, key frame number).
In the above step, the first image frame is a key image frame in a plurality of image frames acquired by the robot at the current position point, and the key image frame may be an image frame with optimal definition.
In the process of performing closed-loop detection and judgment, the following steps may be specifically performed:
and if the similarity between a second image frame and the first image frame in the image frames of the historical position points is greater than a threshold value, determining that the first closed-loop detection result is a closed loop.
And if the similarity between the first image frame and the second image frame in the image frames of the historical position points is greater than a threshold value, determining that the first closed-loop detection result is a closed loop.
And if the similarity between a second image frame and the first image frame in the image frames of the historical position points is greater than a threshold value and the similarities between a previous image frame and a next image frame of the second image frame and the first image frame are greater than the threshold value, determining that the first closed-loop detection result is a closed loop.
And comparing the image frame at the current position with the key image frame in the key frame library, wherein when the similarity between the image frame at the current position and a certain key image frame in the database is greater than a threshold value, for example, 50%, a closed loop is formed, and a visual closed-loop signal is provided for the laser slam module.
And comparing the image frame at the current position with the key image frame in the key image frame library, wherein when the similarity between the image frame at the current position and the two image frames before and after a certain key image frame in the database is greater than a threshold value, for example, 50%, a closed loop is established, and a visual closed loop signal is provided for the laser slam module.
And comparing the image frame at the current position with the key image frame in the key image frame library, wherein when the similarity between the image frame at the current position and a certain key image frame in the database and between the image frame at the current position and the two frames before and after the key image frame is greater than a threshold value, for example, 50%, a closed loop is established, and a visual closed loop signal is provided for the laser slam module.
The previous image frame and the next image frame are image frames corresponding to a position point before and a position point after the position point corresponding to the second image frame. Namely, the image frames respectively corresponding to the front and rear adjacent position points of the position point corresponding to the second image frame. And the image comparison detection is comprehensively carried out, so that the closed-loop detection accuracy is improved.
Specifically, when the similarity determination is performed, it may be that image feature points are extracted from the image frame, it is determined whether or not the amount of shift between the displacement of the feature points and the displacement of the position points can be matched, and when the matching is possible, it is determined that the similarity is greater than the set value.
And 102, on the basis of the first closed-loop detection result, sending closed-loop detection information to a laser synchronous positioning and mapping SLAM module through the visual closed-loop detection module.
The closed-loop detection information is generated based on the first closed-loop detection result. Specifically, the closed-loop detection information may include a first closed-loop detection result.
Closed loop detection is carried out through a visual closed loop detection module; when the closed loop is detected, a visual closed loop signal is directly sent to the laser synchronous positioning and mapping SLAM module, the closed loop detection result is sent to the laser SLAM module, the laser SLAM module is actively informed that the closed loop exists in the current path, and the laser SLAM can directly use the closed loop detection result to execute image optimization operation.
And 103, executing image optimization operation when the laser SLAM module acquires the closed loop detection information.
The laser SLAM module directly uses the closed-loop detection information of the visual closed-loop detection module as a closed-loop detection basis, and determines whether to perform the optimized updating operation of the map or not by using the closed-loop detection information as the basis for judging whether the map needs to be updated or not.
In the embodiment of the application, the visual closed-loop detection module is added to realize the auxiliary effect on the laser SLAM module, so that the closed loop can be detected timely, even the local closed loop can be confirmed in advance, the accumulated error is reduced, the wrong closed loop is filtered, the composition of the laser SLAM module is more timely and accurate, the closed-loop detection of the laser SLAM module is realized, and the accuracy of construction of an environment map and subsequent path planning is improved.
The embodiment of the application also provides different implementation modes of the SLAM closed loop detection method based on the laser radar.
Referring to fig. 2, fig. 2 is a second flowchart of a SLAM closed-loop detection method based on a laser radar according to an embodiment of the present application. As shown in fig. 2, a SLAM closed loop detection method based on a laser radar includes the following steps:
step 201, performing closed-loop detection through the laser SLAM module to obtain a second closed-loop detection result, and sending a verification request to the visual closed-loop detection module.
Wherein the verification request carries the second closed-loop detection result.
The process is carried out on the basis of the first closed-loop detection result, and closed-loop detection information is sent to the front of a laser synchronous positioning and mapping SLAM module through the visual closed-loop detection module.
Before the vision closed-loop detection module sends closed-loop detection information to the laser SLAM module, the laser SLAM module can scan the environment to achieve laser point cloud collection in the moving process of the robot, the laser SLAM module can automatically carry out closed-loop detection according to collected point clouds to obtain a detection result, and generates a verification request to be sent to the vision closed-loop detection module based on the closed-loop detection result, so that the vision closed-loop detection module can assist in verifying whether a second closed-loop detection result is correct, wrong closed-loop detection results of the laser SLAM module are eliminated, and the accuracy of map composition is further ensured.
When the laser SLAM module detects a closed-loop signal, the closed-loop signal is sent to the visual closed-loop detection module for detection, whether the map information detected by the closed-loop detection is really closed-loop or not is confirmed, if the closed-loop exists, the visual closed-loop signal is sent to the laser SLAM module, if the closed-loop does not exist, the current image frame is returned, and the traversal comparison process of the image is finished.
Step 202, performing closed-loop detection through a visual closed-loop detection module based on camera data acquired by a camera to obtain a first closed-loop detection result.
This step is the same as the implementation ofstep 101 in the foregoing embodiment, and is not described here again. Wherein, the occurrence between thestep 202 and thestep 201 is not in sequence.
Step 203, when the visual closed-loop detection module receives the verification request, on the basis of the first closed-loop detection result, sending closed-loop detection information to a laser synchronous positioning and mapping SLAM module through the visual closed-loop detection module.
Further, as an optional implementation manner, on the basis of the first closed-loop detection result, sending, by the visual closed-loop detection module, closed-loop detection information to a laser synchronous positioning and mapping SLAM module, includes:
and if the first closed-loop detection result is consistent with the second closed-loop detection result and the first closed-loop detection result and the second closed-loop detection result are both closed loops, outputting closed-loop detection feedback information passing verification to the laser SLAM module through the visual closed-loop detection module.
When the visual closed-loop detection module verifies the closed-loop detection result of the laser SLAM module, the detection result of the laser SLAM module is specifically compared with the closed-loop detection result made by the visual closed-loop detection module to judge whether the detection result of the laser SLAM module is consistent with the closed-loop detection result of the visual closed-loop detection module, and if the detection result of the visual closed-loop detection module is consistent with the closed-loop detection result of the laser SLAM module and both the detection results of the laser.
Further, the method also comprises the following steps: and if the first closed-loop detection result is inconsistent with the second closed-loop detection result, sending verification information of closed-loop detection errors to the laser SLAM module through the visual closed-loop detection module.
When the visual closed-loop detection module verifies the closed-loop detection result of the laser SLAM module, the detection result of the laser SLAM module is compared with the closed-loop detection result made by the visual closed-loop detection module to judge whether the detection result is consistent with the closed-loop detection result made by the visual closed-loop detection module, if not, the visual closed-loop detection module sends verification information of a closed-loop detection error to the laser SLAM module, the laser SLAM module carries out closed-loop detection again until the closed-loop detection results of the laser SLAM module and the visual closed-loop detection module are consistent and are closed loops, or the laser SLAM module directly takes the closed-loop detection result of the visual closed-loop detection module as the standard to carry out.
And 204, executing image optimization operation when the laser SLAM module acquires the closed loop detection information.
This step is the same as thestep 103 in the previous embodiment, and is not described here again.
Before the laser SLAM module executes the image optimization operation, the closed-loop detection result needs to be determined to be a closed loop, the result verification of the visual closed-loop detection module is passed, the closed-loop detection results of the laser SLAM module and the visual closed-loop detection module are consistent, the detection result is considered to be reliable enough, the final detection result is determined to be the closed loop, and the subsequent image optimization operation is executed. The laser SLAM module is assisted by the visual closed-loop detection module to reject closed-loop false detection, and detection accuracy is improved.
In the embodiment of the application, the visual closed-loop detection module is added to realize the auxiliary effect on the laser SLAM module, so that the closed loop can be detected timely, even the local closed loop can be confirmed in advance, the accumulated error is reduced, the wrong closed loop is filtered, the composition of the laser SLAM module is more timely and accurate, the closed-loop detection of the laser SLAM module is realized, and the accuracy of construction of an environment map and subsequent path planning is improved.
Referring to fig. 3, fig. 3 is a structural diagram of a SLAM closed-loop detection system based on a laser radar according to an embodiment of the present application, and for convenience of explanation, only a part related to the embodiment of the present application is shown.
The SLAM closed loop detection system based on the laser radar comprises:
the visual closed-loop detection module is used for carrying out closed-loop detection on the basis of camera data acquired by the camera to obtain a first closed-loop detection result; on the basis of the first closed-loop detection result, sending closed-loop detection information to a laser synchronous positioning and mapping SLAM module;
and the laser SLAM module is used for executing image optimization operation when the closed-loop detection information is acquired.
Wherein, the laser SLAM module is further configured to:
performing closed-loop detection to obtain a second closed-loop detection result, and sending a verification request to the visual closed-loop detection module, wherein the verification request carries the second closed-loop detection result;
the visual closed-loop detection module is further configured to:
and when the verification request is received, the step of sending closed-loop detection information to a laser synchronous positioning and mapping SLAM module through the visual closed-loop detection module on the basis of the first closed-loop detection result is executed.
Wherein the visual closed-loop detection module is further configured to:
and if the first closed-loop detection result is consistent with the second closed-loop detection result and the first closed-loop detection result and the second closed-loop detection result are both closed loops, outputting closed-loop detection feedback information passing verification to the laser SLAM module.
Wherein the visual closed-loop detection module is further configured to:
and if the first closed loop detection result is inconsistent with the second closed loop detection result, sending verification information of closed loop detection errors to the laser SLAM module.
Wherein the visual closed-loop detection module is further configured to:
matching the first image frame of the current position point with the image frame of the historical position point obtained by recording;
if the similarity between a second image frame and the first image frame in the image frames of the historical position points is greater than a threshold value, determining that the first closed-loop detection result is a closed loop; and/or the presence of a gas in the gas,
and if the similarity between the first image frame and the previous image frame and the similarity between the next image frame and the first image frame of the second image frame in the image frames of the historical position points are both greater than a threshold value, determining that the first closed-loop detection result is a closed loop.
In the embodiment of the application, the visual closed-loop detection module is added to realize the auxiliary effect on the laser SLAM module, so that the closed loop can be detected timely, even the local closed loop can be confirmed in advance, the accumulated error is reduced, the wrong closed loop is filtered, the composition of the laser SLAM module is more timely and accurate, the closed-loop detection of the laser SLAM module is realized, and the accuracy of construction of an environment map and subsequent path planning is improved.
The SLAM closed-loop detection system based on the laser radar provided by the embodiment of the application can realize each process of the embodiment of the SLAM closed-loop detection method based on the laser radar, and can achieve the same technical effect, and in order to avoid repetition, the details are not repeated.
Fig. 4 is a structural diagram of a terminal according to an embodiment of the present application. As shown in the figure, theterminal 4 of this embodiment includes: aprocessor 40, amemory 41 and acomputer program 42 stored in saidmemory 41 and executable on saidprocessor 40. The terminal may be a robot, for example a sweeping robot, a warehouse goods handling robot.
Theterminal 4 may include, but is not limited to, aprocessor 40, amemory 41. Those skilled in the art will appreciate that fig. 4 is only an example of aterminal 4 and does not constitute a limitation ofterminal 4 and may include more or less components than those shown, or some components in combination, or different components, for example, the terminal may also include input output devices, network access devices, buses, etc.
TheProcessor 40 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
Thememory 41 may be an internal storage unit of theterminal 4, such as a hard disk or a memory of theterminal 4. Thememory 41 may also be an external storage device of theterminal 4, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) and the like provided on theterminal 4. Further, thememory 41 may also include both an internal storage unit and an external storage device of theterminal 4. Thememory 41 is used for storing the computer program and other programs and data required by the terminal. Thememory 41 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.