Detailed Description
The invention will be described in detail hereinafter with reference to the accompanying drawings in conjunction with embodiments. It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order.
Example 1
In this embodiment, a calibration method for sensor parameters is provided, where an execution subject of the method may be a target device, the target device may be an electronic device with an environment sensing capability, such as a mobile robot, an unmanned vehicle, and the like, a preset number of sensors for sensing environment data are disposed on the target device, fig. 1 is a flowchart of the calibration method for sensor parameters according to an embodiment of the present invention, and as shown in fig. 1, the flowchart includes the following steps:
step S102, acquiring characteristic data of target equipment by a preset number of sensors, wherein the preset number of sensors are arranged on the target equipment;
among other things, in alternative embodiments of the present application, the types of sensors may include one or more of a camera, a wheel odometer, an ultrasonic radar, an IMU, a Lidar, etc. sensors for sensing the surrounding environment; the parameters of the sensor may include, but are not limited to, the following: RGB camera intrinsic parameters, RGB camera and depth camera (TOF/structured light/binocular) extrinsic parameters, RGB camera/depth camera and Lidar extrinsic parameters, RGB camera/depth camera and wheel odometer extrinsic parameters, RGB camera/depth camera and IMU extrinsic parameters, and the like. It should be noted that objects of data to be acquired by the sensors of different types are different, and if the target device is a robot, vehicle body features corresponding to the sensors of different types are different.
Step S104, calibrating parameters of a preset number of sensors by using the characteristic data;
the calibration of the parameters of the sensor to which the present application relates may include: at least one of an internal reference calibration of the sensor and an external reference calibration between the sensors. Specifically, the calibration of the parameters of the sensor may include: internal reference calibration of a camera type sensor, external reference calibration between a camera type sensor and any or each of the other types of sensors.
The distance explanation is carried out by arranging a camera, an ultrasonic radar and an IMU on the target equipment. The internal reference calibration of the camera can be carried out by utilizing the characteristic data, the external reference calibration between the camera and the ultrasonic radar is carried out, and the external reference calibration between the camera and the IMU is carried out.
Through the steps S102 to S106, acquiring characteristic data of the target equipment by a preset number of sensors, and calibrating parameters of the preset number of sensors by using the characteristic data; that is to say, if the target device is a robot or an unmanned vehicle, the vehicle body characteristics of the robot or the unmanned vehicle can be collected through the sensor, and the parameters of the sensor are calibrated, so that the problem of low online calibration timeliness caused by the fact that external environment information is required to be relied on for calibrating the parameters of the sensor in the related art is solved, and the online calibration efficiency of the sensor is improved.
It should be noted that, in a specific application scenario, feature data corresponding to vehicle body features are acquired according to specific calibration task requirements. For example, for calibrating the camera internal parameters, only the image data containing the vehicle body characteristics of the corresponding camera needs to be acquired; for the calibration of external parameters between the camera and the 3D Lidar, the data of the vehicle body characteristic picture and the data of the 3D Lidar need to be acquired simultaneously.
In addition, if the target device is a mobile robot or an unmanned vehicle, for example, a preset number of sensors involved in the present application need to be installed at appropriate positions of the robot or the unmanned vehicle to ensure that each preset number of sets of sensors can acquire a specified feature of the vehicle body itself, where the number of the specified features may be one or more. For example, the specified features are body features of a robot or an unmanned vehicle, including but not limited to the following: the calibration method comprises the following steps of vehicle body plane isoplanar characteristics, straight lines in the vehicle body outline, shape characteristics such as circular arcs, line segment length in the vehicle body outline, straight line included angles, geometrical characteristics such as circular arc radiuses, and commonly-used calibration targets in calibration of a preset number of groups of sensors such as posted checkerboards or two-dimensional codes.
It should be noted that, for some scenes in which the relative positional relationship between the camera and the vehicle body is fixed and unchanged, in order to complete the calibration task, more complex vehicle body features need to be constructed, for example, targets need to be posted on a plurality of non-coplanar vehicle body planes; that is, under a scene that the relative position relationship between two sensors is fixed and unchanged, a plurality of features on the target device can be used as the designated features corresponding to the two sensors, and a group of designated feature data is obtained by simultaneously performing one-time acquisition by using the two sensors in the external reference calibration process between the two sensors, wherein the group of feature data includes the feature data of the designated features corresponding to the two sensors.
In order to improve the accuracy of parameter calibration, the characteristics of the used vehicle bodies can be different when different parameters are calibrated; for example, taking calibration of a relative positional relationship (external reference) between an RGB camera and a Lidar as an example for explanation, in order to facilitate quick and accurate calibration of internal reference of the RGB camera and quick and accurate determination of a relative positional relationship between a vehicle body feature and the camera, a calibration target with a known accurate size, similar to AprilTag, may be pasted on a vehicle body in a field of view of the RGB camera; for another example, in calibrating the relative positional relationship (external reference) between the RGB camera and the IMU, other vehicle body features other than the specified features corresponding to the RGB camera and the Lidar may be used.
Based on the above description, in an alternative embodiment of the present application, the manner of processing the feature data through the preset algorithm to calibrate the parameters in the preset number of sets of sensors, which is referred to in step S104 of the present application, may further include:
step S104-11, extracting specified feature data corresponding to specified features of the target equipment from the feature data, wherein the specified feature data are acquired by any two sensors in a preset number of sensors;
optionally, in the process of calibrating the external parameters of the two sensors, the feature data of the specified features corresponding to the two sensors are extracted from the feature data to obtain specified feature data, and the extracted specified feature data is processed to calibrate the external parameters between the two sensors.
Step S104-12, determining the designated features or the planes of the designated features according to the designated feature data, and describing information of coordinate systems corresponding to the sensors in any two sensors;
among them, the above description information may be a plane equation in an alternative embodiment of the present application.
And S104-13, calibrating parameters of any two sensors by processing the description information through a preset algorithm.
Wherein the preset algorithm corresponds to the two sensors involved in the step S104-13; the preset algorithms adopted for external reference calibration between sensors of different combinations may be different, and the corresponding relationship is set by developers.
As can be seen from the foregoing steps S104-11 to S104-13, in the present application, calibration of parameters in a preset number of sets of sensors can be implemented based on a specified feature of a target device, and in a case that the target device is a mobile robot or an unmanned vehicle, the specified feature may be a target pasted on a vehicle body plane or a vehicle body plane, and of course, in other scenarios, other vehicle body features may also be implemented, such as: straight line features, circular hole features, etc. In the present application, the steps S104-11 to S104-13 are exemplified by taking two sensors as an RGB camera and a Lidar to perform external reference calibration of the RGB camera and the Lidar;
firstly, in order to facilitate quick and accurate calibration of RGB camera internal parameters and quick and accurate calculation of the relative position relationship between the vehicle body characteristics and the camera, calibration targets with known accurate sizes similar to AprilTag can be pasted on a vehicle body in the camera visual field in advance, and target characteristic points are extracted by using an AprilTag recognition algorithm, wherein if the thickness of the targets is ignored, the vehicle body plane is the target plane, and the targets are components of the vehicle body characteristics. According to the characteristic points, the relative position relation between the target coordinate system and the camera coordinate system is solved by utilizing a PnP algorithm, and then a space plane equation of the vehicle body plane under the camera coordinate system can be obtained:
Acx+Bcy+Ccz+Dc=0
extracting a space plane equation of the vehicle body plane under the Lidar coordinate system by utilizing a PCL point cloud plane extraction algorithm:
Alx+Bly+Clz+ul=0
wherein A, B, C and D are coefficients of a plane equation, x, y and z are coordinate variable representations of three-dimensional space points, and the coefficients are provided with Ac,Bc,CcThe subscript represents the plane equation representation in the camera coordinate system, with Al,Bl,ClThe subscript represents the plane equation representation in the Lidar coordinate system.
Each vehicle body plane forms a group of three-dimensional plane constraint relations under a camera coordinate system and a Lidar coordinate system respectively, and the distance from a space plane normal vector and a coordinate system origin to the plane is as follows:
wherein n iscIs a unit normal vector of a target plane in a camera coordinate system, nlIs a unit normal vector of a target plane in a Lidar coordinate system, dcFor the distance of the origin of the camera coordinate system to the plane, dlIs the distance from the origin of the Lidar coordinate system to the plane.
Before the PnP algorithm is used for solving the relative position relation between the target plane and the camera coordinate system, the camera calibration algorithm is needed to be used for completing the calibration of camera internal parameters to obtain a camera internal parameter matrix:
wherein f isxNormalized focal length for sensor horizontal direction, fyIn the vertical direction of the sensorAnd normalizing the focal length. (c)x,cy) Is the principal point pixel coordinate, where the principal point is the intersection of the camera optical axis and the camera plane, in units of pixels.
The unit normal vector n of the target plane in the camera coordinate system has been derived as described abovecAnd the distance d from the origin of the camera coordinate system to the planecUnit normal vector n of target plane in Lidar coordinate systemlAnd the distance d from the origin of the Lidar coordinate system to the planel. The relative position relationship between the RGB camera to be calibrated and the Lidar is represented by a rotation matrix T as follows:
wherein R isCLA rotation matrix r for the change of the camera coordinate system to the Lidar coordinate systemijIs an element in 3 rows and 3 columns (0)<=i,j<=2);tCLTranslation matrix for the change of camera coordinate system to Lidar coordinate system, ti(0<=i<2) are elements in 3 rows and 1 columns.
According to the rotation principle of the plane in the three-dimensional space, the following corresponding relation can be obtained:
from the plane normal vector:
nl=(l1,l2,l3)T,nc=(c1,c2,c3)Twherein, in the three-dimensional space coordinate system, the normal vector of the plane is a three-dimensional vector and has three parameters, l1,l2,l3Three parameters of a plane normal vector under the Lidar coordinate system respectively, and c1,c2,c3Three parameters of a plane normal vector under a camera coordinate system are respectively obtained, so that:
the constraint relation of the normal vector of one target plane occupied by one group of sensors (camera + Lidar) under the camera coordinate system and the Lidar coordinate system is described above.
When a plurality of vehicle body features are respectively arranged on a plurality of non-coplanar N (N is an integer larger than 1) vehicle body planes, a constraint relation can be obtained for each vehicle body plane by referring to the method, and N groups of constraint relations are obtained; under the condition that the relative positions of the two sensors and the specified features of the target device are variable, the target device utilizes the two sensors to successively acquire feature data of one specified feature for N times in the moving process, N groups of specified feature data can be obtained, one group of specified feature data is acquired by the two sensors at the same time, and each group of specified feature data is processed by referring to the method to obtain N groups of constraint relations. In both cases, N sets of such constraint equations can be obtained:
wherein,
further, the following objective function is obtained:
r _ CL is an orthogonal matrix, and satisfies the following orthogonal matrix properties:
RTR=I3,amd det(R)=1
from the above orthogonal matrix properties, the equivalent objective function can be obtained as follows:
based on the method, the rotation matrix R _ CL can be obtained according to the original Procrustes recipe solving algorithm.
In addition, a plane unit normal vector n before transformation is known from the principle of distance correspondence between points and planescAnd a distance dcAnd transforming the matrix R _ CL, t _ CL to obtain the distance from the coordinate system origin to the plane after transformation as follows:
theoretically there should be the following equation:
d′l=dl
however, because there is an error in the actual measurement process, the theoretically calculated distance and the actually measured distance are not completely equal, so the following objective optimization function can be constructed:
therefore, the translation vector t _ CL in the transformation matrix can be solved by using the Levenberg-Marquard algorithm.
Further, the manner of extracting the specified feature data corresponding to the specified feature of the target device from the feature data involved in the above step S104-11 includes:
in the method (1), under the condition that the relative position between any two sensors and the designated feature of the target device is variable, multiple groups of designated feature data which are acquired by any two sensors in sequence aiming at one vehicle body feature are acquired, wherein each group of designated feature data is acquired by any two sensors at the same time; further, the plurality of sets of specified characteristic data are used to determine a parameter between the two sensors.
In the method (2), under the condition that the relative positions of any two sensors and the designated features of the target device are fixed, a group of designated feature data acquired by any two sensors is acquired, wherein the group of designated feature data comprises feature data of a plurality of vehicle body features, and the plurality of vehicle body features are not coplanar; further, the external parameters between the two sensors are determined by using the plurality of sets of specified characteristic data.
In the case where the relative position between any two sensors and the designated feature of the target device is variable, the target device needs to change the position, for example, move or rotate, to acquire multiple sets of designated feature data.
It should be noted that, in a specific application scenario, that is, in the entire calibration algorithm, in order to complete the solution of the rotation matrix R _ CL and the translation vector, five sets of constraint relationships of the target plane occupied by the sensor (camera + Lidar) in the camera coordinate system and the Lidar coordinate system are at least required. Of course, due to the inevitable sensor measurement errors in the real world, N should be greater than 5 in order to obtain higher combined calibration accuracy, and N is generally 20 in order to take the calibration efficiency into consideration.
In addition, it should be noted that, for a scene in which the relative position relationship between the preset number of sensors and the vehicle body characteristics can be changed, for example, the preset number of sensors is installed on the handle bar of the scooter, and the occupation of the preset number of sensors can be easily obtained by rotating the handle bar. In this scenario, the vehicle body features need only include one target plane. However, for a scene in which the relative positional relationship between the preset number of sets of sensors and the vehicle body features is fixed, it is necessary that the vehicle body features include at least five target planes, and in order to meet the requirement of high precision, it is necessary to include more target plane features.
Therefore, through the mode of the application, the online calibration of multiple sensors can be completed only by utilizing the self characteristics of target equipment (such as a robot or an unmanned vehicle), the method does not depend on external environment information, the online calibration efficiency of the multiple sensors is improved, and meanwhile, the success rate and the timeliness of the completion of the online calibration task of the multiple sensors are ensured.
In an optional embodiment of the present application, the calibrating the parameters of the preset number of sensors by using the characteristic data, which is referred to in step S106 of the present application, includes:
s106-21, extracting specified feature data corresponding to specified features of the target equipment from the feature data, wherein the specified feature data are acquired by a sensor of a preset type in a preset number of sensors;
the predetermined type of sensor to which the present application relates may be a camera type sensor.
And S106-22, performing internal reference calibration of the sensor of the preset type according to the specified characteristic data.
It is to be noted that; when external parameter calibration between any two sensors is completed for the first time, the calibration result of the parameters is stored; and storing the calibration structure of the parameter when the internal parameter calibration of any sensor is completed for the first time.
In yet another alternative embodiment of the present application, as shown in fig. 2, the method steps of the present application may further comprise:
step S106, detecting whether the relative positions among the sensors in the preset number of sensors are changed;
in a specific application scenario, the target device may be impacted, or the relative positions of the sensors in the preset number of sensors may change due to human factors, or the like.
It should be noted that: the target equipment can periodically self-check whether the relative positions among the sensors in the preset number of sensors change, and the detection period can be set by developers or can be customized by users; when the target device receives the calibration check instruction, it may perform detection on whether the relative positions between the sensors in the preset number of sensors change, where the calibration check instruction may be a signal generated when a detection button on the target device is operated, or may also be a calibration check instruction sent by a mobile terminal, such as a mobile phone, a tablet computer, and the like, and is not specifically limited in this application.
And step S108, after detecting that the relative positions of the sensors in the preset number of sensors are changed, triggering to execute step S102 and step S104 again.
Updating the stored calibration result of the parameter by executing step S102 and step S104 again
As can be seen from the above steps S106 to S108, after the relative positions of the sensors in the preset number of sensors change, since the previous calibration result is inaccurate, calibration needs to be performed again, and thus the parameter calibration of the sensors is triggered again.
The method for detecting whether the relative position between the sensors in the preset number of sensors changes in step S106 may further include:
s106-11, acquiring appointed characteristic data of appointed characteristics of target equipment acquired by any two sensors in a preset number of sensors, and acquiring parameter calibration results of any two sensors;
step S106-12, determining the space representation of the specified feature or the plane of the specified feature in each sensor coordinate system of any two sensors;
and S106-13, carrying out calibration check on the parameter calibration results of any two sensors according to the spatial representation.
For the above step S106-11 to step S106-13, in a specific application scenario: taking calibration check of external reference between the RGB camera and the Lidar as an example, after acquiring a spatial representation a (first spatial representation) of the vehicle body feature in the camera coordinate system and a spatial representation B (second spatial representation) of the Lidar coordinate system, respectively, the spatial representation a of the vehicle body feature in the camera coordinate system is converted to a spatial representation C (third spatial representation) of the Lidar coordinate system on the basis of the calibration result of the external reference between the RGB camera and the Lidar.
The method for performing calibration check on the parameter calibration results of any two sensors according to the spatial representation in step S106-13 of the present application may further include:
a step S1 of determining a distance between the second spatial representation and the third spatial representation;
step S2, determining that the calibration results of the parameters of any two sensors are correct under the condition that the distance is less than or equal to a preset threshold value;
reporting a message for indicating that the calibration result is correct under the condition that the calibration result is correct;
and step S3, determining that the calibration results of the parameters of any two sensors are incorrect under the condition that the distance is greater than the preset threshold value.
When the calibration result is incorrect, a message for indicating that the calibration result is incorrect is reported, and/or recalibration of the parameters of the preset number of sensors is triggered, and/or recalibration of the parameters of any two sensors (involved in step S106-13) in the preset number of sensors is triggered. The parameter recalibration mode can refer to the parameter calibration method of the sensor, and details are not repeated here.
Reporting the calibration result as success, indicating that the calibration result is correct and the multi-sensor recalibration is not needed, reporting the calibration result as failure, indicating that the calibration result is wrong and the navigation task needs to be terminated immediately and self-safety protection needs to be carried out, wherein the self-safety protection includes but is not limited to moving to a pedestrian path, moving to an unmanned area and the like. That is to say, in the moving process of the target equipment (mobile robot/unmanned vehicle), the calibration and inspection instruction can be responded quickly to complete the multi-sensor online calibration and inspection task, once the calibration result of the multi-sensor is found to be inaccurate, the navigation task can be stopped, the recalibration instruction is issued after the multi-sensor online calibration and inspection task is operated to a safe position, the relative position relation of the multi-sensor on the vehicle body can be updated in time after the recalibration, the ability of accurately sensing the environment is given to the mobile target equipment (robot/unmanned vehicle) in time, a large amount of manpower, material resources and financial resources are saved, and meanwhile, the operation stability and the safety of the mobile target equipment (robot/unmanned vehicle) are improved.
Through the above description of the embodiments, those skilled in the art can clearly understand that the method according to the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but the former is a better implementation mode in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
Example 2
In this embodiment, a calibration apparatus for sensor parameters is further provided, and the apparatus is used to implement the foregoing embodiments and preferred embodiments, and the description already made is omitted. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware or a combination of software and hardware is also possible and contemplated.
Fig. 3 is a block diagram of a calibration apparatus for sensor parameters according to an embodiment of the present invention, as shown in fig. 3, the apparatus includes: anacquisition module 302, configured to acquire feature data of a target device through a preset number of sensors, where the preset number of sensors are disposed on the target device; and thecalibration module 304 is coupled to theacquisition module 302, and configured to perform parameter calibration on a preset number of sensors by using the characteristic data.
Optionally, thecalibration module 304 in the present application may further include: the first extraction unit is used for extracting specified feature data corresponding to specified features of the target equipment from the feature data, wherein the specified feature data are acquired by any two sensors in a preset number of sensors; the first determining unit is used for determining the specified features or the planes of the specified features according to the specified feature data, and describing information of a coordinate system corresponding to each sensor in any two sensors; and the first calibration unit is used for calibrating the parameters of any two sensors by processing the description information through a preset algorithm.
Optionally, the first extraction unit includes: the first acquisition subunit is used for acquiring a group of specified characteristic data acquired by any two sensors under the condition that the relative positions of the specified characteristics of any two sensors and the target equipment are variable; and the second acquisition subunit is used for acquiring multiple groups of specified characteristic data acquired by any two sensors under the condition that the relative positions of the specified characteristics of any two sensors and the target equipment are fixed, wherein one group of specified characteristic data is acquired by any two sensors at the same time.
Optionally, thecalibration module 304 in this application further includes: the second extraction unit is used for extracting specified feature data corresponding to specified features of the target equipment from the feature data, wherein the specified feature data are acquired by a sensor of a preset type in a preset number of sensors; and the second calibration unit is used for performing internal reference calibration on the sensor of the preset type according to the specified characteristic data.
Fig. 4 is a first structural block diagram of an alternative calibration apparatus for sensor parameters according to an embodiment of the present invention, as shown in fig. 4, the apparatus may further include: a detectingmodule 402, configured to detect whether a relative position between sensors in a preset number of sensors changes; and the triggeringmodule 404 is coupled to the detectingmodule 402, and configured to trigger the steps of acquiring feature data of the target device by using the preset number of sensors again after detecting that the relative positions of the sensors in the preset number of sensors change, and calibrating the parameters of the preset number of sensors by using the feature data.
Optionally, the detectingmodule 402 in this application may further include: the acquisition unit is used for acquiring appointed characteristic data of appointed characteristics of target equipment acquired by any two sensors in a preset number of sensors and acquiring parameter calibration results of any two sensors; a second determination unit, configured to determine a spatial representation of the specified feature or a plane in which the specified feature is located in each of the sensor coordinate systems of any two sensors; and the checking unit is used for carrying out calibration check on the parameter calibration results of any two sensors according to the spatial representation.
Example 3
Embodiments of the present invention also provide a storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the above method embodiments when executed.
Alternatively, in the present embodiment, the storage medium may be configured to store a computer program for executing the steps of:
s1, acquiring characteristic data of the target equipment by a preset number of sensors, wherein the preset number of sensors are arranged on the target equipment;
and S2, calibrating the parameters of the sensors with the characteristic data.
Optionally, in this embodiment, the storage medium may include, but is not limited to: various media capable of storing computer programs, such as a usb disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk.
Embodiments of the present invention also provide an electronic device comprising a memory having a computer program stored therein and a processor arranged to run the computer program to perform the steps of any of the above method embodiments.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
Optionally, in this embodiment, the processor may be configured to execute the following steps by a computer program:
s1, collecting the characteristic data of the target device by a preset number of sensors, wherein the preset number of sensors are arranged on the target device
And S2, calibrating the parameters of the sensors with the characteristic data.
Optionally, the specific examples in this embodiment may refer to the examples described in the above embodiments and optional implementation manners, and this embodiment is not described herein again.
It will be apparent to those skilled in the art that the modules or steps of the present invention described above may be implemented by a general purpose computing device, they may be centralized on a single computing device or distributed across a network of multiple computing devices, and alternatively, they may be implemented by program code executable by a computing device, such that they may be stored in a storage device and executed by a computing device, and in some cases, the steps shown or described may be performed in an order different than that described herein, or they may be separately fabricated into individual integrated circuit modules, or multiple ones of them may be fabricated into a single integrated circuit module. Thus, the present invention is not limited to any specific combination of hardware and software.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the principle of the present invention should be included in the protection scope of the present invention.