Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The same reference numerals denote the same or similar parts in the drawings, and thus, a repetitive description thereof will be omitted.
The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to provide a thorough understanding of embodiments of the invention. One skilled in the relevant art will recognize, however, that the invention may be practiced without one or more of the specific details, or with other methods, components, devices, steps, and so forth. In other instances, well-known methods, devices, implementations or operations have not been shown or described in detail to avoid obscuring aspects of the invention.
The drawings are merely schematic illustrations of the present invention, in which the same reference numerals denote the same or similar parts, and thus, a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The flow charts shown in the drawings are merely illustrative and do not necessarily include all of the contents and steps, nor do they necessarily have to be performed in the order described. For example, some steps may be decomposed, and some steps may be combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
The following detailed description of exemplary embodiments of the invention refers to the accompanying drawings.
Fig. 1 is a system block diagram illustrating a three-dimensional point cloud registration method and apparatus according to an exemplary embodiment.
Theserver 105 may be a server providing various services, such as a background management server (for example only) providing support for a three-dimensional point cloud registration system operated by a user with theterminal devices 101, 102, 103. The background management server may analyze and otherwise process the received data such as the three-dimensional point cloud registration request, and feed back a processing result (e.g., a transformation matrix — just an example) to the terminal device.
Theserver 105 may, for example, obtain a point cloud to be registered and a reference point cloud; theserver 105 may, for example, determine a first projection image of the point cloud to be registered on the two-dimensional plane, where the first projection image includes a plurality of first pixel points, and each first pixel point corresponds to at least one point cloud to be registered; theserver 105 may, for example, determine a second projection image of the reference point cloud in the two-dimensional plane, the second projection image including a plurality of second pixel points, each corresponding to at least one reference point cloud.Server 105 may determine pairs of matched pixels, for example, from first pixels of the first projected image and second pixels of the second projected image, each pair of matched pixels including a first pixel and a second pixel. Theserver 105 may determine a transformation matrix according to at least one point cloud to be registered corresponding to a first pixel point in the matching pixel points and at least one reference point cloud corresponding to a second pixel point in the matching pixel points, for example, to perform a point cloud registration operation on the point cloud to be registered according to the transformation matrix.
The server 105 may be a solid server, or may be composed of a plurality of servers, for example, a part of the server 105 may be, for example, used as a three-dimensional point cloud registration task submitting system in the present disclosure, for obtaining a task to be executed with a three-dimensional point cloud registration command; and a portion of the server 105 may also be used, for example, as a three-dimensional point cloud registration system in the present disclosure, for obtaining a point cloud to be registered and a reference point cloud; determining a first projection image of the point cloud to be registered on a two-dimensional plane, wherein the first projection image comprises a plurality of first pixel points, and each first pixel point corresponds to at least one point cloud to be registered; determining a second projection image of the reference point cloud on the two-dimensional plane, wherein the second projection image comprises a plurality of second pixel points, and each second pixel point corresponds to at least one reference point cloud; determining matched pixel point pairs according to a first pixel point of the first projection image and a second pixel point of the second projection image, wherein each matched pixel point pair comprises a first pixel point and a second pixel point; and determining a transformation matrix according to at least one point cloud to be registered corresponding to a first pixel point in the matching pixel points and at least one reference point cloud corresponding to a second pixel point in the matching pixel points, so as to perform point cloud registration operation on the point cloud to be registered according to the transformation matrix.
Fig. 2 is a flow chart illustrating a three-dimensional point cloud registration method according to an exemplary embodiment. The three-dimensional point cloud registration method provided by the embodiments of the present disclosure may be executed by any electronic device with computing processing capability, for example, theterminal devices 101, 102, and 103 and/or theserver 105, and in the following embodiments, the server execution method is taken as an example for illustration, but the present disclosure is not limited thereto. The three-dimensional point cloud registration method 20 provided by the embodiment of the present disclosure may include steps S202 to S208.
As shown in fig. 2, in step S202, a point cloud to be registered and a reference point cloud are acquired.
In the embodiment of the present disclosure, the point cloud to be registered and the reference point cloud may be local point cloud data acquired from different viewing angles and/or different periods and/or different devices. For example, a point cloud to be registered and a reference point cloud for a wide range of indoor and outdoor scenes may be acquired by an acquisition device. The point cloud to be registered and the reference point cloud may respectively include coordinate data and reflection intensity information in a three-dimensional coordinate system.
In step S204, a first projection image of the point cloud to be registered on the two-dimensional plane is determined, where the first projection image includes a plurality of first pixel points, and each first pixel point corresponds to at least one point cloud to be registered.
In the embodiment of the disclosure, the point cloud to be registered may have three-dimensional coordinate information, and the point cloud to be registered may be projected on a two-dimensional plane according to the three-dimensional coordinate information of the point cloud to be registered, so as to form a plurality of first pixel points. Each point cloud to be registered can be projected on a first pixel point, and each first pixel point can correspond to at least one point cloud to be registered. In order to facilitate implementation of the method, a plane where two dimensions in the three-dimensional coordinate information of the point cloud to be registered are located can be determined as a two-dimensional plane. For example, the coordinate information of the point cloud to be registered includes X, Y, Z three dimensions. Wherein the Z coordinate direction tends to be consistent with the gravity direction. X, Y, the coordinate direction is perpendicular to the direction of gravity. The plane in which the X, Y coordinate dimension lies can be determined as a two-dimensional plane (hereinafter XY plane). And projecting the cloud of the point to be registered to an XY plane by using orthographic projection to achieve the purpose of reducing the dimension of the cloud of the point to be registered.
In step S206, a second projection image of the reference point cloud on the two-dimensional plane is determined, where the second projection image includes a plurality of second pixel points, and each second pixel point corresponds to at least one reference point cloud.
In the embodiment of the present disclosure, a similar generation manner to the first projection image may be adopted for determining the second projection image according to the reference point cloud, and details are not repeated here.
In step S208, matching pixel point pairs are determined according to a first pixel point of the first projection image and a second pixel point of the second projection image, where each matching pixel point pair includes a first pixel point and a second pixel point.
In the embodiment of the present disclosure, a mode of performing two-dimensional image feature extraction and feature matching on pixel points (first pixel points and second pixel points) in projection images (in a first projection image and a second projection image) is adopted to obtain pixel point pairs successfully matched.
In step S210, a transformation matrix is determined according to at least one point cloud to be registered corresponding to a first pixel point in the matching pixel points and at least one reference point cloud corresponding to a second pixel point in the matching pixel points, so as to perform point cloud registration on the point cloud to be registered according to the transformation matrix.
In the embodiment of the disclosure, a target point cloud to be registered and a target reference point cloud may be determined for a first pixel point and a second pixel point in a matching pixel point pair, respectively, and the target point cloud to be registered and the target reference point cloud may be determined as a matching point cloud pair, so as to determine a transformation matrix by using the matching point cloud pair. When the target point cloud to be registered is determined according to the first pixel point, the coordinate information of at least one point cloud to be registered corresponding to the first pixel point can be used as a basis. When the second pixel point is promoted to the target reference point cloud, the coordinate information of at least one reference point cloud corresponding to the second pixel point can be used as a basis. Currently, a point cloud collection device is usually set to a Z coordinate direction in a direction which tends to be consistent with a gravity direction, based on the prior, taking an example of determining a target point cloud to be registered according to a first pixel point in an XY plane, coordinates of X, Y dimensions of the center position of the first pixel point in the XY plane can be respectively determined as X, Y dimensional coordinate values of the lifted target point cloud to be registered, and a Z dimensional coordinate value of a lowest point (i.e., a Z coordinate value is minimum) in at least one point cloud to be registered corresponding to the first pixel point is determined as an elevation value of the lifted target point cloud to be registered. The transformation matrix is a rotational translation matrix between the target point cloud to be registered and the target reference point cloud, and aims to transform the target point cloud to be registered to a coordinate system identical to the target reference point cloud. The transformed coordinates may be expressed as the following equation:
pt=R·ps+T (1)
wherein p istAnd psRespectively a target reference point cloud and a target point cloud to be registered in the same matching point cloud pair. R, T are rotation and translation matrices, collectively referred to as transformation matrices as previously described.
According to the three-dimensional point cloud registration method provided by the embodiment of the disclosure, a point cloud to be registered is projected to be a first pixel point on a two-dimensional plane based on a projection mode, a first projection image is generated by using the first pixel point, a reference point cloud is projected to be a second pixel point on the two-dimensional plane, a second projection image is generated by using the second pixel point, and a matching pixel point pair is determined by using the first projection image and the second projection image, so that the three-dimensional point cloud registration problem can be reduced to be a two-dimensional image registration problem, the algorithm complexity and the memory requirement are reduced, the registration efficiency is greatly improved, the registration algorithm can be expanded to a large-range scene, and the quick registration without initial values of hundreds of millions of level point clouds is realized.
Fig. 3 is a flow chart illustrating a three-dimensional point cloud registration method according to an exemplary embodiment. The three-dimensional point cloud registration method 30 provided by the embodiment of the present disclosure may include steps S302 to S306 when determining the first projection image of the point cloud to be registered on the two-dimensional plane.
As shown in fig. 3, in step S302, the two-dimensional plane is divided according to the projection position of the point cloud to be registered on the two-dimensional plane, so as to obtain a plurality of first pixel points.
In the embodiment of the present disclosure, fig. 5 may be taken as an example. When determining the first projection image, each grid (the pixel points indicated in fig. 5) including three-dimensional points in fig. 5 is a first pixel point. Of course, when determining the second projection image, each pixel point containing the three-dimensional point in fig. 5 represents a second pixel point. When a plurality of first pixel points are obtained, the first pixel points can be projected onto an XY plane according to the (X, Y) coordinates of each point cloud to be registered, and then the minimum X of the projection points of the point cloud to be registered on the XY plane on the X axis and the Y axis is counted
min、Y
minAnd maximum value X
max、Y
max. After obtaining the above four to range, with (X)
min,Y
min) And as a starting point, equally dividing the XY plane at intervals of s within the range of four to form a plurality of first pixel points. The first pixel point can be identified by a row number (i, j). For example, the pixel number at the lower left corner in fig. 5 is marked as (0, 0), and the covered geographic range is (X)
min,Y
min)→(X
min+s,Y
min+ s). The pixel point (i, j) where each three-dimensional point (point cloud to be registered in this embodiment) (X, Y, Z) is located (first pixel point in this embodiment) can be calculated in sequence according to the following formula (2), and the corresponding relationship between the space three-dimensional point and the pixel point where the space three-dimensional point is located is established, so as to obtain an index relationship table between the three-dimensional point and the pixel point. Wherein
Representing a rounding down.
In step S304, a gray value of the first pixel point is determined according to the reflection intensity information of the at least one point cloud to be registered corresponding to the first pixel point.
In the embodiment of the present disclosure, the reflection intensity information of the point cloud to be registered with the lowest mid-range value of the at least one point cloud to be registered corresponding to each first pixel point may be determined as the gray value of the first pixel point. When the two-dimensional plane is an XY plane, the lowest point cloud to be registered of at least one point cloud to be registered corresponding to the first pixel point is the point cloud to be registered with the minimum Z coordinate value.
In step S306, a first projection image is generated according to the gray-scale value of the first pixel point.
In the embodiment of the present disclosure, the gray value of each pixel point may be used as the gray expression in the first projection image of the first pixel point, so as to generate the first projection image. An example diagram of the first projection image may be shown, for example, in fig. 6 (b).
In an exemplary embodiment, in performing step S304, the following steps S3041-S3044 may be included:
step S3041, obtaining reflection intensity information of the point cloud to be registered.
Step S3042, determining the minimum value in the reflection intensity information of the point cloud to be registered as the global minimum reflection intensity of the point cloud to be registered, and determining the maximum value in the reflection intensity information of the point cloud to be registered as the global maximum reflection intensity of the point cloud to be registered.
Step S3043, determining the reflection sub-intensity of the first pixel point according to the reflection intensity information of the at least one point cloud to be registered corresponding to the first pixel point.
Step S3044, determining a gray value of the first pixel point according to the global minimum reflection intensity of the point cloud to be registered, the global maximum reflection intensity of the point cloud to be registered, and the reflection sub-intensity of the first pixel point.
The reflection intensity information of each first pixel point is obtained, and then normalization processing can be performed on the reflection intensity information of at least one point cloud to be registered corresponding to the first pixel point according to a formula (3).
v(i,j)=255(I(i,j)-Imin)/(Imax-Imin) (3)
Wherein I(i,j)The reflection intensity information of the first pixel point before normalization. I ismaxAnd IminThe maximum and minimum reflection intensity values in all point clouds to be registered (i.e., the point clouds to be registered in the whole range acquired), v (i,j) the normalized reflection intensity information of the first pixel point is obtained.
The embodiment shown in fig. 3 of the present disclosure provides a manner of generating the first projection image, and a manner of generating the second projection image may adopt steps similar to steps S302 to S306 or steps S3041 to S3042 of the embodiment of the present disclosure. An example diagram of the second projection image may be as shown in fig. 6(a), for example. For example, when a second projection image of the reference point cloud on the two-dimensional plane is determined, the two-dimensional plane may be divided according to the projection position of the reference point cloud on the two-dimensional plane to obtain a plurality of second pixel points; determining the gray value of a second pixel point according to the reflection intensity information of at least one reference point cloud corresponding to the second pixel point; and generating a second projection image according to the gray value of the second pixel point.
For another example, when the gray value of the second pixel point is determined according to the reflection intensity information of at least one reference point cloud corresponding to the second pixel point, the reflection intensity information of the reference point cloud can be obtained; determining the minimum value in the reflection intensity information of the reference point cloud as the global minimum reflection intensity of the reference point cloud, and determining the maximum value in the reflection intensity information of the reference point cloud as the global maximum reflection intensity of the reference point cloud; determining the reflection sub-intensity of the second pixel point according to the reflection intensity information of at least one reference point cloud corresponding to the second pixel point; and determining the gray value of the second pixel point according to the global minimum reflection intensity of the reference point cloud, the global maximum reflection intensity of the reference point cloud and the reflection sub-intensity of the second pixel point.
FIG. 8 is a flowchart illustrating a method of three-dimensional point cloud registration, according to an example embodiment. The three-dimensional point cloud registration method 80 provided in the embodiment of the present disclosure may include steps S802 to S808 when determining a matching pixel pair according to a first pixel point of a first projection image and a second pixel point of a second projection image.
In step S802, feature extraction is performed on the first pixel to obtain a first feature vector of the first pixel.
In step S804, feature extraction is performed on the second pixel point to obtain a second feature vector of the second pixel point.
In step S806, the feature distance between the first feature vector of the first pixel and the second feature vector of each second pixel is determined.
In the embodiment of the disclosure, for the first pixel point a1, all the second pixel points in the second projection image are a2, B2 and C2 in sequence, and then the characteristic distance S1 between a1 and a2, the characteristic distance S2 between a1 and B2, and the characteristic distance S3 between a1 and C2 can be determined in sequence.
In step S808, the sum of the first pixel point and the second pixel point with the minimum feature distance is determined as a matching point pair.
In the embodiment of the present disclosure, in connection with the foregoing examples, the first pixel point corresponding to the minimum characteristic distance in S1, S2, and S3 may be determined as the matching point pair by the second pixel point. For example, when S1 is the minimum of S1, S2, S3, a1 and a2 may be determined as a matching point pair.
In the embodiment of the present disclosure, a Speeded Up Robust Features (SURF) algorithm that maintains invariance to image scaling, rotation, and affine transformation may be adopted to perform feature point detection and descriptor computation, where SURF is an improvement on a Scale-invariant feature transform (SIFT) algorithm, and the efficiency is improved by several times as compared with SIFT on the premise of ensuring matching accuracy and robustness. The whole matching process comprises 6 parts of Hessian matrix construction, scale space construction, feature point positioning, feature point main direction calculation, feature descriptor generation and feature vector matching based on Kdtree. In an exemplary embodiment, a bidirectional matching strategy can be further adopted to obtain more matching point pairs, so that the matching robustness is improved.
In an exemplary embodiment, a feature distance ratio of a minimum value and a next smallest value among the feature distances may be determined; and if the characteristic distance ratio is larger than the characteristic distance ratio threshold, rejecting the matched point pairs. In connection with the foregoing example, when S1< S2< S3, S1 is the minimum value in the feature distances, S2 is the second smallest value in the feature distances, the feature distance ratio is S1/S2, and when the feature distance ratio S1/S2 is greater than the feature ratio threshold, it is considered that, for the first pixel a1, the first pixel a1 and the second pixel a2 and B2 may both be a matching point pair, in this case, the behavior reliability of directly determining the first pixel a1 and the second pixel a2 as the matching point pair is low, and therefore, the matching point pair of a1 and a2 can be removed to improve the matching robustness.
In an exemplary embodiment, a homographic transformation of the first projection image and the second projection image may also be determined; and eliminating the matched point pairs with the error larger than the error pixel threshold value after homography transformation. For example, a random sample consensus (RANSAC) algorithm may be used to establish a homography between the first and second projection images, and a pair of matching points having a homography post-projection error exceeding an error pixel threshold may be identified as outliers and filtered. FIG. 7 is an exemplary diagram illustrating an effect of feature matching and error point filtering according to an example. In this embodiment, the culling is performed using the RANSAC algorithm based on three-dimensional euclidean transforms. And finally, based on all the matching pairs of the inner points after the false points are eliminated, the centimeter-level large-range point cloud accurate registration can be realized through effective feature matching and a multi-level gross error elimination mechanism.
FIG. 9 is a flowchart illustrating a method of three-dimensional point cloud registration, according to an example embodiment. The three-dimensional point cloud registration method 90 provided in the embodiment of the present disclosure may include steps S902 to S906 when determining a transformation matrix according to at least one point cloud to be registered corresponding to a first pixel point in a matching pixel point and at least one reference point cloud corresponding to a second pixel point in the matching pixel point.
In step S902, the point cloud to be registered having the lowest elevation value in the point clouds to be registered corresponding to the first pixel point in the matching pixel points is determined as the first point cloud.
In step S904, the reference point cloud having the lowest elevation value in the reference point clouds corresponding to the second pixel point in the matching pixel points is determined as the second point cloud.
In step S906, a transformation matrix is determined using the first point cloud and the second point cloud as a matching point cloud pair.
The method comprises the steps of calculating a 6-degree-of-freedom transformation matrix between two point clouds in a matching point cloud pair in an iterative manner by adopting a Levenberg-Marquard (LM) algorithm, and acting the calculated transformation matrix on all three-dimensional points in the point clouds to be registered to realize three-dimensional registration between the point clouds.
According to the embodiment of the disclosure, the two-dimensional first pixel point and the two-dimensional second pixel point are respectively promoted to the first point cloud and the second point cloud in the three-dimensional space, and the transformation matrix is determined based on the first point cloud and the second point cloud. The method can convert the matching point pairs obtained by two-dimensional image registration into three-dimensional matching point cloud pairs, further realize the conversion of the three-dimensional point cloud registration problem into the two-dimensional image registration problem, and realize quick registration without initial values.
In an exemplary embodiment, when the transformation matrix is determined according to at least one point cloud to be registered corresponding to a first pixel point in the matching pixel points and at least one reference point cloud corresponding to a second pixel point in the matching pixel points, a matching point cloud pair can be further obtained according to the matching point pair in formula (4).
Wherein, (i, j) is the coordinate of a pixel point (first pixel point when the first point cloud is determined, second pixel point when the second point cloud is determined), (X)(i,j),Y(i,j)) The coordinates corresponding to X, Y dimensions in the three-dimensional space corresponding to the pixel point are obtained. Three-dimensional space coordinate Z(i,j)The value may be an elevation value (i.e., a Z-coordinate value) of a three-dimensional point with a minimum Z-coordinate value in the point cloud to be registered or the reference point cloud corresponding to the pixel point shown in fig. 5.
FIG. 10 is a flow chart illustrating a method of three-dimensional point cloud registration in accordance with an exemplary embodiment. The three-dimensional pointcloud registration method 100 provided by the embodiment of the disclosure may include three technical links ofpoint cloud projection 1002, feature extraction and matching 1004, and three-dimensional space transformation 1006. Thepoint cloud projection 1002 mainly performs mapping from a three-dimensional point cloud (a point cloud to be registered and a reference point cloud) to a two-dimensional image (a first projection image and a second projection image), and the grid index is a corresponding relationship between each grid (i.e., a first pixel point or a second pixel point) and the point cloud to be registered or the reference point cloud. Feature extraction and matching 1004 calculates pixel point pairs of the same name based on the mapped two-dimensional image, and finally completes registration of the two groups of point clouds through three-dimensional mapping and transformation of the pixel point pairs of the same name. Fig. 4 shows an effect diagram of the point cloud to be registered and the reference point cloud after registration by using a specific example. Fig. 4(a) is an effect diagram of the reference point cloud, fig. 4(b) is an effect diagram of the point cloud to be registered, and fig. 4(c) is an effect diagram of the point cloud after registration. According to the three-dimensional point cloud registration method provided by the embodiment of the disclosure, aiming at the registration problem of large-range three-dimensional dense point cloud, the complicated three-dimensional point cloud registration problem is converted into the two-dimensional image registration problem, and the three-dimensional point cloud registration method based on the orthographic projection without initial value is provided. A large number of experiments prove that the method provided by the invention can effectively solve the problem of large-range (kilometer level) three-dimensional dense point cloud registration. In particular, the method provided by the invention has obvious advantages in the aspects of registration efficiency and registration accuracy compared with the prior art. In the aspect of efficiency, the three-dimensional point cloud registration problem is converted into the two-dimensional image registration problem, the registration efficiency is greatly improved, and the quick registration without initial values of hundreds of millions of levels of point clouds is realized; in the aspect of registration precision, the invention designs an effective feature matching and multi-level gross error rejection mechanism on the basis of RANSAC algorithm, and realizes centimeter-level large-range point cloud precise registration.
It should be clearly understood that this disclosure describes how to make and use particular examples, but the principles of this disclosure are not limited to any details of these examples. Rather, these principles can be applied to many other embodiments based on the teachings of the present disclosure.
Those skilled in the art will appreciate that all or part of the steps for implementing the above embodiments are implemented as a computer program executed by a Central Processing Unit (CPU). When executed by a central processing unit CPU, performs the above-described functions defined by the above-described methods provided by the present disclosure. The program of (a) may be stored in a computer readable storage medium, which may be a read-only memory, a magnetic or optical disk, or the like.
Furthermore, it should be noted that the above-mentioned figures are only schematic illustrations of the processes involved in the methods according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
The following are embodiments of the disclosed apparatus that may be used to perform embodiments of the disclosed methods. For details not disclosed in the embodiments of the apparatus of the present disclosure, refer to the embodiments of the method of the present disclosure.
Fig. 11 is a block diagram illustrating a three-dimensional point cloud registration apparatus according to an exemplary embodiment. Referring to fig. 11, a three-dimensional pointcloud registration apparatus 1100 provided by an embodiment of the present disclosure may include: a pointcloud acquisition module 1102, afirst image module 1104, asecond image module 1106, a pixelpoint pair module 1108, and a pointcloud registration module 1110.
In the three-dimensional pointcloud registration apparatus 1100, a pointcloud obtaining module 1102 may be configured to obtain a point cloud to be registered and a reference point cloud.
Thefirst image module 1104 may be configured to determine a first projection image of the point cloud to be registered on the two-dimensional plane, the first projection image including a plurality of first pixel points, each corresponding to at least one point cloud to be registered.
Thesecond image module 1106 may be configured to determine a second projected image of the reference point cloud in the two-dimensional plane, the second projected image including a plurality of second pixel points, each corresponding to at least one reference point cloud.
The pixelpoint pair module 1108 may be configured to determine matched pixel point pairs from a first pixel point of the first projected image and a second pixel point of the second projected image, each matched pixel point pair including a first pixel point and a second pixel point.
The pointcloud registration module 1110 may be configured to determine a transformation matrix according to at least one point cloud to be registered corresponding to a first pixel point in the matching pixel points and at least one reference point cloud corresponding to a second pixel point in the matching pixel points, so as to perform a point cloud registration operation on the point cloud to be registered according to the transformation matrix.
According to the three-dimensional point cloud registration device provided by the embodiment of the disclosure, a point cloud to be registered is projected to be a first pixel point on a two-dimensional plane based on a projection mode, a first projection image is generated by using the first pixel point, a reference point cloud is projected to be a second pixel point on the two-dimensional plane, a second projection image is generated by using the second pixel point, and a matching pixel point pair is determined by using the first projection image and the second projection image, so that the three-dimensional point cloud registration problem can be reduced to be a two-dimensional image registration problem, the algorithm complexity and the memory requirement are reduced, the registration efficiency is greatly improved, the registration algorithm can be expanded to a large-range scene, and the quick registration without initial values of hundreds of millions of level point clouds is realized.
In an exemplary embodiment, the first image module may include: the plane division submodule can be configured to divide the two-dimensional plane according to the projection position of the point cloud to be registered on the two-dimensional plane to obtain a plurality of first pixel points; the gray value sub-module can be configured to determine the gray value of the first pixel point according to the reflection intensity information of at least one point cloud to be registered corresponding to the first pixel point; and a first image sub-module configurable to generate a first projected image according to the gray value of the first pixel point.
In an exemplary embodiment, the gray value sub-module may include: a reflection intensity acquisition unit configured to acquire reflection intensity information of the point cloud to be registered; the global reflection intensity unit can be configured to determine the minimum value in the reflection intensity information of the point cloud to be registered as the global minimum reflection intensity of the point cloud to be registered, and determine the maximum value in the reflection intensity information of the point cloud to be registered as the global maximum reflection intensity of the point cloud to be registered; the reflection sub-intensity unit can be configured to determine the reflection sub-intensity of the first pixel point according to the reflection intensity information of at least one point cloud to be registered corresponding to the first pixel point; and the gray value unit can be configured to determine the gray value of the first pixel point according to the global minimum reflection intensity of the point cloud to be registered, the global maximum reflection intensity of the point cloud to be registered and the reflection sub-intensity of the first pixel point.
In an exemplary embodiment, the pixel point pair module may include: the first feature vector submodule can be configured to extract features of the first pixel point to obtain a first feature vector of the first pixel point; the second feature vector submodule can be configured to extract features of the second pixel point to obtain a second feature vector of the second pixel point; a feature distance submodule configurable to determine a feature distance of a first feature vector of the first pixel point and a second feature vector of each second pixel point; and the matching point pair submodule can be configured to determine the first pixel point and the second pixel point with the minimum characteristic distance as a matching point pair.
In an exemplary embodiment, the three-dimensional point cloud registration apparatus may further include: a feature distance ratio module configured to determine a feature distance ratio of a minimum value and a next minimum value in the feature distances; the first matching point pair rejection module may be configured to reject the matching point pair if the feature distance ratio is greater than the feature distance ratio threshold.
In an exemplary embodiment, the three-dimensional point cloud registration apparatus may further include: a homographic transformation module configurable to determine a homographic transformation of the first projection image and the second projection image; and the second matching point pair elimination module can be configured to eliminate the matching point pairs with the error larger than the error pixel threshold value after homography transformation.
In an exemplary embodiment, the point cloud registration module may include: the first point cloud submodule can be configured to determine a point cloud to be registered with the lowest elevation value in point clouds to be registered corresponding to the first pixel point in the matching pixel points as a first point cloud; the second point cloud submodule can be configured to determine the reference point cloud with the lowest elevation value in the reference point clouds corresponding to the second pixel points in the matching pixel points as the second point cloud; a transformation matrix sub-module configurable to determine a transformation matrix using the first point cloud and the second point cloud as a matching point cloud pair.
Anelectronic device 1200 according to this embodiment of the invention is described below with reference to fig. 12. Theelectronic device 1200 shown in fig. 12 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 12, theelectronic device 1200 is embodied in the form of a general purpose computing device. The components of theelectronic device 1200 may include, but are not limited to: the at least oneprocessing unit 1210, the at least onememory unit 1220, and abus 1230 connecting the various system components including thememory unit 1220 and theprocessing unit 1210.
Wherein the memory unit stores program code that is executable by theprocessing unit 1210 such that theprocessing unit 1210 performs steps according to various exemplary embodiments of the present invention as described in the above section "exemplary methods" of the present specification. For example, theprocessing unit 1210 may execute step S202 shown in fig. 1, and acquire a point cloud to be registered and a reference point cloud; step S204, determining a first projection image of the point cloud to be registered on the two-dimensional plane, wherein the first projection image comprises a plurality of first pixel points, and each first pixel point corresponds to at least one point cloud to be registered; step S206, determining a second projection image of the reference point cloud on the two-dimensional plane, wherein the second projection image comprises a plurality of second pixel points, and each second pixel point corresponds to at least one reference point cloud; step S208, determining matched pixel point pairs according to a first pixel point of the first projection image and a second pixel point of the second projection image, wherein each matched pixel point pair comprises a first pixel point and a second pixel point; step S210, determining a transformation matrix according to at least one point cloud to be registered corresponding to a first pixel point in a matching pixel point and at least one reference point cloud corresponding to a second pixel point in the matching pixel point, and performing point cloud registration operation on the point cloud to be registered according to the transformation matrix.
Thestorage unit 1220 may include a readable medium in the form of a volatile memory unit, such as a random access memory unit (RAM)12201 and/or acache memory unit 12202, and may further include a read only memory unit (ROM) 12203.
Storage unit 1220 may also include a program/utility 12204 having a set (at least one) ofprogram modules 12205,such program modules 12205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 1230 may be one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
Theelectronic device 1200 may also communicate with one or more external devices 1300 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with theelectronic device 1200, and/or with any devices (e.g., router, modem, etc.) that enable theelectronic device 1200 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interfaces 1250. Also, theelectronic device 1200 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via thenetwork adapter 1260. As shown, thenetwork adapter 1260 communicates with the other modules of theelectronic device 1200 via thebus 1230. It should be appreciated that although not shown, other hardware and/or software modules may be used in conjunction with theelectronic device 1200, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the above section "exemplary methods" of the present description, when said program product is run on the terminal device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Furthermore, the above-described figures are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.