Movatterモバイル変換


[0]ホーム

URL:


CN109345471A - High-precision map datum method is drawn based on the measurement of high-precision track data - Google Patents

High-precision map datum method is drawn based on the measurement of high-precision track data
Download PDF

Info

Publication number
CN109345471A
CN109345471ACN201811046426.5ACN201811046426ACN109345471ACN 109345471 ACN109345471 ACN 109345471ACN 201811046426 ACN201811046426 ACN 201811046426ACN 109345471 ACN109345471 ACN 109345471A
Authority
CN
China
Prior art keywords
image
camera
precision
map datum
precision map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811046426.5A
Other languages
Chinese (zh)
Other versions
CN109345471B (en
Inventor
王涛
鞠伟平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guizhou Kuanben Zhiyun Technology Co Ltd Beijing Branch
Original Assignee
Guizhou Kuanben Zhiyun Technology Co Ltd Beijing Branch
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guizhou Kuanben Zhiyun Technology Co Ltd Beijing BranchfiledCriticalGuizhou Kuanben Zhiyun Technology Co Ltd Beijing Branch
Priority to CN201811046426.5ApriorityCriticalpatent/CN109345471B/en
Publication of CN109345471ApublicationCriticalpatent/CN109345471A/en
Application grantedgrantedCritical
Publication of CN109345471BpublicationCriticalpatent/CN109345471B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Classifications

Landscapes

Abstract

The invention discloses one kind to draw high-precision map datum method based on the measurement of high-precision track data, the following steps are included: step S1, based on image capture device acquisition image information and GPS information, camera calibration is carried out to image information, obtains camera parameter and distortion parameter;Step S2, according to the GPS information and camera parameter of Image Acquisition point, collection point coordinate is generated in the spin matrix R of world coordinates;Step S3, distortion is carried out to image information according to camera parameter and distortion parameter to handle, obtain orthoscopic image;Step S4, the pixel coordinate on orthoscopic image corresponding to geometry mark point is obtained on orthoscopic image;Step S5, high-precision map datum is generated according to GPS information and pixel coordinate.It is described that high-precision map datum method is drawn not dependent on laser point cloud atlas based on the measurement of high-precision track data, it directly can directly be carried out drawing according to high-precision track camera viewings and data are extracted, not need laser point cloud atlas and camera picture that drafting is assisted to extract data.

Description

High-precision map datum method is drawn based on the measurement of high-precision track data
Technical field
The present invention relates to image recognition and processing technology fields, and in particular to one kind is drawn based on the measurement of high-precision track dataHigh-precision map datum method.
Background technique
Laser point cloud atlas is to obtain data by laser radar to generate image, and laser infrared radar imaging figure is by surrounding enviroment, dayThe influence of the factors such as gas can have large error to imaging, not be able to satisfy high accuracy data generation.Main laser radar does not detectLane line;Do not identify traffic mark, so that people can not draw from laser point cloud atlas knows pick-up diatom, the weight such as traffic signboardMap data information is wanted, and laser equipment price is higher.Camera image the problem of there is no laser imagings, camera atAs figure covers the institute people three-dimensional map information that human eye can see, only needing in artificial data production can according to Image RenderingWith, and the cost of relative laser equipment camera wants much lower.
Summary of the invention
The purpose of the present invention is to provide one kind to draw high-precision map datum method based on the measurement of high-precision track data, toSolve the problems, such as that existing laser point cloud atlas cost price is higher.
To achieve the above object, the technical scheme is that
One kind drawing high-precision map datum method based on the measurement of high-precision track data, comprising the following steps:
Step S1, the GPS information that image information and Image Acquisition point are acquired based on image capture device, to image information intoRow camera calibration, obtains camera parameter and distortion parameter;
Step S2, according to the GPS information and camera parameter of Image Acquisition point, collection point coordinate is generated in the rotation of world coordinatesTorque battle array R;
Step S3, distortion is carried out to image information according to camera parameter and distortion parameter to handle, obtain orthoscopic image;
Step S4, the geometry mark point that contour of object is marked on orthoscopic image, obtains geometry mark pointPixel coordinate on corresponding orthoscopic image;
Step S5, high-precision map datum is generated according to GPS information and pixel coordinate.
Further embodiment, the camera calibration are Zhang Shi standardization.
Further embodiment generates collection point coordinate in the step S2 in the method for the spin matrix R of world coordinatesTo generate image capture device coordinate in the spin matrix R of world coordinates (X, Y, Z), formula according to GPS information and camera parameterIt is as follows;
Wherein, X, Y, Z respectively represent world coordinates axis, ψ, φ, and θ represents the rotation angle of corresponding reference axis.
Further embodiment, the method for going distortion to handle are
A, the pixel coordinate system of image information is passed through into the internal reference matrix conversion in camera parameter to camera coordinates system:
Y=(u-u0)/fy;
X=(v-v0)/fx;
B, aberration is carried out under camera coordinates system;
R=x2+y2
X'=x* (1+k1*r+k2*r2+k3*r3)+2*p1*x*y+p2*(r+2*x2);
Y '=y* (1+k1*r+k2*r2+k3*r3)+2*p2*x*y+p1*(r+2*y2);
C, after removing aberration, camera coordinates system is transformed into image pixel coordinates system again;
X "=x ' * fx+u0;
Y "=y ' * fy+v0;
D, and with the pixel value of image information to the pixel for removing fault image interpolation is carried out;
H=x ";
W=y ";
I2 (u, v)=([w+1]-w) * ([h+1]-h) * I1 ([h], [w])
+ ((w+1)-w) * (h- [h]) * I1 ([h+1], [w])
+ (w- [w]) * ([h+1]-h) * I1 ([h], [w+1])
+ (w- [w]) * (h- [h]) * I1 ([h+1], [w+1])
Wherein, [] is to be rounded, and I2 is fault image, and I1 is source images.
Latitude and longitude coordinates are calculated using pixel coordinate in the step S5, according to longitude and latitude in further embodimentCoordinate matching GPS location transfers the height value of corresponding GPS location, generates the high-precision map comprising latitude and longitude coordinates and height valueData.
The present invention has the advantage that
It is described excellent in existing high-precision map datum drafting based on the high-precision track data high-precision map datum method of measurement draftingGesture is directly can directly be carried out drawing according to high-precision track camera viewings and data mention not dependent on laser point cloud atlasIt takes, is not needing laser point cloud atlas and camera picture to assist drawing extraction data.
Detailed description of the invention
Fig. 1 is the outside for drawing high-precision map datum method described in the embodiment of the present invention based on the measurement of high-precision track dataStructure chart.
Specific embodiment
In order to make the object, technical scheme and advantages of the embodiment of the invention clearer, below in conjunction with the embodiment of the present inventionIn attached drawing, technical scheme in the embodiment of the invention is clearly and completely described, it is clear that described embodiment isA part of the embodiment of the present invention, instead of all the embodiments.The present invention being usually described and illustrated herein in the accompanying drawings is implementedThe component of example can be arranged and be designed with a variety of different configurations.Therefore, below to the reality of the invention provided in the accompanying drawingsThe detailed description for applying example is not intended to limit the range of claimed invention, but is merely representative of selected implementation of the inventionExample.Based on the embodiments of the present invention, obtained by those of ordinary skill in the art without making creative effortsEvery other embodiment, shall fall within the protection scope of the present invention.
The embodiment of the present invention is described below in detail, examples of the embodiments are shown in the accompanying drawings, wherein from beginning to endSame or similar label indicates same or similar element or element with the same or similar functions.Below with reference to attachedThe embodiment of figure description is exemplary, it is intended to is used to explain the present invention, and is not considered as limiting the invention.
The present invention is further illustrated with specific embodiment below with reference to accompanying drawings.
As shown in Figure 1, one kind of the embodiment of the present invention, which is based on the measurement of high-precision track data, draws high-precision map datum method,The following steps are included:
Step S1, the GPS information that image information and Image Acquisition point are acquired based on image capture device, to image information intoRow camera calibration, obtains camera parameter and distortion parameter;
Step S2, according to the GPS information and camera parameter of Image Acquisition point, collection point coordinate is generated in the rotation of world coordinatesTorque battle array R;
Step S3, distortion is carried out to image information according to camera parameter and distortion parameter to handle, obtain orthoscopic image;
Step S4, the geometry mark point that contour of object is marked on orthoscopic image, obtains geometry mark pointPixel coordinate on corresponding orthoscopic image;
Step S5, high-precision map datum is generated according to GPS information and pixel coordinate.
Its concrete principle method is illustrated to the method expansion in above-mentioned each step separately below:
Step S1, the GPS information that image information and Image Acquisition point are acquired based on image capture device, to image information intoRow camera calibration, obtains camera parameter and distortion parameter;Image capture device in the present embodiment is preferably monocular camera or doubleMesh camera, wherein camera is industrial camera;The camera calibration is Zhang Shi standardization, and Zhang Shi standardization is the prior art, hereIt is not repeating.
Step S2, according to the GPS information and camera parameter of Image Acquisition point, collection point coordinate is generated in the rotation of world coordinatesTorque battle array R, wherein the method for generating spin matrix R are as follows:
According to GPS information and camera parameter, image capture device coordinate is generated in the spin matrix of world coordinates (X, Y, Z)R, formula are as follows;
Wherein, X, Y, Z respectively represent world coordinates axis, ψ, φ, and θ represents the rotation angle of corresponding reference axis.
Step S3, distortion is carried out to image information according to camera parameter and distortion parameter to handle, obtain orthoscopic image,It is wherein described that distortion is gone to handle method particularly includes:
The internal reference matrix of camera is obtained mentioned by camera calibration and distortion factor is respectively
Internal reference matrix
Distortion factor
D=[k1 k2 k3 p1 p2]
=[- 0.3784872335914 0.00411334402276-0.00079763894420-0.0018735095178 0]
Then each parameter is as follows:
Fx=A (1,1);%fx and fy is f/dx, f/dy respectively;
Fx=A (2,2);%
U0=A (1,3);%cx and cy optical center position, and optical center position cx, cy are related with resolution ratio;
V0=A (2,3);%
K1=D (1);%1 rank coefficient of radial distortion;
K2=D (2);%2 rank coefficient of radial distortion;
K3=D (3);%3 rank coefficient of radial distortion;
P1=D (4);%1 rank coefficient of radial distortion;
P2=D (5);%2 rank coefficient of radial distortion;
Go distortion processing as follows:
A, the pixel coordinate system of image information is passed through into the internal reference matrix conversion in camera parameter to camera
Coordinate system:
Y=(u-u0)/fy;
X=(v-v0)/fx;
B, aberration is carried out under camera coordinates system;
R=x2+y2
X'=x* (1+k1*r+k2*r2+k3*r3)+2*p1*x*y+p2*(r+2*x2);
Y'=y* (1+k1*r+k2*r2+k3*r3)+2*p2*x*y+p1*(r+2*y2);
C, after removing aberration, camera coordinates system is transformed into image pixel coordinates system again;
X "=x ' * fx+u0;
Y "=y ' * fy+v0;
D, and with the pixel value of image information to the pixel for removing fault image interpolation is carried out;
H=x ";
W=y ";
I2 (u, v)=([w+1]-w) * ([h+1]-h) * I1 ([h], [w])
+ ((w+1)-w) * (h- [h]) * is several ([h+1], [w])
+ (w- [w]) * ([h+1]-h) * I1 ([h], [w+1])
+ (w- [w]) * (h- [h]) * I1 ([h+1], [w+1])
Wherein, [] is to be rounded, and I2 is fault image, and I1 is source images.
Step S4, the geometry mark point that contour of object is marked on orthoscopic image, obtains geometry mark pointPixel coordinate on corresponding orthoscopic image.Contour of object contains instruction graticule, forbids on graticule, warning graticule and groundText or number mark.
Step S5, high-precision map datum is generated according to GPS information and pixel coordinate.More specifically, make in the step S5Latitude and longitude coordinates are calculated with pixel coordinate, matches GPS location according to latitude and longitude coordinates, transfers the elevation of corresponding GPS locationValue generates the high-precision map datum comprising latitude and longitude coordinates and height value.
Finally, it should be noted that above-described embodiments are merely to illustrate the technical scheme, rather than to itLimitation;Although the present invention is described in detail referring to the foregoing embodiments, those skilled in the art should understand that:It can still modify to technical solution documented by previous embodiment, or to part of or all technical features intoRow equivalent replacement;And these modifications or substitutions, it does not separate the essence of the corresponding technical solution various embodiments of the present invention technical sideThe range of case.

Claims (5)

CN201811046426.5A2018-09-072018-09-07Method for measuring and drawing high-precision map data based on high-precision track dataActiveCN109345471B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201811046426.5ACN109345471B (en)2018-09-072018-09-07Method for measuring and drawing high-precision map data based on high-precision track data

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201811046426.5ACN109345471B (en)2018-09-072018-09-07Method for measuring and drawing high-precision map data based on high-precision track data

Publications (2)

Publication NumberPublication Date
CN109345471Atrue CN109345471A (en)2019-02-15
CN109345471B CN109345471B (en)2022-06-24

Family

ID=65305057

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201811046426.5AActiveCN109345471B (en)2018-09-072018-09-07Method for measuring and drawing high-precision map data based on high-precision track data

Country Status (1)

CountryLink
CN (1)CN109345471B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN110174115A (en)*2019-06-052019-08-27武汉中海庭数据技术有限公司A kind of method and device automatically generating high accuracy positioning map based on perception data
CN112767498A (en)*2021-02-032021-05-07苏州挚途科技有限公司Camera calibration method and device and electronic equipment
CN112862895A (en)*2019-11-272021-05-28杭州海康威视数字技术股份有限公司Fisheye camera calibration method, device and system
CN113008247A (en)*2020-03-242021-06-22青岛慧拓智能机器有限公司High-precision map construction method and device for mining area
CN113256540A (en)*2021-07-142021-08-13智道网联科技(北京)有限公司Image distortion removal method and apparatus, electronic device, and computer-readable storage medium
CN113345032A (en)*2021-07-072021-09-03北京易航远智科技有限公司Wide-angle camera large-distortion image based initial image construction method and system
CN113808083A (en)*2021-08-242021-12-17国网安徽省电力有限公司检修分公司 Method and system for detecting opening and closing speed of high-voltage circuit breaker based on image recognition

Citations (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN102435188A (en)*2011-09-152012-05-02南京航空航天大学 A Monocular Vision/Inertial Fully Autonomous Navigation Method for Indoor Environment
CN105181109A (en)*2015-08-262015-12-23华北电力大学(保定)Wire ice-shedding skip trajectory binocular measurement method
CN105981074A (en)*2014-11-042016-09-28深圳市大疆创新科技有限公司Camera calibration
CN106500669A (en)*2016-09-222017-03-15浙江工业大学A kind of Aerial Images antidote based on four rotor IMU parameters
CN108053386A (en)*2017-11-272018-05-18北京理工大学For the method and device of image co-registration

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN102435188A (en)*2011-09-152012-05-02南京航空航天大学 A Monocular Vision/Inertial Fully Autonomous Navigation Method for Indoor Environment
CN105981074A (en)*2014-11-042016-09-28深圳市大疆创新科技有限公司Camera calibration
CN105181109A (en)*2015-08-262015-12-23华北电力大学(保定)Wire ice-shedding skip trajectory binocular measurement method
CN106500669A (en)*2016-09-222017-03-15浙江工业大学A kind of Aerial Images antidote based on four rotor IMU parameters
CN108053386A (en)*2017-11-272018-05-18北京理工大学For the method and device of image co-registration

Cited By (11)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN110174115A (en)*2019-06-052019-08-27武汉中海庭数据技术有限公司A kind of method and device automatically generating high accuracy positioning map based on perception data
CN112862895A (en)*2019-11-272021-05-28杭州海康威视数字技术股份有限公司Fisheye camera calibration method, device and system
CN112862895B (en)*2019-11-272023-10-10杭州海康威视数字技术股份有限公司 A fisheye camera calibration method, device and system
CN113008247A (en)*2020-03-242021-06-22青岛慧拓智能机器有限公司High-precision map construction method and device for mining area
CN113008247B (en)*2020-03-242022-10-28青岛慧拓智能机器有限公司High-precision map construction method and device for mining area
CN112767498A (en)*2021-02-032021-05-07苏州挚途科技有限公司Camera calibration method and device and electronic equipment
CN113345032A (en)*2021-07-072021-09-03北京易航远智科技有限公司Wide-angle camera large-distortion image based initial image construction method and system
CN113345032B (en)*2021-07-072023-09-15北京易航远智科技有限公司Initialization map building method and system based on wide-angle camera large distortion map
CN113256540A (en)*2021-07-142021-08-13智道网联科技(北京)有限公司Image distortion removal method and apparatus, electronic device, and computer-readable storage medium
CN113256540B (en)*2021-07-142021-11-19智道网联科技(北京)有限公司Image distortion removal method and apparatus, electronic device, and computer-readable storage medium
CN113808083A (en)*2021-08-242021-12-17国网安徽省电力有限公司检修分公司 Method and system for detecting opening and closing speed of high-voltage circuit breaker based on image recognition

Also Published As

Publication numberPublication date
CN109345471B (en)2022-06-24

Similar Documents

PublicationPublication DateTitle
CN109345471A (en)High-precision map datum method is drawn based on the measurement of high-precision track data
JP6891954B2 (en) Object detection device, object detection method, and program
CN110033407B (en)Shield tunnel surface image calibration method, splicing method and splicing system
WO2019105044A1 (en)Method and system for lens distortion correction and feature extraction
CN109961485A (en) A method for target localization based on monocular vision
CN112686877A (en)Binocular camera-based three-dimensional house damage model construction and measurement method and system
CN110969663A (en)Static calibration method for external parameters of camera
CN106960591B (en) A high-precision vehicle positioning device and method based on road surface fingerprints
CN111932627B (en)Marker drawing method and system
CN113313659A (en)High-precision image splicing method under multi-machine cooperative constraint
CN105118086A (en)3D point cloud data registering method and system in 3D-AOI device
CN109741241A (en) Fisheye image processing method, device, device and storage medium
JP5311465B2 (en) Stereo matching processing system, stereo matching processing method, and program
CN111191596A (en) A closed area mapping method, device and storage medium
CN109146791B (en)Tunnel spread map generation method based on area array CCD imaging
CN113808269A (en) Map generation method, positioning method, system and computer-readable storage medium
CN102096920A (en)Target image-based sub-pixel registering method
CN115063477B (en) Infrared and visible light dual-channel synchronous imaging real-time registration fusion acquisition method and device
CN111860084B (en)Image feature matching and positioning method and device and positioning system
CN108335333A (en)A kind of linear camera scaling method
CN113989428B (en) A global three-dimensional reconstruction method and device for metallurgical storage area based on depth vision
CN110197104B (en)Distance measurement method and device based on vehicle
CN103260008A (en)Projection converting method from image position to actual position
CN111998834B (en)Crack monitoring method and system
CN103925919A (en)Fisheye camera based planetary rover detection point positioning method

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp