Movatterモバイル変換


[0]ホーム

URL:


CN107945104A - A kind of method for panoramic imaging based on space virtual reality camera - Google Patents

A kind of method for panoramic imaging based on space virtual reality camera
Download PDF

Info

Publication number
CN107945104A
CN107945104ACN201711143093.3ACN201711143093ACN107945104ACN 107945104 ACN107945104 ACN 107945104ACN 201711143093 ACN201711143093 ACN 201711143093ACN 107945104 ACN107945104 ACN 107945104A
Authority
CN
China
Prior art keywords
space
image
fisheye
scanning
effective
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201711143093.3A
Other languages
Chinese (zh)
Inventor
饶鹏
陈忻
韩冰
周小康
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CHANGZHOU INSTITUTE OF OPTOELECTRONIC TECHNOLOGY
Original Assignee
CHANGZHOU INSTITUTE OF OPTOELECTRONIC TECHNOLOGY
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CHANGZHOU INSTITUTE OF OPTOELECTRONIC TECHNOLOGYfiledCriticalCHANGZHOU INSTITUTE OF OPTOELECTRONIC TECHNOLOGY
Priority to CN201711143093.3ApriorityCriticalpatent/CN107945104A/en
Publication of CN107945104ApublicationCriticalpatent/CN107945104A/en
Pendinglegal-statusCriticalCurrent

Links

Classifications

Landscapes

Abstract

The invention discloses a kind of method for panoramic imaging based on space virtual reality camera, step 1:Space virtual reality camera omnidirectional imaging system is built, 360 degree of visual angles are covered using four big visual field fish eye lens on satellite, gathers the big visual field fish eye images of space;Step 2:The effectively round region of the big visual field fish eye images of space is extracted;Step 3:The effectively round region of the big visual field fish eye images of space is carried out to correct;Step 4:Back mapping distortion correction based on equidistant projection geometric correction model is carried out to the effectively round region of the big visual field fish eye images of space obtained in step 3, obtains the space image for meeting human vision custom.The present invention establishes effective imaging method, has saved load resource, and finally realize space high-resolution, wide scope, comprehensive, immersion panoramic imagery effect according to the complex environment of space.

Description

Panoramic imaging method based on space virtual reality camera
Technical Field
The invention relates to a panoramic imaging method based on a space virtual reality camera.
Background
With the development of image sensing technology and digital processing technology in the aerospace field, space cameras are increasingly applied in space, a plurality of space cameras are usually adopted to acquire space images, the space background is complex, and the method of increasing the number of the space cameras to make up for insufficient space view field is easy to cause load resource shortage. The common fisheye image scanning method is not in accordance with human visual habits because severe barrel-shaped or pillow-shaped distortion is accompanied when a large-view-field space image is obtained by a common fisheye camera, and the traditional fisheye image scanning method needs to strictly scan one by one from top to bottom and from left to right, so that the calculation amount is large and the efficiency is low. The space environment is complex, the traditional space fisheye image correction has strict requirements on internal and external parameters and distortion models of a space camera, and the distortion models are complex and large in calculated amount. Space cameras generally do not provide an omnidirectional, immersive panoramic experience, and require multiple angles and multiple cameras to acquire different field-of-view space images.
Disclosure of Invention
The invention aims to provide a panoramic imaging method based on a space virtual reality camera.
The technical scheme for realizing the aim of the invention is a panoramic imaging method based on a space virtual reality camera,
the method comprises the following steps: building a space virtual reality camera panoramic imaging system, covering a 360-degree visual angle by adopting four large-visual-field fisheye lenses on a satellite, and collecting a space large-visual-field fisheye image;
step two: extracting an effective circular area of the fisheye image with the large space field of view; k = MN-4R for effective imaging area2 Performing secondary linear scanning, wherein M and N are the size of a neighborhood window of the spatial fisheye image, and R is the radius of an effective area of the spatial fisheye image with a large view field;
step three: if the radius R1 calculated through the horizontal tangent point is different from the radius R2 calculated through the vertical tangent point in the effective circle area extracted in the step two, correcting the effective circle area of the fisheye image with the large space view field;
step four: carrying out reverse mapping distortion correction based on an equidistant projection geometric correction model on the effective circular area of the fisheye image with the large space field obtained in the third step to obtain a space image according with human visual habits; the equidistant projection geometric correction model is an imaging relation of object points on an ideal imaging plane and the fisheye lens imaging curved surface.
In the second step, the effective imaging area is linearly scanned based on an improved linear scanning method:
wherein: ktop 、Kbottom 、Kleft 、Kright The times of scanning in the upper, lower, left and right directions in the whole scanning process are respectively obtained, and X and Y are the times of scanning in the X direction and the Y direction; y istop 、Ybottom 、Xleft 、Xright The scanning times of the upper, lower, left and right directions of the effective imaging area are respectively obtained.
In the third step, the method for correcting the effective circle area of the fisheye image with the large space view field comprises the following steps:
setting a pixel brightness difference threshold T, calculating the difference Isub between the maximum brightness value Imax and the minimum brightness value Imin of the pixel on the scanning line, and searching coordinates A of four tangent lines at the edge of the fisheye image with the large space field and intersection points of the four tangent linesleft (x1 ,y1 )、Aright (x2 ,y2 )、Btop (x3 ,y3 )、Bbottom (x4 ,y4 ) The values of R1 and R2 are calculated according to the following formula:
I=0.59r+0.11g+0.3b
R=max{R1,R2}
wherein, the threshold T satisfies that T is more than or equal to 20 and less than or equal to 50, I is the brightness of each pixel point on the scanning line, r, g and b are three channel color values of red, green and blue, the range is 0-255, and the geometric center point O (x) of the circular effective area0 ,y0 );
When R1 ≠ R2, multiply by M-1 And (3) correcting the circular area of the fisheye image:
where (u, v) is the pixel coordinate of the center point of the fisheye image, and α is the ratio of the total number of row pixels to the total number of column pixels of the fisheye image.
In the third step, the method for determining the coordinates of the four tangent lines at the edge of the fisheye image with the large space view field and the intersection points of the four tangent lines comprises the following steps:
setting the brightness difference of pixels in a certain row/column of the space large-field fisheye image as Isub Respectively scanning the original fisheye image by an upper group of lines, a lower group of lines, a left group of lines and a right group of lines; the pixel is reduced by 1 when the scanning line in the up, down, left and right directions is scanned once; the whole scanning process is performed with K in four directionstop 、Kbottom 、Kright 、Kleft Secondary scanning; when the scanning line is Isub &T, recording the position of the scanning Line as Line0, then calculating the brightness difference of the adjacent scanning lines, and continuously comparing with a threshold value; maximum brightness difference I of the lower adjacent scanning Line1 in the up-down scanning directionsub1 If T is less than or equal to T, rescanning and calculating I on other scanning linessub And compared to a threshold T; if Isub1 &T, the scanning Line0 is a tangent Line of the upper edge of the circular effective area of the fisheye image; the same approach obtains the edge tangent in the left and right scan directions.
In the fourth step, the geometric correction model of equidistant projection is as follows:
the object point P forms an image point P on the ideal imaging plane and the imaging curved surface of the fisheye lens respectively0 And P1 Image height is r0 And r1 Establishing a geometric mapping model of a source space fisheye distortion image and a space fisheye correction image based on an equidistant projection geometric model r = f theta:
wherein: z is a fisheye lens imaging curved surface, h is the corrected target image height, and w is the corrected target imageImage width, O (w/2, h/2) is the coincidence point of geometric center and optical center, Q1 (a)1 ,b1 ) The original distortion point on the distorted image, R is the effective space image radius; according to r0 And r1 The Q (a, b) on the space fish eye correction image is obtained by inverting Q1.
Further, in order to obtain a better image effect, the method further comprises the following steps: performing space image recovery on the space fisheye correction image obtained in the step four through cubic interpolation;
step six: SIFT feature matching is carried out on the recovered space image;
step seven: and splicing the feature point matching results by an optical flow method to realize immersive and omnibearing panoramic imaging.
The numerical values of M and N in the second step are fixed values or variable values; the value of the pixel point brightness difference threshold value T in the third step is a fixed value or a variable value within the range of 20-50; r in step four0 And r1 The coefficient relation of (2) is a fixed value or a variable value; and sixthly, the number of the feature points matched with the SIFT features is a fixed value or a variable value.
By adopting the technical scheme, the invention has the following positive effects: (1) The method successfully makes up the defects of the space camera in the aspects of imaging range, resolution, large field of view and the like, utilizes the four space fisheye lenses to cover the 360-degree field angle, effectively saves load resources, can provide effective assistance and reference for astronauts for space control, has commercial space VR experience, and can be applied to the fields of space maintenance and space security in the future.
(2) The method improves the traditional line scanning method, effectively reduces the repeated scanning rate of the fisheye image, greatly reduces the calculated amount, improves the working efficiency of an imaging system, performs feature matching by using SIFT, and performs registration splicing by using an optical flow method, thereby achieving the omnibearing and immersive panoramic effect.
(3) According to the method, the geometric mapping model of the source space fisheye distortion image and the space fisheye correction image is established according to the equidistant projection geometric model, so that the length and width multiple adjustable space image distortion correction is realized, and the correction effect is effectively improved by 30%.
Drawings
In order that the manner in which the above recited features of the present invention can be understood in detail, a more particular description of the invention, briefly summarized above, may be had by reference to embodiments, some of which are illustrated in the appended drawings
FIG. 1 is a flow chart of the method of the present invention.
FIG. 2 is a geometric topological relation diagram of the large-field fish-eye lens imaging.
Fig. 3 is a diagram for extracting an effective area of a space large-field fisheye image.
FIG. 4 is a geometrical mapping topological relation diagram of a source space fisheye distortion image and a space fisheye correction image.
Detailed Description
(example 1)
See fig. 2, at any point P in space, light passes through optical center O and connects OP hemispherical surface to P1 Over P1 Point making parallel lines parallel to the optical axis Z and intersecting with the image plane P2 Then P is2 I.e. the image formed by the spatial point P, due to the refraction of light, a certain degree of angular deviation is projected on the CCD image sensor array of the camera, which is the cause of image distortion.
The method flow of the embodiment is shown in fig. 1:
the method comprises the following steps: building a space virtual reality camera panoramic imaging system, covering a 360-degree visual angle by adopting four large-visual-field fisheye lenses on a satellite, and collecting a space large-visual-field fisheye image;
step two: extracting an effective circular area of the fisheye image with the large space field of view; k = MN-4R for effective imaging area2 Performing secondary linear scanning, wherein M and N are the size of a neighborhood window of the spatial fisheye image, and R is the radius of an effective area of the spatial large-view fisheye image; the effective imaging area is linearly scanned based on an improved linear scanning method:
wherein: ktop 、Kbottom 、Kleft 、Kright The times of scanning in the upper, lower, left and right directions in the whole scanning process are respectively, and X and Y are the times of scanning in the X direction and the Y direction; y istop 、Ybottom 、Xleft 、Xright The scanning times of the upper, lower, left and right directions of the effective imaging area are respectively obtained.
Step three: if the radius R1 calculated through the horizontal tangent point is different from the radius R2 calculated through the vertical tangent point in the effective circle area extracted in the step two, correcting the effective circle area of the fisheye image with the large space visual field;
the method for correcting the effective circular area of the fisheye image with the large space field comprises the following steps:
setting a pixel brightness difference threshold value T, calculating the difference Isub between the maximum brightness value Imax and the minimum brightness value Imin of the pixel on the scanning line, and searching coordinates A of four tangent lines at the edge of the space large-view-field fisheye image and intersection points of the four tangent linesleft (x1 ,y1 )、Aright (x2 ,y2 )、Btop (x3 ,y3 )、Bbottom (x4 ,y4 ) Specifically, the method for determining the coordinates of the four tangent lines at the edge of the fisheye image with the large space field and the intersection points of the four tangent lines comprises the following steps:
setting the brightness difference of pixels in a certain row/column of the space large-field fisheye image as Isub Respectively scanning the original fisheye image by an upper group of lines, a lower group of lines, a left group of lines and a right group of lines; the pixel is reduced by 1 when the scanning line in the up, down, left and right directions is scanned once so as to avoid repetition; the whole scanning process is performed with K in four directionstop 、Kbottom 、Kright 、Kleft Secondary scanning; when the scanning line is Isub &T, recording the position of the scanning Line as Line0, then calculating the brightness difference of the adjacent scanning lines, and continuously comparing with a threshold value; maximum brightness difference I if the lower adjacent scanning Line1 in the up-down scanning directionsub1 If T is less than or equal to T, rescanning and calculating I on other scanning linessub And compared with a threshold value T; if Isub1 &T, the scanning Line0 is a tangent Line of the upper edge of the circular effective area of the fisheye image; the same approach gets the edge tangent in the left and right scan direction.
The values of R1 and R2 were calculated according to the following formula:
I=0.59r+0.11g+0.3b
R=max{R1,R2}
wherein, the threshold T satisfies that T is more than or equal to 20 and less than or equal to 50, I is the brightness of each pixel point on the scanning line, r, g and b are red, green and blue three-channel color values, the range is 0-255, and the geometric center point O (x) of the circular effective area0 ,y0 );
When R1 ≠ R2, multiply by M-1 And (3) correcting the circular area of the fisheye image:
wherein (u, v) is the pixel coordinate of the center point of the fisheye image, and the alpha value is the ratio of the total number of the row pixels to the total number of the column pixels of the fisheye image;
step four: carrying out reverse mapping distortion correction based on an equidistant projection geometric correction model on the effective circular area of the fisheye image with the large space field obtained in the third step to obtain a space image which accords with human visual habits; the equidistant projection geometric correction model is an imaging relation of object points on an ideal imaging plane and the fisheye lens imaging curved surface. The equidistant projection geometric correction model is as follows:
the object point P forms an image point P on the ideal imaging plane and the fisheye lens imaging curved surface respectively0 And P1 Image height is r0 And r1 Establishing a geometric mapping model of a source space fisheye distortion image and a space fisheye correction image based on an equidistant projection geometric model r = f theta:
wherein: z is the fish-eye lens imaging curved surface, h is the corrected target image height, w is the corrected target image width, O (w/2, h/2) is the coincidence point of the geometric center and the optical center, and Q1 (a)1 ,b1 ) The distortion point is the original distortion point on the distorted image, and R is the effective space image radius; according to r0 And r1 The Q (a, b) on the space fish eye correction image is obtained by inverting Q1.
And a fifth step of: performing space image recovery on the space fisheye correction image obtained in the step four through cubic interpolation;
step six: SIFT feature matching is carried out on the recovered space image;
step seven: and splicing the feature point matching results by an optical flow method to realize immersive and omnibearing panoramic imaging.
The above-mentioned embodiments, objects, technical solutions and advantages of the present invention are further described in detail, it should be understood that the above-mentioned embodiments are only examples of the present invention, and should not be construed as limiting the present invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (7)

setting the brightness difference of pixels in a certain row/column of the space large-field fisheye image as Isub Respectively scanning the original fisheye image by an upper group of lines, a lower group of lines, a left group of lines and a right group of lines; the pixel is reduced by 1 when the scanning line in the up, down, left and right directions is scanned once; the whole scanning process is carried out in four directions by Ktop 、Kbottom 、Kright 、Kleft Secondary scanning; when on the scanning line Isub &T, recording the position of the scanning Line as Line0, then calculating the brightness difference of the adjacent scanning lines, and continuously comparing with a threshold value; maximum brightness difference I of the lower adjacent scanning Line1 in the up-down scanning directionsub1 If T is less than or equal to T, rescanning and calculating I on other scanning linessub And compared with a threshold value T; if Isub1 &T, the scanning Line0 is a tangent Line of the upper edge of the circular effective area of the fisheye image; the same approach gets the edge tangent in the left and right scan direction.
CN201711143093.3A2017-11-172017-11-17A kind of method for panoramic imaging based on space virtual reality cameraPendingCN107945104A (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201711143093.3ACN107945104A (en)2017-11-172017-11-17A kind of method for panoramic imaging based on space virtual reality camera

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201711143093.3ACN107945104A (en)2017-11-172017-11-17A kind of method for panoramic imaging based on space virtual reality camera

Publications (1)

Publication NumberPublication Date
CN107945104Atrue CN107945104A (en)2018-04-20

Family

ID=61931675

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201711143093.3APendingCN107945104A (en)2017-11-172017-11-17A kind of method for panoramic imaging based on space virtual reality camera

Country Status (1)

CountryLink
CN (1)CN107945104A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN108776976A (en)*2018-06-072018-11-09驭势科技(北京)有限公司A kind of while positioning and the method, system and storage medium for building figure
CN114390262A (en)*2020-10-212022-04-22中强光电股份有限公司 Method and electronic device for stitching three-dimensional spherical panoramic images

Citations (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN101246590A (en)*2008-03-032008-08-20北京航空航天大学 Geometric Correction Method for Spatial Distortion Image of Spaceborne Camera
US20160105649A1 (en)*2014-10-102016-04-14IEC Infrared Systems LLCPanoramic View Imaging System With Drone Integration
CN105547256A (en)*2015-12-022016-05-04上海宇航系统工程研究所Spacial whole scene sensing satellite, design method and application method thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN101246590A (en)*2008-03-032008-08-20北京航空航天大学 Geometric Correction Method for Spatial Distortion Image of Spaceborne Camera
US20160105649A1 (en)*2014-10-102016-04-14IEC Infrared Systems LLCPanoramic View Imaging System With Drone Integration
CN105547256A (en)*2015-12-022016-05-04上海宇航系统工程研究所Spacial whole scene sensing satellite, design method and application method thereof

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
周小康等: "《空间操控全景成像技术综述》", 《红外》*
周小康等: "《鱼眼图像畸变校正技术研究》", 《工业控制计算机》*
张正鹏等: "《光流特征聚类的车载全景序列影像匹配方法》", 《测绘学报》*
腾讯科技: "《这次VR真要上天了!VR卫星预计将于今年8月发射》", 《腾讯科技HTTPS://TECH.QQ.COM/A/20170417/004163.HTM?WINZOOM=1》*

Cited By (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN108776976A (en)*2018-06-072018-11-09驭势科技(北京)有限公司A kind of while positioning and the method, system and storage medium for building figure
CN108776976B (en)*2018-06-072020-11-20驭势科技(北京)有限公司Method, system and storage medium for simultaneously positioning and establishing image
CN114390262A (en)*2020-10-212022-04-22中强光电股份有限公司 Method and electronic device for stitching three-dimensional spherical panoramic images

Similar Documents

PublicationPublication DateTitle
CN110197466B (en) A wide-angle fisheye image correction method
CN109767474B (en)Multi-view camera calibration method and device and storage medium
CN107918927B (en) A fast image stitching method with matching strategy fusion and low error
CN114549666B (en)AGV-based panoramic image splicing calibration method
CN109903227B (en)Panoramic image splicing method based on camera geometric position relation
CN107665483B (en)Calibration-free convenient monocular head fisheye image distortion correction method
CN107016646A (en)One kind approaches projective transformation image split-joint method based on improved
CN105488766B (en)Fisheye image bearing calibration and device
CN106875339A (en)A kind of fish eye images joining method based on strip scaling board
CN108776980A (en)A kind of scaling method towards lenticule light-field camera
CN104778656B (en)Fisheye image correcting method based on spherical perspective projection
CN106815805A (en)Rapid distortion bearing calibration based on Bayer images
CN106780374B (en)Fisheye image distortion correction method based on fisheye imaging model
CN114331826B (en) A fast correction method for fisheye images based on distortion stretch factor
CN107993263A (en)Viewing system automatic calibration method, automobile, caliberating device and storage medium
CN114612574B (en)Vehicle-mounted panoramic camera panoramic aerial view calibration and conversion splicing method based on unmanned aerial vehicle
CN106886976B (en)Image generation method for correcting fisheye camera based on internal parameters
RU2654127C1 (en)Method for generating a digital panoramic image
CN116309844A (en)Three-dimensional measurement method based on single aviation picture of unmanned aerial vehicle
CN109859137B (en)Wide-angle camera irregular distortion global correction method
WO2018102990A1 (en)System and method for rectifying a wide-angle image
CN107689033A (en)A kind of fish eye images distortion correction method based on ellipse segmentation
CN107492080B (en)Calibration-free convenient monocular head image radial distortion correction method
CN118261787A (en)High-precision sub-pixel interpolation method suitable for image registration of multispectral camera
CN115049535A (en)Method for obtaining effective area of fisheye lens and finely correcting image

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
WD01Invention patent application deemed withdrawn after publication

Application publication date:20180420

WD01Invention patent application deemed withdrawn after publication

[8]ページ先頭

©2009-2025 Movatter.jp