Movatterモバイル変換


[0]ホーム

URL:


CN118691687B - Polar correction algorithm of binocular underwater camera shooting system - Google Patents

Polar correction algorithm of binocular underwater camera shooting system
Download PDF

Info

Publication number
CN118691687B
CN118691687BCN202411155542.6ACN202411155542ACN118691687BCN 118691687 BCN118691687 BCN 118691687BCN 202411155542 ACN202411155542 ACN 202411155542ACN 118691687 BCN118691687 BCN 118691687B
Authority
CN
China
Prior art keywords
camera
fitting
coordinate system
fitted
coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202411155542.6A
Other languages
Chinese (zh)
Other versions
CN118691687A (en
Inventor
徐鑫霖
许惠平
孙科林
杨景川
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Deep Sea Science and Engineering of CAS
Original Assignee
Institute of Deep Sea Science and Engineering of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Deep Sea Science and Engineering of CASfiledCriticalInstitute of Deep Sea Science and Engineering of CAS
Priority to CN202411155542.6ApriorityCriticalpatent/CN118691687B/en
Publication of CN118691687ApublicationCriticalpatent/CN118691687A/en
Application grantedgrantedCritical
Publication of CN118691687BpublicationCriticalpatent/CN118691687B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Classifications

Landscapes

Abstract

Translated fromChinese

本发明涉及机器视觉技术领域,公开了一种双目水下相机拍摄系统的极线矫正算法,本申请通过引入拟合相机坐标系的拟合相机组件的概念,基于真实相机坐标系的拟合相机组件的每一拟合相机,通过依次确定拟合相机位置、拟合相机位姿、拟合相机内参,并根据拟合相机的内参建立拟合相机坐标系以及像素映射关系,最后通过像素映射关系确定真实图像对应在拟合相机的投影像素,从而获得到两个拟合相机图像,最后对拟合相机图像以及拟合相机原点坐标依据极线矫正算法进行矫正,从而提升极线矫正的准确性。

The present invention relates to the field of machine vision technology, and discloses an epipolar correction algorithm for a binocular underwater camera shooting system. The present application introduces the concept of a fitting camera component of a fitting camera coordinate system. For each fitting camera of the fitting camera component based on a real camera coordinate system, the fitting camera position, the fitting camera posture, and the fitting camera internal parameters are determined in sequence, and a fitting camera coordinate system and a pixel mapping relationship are established according to the internal parameters of the fitting camera. Finally, the projection pixels of the real image corresponding to the fitting camera are determined through the pixel mapping relationship, thereby obtaining two fitting camera images. Finally, the fitting camera image and the fitting camera origin coordinates are corrected according to the epipolar correction algorithm, thereby improving the accuracy of the epipolar correction.

Description

Polar correction algorithm of binocular underwater camera shooting system
Technical Field
The embodiment of the invention relates to the technical field of machine vision, in particular to an epipolar rectification algorithm of a binocular underwater camera shooting system.
Background
The polar line correction method obtained according to the pinhole camera model is difficult to ensure the correction result of a group of underwater pictures. Specifically, the pinhole camera model does not consider image distortion caused by refraction of light rays generated by the light rays when the light rays pass through a medium of the water, glass, air and camera, so that the corresponding relationship from a three-dimensional point to an image point cannot be represented. On the basis of a pinhole camera model, polar correction performed cannot meet the needs of an underwater camera. In fact, in addition to the parameters that the pinhole camera model can calibrate, the underwater camera needs to determine the relative offset of the glass through-cap center and the camera center. The polar correction can be performed on the basis of the underwater camera model, so that the underwater operation requirement of the underwater binocular camera can be met.
The existing method has general performance on the calibration precision of the underwater camera. The correction method developed on the basis has limited effect of correcting the picture acquired by the underwater camera. In addition, the existing method also needs to acquire part of shooting system parameters on land, so that the applicable scene of the existing method is greatly reduced.
Disclosure of Invention
In view of the above problems, the embodiments of the present invention provide an epipolar rectification algorithm for a binocular underwater camera shooting system, which is used for solving the technical problems in the prior art that the epipolar rectification algorithm for the binocular underwater camera shooting system is not sufficient in calibration accuracy and needs to acquire part of system parameters on land in advance.
According to an aspect of an embodiment of the present invention, an epipolar rectification algorithm of a binocular underwater camera shooting system is characterized in that the underwater camera shooting system includes a first camera, a second camera, a first glass transparent cover corresponding to the first camera, and a second glass transparent cover corresponding to the second camera, and the epipolar rectification algorithm includes:
step S1, establishing a fitting camera coordinate system and a real camera coordinate system, and acquiring calibration parameters of a fitting camera component in the fitting camera coordinate system;
s2, determining a fitting camera origin coordinate of the fitting camera assembly according to the calibration parameters and the underwater camera model;
s3, determining pose relations between a right fitting camera and a left fitting camera of the fitting camera assembly according to the origin coordinates of the fitting cameras, the coordinate system of the fitting cameras and the coordinate system of the real camera;
S4, determining the external parameters of the right fitting camera or the external parameters of the left fitting camera according to the pose relation;
s5, obtaining calibration coordinates of at least two calibration points on a world coordinate system;
s6, determining fitting coordinates in the fitting camera coordinate system corresponding to the calibration coordinates, and inputting the determined fitting coordinates into a pinhole camera projection model to determine fitting camera internal references;
Step S7, determining a right fitting camera image and a left fitting camera image;
and S8, carrying out polar correction on the right fitting camera image, the left fitting camera image and the origin coordinates of the fitting camera through a limit correction algorithm.
In an alternative manner, the step of determining the origin coordinates of the fitted camera assembly according to the calibration parameters and the underwater camera model includes:
describing an incident light path of the real camera coordinate system through point coordinates and writing a matrix;
the matrix is solved to determine fitted camera origin coordinates of the fitted camera assembly.
In an alternative manner, the step of describing the incident light path of the real camera coordinate system by the point coordinates and writing a matrix includes:
Incident light path to the real camera coordinate systemThe description is given as formula 1:
(1)
wherein,Is the incident light pathThe Euclidean length of each optical path is that the coordinates of the point where each optical path passes are; The cosine of the direction corresponding to each light path isThese light path information are all located under the real camera coordinate system;
expanding the linear system represented by the formula 1 into a matrix:
(2)
at this time, the linear system is described as:
(3)
In the formula,Three-dimensional point coordinates of each light path passing through the real camera coordinate system; Is of a size ofIs specifically formed by:
(4)。
in an alternative manner, the step of solving the matrix to determine fitted camera origin coordinates of the fitted camera assembly includes:
The matrix is provided withThe specific form of the vector is as follows:
(5)
for the matrixSVD decomposition is carried out:
(6)
Solving the describedVector:
(7)
As can be seen from formula (5), the aboveThe first three rows of vectors are the origin coordinates of the fitting camera: Namely, the method is as follows:
(8)。
in an alternative manner, the step of determining the pose relationship between the right and left fitted cameras of the fitted camera assembly according to the fitted camera origin coordinates, the fitted camera coordinate system, and the real camera coordinate system includes:
Fitting the center of the camera assemblyConstructing a fitted camera coordinate system as an origin, and the fitted camera coordinate system is positioned with the positionThe camera coordinate system at which is parallel, hereNumbering the underwater camera, and under the camera coordinate system, marking the pose of the fitting camera assembly asThe method is characterized by comprising the following steps:
(9)
Here, theIs the origin of the camera coordinate system,I is the coordinates of the center of the corresponding fitted camera assembly in the camera coordinate system, so
The pose of a set of fitted camera coordinate systems under their respective camera coordinate systems can be noted as: the camera is fitted to the left and,; The camera is fitted to the right and,
In an alternative manner, the step of determining the external parameters of the right fitting camera or the external parameters of the left fitting camera according to the pose relationship includes:
taking the right fitting camera's external parameters acquisition process as an example:
based on the left camera coordinate system, combining the external parameters of the calibrated right camera coordinate systemAccording to the rigid transformation relation, obtaining a translation vector of the right fitting camera relative to the left fitting camera under the world coordinate system
(10)
In addition, since the two fitting cameras are in an attitude relative to their respective corresponding camera coordinate systemsAre identity matrices, then the pose of the right fitting camera relative to the left fitting camera is
The right fitting camera under the left camera coordinate system is relative to the external parameters of the left fitting camera as
In an optional manner, the step of determining the fit coordinates in the fit camera coordinate system corresponding to the calibration coordinates, and inputting the determined fit coordinates into the pinhole camera projection model to determine the fit camera internal parameters includes:
In the calibration process, the world coordinate system coincides with the plane of the calibration plate, and the position and the posture of the calibrated camera coordinate system are recorded as
World coordinates of three-dimensional pointsTurning to the camera coordinate system through rigid transformation:
(11)
Coordinates of this point under the camera coordinate systemTurning to the fitted camera coordinate systemThe following steps:
(12)
Obtaining coordinates of three-dimensional points in the fitted camera coordinate systemThen, determining fitting camera internal parameters according to a pinhole camera projection model:
(13)
In the dotIs the pixel coordinates of the projection of the three-dimensional point onto the fitted camera image,Is an internal reference matrix of a fitting camera, and is specifically as follows:
(14)
Four fitted camera references are provided in equation (14), including: focal lengthPrincipal point coordinatesThese internal parameters need to be determined not to be lower than two three-dimensional points and corresponding fitting image points.
In an alternative manner, two of the three-dimensional points are any two of four corner points of the largest rectangle on the checkerboard calibration plate.
In an alternative manner, the step of determining the right fit camera image and the left fit camera image includes:
given pixels fitting to camera imageThe corresponding camera coordinate system lower coordinate can be found by combining (11-14)The following is shown:
(15)
wherein,Is a defined constant for intercepting three-dimensional points from rays where the incident light path is located;
three-dimensional pointPixels projected to the real image can be found from the mathematical projection model of the underwater camera
Realizing the coordinates of the fitting camera image by a least square methodIs adjusted by:
(16)
Fitting pixels of a camera imageAs an initial value of least squares;
Repeating the adjustment process for all points of the image area to obtain an image of the fitted camera.
In an optional manner, the step of performing epipolar rectification on the right fitted camera image, the left fitted camera image and the fitted camera origin coordinates through a limit rectification algorithm includes:
According to the external parameters of the right fitting camera relative to the left fitting camera under the left camera coordinate systemDetermining that the external parameters of the right fitting camera relative to the left fitting camera under the left fitting camera coordinate system are
Combining Bouguet methods, two fitted camera coordinate systems are required to be adjusted to be parallel, and a rotation matrix is firstly adoptedConversion to angular axis vectors using the Rodrigues formulaIn this form, the angular axis vector is characterized by a modulus of the vector representing the rotation angle, and a unit vector thereof being the rotation axis. To make the two fitted camera image planes parallel;
The left fitting camera coordinate system needs to rotate by the angle ofThe right fitting camera coordinate system needs to rotate by the angle of
The rotation angle and the rotation axis are converted into corresponding rotation matrixes through the Rodrigues formulaAnd (3) with
Under the left fitting camera coordinate system, the right fitting camera translation vector is:
(17)
In which the translation vector is written as
Constructing a new rotation matrixThe base line of the fitting camera is parallel to the image planes of the two fitting cameras, and the construction process is implemented by translating the vectorsRealizing;
(18)
The pair of translation vectors in equation (18)Normalization is carried out to obtain a vectorThe vector is the direction vector of the X axis of the coordinate system of the two corrected fitting cameras;
The Y-axis of the coordinate system of the fitted camera after polar correction is ensured to be vertical to the plane XOZ. Then the Y-axis direction vectorExpressed as:
(19)
According to the characteristics of the orthogonal base coordinate system, the direction vector of the Z axis of the adjusted fitting camera coordinate system can be obtained
(20)
Then, the correction matrix for adjusting the two fitting cameras to be coplanar is set as
The rotation matrix is adjusted toAnd (3) withTo achieve epipolar rectification for the two fitting cameras.
According to the application, the concept of a fitting camera component of a fitting camera coordinate system is introduced, each fitting camera of the fitting camera component based on a real camera coordinate system is sequentially determined, the position, the pose and the internal parameters of the fitting camera are sequentially determined, the fitting camera coordinate system and the pixel mapping relation are established according to the internal parameters of the fitting camera, finally, the projection pixels of the real image corresponding to the fitting camera are determined according to the pixel mapping relation, so that two fitting camera images are obtained, and finally, the fitting camera images and the original point coordinates of the fitting cameras are corrected according to an epipolar correction algorithm, so that the epipolar correction accuracy is improved.
The foregoing description is only an overview of the technical solutions of the embodiments of the present invention, and may be implemented according to the content of the specification, so that the technical means of the embodiments of the present invention can be more clearly understood, and the following specific embodiments of the present invention are given for clarity and understanding.
Drawings
The drawings are only for purposes of illustrating embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to designate like parts throughout the figures. In the drawings:
Fig. 1 shows a flow diagram of an epipolar rectification algorithm of a binocular underwater camera shooting system provided by the invention;
fig. 2 is a schematic diagram showing the main structure of an underwater camera shooting system provided by the invention;
fig. 3 is a schematic view showing a light refraction path of an underwater camera shooting system provided by the invention;
FIG. 4 shows a terrestrial binocular camera epipolar geometry schematic;
FIG. 5 shows a polar plane change schematic of a binocular underwater camera shooting system provided by the present invention;
Fig. 6 shows a schematic view of an underwater camera projection model under a world coordinate system of the binocular underwater camera shooting system provided by the invention;
Fig. 7 shows a schematic view of an underwater image polar correction flow of the binocular underwater camera shooting system provided by the invention;
Fig. 8 shows a schematic view of an underwater fitting camera projection process of the binocular underwater camera shooting system provided by the invention;
fig. 9 shows a schematic view of an underwater fitted binocular camera polar plane of the binocular underwater camera shooting system provided by the invention.
Detailed Description
Exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present invention are shown in the drawings, it should be understood that the present invention may be embodied in various forms and should not be limited to the embodiments set forth herein.
In an exemplary technique, under shallow water conditions and with low required calibration accuracy, the industry roughly treats underwater camera systems as a pinhole camera model under an air camera system. The model is calibrated by using Zhang Zhengyou calibration method. [ A flexible new technique for camera calibration,2000]; and carrying out polar correction by using Fusiello method after the calibration of the binocular camera system is finished. [ Quasi-euclidean uncalibrated epipolar rectification,2008]
Elnashef et al, describe a calibration scheme that enables a rough linear estimation of the initial placement position of the camera and discusses the axial characteristics of the spherical cap model. The operator is required to adjust the position of the camera to be overlapped with the center of the spherical cover so that the picture has no distortion of light path refraction .[Geometry, calibration, and robust centering procedures for refractive dome-port based imaging systems, 2022].
Furthermore, a binocular camera system consists of two cameras arranged at different angles, which can simultaneously photograph the object. According to the principle of triangulation, three-dimensional information of the surface of a scene can be obtained by using a group of photographed pictures. In an underwater environment, there are very few scenes that are stationary in water, subject to the effects of dark currents. The process of triangulation using binocular cameras can meet the needs of machine vision working in an underwater dynamic environment, and combines reliability of application with feasibility of implementation. In addition, the binocular camera equipment has the advantages of simplicity in operation and low operation cost.
Polar correction is a unique technique for binocular machine vision. Referring to fig. 4, the epipolar rectification technique mathematically enables the projection of a three-dimensional point onto two camera images to be positioned on the same line. In triangulating a three-dimensional point, it is necessary to find the projections of the point at two viewing angles. When searching the projection points, the polar correction technology can reduce the search area, thereby improving the working efficiency and accuracy of searching the projection point pairs. When the binocular machine vision is operated underwater, the binocular machine vision needs to be applied to an underwater camera, and a special structure of the underwater camera provides a new problem for the polar correction technology.
In order to protect camera equipment, a photographing system typically mounts a camera in a pressure housing during underwater operation, and the camera photographs the outside through a glass transparent cover, and as shown in fig. 3, light rays undergo a water- & gtglass- & gtair- & gtcamera propagation process during photographing.
Underwater cameras are commonly fitted with hemispherical glass covers. The glass transparent cover has better mechanical property, so that the camera has a larger field angle. When the center of the camera and the center of the transparent cover cannot be overlapped, light rays can be refracted when passing through different media, so that distortion exists on pictures shot by the underwater camera. On the distorted underwater picture, the method for searching the polar lines on land is poor in accuracy, so that the polar line correction method of the land binocular system is generally performed.
The application provides an epipolar rectification algorithm of a binocular underwater camera shooting system, which is used for solving the technical problems that the epipolar rectification algorithm of the binocular underwater camera shooting system in the prior art is insufficient in calibration precision and partial system parameters need to be acquired on land in advance.
In an alternative embodiment, referring to fig. 1,2 and 6, an epipolar rectification algorithm of a binocular underwater camera shooting system, referring to fig. 2, the underwater camera shooting system includes a first camera, a second camera, a first glass transparent cover corresponding to the first camera, and a second glass transparent cover corresponding to the second camera, including:
step S1, establishing a fitting camera coordinate system and a real camera coordinate system, and acquiring calibration parameters of a fitting camera component in the fitting camera coordinate system;
s2, determining a fitting camera origin coordinate of the fitting camera assembly according to the calibration parameters and the underwater camera model;
s3, determining pose relations between a right fitting camera and a left fitting camera of the fitting camera assembly according to the origin coordinates of the fitting cameras, the coordinate system of the fitting cameras and the coordinate system of the real camera;
S4, determining the external parameters of the right fitting camera or the external parameters of the left fitting camera according to the pose relation;
s5, obtaining calibration coordinates of at least two calibration points on a world coordinate system;
s6, determining fitting coordinates in the fitting camera coordinate system corresponding to the calibration coordinates, and inputting the determined fitting coordinates into a pinhole camera projection model to determine fitting camera internal references;
Step S7, determining a right fitting camera image and a left fitting camera image;
and S8, carrying out polar correction on the right fitting camera image, the left fitting camera image and the origin coordinates of the fitting camera through a limit correction algorithm.
According to the application, by introducing the concept of the fitting camera, the position of the fitting camera, the pose of the fitting camera, the internal parameters of the fitting camera and the internal parameters of the fitting camera are sequentially determined to establish a coordinate system of the fitting camera and a pixel mapping relation, and finally, the projection pixels of the real image corresponding to the fitting camera are determined through the pixel mapping relation, so that two fitting camera images are obtained, and finally, the fitting camera images and the origin coordinates of the fitting camera are corrected according to an epipolar correction algorithm, so that the epipolar correction accuracy is improved.
Optionally, the step of determining the origin coordinates of the fitted camera assembly according to the calibration parameters and the underwater camera model includes:
describing an incident light path of the real camera coordinate system through point coordinates and writing a matrix;
the matrix is solved to determine fitted camera origin coordinates of the fitted camera assembly.
In an alternative embodiment, referring to fig. 4, the step of describing the incident light path of the real camera coordinate system by the point coordinates and writing a matrix includes:
Incident light path to the real camera coordinate systemThe description is given as formula 1:
(1)
wherein,Is the incident light pathThe Euclidean length of each optical path is that the coordinates of the point where each optical path passes are; The cosine of the direction corresponding to each light path isThese light path information are all located under the real camera coordinate system;
expanding the linear system represented by the formula 1 into a matrix:
(2)
at this time, the linear system is described as:
(3)
In the formula,Three-dimensional point coordinates of each light path passing through the real camera coordinate system; Is of a size ofIs specifically formed by:
(4)。
Optionally, the step of solving the matrix to determine fitted camera origin coordinates of the fitted camera assembly comprises:
The matrix is provided withThe specific form of the vector is as follows:
(5)
for the matrixSVD decomposition is carried out:
(6)
Solving the describedVector:
(7)
As can be seen from formula (5), the aboveThe first three rows of vectors are the origin coordinates of the fitting camera: Namely, the method is as follows:
(8)。
at this time, the incident light outside the glass transparent cover is determined by a given underwater camera model. The linear system of incident rays combines the matrix approach to find the position of the fitted camera center.
Optionally, the step of determining the pose relationship between the right and left fitted cameras of the fitted camera assembly according to the fitted camera origin coordinates, the fitted camera coordinate system and the real camera coordinate system includes:
Fitting the center of the camera assemblyConstructing a fitted camera coordinate system as an origin, and the fitted camera coordinate system is positioned with the positionThe camera coordinate system at which is parallel, hereNumbering the underwater camera, and under the camera coordinate system, marking the pose of the fitting camera assembly asThe method is characterized by comprising the following steps:
(9)
Here, theIs the origin of the camera coordinate system,I is the coordinates of the center of the corresponding fitted camera assembly in the camera coordinate system, so
The pose of a set of fitted camera coordinate systems under their respective camera coordinate systems can be noted as: the camera is fitted to the left and,; The camera is fitted to the right and,
It should be noted that, the origin coordinates of the fitted camera are equal to the center coordinates of the fitted camera, and are called as the origin coordinates of the fitted camera according to the coordinate system, and are called as the center coordinates of the fitted camera according to the physical position of the camera, and generally, the origin of the left fitted camera is set as the origin coordinates of the fitted camera in the world coordinate system.
Optionally, referring to fig. 5 and 7, the step of determining the external parameters of the right fitting camera or the external parameters of the left fitting camera according to the pose relationship includes:
taking the right fitting camera's external parameters acquisition process as an example:
based on the left camera coordinate system, combining the external parameters of the calibrated right camera coordinate systemAccording to the rigid transformation relation, obtaining a translation vector of the right fitting camera relative to the left fitting camera under the world coordinate system
(10)
In addition, since the two fitting cameras are in an attitude relative to their respective corresponding camera coordinate systemsAre identity matrices, then the pose of the right fitting camera relative to the left fitting camera is
The right fitting camera under the left camera coordinate system is relative to the external parameters of the left fitting camera as
In the above process, after the positions of the two fitting cameras under the respective camera coordinate systems are determined, the pose relationship of the two fitting cameras under the camera coordinate systems is obtained. The pose relationship is used as an external parameter for fitting the binocular camera. And obtaining internal parameters of the two fitting cameras through a pinhole camera projection model.
In an optional embodiment, the step of determining the fit coordinates in the fit camera coordinate system corresponding to the calibration coordinates, and inputting the determined fit coordinates into the pinhole camera projection model to determine the fit camera internal parameters includes:
In the calibration process, the world coordinate system coincides with the plane of the calibration plate, and the position and the posture of the calibrated camera coordinate system are recorded as
World coordinates of three-dimensional pointsTurning to the camera coordinate system through rigid transformation:
(11)
Coordinates of this point under the camera coordinate systemTurning to the fitted camera coordinate systemThe following steps:
(12)
Obtaining coordinates of three-dimensional points in the fitted camera coordinate systemThen, determining fitting camera internal parameters according to a pinhole camera projection model:
(13)
In the dotIs the pixel coordinates of the projection of the three-dimensional point onto the fitted camera image,Is an internal reference matrix of a fitting camera, and is specifically as follows:
(14)
Four fitted camera references are provided in equation (14), including: focal lengthPrincipal point coordinatesAt least two three-dimensional points and corresponding fitting image points are required for determining these internal parameters.
Optionally, the two three-dimensional points are any two of four corner points of the largest rectangle on the checkerboard calibration plate.
Optionally, the step of determining the right fitted camera image and the left fitted camera image comprises:
given pixels fitting to camera imageThe corresponding camera coordinate system lower coordinate can be found by combining (11-14)The following is shown:
(15)
wherein,Is a defined constant for intercepting three-dimensional points from rays where the incident light path is located;
three-dimensional pointPixels projected to the real image can be found from the mathematical projection model of the underwater camera
Realizing the coordinates of the fitting camera image by a least square methodIs adjusted by:
(16)
Fitting pixels of a camera imageAs an initial value of least squares;
Repeating the adjustment process for all points of the image area to obtain an image of the fitted camera.
According to the underwater camera projection model and the internal and external parameters of the fitting camera, the position relation between the pixels on the fitting image and the pixels on the real image can be established. And adjusting the pixel positions of the fitting image by utilizing a least square method according to the position relation. By using the method to traverse the pixel points of the whole image, the image fitting the camera is obtained.
Optionally, referring to fig. 8, the step of performing epipolar rectification on the right fitted camera image, the left fitted camera image and the origin coordinates of the fitted camera by using a limiting rectification algorithm includes:
According to the external parameters of the right fitting camera relative to the left fitting camera under the left camera coordinate systemDetermining that the external parameters of the right fitting camera relative to the left fitting camera under the left fitting camera coordinate system are
Combining Bouguet methods, two fitted camera coordinate systems are required to be adjusted to be parallel, and a rotation matrix is firstly adoptedConversion to angular axis vectors using the Rodrigues formulaIn this form, the angular axis vector is characterized by a modulus of the vector representing the rotation angle, and a unit vector thereof being the rotation axis. To make the two fitted camera image planes parallel;
The left fitting camera coordinate system needs to rotate by the angle ofThe right fitting camera coordinate system needs to rotate by the angle of
The rotation angle and the rotation axis are converted into corresponding rotation matrixes through the Rodrigues formulaAnd (3) with
Under the left fitting camera coordinate system, the right fitting camera translation vector is:
(17)
In which the translation vector is written as
Constructing a new rotation matrixThe base line of the fitting camera is parallel to the image planes of the two fitting cameras, and the construction process is implemented by translating the vectorsRealizing;
(18)
The pair of translation vectors in equation (18)Normalization is carried out to obtain a vectorThe vector is the direction vector of the X axis of the coordinate system of the two corrected fitting cameras;
The Y-axis of the coordinate system of the fitted camera after polar correction is ensured to be vertical to the plane XOZ. Then the Y-axis direction vectorExpressed as:
(19)
According to the characteristics of the orthogonal base coordinate system, the direction vector of the Z axis of the adjusted fitting camera coordinate system can be obtained
(20)
Then, the correction matrix for adjusting the two fitting cameras to be coplanar is set asThe adjusted rotation matrix isAnd (3) withTo achieve epipolar rectification for the two fitting cameras.
In the above process, the invention uses the fitting camera internal parameters determined by the previous processAs a common reference matrix for binocular systems. And constructing a mapping relation between pixels of the fitted camera image and the polar corrected image by utilizing a pinhole camera model according to the three-dimensional point coordinates, thereby realizing polar correction process of the fitted camera image. The adjusted fitting camera image and the central position relationship of the two fitting cameras can accord with the definition of the epipolar geometry of the binocular camera. And the polar line correction process can be completed by utilizing the obtained internal and external parameters of the two fitting cameras.
The present invention uses numerical methods to verify its actual performance. And the angular points of the underwater checkerboard picture with the polar lines corrected are required to be subjected to triangulation, and then the measurement result is compared with the three-dimensional coordinates of the angular points. The invention uses a plurality of groups of underwater images to carry out triangulation, the average value of the obtained measurement errors is about 1.3697mm, and the maximum deviation from the average value is not more than 1.4mm.
The invention solves the problem of insufficient precision when carrying out polar correction after the current common land pinhole camera model is applied to an underwater binocular camera system. Aiming at the underwater binocular camera shooting system provided with the hemispherical glass transparent cover, the invention can realize the polar correction process of the shooting system in an underwater environment; the polar line correction process combines the mathematical projection model of the underwater camera, so that the accuracy of triangulation by using the regulated image is higher. In addition, compared with other camera correction processes, the method and the device for correcting the camera parameters can achieve the effect that the whole correction process is carried out underwater. The invention is suitable for the scene of the underwater binocular camera shooting system needing zooming operation. It should be noted that, after the fitting camera image is adjusted, the part of the method combining Bouguet can also be combined with other land polar correction methods.
In the scheme of the invention, the technical key points are as follows:
1. fitting the camera position determination. The invention obtains the position of the fitting camera through a matrix method. And determining an incident light path outside the hemispherical glass transparent cover corresponding to the pixels on the underwater image by a mathematical model of the given underwater camera. Firstly, expressing an external incident light path according to a formula (1) by utilizing points and direction cosine; then expanding an incident light path into a linear system according to a formula (2), abstracting the linear system into a form of a formula (3), wherein specific parameters of the linear system are formulas (4 and 5); and finally solving the formula (3) according to the process of the formula (6-8), thereby obtaining the coordinates of the fitted camera center.
2. And (7) determining the pose of the fitting camera. The invention sets the fitting camera coordinate system and the camera coordinate system to be parallel, and determines the posture and the position of the fitting camera relative to the real camera through a formula (9). In the left camera coordinate system, the pose relationship between the right fitting camera and the left fitting camera is determined by the formula (10).
3. And (7) fitting the determination of the camera internal parameters. According to the invention, after three-dimensional points of a world coordinate system are transformed into a camera coordinate system through rigid body transformation by using a formula (11), the three-dimensional points are switched into a fitting camera coordinate system by using a formula (12). Given the image points and corresponding three-dimensional point coordinates of the fitted camera, through a pinhole camera projection model, internal parameters of the fitted camera are obtained according to formulas (13, 14).
4. Fitting the acquisition of the camera image. The invention sets the size of the fitting camera image to be consistent with that of the original image. The pixels of a given fitted image find the coordinates of their corresponding three-dimensional points in the camera coordinate system using equation (15). And then the coordinates are utilized to find the projection pixels of the coordinates in the real image by utilizing an underwater camera projection model, and the pixel coordinates of the fitting camera are adjusted by the least square of a formula (16). The above process is repeated for each point on the map, eventually obtaining an image that fits the camera.
5. Polar correction of the fitted camera image. The invention combines the obtained fitting camera image with Bouguet method, adjusts the coordinate system of the two fitting cameras to be parallel through formula (17), and then adjusts them to be coplanar through formulas (18-20).
Aiming at the underwater binocular camera shooting system provided with the hemispherical glass transparent cover, the invention can realize the polar correction process of the shooting system in an underwater environment; the polar line correction process combines the mathematical projection model of the underwater camera, so that the accuracy of triangulation by using the regulated image is higher.
In the description provided herein, numerous specific details are set forth. It will be appreciated, however, that embodiments of the invention may be practiced without such specific details. Similarly, in the above description of exemplary embodiments of the invention, various features of embodiments of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. Wherein the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention. Those skilled in the art will appreciate that the modules in the apparatus of the embodiments may be adaptively changed and disposed in one or more apparatuses different from the embodiments. The modules or units or components of the embodiments may be combined into one module or unit or component and, furthermore, they may be divided into a plurality of sub-modules or sub-units or sub-components. Except that at least some of such features and/or processes or elements are mutually exclusive. It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second, third, etc. do not denote any order. These words may be interpreted as names. The steps in the above embodiments should not be construed as limiting the order of execution unless specifically stated.

Claims (10)

CN202411155542.6A2024-08-222024-08-22Polar correction algorithm of binocular underwater camera shooting systemActiveCN118691687B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202411155542.6ACN118691687B (en)2024-08-222024-08-22Polar correction algorithm of binocular underwater camera shooting system

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202411155542.6ACN118691687B (en)2024-08-222024-08-22Polar correction algorithm of binocular underwater camera shooting system

Publications (2)

Publication NumberPublication Date
CN118691687A CN118691687A (en)2024-09-24
CN118691687Btrue CN118691687B (en)2024-11-19

Family

ID=92768448

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202411155542.6AActiveCN118691687B (en)2024-08-222024-08-22Polar correction algorithm of binocular underwater camera shooting system

Country Status (1)

CountryLink
CN (1)CN118691687B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN105931261A (en)*2016-07-082016-09-07北京格灵深瞳信息技术有限公司Method and device for modifying extrinsic parameters of binocular stereo camera
CN112634372A (en)*2020-11-272021-04-09中山大学Real-time binocular camera correction method and device and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN112862876B (en)*2021-01-292024-07-23中国科学院深海科学与工程研究所Real-time deep sea video image enhancement method for underwater robot
CN115063468B (en)*2022-06-172023-06-27梅卡曼德(北京)机器人科技有限公司Binocular stereo matching method, computer storage medium and electronic equipment
CN115830103B (en)*2022-11-282025-04-08北京石油化工学院Transparent object positioning method and device based on monocular color and storage medium
CN117495981A (en)*2023-11-222024-02-02南京大学 A correction method for multi-distortion models of binocular cameras
CN117611684A (en)*2023-12-042024-02-27天津理工大学Structural parameter optimization calibration method for biprism virtual binocular vision system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN105931261A (en)*2016-07-082016-09-07北京格灵深瞳信息技术有限公司Method and device for modifying extrinsic parameters of binocular stereo camera
CN112634372A (en)*2020-11-272021-04-09中山大学Real-time binocular camera correction method and device and storage medium

Also Published As

Publication numberPublication date
CN118691687A (en)2024-09-24

Similar Documents

PublicationPublication DateTitle
CN107248178B (en)Fisheye camera calibration method based on distortion parameters
CN108257183B (en) Method and device for calibrating optical axis of camera lens
CN111243033B (en)Method for optimizing external parameters of binocular camera
CN106408556B (en) A Calibration Method for Small Object Measurement System Based on General Imaging Model
CN108288294A (en)A kind of outer ginseng scaling method of a 3D phases group of planes
CN111340888B (en)Light field camera calibration method and system without white image
CN109615661A (en) Device and method for calibrating internal parameters of light field camera
CN107527336B (en)Lens relative position calibration method and device
WO2018201677A1 (en)Bundle adjustment-based calibration method and device for telecentric lens-containing three-dimensional imaging system
CN108269234A (en)A kind of lens of panoramic camera Attitude estimation method and panorama camera
CN110136068B (en)Sound membrane dome assembly system based on position calibration between bilateral telecentric lens cameras
CN112258581B (en)On-site calibration method for panoramic camera with multiple fish glasses heads
CN115760560A (en)Depth information acquisition method and device, equipment and storage medium
CN118691686B (en) A calibration algorithm for underwater camera shooting system
CN113763480B (en)Combined calibration method for multi-lens panoramic camera
CN118691687B (en)Polar correction algorithm of binocular underwater camera shooting system
CN111292380B (en)Image processing method and device
CN111583117A (en)Rapid panoramic stitching method and device suitable for space complex environment
CN113496516A (en)Calibration method and calibration device
CN115457142B (en)Calibration method and system of MR hybrid photographic camera
TW202131084A (en)Method of determining assembly quality of camera module
CN115439541A (en) Glass orientation calibration system and method for refraction imaging system
CN114373019A (en)Method for calibrating camera without public view field by using optimization method
CN113345024A (en)Method for judging assembling quality of camera module
CN115187651B (en) A method for determining the position and posture of Stewart mechanism based on vision

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp