Movatterモバイル変換


[0]ホーム

URL:


CN103530880B - Based on the camera marking method of projection Gaussian network pattern - Google Patents

Based on the camera marking method of projection Gaussian network pattern
Download PDF

Info

Publication number
CN103530880B
CN103530880BCN201310482789.4ACN201310482789ACN103530880BCN 103530880 BCN103530880 BCN 103530880BCN 201310482789 ACN201310482789 ACN 201310482789ACN 103530880 BCN103530880 BCN 103530880B
Authority
CN
China
Prior art keywords
camera
gaussian
points
calibration
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310482789.4A
Other languages
Chinese (zh)
Other versions
CN103530880A (en
Inventor
贾振元
刘巍
李明星
刘阳
杨景豪
张驰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dalian University of Technology
Original Assignee
Dalian University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dalian University of TechnologyfiledCriticalDalian University of Technology
Priority to CN201310482789.4ApriorityCriticalpatent/CN103530880B/en
Publication of CN103530880ApublicationCriticalpatent/CN103530880A/en
Application grantedgrantedCritical
Publication of CN103530880BpublicationCriticalpatent/CN103530880B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Landscapes

Abstract

Translated fromChinese

本发明基于投影高斯网格图案的摄像机标定方法属于图像处理和计算机视觉检测领域,特别涉及大型锻件尺寸测量系统中摄像机内、外参数的现场标定方法。摄像机标定方法利用高斯网格图案中横、纵光条在宽度方向上的灰度呈高斯分布的特性,通过拟合高斯曲线可以高精度地获取光条中心线上的点的图像坐标,进而拟合出横、纵光条的中心线方程,横、纵光条中心线的交点即为标定特征点,依据拍摄得到的高斯网格图案的图像中提供的标定特征点的图像坐标,分步获取摄像机的内、外参数。本发明具有高的实时性、鲁棒性及较高的标定精度,分步标定可以获取高精度的摄像机参数,避免将所有摄像机参数同时求解时的耦合性问题,适用于锻造现场对摄像机进行在线标定。

The invention relates to a camera calibration method based on a projected Gaussian grid pattern, which belongs to the field of image processing and computer vision detection, and particularly relates to a method for on-site calibration of internal and external parameters of a camera in a size measurement system for large forgings. The camera calibration method utilizes the characteristics of the Gaussian distribution of the horizontal and vertical light bars in the width direction of the Gaussian grid pattern. By fitting the Gaussian curve, the image coordinates of the points on the center line of the light bar can be obtained with high precision, and then simulated. Combine the centerline equations of the horizontal and vertical light strips, and the intersection of the centerlines of the horizontal and vertical light strips is the calibration feature point, which is obtained step by step according to the image coordinates of the calibration feature points provided in the Gaussian grid pattern image obtained The internal and external parameters of the camera. The present invention has high real-time performance, robustness and high calibration accuracy, step-by-step calibration can obtain high-precision camera parameters, and avoids the coupling problem when solving all camera parameters at the same time, and is suitable for on-line monitoring of cameras at forging sites calibration.

Description

Translated fromChinese
基于投影高斯网格图案的摄像机标定方法Camera Calibration Method Based on Projected Gaussian Grid Pattern

技术领域technical field

本发明属于图像处理和计算机视觉检测领域,特别涉及大型锻件尺寸测量系统中摄像机内、外参数的现场标定方法。The invention belongs to the field of image processing and computer vision detection, in particular to a method for on-site calibration of internal and external parameters of a camera in a size measurement system for large forgings.

背景技术Background technique

计算机视觉处理的基本任务之一是根据二维图像信息恢复物体的三维几何信息。要实现利用图像点求取相应的空间物体表面点的任务,需要确定摄像机成像几何模型,该几何模型的参数称为摄像机参数。摄像机参数分为内、外参数,内参数为摄像机自身的与几何和光学特性有关的参数,外参数为摄像机相对于某一世界坐标系的三维位置和方向。确定摄像机内、外参数的过程称为摄像机标定,标定方法的精度直接影响到计算机视觉测量的精度。因此,对摄像机进行快速、简捷、精准标定的研究无疑具有重大的意义。One of the basic tasks of computer vision processing is to recover the 3D geometric information of objects from 2D image information. To achieve the task of using image points to obtain the corresponding surface points of space objects, it is necessary to determine the camera imaging geometric model, and the parameters of the geometric model are called camera parameters. The camera parameters are divided into internal and external parameters. The internal parameters are parameters related to the geometric and optical characteristics of the camera itself, and the external parameters are the three-dimensional position and direction of the camera relative to a certain world coordinate system. The process of determining the internal and external parameters of the camera is called camera calibration, and the accuracy of the calibration method directly affects the accuracy of computer vision measurement. Therefore, the research on fast, simple and accurate calibration of cameras is undoubtedly of great significance.

传统的摄像机标定方法按照标定参照物的不同可以分为基于3D立体靶标标定方法,基于2D平面靶标标定方法(以张正友提出的棋盘格靶标标定方法为代表)和基于1D靶标的标定方法。这些传统的标定方法均需要标定参照物,而对于大视场摄像机的标定,标定参照物的特征点能否均匀布满于整个视场,直接影响着标定的精度。一方面,制作高精度的大尺寸标定靶标造价昂贵、维护困难。另一方面,对于不适用于在线与不可能使用标定参照物的场合也不适用。在锻件锻造车间高温环境下,标定块、标定板与粘贴靶点的方法均不能应用。因此,传统摄像机标定方法不能满足大型锻件在线尺寸参数测量的要求。另外,自标定方法虽然不利用任何标定物,仅利用摄像机内参数自身存在的约束,根据图像间图像点的对应关系就能估计出摄像机内参数。该类方法操作上较为灵活,但是精度不太高,鲁棒性不足。According to different calibration reference objects, traditional camera calibration methods can be divided into calibration methods based on 3D stereo targets, calibration methods based on 2D plane targets (represented by the checkerboard target calibration method proposed by Zhang Zhengyou) and calibration methods based on 1D targets. These traditional calibration methods all require a calibration reference object, and for the calibration of a large field of view camera, whether the feature points of the calibration reference object can evenly cover the entire field of view directly affects the calibration accuracy. On the one hand, making large-scale calibration targets with high precision is expensive and difficult to maintain. On the other hand, it is not suitable for occasions where it is not suitable for online and it is not possible to use calibration reference objects. In the high temperature environment of the forging workshop, the methods of calibration block, calibration plate and sticking target points cannot be applied. Therefore, the traditional camera calibration method cannot meet the requirements of online dimensional parameter measurement of large forgings. In addition, although the self-calibration method does not use any calibration objects, it only uses the constraints of the internal parameters of the camera itself, and the internal parameters of the camera can be estimated according to the corresponding relationship between image points between images. This type of method is more flexible in operation, but the accuracy is not high and the robustness is insufficient.

通过采用投影仪投射靶标可以解决上述问题,投影的特征图案理论上边缘应为阶跃变化,但是实际上由于扩散效应,图案边缘呈渐变趋势,边缘会向黑色背景一侧偏移。以投影仪投射圆形特征光斑阵列为例,圆形光斑的中心即为用于标定的特征点,由于每个光斑会向周围产生不同程度的扩散,因此很难根据二值化后的图像采用形心法获取圆斑的精确中心,利用相关算法进行圆形特征边界提取并进行圆形(或椭圆形)拟合获得高精度的圆斑中心也并非易事。同理,利用投影仪投影一般的光条组合图案,提取光条的中心作为特征线,精度也难以保证。The above problems can be solved by using a projector to project the target. The edge of the projected characteristic pattern should be a step change in theory, but in fact, due to the diffusion effect, the edge of the pattern shows a gradual trend, and the edge will shift to the black background side. Take the circular feature spot array projected by the projector as an example. The center of the circular spot is the feature point used for calibration. Since each spot will diffuse to different degrees around it, it is difficult to use it according to the binarized image. The centroid method obtains the precise center of the circular spot, and it is not easy to use the relevant algorithm to extract the circular feature boundary and perform circular (or elliptical) fitting to obtain the high-precision circular spot center. In the same way, it is difficult to guarantee the accuracy of using a projector to project a general combination pattern of light stripes and extracting the center of the light stripes as a feature line.

发明内容Contents of the invention

本发明所要解决的技术问题是克服现有技术的不足,针对在锻造现场,传统的标定方法存在精度低、非实时,甚至不能应用以及自标定方法存在精度不太高,鲁棒性不足等问题,发明一种基于投影高斯网格图案的大视场摄像机标定方法,利用高斯网格图案中横、纵光条在宽度方向上的灰度呈高斯分布的特性,通过拟合高斯曲线可以高精度地获取光条中心线上的点的图像坐标,进而拟合出横、纵光条的中心线方程,横、纵光条中心线的交点即为标定特征点,依据拍摄得到的高斯网格图案的图像中提供的标定特征点的图像坐标,分步获取摄像机的内、外参数。The technical problem to be solved by the present invention is to overcome the deficiencies of the prior art. For the forging site, the traditional calibration method has low precision, non-real-time, and even cannot be applied, and the self-calibration method has problems such as low precision and insufficient robustness. , invented a large-field-of-view camera calibration method based on projected Gaussian grid patterns, using the characteristics of the Gaussian distribution of the horizontal and vertical light bars in the width direction of the Gaussian grid pattern, and by fitting Gaussian curves, high-precision Accurately obtain the image coordinates of the points on the centerline of the light strip, and then fit the centerline equation of the horizontal and vertical light strips. The intersection of the centerlines of the horizontal and vertical light strips is the calibration feature point. According to the Gaussian grid pattern obtained by shooting The image coordinates of the calibration feature points provided in the image, and the internal and external parameters of the camera are obtained step by step.

本发明采取的技术方案是一种基于投影高斯网格图案的摄像机标定方法,其特征在于,利用高斯网格图案中横、纵光条在宽度方向上的灰度呈高斯分布的特性,通过拟合高斯曲线可以高精度地获取光条中心线上的点的图像坐标,进而拟合出横、纵光条的中心线方程,横、纵光条中心线的交点即为标定特征点,依据拍摄得到的高斯网格图案的图像中提供的标定特征点的图像坐标,分步获取摄像机的内、外参数;具体步骤如下:The technical solution adopted by the present invention is a camera calibration method based on projected Gaussian grid pattern, which is characterized in that the gray scale of the horizontal and vertical light bars in the Gaussian grid pattern is Gaussian distributed in the width direction, through the simulated Gaussian grid pattern. The Gaussian curve can obtain the image coordinates of the points on the centerline of the light strip with high precision, and then fit the centerline equation of the horizontal and vertical light strips. The intersection point of the centerlines of the horizontal and vertical light strips is the calibration feature point. According to the shooting The image coordinates of the calibration feature points provided in the obtained Gaussian grid pattern image, and the internal and external parameters of the camera are obtained step by step; the specific steps are as follows:

步骤1:搭建摄像机标定系统。将左侧四维电控平台2a、右侧四维电控平台2b及投影仪3安装在平台1的台面上,将左侧摄像机4a固定在左侧四维电控平台2a上,将右侧摄像机4b固定在右侧四维电控平台2b上。Step 1: Build a camera calibration system. Install the left four-dimensional electric control platform 2a, the right four-dimensional electric control platform 2b and the projector 3 on the platform 1, fix the left camera 4a on the left four-dimensional electric control platform 2a, and fix the right camera 4b On the right four-dimensional electric control platform 2b.

步骤2:投影高斯网格图案、拍摄并获取交点坐标。通过投影仪3向厂房内光整的平板或墙面5上投影由多条平行的横光条与多条平行的纵光条组成高斯网格图案6,各光条在宽度方向上的灰度均呈高斯分布,其中横、纵光条交点Ai,j为标定特征点,i为横光条的编号,按照从上至下的次序,j为纵光条的编号,按照从左至右的次序。由于横、纵光条交点处光强叠加,对左侧摄像机4a与右侧摄像机4b拍摄得到的图像进行二值化处理后,获得的图像中只剩下网格交点处的亮斑,即一个个孤立的连通区域。利用形心法可以获取连通区域的质心坐标(u0i,j,v0i,j),作为特征点Ai,j的粗略位置。将以该粗略位置为圆心,以Δ个像素为半径的圆形区域作为搜索范围,然后在[u0i,j-Δ,u0i,j+Δ]范围内每隔Δ/n沿宽度方向搜索一次横光条,按照高斯分布特性进行拟合,将高斯分布峰值点作为横光条中心线上的点,因此可以得到2n+1个中心线上的点Pi,j,s,下标s为1,2,3,…,2n+1,进而拟合出直线lh,i,j。同样地,在[v0i,j-Δ,v0i,j+Δ]范围内每隔Δ/n沿宽度方向搜索一次纵光条,按照高斯分布特性进行拟合,将高斯分布峰值点作为纵光条中心线上的点,可以得到2n+1个中心线上的点Qi,j,t,下标t为1,2,3,…,2n+1,进而拟合出直线lv,i,j。最终,通过求取同一搜索范围内两条相交直线的交点作为标定特征点Ai,j,其坐标为(ui,j,vi,j)。Step 2: Project the Gaussian grid pattern, photograph and obtain the intersection coordinates. Through the projector 3, the Gaussian grid pattern 6 composed of multiple parallel horizontal light strips and multiple parallel vertical light strips is projected onto the smooth flat panel or wall 5 in the factory building. The gray scale of each light strip in the width direction is uniform. Gaussian distribution, where the horizontal and vertical light bar intersection Ai,j is the calibration feature point, i is the number of the horizontal light bar, in order from top to bottom, and j is the number of the vertical light bar, in the order from left to right . Due to the superposition of the light intensity at the intersection of the horizontal and vertical light strips, after binarizing the images captured by the left camera 4a and the right camera 4b, only the bright spot at the intersection of the grid remains in the obtained image, that is, a an isolated connected region. Using the centroid method, the centroid coordinates (u0i,j ,v0i,j ) of the connected area can be obtained as the rough position of the feature point Ai,j . Use the rough position as the center and a circular area with a radius of Δ pixels as the search range, and then search along the width direction every Δ/n within the range of [u0i,j -Δ,u0i,j +Δ] A horizontal light bar is fitted according to the Gaussian distribution characteristics, and the peak point of the Gaussian distribution is used as a point on the center line of the horizontal light bar, so 2n+1 points Pi,j,s on the center line can be obtained, and the subscript s is 1 ,2,3,…,2n+1, and then fit the straight line lh,i,j . Similarly, within the range of [v0i,j -Δ,v0i,j +Δ], the vertical light strip is searched every Δ/n along the width direction, fitted according to the Gaussian distribution characteristics, and the peak point of the Gaussian distribution is used as the longitudinal Points on the center line of the light bar can get 2n+1 points Qi,j,t on the center line, and the subscript t is 1,2,3,...,2n+1, and then fit the straight line lv, i,j . Finally, the intersection point of two intersecting straight lines within the same search range is obtained as the calibration feature point Ai,j , and its coordinates are (ui,j ,vi,j ).

步骤3:获取主点的粗略坐标。利用左侧摄像机4a或右侧摄像机4b在两种不同焦距下拍摄同一个投影的高斯网格图案6,特征点Ai,j的图像坐标分别为(u1i,j,v1i,j)和(u2i,j,v2i,j),主点坐标为(u0,v0),则有:Step 3: Get the rough coordinates of the principal point. Using the left camera 4a or the right camera 4b to shoot the same projected Gaussian grid pattern 6 at two different focal lengths, the image coordinates of the feature points Ai,j are (u1i,j ,v1i,j ) and (u2i,j ,v2i,j ), the principal point coordinates are (u0 ,v0 ), then:

uu22ii,,jj--uu00uu11ii,,jj--uu00==vv22ii,,jj--vv00vv11ii,,jj--vv00------((22))

可以利用上式求出主点的粗略位置的坐标(u0,v0)。The coordinates (u0 , v0 ) of the rough position of the main point can be obtained by using the above formula.

步骤4:求取畸变系数与优化的主点坐标。根据畸变模型可以推出实际拍摄的交点Ai,j的坐标pi,j=(ui,j,vi,j,1)T与理想的交点坐标qi,j=(u'i,j,v'i,j,1)T的转换关系如下:Step 4: Calculate the distortion coefficient and the optimized principal point coordinates. According to the distortion model, it can be deduced that the coordinate pi,j =(ui,j ,vi,j ,1)T of the actually photographed intersection point Ai,j and the ideal intersection point coordinate qi,j =(u'i,j ,v'i,j ,1) The conversion relationship ofT is as follows:

uu′′ii,,jjvv′′ii,,jj==uuii,,jjvvii,,jj++uu~~ii,,jj((kk11rrii,,jj22++kk22rrii,,jj44))+2+2pp11uu~~ii,,jjvv~~ii,,jj++pp22((rrii,,jj22++22uu~~ii,,jj22))vv~~ii,,jj((kk11rrii,,jj22++kk22rrii,,jj44))++pp11((rrii,,jj22++22vv~~ii,,jj22))++22pp22uu~~ii,,jjvv~~ii,,jj------((33))

其中,u~i,j=ui,j-u0,v~i,j=vi,j-v0,ri,j=u~i,j2+v~i,j2,k1与k2为径向畸变系数,p1与p2为切向畸变系数。in, u ~ i , j = u i , j - u 0 , v ~ i , j = v i , j - v 0 , r i , j = u ~ i , j 2 + v ~ i , j 2 , k1 and k2 are radial distortion coefficients, p1 and p2 are tangential distortion coefficients.

另外,以横、纵光条数目相同的网格图案为例,交点总数为num,那么每行上有个交点,根据直线的保线性,即在同一条光条上的点共线的性质,结合三点共线的充要条件可以列出优化目标函数如下:In addition, taking the grid pattern with the same number of horizontal and vertical light bars as an example, the total number of intersection points is num, then there are intersection point, according to the linearity of the straight line, that is, the collinear nature of the points on the same light strip, combined with the necessary and sufficient conditions for three points to be collinear, the optimization objective function can be listed as follows:

minminΣΣii==11nnouumm((ΣΣjj==nnouumm((ii--11))++11nnouumm((ii--11))++nnouumm--22||qqii,,jj++11TT[[qqii,,jj]]××qqii,,jj++22||))------((44))

其中,为三点Ai,j,Ai,j+1,Ai,j+2共线的充要条件。[qi,j]×代表qi,j的反对称矩阵,即:in, It is a necessary and sufficient condition for three points Ai,j , Ai,j+1 and Ai,j+2 to be collinear. [qi,j ]× represents the antisymmetric matrix of qi,j , namely:

[[qqii,,jj]]××==00--11--vv′′ii,,jj1100--uu′′ii,,jj--vv′′ii,,jjuu′′ii,,jj00------((55))

通过Levenberg-Marquardt非线性优化算法进行优化,使式(4)的目标函数的值最小,可以获取畸变系数k1、k2、p1与p2以及优化后的主点坐标(u'0,v'0)。然后利用式(3)将所有的交点坐标修正为理想的坐标。Through the optimization of the Levenberg-Marquardt nonlinear optimization algorithm, the value of the objective function of formula (4) can be minimized, and the distortion coefficients k1 , k2 , p1 and p2 and the optimized principal point coordinates (u'0 , v'0 ). Then use formula (3) to correct all intersection coordinates to ideal coordinates.

步骤5:求取摄像机其余的内部参数。利用修正后的理想交点作为特征点,采用主动视觉方法,利用左侧四维电控平台2a带动左侧摄像机4a作两组正交运动,在每组正交运动的三个始末位置上分别拍摄一张投影的高斯网格图案6的图像,最终左侧摄像机4a拍摄得到6张图像。Step 5: Obtain the remaining internal parameters of the camera. Using the corrected ideal intersection point as a feature point, using the active vision method, using the left four-dimensional electric control platform 2a to drive the left camera 4a to perform two sets of orthogonal movements, and take a picture at the three starting and ending positions of each set of orthogonal movements images of the projected Gaussian grid pattern 6, and finally the left camera 4a captures 6 images.

平行直线与无穷远平面相交于同一个无穷远点,即隐消点。而一组正交运动含有两次平移,一次平移运动的始末位置上拍摄的两幅图像上对应交点的连线为一组空间平行线,而且两次平移是相互垂直的,因此我们可以获得一组正交的隐消点对ei1(ui1,vi1)和ei2(ui2,vi2),其中i=1,2,代表正交运动的次序,并有Oei1·Oei2=0,其中O为摄像机的主点。利用两组正交隐消点对,可以通过求解下述方程组分别求解左侧摄像机4a的内参数矩阵K中的尺度因子αx与αyParallel lines intersect the infinite plane at the same infinite point, which is the vanishing point. A set of orthogonal motions contains two translations, and the lines corresponding to the intersection points on the two images taken at the start and end positions of a translation motion are a set of spatial parallel lines, and the two translations are perpendicular to each other, so we can obtain a A set of orthogonal blanking point pairs ei1 (ui1 ,vi1 ) and ei2 (ui2 ,vi2 ), where i=1,2 represents the order of orthogonal motion, and Oei1 ·Oei2 = 0, where O is the principal point of the camera. Using two sets of orthogonal blanking point pairs, the scale factors αx and αy in the internal parameter matrix K of the left camera 4a can be solved respectively by solving the following equations:

((uu1111--uu′′00))((uu1212--uu′′00))//ααxx22++((vv1111--vv′′00))((vv1212--vv′′00))//ααythe y22++11==00((uu21twenty one--uu′′00))((uu22twenty two--uu′′00))//ααxx22++((vv21twenty one--vv′′00))((vv22twenty two--vv′′00))//ααythe y++11==00------((66))

同样地,可以获取右侧摄像机4b的内参数矩阵K中的尺度因子αx与αyLikewise, the scale factors αx and αy in the internal parameter matrix K of the right camera 4 b can be obtained.

步骤6:获取摄像机的外部参数。左侧摄像机4a与右侧摄像机4b拍摄同一投影的高斯网格图案6,将世界坐标系建立在左侧摄像机4a的像机坐标系上,利用左、右摄像机4a、4b拍摄图像修正后的匹配点计算出基本矩阵F。利用已求取的内参数和基本矩阵可在相差一个比例因子s的情况下计算出本质矩阵E。分解本质矩阵E后可在相差一个比例因子的情况下确定外部参数(旋转矩阵R'与平移向量t')。Step 6: Obtain the external parameters of the camera. The left camera 4a and the right camera 4b shoot the Gaussian grid pattern 6 of the same projection, establish the world coordinate system on the camera coordinate system of the left camera 4a, and use the left and right cameras 4a and 4b to shoot the corrected images. Point to calculate the fundamental matrix F. The essential matrix E can be calculated by using the calculated internal parameters and the fundamental matrix with a difference of a scale factor s. After decomposing the essential matrix E, the external parameters (rotation matrix R' and translation vector t') can be determined with a difference of a scale factor.

利用投影仪3投影平行的高斯光条至实际长度L0已精确测量的量块上,利用光条的高斯特性拟合出亚像素光条中心线,利用灰度骤变的点作为量块的边界点,根据上述求取的内、外参数重建量块的长度L'0,可以获得比例因子为:s=L0/L'0。因此,可以得到摄像机实际的外部参数(旋转矩阵R=R'与平移向量t=s*t')。至此,完成了摄像机的标定过程。Use the projector 3 to project parallel Gaussian light strips onto the gauge block whose actual lengthL0 has been accurately measured, use the Gaussian characteristics of the light strips to fit the center line of the sub-pixel light strips, and use the points with sudden changes in gray scale as the gauge block For the boundary point, the length L'0 of the gauge block can be reconstructed according to the internal and external parameters obtained above, and the scale factor can be obtained as: s=L0 /L'0 . Therefore, the actual external parameters of the camera (rotation matrix R=R' and translation vector t=s*t') can be obtained. So far, the calibration process of the camera is completed.

本发明的有益效果是采用投影仪投影的高斯网格图案的交点作为标定特征点,避免了标定块、标定板与粘贴标记点的使用,便于实现在锻造现场等复杂环境下进行摄像机的实时标定。根据光条宽度方向上灰度呈高斯分布特性可高精度地确定标定特征点的位置信息,具有高的鲁棒性,分步标定可以获取高精度的摄像机参数,该方法同时可以避免将所有摄像机参数同时求解时的耦合性问题。The beneficial effect of the present invention is that the intersection of the Gaussian grid pattern projected by the projector is used as the calibration feature point, which avoids the use of calibration blocks, calibration plates and pasted marking points, and facilitates the real-time calibration of cameras in complex environments such as forging sites . According to the Gaussian distribution characteristic of the gray level in the width direction of the light strip, the position information of the calibration feature points can be determined with high precision, which has high robustness. The step-by-step calibration can obtain high-precision camera parameters, and this method can avoid all cameras at the same time Coupling problem when parameters are solved simultaneously.

附图说明Description of drawings

图1为本发明的标定系统示意图。其中:1-隔震平台,2a-左侧四维电控平台,2b-右侧四维电控平台,3-投影仪,4a-左侧摄像机,4b-右侧摄像机,5-光整的平板或墙面,6-高斯网格图案。Fig. 1 is a schematic diagram of the calibration system of the present invention. Among them: 1-shock isolation platform, 2a-left four-dimensional electric control platform, 2b-right four-dimensional electric control platform, 3-projector, 4a-left camera, 4b-right camera, 5-light flat panel or Wall surface, 6-Gauss grid pattern.

图2为本发明由摄像机拍摄得到的网格图案的图像。Fig. 2 is an image of a grid pattern captured by a camera in the present invention.

图3为本发明对网格图案的图像进行二值化处理并获取交点的粗略位置。Fig. 3 shows the rough position of the intersecting point obtained by binarizing the image of the grid pattern according to the present invention.

图4为本发明的高斯光条阵列重建量块尺寸求取比例因子。Fig. 4 shows the calculation scale factor for Gaussian light strip array reconstruction gauge block size of the present invention.

具体实施方式detailed description

下面结合附图和技术方案进一步详细说明本发明的具体实施方式。The specific implementation manner of the present invention will be further described in detail below in conjunction with the accompanying drawings and technical solutions.

摄像机标定通常采用经典的小孔成像模型,该模型的表达式如下:Camera calibration usually adopts the classic pinhole imaging model, the expression of which is as follows:

其中,(Xw,Yw,Zw,1)T为空间点在世界坐标系中的齐次坐标,(u,v,1)T为对应的图像像点像素坐标系o0uv中的齐次坐标,αx=f/dx为o0uv坐标系内u轴上的尺度因子,αy=f/dy为o0uv坐标系内v轴上的尺度因子,f为摄像机镜头焦距,dx与dy分别为像元的横、纵物理尺寸,(u0,v0)为主点坐标,ρc为比例系数,K为摄像机内部参数矩阵,[R|t]为摄像机的外部参数矩阵,其中,R为旋转矩阵,t为平移向量。Among them, (Xw ,Yw ,Zw ,1)T is the homogeneous coordinate of the space point in the world coordinate system, and (u,v,1)T is the corresponding image pixel pixel coordinate system o0 uv Homogeneous coordinates, αx =f/dx is the scale factor on the u axis in the o0 uv coordinate system, αy =f/dy is the scale factor on the v axis in the o0 uv coordinate system, f is the focal length of the camera lens, dx and dy are the horizontal and vertical physical dimensions of the pixel respectively, (u0 , v0 ) are the principal point coordinates, ρc is the proportional coefficient, K is the internal parameter matrix of the camera, [R|t] is the external parameter matrix of the camera , where R is the rotation matrix and t is the translation vector.

摄像机内部参数包括主点坐标(u0,v0)、尺度因子αx、αy,径向畸变系数k1、k2与切向畸变系数p1、p2。摄像机外部参数为摄像机坐标系相对于世界坐标系的方位,包括旋转矩阵R与平移向量t。The internal parameters of the camera include principal point coordinates (u0 , v0 ), scale factors αx , αy , radial distortion coefficients k1 , k2 and tangential distortion coefficients p1 , p2 . The external parameters of the camera are the orientation of the camera coordinate system relative to the world coordinate system, including the rotation matrix R and the translation vector t.

步骤1:搭建摄像机标定系统。将左侧四维电控平台2a、右侧四维电控平台2b及投影仪3安装在平台1的台面上,将左侧摄像机4a固定在左侧四维电控平台2a上,将右侧摄像机4b固定在右侧四维电控平台2b上,如图1所示。Step 1: Build a camera calibration system. Install the left four-dimensional electric control platform 2a, the right four-dimensional electric control platform 2b and the projector 3 on the platform 1, fix the left camera 4a on the left four-dimensional electric control platform 2a, and fix the right camera 4b On the right four-dimensional electric control platform 2b, as shown in FIG. 1 .

步骤2:投影高斯网格图案、拍摄并获取交点坐标。通过投影仪3向厂房内光整的平板或墙面5上投影高斯网格图案6,高斯网格图案6由多条平行的横光条与多条平行的纵光条组成,各光条的灰度在宽度方向上均呈高斯分布,其中横、纵光条交点Ai,j为标定特征点,i为横光条的编号,按照从上至下的次序,j为纵光条的编号,按照从左至右的次序。由左侧摄像机4a或右侧摄像机4b拍摄得到的投射高斯网格图案的图像如图2所示。由于横、纵光条交点处光强叠加,对左侧摄像机4a与右侧摄像机4b拍摄得到的图像进行二值化处理后,获得的图像中只剩下网格交点处的亮斑,即一个个孤立的连通区域,如图3所示。利用形心法可以获取连通区域的质心坐标(u0i,j,v0i,j),作为特征点Ai,j的粗略位置。将以该粗略位置为圆心,以Δ个像素为半径的圆形区域作为搜索范围,然后在[u0i,j-Δ,u0i,j+Δ]范围内每隔Δ/n沿宽度方向搜索一次横光条,按照高斯分布特性进行拟合,将高斯分布峰值点作为横光条中心线上的点,因此可以得到2n+1个中心线上的点Pi,j,s,下标s为1,2,3,…,2n+1,进而拟合出直线lh,i,j。同样地,在[v0i,j-Δ,v0i,j+Δ]范围内每隔Δ/n沿宽度方向搜索一次纵光条,按照高斯分布特性进行拟合,将高斯分布峰值点作为纵光条中心线上的点,可以得到2n+1个中心线上的点Qi,j,t,下标t为1,2,3,…,2n+1,进而拟合出直线lv,i,j。最终,通过求取同一搜索范围内两条相交直线的交点作为标定特征点Ai,j,其坐标为(ui,j,vi,j)。Step 2: Project the Gaussian grid pattern, photograph and obtain the intersection coordinates. Through the projector 3, the Gaussian grid pattern 6 is projected onto the flat panel or wall 5 in the factory building. The Gaussian grid pattern 6 is composed of a plurality of parallel horizontal light bars and a plurality of parallel vertical light bars. The degree is Gaussian distributed in the width direction, where the horizontal and vertical light bars intersection point Ai, j is the calibration feature point, i is the number of the horizontal light bar, in order from top to bottom, and j is the number of the vertical light bar, according to order from left to right. The image of the projected Gaussian grid pattern captured by the left camera 4 a or the right camera 4 b is shown in FIG. 2 . Due to the superposition of the light intensity at the intersection of the horizontal and vertical light strips, after binarizing the images captured by the left camera 4a and the right camera 4b, only the bright spot at the intersection of the grid remains in the obtained image, that is, a An isolated connected region, as shown in Figure 3. Using the centroid method, the centroid coordinates (u0i,j ,v0i,j ) of the connected area can be obtained as the rough position of the feature point Ai,j . Use the rough position as the center and a circular area with a radius of Δ pixels as the search range, and then search along the width direction every Δ/n within the range of [u0i,j -Δ,u0i,j +Δ] A horizontal light bar is fitted according to the Gaussian distribution characteristics, and the peak point of the Gaussian distribution is used as a point on the center line of the horizontal light bar, so 2n+1 points Pi,j,s on the center line can be obtained, and the subscript s is 1 ,2,3,…,2n+1, and then fit the straight line lh,i,j . Similarly, within the range of [v0i,j -Δ,v0i,j +Δ], the vertical light strip is searched every Δ/n along the width direction, fitted according to the Gaussian distribution characteristics, and the peak point of the Gaussian distribution is used as the longitudinal Points on the center line of the light bar can get 2n+1 points Qi,j,t on the center line, and the subscript t is 1,2,3,...,2n+1, and then fit the straight line lv, i,j . Finally, the intersection point of two intersecting straight lines within the same search range is obtained as the calibration feature point Ai,j , and its coordinates are (ui,j ,vi,j ).

步骤3:获取主点的粗略坐标。利用变焦距方法求取主点,左侧摄像机4a或右侧摄像机4b在两种不同焦距下拍摄同一个投影高斯网格图案6,特征点Ai,j的图像坐标分别为(u1i,j,v1i,j)和(u2i,j,v2i,j),主点坐标为(u0,v0),则有:Step 3: Get the rough coordinates of the principal point. Using the zoom method to find the principal point, the left camera 4a or the right camera 4b shoots the same projected Gaussian grid pattern 6 under two different focal lengths, and the image coordinates of the feature points Ai, j are (u1i, j ,v1i,j ) and (u2i,j ,v2i,j ), the principal point coordinates are (u0 ,v0 ), then:

uu22ii,,jj--uu00uu11ii,,jj--uu00==vv22ii,,jj--vv00vv11ii,,jj--vv00------((22))

暂且将镜头缩放中心视为主点,可以利用上式求出主点的粗略位置的坐标(u0,v0)。Considering the zoom center of the lens as the main point for now, the coordinates (u0 , v0 ) of the rough position of the main point can be obtained by using the above formula.

步骤4:求取畸变系数与优化的主点坐标。根据畸变模型可以推出实际拍摄的交点Ai,j的坐标pi,j=(ui,j,vi,j,1)T与理想的交点坐标qi,j=(u'i,j,v'i,j,1)T的转换关系如下:Step 4: Calculate the distortion coefficient and the optimized principal point coordinates. According to the distortion model, it can be deduced that the coordinate pi,j =(ui,j ,vi,j ,1)T of the actually photographed intersection point Ai,j and the ideal intersection point coordinate qi,j =(u'i,j ,v'i,j ,1) The conversion relationship ofT is as follows:

uu′′ii,,jjvv′′ii,,jj==uuii,,jjvvii,,jj++uu~~ii,,jj((kk11rrii,,jj22++kk22rrii,,jj44))+2+2pp11uu~~ii,,jjvv~~ii,,jj++pp22((rrii,,jj22++22uu~~ii,,jj22))vv~~ii,,jj((kk11rrii,,jj22++kk22rrii,,jj44))++pp11((rrii,,jj22++22vv~~ii,,jj22))++22pp22uu~~ii,,jjvv~~ii,,jj------((33))

其中,u~i,j=ui,j-u0,v~i,j=vi,j-v0,ri,j=u~i,j2+v~i,j2,k1与k2为径向畸变系数,p1与p2为切向畸变系数。in, u ~ i , j = u i , j - u 0 , v ~ i , j = v i , j - v 0 , r i , j = u ~ i , j 2 + v ~ i , j 2 , k1 and k2 are radial distortion coefficients, p1 and p2 are tangential distortion coefficients.

另外,以横、纵光条数目相同的网格图案为例,交点总数为num,那么每行上有个交点,根据直线的保线性,即在同一条光条上的点共线的性质,结合三点共线的充要条件可以列出优化目标函数如下:In addition, taking the grid pattern with the same number of horizontal and vertical light bars as an example, the total number of intersection points is num, then there are intersection point, according to the linearity of the straight line, that is, the collinear nature of the points on the same light strip, combined with the necessary and sufficient conditions for three points to be collinear, the optimization objective function can be listed as follows:

minminΣΣii==11nnouumm((ΣΣjj==nnouumm((ii--11))++11nnouumm((ii--11))++nnouumm--22||qqii,,jj++11TT[[qqii,,jj]]××qqii,,jj++22||))------((44))

其中,为三点Ai,j,Ai,j+1,Ai,j+2共线的充要条件。[qi,j]×代表qi,j的反对称矩阵,即:in, It is a necessary and sufficient condition for three points Ai,j , Ai,j+1 and Ai,j+2 to be collinear. [qi,j ]× represents the antisymmetric matrix of qi,j , namely:

[[qqii,,jj]]××==00--11--vv′′ii,,jj1100--uu′′ii,,jj--vv′′ii,,jjuu′′ii,,jj00------((55))

通过Levenberg-Marquardt非线性优化算法进行优化,使式(4)的目标函数的值最小,可以获取畸变系数k1、k2、p1与p2以及优化后的主点坐标(u'0,v'0)。然后利用式(3)将所有的交点坐标修正为理想的坐标。Through the optimization of the Levenberg-Marquardt nonlinear optimization algorithm, the value of the objective function of formula (4) can be minimized, and the distortion coefficients k1 , k2 , p1 and p2 and the optimized principal point coordinates (u'0 , v'0 ). Then use formula (3) to correct all intersection coordinates to ideal coordinates.

步骤5:求取摄像机其余的内部参数。利用修正后的理想交点作为特征点,采用主动视觉方法,利用左侧四维电控平台2a带动左侧摄像机4a作两组正交运动,在每组正交运动的三个始末位置上分别拍摄一张投影高斯网格图案6的图像,最终左侧摄像机4a拍摄得到6张图像。具体流程如下:(1)调节左侧四维电控平台2a至适当的位置,开始第一组正交运动,在正交运动的三个始末位置上分别拍摄一张投影高斯网格图案6的图像;(2)利用左侧四维电控平台2a使左侧摄像机4a俯视一定角度,开始第二组正交运动,在正交运动的三个始末位置上分别拍摄一张投影高斯网格图案6的图像。Step 5: Obtain the remaining internal parameters of the camera. Using the corrected ideal intersection point as the feature point, using the active vision method, using the left four-dimensional electric control platform 2a to drive the left camera 4a to perform two sets of orthogonal movements, and take a picture at the three start and end positions of each set of orthogonal movements An image of the projected Gaussian grid pattern 6, and finally the left camera 4a captures 6 images. The specific process is as follows: (1) Adjust the left four-dimensional electronic control platform 2a to an appropriate position, start the first set of orthogonal movements, and take an image of the projected Gaussian grid pattern 6 at the three starting and ending positions of the orthogonal movement (2) Utilize the left four-dimensional electric control platform 2a to make the left camera 4a look down at a certain angle, start the second group of orthogonal movements, and take a picture of the projected Gaussian grid pattern 6 respectively on the three start and end positions of the orthogonal movement image.

直线上无穷远点的图像称为该直线的隐消点。由于平行直线与无穷远平面相交于同一个无穷远点,即隐消点。一组正交运动含有两次平移,一次平移运动的始末位置上拍摄的两幅图像上对应交点的连线为一组空间平行线,而且两次平移是相互垂直的,因此我们可以获得一组正交的隐消点对ei1(ui1,vi1)和ei2(ui2,vi2),下标i=1,2,代表正交运动的次序,并有Oei1·Oei2=0,其中O为摄像机的主点。利用两组正交隐消点对,可以通过求解下述方程组分别求解左侧摄像机4a的内参数矩阵K中的尺度因子αx与αyThe image of a point at infinity on a line is called the vanishing point of that line. Since the parallel straight line intersects the infinity plane at the same infinity point, that is, the vanishing point. A set of orthogonal motions contains two translations. The lines corresponding to the intersection points on the two images captured at the start and end positions of a translation motion are a set of spatial parallel lines, and the two translations are perpendicular to each other, so we can obtain a set of Orthogonal blanking point pair ei1 (ui1 ,vi1 ) and ei2 (ui2 ,vi2 ), subscript i=1,2, represents the order of orthogonal movement, and Oei1 ·Oei2 = 0, where O is the principal point of the camera. Using two sets of orthogonal blanking point pairs, the scale factors αx and αy in the internal parameter matrix K of the left camera 4a can be solved respectively by solving the following equations:

((uu1111--uu′′00))((uu1212--uu′′00))//ααxx22++((vv1111--vv′′00))((vv1212--vv′′00))//ααythe y22++11==00((uu21twenty one--uu′′00))((uu22twenty two--uu′′00))//ααxx22++((vv21twenty one--vv′′00))((vv22twenty two--vv′′00))//ααythe y++11==00------((66))

同样地,可以获取右侧摄像机4b的内参数矩阵K中的尺度因子αx与αyLikewise, the scale factors αx and αy in the internal parameter matrix K of the right camera 4 b can be obtained.

步骤6:获取外部参数。左侧摄像机4a与右侧摄像机4b拍摄同一投影高斯网格图案6,将世界坐标系建立在左侧摄像机4a的像机坐标系上,利用左、右摄像机4a、4b拍摄图像修正后的匹配点计算出基本矩阵F。利用内参数和基本矩阵可在相差一个比例因子s的情况下计算出本质矩阵E。分解本质矩阵E后可在相差一个比例因子的情况下确定外部参数(旋转矩阵R'与平移向量t')。Step 6: Get external parameters. The left camera 4a and the right camera 4b shoot the same projected Gaussian grid pattern 6, establish the world coordinate system on the camera coordinate system of the left camera 4a, and use the left and right cameras 4a and 4b to shoot the matching points after image correction Calculate the fundamental matrix F. The intrinsic matrix E can be calculated with a difference of a scale factor s by using the internal parameters and the fundamental matrix. After decomposing the essential matrix E, the external parameters (rotation matrix R' and translation vector t') can be determined with a difference of a scale factor.

如图4所示,利用投影仪3投影平行的高斯光条至实际长度L0已精确测量的量块上,利用光条的高斯特性拟合出亚像素光条中心线,利用灰度骤变的点作为量块的边界点,根据上述求取的内、外参数重建量块的长度L'0,可以获得比例因子为:s=L0/L'0。因此,可以得到摄像机实际的外部参数(旋转矩阵R=R'与平移向量t=s*t')。至此,完成了摄像机的标定过程。As shown in Figure 4, use the projector 3 to project the parallel Gaussian light strips onto the gauge block whose actual lengthL0 has been accurately measured, use the Gaussian properties of the light strips to fit the sub-pixel light strip centerline, and use the grayscale sudden change The point of the gauge block is used as the boundary point of the gauge block, and the length L'0 of the gauge block is reconstructed according to the internal and external parameters obtained above, and the scaling factor can be obtained as: s=L0 /L'0 . Therefore, the actual external parameters of the camera (rotation matrix R=R' and translation vector t=s*t') can be obtained. So far, the calibration process of the camera is completed.

本发明提出的摄像机标定方法具有良好的实时性、鲁棒性及较高的标定精度,能够用于锻造现场等复杂环境下对大视场摄像机进行在线标定。The camera calibration method proposed by the present invention has good real-time performance, robustness and high calibration accuracy, and can be used for online calibration of cameras with a large field of view in complex environments such as forging sites.

Claims (1)

1. A camera calibration method based on a projected Gaussian grid pattern is characterized in that the camera calibration method utilizes the characteristic that the gray scales of horizontal and vertical light bars in the Gaussian grid pattern in the width direction are in Gaussian distribution, image coordinates of points on the center line of the light bars can be obtained at high precision by fitting a Gaussian curve, further a center line equation of the horizontal and vertical light bars is fitted, the intersection point of the center lines of the horizontal and vertical light bars is a calibration feature point, and internal and external parameters of a camera are obtained step by step according to image coordinates of the calibration feature point provided in an image of the photographed Gaussian grid pattern; the method comprises the following specific steps:
step 2: projecting a Gaussian grid pattern, shooting and acquiring intersection point coordinates; a Gauss grid pattern (6) consisting of a plurality of parallel transverse light bars and a plurality of parallel longitudinal light bars is projected to a smooth flat plate or a wall surface (5) in a factory building through a projector (3), the gray scales of all the light bars in the width direction are in Gaussian distribution, and the intersection point A of the transverse light bars and the longitudinal light bars isi,jFor calibrating the characteristic points, i is the number of the transverse light bars in the sequence from top to bottom, and j is the number of the longitudinal light bars in the sequence from left to right; because the light intensity at the intersection of the horizontal and vertical light bars is superposed, after the binarization processing is carried out on the images shot by the left camera (4a) and the right camera (4b), only bright spots at the intersection of grids, namely isolated connected regions, are left in the obtained images; centroid coordinates of connected regions can be obtained by centroid method (u 0)i,j,v0i,j) As a feature point Ai,jA coarse position of (a); a circular area with the rough position as the center and a radius of delta pixels is used as a search range, and then the search range is [ u0 ]i,j-Δ,u0i,j+Δ]The transverse light bars are searched once in the width direction every delta/n within the range, fitting is carried out according to Gaussian distribution characteristics, and Gaussian distribution peak points are used as points on the central lines of the transverse light bars, so that points P on 2n +1 central lines can be obtainedi,j,sSubscript s is 1,2,3, …,2n +1, and a straight line l is fittedh,i,j(ii) a Similarly, in [ v0i,j-Δ,v0i,j+Δ]The longitudinal light bars are searched once in the width direction every delta/n within the range, fitting is carried out according to Gaussian distribution characteristics, Gaussian distribution peak points are used as points on the central line of the longitudinal light bars, and points Q on 2n +1 central lines can be obtainedi,j,tSubscript t is 1,2,3, …,2n +1, and then a straight line l is fittedv,i,j(ii) a Finally, the intersection point of two intersecting straight lines in the same search range is obtained as a calibration characteristic point Ai,jThe coordinates of which are (u)i,j,vi,j);
the parallel straight line and the infinite plane are intersected at the same infinite point, namely a vanishing point; and a group of orthogonal motion comprises two translations, the connecting line of corresponding intersection points on two images shot at the start and end positions of one translation motion is a group of spatial parallel lines, and the two translations are mutually vertical, so that a group of orthogonal vanishing point pairs e can be obtainedi1(ui1,vi1) And ei2(ui2,vi2) The index i being 1,2, representing the order of orthogonal movements, and having Oei1·Oei20, wherein O ═ u'0,v'0) Is the principal point of the camera, and by using two orthogonal pairs of vanishing points, the scale factor α in the internal parameter matrix K of the left camera (4a) can be solved respectively by solving the following equationsxAnd αy
CN201310482789.4A2013-10-162013-10-16Based on the camera marking method of projection Gaussian network patternActiveCN103530880B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201310482789.4ACN103530880B (en)2013-10-162013-10-16Based on the camera marking method of projection Gaussian network pattern

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201310482789.4ACN103530880B (en)2013-10-162013-10-16Based on the camera marking method of projection Gaussian network pattern

Publications (2)

Publication NumberPublication Date
CN103530880A CN103530880A (en)2014-01-22
CN103530880Btrue CN103530880B (en)2016-04-06

Family

ID=49932859

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201310482789.4AActiveCN103530880B (en)2013-10-162013-10-16Based on the camera marking method of projection Gaussian network pattern

Country Status (1)

CountryLink
CN (1)CN103530880B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US11514592B2 (en)*2019-07-292022-11-29Seiko Epson CorporationControl method for projector and projector

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN104167001B (en)*2014-08-272017-02-15大连理工大学Large-visual-field camera calibration method based on orthogonal compensation
CN104156974A (en)*2014-09-052014-11-19大连理工大学Camera distortion calibration method on basis of multiple constraints
CN105758337B (en)*2014-12-192018-09-04宁波舜宇光电信息有限公司A method of obtaining angle between lens plane and image sensor plane
CN104777327B (en)*2015-03-172018-03-20河海大学Time-space image velocity-measuring system and method based on laser assisted demarcation
CN104820973B (en)*2015-05-072017-10-03河海大学The method for correcting image of distortion curve radian detection template
CN104933717B (en)*2015-06-172017-08-11合肥工业大学The camera interior and exterior parameter automatic calibration method of target is demarcated based on directionality
CN105716539B (en)*2016-01-262017-11-07大连理工大学A kind of three-dimentioned shape measurement method of quick high accuracy
CN107464263A (en)*2016-06-022017-12-12维森软件技术(上海)有限公司Automobile calibration system and its scaling method
CN107464218A (en)*2016-06-022017-12-12维森软件技术(上海)有限公司Automobile calibration system and its scaling method
CN107580203B (en)*2017-07-182019-01-15长春理工大学Immersion active stereo projective perspective transformation matrix solving method
CN108198219B (en)*2017-11-212022-05-13合肥工业大学Error compensation method for camera calibration parameters for photogrammetry
CN108805936B (en)*2018-05-242021-03-26北京地平线机器人技术研发有限公司Camera external parameter calibration method and device and electronic equipment
CN109993799B (en)*2019-03-082023-03-24贵州电网有限责任公司Ultraviolet camera calibration method and calibration device
CN110415299B (en)*2019-08-022023-02-24山东大学Vehicle position estimation method based on set guideboard under motion constraint
CN112427487A (en)*2019-08-262021-03-02北京机电研究所有限公司Device for measuring size of thermal state free forging by utilizing optical image and display grid
CN111579220B (en)*2020-05-292023-02-10江苏迪盛智能科技有限公司Resolution ratio board
CN111968183B (en)*2020-08-172022-04-05西安交通大学 A Gauge Block Calibration Method for Monocular Laser 3D Measurement Module Calibration
CN114705266A (en)*2022-04-022022-07-05北京智科车联科技有限公司 Fuel tank fuel quantity detection method, device, fuel tank, T-box and vehicle

Citations (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US5159361A (en)*1989-03-091992-10-27Par Technology CorporationMethod and apparatus for obtaining the topography of an object
US7232990B2 (en)*2004-06-302007-06-19Siemens Medical Solutions Usa, Inc.Peak detection calibration for gamma camera using non-uniform pinhole aperture grid mask
CN101776437A (en)*2009-09-302010-07-14江南大学Calibration technology for vision sub-pixel of embedded type machine with optical path adjustment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US8248476B2 (en)*2008-09-032012-08-21University Of South CarolinaRobust stereo calibration system and method for accurate digital image correlation measurements

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US5159361A (en)*1989-03-091992-10-27Par Technology CorporationMethod and apparatus for obtaining the topography of an object
US7232990B2 (en)*2004-06-302007-06-19Siemens Medical Solutions Usa, Inc.Peak detection calibration for gamma camera using non-uniform pinhole aperture grid mask
CN101776437A (en)*2009-09-302010-07-14江南大学Calibration technology for vision sub-pixel of embedded type machine with optical path adjustment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于立体靶标的摄像机标定方法;张捷 等;《东南大学学报(自然科学版)》;20110531;第41卷(第3期);543-548*

Cited By (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US11514592B2 (en)*2019-07-292022-11-29Seiko Epson CorporationControl method for projector and projector

Also Published As

Publication numberPublication date
CN103530880A (en)2014-01-22

Similar Documents

PublicationPublication DateTitle
CN103530880B (en)Based on the camera marking method of projection Gaussian network pattern
CN103837869B (en)Based on single line laser radar and the CCD camera scaling method of vector relations
CN105913439B (en)A kind of large-field shooting machine scaling method based on laser tracker
CN103411553B (en)The quick calibrating method of multi-linear structured light vision sensors
CN103278138B (en)Method for measuring three-dimensional position and posture of thin component with complex structure
CN104851104B (en)Using the flexible big view calibration method of target high speed camera close shot
CN104182982B (en)Overall optimizing method of calibration parameter of binocular stereo vision camera
CN105184857B (en)Monocular vision based on structure light ranging rebuilds mesoscale factor determination method
CN102034238B (en)Multi-camera system calibrating method based on optical imaging probe and visual graph structure
CN105300316B (en)Optical losses rapid extracting method based on grey scale centre of gravity method
CN102364299B (en)Calibration technology for multiple structured light projected three-dimensional profile measuring heads
CN104567727B (en)Global unified calibration method for linear structured light profile sensor through three-dimensional target
CN103150724B (en)Segmented model-based camera calibration method
CN106600647A (en)Binocular visual multi-line projection structured light calibration method
CN1971206A (en)Calibration method for binocular vision sensor based on one-dimension target
CN111047649A (en) A high-precision camera calibration method based on optimal polarization angle
CN106091984A (en)A kind of three dimensional point cloud acquisition methods based on line laser
CN105067023A (en)Panorama three-dimensional laser sensor data calibration method and apparatus
CN105931222A (en)High-precision camera calibration method via low-precision 2D planar target
CN108981608B (en) A Novel Linear Structured Light Vision System and Calibration Method
CN104613899A (en)Full-automatic calibration method for structured light hand-eye three-dimensional measuring system
CN111968182B (en) A calibration method for nonlinear model parameters of binocular camera
CN101586943A (en)Method for calibrating structure light vision transducer based on one-dimensional target drone
CN109579695A (en)A kind of parts measurement method based on isomery stereoscopic vision
CN115200509A (en) A measurement method, device and control device based on fringe projection measurement model

Legal Events

DateCodeTitleDescription
C06Publication
PB01Publication
C10Entry into substantive examination
SE01Entry into force of request for substantive examination
C14Grant of patent or utility model
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp