本申请要求于2024年03月19日提交中国专利局、申请号为2024103165207,发明名称为“LED灯珠标定方法、装置、设备及介质”的中国专利申请的优先权,其全部内容通过引用结合在本本申请中。This application claims priority to the Chinese patent application filed with the China Patent Office on March 19, 2024, with application number 2024103165207, and invention name “LED lamp bead calibration method, device, equipment and medium”, the entire contents of which are incorporated by reference into this application.
本申请涉及机器视觉标定技术领域,尤其涉及一种LED灯珠标定方法、装置、设备及介质。The present application relates to the field of machine vision calibration technology, and in particular to a method, device, equipment and medium for calibrating LED lamp beads.
智能穿戴设备(如AR/VR眼镜等)可以通过眼动追踪技术实现对用户视觉中心的定位,即通过眼动相机采集眼睛图像中的LED光源反射点,然后根据LED光源与眼动相机的相对坐标,构建眼球模型,进而根据眼球模型,计算眼睛图像中的视觉中心坐标,从而实现视觉中心的眼动追踪。因此,视觉中心眼动追踪的精确性取决于眼动相机和LED光源之间的相对位置标定准确性。Smart wearable devices (such as AR/VR glasses) can locate the user's visual center through eye tracking technology. This involves using an eye-tracking camera to capture the reflection point of an LED light source in the eye image. Based on the relative coordinates of the LED light source and the eye-tracking camera, an eyeball model is constructed. This eyeball model is then used to calculate the coordinates of the visual center in the eye image, enabling eye tracking of the visual center. Therefore, the accuracy of visual center eye tracking depends on the accuracy of the relative position calibration between the eye-tracking camera and the LED light source.
但是,相关技术中的LED光源标定方法不能直接获得眼动相机和LED光源的相对位置,而是分别对LED光源和眼动相机进行单独标定,获得两者在世界坐标系中的坐标,从而获得两者的相对位置。由此,会导致LED光源的标定结果不准确,从而降低眼动追踪结果的准确性。However, the LED light source calibration method in related art cannot directly determine the relative position of the eye tracking camera and the LED light source. Instead, the LED light source and the eye tracking camera must be calibrated separately to obtain their coordinates in the world coordinate system, thereby determining their relative position. This can lead to inaccurate LED light source calibration results, thereby reducing the accuracy of eye tracking results.
因此,如何提高眼动追踪系统中的LED灯珠标定准确性成为了亟待解决的技术问题。Therefore, how to improve the calibration accuracy of LED lamp beads in eye tracking systems has become a technical problem that needs to be solved urgently.
本申请提供了一种LED灯珠标定方法、装置、设备及存储介质,旨在提高眼动追踪系统中的LED灯珠标定准确性。The present application provides an LED lamp bead calibration method, apparatus, device and storage medium, aiming to improve the accuracy of LED lamp bead calibration in an eye tracking system.
第一方面,本申请提供一种LED灯珠标定方法,包括:In a first aspect, the present application provides a method for calibrating an LED lamp bead, comprising:
获取第一标定相机对应的第一标定参数和眼动相机的第三标定参数;Obtaining a first calibration parameter corresponding to the first calibration camera and a third calibration parameter of the eye-tracking camera;
基于所述第一标定相机和第二标定相机分别采集的目标灯珠的灯珠图像,获得所述第一标定相机对应的第一图像坐标和所述第二标定相机对应的第二图像坐标;Based on the lamp bead images of the target lamp bead captured by the first calibration camera and the second calibration camera respectively, obtaining first image coordinates corresponding to the first calibration camera and second image coordinates corresponding to the second calibration camera;
基于所述第一图像坐标、所述第二图像坐标和所述第一标定参数,计算所述目标灯珠在所述第一标定相机对应的相机坐标系中的第一三维坐标;Calculate the first three-dimensional coordinates of the target lamp bead in the camera coordinate system corresponding to the first calibration camera based on the first image coordinates, the second image coordinates and the first calibration parameters;
基于所述第三标定参数和所述第一标定参数,将所述第一三维坐标转化到眼动相机对应的相机坐标系中,获得所述目标灯珠相对于所述眼动相机的目标三维坐标。Based on the third calibration parameter and the first calibration parameter, the first three-dimensional coordinate is converted into a camera coordinate system corresponding to the eye-tracking camera to obtain the target three-dimensional coordinate of the target lamp bead relative to the eye-tracking camera.
第二方面,本申请还提供一种LED灯珠标定装置,包括:In a second aspect, the present application further provides an LED lamp bead calibration device, comprising:
标定参数获取模块,用于获取第一标定相机对应的第一标定参数和眼动相机的第三标定参数;A calibration parameter acquisition module, configured to acquire a first calibration parameter corresponding to the first calibration camera and a third calibration parameter of the eye-tracking camera;
图像坐标获得模块,用于基于所述第一标定相机和第二标定相机分别采集的目标灯珠的灯珠图像,获得所述第一标定相机对应的第一图像坐标和所述第二标定相机对应的第二图像坐标;An image coordinate acquisition module, configured to obtain first image coordinates corresponding to the first calibration camera and second image coordinates corresponding to the second calibration camera based on the lamp bead images of the target lamp bead captured by the first calibration camera and the second calibration camera respectively;
三维坐标计算模块,用于基于所述第一图像坐标、所述第二图像坐标和所述第一标定参数,计算所述目标灯珠在所述第一标定相机对应的相机坐标系中的第一三维坐标;a three-dimensional coordinate calculation module, configured to calculate the first three-dimensional coordinates of the target lamp bead in the camera coordinate system corresponding to the first calibration camera based on the first image coordinates, the second image coordinates, and the first calibration parameters;
三维坐标转换模块,用于基于所述第三标定参数和所述第一标定参数,将所述第一三维坐标转化到眼动相机对应的相机坐标系中,获得所述目标灯珠相对于所述眼动相机的目标三维坐标。A three-dimensional coordinate conversion module is used to convert the first three-dimensional coordinate into the camera coordinate system corresponding to the eye-tracking camera based on the third calibration parameter and the first calibration parameter, so as to obtain the target three-dimensional coordinate of the target lamp bead relative to the eye-tracking camera.
第三方面,本申请还提供一种计算机设备,所述计算机设备包括处理器、存储器、以及存储在所述存储器上并可被所述处理器执行的计算机程序,其中所述计算机程序被所述处理器执行时,实现如上所述的LED灯珠标定方法的步骤。In a third aspect, the present application also provides a computer device, comprising a processor, a memory, and a computer program stored in the memory and executable by the processor, wherein when the computer program is executed by the processor, the steps of the LED lamp bead calibration method as described above are implemented.
第四方面,本申请还提供一种计算机可读存储介质,所述计算机可读存储介质上存储有计算机程序,其中所述计算机程序被处理器执行时,实现如上所述的LED灯珠标定方法的步骤。In a fourth aspect, the present application further provides a computer-readable storage medium, on which a computer program is stored, wherein when the computer program is executed by a processor, the steps of the LED lamp bead calibration method as described above are implemented.
本申请提供一种LED灯珠标定方法、装置、设备及存储介质,所述方法包括获取第一标定相机对应的第一标定参数和眼动相机的第三标定参数;基于所述第一标定相机和第二标定相机分别采集的目标灯珠的灯珠图像,获得所述第一标定相机对应的第一图像坐标和所述第二标定相机对应的第二图像坐标;基于所述第一图像坐标、所述第二图像坐标和所述第一标定参数,计算所述目标灯珠在所述第一标定相机对应的相机坐标系中的第一三维坐标;基于所述第三标定参数和所述第一标定参数,将所述第一三维坐标转化到眼动相机对应的相机坐标系中,获得所述目标灯珠相对于所述眼动相机的目标三维坐标。通过上述方式,本申请通过第一标定相机和第二标定相机采集目标灯珠的灯珠图像,进而根据目标灯珠在两张灯珠图像中的第一图像坐标和第二图像坐标,以及第一标定相机的第一标定参数,计算目标灯珠在第一标定相机的相机坐标系中的第一三维坐标。第一标定相机和第二标定相机组成的双目立体视觉标定系统,可以提高目标灯珠在相机坐标系中的三维坐标计算精度,从而降低目标灯珠在坐标转换时的坐标误差,使得根据第一标定参数和第二标定参数进行坐标转化得到的目标灯珠在眼动相机对应的相机坐标系中的目标三维坐标误差更小,提高目标灯珠与眼动相机之间相对位置的标定准确性。The present application provides a method, apparatus, device, and storage medium for calibrating LED lamp beads. The method includes obtaining a first calibration parameter corresponding to a first calibration camera and a third calibration parameter of an eye-tracking camera; based on the lamp bead images of the target lamp bead captured by the first calibration camera and the second calibration camera, respectively, obtaining the first image coordinates corresponding to the first calibration camera and the second image coordinates corresponding to the second calibration camera; based on the first image coordinates, the second image coordinates, and the first calibration parameter, calculating the first three-dimensional coordinates of the target lamp bead in the camera coordinate system corresponding to the first calibration camera; based on the third calibration parameter and the first calibration parameter, converting the first three-dimensional coordinates into the camera coordinate system corresponding to the eye-tracking camera to obtain the target three-dimensional coordinates of the target lamp bead relative to the eye-tracking camera. In the above manner, the present application captures the lamp bead images of the target lamp bead through the first calibration camera and the second calibration camera, and then calculates the first three-dimensional coordinates of the target lamp bead in the camera coordinate system of the first calibration camera based on the first image coordinates and the second image coordinates of the target lamp bead in the two lamp bead images, as well as the first calibration parameter of the first calibration camera. The binocular stereo vision calibration system composed of a first calibration camera and a second calibration camera can improve the calculation accuracy of the three-dimensional coordinates of the target lamp beads in the camera coordinate system, thereby reducing the coordinate error of the target lamp beads during coordinate conversion, so that the target three-dimensional coordinate error of the target lamp beads obtained by coordinate conversion according to the first calibration parameters and the second calibration parameters in the camera coordinate system corresponding to the eye-tracking camera is smaller, thereby improving the calibration accuracy of the relative position between the target lamp beads and the eye-tracking camera.
为了更清楚地说明本申请实施例技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。In order to more clearly illustrate the technical solutions of the embodiments of the present application, the following is a brief introduction to the drawings required for use in the description of the embodiments. Obviously, the drawings described below are some embodiments of the present application. For ordinary technicians in this field, other drawings can be obtained based on these drawings without any creative work.
图1为本申请实施例提供的一种LED灯珠标定方法第一实施例的流程示意图;FIG1 is a flow chart of a first embodiment of a method for calibrating an LED lamp bead provided in an embodiment of the present application;
图2为本申请实施例提供的一种LED灯珠标定方法第二实施例的流程示意图;FIG2 is a flow chart of a second embodiment of a method for calibrating an LED lamp bead provided in an embodiment of the present application;
图3为本申请实施例提供的一种LED灯珠标定方法第三实施例的流程示意图;FIG3 is a flow chart of a third embodiment of a method for calibrating an LED lamp bead provided in an embodiment of the present application;
图4是本申请提供的一种LED灯珠标定装置第一实施例的结构示意图;FIG4 is a structural diagram of a first embodiment of an LED lamp bead calibration device provided by the present application;
图5是本申请实施例提供的一种计算机设备的结构示意性框图。FIG5 is a schematic block diagram of the structure of a computer device provided in an embodiment of the present application.
本申请目的实现、功能特点及优点将结合实施例,参照附图做进一步说明。The purpose, features and advantages of this application will be further explained in conjunction with the embodiments and with reference to the accompanying drawings.
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。The following will be combined with the drawings in the embodiments of this application to clearly and completely describe the technical solutions in the embodiments of this application. Obviously, the embodiments described are part of the embodiments of this application, not all of them. Based on the embodiments in this application, all other embodiments obtained by ordinary technicians in this field without making creative efforts are within the scope of protection of this application.
附图中所示的流程图仅是示例说明,不是必须包括所有的内容和操作/步骤,也不是必须按所描述的顺序执行。例如,有的操作/步骤还可以分解、组合或部分合并,因此实际执行的顺序有可能根据实际情况改变。The flowcharts shown in the accompanying drawings are for illustrative purposes only and do not necessarily include all contents and operations/steps, nor must they be executed in the order described. For example, some operations/steps may be decomposed, combined, or partially merged, so the actual execution order may vary depending on the actual situation.
下面结合附图,对本申请的一些实施方式作详细说明。在不冲突的情况下,下述的实施例及实施例中的特征可以相互组合。The following describes some embodiments of the present application in detail with reference to the accompanying drawings. In the absence of conflict, the following embodiments and features therein may be combined with each other.
请参照图1,图1为本申请实施例提供的一种LED灯珠标定方法第一实施例的流程示意图。Please refer to Figure 1, which is a flow chart of a first embodiment of a method for calibrating an LED lamp bead provided in an embodiment of the present application.
如图1所示,该LED灯珠标定方法包括步骤S101至步骤S104。As shown in FIG1 , the LED lamp bead calibration method includes steps S101 to S104 .
S101、获取第一标定相机对应的第一标定参数和眼动相机的第三标定参数;S101, obtaining a first calibration parameter corresponding to a first calibration camera and a third calibration parameter corresponding to an eye-tracking camera;
在一实施例中,双目立体视觉标定系统是由两台相同的相机组成。通过双目立体视觉标定系统对嵌装在智能穿戴设备上的LED灯珠进行标定。智能穿戴设备左右各安装一个眼动相机,若干个LED灯珠分布于眼动相机周围,第一标定相机和第二标定相机的拍摄方向均朝向智能穿戴设备,且智能穿戴设备中的左侧和/或右侧眼动相机以及眼动相机周围分布的若干个LED灯珠需要同时位于第一标定相机的拍摄视场和第二标定相机的拍摄视场内。第一标定相机和第二标定相机同步采集智能穿戴设备上的LED灯珠的灯珠图像,要确保眼动相机与LED灯珠须在第一标定相机和第二标定相机构成的立体视觉系统视场范围内。In one embodiment, a binocular stereo vision calibration system is composed of two identical cameras. The LED lamp beads embedded in the smart wearable device are calibrated by the binocular stereo vision calibration system. An eye-tracking camera is installed on each side of the smart wearable device, and a number of LED lamp beads are distributed around the eye-tracking camera. The shooting directions of the first calibration camera and the second calibration camera are both toward the smart wearable device, and the left and/or right eye-tracking cameras in the smart wearable device and the several LED lamp beads distributed around the eye-tracking cameras need to be located in the shooting field of view of the first calibration camera and the shooting field of view of the second calibration camera at the same time. The first calibration camera and the second calibration camera synchronously capture the lamp bead images of the LED lamp beads on the smart wearable device to ensure that the eye-tracking camera and the LED lamp beads are within the field of view of the stereo vision system formed by the first calibration camera and the second calibration camera.
在一实施例中,相机中有四个坐标系,包括世界坐标系、相机坐标系、图像坐标系和像素坐标系。其中,世界坐标系经过刚体变换(如旋转、平移)后转化为相机坐标系,相机坐标系经过透视投影转换为图像坐标系,图像坐标系经过仿射变换转换为像素坐标系。In one embodiment, the camera has four coordinate systems, including a world coordinate system, a camera coordinate system, an image coordinate system, and a pixel coordinate system. The world coordinate system is converted to the camera coordinate system through a rigid body transformation (e.g., rotation or translation), the camera coordinate system is converted to the image coordinate system through perspective projection, and the image coordinate system is converted to the pixel coordinate system through an affine transformation.
可以理解地是,图像的变换大多可以使用矩阵乘法和矩阵加法来表示变换前后像素的映射关系,因此,在进行图像变换时需要获取相机标定参数,相机标定参数(第一标定参数、第二标定参数和第三标定参数)包括相机内部参数和相机外部参数。It can be understood that most image transformations can use matrix multiplication and matrix addition to represent the mapping relationship between pixels before and after the transformation. Therefore, it is necessary to obtain camera calibration parameters when performing image transformation. The camera calibration parameters (first calibration parameters, second calibration parameters, and third calibration parameters) include camera internal parameters and camera external parameters.
其中,相机内部参数可以包括一个像素的物理尺寸dX和dY,焦距f,图像物理坐标的扭曲因子r,图像原点相对于光心成像点的纵横偏移量u和v(以像素为单位)以及畸变系数;畸变系数可以包括相机的径向畸变系数K(K1,K2,K3)和切向畸变系数P(P1,P2)。相机外部参数可以包括世界坐标系转换到相机坐标系的旋转矩阵R和平移矩阵T。The camera's internal parameters can include the physical dimensions dX and dY of a pixel, the focal length f, the distortion factor r of the image's physical coordinates, the vertical and horizontal offsets u and v (in pixels) of the image origin relative to the optical center imaging point, and the distortion coefficients. The distortion coefficients can include the camera's radial distortion coefficients K (K1, K2, K3) and tangential distortion coefficients P (P1, P2). The camera's external parameters can include the rotation matrix R and translation matrix T that transform the world coordinate system to the camera coordinate system.
其中,相机内部参数包括内部参数矩阵和畸变系数等,相机外部参数包括旋转矩阵和平移矩阵等。Among them, the camera internal parameters include internal parameter matrix and distortion coefficient, etc., and the camera external parameters include rotation matrix and translation matrix, etc.
在一实施例中,内部参数矩阵可以表示为:
In one embodiment, the internal parameter matrix can be expressed as:
其中,f为焦距,dX、dY分别表示X、Y方向上的一个像素在相机感光板上的物理长度,即一个像素在感光板上的实际长度尺寸和实际宽度尺寸,u0、v0分别表示相机感光板中心在像素坐标系下的X轴坐标和Y轴坐标,θ表示相机感光板的横边和纵边之间的角度(90°表示无误差)。Where f is the focal length, dX and dY represent the physical lengths of a pixel on the camera plate in the X and Y directions, respectively, that is, the actual length and width of a pixel on the plate,u0 andv0 represent the X-axis and Y-axis coordinates of the center of the camera plate in the pixel coordinate system, and θ represents the angle between the horizontal and vertical edges of the camera plate (90° indicates no error).
在一实施例中,外部参数矩阵可以表示为:
In one embodiment, the external parameter matrix can be expressed as:
其中,R表示旋转矩阵,T表示平移矩阵。Where R represents the rotation matrix and T represents the translation matrix.
在一实施例中,第一标定相机对应的第一标定参数、第二标定相机对应的第二标定参数和眼动相机的第三标定参数可以采用单相机标定方法进行标定。In one embodiment, the first calibration parameter corresponding to the first calibration camera, the second calibration parameter corresponding to the second calibration camera, and the third calibration parameter of the eye-tracking camera can be calibrated using a single-camera calibration method.
进一步地,基于单相机标定方法,对所述第一标定相机、所述第二标定相机和所述眼动相机分别进行参数标定,获得所述第一标定参数、所述第二标定参数和所述第三标定参数。Furthermore, based on a single-camera calibration method, parameter calibration is performed on the first calibration camera, the second calibration camera, and the eye-tracking camera respectively to obtain the first calibration parameters, the second calibration parameters, and the third calibration parameters.
在一实施例中,单相机标定可以获取相机的内部参数矩阵和畸变系数。在单相机标定方法中,通常涉及拍摄已知尺寸的标定板,并使用如张正友标定法等算法计算出这些参数。In one embodiment, single-camera calibration can obtain the camera's intrinsic parameter matrix and distortion coefficients. Single-camera calibration methods typically involve photographing a calibration plate of known size and calculating these parameters using algorithms such as the Zhang Zhengyou calibration method.
其中,标定板可以是棋盘格和圆点格等。棋盘格是由交替的黑白方块组成,适用于焦点检测;圆点格是由一系列的圆点组成,适用于更精确的特征点定位。The calibration plate can be a checkerboard or a dot grid. The checkerboard is composed of alternating black and white squares and is suitable for focus detection; the dot grid is composed of a series of dots and is suitable for more accurate feature point positioning.
在一实施例中,在进行单相机标定时,首先使用待标定相机(第一标定相机或第二标定相机或眼动相机)从不同的角度和位置拍摄标定板,获得若干张标定板图像,然后通过计算机算法或软件(如OpenCV等)识别和提取标定板上的角点或特征点,根据角点或特征点以及标定板中的已知参数,计算相机的内部参数矩阵和畸变系数,内部参数矩阵可以包括焦距和主点坐标等参数信息,而畸变系数描述了镜头的畸变特性。In one embodiment, when performing single-camera calibration, the camera to be calibrated (the first calibration camera, the second calibration camera, or the eye-tracking camera) is first used to photograph a calibration plate from different angles and positions to obtain several images of the calibration plate. Then, a computer algorithm or software (such as OpenCV, etc.) is used to identify and extract corner points or feature points on the calibration plate. Based on the corner points or feature points and known parameters in the calibration plate, the camera's internal parameter matrix and distortion coefficients are calculated. The internal parameter matrix may include parameter information such as focal length and principal point coordinates, while the distortion coefficients describe the distortion characteristics of the lens.
在一实施例中,相机的标定过程实际上就是在4个坐标系转化的过程中求出相机的内参和外参的过程。其中,世界坐标系描述物体真实位置;相机坐标系中,可以描述摄像头镜头中心;图像坐标系中,可以描述图像传感器成像中心,图片中心,影布中心,单位一般是毫米;像素坐标系中,描述像素的位置,单位是多少行,多少列。In one embodiment, the camera calibration process involves determining the camera's intrinsic and extrinsic parameters through the transformation of four coordinate systems. The world coordinate system describes the actual position of an object; the camera coordinate system describes the center of the camera lens; the image coordinate system describes the image sensor imaging center, the image center, and the center of the canvas, typically in millimeters; and the pixel coordinate system describes the position of pixels, measured in rows and columns.
在一实施例中,从世界坐标系转换到相机坐标系,可以求解相机外部参数,即旋转矩阵和平移矩阵;从相机坐标系转换到图像坐标系,可以求解相机内部参数,即内部参数矩阵和畸变系数;从图像坐标系转换到像素坐标系,可以求解像素转化矩阵,即原点从图片中心到左上角,单位从毫米变成行列。In one embodiment, by converting from the world coordinate system to the camera coordinate system, the camera external parameters, i.e., the rotation matrix and the translation matrix, can be solved; by converting from the camera coordinate system to the image coordinate system, the camera internal parameters, i.e., the internal parameter matrix and the distortion coefficient, can be solved; by converting from the image coordinate system to the pixel coordinate system, the pixel conversion matrix can be solved, i.e., the origin is from the center of the picture to the upper left corner, and the unit is changed from millimeters to rows and columns.
S102、基于所述第一标定相机和第二标定相机分别采集的目标灯珠的灯珠图像,获得所述第一标定相机对应的第一图像坐标和所述第二标定相机对应的第二图像坐标;S102: Based on the lamp bead images of the target lamp bead captured by the first calibration camera and the second calibration camera respectively, obtain first image coordinates corresponding to the first calibration camera and second image coordinates corresponding to the second calibration camera;
在一实施例中,目标灯珠需要同时出现在第一标定相机和第二标定相机的采集画面中,通过第一标定相机和第二标定相机分别采集目标灯珠的灯珠图像。然后根据第一标定相机采集的灯珠图像在第一标定相机对应的第一图像坐标系中的坐标位置,获得第一图像坐标;根据第二标定相机采集的灯珠图像在第二标定相机对应的第二图像坐标系中的坐标位置,获得第二图像坐标。In one embodiment, the target lamp bead must appear in the capture images of both the first calibration camera and the second calibration camera. The first calibration camera and the second calibration camera each capture images of the target lamp bead. The first image coordinates are then obtained based on the coordinate position of the lamp bead image captured by the first calibration camera in the first image coordinate system corresponding to the first calibration camera. The second image coordinates are obtained based on the coordinate position of the lamp bead image captured by the second calibration camera in the second image coordinate system corresponding to the second calibration camera.
进一步地,获取第二标定相机对应的第二标定参数;基于所述第一标定参数和所述第二标定参数,分别创建所述第一标定相机对应的第一图像坐标系和所述第二标定相机对应的第二图像坐标系;基于所述第一标定相机采集的所述目标灯珠的第一灯珠图像,确定所述目标灯珠在所述第一图像坐标系中的第一图像坐标;基于所述第二标定相机采集的所述目标灯珠的第二灯珠图像,确定所述目标灯珠在所述第二图像坐标系中的第二图像坐标。Furthermore, second calibration parameters corresponding to the second calibration camera are obtained; based on the first calibration parameters and the second calibration parameters, a first image coordinate system corresponding to the first calibration camera and a second image coordinate system corresponding to the second calibration camera are created respectively; based on the first lamp bead image of the target lamp bead captured by the first calibration camera, the first image coordinates of the target lamp bead in the first image coordinate system are determined; based on the second lamp bead image of the target lamp bead captured by the second calibration camera, the second image coordinates of the target lamp bead in the second image coordinate system are determined.
在一实施例中,以第一标定相机采集图像的横向像素点和纵向像素点分别作为X轴和Y轴,原点为采集图像的左上角第一个像素点位置,建立第一图像坐标系。然后根据第一标定相机采集的灯珠图像,确定灯珠图像中目标灯珠在第一图像坐标系中的图像坐标,即第一图像坐标。In one embodiment, a first image coordinate system is established, with the horizontal and vertical pixels of the image captured by the first calibration camera as the X-axis and Y-axis, respectively, and the origin being the first pixel in the upper left corner of the captured image. The image coordinates of the target lamp bead in the lamp bead image in the first image coordinate system, i.e., the first image coordinates, are then determined based on the lamp bead image captured by the first calibration camera.
同理,以第二标定相机采集图像的横向像素点和纵向像素点分别作为X轴和Y轴,原点为采集图像的左上角第一个像素点位置,建立第二图像坐标系。然后根据第二标定相机采集的灯珠图像,确定灯珠图像中目标灯珠在第二图像坐标系中的图像坐标,即第二图像坐标。Similarly, establish a second image coordinate system using the horizontal and vertical pixels of the image captured by the second calibration camera as the X-axis and Y-axis, respectively, with the origin being the first pixel in the upper left corner of the captured image. Then, based on the lamp bead image captured by the second calibration camera, determine the image coordinates of the target lamp bead in the lamp bead image in the second image coordinate system, i.e., the second image coordinates.
其中,第一图像坐标和第二图像坐标均为像素坐标。The first image coordinates and the second image coordinates are both pixel coordinates.
S103、基于所述第一图像坐标、所述第二图像坐标和所述第一标定参数,计算所述目标灯珠在所述第一标定相机对应的相机坐标系中的第一三维坐标;S103: Calculate the first three-dimensional coordinates of the target lamp bead in the camera coordinate system corresponding to the first calibration camera based on the first image coordinates, the second image coordinates, and the first calibration parameters;
在一实施例中,根据第一图像坐标和第二图像坐标,可以计算目标灯珠在第一标定相机和第二标定相机分别采集的两张灯珠图像中的图像视差,即第一图像坐标和第二图像坐标的X轴坐标差值。In one embodiment, based on the first image coordinates and the second image coordinates, the image parallax of the target lamp bead in the two lamp bead images captured by the first calibration camera and the second calibration camera respectively can be calculated, that is, the X-axis coordinate difference between the first image coordinates and the second image coordinates.
在一实施例中,根据图像视差和相机标定参数,可以进一步计算目标灯珠与第一标定相机之间的深度距离,其中,目标灯珠的深度距离计算公式可以表示为:
Z=f·B/dIn one embodiment, the depth distance between the target lamp bead and the first calibration camera can be further calculated based on the image parallax and the camera calibration parameters. The depth distance calculation formula of the target lamp bead can be expressed as:
Z=f·B/d
其中,Z表示深度距离,f表示第一标定相机的焦距,B表示第一标定相机和第二标定相机之间的基线距离,即第一标定相机和第二标定相机之间的实际直线距离,d表示图像视差。第一标定相机和第二标定相机的焦距相等。Where Z represents the depth distance, f represents the focal length of the first calibration camera, B represents the baseline distance between the first calibration camera and the second calibration camera, that is, the actual straight-line distance between the first calibration camera and the second calibration camera, and d represents the image disparity. The focal lengths of the first calibration camera and the second calibration camera are equal.
S104、基于所述第三标定参数和所述第一标定参数,将所述第一三维坐标转化到眼动相机对应的相机坐标系中,获得所述目标灯珠相对于所述眼动相机的目标三维坐标。S104. Based on the third calibration parameter and the first calibration parameter, convert the first three-dimensional coordinate into a camera coordinate system corresponding to the eye-tracking camera to obtain the target three-dimensional coordinate of the target lamp bead relative to the eye-tracking camera.
在一实施例中,相机标定参数包含相机内部参数和相机外部参数,根据相机内部参数和相机外部参数,可以实现相机相关的各个坐标系的转换,即世界坐标系、相机坐标系、图像坐标系以及像素坐标系之间的相互转化。In one embodiment, the camera calibration parameters include camera intrinsic parameters and camera extrinsic parameters. According to the camera intrinsic parameters and camera extrinsic parameters, the conversion of various camera-related coordinate systems can be realized, that is, the mutual conversion between the world coordinate system, the camera coordinate system, the image coordinate system and the pixel coordinate system.
世界坐标系是指客观三维世界的绝对坐标系。因为第一标定相机、第二标定相机和眼动相机安放在三维空间中,所以需要世界坐标系这个基准坐标系描述上述三个相机的位置,并且用该世界坐标系描述安放在此三维环境中的其他任何物体的位置,用(X,Y,Z)表示其坐标值。The world coordinate system is the absolute coordinate system of the objective three-dimensional world. Because the first calibration camera, the second calibration camera, and the eye-tracking camera are placed in three-dimensional space, a world coordinate system is required to describe the positions of these three cameras. This world coordinate system is also used to describe the position of any other objects placed in this three-dimensional environment, with their coordinate values expressed as (X, Y, Z).
相机坐标系以相机的光心为坐标原点,X轴和Y轴分别平行于图像坐标系的X轴和Y轴,相机的光轴为Z轴,用(Xc,Yc,Zc)表示其坐标值。The camera coordinate system takes the optical center of the camera as its origin. The X-axis and Y-axis are parallel to the X-axis and Y-axis of the image coordinate system respectively. The optical axis of the camera is the Z-axis. Its coordinate values are expressed as (Xc ,Yc ,Zc ).
图像坐标系以相机图像传感器的图像平面中心为坐标原点,X轴和Y轴分别平行于图像平面的两条垂直边,用(x,y)表示其坐标值。图像坐标系是用物理单位(如毫米)表示像素在图像中的位置。The image coordinate system uses the center of the image plane of the camera image sensor as its origin. The X-axis and Y-axis are parallel to the two perpendicular sides of the image plane, and their coordinate values are represented by (x, y). The image coordinate system uses physical units (such as millimeters) to represent the position of pixels in the image.
像素坐标系是以相机图像传感器的图像平面左上角顶点为原点,X轴和Y轴分别平行于图像坐标系的X轴和Y轴,用(u,v)表示其坐标值。像素坐标系以像素为单位(行×列)。The pixel coordinate system is based on the top-left corner of the camera image sensor's image plane as its origin. The x-axis and y-axis are parallel to the x-axis and y-axis of the image coordinate system, respectively. Its coordinate values are represented by (u, v). The pixel coordinate system is measured in pixels (rows × columns).
在一实施例中,像素坐标系与图像坐标系之间的转换关系为:
In one embodiment, the conversion relationship between the pixel coordinate system and the image coordinate system is:
采用齐次坐标系再用矩阵形式将上式表示为:
Using a homogeneous coordinate system, the above formula can be expressed in matrix form as follows:
其中,u表示像素坐标系中的X轴坐标,v表示像素坐标系中的Y轴坐标,x表示图像坐标系中的X轴坐标,y表示图像坐标系中的Y轴坐标,dx表示相机每个像素在图像平面X轴方向上的物理尺寸,dy表示相机每个像素在图像平面Y轴方向上的物理尺寸,(u0,v0)表示图像坐标系原点在像素坐标系中的坐标。Where u represents the X-axis coordinate in the pixel coordinate system, v represents the Y-axis coordinate in the pixel coordinate system, x represents the X-axis coordinate in the image coordinate system, y represents the Y-axis coordinate in the image coordinate system, dx represents the physical size of each pixel of the camera in the X-axis direction of the image plane, dy represents the physical size of each pixel of the camera in the Y-axis direction of the image plane, and (u0 , v0 ) represents the coordinates of the origin of the image coordinate system in the pixel coordinate system.
在一实施例中,齐次坐标就是将一个原本是n维的向量用一个n+1维向量来表示,是指一个用于投影几何里的坐标系统。In one embodiment, homogeneous coordinates represent an n-dimensional vector using an n+1-dimensional vector, and refer to a coordinate system used in projective geometry.
在一实施例中,图像坐标系与相机坐标系的转换关系如下:
In one embodiment, the conversion relationship between the image coordinate system and the camera coordinate system is as follows:
采用齐次坐标系再用矩阵形式将上式表示为:
Using a homogeneous coordinate system, the above formula can be expressed in matrix form as follows:
其中,f为相机的焦距,(x,y)表示图像坐标系中的二维点坐标,(Xc,Yc,Zc)表示相机坐标系中的三维点坐标。Where f is the focal length of the camera, (x, y) represents the two-dimensional point coordinates in the image coordinate system, and (Xc ,Yc ,Zc ) represents the three-dimensional point coordinates in the camera coordinate system.
在一实施例中,相机坐标系与世界坐标系之间的转换关系如下:
In one embodiment, the conversion relationship between the camera coordinate system and the world coordinate system is as follows:
其中,R为3×3的旋转矩阵,t为三维平移向量,T为平移矩阵,(Xc,Yc,Zc)表示相机坐标系中的三维点坐标,(X,Y,Z)表示世界坐标系中的三维点坐标。Where R is a 3×3 rotation matrix, t is a three-dimensional translation vector, T is a translation matrix, (Xc ,Yc ,Zc ) represents the three-dimensional point coordinates in the camera coordinate system, and (X, Y, Z) represents the three-dimensional point coordinates in the world coordinate system.
在一实施例中,第一三维坐标是第一标定相机或者第二标定相机对应的相机坐标系中的三维点坐标,因此,首先需要根据上述中的相机坐标系与世界坐标系之间的坐标转换公式,将第一三维坐标转换为世界坐标系中的三维点坐标,获得目标灯珠在世界坐标系中的三维点坐标。再根据相机坐标系和世界坐标系之间的坐标转换公式,将目标灯珠在世界坐标系中的三维点坐标逆转换到眼动相机对应的相机坐标系中,获得目标灯珠在眼动相机对应的相机坐标系中的目标三维坐标,从而完成目标灯珠和眼动相机的相对位置关系的标定。In one embodiment, the first three-dimensional coordinates are the three-dimensional coordinates of a point in the camera coordinate system corresponding to the first calibration camera or the second calibration camera. Therefore, the first three-dimensional coordinates must first be converted to three-dimensional coordinates in the world coordinate system according to the coordinate conversion formula between the camera coordinate system and the world coordinate system, thereby obtaining the three-dimensional coordinates of the target lamp bead in the world coordinate system. Then, according to the coordinate conversion formula between the camera coordinate system and the world coordinate system, the three-dimensional coordinates of the target lamp bead in the world coordinate system are reversely converted to the camera coordinate system corresponding to the eye-tracking camera, thereby obtaining the target three-dimensional coordinates of the target lamp bead in the camera coordinate system corresponding to the eye-tracking camera, thereby completing the calibration of the relative position relationship between the target lamp bead and the eye-tracking camera.
其中,眼动相机对应的相机坐标系的坐标原点即为眼动相机所在位置。The origin of the coordinate system of the camera corresponding to the eye-tracking camera is the location of the eye-tracking camera.
本实施例提供了一种LED灯珠标定方法,该方法通过第一标定相机和第二标定相机采集目标灯珠的灯珠图像,进而根据目标灯珠在两张灯珠图像中的第一图像坐标和第二图像坐标,以及两个标定相机的相机标定参数,计算目标灯珠在标定相机的相机坐标系中的第一三维坐标。第一标定相机和第二标定相机组成的双目立体视觉标定系统,可以提高目标灯珠在相机坐标系中的三维坐标计算精度,从而降低目标灯珠在坐标转换时的坐标误差,使得通过坐标转化得到的目标灯珠在眼动相机对应的相机坐标系中的目标三维坐标误差更小,提高目标灯珠与眼动相机之间相对位置的标定准确性。This embodiment provides an LED lamp bead calibration method. The method uses a first calibration camera and a second calibration camera to capture lamp bead images of a target lamp bead. The method then calculates the first three-dimensional coordinates of the target lamp bead in the camera coordinate system of the calibration cameras based on the first and second image coordinates of the target lamp bead in the two lamp bead images, as well as the camera calibration parameters of the two calibration cameras. The binocular stereo vision calibration system composed of the first and second calibration cameras can improve the accuracy of calculating the three-dimensional coordinates of the target lamp bead in the camera coordinate system, thereby reducing the coordinate error of the target lamp bead during coordinate conversion. This results in a smaller error in the target three-dimensional coordinates of the target lamp bead obtained through coordinate conversion in the camera coordinate system corresponding to the eye-tracking camera, thereby improving the calibration accuracy of the relative position between the target lamp bead and the eye-tracking camera.
请参照图2,图2为本申请实施例提供的一种LED灯珠标定方法第二实施例的流程示意图。Please refer to FIG. 2 , which is a flow chart of a second embodiment of a method for calibrating an LED lamp bead provided in an embodiment of the present application.
如图2所示,基于上述图1所示实施例,所述步骤S102具体包括:As shown in FIG2 , based on the embodiment shown in FIG1 , step S102 specifically includes:
S201、基于所述第一图像坐标、所述第二图像坐标和所述第一标定参数,计算所述目标灯珠在所述第一标定相机对应的相机坐标系中的深度距离;S201, calculating the depth distance of the target lamp bead in the camera coordinate system corresponding to the first calibration camera based on the first image coordinates, the second image coordinates and the first calibration parameters;
在一实施例中,根据第一图像坐标和第二图像坐标,可以计算目标灯珠在第一标定相机和第二标定相机采集的两张灯珠图像中的图像视差。进而根据图像视差、第一标定相机和第二标定相机的焦距以及基线距离,即可计算目标灯珠在第一标定相机(或第二标定相机)对应的相机坐标系中的深度距离,也即目标灯珠在第一标定相机(或第二标定相机)对应的相机坐标系中的Z轴坐标。In one embodiment, the image parallax of the target lamp bead in the two lamp bead images captured by the first calibration camera and the second calibration camera can be calculated based on the first image coordinates and the second image coordinates. Furthermore, based on the image parallax, the focal lengths of the first calibration camera and the second calibration camera, and the baseline distance, the depth distance of the target lamp bead in the camera coordinate system corresponding to the first calibration camera (or the second calibration camera) can be calculated, that is, the Z-axis coordinate of the target lamp bead in the camera coordinate system corresponding to the first calibration camera (or the second calibration camera).
进一步地,基于所述第一图像坐标和所述第二图像坐标,计算所述目标灯珠在所述第一灯珠图像和所述第二灯珠图像中的图像视差;获取所述第一标定相机和所述第二标定相机的基线距离;基于所述第一标定参数、所述图像视差和所述基线距离,计算所述深度距离。Furthermore, based on the first image coordinates and the second image coordinates, the image parallax of the target lamp bead in the first lamp bead image and the second lamp bead image is calculated; the baseline distance between the first calibration camera and the second calibration camera is obtained; and based on the first calibration parameter, the image parallax and the baseline distance, the depth distance is calculated.
在一实施例中,通过图像处理技术中的特征点匹配算法,在第一标定相机和第二标定相机采集灯珠图像中识别和匹配目标灯珠的特征点。这些特征点在第一标定相机和第二标定相机采集的两张灯珠图像中的像素坐标位置一般是存在差异的,即图像视差。In one embodiment, a feature point matching algorithm within image processing technology is used to identify and match the feature points of the target lamp bead in the lamp bead images captured by the first calibration camera and the second calibration camera. The pixel coordinates of these feature points in the two lamp bead images captured by the first calibration camera and the second calibration camera generally differ, which is known as image parallax.
其中,图像视差是指同一物体在左右两个相机采集的图像上的水平位置差值。Image parallax refers to the horizontal position difference of the same object in the images captured by the left and right cameras.
示例性的,假设目标灯珠在第一标定相机采集的灯珠图像中的图像坐标为pL,目标灯珠在第二标定相机采集的灯珠图像中的图像坐标为pR。则,图像视差的计算公式可以表示为:
d=pL·x-pR·xFor example, assuming that the image coordinates of the target lamp bead in the lamp bead image captured by the first calibration camera are pL , and the image coordinates of the target lamp bead in the lamp bead image captured by the second calibration camera are pR , the image parallax calculation formula can be expressed as:
d=pL ·xpR ·x
其中,pL·x表示目标灯珠在第一标定相机采集的灯珠图像中的X轴坐标,pR·x表示目标灯珠在第二标定相机采集的灯珠图像中的X轴坐标。Wherein, pL ·x represents the X-axis coordinate of the target lamp bead in the lamp bead image captured by the first calibration camera, and pR ·x represents the X-axis coordinate of the target lamp bead in the lamp bead image captured by the second calibration camera.
在一实施例中,在计算得到图像视差之后,即可根据第一标定相机和第二标定相机的相机标定参数计算目标灯珠的深度坐标,即目标灯珠距离第一标定相机或第二标定相机的深度距离。In one embodiment, after the image parallax is calculated, the depth coordinate of the target lamp bead, that is, the depth distance of the target lamp bead from the first calibration camera or the second calibration camera, can be calculated based on the camera calibration parameters of the first calibration camera and the second calibration camera.
其中,目标灯珠的深度距离计算公式可以表示为:
Z=f·B/dAmong them, the calculation formula of the depth distance of the target lamp bead can be expressed as:
Z=f·B/d
其中,Z表示深度距离,f表示第一标定相机或第二标定相机的焦距,B表示第一标定相机和第二标定相机之间的基线距离,即第一标定相机和第二标定相机之间的实际直线距离,d表示图像视差。Where Z represents the depth distance, f represents the focal length of the first calibration camera or the second calibration camera, B represents the baseline distance between the first calibration camera and the second calibration camera, that is, the actual straight-line distance between the first calibration camera and the second calibration camera, and d represents the image disparity.
S202、基于所述第一图像坐标和所述深度距离,确定所述目标灯珠在所述第一标定相机对应的第一相机坐标系中的所述第一三维坐标。S202: Determine the first three-dimensional coordinates of the target lamp bead in a first camera coordinate system corresponding to the first calibration camera based on the first image coordinates and the depth distance.
在一实施例中,根据上述的图像坐标系与相机坐标系的转换关系,可以将目标灯珠在第一标定相机对应的图像坐标系中的第一图像坐标转换到第一标定相机对应的相机坐标系中,从而获得目标灯珠在第一标定相机对应的相机坐标系中的X轴和Y轴坐标,而深度距离为目标灯珠在第一标定相机对应的相机坐标系中的Z轴坐标,由此,可以获得目标灯珠在第一标定相机对应的第一相机坐标系中的第一三维坐标。In one embodiment, according to the above-mentioned conversion relationship between the image coordinate system and the camera coordinate system, the first image coordinate of the target lamp bead in the image coordinate system corresponding to the first calibration camera can be converted to the camera coordinate system corresponding to the first calibration camera, thereby obtaining the X-axis and Y-axis coordinates of the target lamp bead in the camera coordinate system corresponding to the first calibration camera, and the depth distance is the Z-axis coordinate of the target lamp bead in the camera coordinate system corresponding to the first calibration camera. Thus, the first three-dimensional coordinate of the target lamp bead in the first camera coordinate system corresponding to the first calibration camera can be obtained.
进一步地,基于所述第一标定参数,获得所述第一标定相机对应的内部参数矩阵;基于所述内部参数矩阵,将所述第一图像坐标转化到所述第一标定相机对应的归一化坐标系中,获得所述目标灯珠对应的归一化坐标;基于所述归一化坐标和所述深度距离,计算所述目标灯珠在所述第一相机坐标系中的所述第一三维坐标。Furthermore, based on the first calibration parameters, an internal parameter matrix corresponding to the first calibration camera is obtained; based on the internal parameter matrix, the first image coordinates are converted into a normalized coordinate system corresponding to the first calibration camera to obtain the normalized coordinates corresponding to the target lamp bead; based on the normalized coordinates and the depth distance, the first three-dimensional coordinates of the target lamp bead in the first camera coordinate system are calculated.
在一实施例中,第一标定参数包括第一标定相机的相机内部参数和相机外部参数,相机内部参数包括相机内部参数矩阵和畸变系数。因此,可以根据第一标定参数获得第一标定相机对应的内部参数矩阵。In one embodiment, the first calibration parameters include camera intrinsic parameters and camera extrinsic parameters of the first calibration camera, wherein the camera intrinsic parameters include a camera intrinsic parameter matrix and distortion coefficients. Therefore, the intrinsic parameter matrix corresponding to the first calibration camera can be obtained based on the first calibration parameters.
在一实施例中,可以采用去畸变算法以及内部参数矩阵K的逆,将目标灯珠在第一图像坐标系中的第一图像坐标转换为相机坐标系的归一化相机坐标系的归一化坐标。In one embodiment, a dedistortion algorithm and the inverse of the internal parameter matrix K may be used to convert the first image coordinates of the target lamp bead in the first image coordinate system into normalized coordinates of the camera coordinate system.
其中,归一化坐标系是指相机坐标系中的点转换为归一化平面中的点。The normalized coordinate system refers to the conversion of points in the camera coordinate system into points in the normalized plane.
示例性的,假设目标灯珠在第一标定相机采集的灯珠图像中的像素坐标为(x1,y1),目标灯珠在第一标定相机的相机坐标系中的深度为Z,第一相机的内参矩阵为K1。
For example, assume that the pixel coordinates of the target lamp bead in the lamp bead image captured by the first calibration camera are (x1 , y1 ), the depth of the target lamp bead in the camera coordinate system of the first calibration camera is Z, and the intrinsic parameter matrix of the first camera is K1 .
其中,(fx,fy)为使用像素表示的X轴和Y轴方向的焦距长度,(cx,cy)为相机感光板中心在像素坐标系下的X轴坐标和Y轴坐标。Where (fx ,fy ) is the focal length in the X-axis and Y-axis directions expressed in pixels, and (cx ,cy ) is the X-axis coordinate and Y-axis coordinate of the center of the camera plate in the pixel coordinate system.
将像素坐标转换为归一化相机坐标系中的坐标,则有:
Convert the pixel coordinates to the coordinates in the normalized camera coordinate system:
然后,利用深度Z,可以计算得到目标灯珠在第一标定相机的相机坐标系中的三维坐标(XB,YB,ZB),其中:
XB=x′×D
YB=y′×D
ZB=DThen, using the depth Z, the three-dimensional coordinates (XB , YB , ZB ) of the target lamp bead in the camera coordinate system of the first calibration camera can be calculated, where:
XB = x′×D
YB =y′×D
ZB = D
在一实施例中,第一图像坐标和第二图像坐标为灯珠图像中的像素坐标。因此,归一化坐标系可以是图像坐标系,目标灯珠的归一化坐标可以是将目标灯珠的像素坐标转换为图像坐标,即以上述像素坐标系与图像坐标系之间的转换公式将目标灯珠的像素坐标转换为图像坐标,获得归一化坐标。而后,根据图像坐标系与相机坐标系的转换公式,将归一化坐标转换到第一相机坐标系中,获得目标灯珠在第一相机坐标系中的X轴坐标和Y轴坐标,再将深度距离作为Z轴坐标,从而获得目标灯珠在第一相机坐标系中的第一三维坐标。In one embodiment, the first image coordinates and the second image coordinates are pixel coordinates in the lamp bead image. Therefore, the normalized coordinate system can be an image coordinate system, and the normalized coordinates of the target lamp bead can be obtained by converting the pixel coordinates of the target lamp bead into image coordinates, i.e., converting the pixel coordinates of the target lamp bead into image coordinates using the conversion formula between the pixel coordinate system and the image coordinate system to obtain the normalized coordinates. Then, according to the conversion formula between the image coordinate system and the camera coordinate system, the normalized coordinates are converted into the first camera coordinate system to obtain the X-axis and Y-axis coordinates of the target lamp bead in the first camera coordinate system. The depth distance is then used as the Z-axis coordinate to obtain the first three-dimensional coordinates of the target lamp bead in the first camera coordinate system.
在一实施例中,图像畸变是由于透镜制造精度以及组装工艺的偏差会引入畸变,导致原始图像失真。镜头的畸变分为径向畸变和切向畸变两类。从镜头获取的原始图像,都是存在第一部分所述的两种畸变的,为了后续进行更好的图像操作,需要执行去畸变过程。In one embodiment, image distortion is caused by variations in lens manufacturing precision and assembly processes, which can introduce distortion and distort the original image. Lens distortion is categorized into two types: radial distortion and tangential distortion. The original image captured by the lens exhibits both of the two types of distortion described in the first section. To enable better subsequent image manipulation, a dedistortion process is required.
可以理解地是,图像去畸变可以采用现有技术中的去畸变算法,其主要过程包括:将像素坐标转换到相机坐标系中,获得相机坐标,然后根据畸变系数(K1,K2,K3,P1,P2)分别计算X轴和Y轴方向上的畸变量,在根据畸变量对相机坐标进行校正,并将经过校正后的相机坐标转换到像素坐标系中,即可获得校正后的像素坐标。It can be understood that image dedistortion can adopt the dedistortion algorithm in the prior art, and its main process includes: converting the pixel coordinates into the camera coordinate system to obtain the camera coordinates, and then calculating the distortion amounts in the X-axis and Y-axis directions according to the distortion coefficients (K1, K2, K3, P1, P2), correcting the camera coordinates according to the distortion amounts, and converting the corrected camera coordinates into the pixel coordinate system to obtain the corrected pixel coordinates.
可以理解地是,第一标定相机和第二标定相机组成双目视觉系统,对目标灯珠进行定位,因此,同样可以采用第二标定相机对目标灯珠进行三维坐标计算。It can be understood that the first calibration camera and the second calibration camera form a binocular vision system to locate the target lamp bead. Therefore, the second calibration camera can also be used to calculate the three-dimensional coordinates of the target lamp bead.
进一步地,基于所述第一图像坐标、所述第二图像坐标和所述第二标定参数,计算所述目标灯珠在所述第二标定相机对应的相机坐标系中的所述第一三维坐标;基于所述第三标定参数和所述第二标定参数,将所述第一三维坐标转化到眼动相机对应的相机坐标系中,获得所述目标灯珠相对于所述眼动相机的所述目标三维坐标。Furthermore, based on the first image coordinates, the second image coordinates and the second calibration parameters, the first three-dimensional coordinates of the target lamp bead in the camera coordinate system corresponding to the second calibration camera are calculated; based on the third calibration parameters and the second calibration parameters, the first three-dimensional coordinates are converted into the camera coordinate system corresponding to the eye-tracking camera to obtain the target three-dimensional coordinates of the target lamp bead relative to the eye-tracking camera.
在一实施例中,因为第一标定相机和第二标定相机构成双目立体视觉系统,因此,通过第一图像坐标和深度距离,计算目标灯珠在第一相机坐标系中的第一三维坐标,与通过第二图像坐标和深度距离,计算目标灯珠在第二相机坐标系中的第一三维坐标的过程是相同的,区别仅在于所采用的相机坐标系。In one embodiment, because the first calibration camera and the second calibration camera constitute a binocular stereo vision system, the process of calculating the first three-dimensional coordinates of the target lamp bead in the first camera coordinate system through the first image coordinates and the depth distance is the same as the process of calculating the first three-dimensional coordinates of the target lamp bead in the second camera coordinate system through the second image coordinates and the depth distance. The only difference is the camera coordinate system used.
在一实施例中,根据上述的图像坐标系与相机坐标系的转换关系,可以将目标灯珠在第二标定相机对应的图像坐标系中的第二图像坐标转换到第二标定相机对应的相机坐标系中,从而获得目标灯珠在第二标定相机对应的相机坐标系中的X轴和Y轴坐标,而深度距离为目标灯珠在第二标定相机对应的相机坐标系中的Z轴坐标,由此,可以获得目标灯珠在第二标定相机对应的第二相机坐标系中的第一三维坐标。In one embodiment, according to the above-mentioned conversion relationship between the image coordinate system and the camera coordinate system, the second image coordinates of the target lamp bead in the image coordinate system corresponding to the second calibration camera can be converted to the camera coordinate system corresponding to the second calibration camera, thereby obtaining the X-axis and Y-axis coordinates of the target lamp bead in the camera coordinate system corresponding to the second calibration camera, and the depth distance is the Z-axis coordinate of the target lamp bead in the camera coordinate system corresponding to the second calibration camera. Thus, the first three-dimensional coordinates of the target lamp bead in the second camera coordinate system corresponding to the second calibration camera can be obtained.
本实施例中,根据第一图像坐标和第二图像坐标,计算目标灯珠在第一标定相机和第二标定相机采集的两张灯珠图像中的图像视差,进而根据图像视差、相机标定参数以及两个标定相机之间的基线距离,计算得到目标灯珠与两个标定相机之间的深度距离,通过双目视觉实现对目标灯珠的深度信息计算,从而提高目标灯珠的三维坐标计算的准确性。In this embodiment, based on the first image coordinates and the second image coordinates, the image parallax of the target lamp bead in the two lamp bead images captured by the first calibration camera and the second calibration camera is calculated, and then the depth distance between the target lamp bead and the two calibration cameras is calculated based on the image parallax, the camera calibration parameters and the baseline distance between the two calibration cameras. The depth information of the target lamp bead is calculated through binocular vision, thereby improving the accuracy of the three-dimensional coordinate calculation of the target lamp bead.
请参照图3,图3为本申请实施例提供的一种LED灯珠标定方法第三实施例的流程示意图。Please refer to FIG3 , which is a flow chart of a third embodiment of a method for calibrating an LED lamp bead provided in an embodiment of the present application.
如图3所示,基于上述图2所示实施例,所述步骤S104具体包括:As shown in FIG3 , based on the embodiment shown in FIG2 , step S104 specifically includes:
S301、基于所述第一标定参数和所述第三标定参数,计算所述第一标定相机和所述眼动相机之间的第二旋转矩阵和第二平移向量;S301, calculating a second rotation matrix and a second translation vector between the first calibration camera and the eye-tracking camera based on the first calibration parameter and the third calibration parameter;
在一实施例中,通过旋转和平移变换将三维的坐标转换为相机二维的坐标,其中的旋转矩阵和平移矩阵就被称为相机的外部标定参数,描述的是将世界坐标系转换成相机坐标系的过程。In one embodiment, three-dimensional coordinates are converted into two-dimensional coordinates of the camera through rotation and translation transformations, where the rotation matrix and translation matrix are called the external calibration parameters of the camera, which describe the process of converting the world coordinate system into the camera coordinate system.
在一实施例中,可以通过双目视觉标定的方式,确定两个相机之间的空间关系,即相对位置和相对姿态。In one embodiment, the spatial relationship between the two cameras, ie, the relative position and relative posture, may be determined by binocular vision calibration.
示例性的,以第一标定相机和眼动相机组成双目视觉系统,两个相机从不同角度同步拍摄同一场景(如棋盘格)的多组图片,使用特征提取和匹配算法(如SIFT、SURF或ORB)找到图片间的对应点。基于这些对应点,计算第一标定相机和眼动相机之间的第二旋转矩阵R13和第二平移向量T13。For example, a binocular vision system is composed of a first calibration camera and an eye-tracking camera. The two cameras simultaneously capture multiple images of the same scene (e.g., a checkerboard pattern) from different angles. A feature extraction and matching algorithm (e.g., SIFT, SURF, or ORB) is used to find corresponding points between the images. Based on these corresponding points, a second rotation matrix R13 and a second translation vector T13 are calculated between the first calibration camera and the eye-tracking camera.
S302、基于所述第二旋转矩阵和所述第二平移向量,将所述第一三维坐标转化到所述眼动相机对应的相机坐标系中,获得所述目标三维坐标。S302: Based on the second rotation matrix and the second translation vector, transform the first three-dimensional coordinates into a camera coordinate system corresponding to the eye-tracking camera to obtain the target three-dimensional coordinates.
在一实施例中,将目标灯珠在第一标定相机对应的相机坐标系中的三维坐标转化到眼动相机对应的相机坐标系中,需要使用第一标定相机与眼动相机之间的第二旋转矩阵R13的逆矩阵In one embodiment, the three-dimensional coordinates of the target lamp bead in the camera coordinate system corresponding to the first calibration camera are converted to the camera coordinate system corresponding to the eye tracking camera. The inverse matrix of the second rotation matrixR13 between the first calibration camera and the eye tracking camera is required.
其中,目标灯珠在眼动相机中的目标三维坐标=点1就是目标灯珠在第一标定相机的相机坐标系中的第一三维坐标(XB,YB,ZB)。Among them, the target three-dimensional coordinates of the target lamp bead in the eye-tracking camera = Point1 is the first three-dimensional coordinate (XB , YB , ZB ) of the target lamp bead in the camera coordinate system of the first calibration camera.
在一实施例中,还可以通过世界坐标系进行三维坐标转化,即通过第一标定参数和第三标定参数,确定第一标定相机的相机坐标系和世界坐标系之间的旋转矩阵和平移矩阵,以及眼动相机的相机坐标系和世界坐标系之间的旋转矩阵和平移矩阵,也即第一标定相机的外部标定参数和眼动相机的外部标定参数。In one embodiment, three-dimensional coordinate transformation can also be performed through the world coordinate system, that is, the rotation matrix and translation matrix between the camera coordinate system of the first calibration camera and the world coordinate system, as well as the rotation matrix and translation matrix between the camera coordinate system of the eye-tracking camera and the world coordinate system, are determined through the first calibration parameters and the third calibration parameters, that is, the external calibration parameters of the first calibration camera and the external calibration parameters of the eye-tracking camera.
进一步地,基于所述第三标定参数,获得所述眼动相机对应的相机坐标系与所述世界坐标系之间的坐标转换矩阵;基于所述坐标转换矩阵,将所述全局三维坐标转换到所述眼动相机对应的相机坐标系中,获得所述目标三维坐标。Furthermore, based on the third calibration parameter, a coordinate transformation matrix between the camera coordinate system corresponding to the eye-tracking camera and the world coordinate system is obtained; based on the coordinate transformation matrix, the global three-dimensional coordinates are transformed into the camera coordinate system corresponding to the eye-tracking camera to obtain the target three-dimensional coordinates.
在一实施例中,根据外部标定参数(旋转矩阵和平移矩阵),通过上述的相机坐标系与世界坐标系的转换关系,将第一相机坐标系中的第一三维坐标转换到世界坐标系中,获得目标灯珠在世界坐标系中的全局三维坐标。其中,相机坐标系与世界坐标系的转换关系如下:
In one embodiment, based on the external calibration parameters (rotation matrix and translation matrix), the first three-dimensional coordinates in the first camera coordinate system are converted to the world coordinate system through the above-mentioned conversion relationship between the camera coordinate system and the world coordinate system to obtain the global three-dimensional coordinates of the target lamp bead in the world coordinate system. Among them, the conversion relationship between the camera coordinate system and the world coordinate system is as follows:
其中,R为3×3的旋转矩阵,t为三维平移向量,T为平移矩阵,(Xc,Yc,Zc)表示相机坐标系中的三维点坐标,(X,Y,Z)表示世界坐标系中的三维点坐标。Where R is a 3×3 rotation matrix, t is a three-dimensional translation vector, T is a translation matrix, (Xc ,Yc ,Zc ) represents the three-dimensional point coordinates in the camera coordinate system, and (X, Y, Z) represents the three-dimensional point coordinates in the world coordinate system.
在一实施例中,根据第三标定参数,获得眼动相机的外部标定参数,即眼动相机的相机坐标系与世界坐标系之间的旋转矩阵和平移矩阵。进而获得相机坐标系与世界坐标系之间的坐标转换矩阵,也即上述相机坐标系与世界坐标系的转换关系。In one embodiment, the extrinsic calibration parameters of the eye-tracking camera are obtained based on the third calibration parameters, namely, the rotation matrix and translation matrix between the camera coordinate system and the world coordinate system of the eye-tracking camera. Furthermore, the coordinate transformation matrix between the camera coordinate system and the world coordinate system is obtained, namely, the transformation relationship between the camera coordinate system and the world coordinate system.
在一实施例中,获取到目标灯珠在世界坐标系中的全局三维坐标,且获得眼动相机的相机坐标系与世界坐标系之间的坐标转换矩阵,即可根据全局三维坐标和坐标转换矩阵,计算得出目标灯珠在眼动相机的相机坐标系中的目标三维坐标。In one embodiment, the global three-dimensional coordinates of the target lamp bead in the world coordinate system are obtained, and the coordinate transformation matrix between the camera coordinate system of the eye-tracking camera and the world coordinate system is obtained. Then, the target three-dimensional coordinates of the target lamp bead in the camera coordinate system of the eye-tracking camera can be calculated based on the global three-dimensional coordinates and the coordinate transformation matrix.
在一实施例中,因为眼动相机的相机坐标系中以眼动相机的光轴中心作为坐标原点,因此,可以将目标三维坐标作为目标灯珠相对于眼动相机的相对位置表示。In one embodiment, because the camera coordinate system of the eye-tracking camera uses the center of the optical axis of the eye-tracking camera as the coordinate origin, the target three-dimensional coordinates can be used to represent the relative position of the target lamp bead relative to the eye-tracking camera.
本实施例中,根据第一标定参数和坐标转换矩阵,将目标灯珠在第一标定坐标系中的三维坐标转换到世界坐标系中,获得全局三维坐标。通过眼动相机的第三标定参数,计算眼动相机的相机坐标系与世界坐标系之间的坐标转换矩阵,继而根据坐标转换矩阵,将目标灯珠在世界坐标系中的全局三维坐标转换到眼动相机的相机坐标系中,从而在眼动相机的相机坐标系中标定目标灯珠与眼动相机的相对位置关系,可以避免求解眼动相机在世界坐标系中的点坐标所产生的误差,从而提高眼动相机和目标灯珠的相对位置的标定准确性。In this embodiment, the three-dimensional coordinates of the target lamp bead in the first calibration coordinate system are converted to the world coordinate system based on the first calibration parameters and the coordinate transformation matrix to obtain global three-dimensional coordinates. The coordinate transformation matrix between the eye-tracking camera's camera coordinate system and the world coordinate system is calculated using the eye-tracking camera's third calibration parameters. Subsequently, based on the coordinate transformation matrix, the global three-dimensional coordinates of the target lamp bead in the world coordinate system are converted to the eye-tracking camera's camera coordinate system. This allows the relative positional relationship between the target lamp bead and the eye-tracking camera to be calibrated in the eye-tracking camera's camera coordinate system. This avoids errors associated with solving the eye-tracking camera's point coordinates in the world coordinate system, thereby improving the calibration accuracy of the relative position between the eye-tracking camera and the target lamp bead.
请参阅图4,图4是本申请提供的一种LED灯珠标定装置第一实施例的结构示意图,该LED灯珠标定装置用于执行前述的LED灯珠标定方法。Please refer to FIG4 , which is a structural diagram of a first embodiment of an LED lamp bead calibration device provided in the present application. The LED lamp bead calibration device is used to execute the aforementioned LED lamp bead calibration method.
如图4所示,该LED灯珠标定装置400,包括:标定参数获取模块401、图像坐标获得模块402、三维坐标计算模块403和三维坐标转换模块404。As shown in FIG4 , the LED lamp bead calibration device 400 includes: a calibration parameter acquisition module 401 , an image coordinate acquisition module 402 , a three-dimensional coordinate calculation module 403 and a three-dimensional coordinate conversion module 404 .
标定参数获取模块401,用于获取第一标定相机对应的第一标定参数和眼动相机的第三标定参数;The calibration parameter acquisition module 401 is used to acquire a first calibration parameter corresponding to the first calibration camera and a third calibration parameter of the eye-tracking camera;
图像坐标获得模块402,用于基于所述第一标定相机和第二标定相机分别采集的目标灯珠的灯珠图像,获得所述第一标定相机对应的第一图像坐标和所述第二标定相机对应的第二图像坐标;An image coordinate obtaining module 402 is configured to obtain first image coordinates corresponding to the first calibration camera and second image coordinates corresponding to the second calibration camera based on the lamp bead images of the target lamp bead captured by the first calibration camera and the second calibration camera respectively;
三维坐标计算模块403,用于基于所述第一图像坐标、所述第二图像坐标和所述第一标定参数,计算所述目标灯珠在所述第一标定相机对应的相机坐标系中的第一三维坐标;A three-dimensional coordinate calculation module 403 is used to calculate the first three-dimensional coordinates of the target lamp bead in the camera coordinate system corresponding to the first calibration camera based on the first image coordinates, the second image coordinates and the first calibration parameters;
三维坐标转换模块404,用于基于所述第三标定参数和所述第一标定参数,将所述第一三维坐标转化到眼动相机对应的相机坐标系中,获得所述目标灯珠相对于所述眼动相机的目标三维坐标。The three-dimensional coordinate conversion module 404 is used to convert the first three-dimensional coordinate into the camera coordinate system corresponding to the eye-tracking camera based on the third calibration parameter and the first calibration parameter, and obtain the target three-dimensional coordinate of the target lamp bead relative to the eye-tracking camera.
需要说明的是,所属领域的技术人员可以清楚地了解到,为了描述的方便和简洁,上述描述的装置和各模块的具体工作过程,可以参考前述LED灯珠标定方法实施例中的对应过程,在此不再赘述。It should be noted that, those skilled in the art can clearly understand that, for the convenience and brevity of description, the specific working processes of the above-described devices and modules can refer to the corresponding processes in the aforementioned LED lamp bead calibration method embodiment, and will not be repeated here.
上述实施例提供的装置可以实现为一种计算机程序的形式,该计算机程序可以在如图5所示的计算机设备上运行。The apparatus provided in the above embodiment may be implemented in the form of a computer program, and the computer program may be run on a computer device as shown in FIG5 .
请参阅图5,图5是本申请实施例提供的一种计算机设备的结构示意性框图。该计算机设备可以是服务器。Please refer to Figure 5, which is a schematic block diagram of the structure of a computer device provided in an embodiment of the present application. The computer device may be a server.
参阅图5,该计算机设备包括通过系统总线连接的处理器、存储器和网络接口,其中,存储器可以包括非易失性存储介质和内存储器。5 , the computer device includes a processor, a memory, and a network interface connected via a system bus, wherein the memory may include a non-volatile storage medium and an internal memory.
非易失性存储介质可存储操作系统和计算机程序。该计算机程序包括程序指令,该程序指令被执行时,可使得处理器执行任意一种LED灯珠标定方法。The non-volatile storage medium can store an operating system and a computer program. The computer program includes program instructions that, when executed, enable the processor to perform any of the LED lamp bead calibration methods.
处理器用于提供计算和控制能力,支撑整个计算机设备的运行。The processor is used to provide computing and control capabilities and support the operation of the entire computer equipment.
内存储器为非易失性存储介质中的计算机程序的运行提供环境,该计算机程序被处理器执行时,可使得处理器执行任意一种LED灯珠标定方法。The internal memory provides an environment for the operation of the computer program in the non-volatile storage medium. When the computer program is executed by the processor, the processor can execute any LED lamp bead calibration method.
该网络接口用于进行网络通信,如发送分配的任务等。本领域技术人员可以理解,图5中示出的结构,仅仅是与本申请方案相关的部分结构的框图,并不构成对本申请方案所应用于其上的计算机设备的限定,具体的计算机设备可以包括比图中所示更多或更少的部件,或者组合某些部件,或者具有不同的部件布置。The network interface is used for network communication, such as sending assigned tasks, etc. Those skilled in the art will understand that the structure shown in FIG5 is merely a block diagram of a portion of the structure related to the solution of the present application, and does not constitute a limitation on the computer device to which the solution of the present application is applied. A specific computer device may include more or fewer components than shown in the figure, or combine certain components, or have a different arrangement of components.
应当理解的是,处理器可以是中央处理单元(CentralProcessingUnit,CPU),该处理器还可以是其他通用处理器、数字信号处理器(DigitalSignalProcessor,DSP)、专用集成电路(ApplicationSpecificIntegratedCircuit,ASIC)、现场可编程门阵列(Field-ProgrammableGateArray,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。其中,通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。It should be understood that the processor may be a central processing unit (CPU), or other general-purpose processors, digital signal processors (DSP), application-specific integrated circuits (ASIC), field-programmable gate arrays (FPGA), or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc. The general-purpose processor may be a microprocessor or any conventional processor, etc.
本申请的实施例中还提供一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序中包括程序指令,所述处理器执行所述程序指令,实现本申请实施例提供的任一种LED灯珠标定方法。A computer-readable storage medium is also provided in an embodiment of the present application. The computer-readable storage medium stores a computer program. The computer program includes program instructions. The processor executes the program instructions to implement any LED lamp bead calibration method provided in the embodiment of the present application.
其中,所述计算机可读存储介质可以是前述实施例所述的计算机设备的内部存储单元,例如所述计算机设备的硬盘或内存。所述计算机可读存储介质也可以是所述计算机设备的外部存储设备,例如所述计算机设备上配备的插接式硬盘,智能存储卡(SmartMediaCard,SMC),安全数字(SecureDigital,SD)卡,闪存卡(FlashCard)等。The computer-readable storage medium may be an internal storage unit of the computer device described in the aforementioned embodiment, such as a hard disk or memory of the computer device. The computer-readable storage medium may also be an external storage device of the computer device, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) card, a Flash Card, etc., equipped on the computer device.
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到各种等效的修改或替换,这些修改或替换都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以权利要求的保护范围为准。The above description is merely a specific embodiment of the present application, but the scope of protection of the present application is not limited thereto. Any person skilled in the art can easily conceive of various equivalent modifications or substitutions within the technical scope disclosed in the present application, and such modifications or substitutions should be included in the scope of protection of the present application. Therefore, the scope of protection of the present application should be based on the scope of protection of the claims.
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202410316520.7ACN118334116A (en) | 2024-03-19 | 2024-03-19 | LED lamp bead calibration method, device, equipment and medium |
| CN202410316520.7 | 2024-03-19 |
| Publication Number | Publication Date |
|---|---|
| WO2025195304A1true WO2025195304A1 (en) | 2025-09-25 |
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2025/082727PendingWO2025195304A1 (en) | 2024-03-19 | 2025-03-14 | Led lamp bead calibration method and apparatus, and device and medium |
| Country | Link |
|---|---|
| CN (1) | CN118334116A (en) |
| WO (1) | WO2025195304A1 (en) |
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN118334116A (en)* | 2024-03-19 | 2024-07-12 | 珠海莫界科技有限公司 | LED lamp bead calibration method, device, equipment and medium |
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN107765840A (en)* | 2016-08-22 | 2018-03-06 | 深圳纬目信息技术有限公司 | A kind of Eye-controlling focus method equipment of the general headset equipment based on binocular measurement |
| CN116051658A (en)* | 2023-03-27 | 2023-05-02 | 北京科技大学 | Camera hand-eye calibration method and device for target detection based on binocular vision |
| CN116225219A (en)* | 2023-01-28 | 2023-06-06 | 深圳锐视智芯科技有限公司 | An eye tracking method and related device based on multi-combination binocular stereo vision |
| CN118334116A (en)* | 2024-03-19 | 2024-07-12 | 珠海莫界科技有限公司 | LED lamp bead calibration method, device, equipment and medium |
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN107765840A (en)* | 2016-08-22 | 2018-03-06 | 深圳纬目信息技术有限公司 | A kind of Eye-controlling focus method equipment of the general headset equipment based on binocular measurement |
| CN116225219A (en)* | 2023-01-28 | 2023-06-06 | 深圳锐视智芯科技有限公司 | An eye tracking method and related device based on multi-combination binocular stereo vision |
| CN116051658A (en)* | 2023-03-27 | 2023-05-02 | 北京科技大学 | Camera hand-eye calibration method and device for target detection based on binocular vision |
| CN118334116A (en)* | 2024-03-19 | 2024-07-12 | 珠海莫界科技有限公司 | LED lamp bead calibration method, device, equipment and medium |
| Publication number | Publication date |
|---|---|
| CN118334116A (en) | 2024-07-12 |
| Publication | Publication Date | Title |
|---|---|---|
| JP6764533B2 (en) | Calibration device, chart for calibration, chart pattern generator, and calibration method | |
| WO2023045147A1 (en) | Method and system for calibrating binocular camera, and electronic device and storage medium | |
| CN115345942B (en) | Space calibration method, device, computer equipment and storage medium | |
| US8600192B2 (en) | System and method for finding correspondence between cameras in a three-dimensional vision system | |
| WO2022160761A1 (en) | Method and apparatus for calibrating dual stereo cameras | |
| CN106683071B (en) | Image stitching method and device | |
| JP7675288B2 (en) | Photographing and measuring method, device, equipment and storage medium | |
| JP2011253376A (en) | Image processing device, image processing method and program | |
| JP5070435B1 (en) | Three-dimensional relative coordinate measuring apparatus and method | |
| WO2025195304A1 (en) | Led lamp bead calibration method and apparatus, and device and medium | |
| JP7489253B2 (en) | Depth map generating device and program thereof, and depth map generating system | |
| CN115965697B (en) | Projector calibration method, system and device based on Moh's law | |
| WO2019232793A1 (en) | Two-camera calibration method, electronic device and computer-readable storage medium | |
| CN110580718A (en) | Correction method of image device and related image device and arithmetic device | |
| CN116188594B (en) | Calibration method, calibration system, calibration device and electronic equipment of camera | |
| CN113379845A (en) | Camera calibration method and device, electronic equipment and storage medium | |
| KR20200142391A (en) | Method for Estimating 3D Marker Cordinetes of Optical Position Tracking System | |
| CN111383264A (en) | A positioning method, device, terminal and computer storage medium | |
| CN118279414A (en) | External parameter calibration method, device and equipment | |
| KR102631472B1 (en) | Methods and device for lossless correction of fisheye distortion image | |
| JP5648159B2 (en) | Three-dimensional relative coordinate measuring apparatus and method | |
| CN118154694A (en) | Visual measurement method, device, equipment and medium for large-size equipment docking | |
| CN110728714B (en) | Image processing method and device, storage medium and electronic equipment | |
| CN116994303B (en) | A method for liveness detection based on binocular left and right image alignment and correction | |
| CN118160319A (en) | Learning device, learning method, learning program, camera parameter calculation device, camera parameter calculation method, and camera parameter calculation program |