技术领域Technical Field
本发明属于计算机视觉技术领域,尤其涉及一种基于注视轨迹的视线追踪误差补偿或参数标定方法及系统。The present invention belongs to the field of computer vision technology, and in particular relates to a gaze tracking error compensation or parameter calibration method and system based on gaze trajectory.
背景技术Background Art
人眼是大脑获取信息的最主要来源。视线追踪是一种利用人眼运动信息来计算视线方向或关注点的技术,在人机交互、医学辅助、广告、驾驶辅助等方面有着广泛应用,在虚拟现实、增强现实等领域有良好的应用前景。依据人眼摄像头安装位置的不同,包含近眼视线跟踪和远距视线跟踪两类场景,有多摄像机和单一摄像机等不同方案。在近眼视线跟踪问题中,摄像头通常安装在佩戴的眼镜框架之上,主要应用在医学辅助、驾驶辅助等场合;在远距视线跟踪问题中通常安装在屏幕附近,例如笔记本电脑屏幕顶端,相对于人眼保持一定距离,主要用于人机交互、广告等等。The human eye is the main source of information for the brain. Eye tracking is a technology that uses eye movement information to calculate the direction of sight or focus. It is widely used in human-computer interaction, medical assistance, advertising, driving assistance, etc., and has good application prospects in virtual reality, augmented reality and other fields. Depending on the installation position of the human eye camera, there are two types of scenarios: near-eye sight tracking and long-distance sight tracking, with different solutions such as multiple cameras and single cameras. In the problem of near-eye sight tracking, the camera is usually installed on the frame of the glasses worn, and is mainly used in medical assistance, driving assistance and other occasions; in the problem of long-distance sight tracking, it is usually installed near the screen, such as the top of a laptop screen, keeping a certain distance from the human eye, and is mainly used for human-computer interaction, advertising and so on.
视线追踪技术分为两大类别:基于模型的视线估计方法、基于特征学习的视线估计方法;研究和应用中都是以单摄像机方案为主。基于特征学习方法需要开展大量的训练数据,而且当前的机器学习模型所得结果是一种难以解读的黑盒子,目前实用价值还比较有限。Eye tracking technology is divided into two categories: model-based eye estimation method and feature learning-based eye estimation method. Both research and application are mainly based on single-camera solutions. Feature learning-based methods require a large amount of training data, and the results obtained by current machine learning models are a black box that is difficult to interpret, and their practical value is currently limited.
基于模型的视线估计方法主要依赖人眼生理结构,构建出二维映射模型或者是三维几何模型,根据人脸关键点、瞳孔、眼角等几何信息计算人眼视线方向或者注视点,具有较好的实用价值。二维映射模型方法可以细分为角膜反射法、交比值法、单应归一化法等三种。角膜反射法和单应归一化法都需要针对模型参数,例如人眼kappa角、单应性矩阵等关键参数进行参数标定;交比值法利用了交比射影不变性进行视线估计从而规避了参数标定问题,但是却需要建立注视点的误差补偿表,对kappa角等带来的误差进行补偿。The model-based line of sight estimation method mainly relies on the physiological structure of the human eye, constructs a two-dimensional mapping model or a three-dimensional geometric model, and calculates the line of sight direction or gaze point of the human eye based on geometric information such as facial key points, pupils, and eye corners, which has good practical value. The two-dimensional mapping model method can be subdivided into three types: corneal reflection method, cross-ratio method, and homography normalization method. Both the corneal reflection method and the homography normalization method require parameter calibration for key parameters such as the kappa angle of the human eye and the homography matrix; the cross-ratio method uses the cross-ratio projective invariance to estimate the line of sight, thereby avoiding the parameter calibration problem, but it is necessary to establish an error compensation table for the gaze point to compensate for the errors caused by the kappa angle.
二维映射模型方法主要通过瞳孔或者虹膜中心对人眼进行参数标定或者建立误差补偿表。三维几何模型方法更多采用统计分析或者是机器学习方式计算三维几何模型参数,需要进行较多的基础数据或者数据训练。The two-dimensional mapping model method mainly calibrates the parameters of the human eye or establishes an error compensation table through the pupil or iris center. The three-dimensional geometric model method uses statistical analysis or machine learning to calculate the parameters of the three-dimensional geometric model, which requires more basic data or data training.
无论采用哪种方法,单摄像机视线追踪都需要依据视线追踪模型的特点开展大量的实际测量工作,都存在系统误差建模和标定数据工作量较大的问题。Regardless of the method used, single-camera gaze tracking requires a large amount of actual measurement work based on the characteristics of the gaze tracking model, and there are problems with the large workload of system error modeling and calibration data.
由于人眼视线光轴和视轴之间存在着kappa角、视线追踪模型误差、实验过程中的操作误差等因素,计算出的视线落点与真实视线落点必然存在偏差,视线追踪误差建模和参数标定都需要进行必要数量的视线落点测量和估计。常用的方法是记录若干个屏幕校准点的注视点计算位置,求解注视点计算位置与校准位置的映射函数,进行系统误差补偿或者模型参数标定。通常至少会采用九个或者是更多的测点,校准过程需要对每个校准点进行较长时间的注视,对用户的技能要求较高,且长时间的注视会造成人眼的不适。Due to factors such as the kappa angle between the human eye's line of sight optical axis and visual axis, line of sight tracking model errors, and operational errors during the experiment, there must be a deviation between the calculated line of sight landing point and the actual line of sight landing point. Line of sight tracking error modeling and parameter calibration require a necessary number of line of sight landing point measurements and estimates. The commonly used method is to record the gaze point calculation position of several screen calibration points, solve the mapping function between the gaze point calculation position and the calibration position, and perform system error compensation or model parameter calibration. Usually at least nine or more measurement points are used. The calibration process requires a long period of gaze at each calibration point, which requires high skills from the user, and long-term gaze will cause discomfort to the human eye.
通过上述分析,现有技术存在的问题及缺陷为:Through the above analysis, the problems and defects of the prior art are as follows:
传统的视线追踪系统的视线追踪误差补偿或者参数标定的工作量大,对用户的技能要求高,操作体验差。The traditional gaze tracking system requires a lot of work in gaze tracking error compensation or parameter calibration, has high requirements on user skills, and has a poor operating experience.
发明内容Summary of the invention
针对现有技术存在的问题,本发明提供了一种基于注视轨迹的视线追踪误差补偿或参数标定方法及系统。In view of the problems existing in the prior art, the present invention provides a gaze tracking error compensation or parameter calibration method and system based on gaze trajectory.
本发明是这样实现的,一种基于注视轨迹的视线追踪参数标定或者误差补偿方法,所述基于注视点运动轨迹的视线追踪误差补偿或参数标定方法包括:The present invention is implemented as follows: a method for calibrating sight tracking parameters or compensating errors based on gaze trajectory, wherein the method for calibrating sight tracking errors or parameters based on gaze point motion trajectory comprises:
通过追踪给定轨迹的动点,选取不同时间序列的人眼图像按照选定的视线估计方法构建映射模型,计算对应的视线落点;利用视线轨迹的形状信息以及最小二乘法的方式对系统进行补偿表系数求解或模型参数标定。By tracking the moving point of a given trajectory, human eye images of different time series are selected to build a mapping model according to the selected line of sight estimation method, and the corresponding line of sight landing point is calculated; the compensation table coefficients of the system are solved or the model parameters are calibrated using the shape information of the line of sight trajectory and the least squares method.
误差补偿以二维映射模型方法之中的交比值法为代表,参数标定以二维映射模型方法之中的角膜反射法为代表。对三维几何模型方法还有基于特征学习的方法,也同样可以采用基于注视点运动轨迹进行模型参数标定,或者用来建立大样本的训练数据集。Error compensation is represented by the cross ratio method in the two-dimensional mapping model method, and parameter calibration is represented by the corneal reflection method in the two-dimensional mapping model method. There are also feature learning-based methods for the three-dimensional geometric model method, which can also be used to calibrate model parameters based on the gaze point motion trajectory, or to establish a large sample training data set.
其中,所述的利用轨迹形状信息进行补偿是通过特殊点的位置求取轨迹的位置和形状调整参数,所述利用最小二乘法方式对系统进行补偿表系数求解或模型参数标定是通过跟踪注视轨迹的方法,快速便捷地获取大量视线落点的优势将视线落点的信息带入最小二乘法中进行补偿表系数求解或模型参数标定。Among them, the compensation using trajectory shape information is to obtain the position and shape adjustment parameters of the trajectory through the position of special points, and the least squares method is used to solve the compensation table coefficients or calibrate the model parameters of the system. This is a method of tracking the gaze trajectory, which takes advantage of the fact that a large number of sight points are quickly and conveniently obtained, and the information of the sight points is brought into the least squares method to solve the compensation table coefficients or calibrate the model parameters.
进一步,构建视线模型,计算视线落点:Further, construct the sight line model and calculate the sight line landing point:
根据所用视线估计方法计算视线在屏幕上的落点,例如交比值法,通过交比不变性构建视线映射模型,计算视线屏幕落点位置。The point where the line of sight falls on the screen is calculated according to the line of sight estimation method used, such as the cross-ratio method. The line of sight mapping model is constructed through the cross-ratio invariance to calculate the position of the line of sight falling on the screen.
进一步,利用所述的注视轨迹形状信息进行注视点修正,具体过程为:Furthermore, the gaze point correction is performed using the gaze trajectory shape information, and the specific process is as follows:
通过人眼注视校准屏幕上的匀速动点,动点的轨迹为一规则的半圆或圆。By gazing at the uniformly moving point on the calibration screen, the trajectory of the moving point is a regular semicircle or circle.
进一步,逐帧画面计算视线在校准屏幕上的视线落点轨迹。为方便处理,选取轨迹上特殊角的点(如0°、180°等),与动点的原始轨迹进行比较,得到夹角θ和中心点位置(x’0,y’0)。Furthermore, the trajectory of the sight point on the calibration screen is calculated frame by frame. For the convenience of processing, points with special angles on the trajectory (such as 0°, 180°, etc.) are selected and compared with the original trajectory of the moving point to obtain the angle θ and the center point position (x'0 , y'0 ).
通过人眼注视校准屏幕上的匀速动点,避免了逐点校准过程中的注视点对准问题,同时通过记录相机拍摄人眼视频和屏幕动点运动的时间信息,利用对应的时间关系,使用视线映射模型快速计算各个点的视线落点位置,同多点校准相比大大降低了点测量的工作量。By gazing at the uniformly moving points on the calibration screen, the problem of gaze point alignment in the point-by-point calibration process is avoided. At the same time, by recording the time information of the camera's video of the human eye and the movement of the moving points on the screen, the corresponding time relationship is utilized, and the line of sight mapping model is used to quickly calculate the line of sight position of each point, which greatly reduces the workload of point measurement compared to multi-point calibration.
进一步,依据特殊点的位置,例如直径的两个端点,计算所述原始注视轨迹和拟合注视轨迹的对应关系:Furthermore, according to the positions of special points, such as the two end points of the diameter, the corresponding relationship between the original gaze trajectory and the fitted gaze trajectory is calculated:
如图5所示,拟合轨迹和所述原始轨迹之间的夹角记为θ,所述原始轨迹的半径记为R,圆心坐标所述拟合轨迹的半径记为r,圆心坐标为(x’0,y’0);As shown in FIG5 , the angle between the fitting trajectory and the original trajectory is denoted as θ, the radius of the original trajectory is denoted as R, and the center coordinates are The radius of the fitting trajectory is denoted as r, and the coordinates of the center of the circle are (x'0 , y'0 );
屏幕动点原始轨迹记为函数:The original trajectory of the moving point on the screen is recorded as a function:
依据摄像机光栅图像和某一种视线追踪模型计算得到的视线落点轨迹为包含有轨迹误差的闭合曲线fimage(x′,y′)。显然,fimage(x′,y′)必须与原始轨迹的圆形相对应,不妨记为:The sight point trajectory calculated based on the camera raster image and a certain sight tracking model is a closed curve fimage (x′, y′) containing the trajectory error. Obviously, fimage (x′, y′) must correspond to the circle of the original trajectory, which can be recorded as:
fimage(x′,y′,model(k,d1,d2,...))=(x-x′0)2+(y-y′0)2-r2=0#(2)fimage (x′, y′, model (k, d1, d2,...))=(xx′0 )2 + (yy′0 )2 -r2 =0#(2)
其中,(x’1,y’1)和(x’2,y’2)分别为依据视线追踪模型计算得到的视线落点轨迹的两个端点,分别对应了屏幕动点原始半圆弧轨迹的起点和终点;(x0,y0)和R为屏幕动点原始半圆弧轨迹的圆心和半径。model(k,d1,d2,…)表示视线追踪模型,其中的k,d1,d2等等为视线追踪模型中的系统参数。in, (x'1 , y'1 ) and (x'2 , y'2 ) are the two endpoints of the gaze trajectory calculated according to the gaze tracking model, corresponding to the starting point and end point of the original semicircular arc trajectory of the moving point on the screen; (x0 , y0 ) and R are the center and radius of the original semicircular arc trajectory of the moving point on the screen. model(k, d1, d2, ...) represents the gaze tracking model, where k, d1, d2, etc. are system parameters in the gaze tracking model.
依据半圆直径两个端点的位置对应关系,求取视线落点轨迹与原始轨迹之间的投影变换矩阵,包含圆弧半径调整、旋转变换、平移变换,其中:According to the position correspondence between the two endpoints of the semicircle diameter, the projection transformation matrix between the sight point trajectory and the original trajectory is obtained, including arc radius adjustment, rotation transformation, and translation transformation, where:
首先进行圆弧半径调整,比例调整矩阵Tp将拟合圆弧的半径调整为原始圆弧半径,既将拟合圆弧的形状进行比例系数为的等比例放大:First, the arc radius is adjusted. The proportional adjustment matrix Tp adjusts the radius of the fitted arc to the radius of the original arc, that is, the shape of the fitted arc is proportional to Proportional enlargement of:
拟合视线落点轨迹比例放大后的任意计算位置:Fit the line of sight landing point trajectory to any calculated position after scaling up:
然后,旋转变换矩阵TR将半圆弧的直径齐平:Then, the rotation transformation matrixTR aligns the diameters of the semicircle arcs:
夹角Angle
若y′1-y′2>0则认为θ为正角度,反之则认为θ为负角度。If y′1 -y′2 > 0, θ is considered to be a positive angle, otherwise, θ is considered to be a negative angle.
将拟合视线落点轨迹旋转后的任意落点计算位置:Calculate the position of any landing point after rotating the fitted sight landing point trajectory:
平移变换矩阵TT将两圆弧的圆心重合:The translation transformation matrix TT makes the centers of the two arcs coincident:
将拟合视线落点轨迹与原始半圆轨迹重合后的任意落点计算位置:Calculate the position of any landing point after the fitted sight line landing point trajectory coincides with the original semicircle trajectory:
进一步,视线落点轨迹fimage(x′,y′,model(k,d1,d2,...))上所有点的位置都必须满足公式(1)所述圆轨迹方程,因此,将所有视频帧中各动点的计算位置fimage(x′,y′,model(k,d1,d2,...))经过公式(8)处理后应当全部都符合公式(1)约束条件。但是由于视线追踪模型误差的存在,公式(8)处理得到的视线落点轨迹坐标带入公式(1)中之后,必然存在一个不为零的偏差。Furthermore, the positions of all points on the sight point trajectory fimage (x′, y′, model (k, d1, d2, ...)) must satisfy the circular trajectory equation described in formula (1). Therefore, the calculated positions of all moving points in all video frames fimage (x′, y′, model (k, d1, d2, ...)) should all meet the constraints of formula (1) after being processed by formula (8). However, due to the existence of the sight tracking model error, after the sight point trajectory coordinates obtained by formula (8) are substituted into formula (1), there must be a non-zero deviation.
进一步,当采用无需参数标定的交比值法而进行误差补偿时,增加误差补偿项目,例如二次多项式误差或者是三次多项式误差Furthermore, when the cross ratio method without parameter calibration is used for error compensation, an error compensation item is added, such as a quadratic polynomial error Or a cubic polynomial error
因此,基于最小二乘法原理,将err中的参数作为待求变量,对(9)式求最小化:Therefore, based on the principle of least squares method, the parameters in err are taken as the variables to be determined, and equation (9) is minimized:
min{∑(fimage(x′,y′,model(k,d1,d2,...))·Tp·Tt·Tz+err-f(xref,xref))2}#(9)min{∑(fimage (x′, y′, model(k, d1, d2,...))·Tp ·Tt ·Tz+err-f(xref , xref ))2 }#( 9)
分别对二次多项式或者三次多项式的系数项目求导数并令其为零,得到各误差补偿系数,既令等等。最小二乘所得各误差补偿系数,对应了多项式误差模型补偿表。Take the derivative of the coefficient items of the quadratic polynomial or cubic polynomial respectively and set them to zero to obtain the error compensation coefficients, that is, Etc. The error compensation coefficients obtained by least squares correspond to the polynomial error model compensation table.
进一步,由于最小二乘法受到奇异值影响会对结果造成较大偏差,在采集的人眼图像过程中人眼不经意的撇视标准点周围区域的情况极易发生,为保证最终结果的准确性需对奇异点进行去除。设共取了n各注视点,每个注视点的偏差距离为Li,则Furthermore, since the least squares method is affected by singular values, it will cause a large deviation in the result. In the process of collecting human eye images, it is very easy for the human eye to inadvertently glance at the area around the standard point. In order to ensure the accuracy of the final result, the singular points need to be removed. Suppose a total of n fixation points are taken, and the deviation distance of each fixation point isLi , then
当的比值大于1.5或小于0.75时,认为该点为奇异点,去除该点的数据不带入方程计算。when When the ratio is greater than 1.5 or less than 0.75, the point is considered a singular point and the data of the point is removed and not included in the equation calculation.
基于最小二乘法原理,计算得到的系数项目形式为:Based on the principle of least squares method, the calculated coefficient item form is:
其中n为带入对应点的数量。Where n is the number of corresponding points.
联立三个关于a,b,c的方程可求解出a,b,c的值,联立三个关于d,e,f的方程可求解出d,e,f的值The three equations about a, b, and c can be solved together to find the values of a, b, and c. The three equations about d, e, and f can be solved together to find the values of d, e, and f.
进一步,当采用需要参数标定的角膜反射法、单归一化法时,对模型中待标定参数(k,d1,d2,...)进行类似的最小二乘法处理,求取最小化问题。Furthermore, when the corneal reflection method or the single normalization method which requires parameter calibration is used, a similar least squares method is performed on the parameters to be calibrated (k, d1, d2, ...) in the model to obtain the minimization problem.
令make
最小二乘法所得各项系数即为标定参数结果。 The coefficients obtained by the least squares method are the calibration parameter results.
基于最小二乘法原理,建立误差多项式的系数求解方程为:Based on the principle of least squares method, the coefficient solution equation of the error polynomial is established as:
……。…….
本发明的另一目的在于提供一种计算机设备,所述计算机设备包括存储器和处理器,所述存储器存储有计算机程序,所述计算机程序被所述处理器执行时,使得所述处理器执行所述基于注视轨迹的视线追踪误差补偿或参数标定方法的步骤。Another object of the present invention is to provide a computer device, which includes a memory and a processor, wherein the memory stores a computer program, and when the computer program is executed by the processor, the processor executes the steps of the gaze tracking error compensation or parameter calibration method based on the gaze trajectory.
本发明的另一目的在于提供一种计算机可读存储介质,存储有计算机程序,所述计算机程序被处理器执行时,使得所述处理器执行所述基于视轨迹的视线追踪误差补偿或参数标定方法的步骤。Another object of the present invention is to provide a computer-readable storage medium storing a computer program, which, when executed by a processor, enables the processor to perform the steps of the line of sight tracking error compensation or parameter calibration method based on visual trajectory.
本发明的另一目的在于提供一种信息数据处理终端,所述信息数据处理终端用于实现所述基于视轨迹的视线追踪误差补偿或参数标定系统。Another object of the present invention is to provide an information data processing terminal, which is used to implement the line of sight tracking error compensation or parameter calibration system based on visual trajectory.
结合上述的技术方案和解决的技术问题,本发明所要保护的技术方案所具备的优点及积极效果为:In combination with the above technical solutions and the technical problems solved, the advantages and positive effects of the technical solutions to be protected by the present invention are as follows:
第一、针对上述现有技术存在的技术问题以及解决该问题的难度,紧密结合本发明的所要保护的技术方案以及研发过程中结果和数据等,详细、深刻地分析本发明技术方案如何解决的技术问题,解决问题之后带来的一些具备创造性的技术效果。具体描述如下:First, in view of the technical problems existing in the above-mentioned prior art and the difficulty of solving the problems, the technical solutions to be protected by the present invention and the results and data during the research and development process are closely combined to analyze in detail and deeply how the technical solutions of the present invention solve the technical problems, and some creative technical effects brought about after solving the problems. The specific description is as follows:
本发明首先针对传统的视线追踪系统视线追踪误差补偿或参数标定工作量大、对用户的技能要求高、操作体验差等问题,提出一种基于注视轨迹的视线追踪误差补偿或参数标定方法。通过追踪给定轨迹的动点,选取不同时间序列的人眼图像按照相应的视线估计方法计算对应的视线落点,利用视线注视轨迹的形状信息以及跟踪注视轨迹方法快速便捷的获取大量视线落点的优势通过最小二乘法求解补偿表系数或者进行模型参数标定。通过对不同方法进行处理的视线落点偏差进行比较,说明本发明公开的方法可在保证视线计算精度的前提下简化实现误差补偿或参数标定的过程。The present invention first proposes a method for gaze tracking error compensation or parameter calibration based on gaze trajectory, aiming at the problems of large workload of gaze tracking error compensation or parameter calibration, high skill requirements for users, and poor operating experience in traditional gaze tracking systems. By tracking the moving point of a given trajectory, human eye images of different time series are selected to calculate the corresponding gaze landing point according to the corresponding gaze estimation method, and the shape information of the gaze gaze trajectory and the advantage of tracking the gaze trajectory method to quickly and conveniently obtain a large number of gaze landing points are used to solve the compensation table coefficients or perform model parameter calibration through the least squares method. By comparing the gaze landing point deviations processed by different methods, it is shown that the method disclosed in the present invention can simplify the process of error compensation or parameter calibration while ensuring the accuracy of gaze calculation.
第二,把技术方案看做一个整体或者从产品的角度,本发明所要保护的技术方案具备的技术效果和优点,具体描述如下:Second, considering the technical solution as a whole or from the perspective of the product, the technical effects and advantages of the technical solution to be protected by the present invention are described in detail as follows:
本发明在基于视线追踪系统的基础上进行研究,通过跟踪注视轨迹的方法来对视线追踪系统中的视线估计模块进行简易高效的误差补偿或模型参数标定,在保证视线落点计算精度的同时降低实际操作过程的难度和用户操作的不适感。The present invention is based on the gaze tracking system and performs simple and efficient error compensation or model parameter calibration on the gaze estimation module in the gaze tracking system by tracking the gaze trajectory, thereby reducing the difficulty of the actual operation process and the discomfort of the user's operation while ensuring the accuracy of the gaze landing point calculation.
第三,作为本发明的权利要求的创造性辅助证据,还体现在以下重要方面:Third, as auxiliary evidence of the inventiveness of the claims of the present invention, it is also reflected in the following important aspects:
本发明的技术方案转化后的预期收益和商业价值为:本专利提出了一种新的误差补偿或参数标定方法,相对传统的方法而言过程更加简单高效,具有较好的应用前景。The expected benefits and commercial value of the technical solution of the present invention after transformation are as follows: This patent proposes a new error compensation or parameter calibration method, which is simpler and more efficient than the traditional method and has good application prospects.
附图说明BRIEF DESCRIPTION OF THE DRAWINGS
图1是本发明实施例提供的基于交比不变性的视线估计方法流程图;1 is a flow chart of a line of sight estimation method based on cross ratio invariance provided by an embodiment of the present invention;
图2是本发明实施例提供的视线计算模型示意图;FIG2 is a schematic diagram of a line of sight calculation model provided by an embodiment of the present invention;
图3是本发明实施例提供的不做补偿情况下的误差与通过本发明方法补偿后的误差对比图;FIG3 is a comparison diagram of the error without compensation provided by an embodiment of the present invention and the error after compensation by the method of the present invention;
图4是本发明实施例提供的通过九点法补偿和通过本发明方法补偿后的误差在两种不同定标板上的误差对比图(a)九点定标板(b)注视轨迹定标板;FIG4 is a comparison diagram of errors compensated by the nine-point method and the method of the present invention on two different calibration plates provided by an embodiment of the present invention (a) nine-point calibration plate (b) gaze track calibration plate;
图5是本发明实施例提供的拟合轨迹和所述原始轨迹示意图。FIG. 5 is a schematic diagram of a fitting trajectory and the original trajectory provided by an embodiment of the present invention.
具体实施方式DETAILED DESCRIPTION
为了使本发明的目的、技术方案及优点更加清楚明白,以下结合实施例交比法,对本发明进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本发明,并不用于限定本发明。In order to make the purpose, technical solution and advantages of the present invention more clearly understood, the present invention is further described in detail below in combination with the embodiment cross-comparison method. It should be understood that the specific embodiments described herein are only used to explain the present invention and are not used to limit the present invention.
为了使本领域技术人员充分了解本发明如何具体实现,该部分是对权利要求技术方案进行展开说明的解释说明实施例。In order to enable those skilled in the art to fully understand how to implement the present invention in detail, this section is an explanatory embodiment that expands and describes the technical solution of the claims.
如图1所示,本发明实施例提供的基于注视轨迹的视线追踪误差补偿方法包括:As shown in FIG1 , the gaze tracking error compensation method based on gaze trajectory provided by an embodiment of the present invention includes:
S101,通过追踪给定轨迹的动点,选取不同时间序列的人眼图像按照交比不变性构建映射模型,计算对应的视线落点;S101, by tracking the moving point of a given trajectory, selecting human eye images of different time series to construct a mapping model according to the cross ratio invariance, and calculating the corresponding sight point;
S102,进行视线落点误差补偿,利用注视轨迹的形状信息以及最小二乘法对补偿表系数进行求解,确定视线落点的最终位置。S102, performing sight point error compensation, using shape information of the gaze trajectory and the least square method to solve the compensation table coefficients to determine the final position of the sight point.
通过交比不变性构建视线映射模型应用了射影几何学中的交比射影不变性原理,即空间中一条直线经过几何变换后直线上的对应点的交比不变。The line of sight mapping model is constructed through cross-ratio invariance, which applies the cross-ratio projective invariance principle in projective geometry, that is, the cross-ratio of corresponding points on a straight line in space remains unchanged after a geometric transformation.
图2为通过交比不变性构造的视线计算模型示意图,主要由屏幕平面、相机成像平面以及角膜反射平面三个平面构成,在该模型中将角膜反射面这一曲面理想化为一个平面。角膜面上的反射点和相机平面上的光斑成像点可以认为是屏幕上光源点的映射,故可再此将交比不变性原理由二维应用到三维空间,通过各个平面之间相关点构成的平面图形的各线段之间的交比值固定计算出视线落点S的位置。其中L1、L2、L3、L4是屏幕平面上的四个光源点,G1、G2、G3、G4是光源在人眼角膜上形成的反射光斑,g1、g2、g3、g4是反射光斑在相机成像面上所形成的像。P0是瞳孔中心,c是眼球中心,瞳孔中心与眼球中心的连线是眼球的光轴,在此模型中将光轴方向视为人眼视线的方向,光轴与屏幕的交点S即为该模型求得的视线落点。FIG2 is a schematic diagram of a sightline calculation model constructed by cross-ratio invariance, which is mainly composed of three planes: the screen plane, the camera imaging plane, and the corneal reflection plane. In this model, the curved surface of the corneal reflection plane is idealized as a plane. The reflection point on the corneal surface and the light spot imaging point on the camera plane can be considered as the mapping of the light source point on the screen. Therefore, the cross-ratio invariance principle can be applied from two dimensions to three dimensions. The position of the sightline landing point S is calculated by fixing the cross-ratio value between each line segment of the plane figure composed of the relevant points between each plane. Among them, L1, L2, L3, and L4 are four light source points on the screen plane, G1, G2, G3, and G4 are the reflection spots formed by the light source on the human cornea, and g1, g2, g3, and g4 are the images formed by the reflection spots on the camera imaging plane. P0 is the center of the pupil, c is the center of the eyeball, and the line connecting the center of the pupil and the center of the eyeball is the optical axis of the eyeball. In this model, the direction of the optical axis is regarded as the direction of the human eye's sight line, and the intersection S of the optical axis and the screen is the sightline landing point obtained by the model.
进一步,误差来源包括角膜曲面、光轴与视轴夹角;所述角膜曲面的误差包括:Furthermore, the sources of error include the corneal curvature and the angle between the optical axis and the visual axis; the errors of the corneal curvature include:
在构建交比不变性视线映射模型时为了方便计算将反射点和瞳孔中心理想化的认为在同一平面内(瞳孔切面),而在实际情况中人眼角膜是一种曲面的结构,所以反射点和瞳孔中心并不是共面的,在进行交比计算时,会将理想平面上的两点直线长度取代真实的两点之间的曲线长度进行计算,相同两点之间曲线的长度明显是大于直线的,这种模型构造上出现的错误会给视线方向的估计带来误差。When constructing the cross-ratio invariant line of sight mapping model, the reflection point and the pupil center are idealized to be in the same plane (pupil section) for the convenience of calculation. However, in reality, the human cornea is a curved structure, so the reflection point and the pupil center are not coplanar. When calculating the cross ratio, the length of the straight line between the two points on the ideal plane is used instead of the length of the curve between the two real points. The length of the curve between the same two points is obviously greater than that of the straight line. This error in model construction will cause errors in the estimation of the line of sight direction.
所述光轴与视轴夹角的误差为:The error of the angle between the optical axis and the visual axis is:
光轴是通过角膜表面中央部且垂直角膜便面的直线,视轴是眼外注视点与黄斑中间凹的连线。视轴与光轴之间存在一个夹角kappa,kappa角的大小一般在4~8度之间,交比法在构建模型时为了简化模型的复杂度直接将光轴作为视轴进行输出,致使最终求得的视线落点位置与实际的视线落点位置产生偏差。The optical axis is a straight line that passes through the center of the corneal surface and is perpendicular to the corneal surface. The visual axis is the line connecting the extraocular gaze point and the fovea. There is an angle kappa between the visual axis and the optical axis. The kappa angle is generally between 4 and 8 degrees. When constructing the model, the cross-ratio method directly outputs the optical axis as the visual axis in order to simplify the complexity of the model, resulting in a deviation between the final sight point position and the actual sight point position.
以上两点的误差来源于模型构造时为了便于计算而简化模型带来的系统误差,对于系统误差的消除一般通过对照试验、空白试验、矫正试验这三种方法进行操作,综合考虑本发明所研究的视线追踪系统的特点,采用矫正实验的思路对误差进行矫正。The errors in the above two points come from the systematic errors caused by simplifying the model for the convenience of calculation when constructing the model. The elimination of systematic errors is generally carried out through three methods: control test, blank test and correction test. Taking into account the characteristics of the gaze tracking system studied in this invention, the idea of correction experiment is adopted to correct the errors.
进一步,视线落点误差补偿包括:Furthermore, the sight point error compensation includes:
(1)利用轨迹形状信息进行落点误差补偿;(1) Using trajectory shape information to compensate for landing point error;
(2)建立误差补偿表,利用最小二乘法求解补偿表系数,进行误差补偿;(2) Establish an error compensation table, use the least squares method to solve the compensation table coefficients, and perform error compensation;
进一步,所述利用轨迹形状信息进行落点误差补偿包括:Further, the use of trajectory shape information to perform landing point error compensation includes:
通过人眼注视校准屏幕上的匀速动点,动点的轨迹为一规则的半圆或圆,逐帧画面计算视线在校准屏幕上的视线落点轨迹,依据特殊点的位置,例如直径的两个端点,计算所述原始注视轨迹和拟合注视轨迹的对应关系,拟合轨迹和所述原始轨迹之间的夹角记为θ,所述原始轨迹的半径记为R,圆心坐标所述拟合轨迹的半径记为r,圆心坐标为(x’0,y’0),如图3所示。The human eye gazes at a uniform moving point on the calibration screen, and the trajectory of the moving point is a regular semicircle or circle. The trajectory of the sight point on the calibration screen is calculated frame by frame. According to the position of special points, such as the two end points of the diameter, the corresponding relationship between the original gaze trajectory and the fitted gaze trajectory is calculated. The angle between the fitted trajectory and the original trajectory is recorded as θ, the radius of the original trajectory is recorded as R, and the coordinates of the center of the circle are The radius of the fitting trajectory is denoted as r, and the coordinates of the center of the circle are (x'0 , y'0 ), as shown in FIG3 .
屏幕动点原始轨迹记为函数:The original trajectory of the moving point on the screen is recorded as a function:
依据摄像机光栅图像和某一种视线追踪模型计算得到的视线落点轨迹为包含有轨迹误差的闭合曲线fimage(x′,y′)。显然,fimage(x′,y′)必须与原始轨迹的圆形相对应,不妨记为:The sight point trajectory calculated based on the camera raster image and a certain sight tracking model is a closed curve fimage (x′, y′) containing the trajectory error. Obviously, fimage (x′, y′) must correspond to the circle of the original trajectory, which can be recorded as:
fimage(x′,y′,model(k,d1,d2,...))=(x-x′0)2+(y-y′0)2-r2=0#(2)fimage (x′, y′, model (k, d1, d2,...))=(xx′0 )2 + (yy′0 )2 -r2 =0#(2)
其中,(x’1,y’1)和(x’2,y’2)分别为依据视线追踪模型计算得到的视线落点轨迹的两个端点,分别对应了屏幕动点原始半圆弧轨迹的起点和终点;(x0,y0)和R为屏幕动点原始半圆弧轨迹的圆心和半径。model(k,d1,d2,…)表示视线追踪模型,其中的k,d1,d2等等为视线追踪模型中的系统参数。in, (x'1 , y'1 ) and (x'2 , y'2 ) are the two endpoints of the gaze trajectory calculated according to the gaze tracking model, corresponding to the starting point and end point of the original semicircular arc trajectory of the moving point on the screen; (x0 , y0 ) and R are the center and radius of the original semicircular arc trajectory of the moving point on the screen. model(k, d1, d2, ...) represents the gaze tracking model, where k, d1, d2, etc. are system parameters in the gaze tracking model.
依据半圆直径两个端点的位置对应关系,求取视线落点轨迹与原始轨迹之间的投影变换矩阵,包含圆弧半径调整、旋转变换、平移变换,其中:According to the position correspondence between the two endpoints of the semicircle diameter, the projection transformation matrix between the sight point trajectory and the original trajectory is obtained, including arc radius adjustment, rotation transformation, and translation transformation, where:
首先进行圆弧半径调整,比例调整矩阵Tp将拟合圆弧的半径调整为原始圆弧半径,既将拟合圆弧的形状进行比例系数为的等比例放大:First, the arc radius is adjusted. The proportional adjustment matrix Tp adjusts the radius of the fitted arc to the original arc radius, that is, the shape of the fitted arc is proportional to Proportional enlargement of:
拟合视线落点轨迹比例放大后的任意计算位置:Fit the line of sight landing point trajectory to any calculated position after scaling up:
然后,旋转变换矩阵TR将半圆弧的直径齐平:Then, the rotation transformation matrixTR aligns the diameters of the semicircle arcs:
夹角Angle
若y′1-y′2>0则认为θ为正角度,反之则认为θ为负角度。If y′1 -y′2 > 0, θ is considered to be a positive angle, otherwise, θ is considered to be a negative angle.
将拟合视线落点轨迹旋转后的任意落点计算位置:Calculate the position of any landing point after rotating the fitted sight landing point trajectory:
平移变换矩阵TT将两圆弧的圆心重合:The translation transformation matrix TT makes the centers of the two arcs coincident:
将拟合视线落点轨迹与原始半圆轨迹重合后的任意落点计算位置:Calculate the position of any landing point after the fitted sight line landing point trajectory coincides with the original semicircle trajectory:
进一步,所述建立误差补偿表,利用最小二乘法求解补偿表系数,进行误差补偿包括:Further, the error compensation table is established, and the coefficients of the compensation table are solved by the least square method to perform error compensation, including:
视线落点轨迹fimage(x′,y′,model(k,d1,d2,...))上所有点的位置都必须满足公式(1)所述圆轨迹方程,因此,将所有视频帧中各动点的计算位置fimage(x′,y′,model(k,d1,d2,...))经过公式(8)处理后应当全部都符合公式(1)约束条件。但是由于视线追踪模型误差的存在,公式(8)处理得到的视线落点轨迹坐标带入公式(1)中之后,必然存在一个不为零的偏差。The positions of all points on the sight point trajectory fimage (x′, y′, model (k, d1, d2, ...)) must satisfy the circular trajectory equation described in formula (1). Therefore, the calculated positions of all moving points in all video frames fimage (x′, y′, model (k, d1, d2, ...)) should all meet the constraints of formula (1) after being processed by formula (8). However, due to the existence of the sight tracking model error, after the sight point trajectory coordinates obtained by formula (8) are substituted into formula (1), there must be a non-zero deviation.
进一步,增加误差补偿项目二次多项式误差Furthermore, the error compensation item quadratic polynomial error is added
因此,基于最小二乘法原理,将model(k,d1,d2,…)中的参数作为待求变量,对(9)式求最小化:Therefore, based on the principle of least squares, the parameters in model (k, d1, d2, ...) are taken as variables to be determined, and equation (9) is minimized:
min{∑(fimage(x′,y′,model(k,d1,d2,...))+err-f(xref,xref))2}#(9)min{∑(fimage (x′, y′, model(k, d1, d2,...))+err-f(xref , xref ))2 }#(9)
分别对二次多项式的系数项目求导并令其为零,得到各误差补偿系数,即令等等,最小二乘所得各误差补偿系数,对应了多项式误差模型补偿表。Derivative the coefficient items of the quadratic polynomial respectively and set them to zero to obtain the error compensation coefficients, that is, And so on, the error compensation coefficients obtained by least squares correspond to the polynomial error model compensation table.
进一步,由于最小二乘法收到奇异值影响会对结果造成较大偏差,在采集的人眼图像过程中人眼不经意的撇视标准点周围区域的情况极易发生,为保证最终结果的准确性需对奇异点进行去除。设共取了n个注视点,每个注视点的偏差距离为Li,则Furthermore, since the least squares method is affected by singular values, it will cause a large deviation in the result. In the process of collecting human eye images, it is very easy for the human eye to inadvertently glance at the area around the standard point. In order to ensure the accuracy of the final result, the singular points need to be removed. Assume that a total of n fixation points are taken, and the deviation distance of each fixation point isLi , then
当的比值大于1.5或小于0.75时,认为该点为奇异点,去除该点的数据不带入方程计算。when When the ratio is greater than 1.5 or less than 0.75, the point is considered a singular point and the data of the point is removed and not included in the equation calculation.
基于最小二乘法原理,计算得到的系数项目形式为:Based on the principle of least squares method, the calculated coefficient item form is:
其中n为带入对应点的数量。Where n is the number of corresponding points.
联立三个关于a,b,c的方程可求解出a,b,c,的值,联立三个关于d,e,f的方程可求解出d,e,f的值The three equations about a, b, and c can be solved to find the values of a, b, and c. The three equations about d, e, and f can be solved to find the values of d, e, and f.
本发明实施例还提供了基于注视轨迹的视线追踪误差补偿或参数标定系统,包括:The embodiment of the present invention further provides a gaze tracking error compensation or parameter calibration system based on gaze trajectory, including:
模型构建模块,用于通过追踪注视轨迹的动点,选取不同时间序列的人眼图像按照交比不变性构建映射模型;A model building module is used to select human eye images of different time series to build a mapping model according to the cross ratio invariance by tracking the moving points of the gaze trajectory;
视线误差补偿或参数标定模块,利用视线轨迹的形状信息以及最小二乘法的方式对系统进行补偿表系数求解或模型参数标定The sight error compensation or parameter calibration module uses the shape information of the sight trajectory and the least squares method to solve the compensation table coefficients or calibrate the model parameters of the system.
为了证明本发明的技术方案的创造性和技术价值,该部分是对权利要求技术方案进行具体产品上或相关技术上的应用实施例。In order to prove the creativity and technical value of the technical solution of the present invention, this section provides application examples of the claimed technical solution on specific products or related technologies.
将本发明应用实施例提供的基于注视轨迹的视线追踪误差补偿或参数标定方法应用于计算机设备,所述计算机设备包括存储器和处理器,所述存储器存储有计算机程序,所述计算机程序被所述处理器执行时,使得所述处理器执行所述基于注视轨迹的视线追踪误差补偿或参数标定方法的步骤。The gaze tracking error compensation or parameter calibration method based on gaze trajectory provided by the application embodiment of the present invention is applied to a computer device, and the computer device includes a memory and a processor, and the memory stores a computer program. When the computer program is executed by the processor, the processor executes the steps of the gaze tracking error compensation or parameter calibration method based on gaze trajectory.
将本发明应用实施例提供的基于注视轨迹的视线追踪误差补偿或参数标定方法应用于信息数据处理终端,所述信息数据处理终端用于实现所述基于注视轨迹的视线追踪误差补偿或参数标定系统。The gaze tracking error compensation or parameter calibration method based on the gaze trajectory provided in the application embodiment of the present invention is applied to an information data processing terminal, and the information data processing terminal is used to implement the gaze tracking error compensation or parameter calibration system based on the gaze trajectory.
本发明实施例在研发或者使用过程中取得了一些积极效果,和现有技术相比具备很大的优势,下面内容结合试验过程的数据、图表等进行描述。The embodiments of the present invention have achieved some positive effects during the development or use process and have great advantages over the prior art. The following content is described in conjunction with data, charts, etc. of the test process.
本发明实施例提供的实验与分析包括:实验准备过程和实验结果与分析;所述实验(选择交比法对该参数补偿方法进行说明)准备过程包括:The experiment and analysis provided by the embodiment of the present invention include: an experiment preparation process and an experiment result and analysis; the experiment (selecting the cross ratio method to illustrate the parameter compensation method) preparation process includes:
由图2的视线计算模型示意图中可以看出,硬件包括一个摄像头、4个红外led灯、显示屏、给定小球的运动速度和运动轨迹的短视频或动图。As can be seen from the schematic diagram of the line of sight calculation model in Figure 2, the hardware includes a camera, four infrared LED lights, a display screen, and a short video or animated image of the movement speed and trajectory of a given ball.
摄像头对准待测人眼,红外led灯按照模型放置在显示屏的相应位置,调整好人头部与显示屏的对应位置,播放小球运动的视频,人眼注视视频中的小球,人眼相机开始记录眼动行为(包括时间、瞳孔位置、反射光斑位置),校准结束后通过本发明的视线注视模型进行求解。The camera is aimed at the human eye to be tested, the infrared LED lamp is placed at the corresponding position of the display screen according to the model, the corresponding position of the human head and the display screen is adjusted, a video of the movement of a small ball is played, the human eye looks at the small ball in the video, and the human eye camera starts to record eye movement behavior (including time, pupil position, and reflected light spot position). After the calibration is completed, the solution is obtained through the line of sight gaze model of the present invention.
进一步,所述实验结果与分析包括:Further, the experimental results and analysis include:
本发明主要视线估计中的系统误差进行分析,本实验所选的交比不变性映射模型主要存在两类系统误差,通过交叉实验的方法,与传统的多点补偿方法进行比较。The present invention mainly analyzes the systematic error in line of sight estimation. The cross-ratio invariance mapping model selected in this experiment mainly has two types of systematic errors. Through the cross-experiment method, it is compared with the traditional multi-point compensation method.
1)不做补偿和通过本发明方法补偿误差对比:如图所示,比较直接通过视线估计模型得出的视线落点与真实注视点之间的误差和通过本发明方法补偿后得出的视线落点和真实注视点之间的误差。1) Comparison of errors without compensation and with compensation by the method of the present invention: As shown in the figure, the error between the sight point obtained directly by the sight estimation model and the real gaze point is compared with the error between the sight point obtained after compensation by the method of the present invention and the real gaze point.
2)本发明方法和传统九点法误差对比:如图所示,比较通过传统多点法补偿后的视线落点与真实注视点之间的误差和通过本发明方法补偿后得出的视线落点和真实注视点之间的误差。2) Error comparison between the method of the present invention and the traditional nine-point method: As shown in the figure, the error between the sight point compensated by the traditional multi-point method and the real gaze point is compared with the error between the sight point compensated by the method of the present invention and the real gaze point.
本发明所用的方法补偿后计算出的视线落点误差与九点法补偿后计算出的视线落点误差相比有明显的改善,说明该方法在精度上可以满足视线估计的要求。The sight point error calculated after compensation by the method used in the present invention is significantly improved compared with the sight point error calculated after compensation by the nine-point method, indicating that the method can meet the requirements of sight line estimation in terms of accuracy.
本发明以视线追踪系统中的参数补偿方法为研究对象,针对该系统使用常规的多点校准方法进行补偿时存在的过程复杂、技能要求高、用户体验差等问题,提出了一种基于注视轨迹的视线追踪误差补偿或参数标定方法。通过对交比法实例下不同情况下的视线落点精度进行对比,说明该方法可以在保证精度的同时降低补偿的复杂程度和用户技能要求,改善用户使用体验。This paper takes the parameter compensation method in the gaze tracking system as the research object. In view of the problems that the system has in the conventional multi-point calibration method for compensation, such as complex process, high skill requirements, and poor user experience, a gaze tracking error compensation or parameter calibration method based on gaze trajectory is proposed. By comparing the accuracy of the gaze point under different situations in the cross-ratio method example, it is shown that this method can reduce the complexity of compensation and the user skill requirements while ensuring accuracy, thereby improving the user experience.
应当注意,本发明的实施方式可以通过硬件、软件或者软件和硬件的结合来实现。硬件部分可以利用专用逻辑来实现;软件部分可以存储在存储器中,由适当的指令执行系统,例如微处理器或者专用设计硬件来执行。本领域的普通技术人员可以理解上述的设备和方法可以使用计算机可执行指令和/或包含在处理器控制代码中来实现,例如在诸如磁盘、CD或DVD-ROM的载体介质、诸如只读存储器(固件)的可编程的存储器或者诸如光学或电子信号载体的数据载体上提供了这样的代码。本发明的设备及其模块可以由诸如超大规模集成电路或门阵列、诸如逻辑芯片、晶体管等的半导体、或者诸如现场可编程门阵列、可编程逻辑设备等的可编程硬件设备的硬件电路实现,也可以用由各种类型的处理器执行的软件实现,也可以由上述硬件电路和软件的结合例如固件来实现。以上所述,仅为本发明的具体实施方式,但本发明的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本发明揭露的技术范围内,凡在本发明的精神和原则之内所作的任何修改、等同替换和改进等,都应涵盖在本发明的保护范围之内。It should be noted that the embodiments of the present invention can be implemented by hardware, software or a combination of software and hardware. The hardware part can be implemented using dedicated logic; the software part can be stored in a memory and executed by an appropriate instruction execution system, such as a microprocessor or dedicated hardware. A person of ordinary skill in the art will understand that the above-mentioned devices and methods can be implemented using computer executable instructions and/or contained in a processor control code, such as a carrier medium such as a disk, CD or DVD-ROM, a programmable memory such as a read-only memory (firmware), or a data carrier such as an optical or electronic signal carrier. Such code is provided on a carrier medium such as a disk, CD or DVD-ROM, a programmable memory such as a read-only memory (firmware), or a data carrier such as an optical or electronic signal carrier. The device and its modules of the present invention can be implemented by hardware circuits such as very large-scale integrated circuits or gate arrays, semiconductors such as logic chips, transistors, etc., or programmable hardware devices such as field programmable gate arrays, programmable logic devices, etc., can also be implemented by software executed by various types of processors, or can be implemented by a combination of the above-mentioned hardware circuits and software, such as firmware. The above is only a specific embodiment of the present invention, but the protection scope of the present invention is not limited thereto. Any modification, equivalent substitution and improvement made by any technician familiar with the technical field within the technical scope disclosed by the present invention, within the spirit and principle of the present invention, should be covered within the protection scope of the present invention.
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202310447946.1ACN116594502B (en) | 2023-04-23 | 2023-04-23 | Gaze tracking error compensation or parameter calibration method and system based on gaze trajectory |
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202310447946.1ACN116594502B (en) | 2023-04-23 | 2023-04-23 | Gaze tracking error compensation or parameter calibration method and system based on gaze trajectory |
| Publication Number | Publication Date |
|---|---|
| CN116594502Atrue CN116594502A (en) | 2023-08-15 |
| CN116594502B CN116594502B (en) | 2025-03-18 |
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202310447946.1AActiveCN116594502B (en) | 2023-04-23 | 2023-04-23 | Gaze tracking error compensation or parameter calibration method and system based on gaze trajectory |
| Country | Link |
|---|---|
| CN (1) | CN116594502B (en) |
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN118012268A (en)* | 2024-02-21 | 2024-05-10 | 深圳市铱硙医疗科技有限公司 | A post-processing correction method and system for VR eye tracking data |
| CN119248107A (en)* | 2024-09-20 | 2025-01-03 | 苏州数界科技有限公司 | A non-contact human-computer interaction system and method based on stereoscopic vision |
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106066696A (en)* | 2016-06-08 | 2016-11-02 | 华南理工大学 | The sight tracing compensated based on projection mapping correction and point of fixation under natural light |
| US20170243042A1 (en)* | 2011-02-04 | 2017-08-24 | Gannon Technologies Group, Llc | Systems and methods for biometric identification |
| CN108122257A (en)* | 2016-11-28 | 2018-06-05 | 沈阳新松机器人自动化股份有限公司 | A kind of Robotic Hand-Eye Calibration method and device |
| CN111443804A (en)* | 2019-12-27 | 2020-07-24 | 安徽大学 | A gaze point trajectory description method and system based on video analysis |
| CN112965532A (en)* | 2021-03-22 | 2021-06-15 | 北京航空航天大学 | Optimization method of trajectory of aircraft around multiple no-fly zones based on path optimization |
| CN113733088A (en)* | 2021-09-07 | 2021-12-03 | 河南大学 | Mechanical arm kinematics self-calibration method based on binocular vision |
| CN115840502A (en)* | 2022-11-23 | 2023-03-24 | 深圳市华弘智谷科技有限公司 | Three-dimensional sight tracking method, device, equipment and storage medium |
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20170243042A1 (en)* | 2011-02-04 | 2017-08-24 | Gannon Technologies Group, Llc | Systems and methods for biometric identification |
| CN106066696A (en)* | 2016-06-08 | 2016-11-02 | 华南理工大学 | The sight tracing compensated based on projection mapping correction and point of fixation under natural light |
| CN108122257A (en)* | 2016-11-28 | 2018-06-05 | 沈阳新松机器人自动化股份有限公司 | A kind of Robotic Hand-Eye Calibration method and device |
| CN111443804A (en)* | 2019-12-27 | 2020-07-24 | 安徽大学 | A gaze point trajectory description method and system based on video analysis |
| CN112965532A (en)* | 2021-03-22 | 2021-06-15 | 北京航空航天大学 | Optimization method of trajectory of aircraft around multiple no-fly zones based on path optimization |
| CN113733088A (en)* | 2021-09-07 | 2021-12-03 | 河南大学 | Mechanical arm kinematics self-calibration method based on binocular vision |
| CN115840502A (en)* | 2022-11-23 | 2023-03-24 | 深圳市华弘智谷科技有限公司 | Three-dimensional sight tracking method, device, equipment and storage medium |
| Title |
|---|
| 朱博;迟健男;张天侠;: "视线追踪系统头动状态下的视线落点补偿方法", 公路交通科技, no. 10, 15 October 2013 (2013-10-15), pages 105 - 110* |
| 胡锴等: ""基于角膜反射法的视线追踪系统的视线落点校正研究"", 《计量学报》, vol. 43, no. 3, 31 March 2022 (2022-03-31), pages 332 - 337* |
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN118012268A (en)* | 2024-02-21 | 2024-05-10 | 深圳市铱硙医疗科技有限公司 | A post-processing correction method and system for VR eye tracking data |
| CN119248107A (en)* | 2024-09-20 | 2025-01-03 | 苏州数界科技有限公司 | A non-contact human-computer interaction system and method based on stereoscopic vision |
| Publication number | Publication date |
|---|---|
| CN116594502B (en) | 2025-03-18 |
| Publication | Publication Date | Title |
|---|---|---|
| CN116594502A (en) | Eye tracking error compensation or parameter calibration method and system based on gaze trajectory | |
| US11915381B2 (en) | Method, device and computer program for virtually adjusting a spectacle frame | |
| US10958898B2 (en) | Image creation device, method for image creation, image creation program, method for designing eyeglass lens and method for manufacturing eyeglass lens | |
| KR101962302B1 (en) | Eye tracking using structured light | |
| US12154383B2 (en) | Methods, devices and systems for determining eye parameters | |
| JP2020160467A (en) | Method, device, and computer program for virtual fitting of spectacle frame | |
| Nitschke et al. | Corneal imaging revisited: An overview of corneal reflection analysis and applications | |
| Coutinho et al. | Improving head movement tolerance of cross-ratio based eye trackers | |
| WO2021249187A1 (en) | Gaze tracking method, gaze tracking apparatus, computing device, and medium | |
| US20080130950A1 (en) | Eye gaze tracker system and method | |
| Katina et al. | The definitions of three‐dimensional landmarks on the human face: an interdisciplinary view | |
| CN110160749A (en) | Calibrating installation and calibration method applied to augmented reality equipment | |
| CN113793389B (en) | A virtual and real fusion calibration method and device for an augmented reality system | |
| CN113940812B (en) | Cornea center positioning method for excimer laser cornea refractive surgery | |
| US20220365342A1 (en) | Eyeball Tracking System and Method based on Light Field Sensing | |
| WO2022032911A1 (en) | Gaze tracking method and apparatus | |
| CN109308472B (en) | Three-dimensional sight estimation method based on iris projection matching function | |
| TWI756600B (en) | Measuring method of lens frame wearing parameters and measuring equipment thereof | |
| CN114967904A (en) | Sight line positioning method, head-mounted display device, computer device and storage medium | |
| WO2025112874A1 (en) | Eye movement tracking control method | |
| Nitschke | Image-based eye pose and reflection analysis for advanced interaction techniques and scene understanding | |
| WO2022205769A1 (en) | Virtual reality system foveated rendering method and system based on single eyeball tracking | |
| CN111657842A (en) | Probe control method and probe control device | |
| Barry et al. | Catoptric properties of eyes with misaligned surfaces studied by exact ray tracing. | |
| WO2022078169A1 (en) | Head-mounted display device imaging method and head-mounted display device |
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |