技术领域technical field
本发明涉及一种用于确定测试人员的双眼的至少一个参数的方法,所述方法包括步骤:通过第一捕获单元光学地捕获双眼中的第一眼睛;通过第二捕获单元光学地捕获所述双眼中的第二眼睛;将关于被捕获的第一眼睛的第一信号从第一捕获单元传送至分析单元和将关于被捕获的第二眼睛的第二信号从第二捕获单元传送至分析单元;和基于在分析单元中的被传送的第一和第二信号确定所述双眼的至少一个参数。本发明还涉及一种用于确定测试人员的双眼的至少一个参数的光学测量装置。The present invention relates to a method for determining at least one parameter of both eyes of a test person, said method comprising the steps of: optically capturing a first of the two eyes by a first capture unit; optically capturing said eye by a second capture unit The second of the two eyes; the first signal for the first captured eye is transmitted from the first capture unit to the analysis unit and the second signal for the second captured eye is transmitted from the second capture unit to the analysis unit ; and determining at least one parameter of the two eyes based on the transmitted first and second signals in the analysis unit. The invention also relates to an optical measuring device for determining at least one parameter of both eyes of a test person.
背景技术Background technique
从现有技术已知使用头戴式眼睛追踪装置。USRE39,539E公开了用于监控人的眼睛运动的设备。系统包括戴在人的头部上的框架、在框架上用于朝向人的眼睛引导光的发射器阵列、以及在框架上用于检测来自发射器阵列的光的传感器阵列。传感器检测被眼睛的各个部分或其的眼睑反射的光,由此产生表示眼睛的反射部分被眼睑覆盖时的输出信号。所述设备允许监控人的睡意水平。The use of head-mounted eye-tracking devices is known from the prior art. USRE 39,539E discloses a device for monitoring human eye movements. The system includes a frame worn on a person's head, an array of emitters on the frame for directing light toward the person's eyes, and an array of sensors on the frame for detecting light from the array of emitters. The sensor detects light reflected by various parts of the eye or its eyelids, thereby producing an output signal indicative of when the reflecting portion of the eye is covered by the eyelids. The device allows monitoring the sleepiness level of a person.
US6,163,281公开了一种利用人眼的运动进行交流的系统和方法,包括用于朝向眼睛引导光的发射器、用于检测从发射器发出的光的传感器、以及耦接至传感器用于将从传感器接收到的依次的光强信号转换成数据流和/或将信号转换成可理解的信息的处理器。US6,163,281 discloses a system and method for communicating using human eye movement, comprising an emitter for directing light toward the eye, a sensor for detecting light emitted from the emitter, and a sensor coupled to the sensor for A processor that converts sequential light intensity signals received from sensors into a data stream and/or converts the signals into understandable information.
US2004/0196433A1公开了一种用于监控使用者眼睛的运动的眼睛追踪系统,包括眼相机和场景相机,用于供给表示使用者眼睛的图像和被使用者观察到的场景的图像的隔行扫描电子视频数据。另外,所述系统包含帧抓取器和光点位置模块,所述帧抓取器用于数字化视频数据和用于将眼睛和场景数据分成两个处理信道,所述光点地点模块用于从视频数据确定通过点光源照射使用者眼睛而在使用者眼睛上所形成的参考光点的位置。所述系统还包括光瞳位置模块,用于确定使用者的凝视线。US2004/0196433A1 discloses an eye-tracking system for monitoring the movement of a user's eyes, comprising an eye camera and a scene camera for supplying interlaced scanning electronics representing images of the user's eyes and an image of the scene observed by the user. video data. In addition, the system includes a frame grabber for digitizing video data and for separating eye and scene data into two processing channels, and a spot location module for The position of the reference light spot formed on the eyes of the user by illuminating the eyes of the user with the point light source is determined. The system also includes a pupil position module for determining a gaze line of the user.
WO2010/83853A1公开了凝视点检测系统,具有放置在测试场景中作为参考点的一个或更多的红外信号源,由测试对象穿戴的至少一对眼镜以及用于计算人的凝视点的数据处理存储单元。眼镜包括适于检测来自至少一个IR信号源的IR信号和产生IR信号源追踪信号的图像传感器、适于确定测试对象人员的凝视方向和产生眼睛追踪信号的眼睛追踪单元、以及适于获取测试场景图片的相机单元。WO2010/83853A1 discloses a gaze point detection system with one or more infrared signal sources placed in a test scene as reference points, at least one pair of glasses worn by a test subject and a data processing storage for calculating a person's gaze point unit. The glasses comprise an image sensor adapted to detect an IR signal from at least one IR signal source and generate an IR signal source tracking signal, an eye tracking unit adapted to determine a gaze direction of a test subject person and generate an eye tracking signal, and an eye tracking unit adapted to acquire a test scene The camera unit of the picture.
WO2004/066097A2公开了一种用于在使用者凝视的关注点处显示视频屏幕指针的眼睛追踪系统。所述系统包括聚焦在使用者眼睛上的相机、连接至相机用于固定相机至使用者的光瞳的相对位置的支撑件、以及具有CPU和眼睛追踪界面的计算机。通过确定眼睛的中心,视频显示屏幕上的指针可以被显示在关注点处。WO2004/066097A2 discloses an eye tracking system for displaying a video screen pointer at the point of interest of a user's gaze. The system includes a camera focused on the user's eye, a support connected to the camera for fixing the relative position of the camera to the user's pupil, and a computer with a CPU and an eye-tracking interface. By determining the center of the eye, a pointer on the video display screen can be displayed at the point of interest.
US2010/0220291A1公开了一种眼睛追踪系统,具有透明透镜、至少一个光源和多个光检测器。透明透镜适于设置在眼睛附近。至少一个光源设置在透明透镜中并且配置成朝向眼睛发射光。至少一个光源透过可见光。多个光检测器设置在透明透镜中并且配置成接收从至少一个光源发射并且被眼睛反射的光。每一光检测器透过可见光并且配置成在接收到被眼睛反射的光时供给输出信号。US2010/0220291A1 discloses an eye tracking system with a transparent lens, at least one light source and a plurality of light detectors. The transparent lens is adapted to be placed near the eye. At least one light source is disposed in the transparent lens and configured to emit light toward the eye. At least one light source transmits visible light. A plurality of light detectors are disposed in the transparent lens and configured to receive light emitted from the at least one light source and reflected by the eye. Each photodetector is transparent to visible light and is configured to provide an output signal upon receiving light reflected by the eye.
已知的头戴式眼睛追踪器遭受以下缺点:不得不处理大的数据量以保证可靠的眼睛追踪。监控测试人员的眼睛的相机单独地获取和提供与关于眼睛的特点相关联的冗余数据。该数据需要被快速地传送至适当的处理单元。因此,需要用于数据传送的带宽显著大。另外,大的数据量必须快速地处理以实现实时地眼睛追踪,即只有很少的时间延迟。因此,需要昂贵的处理单元,其也遭受高的电力消耗。尤其是,可移动的和小的眼睛追踪装置可能因此变得重、大且昂贵。除了头戴式单元之外,包括处理单元的外部装置可能是需要的。现有技术中的设置有倾斜部件的装置可能遭受不能与冗余的数据流保持同步。可能的结果是在眼睛追踪功能上的重要数据的丢失或显著的时间延迟,因此使得表征被捕获的眼睛的参数的可靠确定被折衷。Known head-mounted eye trackers suffer from the disadvantage that large data volumes have to be processed in order to guarantee reliable eye tracking. Cameras monitoring the tester's eyes individually acquire and provide redundant data associated with characteristics about the eyes. This data needs to be transferred quickly to the appropriate processing unit. Therefore, the bandwidth required for data transfer is significantly larger. Additionally, large data volumes must be processed quickly to enable real-time eye tracking, ie with little time delay. Therefore, an expensive processing unit is required, which also suffers from high power consumption. In particular, mobile and small eye tracking devices can thus become heavy, large and expensive. In addition to the head mounted unit, an external device including a processing unit may be required. Prior art devices provided with tilting elements may suffer from being unable to keep pace with redundant data streams. Possible results are loss of important data or significant time delays in the eye-tracking function, thus compromising reliable determination of parameters characterizing the captured eye.
发明内容Contents of the invention
本发明的一个目的是提供允许更加可靠地确定测试人员的双眼的至少一个参数的一种方法和一种光学测量装置。It is an object of the invention to provide a method and an optical measuring device which allow a more reliable determination of at least one parameter of the eyes of a test person.
根据本发明所述的任务由具有根据专利权利要求1的特征的方法和具有根据专利权利要求11所述的特征的光学测量装置来解决。本发明的有利实施例是独立权利要求和说明书的主题。The object according to the invention is achieved by a method with the features of patent claim 1 and an optical measuring device with the features of patent claim 11 . Advantageous embodiments of the invention are the subject of the independent claims and the description.
根据本发明所述的方法用于确定测试人员的双眼的至少一个参数,所述方法包括下述步骤:The method according to the invention is used to determine at least one parameter of the eyes of the tester, said method comprising the following steps:
-通过第一捕获单元光学地捕获所述双眼中的第一眼睛;- optically capturing a first of said eyes by a first capturing unit;
-通过第二捕获单元光学地捕获所述双眼中的第二眼睛;- optically capturing a second of said eyes by a second capturing unit;
-将关于被捕获的第一眼睛的第一信号从第一捕获单元传送至分析单元和将关于被捕获的第二眼睛的第二信号从第二捕获单元传送至分析单元;- transmitting a first signal relating to the first captured eye from the first capture unit to the analysis unit and transmitting a second signal relating to the second captured eye from the second capture unit to the analysis unit;
-基于在分析单元中的被传送的第一和第二信号确定所述双眼的至少一个参数,和- determining at least one parameter of said eyes based on the transmitted first and second signals in the analysis unit, and
-设定用于第一信号的第一数据速率和用于第二信号的第二数据速率,其中所述第一和第二数据速率彼此不同,其中所述第一信号的传送被以第一数据速率实现而第二信号的传送被以第二数据速率实现。- setting a first data rate for a first signal and a second data rate for a second signal, wherein said first and second data rates are different from each other, wherein said first signal is transmitted at a first The data rate is achieved and the transmission of the second signal is achieved at the second data rate.
由于第一和第二数据速率彼此不同,所述数据速率之一小于另一个。因此,传送第一和第二数据所需要的整体带宽相比于现有技术被减小。利用该方法,可以例如捕获表征第一眼睛的第一参数和表征第二眼睛的不同于第一参数的第二参数。关于第一参数的数据之后被以第一数据速率传送,而关于第二参数的数据被以第二数据速率传送。然后所述两个参数一起允许可靠地确定表征双眼的感兴趣的一个或更多的参数。尤其是,第一和第二参数不需要被第一和第二捕获单元两者捕获。可以避免冗余的且过多的信息的捕获和传送。这样,可以保持数据流是精简的。尽管减小的数据量,可以更加可靠地执行双眼的至少一个参数的确定。Since the first and second data rates are different from each other, one of the data rates is less than the other. Thus, the overall bandwidth required to transmit the first and second data is reduced compared to the prior art. With this method it is possible, for example, to capture a first parameter characterizing a first eye and a second parameter characterizing a second eye different from the first parameter. Data relating to the first parameter is then transmitted at the first data rate and data relating to the second parameter is transmitted at the second data rate. Said two parameters together then allow a reliable determination of one or more parameters of interest characterizing both eyes. In particular, the first and second parameters need not be captured by both the first and second capture units. The capture and transmission of redundant and excessive information can be avoided. In this way, the data flow can be kept lean. Despite the reduced data volume, the determination of at least one parameter of both eyes can be performed more reliably.
捕获单元可以包括至少一个相机。可行的是用第一和第二捕获单元对各自的眼睛的捕获以相同的数据速率实现。所获取的数据之后可以被预先处理,使得关于第一捕获单元的数据被以第一数据速率传送至分析单元,而关于第二捕获单元的数据被以不同的第二数据速率传送至分析单元。可替代地,还可行的是,由第一和第二捕获单元进行的已有数据获取以两个不同的数据速率实现。The capture unit may include at least one camera. It is possible that the capture of the respective eye with the first and second capture unit takes place at the same data rate. The acquired data may then be pre-processed such that data pertaining to the first capture unit is transmitted to the analysis unit at a first data rate and data pertaining to the second capture unit is transmitted to the analysis unit at a second, different data rate. Alternatively, it is also possible that the existing data acquisition by the first and second acquisition unit takes place at two different data rates.
有利地,所述方法包括下述步骤:Advantageously, the method comprises the steps of:
-依赖于所述设定的第一或第二数据速率,提供关于在第一和/或第二捕获单元中各自被捕获的眼睛的数据;- providing data about the respective captured eye in the first and/or second capture unit, depending on said set first or second data rate;
-基于所述提供的数据产生第一和/或第二信号。- generating a first and/or a second signal based on said provided data.
第一和第二捕获单元的捕获或获取数据速率可以因此不同。第一捕获单元可以根据所设定的第一数据速率光学地捕获第一眼睛,而第二捕获单元可以以第二数据速率捕获第二眼睛。这样,在数据获取的步骤中,数据量已经保持很小。仅这样的数据被获取,其对于确定双眼的至少一个参数确实是必要的。从现在开始避免了过多的数据。整体的数据量保持很小。The capture or acquisition data rates of the first and second capture units may thus be different. The first capture unit may optically capture the first eye according to the set first data rate, and the second capture unit may capture the second eye at the second data rate. In this way, in the step of data acquisition, the amount of data is already kept small. Only such data are acquired which are actually necessary for determining at least one parameter of both eyes. Avoid excess data from now on. The overall data volume is kept small.
有利地,在提供数据的步骤中,依赖于所述设定的第一或第二数据速率执行数据压缩。尤其是,所述步骤可以在数据被第一和第二捕获单元获取之后但是在各自的数据被传送至分析单元之前被执行。所述步骤可以包括预先处理所述数据。可以使用现有技术的数据压缩算法。对于由第一和第二捕获单元所提供的数据,数据压缩可以不同。例如,由第二捕获单元所提供的数据可以比由第一捕获单元所提供的数据进行更高的数据压缩。可以在第一和第二捕获单元中进行不同的数据压缩算法。Advantageously, in the step of providing data, data compression is performed in dependence of said set first or second data rate. In particular, said steps may be performed after the data are acquired by the first and second capture unit but before the respective data are transmitted to the analysis unit. Said steps may comprise pre-processing said data. State-of-the-art data compression algorithms may be used. The data compression may be different for the data provided by the first and the second capture unit. For example, the data provided by the second capture unit may have a higher data compression than the data provided by the first capture unit. Different data compression algorithms may be performed in the first and second capture unit.
在一个实施例中,所述方法可以包括下述步骤:依赖于所述设定的第一或第二数据速率设定用于各个眼睛的光学捕获的第一和/或第二捕获单元的瞬时捕获分辨率和/或空间捕获分辨率和/或可捕获图像部分,尤其是动态的感兴趣区域,其跟随所述眼睛且可以调整关于其的尺寸和扫描速率。尤其是,瞬时捕获分辨率关系到各个捕获单元的瞬时数据获取速率。例如,第一捕获单元可以在与第二捕获单元的同一时间周期期间捕获双倍的信号量。空间捕获分辨率可以由像素分辨率来限定。如果第一和第二捕获单元的空间分辨率由具有相同像素尺寸的相同像素阵列限定,如果第二捕获单元执行二至二解析度读出(two-by-twobinning),那么第一捕获单元的空间捕获分辨率可以高至第二捕获单元的空间捕获分辨率的四倍。各自的捕获单元的可捕获图像部分可以是该捕获单元的视场。该视场可以被动态地调节至各自的捕获的眼睛的凝视点。这样,第一和第二数据获取速率可以被灵活地调节。依赖于各自的情形,可以以容易的方式选择恰当的第一和/或第二数据速率。In one embodiment, the method may comprise the step of setting the instantaneous speed of the first and/or second capture unit for the optical capture of the respective eye in dependence on said set first or second data rate. The capture resolution and/or spatial capture resolution and/or can capture image parts, especially dynamic regions of interest, which follow the eye and about which the size and scan rate can be adjusted. In particular, the instantaneous capture resolution is related to the instantaneous data acquisition rate of each capture unit. For example, a first capture unit may capture double the amount of semaphores during the same time period as a second capture unit. Spatial capture resolution may be defined by pixel resolution. If the spatial resolutions of the first and second capture units are defined by the same pixel array with the same pixel size, and if the second capture unit performs two-by-two binning, then the first capture unit's The spatial capture resolution can be as high as four times that of the second capture unit. The image captureable portion of a respective capture unit may be the field of view of that capture unit. The field of view can be dynamically adjusted to the gaze point of the respective captured eye. In this way, the first and second data acquisition rates can be flexibly adjusted. Depending on the respective situation, an appropriate first and/or second data rate can be selected in an easy manner.
在一个实施例中,第一和第二眼睛的光学捕获可以以相同的数据速率发生,但是具有特定的相位移。当第一捕获单元正在扫描第一眼睛时,第二捕获单元可以不扫描第二眼睛,反之亦然。顺序的扫描间隔可以由捕获速率确定,其对于两个捕获单元可以是相同的。然而,第一和第二捕获单元的扫描间隔之间的时间延迟可能被引入。如果将被确定的参数实际上是相同的,那么不管第一或第二眼睛被观察,这一冗余以及所述相位移一起有效导致了增强的捕获速率。如果例如将被确定的参数是光瞳直径,那么该直径通常对于第一和第二眼睛是相同的。第一和第二眼睛可以之后被用每一个30Hz的捕获速率捕获,两个捕获次序移位半个周期。之后以60Hz的速率有效采样光瞳直径。改变该相位移因此是设定期望的数据速率的极好的方式。In one embodiment, the optical capture of the first and second eyes can occur at the same data rate, but with a specific phase shift. When the first capture unit is scanning the first eye, the second capture unit may not scan the second eye, and vice versa. The sequential scan interval can be determined by the capture rate, which can be the same for both capture units. However, a time delay between the scan intervals of the first and second capture unit may be introduced. This redundancy together with said phase shift effectively leads to an enhanced acquisition rate, regardless of whether the first or second eye is viewed, if the parameters to be determined are in fact the same. If eg the parameter to be determined is the pupil diameter, this diameter is usually the same for the first and second eye. The first and second eye can then be captured with a capture rate of 30 Hz each, with the two capture orders shifted by half a cycle. The pupil diameter is then effectively sampled at a rate of 60 Hz. Varying this phase shift is therefore an excellent way of setting the desired data rate.
有利地,所述方法可以包括下述步骤:Advantageously, the method may comprise the steps of:
-通过第三捕获单元光学地捕获视场,所述视场至少部分地对应于由测试人员的眼睛能够捕获的视场;- optically capturing a field of view by the third capture unit, said field of view at least partially corresponding to the field of view capable of being captured by the tester's eyes;
-将关于被捕获的视场的第三信号从第三捕获单元传送至分析单元;和- transmitting a third signal relating to the captured field of view from the third capture unit to the analysis unit; and
-基于分析单元中的第一和第三信号和/或第二和第三信号确定所述捕获的视场和至少一个被确定的参数之间的关联关系。- Determining a correlation between said captured field of view and at least one determined parameter based on the first and third signal and/or the second and third signal in the analysis unit.
尤其是,第三捕获单元可以是场景相机。在优选的实施例中,依赖于第一和第三信号和/或第二和第三信号,被捕获的视场可以被从场景切换至人像模式,反之亦然。因此,被第三捕获单元捕获的视场可能不是恒定的,但是可以依赖于例如双眼的关注点进行调整,其可以基于第一和第三信号和/或第二和第三信号被确定。因此,仅在各种情形中的相应的视场被捕获,避免了不需要的数据。在其他实施例中,第三捕获单元可以包括几个捕获单元或传感器,例如场景相机和红外相机。In particular, the third capture unit may be a scene camera. In a preferred embodiment, depending on the first and third signal and/or the second and third signal, the captured field of view can be switched from scene to portrait mode and vice versa. Thus, the field of view captured by the third capture unit may not be constant, but may be adjusted depending eg on the focus of both eyes, which may be determined based on the first and third signals and/or the second and third signals. Thus, only the corresponding fields of view in each situation are captured, avoiding unnecessary data. In other embodiments, the third capture unit may include several capture units or sensors, such as a scene camera and an infrared camera.
有利地,所述方法包括下述步骤:确定用于第三信号的第三数据速率,其中第三信号的传送以第三数据速率实现,其中尤其是所述第三数据速率不同于第一和/或第二数据速率。因此所有三个数据速率可以被独立地选择和依赖于各自的捕获情形进行选择。例如如果需要高的第一数据速率,那么可以相应地调节第二和/或第三数据速率,并且对于它们选择各自的较低的值。Advantageously, the method comprises the step of determining a third data rate for the third signal, wherein the transmission of the third signal takes place at the third data rate, wherein in particular the third data rate is different from the first and /or the second data rate. All three data rates can thus be selected independently and in dependence on the respective acquisition situation. For example, if a high first data rate is required, the second and/or third data rate can be adjusted accordingly and a respective lower value selected for them.
在一个实施例中,所述第一数据速率被设定成大于第二数据速率,且基于第一和第二信号确定了测试人员的视觉焦点和/或观看方向。还可以仅通过用第一捕获单元光学地捕获第一眼睛,可以在分析单元中确定大致的观看方向和/或大致的视觉焦点。然后,用第二捕获单元所获取的数据可以允许精细地调节所确定的大致的观看方向和/或视觉焦点。因此,第一和第二数据速率不需要是相等的但是第二数据速率可以被选择成小于第一数据速率。这样,尽管减小的整体数据速率,观看方向和/或视觉焦点可以以非常精确地确定。In one embodiment, the first data rate is set to be greater than the second data rate, and the visual focus and/or viewing direction of the tester is determined based on the first and second signals. It is also possible to determine the approximate viewing direction and/or the approximate visual focus in the analysis unit only by optically capturing the first eye with the first capturing unit. The data acquired with the second capture unit may then allow fine adjustment of the determined approximate viewing direction and/or visual focus. Thus, the first and second data rates need not be equal but the second data rate may be chosen to be less than the first data rate. In this way, despite the reduced overall data rate, the viewing direction and/or visual focus can be determined with great accuracy.
有利地,基于在分析单元中的第一和第二信号和/或第一和第三信号和/或第二和第三信号执行视差修正。第一、第二或第三信号的组中的一个可以之后用作用于确定关注点的基本数据源。第一、第二和第三信号的组中的另一信号可以之后用作执行关于关注点的视差修正的校正性信号。确定关注点可能需要相对大量的数据,因此需要大的数据速率,而视差修正所需要的数据可能被以相对较低的数据速率获取和/或传送。整体的数据速率保持很低。Advantageously, a parallax correction is performed based on the first and second signal and/or the first and third signal and/or the second and third signal in the evaluation unit. One of the groups of first, second or third signals can then be used as the basic data source for determining the point of interest. The other signal of the group of first, second and third signals may then be used as a corrective signal for performing a parallax correction with respect to the point of interest. Determining points of interest may require a relatively large amount of data, and thus a large data rate, while data required for parallax correction may be acquired and/or transmitted at a relatively low data rate. The overall data rate remains low.
有利地,所述至少一个被捕获的参数与至少一个眼睛的取向和/或位置和/或眼睑闭合和/或光瞳直径和/或巩膜特性和/或虹膜特性和/或血管特性和/或角膜特性有关。尤其是,至少一个被捕获的参数可能关于角膜半径(前面的、后面的)、眼球半径、光瞳中心至角膜中心的距离、角膜中心至眼球中心的距离、光瞳中心至角膜与巩膜的连接部分(limbus)的中心的距离、角膜屈光折射率、角膜折射率、玻璃状液的折射率、晶状体至眼球中心和至角膜中心和至角膜顶点的距离、晶状体的折射率、视轴线方向、光学轴线方向、光瞳轴线(削色差轴线)方向、视线方向、散光度(屈光度)和平坦和陡峭轴线的方向角、虹膜直径、光瞳直径(光瞳长轴线和短轴线)、光瞳区域、异色边缘(limbus)长轴线和短轴线、眼睛旋转、眼睛眼内距离、眼睛聚散度、在眼睛内收/外展上的统计值和在眼睛突出/凹陷上的统计值。光学测量装置可以之后用作眼睛追踪装置。Advantageously, said at least one captured parameter is related to the orientation and/or position of at least one eye and/or eyelid closure and/or pupil diameter and/or scleral properties and/or iris properties and/or vascular properties and/or properties of the cornea. In particular, at least one captured parameter may relate to corneal radius (anterior, posterior), eyeball radius, pupil center to cornea center distance, cornea center to eyeball center distance, pupil center to cornea-sclera connection The distance of the center of the limbus, the refractive index of the cornea, the refractive index of the cornea, the refractive index of the vitreous humor, the distances from the lens to the center of the eyeball and to the center of the cornea and to the apex of the cornea, the refractive index of the lens, the direction of the visual axis, Optical axis direction, pupil axis (chromatic aberration axis) direction, gaze direction, astigmatism (diopters) and orientation angle of flat and steep axes, iris diameter, pupil diameter (pupil major and minor axis), pupil area , limbus major and minor axis, eye rotation, eye intraocular distance, eye vergence, statistics on adduction/abduction and statistics on protruding/sunken eyes. The optical measuring device can then be used as an eye tracking device.
根据本发明的一种光学测量装置用作确定测试人员的双眼的至少一个参数。所述光学测量装置包括:第一捕获单元,配置成光学地捕获双眼的第一眼睛;第二捕获单元,配置成光学地捕获双眼的第二眼睛;分析单元,配置成接收关于被捕获的第一眼睛并且由第一捕获单元传送的第一信号,和接收关于被捕获的第二眼睛并且由第二捕获单元传送的第二信号,并且基于被传送的第一和第二信号以确定双眼的至少一个参数,和分配单元,配置成设定用于第一信号的第一数据速率和用于第二信号的不同的第二数据速率,使得第一信号至分析单元的传送被以第一数据速率实现,而第二信号至分析单元的传送被以第二数据速率实现。An optical measuring device according to the invention is used to determine at least one parameter of both eyes of a test person. The optical measurement device includes: a first capture unit configured to optically capture a first eye of both eyes; a second capture unit configured to optically capture a second eye of both eyes; an analysis unit configured to receive information about the captured first eye. a first signal transmitted by the first capture unit for one eye, and receiving a second signal transmitted by the second capture unit for the captured second eye, and based on the transmitted first and second signals to determine the At least one parameter, and the allocation unit, configured to set a first data rate for the first signal and a different second data rate for the second signal, such that transmission of the first signal to the analysis unit is transmitted with the first data rate rate, while the transmission of the second signal to the analysis unit is performed at a second data rate.
尤其是,分析单元和分配单元可以被单个处理单元和/或计算机包含。In particular, the analysis unit and the distribution unit can be contained by a single processing unit and/or computer.
有利地,光学测量装置可以包括第三捕获单元,配置成捕获至少部分地对应于由测试人员的眼睛能够捕获的视场的视场,并且配置成将关于被捕获的视场的第三信号以第三数据速率传送至分析单元,其中所述分配单元配置成设定第三数据速率。尤其是,第三捕获单元可以是相机。虽然第一和第二捕获单元可以是观看测试人员的眼睛的相机,但是第三捕获单元可以是捕获类似于由测试人员所看到的场景的场景相机。尤其是,一方面第一和第二捕获单元,而另一方面第三捕获单元,可以被引导至相反的方向。Advantageously, the optical measurement device may comprise a third capture unit configured to capture a field of view at least partially corresponding to a field of view capable of being captured by the tester's eyes, and configured to convert the third signal about the captured field of view to The third data rate is transmitted to the analyzing unit, wherein the allocating unit is configured to set the third data rate. In particular, the third capture unit may be a camera. While the first and second capture units may be cameras looking at the tester's eyes, the third capture unit may be a scene camera that captures a scene similar to that seen by the tester. In particular, the first and second capture unit on the one hand, and the third capture unit on the other hand, can be directed in opposite directions.
有利地,所述分配单元可以配置成设定第一数据速率与第二数据速率的比例和/或第一数据速率与第三数据速率的比例和/或第二数据速率与第三数据速率的比例,使得其采用在1:5000至5000:1的范围内的值。宽的范围允许独立于彼此设定各自的数据速率。整体的数据速率可以被精细地调节。Advantageously, the allocation unit can be configured to set the ratio of the first data rate to the second data rate and/or the ratio of the first data rate to the third data rate and/or the ratio of the second data rate to the third data rate ratio such that it takes values in the range 1:5000 to 5000:1. The wide range allows setting the respective data rates independently of each other. The overall data rate can be finely tuned.
有利地,所述分配单元配置成依赖于彼此和/或依赖于可预定的参数,尤其是数据线上数据传送量,和/或依赖于光学测量装置的可预定的测量目的设定第一和/或第二和/或第三数据速率。因此,可以利用由光学测量装置的数据线所提供的全部带宽,而不剩下不被使用或超过可以被传送的最大数据量的任何带宽。各自的数据速率可以动态地调节至各自的情形。Advantageously, the allocation units are configured to set the first and second values in dependence on each other and/or in dependence on a predeterminable parameter, in particular the amount of data transmitted on the data line, and/or in dependence on a predeterminable measurement purpose of the optical measuring device. /or the second and/or third data rate. Thus, the full bandwidth provided by the data line of the optical measuring device can be utilized, leaving no bandwidth unused or exceeding the maximum amount of data that can be transferred. The respective data rates can be dynamically adjusted to the respective situation.
优选地,光学测量装置包括至少一个共同的数据线,配置成传送第一和第二和/或第一和第三和/或第二和第三信号。共同的数据线的带宽通常受限制。通过利用可调节的第一、第二和第三数据速率,可以利用所提供的全部带宽。超过可分配的带宽的影响数据可以被防止。Preferably, the optical measuring device comprises at least one common data line configured to transmit the first and second and/or first and third and/or second and third signals. Common data lines are usually limited in bandwidth. By utilizing the adjustable first, second and third data rates, the full bandwidth provided can be utilized. Influencing data exceeding the allotable bandwidth can be prevented.
有利地,第一和/或第二和/或第三捕获单元包括至少一个相机。Advantageously, the first and/or second and/or third capture unit comprises at least one camera.
本发明的另外的特征源自权利要求、附图和附图的描述。在说明书中之前所述的所有特征和特征的结合以及与附图的描述和/或仅在附图中所显示的特征进一步所述的特征和特征的结合不仅可用于在每一情形中所指出的结合中,而且还可以以不同的结合使用或独自使用。Further features of the invention result from the claims, the figures and the description of the figures. All features and combinations of features previously mentioned in the description and features and combinations of features further described in connection with the description of the drawings and/or only the features shown in the drawings are not only applicable to the ones indicated in each case It can also be used in different combinations or alone.
附图说明Description of drawings
现在参考单独的优选实施例和参考附图详细说明本发明。下面显示出这些:The invention will now be described in detail with reference to a single preferred embodiment and with reference to the accompanying drawings. These are shown below:
图1A是根据本发明实施例的眼镜装置的正视图;FIG. 1A is a front view of a glasses device according to an embodiment of the present invention;
图1B是图1A的眼镜装置的侧视图;Fig. 1B is a side view of the glasses device of Fig. 1A;
图1C是图1A的眼镜装置的俯视图;Fig. 1C is a top view of the glasses device of Fig. 1A;
图1D是图1A的眼镜装置的透视图;Figure 1D is a perspective view of the glasses device of Figure 1A;
图2是眼镜装置的后视图;Figure 2 is a rear view of the glasses device;
图3是眼镜装置的示意后视图,其中眼相机利用偏转元件来将其的光学路径引导到眼睛上;Figure 3 is a schematic rear view of an eyeglass device with the eye camera utilizing deflection elements to direct its optical path onto the eye;
图4是示意性地显示出眼相机的取向的眼镜装置的侧视图;4 is a side view of the eyewear device schematically showing the orientation of the eye camera;
图5是被眼镜装置所包含的单独的电子部件的示意图;FIG. 5 is a schematic diagram of individual electronic components included in the glasses device;
图6A是表示用根据现有技术的光学测量装置所获得的大的视差的符号的图片;FIG. 6A is a picture showing symbols of large parallax obtained with an optical measurement device according to the prior art;
图6B是显示用根据本发明实施例的眼镜装置的表示没有视差的符号的图片;FIG. 6B is a picture showing a symbol representing no parallax using the glasses device according to an embodiment of the present invention;
图7是视差模型;Figure 7 is a parallax model;
图8是比较根据现有技术和根据本发明实施例的测量装置的视差的视图;FIG. 8 is a view comparing parallax of measuring devices according to the prior art and according to an embodiment of the present invention;
图9A是通过场景相机所获取的第一视场;Fig. 9A is the first field of view acquired by the scene camera;
图9B是通过场景相机所获得的第二视场;Fig. 9B is the second field of view obtained by the scene camera;
图10A是眼镜装置的示意侧视图,其中眼相机的光学路径在从眼相机至眼睛的直线上延伸;和10A is a schematic side view of an eyeglass device, wherein the optical path of the eye camera extends on a straight line from the eye camera to the eye; and
图10B是眼镜装置的示意侧视图,其中眼相机的光学路径从眼相机经由反射镜延伸至眼睛。Fig. 10B is a schematic side view of an eyewear device, where the optical path of the eye camera extends from the eye camera to the eye via a mirror.
具体实施方式detailed description
在附图中,相同的元件或具有相同功能的元件被给予相同的参考标记。图2、3和4显示出具有笛卡尔坐标系和垂直的轴线x、y和z的相同的参考系。In the drawings, the same elements or elements having the same function are given the same reference numerals. Figures 2, 3 and 4 show the same reference system with a Cartesian coordinate system and vertical axes x, y and z.
图1A至1D显示出光学测量装置,其分别具有眼镜装置1或眼睛追踪装置的形式。眼镜装置1被设计成使得人员可以将其穿戴在其的头上,正如通常的一对眼镜那样。其包括具有两个侧杆5l和5r的框架4,所述两个侧杆将眼镜装置1支撑在穿戴其的人员的耳朵上。另外,眼镜装置1通过鼻托7被保持在头上的适合位置。主框架具有具体的宽度w1和高度h。其长度l依赖于侧杆5l和5r的长度。如在图1C中所看到的,侧杆5l和5r被铰接至框架4的前部,使得在侧杆5l和5r之间的距离w2可以被放大或缩短(参见在图1C中的对侧杆5l的虚线侧杆配置)。Figures 1A to ID show an optical measuring device in the form of a glasses device 1 or an eye tracking device, respectively. The eyeglass device 1 is designed so that a person can wear it on his head, like a usual pair of eyeglasses. It comprises a frame 4 with two side bars 51 and 5r which support the spectacle device 1 on the ears of the person wearing it. In addition, the eyeglass device 1 is held in place on the head by the nose pad 7 . The main frame has a specific width w1 and height h. Its length l depends on the lengths of the side bars 5l and 5r. As seen in FIG. 1C, the side bars 5l and 5r are hinged to the front of the frame 4 so that the distance w2 between the side bars 5l and 5r can be enlarged or shortened (see opposite side in FIG. 1C Dotted line side pole configuration for pole 5l).
可替代地,光学测量装置可以不设计成常规的一对眼镜的形式,而是可以被设计成使得其类似于具有面罩(形成框架插入件)的头盔(形成框架)。Alternatively, the optical measuring device may not be designed in the form of a conventional pair of spectacles, but may be designed such that it resembles a helmet (forming the frame) with a visor (forming the frame insert).
在框架4中的鼻托7上方,安装了场景相机2。其可以连接至或一体集成到框架4中。利用场景相机2,事实上可以捕获如由测试人员在穿戴眼镜装置1时所看到的那样的类似视场。在框架4的下部中,眼镜装置1包含两个眼相机3l和3r。在眼镜装置1被人穿戴时,可以通过眼相机3l和3r捕获人眼,该眼相机3l和3r被以适合的角度集成到框架4中。眼相机3l和3r被设计成分别观看人的左眼和右眼,即捕获人眼的特征。Above the nose pads 7 in the frame 4, a scene camera 2 is mounted. It can be connected to or integrated into the frame 4 . With the scene camera 2 , it is in fact possible to capture a similar field of view as seen by the tester while wearing the eyewear device 1 . In the lower part of the frame 4, the eyewear device 1 contains two eye cameras 3l and 3r. When the glasses device 1 is worn by a person, the eyes of the person can be captured by the eye cameras 31 and 3r, which are integrated into the frame 4 at a suitable angle. The eye cameras 31 and 3r are designed to look at the left and right eyes of a person respectively, ie to capture the characteristics of a person's eyes.
框架4包括两个开口,所述两个开口填充有由此形成了框架插入件的眼镜透镜8l和8r。由场景相机2和眼相机3l和3r所获取的图片引起信号,所述信号在集成到侧杆5l和5r中的一个或几个预先处理单元6中被处理。The frame 4 comprises two openings filled with spectacle lenses 81 and 8r thus forming the frame insert. The pictures taken by the scene camera 2 and the eye cameras 31 and 3r result in signals which are processed in one or several preprocessing units 6 integrated into the sidesticks 51 and 5r.
图2显示眼镜装置1的内侧视图。沿框架部分包封眼镜透镜8l和8r的边缘,以环形布置定位几个发光二极管(LED)9。当眼镜装置1被人穿戴时,这些LED9可以以限定的方式照射测试人员的眼睛。LED9将导致对于所有可能的凝视角度在测试人员的眼睛上的反射(角膜反射)。这些反射可以由眼相机3l和3r检测并且可以用于眼睛追踪。FIG. 2 shows an inside view of the eyewear device 1 . Along the edge of the frame part enclosing the spectacle lenses 81 and 8r, several light emitting diodes (LEDs) 9 are positioned in a ring arrangement. When the spectacle device 1 is worn by a person, these LEDs 9 can illuminate the test person's eyes in a defined manner. LED9 will cause a reflection (corneal reflection) on the tester's eye for all possible gaze angles. These reflections can be detected by eye cameras 31 and 3r and can be used for eye tracking.
LED9可以遵循特定的时间图案、选通特性或空间变化被单独地、成组地或所有一起地切换成打开和关闭。不同的LED9或LED9组的开关切换频率可以变化。特定组的LED9可以精确地在其他组的LED9被关闭时打开。具体的空间和瞬间相关图案可以关于切换并且因此关于照射特性被实。这样,在可以被容易地由眼相机3所识别的眼睛上可以生成反射图案。The LEDs 9 can be switched on and off individually, in groups or all together following a specific temporal pattern, gating characteristic or spatial variation. The switching frequency of different LED9 or groups of LED9 can be changed. A particular group of LEDs 9 can be turned on precisely when other groups of LEDs 9 are turned off. Specific spatially and temporally correlated patterns can be determined with respect to switching and thus with respect to illumination characteristics. In this way, a reflection pattern can be generated on the eye which can be easily recognized by the eye camera 3 .
具有最重要的电子部件的整体装配在图5中示出。眼相机3l和3r通过100mm长的缆线14连接至特定的相机电子装置15。尤其是,相机3l和3r仅包括基本的电子部件,而他们的主要电子部件定位在相机电子装置15内。这样,相机3l和3r的基本“光学部件”可以相对于相机电子装置15内的基本“电子部件”远程地定位。所述部件两者之后可以通过柔性PCB缆线14连接。这样,光学传感器和在相机3l和3r内的基本电子部件形成了非常小且高紧凑的整体,而在电子装置15内的大量的电子部件可以被放置在更多的大规模的集成电路板的其他位置上。电子装置15连接至预先处理单元16,所述预先处理单元可以处理来自眼相机3l和3r的信号。预先处理单元16可以与定位在眼镜装置1的侧杆5l和5r中的预先处理单元6相同。预先处理单元16连接至USB集线器(hub)19。安装在框架4中的LED9形成了布置成围绕眼镜透镜8l和8r的环形配置的第一和第二IRLED链条21和22。IRLED链条21和22连接至IRLED恒流源20,其也连接至USB集线器19。USB集线器19另外用作IRLED恒流源20的电源。IRLED链条21和22的LED9可以被单独切换成打开和关闭。为了实现这个,它们可以以实施对每一LED9单独电切换的并行网络被连接至IRLED恒流源20。The overall assembly with the most important electronic components is shown in FIG. 5 . The eye cameras 31 and 3r are connected to specific camera electronics 15 by cables 14 100mm long. In particular, the cameras 31 and 3r comprise only basic electronic components, while their main electronic components are located within the camera electronics 15 . In this way, the basic "optics" of cameras 31 and 3r can be located remotely relative to the basic "electronics" within camera electronics 15 . The two components can then be connected by a flexible PCB cable 14 . In this way, the optical sensors and the basic electronic components in the cameras 31 and 3r form a very small and highly compact whole, while a large number of electronic components in the electronic device 15 can be placed on more large-scale integrated circuit boards. other positions. The electronics 15 are connected to a pre-processing unit 16 which can process the signals from the eye cameras 31 and 3r. The pre-processing unit 16 may be identical to the pre-processing unit 6 positioned in the side bars 51 and 5r of the eyewear device 1 . The preprocessing unit 16 is connected to a USB hub 19 . The LEDs 9 mounted in the frame 4 form first and second IR LED chains 21 and 22 arranged in a ring configuration around the spectacle lenses 81 and 8r. The IRLED chains 21 and 22 are connected to the IRLED constant current source 20 which is also connected to the USB hub 19 . The USB hub 19 is additionally used as a power source for the IRLED constant current source 20 . The LEDs 9 of the IRLED chains 21 and 22 can be switched on and off individually. To achieve this, they can be connected to the IRLED constant current source 20 in a parallel network implementing an individual electrical switching of each LED 9 .
USB集线器19经由串联接口或USB缆线25连接至预先处理单元26。在预先处理单元26中预先处理的信号最终在个人计算机27中被分析,所述个人计算机包括记录器装置28。形成眼镜装置1上的接口的另外的辅助/同步端口13也可以连接至USB集线器19。辅助/同步端口13可以用作与其他电子装置同步或用于触发平行的数据获取的接口。电子装置15、预先处理单元16、USB集线器19和IRLED恒流源20定位在共同的印刷电路板PCB23上。The USB hub 19 is connected to the preprocessing unit 26 via a serial interface or a USB cable 25 . The signals preprocessed in the preprocessing unit 26 are finally analyzed in a personal computer 27 which includes a recorder device 28 . A further auxiliary/sync port 13 forming an interface on the eyewear device 1 may also be connected to a USB hub 19 . The auxiliary/sync port 13 can be used as an interface for synchronization with other electronic devices or for triggering parallel data acquisition. The electronics 15 , the pre-processing unit 16 , the USB hub 19 and the IRLED constant current source 20 are positioned on a common printed circuit board PCB 23 .
类似于该构造,场景相机2也经由100mm缆线14连接至电子装置15。在该情形中,电子装置15定位在第二印刷电路板PCB24上,其也包括预先处理单元17。预先处理单元17可以基于根据DaVinci数字信号处理器(DSP)的电子装置。其包含用于编码从电子装置15接收到的信号的MPEG编码器18。麦克风12也可以连接至预先处理单元17。定位在PCB24上的预先处理单元17连接至USB集线器19。这样,由场景相机2所获取的处理信号最终在个人计算机27中被分析。Similar to the configuration, the scene camera 2 is also connected to the electronic device 15 via a 100 mm cable 14 . In this case, the electronic device 15 is positioned on a second printed circuit board PCB 24 , which also includes a pre-processing unit 17 . The pre-processing unit 17 may be based on electronics according to a DaVinci digital signal processor (DSP). It contains an MPEG encoder 18 for encoding signals received from the electronic device 15 . The microphone 12 can also be connected to a pre-processing unit 17 . A pre-processing unit 17 positioned on the PCB 24 is connected to a USB hub 19 . In this way, the processed signals acquired by the scene camera 2 are finally analyzed in the personal computer 27 .
预先处理单元6,16,17和26可以能够压缩由两个眼相机3l和3r以及场景相机2所产生的三个图像流中的至少一个。在此处,不同的替代方案是可行的。预先处理单元可以仅压缩一个相机的图像流,而每个相机具有其自身的预先处理单元。可替代地,单个预先处理单元可以压缩所有相机的图像流。另外,预先处理单元可以配置成经由系统接口和对应的软件通过调节分辨率、感兴趣的区域、帧速率和压缩参数来管理带宽。预先处理单元可以被设计成同步地触发相机的图像获取。它们可以为每一个被获取的图像提供时间标记,其可以用于离线地同步几个或所有相机数据流。The preprocessing units 6 , 16 , 17 and 26 may be able to compress at least one of the three image streams produced by the two eye cameras 31 and 3r and the scene camera 2 . Here, different alternatives are possible. The pre-processing unit may only compress the image stream of one camera, with each camera having its own pre-processing unit. Alternatively, a single pre-processing unit can compress the image streams of all cameras. Additionally, the pre-processing unit may be configured to manage bandwidth by adjusting resolution, region of interest, frame rate and compression parameters via the system interface and corresponding software. The pre-processing unit can be designed to trigger the image acquisition of the camera synchronously. They can provide a time stamp for each acquired image, which can be used to synchronize several or all camera streams offline.
预先处理单元可以定位在相机的集成电路板上、或者定位在头安装件处或上(例如在眼镜装置1的侧杆5l或5r中)的单独的集成电路板上、或在由测试人员31所穿戴的单独的壳体中,例如在带上。The pre-processing unit may be located on the integrated circuit board of the camera, or on a separate integrated circuit board at or on the head mount (for example in the side bars 51 or 5r of the eyewear device 1 ), or on a separate integrated circuit board by the tester 31 In a separate housing that is worn, for example on a belt.
眼镜装置1还可以包括辅助接口,其允许实时地从外部传感器获取数据。这样的传感器可以是生物测量传感器(包括但不限于EEG,ECG等)或姿态传感器(包括但不限于加速计、磁力计、陀螺仪等)。之后可以将外部传感器的数据流与从相机2,3l和3r所获取的数据流同步。另外,可以提供被外部时钟或触发信号,其可以由外部传感器使用以使它们自身与所述系统同步。从接口所获取的数据的带宽可以通过集成到所述系统中的在其专用记录单元28中的机载处理资源的方式被减小或压缩。The glasses device 1 may also include an auxiliary interface that allows data to be acquired from external sensors in real time. Such sensors may be biometric sensors (including but not limited to EEG, ECG, etc.) or attitude sensors (including but not limited to accelerometers, magnetometers, gyroscopes, etc.). The data stream of the external sensor can then be synchronized with the data stream acquired from the cameras 2, 3l and 3r. Additionally, an external clock or trigger signal can be provided which can be used by external sensors to synchronize themselves with the system. The bandwidth of the data retrieved from the interface can be reduced or compressed by means of on-board processing resources integrated into the system in its dedicated recording unit 28 .
眼相机3l和3r可能适合于可见光或近红外光。它们可以相对于竖直的中心线对称地定位,该竖直的中心线将使用者的面部分成两半。眼相机3l和3r可以分别定位在眼睛10l和10r的前面和下面,例如在一对眼镜透镜8l和8r的下边缘处,以30°至50°的角度指向眼睛10l和10r,并且以30°至50°的角度安装在框架4中。在实施例中,眼相机3l和3r在近红外中是敏感的。Eye cameras 31 and 3r may be adapted for visible or near infrared light. They may be positioned symmetrically with respect to a vertical centerline that bisects the user's face. The eye cameras 3l and 3r may be positioned in front of and below the eyes 10l and 10r, respectively, for example at the lower edge of a pair of spectacle lenses 8l and 8r, pointing at the eyes 10l and 10r at an angle of 30° to 50°, and pointing at the eyes 10l and 10r at an angle of 30°. Mounted in frame 4 at an angle of up to 50°. In an embodiment the eye cameras 31 and 3r are sensitive in the near infrared.
在实施例中,眼相机3l和3r在近红外中是敏感的。它们具有640*480的分辨率并且以60Hz的频率读取。In an embodiment the eye cameras 31 and 3r are sensitive in the near infrared. They have a resolution of 640*480 and are read at a frequency of 60Hz.
场景相机2可以定位在竖直的中心线上,该竖直的中心线在框架4的鼻桥处将使用者的面部分成两半。可替代地,其还可以定位在头盔、帽或头带的边缘处或附近。场景相机2可以具有HD(高清晰度)和/或可调节的分辨率。其可以沿风景或人像的取向被安装。另外,其可以被安装以使得其的取向可以从风景变化到人像取向(相机辊)以及还有相机所指向的方向(相机云台)。The scene camera 2 may be positioned on a vertical centerline that bisects the user's face at the nose bridge of the frame 4 . Alternatively, it may also be positioned at or near the edge of a helmet, cap or headband. The scene camera 2 may have HD (High Definition) and/or adjustable resolution. It can be mounted in landscape or portrait orientation. Additionally, it can be mounted so that its orientation can vary from landscape to portrait orientation (camera roll) and also the direction the camera is pointing (camera gimbal).
代替单个场景相机2,眼镜装置1还可以包括一对场景相机,其中每个场景相机可以在人像模式或风景模式中取向。另外,每个场景相机可以独立于各自的第二场景相机取向。可替代地,两个场景相机2可以具有固定的取向,其可以彼此相同或可以彼此不同。Instead of a single scene camera 2, the eyewear device 1 may also comprise a pair of scene cameras, wherein each scene camera may be oriented in portrait mode or landscape mode. Additionally, each scene camera may be oriented independently of a respective second scene camera. Alternatively, the two scene cameras 2 may have a fixed orientation, which may be the same as each other or may be different from each other.
另外,棱镜或透镜可以安装在场景相机2的前面,以产生场景相机2相对于眼镜的视场的不同定位,尤其是用于近范围读取应用的更加向下取向的视场。Additionally, prisms or lenses may be mounted in front of the scene camera 2 to create different positioning of the scene camera 2 relative to the field of view of the glasses, especially a more downwardly oriented field of view for near-range reading applications.
6个LED9围绕每个眼镜透镜8定位。它们以850nm的中心波长在红外波长范围内(典型地大于750nm并且小于1000nm)发射。它们被由IRLED恒流源20提供的50mA电流驱动。Six LEDs 9 are positioned around each spectacle lens 8 . They emit in the infrared wavelength range (typically greater than 750nm and less than 1000nm) with a center wavelength of 850nm. They are driven by a current of 50 mA provided by the IRLED constant current source 20 .
代替用LED9直接照射眼睛,可以设想具有光导的实施方式。一个或几段的光导(例如光纤)可以被使用。眼睛的照射可以用聚焦光学器件(结构照射)来实施。替代LED9,适合的衍射光学器件或激光器可以被用于产生照射眼睛的相干光的图案。光源可以与光学元件一起使用,用于(例如用聚焦光学器件或衍射光学器件)产生在眼睛10l和10r上的反射图案。照射源可以发射可见光或近红外光。照射源可以定位在框架4中或上,尤其是在围绕眼镜透镜8l和8r的圈形布置中。可替代地,照射源可以定位在头戴式显示器的边缘或框架上。其可以具体地被设计成在测试人员31的眼睛表面上产生反射图案。Instead of directly illuminating the eye with the LED 9, an embodiment with a light guide is conceivable. One or several segments of light guides (eg optical fibers) can be used. Illumination of the eye can be performed with focusing optics (structured illumination). Instead of LEDs 9, suitable diffractive optics or lasers can be used to generate a pattern of coherent light that illuminates the eye. A light source may be used with optical elements for generating reflective patterns on the eyes 101 and 10r (eg, with focusing optics or diffractive optics). The illumination source can emit visible or near infrared light. The illumination sources may be positioned in or on the frame 4, especially in a ring-shaped arrangement around the spectacle lenses 8l and 8r. Alternatively, the illumination source may be positioned on the edge or frame of the head mounted display. It may be specifically designed to produce a reflective pattern on the surface of the tester's 31 eye.
在图2所示的眼镜装置1被测试人员穿戴时,图10A中所示的情形以简化的方式实现。眼相机3被布置成使得,在眼镜装置1固定至测试人员的头部的情况下,在框架4上,捕获眼睛10的至少一个参数的光学路径M在从眼相机3至眼睛10的直线上延伸。The situation shown in FIG. 10A is realized in a simplified manner when the eyewear device 1 shown in FIG. 2 is worn by a test person. The eye camera 3 is arranged such that, with the eyeglass device 1 fixed to the tester's head, on the frame 4 the optical path M capturing at least one parameter of the eye 10 is on a straight line from the eye camera 3 to the eye 10 extend.
图3和10B显示眼镜装置1的不同配置。眼镜装置1包括形成了连接至框架4的光学偏转元件的反射镜11,反射镜11和眼相机3布置成使得,在眼镜装置1固定至测试人员的头部上的情况下,在框架4上,用于捕获眼睛10的至少一个参数的光学路径M从眼相机3经由反射镜11延伸至眼睛10。图3的三维视图显示出从后视的角度或内视的角度的眼镜装置1。在附图中,左右眼睛10l和10r的反射分别显示在眼镜透镜8l和8r中。坐标系是笛卡尔坐标系,且z轴线被引导到投影平面中。3 and 10B show different configurations of the glasses device 1 . The eyewear device 1 comprises a mirror 11 forming an optical deflecting element connected to the frame 4, the mirror 11 and the eye camera 3 being arranged such that, with the eyewear device 1 fixed to the tester's head, , the optical path M for capturing at least one parameter of the eye 10 extends from the eye camera 3 to the eye 10 via the mirror 11 . The three-dimensional view of Fig. 3 shows the eyewear device 1 from a rear-view point of view or from an inside-view point of view. In the drawing, the reflections of the left and right eyes 101 and 10r are shown in spectacle lenses 81 and 8r, respectively. The coordinate system is Cartesian and the z-axis is directed into the projection plane.
因此,眼相机3l和3r可以被安装在眼睛10l和10r的前面和上方,且光导或反射镜11定位在眼睛10l和10r的前面和下方,例如在一对眼镜透镜8l和8r的下边缘处,用于从向前和低的透视角度获取每一眼睛10l和10r的图像,并且使得所述图像对于眼相机10l和10r是可见的。光导或反射镜11可以是(平坦的)反射镜、球形反射镜、拱顶、定制透镜、全息图像光导等。反射镜11可以仅反射特定范围的波长和对于其他是透射的。Thus, the eye cameras 31 and 3r may be mounted in front of and above the eyes 101 and 10r with the light guide or mirror 11 positioned in front of and below the eyes 101 and 10r, for example at the lower edge of a pair of spectacle lenses 81 and 8r , for acquiring images of each eye 101 and 10r from a forward and low perspective angle and making the images visible to eye cameras 101 and 10r. The light guide or mirror 11 may be a (flat) mirror, a spherical mirror, a dome, a custom lens, a holographic image light guide or the like. Mirror 11 may only reflect a certain range of wavelengths and be transmissive for others.
反射镜11可以是平坦的反射镜或球形反射镜。球形反射镜的优点在于其放大了眼相机3的视场超过用平坦的反射镜可获得的视场。图3的配置另外允许将光学系统放置成非常靠近眼睛10(设定方向),由此改善了工效学和美学。测试人员自己的视场很难被阻挡。反射镜11可以是所谓的热反射镜,即反射镜11在可见波长范围中是透射的而在红外波长范围中具有较高的反射率。其可以非常薄且中空的(所谓的拱顶),因此最小化了由于折射所引起的变形。其可以由显示出非常小的折射率(IOR)的材料制造。Mirror 11 may be a flat mirror or a spherical mirror. The advantage of the spherical mirror is that it magnifies the field of view of the eye camera 3 beyond that obtainable with a flat mirror. The configuration of Fig. 3 additionally allows placing the optical system very close to the eye 10 (set orientation), thereby improving ergonomics and aesthetics. The tester's own field of view was hardly blocked. The mirror 11 may be a so-called thermal mirror, ie the mirror 11 is transmissive in the visible wavelength range and has a higher reflectivity in the infrared wavelength range. It can be very thin and hollow (so-called dome), thus minimizing deformation due to refraction. It can be fabricated from materials that exhibit a very small index of refraction (IOR).
在两种情形(图10A和10B)中,眼相机3被布置成使得用于捕获眼睛10的至少一个参数的光学路径M排除了框架插入件,即眼镜透镜8。另外,眼镜透镜8被布置成使得眼睛10的光学轴线K和光学路径M作为单个结合地使用的光学元件包括眼睛10。另外,光学路径M整体在空间Sp中延伸,其在面向眼睛10的眼镜透镜8的一侧上延伸。In both cases ( FIGS. 10A and 10B ), the eye camera 3 is arranged such that the optical path M for capturing at least one parameter of the eye 10 excludes the frame insert, ie the spectacle lens 8 . In addition, the spectacle lens 8 is arranged such that the optical axis K and the optical path M of the eye 10 comprise the eye 10 as a single optical element used in combination. In addition, the optical path M extends entirely in the space Sp, which extends on the side of the spectacle lens 8 facing the eye 10 .
图2和3以及图10A和10B所示的实施例分别都减小了由于上眼睑造成的眼睛闭塞。The embodiments shown in Figures 2 and 3 and Figures 10A and 10B, respectively, reduce eye occlusion due to the upper eyelid.
图6A至8显示了与现有技术相比减小了眼镜装置1的视差。如从图6A所见,测试人员实际上将其眼睛聚焦到物体29的位置和由眼镜装置1所确定的关注点32通常在使用如从现有技术所已知的眼镜装置1时不是很好重合。这种效果通常是测试人员越靠近将被聚焦的物体29定位时越明显。然而,根据本发明实施例的眼镜装置1,所确定的关注点32和实际物体29之间的重合非常好,甚至对于低至0.5m的测量距离(参见图6B)。这通过最小化眼球中心和相机焦点之间的距离来实现。6A to 8 show that the parallax of the glasses device 1 is reduced compared with the prior art. As can be seen from FIG. 6A , the tester actually focuses his eyes on the position of the object 29 and the point of interest 32 determined by the eyewear device 1 is generally not very good when using the eyewear device 1 as known from the prior art. coincide. This effect is generally more pronounced the closer the tester is positioned to the object 29 to be focused. However, according to the eyewear device 1 of the embodiment of the present invention, the coincidence between the determined point of interest 32 and the actual object 29 is very good, even for a measurement distance as low as 0.5 m (see FIG. 6B ). This is achieved by minimizing the distance between the center of the eyeball and the focal point of the camera.
所述情形再次在图7中示出。由于眼睛10和场景相机2定位在略微不同的位置上,所以在他们的各自用于聚焦物体29的视角上的差别在物体29分别地越靠近眼睛10和场景相机2定位时预加明显(即对于更小的z值具有更大的变形)。眼镜装置1可以在图6B所示的情形中被校准。物体29之后位于校准平面P中并且通过校准眼镜装置1,可以保证所确定的关注点32实际上落到实际的物体29上。通常在距离测试对象的某一距离处的平面上执行校准。其将所测量的凝视方向(角度)与在场景视频帧中的像素相关联。这一计算仅对于位于所述校准平面中的点给出了有效的结果。位于未在所述平面中的点,引入了系统误差(视差)。当眼镜装置与物体29的距离被增大时,所述至校准平面P的距离和至物体29的实际距离之间差别导致了明显的偏差。对于根据本发明实施例的眼镜装置1,对于所有距离d的这些偏差或视差(由在图8中的符号S2圆圈表示)远小于根据现有技术所述的装置的偏差或视差(符号S1、矩形)。细交叉线与符号S2的组相关,而粗交叉线与符号S1的组相关。所述交叉线对应于用于校准目的关注点32。The situation is shown again in FIG. 7 . Since the eye 10 and the scene camera 2 are positioned at slightly different positions, the difference in their respective viewing angles for focusing on the object 29 is pre-emphasized the closer the object 29 is positioned to the eye 10 and the scene camera 2 respectively (i.e. have larger deformations for smaller z values). The glasses device 1 can be calibrated in the situation shown in FIG. 6B. The object 29 is then located in the calibration plane P and by calibrating the spectacle arrangement 1 it can be ensured that the determined point of interest 32 actually falls on the actual object 29 . Calibration is usually performed on a plane at a certain distance from the test object. It correlates the measured gaze direction (angle) to a pixel in the video frame of the scene. This calculation only gives valid results for points lying in the calibration plane. Points that lie not in said plane introduce systematic errors (parallax). The difference between said distance to the calibration plane P and the actual distance to the object 29 leads to a significant deviation when the distance of the spectacle device from the object 29 is increased. For the eyewear device 1 according to an embodiment of the invention, these deviations or parallaxes (indicated by the symbol S2 circle in FIG. 8 ) for all distances d are much smaller than those of devices according to the prior art (symbols S1 , rectangle). The thin cross-hatches are associated with the group of symbols S2, while the thick cross-hatches are associated with the group of symbol S1. The cross-hatches correspond to points of interest 32 for calibration purposes.
视差被数学建模为场景相机2的位置相对于眼睛位置的函数。由于视差所造成的凝视估计误差通过将场景相机2根据由数学模拟所显示的结果尽可能地靠近眼睛10放置而被最小化。视差可以通过利用双眼追踪的光折射度数估计至关注点的距离和通过估计眼睛相对于眼睛追踪装置的位置而被进一步修正。Parallax is mathematically modeled as a function of the position of the scene camera 2 relative to the position of the eyes. Gaze estimation errors due to parallax are minimized by placing the scene camera 2 as close as possible to the eye 10 as shown by the mathematical simulations. Parallax can be further corrected by estimating the distance to the point of interest using binocular-tracked optical powers and by estimating the position of the eyes relative to the eye-tracking device.
为了实现甚至更好的结果,场景相机2的视场可以被优化。具有标准光学器件的场景相机2具有不覆盖全部生理学凝视范围(标准光学器件的水平视场:40°至50°;典型的生理学凝视范围为60°)的视场。在实施例中,场景相机2的视场因此可以根据各自的应用被优化。在图9A和9B中示出了一个这样的视场优化方法。穿戴眼镜装置1的使用者同时观看背景B和他的移动电话30。根据图9A,视场FOV1主要覆盖背景B。当测试人员31向下看他的移动电话30时,凝视方向的变化自动地由眼相机3l和3r确定,场景相机2的视场自动地通过从风景切换至人像取向而被调节(视场FOV2)。这可以通过场景相机2的z轴线90°的机械滚动或通过使用在场景相机2的前面的光学棱镜来实现。另外,使用具有不同的倾斜或滚动角度的两个场景相机是可行的。可替代地,另外光学分束器可以被用在场景相机2的前面。To achieve even better results, the field of view of the scene camera 2 can be optimized. Scene camera 2 with standard optics has a field of view that does not cover the full range of physiological gaze (horizontal field of view for standard optics: 40° to 50°; typical physiological gaze range is 60°). In an embodiment, the field of view of the scene camera 2 can thus be optimized according to the respective application. One such field of view optimization method is shown in Figures 9A and 9B. The user wearing the glasses device 1 watches the background B and his mobile phone 30 at the same time. According to FIG. 9A , the field of view FOV1 mainly covers the background B. FIG. When the tester 31 looks down at his mobile phone 30, the change in gaze direction is automatically determined by the eye cameras 31 and 3r, and the field of view of the scene camera 2 is automatically adjusted by switching from landscape to portrait orientation (field of view FOV2 ). This can be achieved by a mechanical roll of the z-axis of the scene camera 2 by 90° or by using an optical prism in front of the scene camera 2 . Also, it is possible to use two scene cameras with different tilt or roll angles. Alternatively, an additional optical beam splitter may be used in front of the scene camera 2 .
总之,眼镜装置1形成了头戴式眼睛追踪系统,其由三个相机构成:两个眼相机3l和3r和至少一个场景相机2。三个相机3l,3r和2可以具有可管理的带宽,例如通过可调节的帧速率或分辨率。一个或几个预先处理单元6,16,17和26可以存在,其执行从相机2,3l和3r所接收到的视频流的可变的压缩。视频流的压缩水平可以与眼相机3l和3r以及场景相机2相同,或者视频流可以针对眼相机3l和3r和场景相机2被单独地压缩。眼相机3l的帧速率可以对应于全速获取,眼相机3r的帧速率可以对应于1/10速度获取,场景相机2的帧速率可以对应于1/2速度获取。代替调节不同相机的帧速率,可替代地,获取速率可以被选择成相同,而数据处理被针对于每一相机不同地执行。即便两个相机获取了相同量的数据,由一个相机所提供的数据可以比由另一相机提供的数据更大地压缩。还可以组合不同的压缩速率与不同的获取速率。还可以在传送数据时省略例如每隔一秒所获取的图像,由此将发送至CPU的数据量减小至一半。相机2,3l和3r的信号可以经由无线或有线接口被传送至PC27中的CPU(参见图5)。用于其他数据源的辅助接口和用于与这些数据源同步的方法可以在眼镜装置1中被实施。In summary, the eyewear device 1 forms a head-mounted eye tracking system consisting of three cameras: two eye cameras 31 and 3r and at least one scene camera 2 . The three cameras 31, 3r and 2 can have manageable bandwidth, for example by adjustable frame rate or resolution. One or several pre-processing units 6, 16, 17 and 26 may be present, which perform variable compression of the video streams received from the cameras 2, 31 and 3r. The video stream can be compressed at the same level as eye cameras 31 and 3r and scene camera 2, or the video stream can be compressed separately for eye cameras 31 and 3r and scene camera 2. The frame rate of the eye camera 31 may correspond to full speed acquisition, the frame rate of the eye camera 3r may correspond to 1/10 speed acquisition, and the frame rate of the scene camera 2 may correspond to 1/2 speed acquisition. Instead of adjusting the frame rates of the different cameras, alternatively the acquisition rate can be chosen to be the same, while the data processing is performed differently for each camera. Even though both cameras acquire the same amount of data, the data provided by one camera may be more compressed than the data provided by the other camera. It is also possible to combine different compression rates with different acquisition rates. It is also possible to omit, for example, images acquired every second when transferring data, thereby reducing the amount of data sent to the CPU to half. The signals of the cameras 2, 3l and 3r can be transmitted to the CPU in the PC 27 via a wireless or wired interface (see Fig. 5). Auxiliary interfaces for other data sources and methods for synchronizing with these data sources may be implemented in the eyewear device 1 .
眼镜装置1可以犹如包括几个可交换件的系统。眼镜装置1可以具有对于有小或大鼻子的面部的可更换组的鼻件或鼻托7。这样,眼镜装置1可以穿戴在视力矫正眼镜上,而不造成问题。另外,眼镜装置1具有用于可更换眼镜的保持机构,其可以对于特定的波长范围具有不同的光透射水平(例如透明眼镜或太阳镜)。另外或可替代地,可更换的眼镜可以具有近红外光学滤光片,以匹配照射源的波长和阻挡在相同和类似波长的外面的所有光或一些光到达眼睛表面,以改善眼睛表面上的信噪比。眼镜装置1具有边缘和鼻桥,其用作眼相机3l和3r以及场景相机2的安装件或壳体。眼相机3l和3r被安装使得它们的视场延伸到可更换的眼镜8l和8r的后面。The eyeglass device 1 can be like a system comprising several exchangeable pieces. The eyewear device 1 may have an exchangeable set of nosepieces or nose pads 7 for faces with small or large noses. In this way, the eyeglass device 1 can be worn on vision correction glasses without causing a problem. In addition, the eyewear device 1 has a holding mechanism for interchangeable eyeglasses, which may have different light transmission levels for certain wavelength ranges (eg clear eyeglasses or sunglasses). Additionally or alternatively, the replaceable eyeglasses may have near-infrared optical filters to match the wavelength of the illumination source and block all or some light outside the same and similar wavelengths from reaching the eye surface to improve vision on the eye surface. SNR. The eyewear device 1 has rims and a nose bridge, which serve as mounts or housings for the eye cameras 31 and 3r and the scene camera 2 . The eye cameras 31 and 3r are mounted such that their fields of view extend behind the interchangeable glasses 81 and 8r.
对于眼镜装置1,可以进行眼睛追踪、超自然测量(occulometrics)、生物学测量和位置移动测量,用于在自由范围移动的装配中尽可能全面地测量和分类人类的行为。头戴式眼睛追踪装置被实现,其是不需校准的且提供了散光估计。眼睛追踪功能性具有零启动时间。不需要调整。测试人员31可以仅将眼镜装置1放置到其上并且开始使用其。其具有覆盖人类眼睛运动的生理学范围的非常大的凝视追踪范围(水平上80°,竖直上60°)。其是非常鲁棒性的并且在凝视映射中具有高精度散光被补偿,视差被最小化,光轴轴线偏移被补偿并且所述装置不需校准,或可以使用一个点的校准特征进行校准。另外,其被设计成不管人种(高加索人、亚洲人、非洲人等)、性别和年龄进行工作。场景相机2的视场被优化。通过使用光学、惯性或磁性传感器,头部追踪功能可以被实施。眼镜装置另外提供了生物测量特征,诸如测量光瞳直径,并且提供了与EEG,ECG等的接合和同步选项。最终,可以与头戴式显示器集成在一起。可以将虚拟图像投影到便携式计算机屏幕的使用者眼睛上。另外,提供了使用眼睛运动(凝视、眨眼)与虚拟图像中的“物体”互动的可能性。For the eyewear device 1 , eye tracking, occulometrics, biological measurements, and positional movement measurements can be performed for measuring and classifying human behavior as comprehensively as possible in an assembly of free range movement. A head-mounted eye-tracking device is implemented that is calibration-free and provides astigmatism estimation. Eye tracking functionality has zero start-up time. No adjustment is required. The tester 31 can just put the glasses device 1 thereon and start using it. It has a very large gaze tracking range (80° horizontally, 60° vertically) covering the physiological range of human eye movements. It is very robust and has high precision in gaze mapping astigmatism is compensated, parallax is minimized, optical axis axis offset is compensated and the device does not need to be calibrated, or can be calibrated using a one point calibration feature. Additionally, it is designed to work regardless of race (Caucasian, Asian, African, etc.), gender and age. The field of view of scene camera 2 is optimized. Head tracking can be implemented using optical, inertial or magnetic sensors. The eyewear device additionally provides biometric features, such as measuring pupil diameter, and provides interface and synchronization options with EEG, ECG, etc. Eventually, it could be integrated with a head-mounted display. A virtual image can be projected onto the user's eyes on a laptop computer screen. Additionally, the possibility is provided to interact with "objects" in the virtual image using eye movements (gaze, blink).
头部追踪功能可以通过使用三轴线陀螺仪、三轴线加速计和/或三轴线磁力计来实现,且对于六维的头部追踪具有可选的传感器融合。Head tracking functionality can be achieved using a 3-axis gyroscope, 3-axis accelerometer, and/or 3-axis magnetometer, with optional sensor fusion for six-dimensional head tracking.
总之,眼镜装置1提供了非常特定的光学和电子构造。关于电子构造,具有可分配的带宽的三个或更多的高分辨率相机被集成到所述装置1中。可以设想用于眼相机3l和3r和场景相机2的单独的处理通道。光学构造的特点在于具有各种性质的可更换的眼镜。眼相机3l和3r的光学路径分别延伸到眼镜或眼镜透镜8l和8r的后面。另外,一组LED9允许对眼睛10l和10r的高度可变的照射。例如,可以控制围绕眼睛的照射几何构型。可以相对于频闪效应和次序控制特定的LED子组。最终,眼睛照射可以通过点、线或两维光源来实现。In summary, the spectacle device 1 provides a very specific optical and electronic configuration. Regarding the electronic configuration, three or more high-resolution cameras with assignable bandwidth are integrated into the device 1 . Separate processing channels for the eye cameras 31 and 3r and the scene camera 2 can be envisaged. The optical construction features interchangeable eyeglasses of various nature. The optical paths of the eye cameras 31 and 3r extend behind the glasses or spectacle lenses 81 and 8r, respectively. In addition, a set of LEDs 9 allows highly variable illumination of the eyes 101 and 10r. For example, the illumination geometry around the eye can be controlled. Specific LED subgroups can be controlled with respect to strobe effect and sequence. Finally, eye illumination can be achieved with point, line or two-dimensional light sources.
参考标记reference mark
1眼镜装置1 glasses device
2场景相机2 scene cameras
3,3l,3r眼相机3,3l,3r eye camera
4框架4 frames
5l,5r侧杆5l,5r side stick
6预先处理单元6 pre-processing units
7鼻托7 nose pads
8,8l,8r眼镜透镜8,8l,8r spectacle lens
9LED9 LEDs
10,10l,10r眼睛10,10l,10r eyes
11反射镜11 mirrors
12麦克风12 microphones
13辅助/同步端口13 aux/sync ports
14缆线14 cables
15电子装置15 electronic devices
16预先处理单元16 pre-processing units
17预先处理单元17 pre-processing units
18MPEG编码器18MPEG encoder
19USB集线器19USB hub
20IRLED恒流源20IRLED constant current source
21,22IRLED链条21,22IRLED chain
23,24PCB23,24PCB
25USB2.0缆线25USB2.0 cable
26预先处理单元26 pre-processing units
27PC27PC
28记录器28 recorder
29物体29 objects
30移动电话30 mobile phone
31测试人员31 testers
32关注点32 concerns
w1,w2宽度w1, w2 width
h高度height
l长度l Length
a倾角a inclination
K光学轴线K optical axis
M光学路径M optical path
O系统参考原点O system reference origin
P校准平面P calibration plane
Sp空间SP space
d距离d distance
S1,S2符号S1, S2 symbols
B背景B background
FOV1,FOV2视场FOV1, FOV2 field of view
x,y,z轴线x, y, z axis
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP11158891 | 2011-03-18 | ||
| EP11158891.9 | 2011-03-18 | ||
| EP11173756.5 | 2011-07-13 | ||
| EP20110173756EP2499960B1 (en) | 2011-03-18 | 2011-07-13 | Method for determining at least one parameter of two eyes by setting data rates and optical measuring device |
| PCT/EP2012/054605WO2012126808A1 (en) | 2011-03-18 | 2012-03-15 | Method for determining at least one parameter of two eyes by setting data rates and optical measuring device |
| Publication Number | Publication Date |
|---|---|
| CN103442629A CN103442629A (en) | 2013-12-11 |
| CN103442629Btrue CN103442629B (en) | 2016-01-06 |
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201280014104.7AActiveCN103442629B (en) | 2011-03-18 | 2012-03-15 | Method and optical measuring device for determining at least one parameter of both eyes by setting the data rate |
| CN201280013872.0AActiveCN103429139B (en) | 2011-03-18 | 2012-03-15 | Spectacle device with an adjustable field of view and method |
| CN201280014079.2AActiveCN103458770B (en) | 2011-03-18 | 2012-03-15 | Optical measuring device and method for capturing at least one parameter of at least one eyes that illumination characteristic can be adjusted |
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201280013872.0AActiveCN103429139B (en) | 2011-03-18 | 2012-03-15 | Spectacle device with an adjustable field of view and method |
| CN201280014079.2AActiveCN103458770B (en) | 2011-03-18 | 2012-03-15 | Optical measuring device and method for capturing at least one parameter of at least one eyes that illumination characteristic can be adjusted |
| Country | Link |
|---|---|
| US (5) | US20140078283A1 (en) |
| EP (5) | EP2499964B1 (en) |
| JP (4) | JP6159263B2 (en) |
| CN (3) | CN103442629B (en) |
| WO (4) | WO2012126809A1 (en) |
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9952664B2 (en) | 2014-01-21 | 2018-04-24 | Osterhout Group, Inc. | Eye imaging in head worn computing |
| US9715112B2 (en) | 2014-01-21 | 2017-07-25 | Osterhout Group, Inc. | Suppression of stray light in head worn computing |
| US9298007B2 (en) | 2014-01-21 | 2016-03-29 | Osterhout Group, Inc. | Eye imaging in head worn computing |
| US9229233B2 (en) | 2014-02-11 | 2016-01-05 | Osterhout Group, Inc. | Micro Doppler presentations in head worn computing |
| US9400390B2 (en) | 2014-01-24 | 2016-07-26 | Osterhout Group, Inc. | Peripheral lighting for head worn computing |
| US9965681B2 (en) | 2008-12-16 | 2018-05-08 | Osterhout Group, Inc. | Eye imaging in head worn computing |
| IL221863A (en)* | 2012-09-10 | 2014-01-30 | Elbit Systems Ltd | Digital system for surgical video capturing and display |
| US9894269B2 (en) | 2012-10-31 | 2018-02-13 | Atheer, Inc. | Method and apparatus for background subtraction using focus differences |
| JP6066676B2 (en)* | 2012-11-06 | 2017-01-25 | 株式会社ソニー・インタラクティブエンタテインメント | Head mounted display and video presentation system |
| CN102961119B (en)* | 2012-11-26 | 2015-01-07 | 深圳恒兴视光科技有限公司 | Centrometer |
| KR102205374B1 (en) | 2012-12-06 | 2021-01-21 | 아이플루언스, 인크. | Eye tracking wearable devices and methods for use |
| US20140176327A1 (en)* | 2012-12-20 | 2014-06-26 | Nokia Corporation | Method and apparatus for determining that medical assistance may be required |
| US9160915B1 (en)* | 2013-01-09 | 2015-10-13 | Amazon Technologies, Inc. | Modifying device functionality based on device orientation |
| GB2513579A (en)* | 2013-04-29 | 2014-11-05 | Tobii Technology Ab | Power efficient image sensing apparatus, method of operating the same and eye/gaze tracking system |
| JP6330258B2 (en)* | 2013-05-15 | 2018-05-30 | セイコーエプソン株式会社 | Virtual image display device |
| TWI505260B (en)* | 2013-07-30 | 2015-10-21 | Univ Nat Chiao Tung | Head-mount eye tracking system |
| AT513987B1 (en)* | 2013-08-23 | 2014-09-15 | Ernst Dipl Ing Dr Pfleger | Spectacles and methods for determining pupil centers of both eyes of a human |
| US10686972B2 (en) | 2013-09-03 | 2020-06-16 | Tobii Ab | Gaze assisted field of view control |
| CN113576398A (en)* | 2013-09-03 | 2021-11-02 | 托比股份公司 | Portable eye tracking device |
| US11327302B2 (en) | 2013-09-18 | 2022-05-10 | Beth Holst | Secure capture and transfer of image and audio data |
| US10008124B1 (en) | 2013-09-18 | 2018-06-26 | Beth Holst | Method and system for providing secure remote testing |
| US9332903B2 (en) | 2013-09-19 | 2016-05-10 | Gn Otometrics A/S | Headgear for observation of eye movements |
| JP2015061595A (en)* | 2013-09-19 | 2015-04-02 | ジーエヌ オトメトリックス エー/エスGN Otometrics A/S | Headgear for observation of eye movements |
| US9785231B1 (en)* | 2013-09-26 | 2017-10-10 | Rockwell Collins, Inc. | Head worn display integrity monitor system and methods |
| CN104706422A (en)* | 2013-12-13 | 2015-06-17 | Ge医疗系统环球技术有限公司 | Head-worn type medical device, medical system and operation method of medical system |
| US9575321B2 (en) | 2014-06-09 | 2017-02-21 | Osterhout Group, Inc. | Content presentation in head worn computing |
| US9810906B2 (en) | 2014-06-17 | 2017-11-07 | Osterhout Group, Inc. | External user interface for head worn computing |
| US10684687B2 (en) | 2014-12-03 | 2020-06-16 | Mentor Acquisition One, Llc | See-through computer display systems |
| US11103122B2 (en) | 2014-07-15 | 2021-08-31 | Mentor Acquisition One, Llc | Content presentation in head worn computing |
| US9939934B2 (en) | 2014-01-17 | 2018-04-10 | Osterhout Group, Inc. | External user interface for head worn computing |
| US9594246B2 (en) | 2014-01-21 | 2017-03-14 | Osterhout Group, Inc. | See-through computer display systems |
| US9299194B2 (en) | 2014-02-14 | 2016-03-29 | Osterhout Group, Inc. | Secure sharing in head worn computing |
| US9841599B2 (en) | 2014-06-05 | 2017-12-12 | Osterhout Group, Inc. | Optical configurations for head-worn see-through displays |
| US9746686B2 (en) | 2014-05-19 | 2017-08-29 | Osterhout Group, Inc. | Content position calibration in head worn computing |
| US20160019715A1 (en) | 2014-07-15 | 2016-01-21 | Osterhout Group, Inc. | Content presentation in head worn computing |
| US10254856B2 (en) | 2014-01-17 | 2019-04-09 | Osterhout Group, Inc. | External user interface for head worn computing |
| US9529195B2 (en) | 2014-01-21 | 2016-12-27 | Osterhout Group, Inc. | See-through computer display systems |
| US11227294B2 (en) | 2014-04-03 | 2022-01-18 | Mentor Acquisition One, Llc | Sight information collection in head worn computing |
| US10649220B2 (en) | 2014-06-09 | 2020-05-12 | Mentor Acquisition One, Llc | Content presentation in head worn computing |
| US20150277118A1 (en) | 2014-03-28 | 2015-10-01 | Osterhout Group, Inc. | Sensor dependent content position in head worn computing |
| US9671613B2 (en) | 2014-09-26 | 2017-06-06 | Osterhout Group, Inc. | See-through computer display systems |
| US10191279B2 (en) | 2014-03-17 | 2019-01-29 | Osterhout Group, Inc. | Eye imaging in head worn computing |
| US9829707B2 (en) | 2014-08-12 | 2017-11-28 | Osterhout Group, Inc. | Measuring content brightness in head worn computing |
| US9766463B2 (en) | 2014-01-21 | 2017-09-19 | Osterhout Group, Inc. | See-through computer display systems |
| US9494800B2 (en) | 2014-01-21 | 2016-11-15 | Osterhout Group, Inc. | See-through computer display systems |
| US20150205135A1 (en) | 2014-01-21 | 2015-07-23 | Osterhout Group, Inc. | See-through computer display systems |
| US12105281B2 (en) | 2014-01-21 | 2024-10-01 | Mentor Acquisition One, Llc | See-through computer display systems |
| US11487110B2 (en) | 2014-01-21 | 2022-11-01 | Mentor Acquisition One, Llc | Eye imaging in head worn computing |
| US12093453B2 (en) | 2014-01-21 | 2024-09-17 | Mentor Acquisition One, Llc | Eye glint imaging in see-through computer display systems |
| US9753288B2 (en) | 2014-01-21 | 2017-09-05 | Osterhout Group, Inc. | See-through computer display systems |
| US9740280B2 (en) | 2014-01-21 | 2017-08-22 | Osterhout Group, Inc. | Eye imaging in head worn computing |
| US11669163B2 (en) | 2014-01-21 | 2023-06-06 | Mentor Acquisition One, Llc | Eye glint imaging in see-through computer display systems |
| US9615742B2 (en) | 2014-01-21 | 2017-04-11 | Osterhout Group, Inc. | Eye imaging in head worn computing |
| US9651788B2 (en) | 2014-01-21 | 2017-05-16 | Osterhout Group, Inc. | See-through computer display systems |
| US11892644B2 (en) | 2014-01-21 | 2024-02-06 | Mentor Acquisition One, Llc | See-through computer display systems |
| US9836122B2 (en) | 2014-01-21 | 2017-12-05 | Osterhout Group, Inc. | Eye glint imaging in see-through computer display systems |
| US11737666B2 (en) | 2014-01-21 | 2023-08-29 | Mentor Acquisition One, Llc | Eye imaging in head worn computing |
| US9651784B2 (en) | 2014-01-21 | 2017-05-16 | Osterhout Group, Inc. | See-through computer display systems |
| US9811152B2 (en) | 2014-01-21 | 2017-11-07 | Osterhout Group, Inc. | Eye imaging in head worn computing |
| US9846308B2 (en) | 2014-01-24 | 2017-12-19 | Osterhout Group, Inc. | Haptic systems for head-worn computers |
| US9794475B1 (en) | 2014-01-29 | 2017-10-17 | Google Inc. | Augmented video capture |
| US20170090557A1 (en)* | 2014-01-29 | 2017-03-30 | Google Inc. | Systems and Devices for Implementing a Side-Mounted Optical Sensor |
| US9401540B2 (en) | 2014-02-11 | 2016-07-26 | Osterhout Group, Inc. | Spatial location presentation in head worn computing |
| US20170017481A1 (en)* | 2014-02-12 | 2017-01-19 | Nokia Technologies Oy | Method and apparatus for updating a firmware of an apparatus |
| GB2523356A (en)* | 2014-02-21 | 2015-08-26 | Tobii Technology Ab | Apparatus and method for robust eye/gaze tracking |
| US10430985B2 (en)* | 2014-03-14 | 2019-10-01 | Magic Leap, Inc. | Augmented reality systems and methods utilizing reflections |
| US20160187651A1 (en) | 2014-03-28 | 2016-06-30 | Osterhout Group, Inc. | Safety for a vehicle operator with an hmd |
| US9361519B2 (en)* | 2014-03-28 | 2016-06-07 | Intel Corporation | Computational array camera with dynamic illumination for eye tracking |
| JP2015202183A (en)* | 2014-04-14 | 2015-11-16 | 株式会社ジェイアイエヌ | Detection control device, attachment tool, eye potential information processing system, and program |
| JP2015202199A (en)* | 2014-04-14 | 2015-11-16 | 株式会社ジェイアイエヌ | Electrooculogram information processing apparatus, electrooculogram information processing system, wearing tool and program |
| US10853589B2 (en) | 2014-04-25 | 2020-12-01 | Mentor Acquisition One, Llc | Language translation with head-worn computing |
| US9651787B2 (en) | 2014-04-25 | 2017-05-16 | Osterhout Group, Inc. | Speaker assembly for headworn computer |
| US9672210B2 (en) | 2014-04-25 | 2017-06-06 | Osterhout Group, Inc. | Language translation with head-worn computing |
| WO2016018487A2 (en)* | 2014-05-09 | 2016-02-04 | Eyefluene, Inc. | Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects |
| US10663740B2 (en) | 2014-06-09 | 2020-05-26 | Mentor Acquisition One, Llc | Content presentation in head worn computing |
| CN106937531B (en)* | 2014-06-14 | 2020-11-06 | 奇跃公司 | Method and system for generating virtual and augmented reality |
| CN104111530A (en) | 2014-06-30 | 2014-10-22 | 联想(北京)有限公司 | Information processing method and wearable electronic equipment |
| US10540907B2 (en) | 2014-07-31 | 2020-01-21 | Intelligent Technologies International, Inc. | Biometric identification headpiece system for test taking |
| US9465991B2 (en) | 2014-08-11 | 2016-10-11 | Microsoft Technology Licensing, Llc | Determining lens characteristics |
| EP3182881B1 (en)* | 2014-08-20 | 2023-11-29 | California Baptist University | Systems for monitoring eye health |
| US10410535B2 (en) | 2014-08-22 | 2019-09-10 | Intelligent Technologies International, Inc. | Secure testing device |
| CN106662746B (en) | 2014-08-22 | 2020-10-23 | 国际智能技术公司 | Secure examination device, system and method |
| AT516326B1 (en)* | 2014-09-29 | 2016-07-15 | Pocket Sky Og | Device for signal transmission to the eye |
| WO2016073202A1 (en) | 2014-11-04 | 2016-05-12 | Intelligent Technologies International, Inc. | Smartcard |
| US9804392B2 (en) | 2014-11-20 | 2017-10-31 | Atheer, Inc. | Method and apparatus for delivering and controlling multi-feed data |
| EP3023827A1 (en)* | 2014-11-20 | 2016-05-25 | GN Otometrics A/S | Head mountable device for measuring eye movement having visible projection means |
| US9684172B2 (en) | 2014-12-03 | 2017-06-20 | Osterhout Group, Inc. | Head worn computer display systems |
| USD751552S1 (en) | 2014-12-31 | 2016-03-15 | Osterhout Group, Inc. | Computer glasses |
| USD753114S1 (en) | 2015-01-05 | 2016-04-05 | Osterhout Group, Inc. | Air mouse |
| US10567641B1 (en)* | 2015-01-19 | 2020-02-18 | Devon Rueckner | Gaze-directed photography |
| US20160239985A1 (en) | 2015-02-17 | 2016-08-18 | Osterhout Group, Inc. | See-through computer display systems |
| NZ773836A (en) | 2015-03-16 | 2022-07-01 | Magic Leap Inc | Methods and systems for diagnosing and treating health ailments |
| EP3305190A4 (en) | 2015-06-01 | 2018-12-26 | Alps Electric Co., Ltd. | Glasses-type electronic device |
| US9870049B2 (en)* | 2015-07-31 | 2018-01-16 | Google Llc | Reflective lenses to auto-calibrate a wearable system |
| US20200081524A1 (en)* | 2015-08-07 | 2020-03-12 | Apple Inc. | Method and appartus for data capture and evaluation of ambient data |
| JP2016127587A (en)* | 2015-09-03 | 2016-07-11 | 株式会社Fove | Head-mounted display |
| WO2017042612A1 (en)* | 2015-09-12 | 2017-03-16 | Shamir Optical Industry Ltd. | Automatic eyewear measurement and specification |
| US11109916B2 (en) | 2015-11-09 | 2021-09-07 | Digital Surgicals Pte Ltd | Personalized hand-eye coordinated digital stereo microscopic systems and methods |
| US10043075B2 (en)* | 2015-11-19 | 2018-08-07 | Microsoft Technology Licensing, Llc | Eye feature identification |
| US10241569B2 (en) | 2015-12-08 | 2019-03-26 | Facebook Technologies, Llc | Focus adjustment method for a virtual reality headset |
| US10445860B2 (en) | 2015-12-08 | 2019-10-15 | Facebook Technologies, Llc | Autofocus virtual reality headset |
| US10963063B2 (en)* | 2015-12-18 | 2021-03-30 | Sony Corporation | Information processing apparatus, information processing method, and program |
| US10678958B2 (en) | 2015-12-28 | 2020-06-09 | Intelligent Technologies International, Inc. | Intrusion-protected memory component |
| IL260604B2 (en)* | 2016-01-19 | 2023-09-01 | Magic Leap Inc | Augmented reality systems and methods utilizing reflections |
| WO2017127494A1 (en)* | 2016-01-22 | 2017-07-27 | Corning Incorporated | Wide field personal display |
| US11054648B2 (en)* | 2016-02-04 | 2021-07-06 | Google Llc | Compact near-eye display optics for higher optical performance |
| USD794112S1 (en)* | 2016-03-07 | 2017-08-08 | Snap Inc. | Eyeglasses |
| JP2017163180A (en)* | 2016-03-07 | 2017-09-14 | 富士通株式会社 | Deviation determination program, deviation determination method, and information processing apparatus |
| US11106276B2 (en) | 2016-03-11 | 2021-08-31 | Facebook Technologies, Llc | Focus adjusting headset |
| US10379356B2 (en) | 2016-04-07 | 2019-08-13 | Facebook Technologies, Llc | Accommodation based optical correction |
| JP6923552B2 (en) | 2016-04-08 | 2021-08-18 | マジック リープ, インコーポレイテッドMagic Leap,Inc. | Augmented reality systems and methods with varifocal lens elements |
| CN105676458A (en)* | 2016-04-12 | 2016-06-15 | 王鹏 | Wearable calculation device and control method thereof, and wearable equipment with wearable calculation device |
| KR102522502B1 (en) | 2016-04-26 | 2023-04-17 | 매직 립, 인코포레이티드 | Electromagnetic tracking with augmented reality systems |
| EP3448229A4 (en)* | 2016-04-28 | 2019-12-04 | Alex Artsyukhovich | DETACHABLE MINIATURE MICROSCOPE MOUNTED KERATOMETER FOR CATARACT SURGERY |
| US9854968B2 (en)* | 2016-05-20 | 2018-01-02 | International Business Machines Corporation | Behind-eye monitoring using natural reflection of lenses |
| US10684479B2 (en)* | 2016-06-15 | 2020-06-16 | Vrvaorigin Vision Technology Corp. Ltd. | Head-mounted personal multimedia systems and visual assistance devices thereof |
| EP3751396A1 (en)* | 2016-06-16 | 2020-12-16 | Apple Inc. | Method and system for providing eye tracking based information about a user behavior, client device, server and computer program product |
| CN106200901B (en)* | 2016-06-24 | 2019-03-29 | 联想(北京)有限公司 | A kind of bearing calibration of wear-type ocular pursuit device and wear-type ocular pursuit device |
| KR102669685B1 (en) | 2016-07-25 | 2024-05-29 | 매직 립, 인코포레이티드 | Light field processor system |
| CN106293100A (en)* | 2016-08-24 | 2017-01-04 | 上海与德通讯技术有限公司 | The determination method of sight line focus and virtual reality device in virtual reality device |
| US10828560B2 (en)* | 2016-09-30 | 2020-11-10 | Sony Interactive Entertainment Inc. | Systems and methods for stereoscopic vision with head mounted display |
| EP3306528B1 (en)* | 2016-10-04 | 2019-12-25 | Axis AB | Using image analysis algorithms for providing traning data to neural networks |
| US10877556B2 (en)* | 2016-10-21 | 2020-12-29 | Apple Inc. | Eye tracking system |
| CN107066079A (en) | 2016-11-29 | 2017-08-18 | 阿里巴巴集团控股有限公司 | Service implementation method and device based on virtual reality scenario |
| CN206301289U (en)* | 2016-11-29 | 2017-07-04 | 阿里巴巴集团控股有限公司 | VR terminal equipment |
| US9905143B1 (en)* | 2016-12-01 | 2018-02-27 | Varjo Technologies Oy | Display apparatus and method of displaying using image renderers and optical combiners |
| WO2018129034A1 (en)* | 2017-01-04 | 2018-07-12 | 3M Innovative Properties Company | Asymmetric turning film with top-hat light output distributions |
| US10310598B2 (en)* | 2017-01-17 | 2019-06-04 | Facebook Technologies, Llc | Varifocal head-mounted display including modular air spaced optical assembly |
| IL311431A (en) | 2017-02-23 | 2024-05-01 | Magic Leap Inc | Display system with variable power reflector |
| US10545347B2 (en) | 2017-02-23 | 2020-01-28 | Google Llc | Compact eye tracking using folded display optics |
| CN106873158A (en)* | 2017-02-27 | 2017-06-20 | 阿里巴巴集团控股有限公司 | Virtual reality helmet |
| CN106873159A (en) | 2017-02-27 | 2017-06-20 | 阿里巴巴集团控股有限公司 | Virtual reality helmet |
| CN107122642A (en) | 2017-03-15 | 2017-09-01 | 阿里巴巴集团控股有限公司 | Identity identifying method and device based on reality environment |
| JP6941952B2 (en)* | 2017-03-28 | 2021-09-29 | 株式会社トプコン | Ophthalmic equipment |
| IL269861B2 (en)* | 2017-04-14 | 2023-11-01 | Magic Leap Inc | Multimodal eye tracking |
| US10976551B2 (en) | 2017-08-30 | 2021-04-13 | Corning Incorporated | Wide field personal display device |
| US10379628B2 (en)* | 2017-09-27 | 2019-08-13 | Htc Corporation | Tracking system, virtual reality system and attachable device |
| FI20175960A1 (en) | 2017-10-30 | 2019-05-01 | Univ Of Eastern Finland | Method and apparatus for gaze detection |
| US11138301B1 (en)* | 2017-11-20 | 2021-10-05 | Snap Inc. | Eye scanner for user identification and security in an eyewear device |
| US10728517B2 (en)* | 2017-12-22 | 2020-07-28 | Flir Systems Ab | Parallax mitigation for multi-imager systems and methods |
| JP7020110B2 (en)* | 2017-12-26 | 2022-02-16 | トヨタ自動車株式会社 | Manufacturing method of catalyst for exhaust gas purification and catalyst for exhaust gas purification |
| JP2019124772A (en)* | 2018-01-15 | 2019-07-25 | 株式会社東海理化電機製作所 | Imaging device |
| CN108089326B (en)* | 2018-02-01 | 2023-12-26 | 北京七鑫易维信息技术有限公司 | Device suitable for being used with glasses |
| EP3750028B1 (en) | 2018-02-09 | 2022-10-19 | Pupil Labs GmbH | Devices, systems and methods for predicting gaze-related parameters |
| WO2019154511A1 (en) | 2018-02-09 | 2019-08-15 | Pupil Labs Gmbh | Devices, systems and methods for predicting gaze-related parameters using a neural network |
| US11393251B2 (en) | 2018-02-09 | 2022-07-19 | Pupil Labs Gmbh | Devices, systems and methods for predicting gaze-related parameters |
| US10845872B2 (en) | 2018-02-09 | 2020-11-24 | Ricoh Company, Ltd. | Eye-gaze tracker, eye-gaze tracking method, and recording medium |
| KR102579034B1 (en)* | 2018-02-23 | 2023-09-15 | 삼성전자주식회사 | An electronic device including a semi-transparent member disposed at an angle specified with respect to a direction in which a video is outputbelow the video outputmodule |
| FR3079936B1 (en)* | 2018-04-04 | 2020-04-17 | Institut Mines-Telecom | OPTICAL SYSTEM FOR DETECTING AND TRACKING EYE MOVEMENTS, EXTERNAL MOUNT AND CONNECTED CONTACT LENS THEREOF |
| CN110596889A (en)* | 2018-06-13 | 2019-12-20 | 托比股份公司 | Eye tracking device and method of manufacturing an eye tracking device |
| EP3826528A4 (en)* | 2018-07-25 | 2022-07-27 | Natus Medical Incorporated | Real-time removal of ir led reflections from an image |
| JP7227989B2 (en)* | 2018-08-21 | 2023-02-22 | メタ プラットフォームズ テクノロジーズ, リミテッド ライアビリティ カンパニー | Illumination assembly with in-field microdevices |
| US10809760B1 (en)* | 2018-10-29 | 2020-10-20 | Facebook, Inc. | Headset clock synchronization |
| US11500185B2 (en)* | 2018-11-09 | 2022-11-15 | Meta Platforms Technologies, Llc | Catadioptric and refractive optical structures for beam shaping |
| US20200150425A1 (en)* | 2018-11-09 | 2020-05-14 | Facebook Technologies, Llc | Inconspicuous near-eye electrical components |
| US10914945B2 (en) | 2018-11-09 | 2021-02-09 | Facebook Technologies, Llc | Inconspicuous near-eye electrical circuits |
| CN109725714B (en) | 2018-11-14 | 2022-06-14 | 北京七鑫易维信息技术有限公司 | Sight line determining method, device and system and head-mounted eye movement equipment |
| EP3891696A1 (en) | 2018-12-04 | 2021-10-13 | Telefonaktiebolaget Lm Ericsson (Publ) | Improved optical see-through viewing device and method for providing virtual content overlapping visual objects |
| USD879868S1 (en)* | 2018-12-27 | 2020-03-31 | Snap Inc. | Eyewear |
| USD879865S1 (en)* | 2018-12-27 | 2020-03-31 | Snap Inc. | Eyewear |
| US11537202B2 (en) | 2019-01-16 | 2022-12-27 | Pupil Labs Gmbh | Methods for generating calibration data for head-wearable devices and eye tracking system |
| US11861063B2 (en) | 2019-02-05 | 2024-01-02 | Samsung Electronics Co., Ltd. | Eye-tracking device and display apparatus including the same |
| US10764571B1 (en)* | 2019-04-22 | 2020-09-01 | Snap Inc. | Camera holder for economical and simplified test alignment |
| US11786694B2 (en) | 2019-05-24 | 2023-10-17 | NeuroLight, Inc. | Device, method, and app for facilitating sleep |
| EP3979896B1 (en) | 2019-06-05 | 2024-11-13 | Pupil Labs GmbH | Devices, systems and methods for predicting gaze-related parameters |
| WO2020244780A1 (en)* | 2019-06-07 | 2020-12-10 | Telefonaktiebolaget Lm Ericsson (Publ) | Improved optical see-through viewing device and method for calibrating provision of virtual content overlapping visual objects |
| WO2020253949A1 (en) | 2019-06-18 | 2020-12-24 | Pupil Labs Gmbh | Systems and methods for determining one or more parameters of a user's eye |
| CN110441901A (en)* | 2019-08-14 | 2019-11-12 | 东北大学 | It is a kind of can real-time tracing watch the optical microscope system and method for position attentively |
| CN111651034B (en)* | 2019-12-05 | 2023-12-26 | 寰采星科技(宁波)有限公司 | Intelligent glasses, control method and control chip of intelligent glasses |
| EP3973346B1 (en) | 2020-02-19 | 2024-12-25 | Pupil Labs GmbH | Eye tracking module and head-wearable device |
| AT523152B1 (en)* | 2020-02-24 | 2021-06-15 | Univ Graz Tech | Device for detecting eye movements |
| US12007810B2 (en)* | 2020-04-23 | 2024-06-11 | Apple Inc. | Electronic devices with antennas and optical components |
| EP3935433B1 (en)* | 2020-05-14 | 2022-05-04 | Viewpoint Sicherheitsforschung - Blickforschung GmbH | Spectacles and method for determining the pupil center |
| CN111783660B (en)* | 2020-07-01 | 2023-11-10 | 业成科技(成都)有限公司 | Eye movement tracking device and electronic device using same |
| IL299948A (en) | 2020-07-23 | 2023-03-01 | Magic Leap Inc | Eye tracking using alternate sampling |
| CN112244763A (en)* | 2020-10-20 | 2021-01-22 | 垒途智能教科技术研究院江苏有限公司 | Head-wearing glasses type eye movement instrument convenient to adjust |
| CN116635769A (en)* | 2021-01-18 | 2023-08-22 | 三星电子株式会社 | Wearable electronic device including miniature camera |
| EP4387519A4 (en)* | 2021-08-18 | 2025-08-06 | Advanced Neuromodulation Systems Inc | SYSTEMS AND METHODS FOR PROVIDING DIGITAL HEALTH SERVICES |
| WO2023146876A1 (en) | 2022-01-26 | 2023-08-03 | Apple Inc. | Eye tracking using efficient image capture and vergence and inter-pupillary distance history |
| US11818472B2 (en) | 2022-01-31 | 2023-11-14 | Donald Siu | Simultaneously capturing images in landscape and portrait modes |
| WO2023203923A1 (en)* | 2022-04-22 | 2023-10-26 | ソニーグループ株式会社 | Head-mounted information processing device, information processing method, recording medium |
| US11806078B1 (en) | 2022-05-01 | 2023-11-07 | Globe Biomedical, Inc. | Tear meniscus detection and evaluation system |
| CN117055209A (en)* | 2022-05-07 | 2023-11-14 | 北京七鑫易维信息技术有限公司 | Eye movement tracking device |
| US12032737B2 (en)* | 2022-08-22 | 2024-07-09 | Meta Platforms Technologies, Llc | Gaze adjusted avatars for immersive reality applications |
| CN117812452A (en)* | 2022-09-22 | 2024-04-02 | 北京字跳网络技术有限公司 | Eye image acquisition method, device, equipment and medium |
| CN115624315B (en)* | 2022-11-18 | 2023-03-14 | 北京中科睿医信息科技有限公司 | Eye movement tracking method and device, electronic equipment, computer storage medium and product |
| JP2024092340A (en) | 2022-12-26 | 2024-07-08 | キヤノン株式会社 | Eye gaze detection device |
| US12339457B2 (en) | 2022-12-26 | 2025-06-24 | Canon Kabushiki Kaisha | Line-of-sight detection device and head mounted display device |
| US12186019B2 (en) | 2023-04-07 | 2025-01-07 | Globe Biomedical, Inc | Mechanical integration of components of wearable devices and ocular health monitoring system |
| USD1058631S1 (en) | 2023-04-07 | 2025-01-21 | Globe Biomedical, Inc. | Smart spectacles |
| USD1057791S1 (en) | 2023-04-07 | 2025-01-14 | Globe Biomedical, Inc. | Smart spectacles |
| CN116645787A (en)* | 2023-05-04 | 2023-08-25 | 浙江极氪智能科技有限公司 | AR device for assisting safe driving and method for assisting safe driving |
| DE102023209501A1 (en)* | 2023-09-28 | 2025-04-03 | Robert Bosch Gesellschaft mit beschränkter Haftung | Method for calibrating data glasses having a virtual retinal display, computing unit and data glasses |
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5886767A (en)* | 1996-10-09 | 1999-03-23 | Snook; Richard K. | Keratometry system and method for measuring physical parameters of the cornea |
| EP1391176A1 (en)* | 2002-08-16 | 2004-02-25 | Universiteit Maastricht | Method and arrangement for performing measurements of the topography of a corneal surface |
| CN1960670A (en)* | 2004-04-01 | 2007-05-09 | 威廉·C·托奇 | Biosensor, communicator, and controller for monitoring eye movement and methods of use thereof |
| CN101040776A (en)* | 2006-03-24 | 2007-09-26 | 株式会社拓普康 | A fundus observation device |
| CN101282680A (en)* | 2005-10-10 | 2008-10-08 | 托比技术有限公司 | Eye tracker having an extended span of operating distances |
| US20090247997A1 (en)* | 2008-04-01 | 2009-10-01 | Amo Development, Llc | Ophthalmic laser apparatus, system, and method with high resolution imaging |
| WO2010118292A1 (en)* | 2009-04-09 | 2010-10-14 | Dynavox Systems, Llc | Calibration free, motion tolerant eye-gaze direction detector with contextually aware computer interaction and communication methods |
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US4145122A (en)* | 1977-05-31 | 1979-03-20 | Colorado Seminary | Method and apparatus for monitoring the position of the eye |
| US4300818A (en)* | 1978-03-13 | 1981-11-17 | Schachar Ronald A | Multifocal ophthalmic lens |
| US4659197A (en)* | 1984-09-20 | 1987-04-21 | Weinblatt Lee S | Eyeglass-frame-mounted eye-movement-monitoring apparatus |
| US4852988A (en)* | 1988-09-12 | 1989-08-01 | Applied Science Laboratories | Visor and camera providing a parallax-free field-of-view image for a head-mounted eye movement measurement system |
| US5092669A (en)* | 1990-03-16 | 1992-03-03 | Migra Limited | Optical device and method for using same |
| US5610678A (en)* | 1993-12-30 | 1997-03-11 | Canon Kabushiki Kaisha | Camera including camera body and independent optical viewfinder |
| WO1996036271A1 (en)* | 1995-05-15 | 1996-11-21 | Leica Ag | Process and device for the parallel capture of visual information |
| DE69604228T2 (en)* | 1995-05-15 | 2000-01-20 | Nihon Kohden Corp., Tokio/Tokyo | Pupil measurement arrangement and Alzheimer's disease diagnosis system |
| JP3477567B2 (en)* | 1995-05-15 | 2003-12-10 | 日本光電工業株式会社 | Pupil measuring device and Alzheimer's disease diagnostic device |
| US6003991A (en)* | 1996-02-17 | 1999-12-21 | Erik Scott Viirre | Eye examination apparatus and method for remote examination of a patient by a health professional |
| US5861936A (en)* | 1996-07-26 | 1999-01-19 | Gillan Holdings Limited | Regulating focus in accordance with relationship of features of a person's eyes |
| USRE39539E1 (en) | 1996-08-19 | 2007-04-03 | Torch William C | System and method for monitoring eye movement |
| US6163281A (en)* | 1996-08-19 | 2000-12-19 | Torch; William C. | System and method for communication using eye movement |
| US6120460A (en)* | 1996-09-04 | 2000-09-19 | Abreu; Marcio Marc | Method and apparatus for signal acquisition, processing and transmission for evaluation of bodily functions |
| US6847336B1 (en)* | 1996-10-02 | 2005-01-25 | Jerome H. Lemelson | Selectively controllable heads-up display system |
| US6133946A (en)* | 1998-01-06 | 2000-10-17 | Sportvision, Inc. | System for determining the position of an object |
| CA2310114A1 (en)* | 1998-02-02 | 1999-08-02 | Steve Mann | Wearable camera system with viewfinder means |
| US6614408B1 (en)* | 1998-03-25 | 2003-09-02 | W. Stephen G. Mann | Eye-tap for electronic newsgathering, documentary video, photojournalism, and personal safety |
| JP2000023065A (en)* | 1998-06-29 | 2000-01-21 | Dainippon Printing Co Ltd | Goggle type display with gaze input |
| EP1038494A1 (en)* | 1999-03-20 | 2000-09-27 | Richard K. Snook | Clinical keratometer system |
| US6116736A (en)* | 1999-04-23 | 2000-09-12 | Neuroptics, Inc. | Pupilometer with pupil irregularity detection capability |
| JP2000300520A (en)* | 1999-04-23 | 2000-10-31 | Matsushita Electric Works Ltd | Pupil measuring apparatus |
| JP3200801B2 (en)* | 1999-06-14 | 2001-08-20 | 八木 聰明 | Imaging device |
| JP2001281520A (en)* | 2000-03-30 | 2001-10-10 | Minolta Co Ltd | Optical device |
| JP4584408B2 (en)* | 2000-05-17 | 2010-11-24 | 株式会社ニューオプト | Eye image analysis system |
| DE10047237A1 (en)* | 2000-09-23 | 2002-04-11 | Physoptics Opto Electronic Gmb | System for recording the retinal reflex image |
| US6384863B1 (en)* | 2000-10-11 | 2002-05-07 | Hewlett-Packard Company | Ergonomically designed digital camera capable of being held by one hand |
| US6478425B2 (en)* | 2000-12-29 | 2002-11-12 | Koninlijke Phillip Electronics N. V. | System and method for automatically adjusting a lens power through gaze tracking |
| DE10103922A1 (en)* | 2001-01-30 | 2002-08-01 | Physoptics Opto Electronic Gmb | Interactive data viewing and operating system |
| FR2820486B1 (en)* | 2001-02-02 | 2003-04-04 | Dietrich & Cie De | CLOSING BODY OF A VALVE CONTAINING AN INTERNAL HOUSING FOR A SENSOR ALLOWING THE EXTRACTION OF THE SAME WITHOUT DISMANTLING |
| US6760576B2 (en)* | 2001-03-27 | 2004-07-06 | Qualcomm Incorporated | Method and apparatus for enhanced rate determination in high data rate wireless communication systems |
| GB0119859D0 (en) | 2001-08-15 | 2001-10-10 | Qinetiq Ltd | Eye tracking system |
| JP2003061912A (en)* | 2001-08-29 | 2003-03-04 | Canon Inc | Ophthalmologic image apparatus |
| US7753524B2 (en)* | 2002-02-08 | 2010-07-13 | Novavision, Inc. | Process and device for treating blind regions of the visual field |
| US7206022B2 (en)* | 2002-11-25 | 2007-04-17 | Eastman Kodak Company | Camera system with eye monitoring |
| US6637883B1 (en) | 2003-01-23 | 2003-10-28 | Vishwas V. Tengshe | Gaze tracking system and method |
| AU2003903157A0 (en)* | 2003-06-20 | 2003-07-03 | The Lions Eye Institute of Western Australia Incorporated The | Ophthalmic camera and ophthalmic camera adaptor |
| WO2005043218A1 (en)* | 2003-10-30 | 2005-05-12 | Brother Kogyo Kabushiki Kaisha | Image display device |
| JP2005252732A (en)* | 2004-03-04 | 2005-09-15 | Olympus Corp | Imaging device |
| US20110077548A1 (en)* | 2004-04-01 | 2011-03-31 | Torch William C | Biosensors, communicators, and controllers monitoring eye movement and methods for using them |
| JP4580678B2 (en)* | 2004-04-28 | 2010-11-17 | 株式会社ディテクト | Gaze point display device |
| WO2006011870A1 (en)* | 2004-06-25 | 2006-02-02 | Fergason James L | Optical system for monitoring eye movement |
| FI20045300A7 (en)* | 2004-08-17 | 2006-02-18 | Nokia Corp | Electronic device and method for controlling the functions of an electronic device, as well as a program product for implementing the method |
| US7390088B2 (en)* | 2004-12-03 | 2008-06-24 | Searete Llc | Adjustable lens system with neural-based control |
| CA2601477C (en)* | 2005-03-25 | 2015-09-15 | Intellivid Corporation | Intelligent camera selection and object tracking |
| JP4617214B2 (en)* | 2005-08-05 | 2011-01-19 | キヤノン株式会社 | Image photographing apparatus, control method therefor, program, and image photographing system |
| JP5133883B2 (en)* | 2005-08-11 | 2013-01-30 | スリープ・ダイアグノスティックス・プロプライアタリー・リミテッド | Arousal detection glasses |
| WO2007043954A1 (en)* | 2005-10-10 | 2007-04-19 | Tobii Technology Ab | Eye tracker having an extended span of operating distances |
| CA2629903C (en) | 2005-11-15 | 2016-04-12 | Carl Zeiss Vision Australia Holdings Limited | Ophthalmic lens simulation system and method |
| US20110298829A1 (en)* | 2010-06-04 | 2011-12-08 | Sony Computer Entertainment Inc. | Selecting View Orientation in Portable Device via Image Analysis |
| JP4961914B2 (en)* | 2006-09-08 | 2012-06-27 | ソニー株式会社 | Imaging display device and imaging display method |
| JP4663700B2 (en)* | 2007-09-28 | 2011-04-06 | 富士フイルム株式会社 | Imaging apparatus and imaging method |
| CN201110916Y (en)* | 2007-12-03 | 2008-09-03 | 吴汉标 | Visual field adjusting glasses |
| US8786675B2 (en)* | 2008-01-23 | 2014-07-22 | Michael F. Deering | Systems using eye mounted displays |
| US8348429B2 (en)* | 2008-03-27 | 2013-01-08 | Doheny Eye Institute | Optical coherence tomography device, method, and system |
| US20100149073A1 (en)* | 2008-11-02 | 2010-06-17 | David Chaum | Near to Eye Display System and Appliance |
| US8269893B2 (en)* | 2008-05-12 | 2012-09-18 | Flir Systems, Inc. | Optical payload electrical system |
| JP2010081480A (en)* | 2008-09-29 | 2010-04-08 | Fujifilm Corp | Portable suspicious individual detecting apparatus, suspicious individual detecting method, and program |
| KR101564387B1 (en) | 2009-01-26 | 2015-11-06 | 토비 에이비 | Detection of gaze point assisted by optical reference signals |
| US8398239B2 (en) | 2009-03-02 | 2013-03-19 | Honeywell International Inc. | Wearable eye tracking system |
| CA3010578C (en)* | 2009-04-01 | 2021-03-09 | Tearscience, Inc. | Ocular surface interferometry (osi) devices, systems, and methods for imaging, processing, and/or displaying an ocular tear film and/or measuring ocular tear film layer thickness(es) |
| JP5828070B2 (en)* | 2010-08-20 | 2015-12-02 | パナソニックIpマネジメント株式会社 | Imaging apparatus and imaging method |
| US8593558B2 (en)* | 2010-09-08 | 2013-11-26 | Apple Inc. | Camera-based orientation fix from portrait to landscape |
| JP2012070116A (en)* | 2010-09-22 | 2012-04-05 | Sony Corp | Information processing device, information processing method, reproduction device, reproduction method, and program |
| US8836777B2 (en)* | 2011-02-25 | 2014-09-16 | DigitalOptics Corporation Europe Limited | Automatic detection of vertical gaze using an embedded imaging device |
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5886767A (en)* | 1996-10-09 | 1999-03-23 | Snook; Richard K. | Keratometry system and method for measuring physical parameters of the cornea |
| EP1391176A1 (en)* | 2002-08-16 | 2004-02-25 | Universiteit Maastricht | Method and arrangement for performing measurements of the topography of a corneal surface |
| CN1960670A (en)* | 2004-04-01 | 2007-05-09 | 威廉·C·托奇 | Biosensor, communicator, and controller for monitoring eye movement and methods of use thereof |
| CN101282680A (en)* | 2005-10-10 | 2008-10-08 | 托比技术有限公司 | Eye tracker having an extended span of operating distances |
| CN101040776A (en)* | 2006-03-24 | 2007-09-26 | 株式会社拓普康 | A fundus observation device |
| US20090247997A1 (en)* | 2008-04-01 | 2009-10-01 | Amo Development, Llc | Ophthalmic laser apparatus, system, and method with high resolution imaging |
| WO2010118292A1 (en)* | 2009-04-09 | 2010-10-14 | Dynavox Systems, Llc | Calibration free, motion tolerant eye-gaze direction detector with contextually aware computer interaction and communication methods |
| Publication number | Publication date |
|---|---|
| JP2014509533A (en) | 2014-04-21 |
| CN103458770B (en) | 2017-04-05 |
| EP2499964A1 (en) | 2012-09-19 |
| WO2012126810A1 (en) | 2012-09-27 |
| US20140085452A1 (en) | 2014-03-27 |
| US20140078283A1 (en) | 2014-03-20 |
| EP2499960A1 (en) | 2012-09-19 |
| CN103458770A (en) | 2013-12-18 |
| JP2014509534A (en) | 2014-04-21 |
| US20220061660A1 (en) | 2022-03-03 |
| EP2499962A1 (en) | 2012-09-19 |
| CN103442629A (en) | 2013-12-11 |
| EP2923638B1 (en) | 2019-02-20 |
| EP2499961A1 (en) | 2012-09-19 |
| WO2012126808A1 (en) | 2012-09-27 |
| JP6030075B2 (en) | 2016-11-24 |
| EP2499962B1 (en) | 2015-09-09 |
| JP2014508016A (en) | 2014-04-03 |
| JP2014512596A (en) | 2014-05-22 |
| EP2923638A1 (en) | 2015-09-30 |
| US20140055746A1 (en) | 2014-02-27 |
| EP2499960B1 (en) | 2015-04-22 |
| CN103429139B (en) | 2017-02-15 |
| CN103429143A (en) | 2013-12-04 |
| US9107622B2 (en) | 2015-08-18 |
| WO2012126809A1 (en) | 2012-09-27 |
| WO2012126811A1 (en) | 2012-09-27 |
| EP2499961B1 (en) | 2016-09-21 |
| JP6159264B2 (en) | 2017-07-05 |
| JP6159263B2 (en) | 2017-07-05 |
| JP6026444B2 (en) | 2016-11-16 |
| CN103429139A (en) | 2013-12-04 |
| US20170035293A1 (en) | 2017-02-09 |
| EP2499964B1 (en) | 2015-04-15 |
| Publication | Publication Date | Title |
|---|---|---|
| US20220061660A1 (en) | Method for Determining at Least One Parameter of Two Eyes by Setting Data Rates and Optical Measuring Device | |
| US9033502B2 (en) | Optical measuring device and method for capturing at least one parameter of at least one eye wherein an illumination characteristic is adjustable | |
| US10498976B2 (en) | Virtual focus feedback | |
| EP3371781B1 (en) | Systems and methods for generating and using three-dimensional images | |
| EP3586195A1 (en) | Systems and methods for obtaining eyewear information | |
| WO2018154272A1 (en) | Systems and methods for obtaining information about the face and eyes of a subject | |
| CN103429143B (en) | Optical measuring device and system |
| Date | Code | Title | Description |
|---|---|---|---|
| C06 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| C14 | Grant of patent or utility model | ||
| GR01 | Patent grant | ||
| TR01 | Transfer of patent right | Effective date of registration:20190606 Address after:American California Patentee after:Apple Computer, Inc. Address before:German Pavilion Patentee before:Sensomotoric Instruments GmbH | |
| TR01 | Transfer of patent right |