Movatterモバイル変換


[0]ホーム

URL:


US20220397958A1 - Slippage resistant gaze tracking user interfaces - Google Patents

Slippage resistant gaze tracking user interfaces
Download PDF

Info

Publication number
US20220397958A1
US20220397958A1US17/756,441US202017756441AUS2022397958A1US 20220397958 A1US20220397958 A1US 20220397958A1US 202017756441 AUS202017756441 AUS 202017756441AUS 2022397958 A1US2022397958 A1US 2022397958A1
Authority
US
United States
Prior art keywords
gaze
user
trajectory
offset
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US17/756,441
Inventor
Eric Aboussouan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Google LLC
Original Assignee
Google LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Google LLCfiledCriticalGoogle LLC
Assigned to GOOGLE LLCreassignmentGOOGLE LLCASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: ABOUSSOUAN, ERIC
Publication of US20220397958A1publicationCriticalpatent/US20220397958A1/en
Abandonedlegal-statusCriticalCurrent

Links

Images

Classifications

Definitions

Landscapes

Abstract

A slippage resistant user interface (UI) may account for movement due to slippage, remounting and the like, of a head mounted display (HMD) device in processing user eye gaze in puts. The slippage resistant UI may include a number of UI elements. Each UI element may have a unique pattern of movement defining a display trajectory for that UI element, providing for differentiation amongst the UI elements. A user gaze trajectory may be matched with one of the display trajectories of the UI elements, and the corresponding UI element may be identified as the target UI element, intended for selection by the user. An eye to screen recalibration of the HMD may be accomplished using translational, scaling, and rotational offsets between the gaze trajectory and the display trajectory of the target UI element.

Description

Claims (18)

10. An electronic device, including:
a display;
a sensing system;
at least one processor; and
a memory storing instructions that, when executed by the at least one processor, cause the electronic device to:
display, by the display, a virtual user interface (UI), the virtual UI including a plurality of UI elements;
detect a user gaze directed at the virtual UI;
detect a gaze trajectory corresponding to the detected user gaze;
match the detected gaze trajectory to a display trajectory associated with a UI element of the plurality of UI elements;
identify the UI element as a target UI element;
determine an offset between the gaze trajectory and the display trajectory associated with the target UI element, the offset including at least one of a translational offset, a scaling offset, or a rotational offset; and
recalibrate a user gaze interaction mode based on the determined offset.
18. A non-transitory, computer-readable medium having instructions stored thereon that, when executed by a computing device, cause the computing device to:
display, by a display device of the computing device, a virtual user interface (UI), the virtual UI including a plurality of UI elements;
detect a user gaze directed at the virtual UI;
detect a gaze trajectory corresponding to the detected user gaze;
match the detected gaze trajectory to a display trajectory associated with a UI element of the plurality of UI elements;
identify the UI element as a target UI element;
determine an offset between the gaze trajectory and the display trajectory associated with the target UI element, the offset including at least one of a translational offset, a scaling offset, or a rotational offset; and
recalibrate a user gaze interaction mode based on the determined offset.
US17/756,4412020-08-102020-08-10Slippage resistant gaze tracking user interfacesAbandonedUS20220397958A1 (en)

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
PCT/US2020/070388WO2022035458A1 (en)2020-08-102020-08-10Slippage resistant gaze tracking user interfaces

Publications (1)

Publication NumberPublication Date
US20220397958A1true US20220397958A1 (en)2022-12-15

Family

ID=72148259

Family Applications (1)

Application NumberTitlePriority DateFiling Date
US17/756,441AbandonedUS20220397958A1 (en)2020-08-102020-08-10Slippage resistant gaze tracking user interfaces

Country Status (6)

CountryLink
US (1)US20220397958A1 (en)
EP (1)EP4097566B1 (en)
JP (1)JP2023525196A (en)
KR (1)KR20220100051A (en)
CN (1)CN114868099A (en)
WO (1)WO2022035458A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20220252884A1 (en)*2021-02-102022-08-11Canon Kabushiki KaishaImaging system, display device, imaging device, and control method for imaging system

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
WO2025183394A1 (en)*2024-02-292025-09-04삼성전자주식회사Wearable device, method, and non-transitory computer-readable recording medium for gaze tracking on basis of personal characteristics

Citations (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US8235529B1 (en)*2011-11-302012-08-07Google Inc.Unlocking a screen using eye tracking information
US20150331485A1 (en)*2014-05-192015-11-19Weerapan WilairatGaze detection calibration
US20190320962A1 (en)*2016-06-302019-10-24Cornell UniversityOptokinesys
US20190384385A1 (en)*2018-06-192019-12-19IgtInteracting with game elements using eye movement tracking

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JP2000005130A (en)*1998-06-192000-01-11Nippon Hoso Kyokai <Nhk> Eye movement measurement method and apparatus
US20150084864A1 (en)*2012-01-092015-03-26Google Inc.Input Method
WO2014192001A2 (en)*2013-05-302014-12-04Umoove Services Ltd.Smooth pursuit gaze tracking

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US8235529B1 (en)*2011-11-302012-08-07Google Inc.Unlocking a screen using eye tracking information
US20150331485A1 (en)*2014-05-192015-11-19Weerapan WilairatGaze detection calibration
US20190320962A1 (en)*2016-06-302019-10-24Cornell UniversityOptokinesys
US20190384385A1 (en)*2018-06-192019-12-19IgtInteracting with game elements using eye movement tracking

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Vidal et al. (Pursuit: Spontaneous Interaction with Displays based on Smooth Pursuit Eye Movement and Moving targets, Session: Novel Interfaces, UbiComp’13, Septemberv8-12, 2013, Zurich Switzerland, pages 439-448 (Year: 2013)*

Cited By (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20220252884A1 (en)*2021-02-102022-08-11Canon Kabushiki KaishaImaging system, display device, imaging device, and control method for imaging system

Also Published As

Publication numberPublication date
WO2022035458A1 (en)2022-02-17
CN114868099A (en)2022-08-05
JP2023525196A (en)2023-06-15
EP4097566A1 (en)2022-12-07
KR20220100051A (en)2022-07-14
EP4097566B1 (en)2024-07-24

Similar Documents

PublicationPublication DateTitle
US11181986B2 (en)Context-sensitive hand interaction
US10642344B2 (en)Manipulating virtual objects with six degree-of-freedom controllers in an augmented and/or virtual reality environment
US10083544B2 (en)System for tracking a handheld device in virtual reality
EP3314371B1 (en)System for tracking a handheld device in an augmented and/or virtual reality environment
US20230110964A1 (en)Object selection based on eye tracking in wearable device
US10353478B2 (en)Hover touch input compensation in augmented and/or virtual reality
EP3458934B1 (en)Object tracking in a head mounted reference frame for an augmented or virtual reality environment
EP4097566B1 (en)Slippage resistant gaze tracking user interfaces
US11354011B2 (en)Snapping range for augmented reality
US20230259199A1 (en)Selection of real-world objects using a wearable device
US11868583B2 (en)Tangible six-degree-of-freedom interfaces for augmented reality

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:GOOGLE LLC, CALIFORNIA

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ABOUSSOUAN, ERIC;REEL/FRAME:060033/0683

Effective date:20200812

STPPInformation on status: patent application and granting procedure in general

Free format text:DOCKETED NEW CASE - READY FOR EXAMINATION

STPPInformation on status: patent application and granting procedure in general

Free format text:NON FINAL ACTION MAILED

STPPInformation on status: patent application and granting procedure in general

Free format text:RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPPInformation on status: patent application and granting procedure in general

Free format text:FINAL REJECTION MAILED

STPPInformation on status: patent application and granting procedure in general

Free format text:RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPPInformation on status: patent application and granting procedure in general

Free format text:ADVISORY ACTION MAILED

STPPInformation on status: patent application and granting procedure in general

Free format text:DOCKETED NEW CASE - READY FOR EXAMINATION

STPPInformation on status: patent application and granting procedure in general

Free format text:NON FINAL ACTION MAILED

STCBInformation on status: application discontinuation

Free format text:ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION


[8]ページ先頭

©2009-2025 Movatter.jp