Movatterモバイル変換


[0]ホーム

URL:


US8786953B2 - User interface - Google Patents

User interface
Download PDF

Info

Publication number
US8786953B2
US8786953B2US14/071,974US201314071974AUS8786953B2US 8786953 B2US8786953 B2US 8786953B2US 201314071974 AUS201314071974 AUS 201314071974AUS 8786953 B2US8786953 B2US 8786953B2
Authority
US
United States
Prior art keywords
display
hmd
images
axis
gaze
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US14/071,974
Other versions
US20140055846A1 (en
Inventor
Aaron Joseph Wheeler
Luis Ricardo Prada Gomez
Hayes Solos Raffle
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Google LLC
Original Assignee
Google LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Google LLCfiledCriticalGoogle LLC
Priority to US14/071,974priorityCriticalpatent/US8786953B2/en
Assigned to GOOGLE INC.reassignmentGOOGLE INC.ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: GOMEZ, LUIS RICARDO PRADA, RAFFLE, HAYES SOLOS, WHEELER, AARON JOSEPH
Publication of US20140055846A1publicationCriticalpatent/US20140055846A1/en
Application grantedgrantedCritical
Publication of US8786953B2publicationCriticalpatent/US8786953B2/en
Assigned to GOOGLE LLCreassignmentGOOGLE LLCCHANGE OF NAME (SEE DOCUMENT FOR DETAILS).Assignors: GOOGLE INC.
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Definitions

Landscapes

Abstract

A head-mounted display (HMD) may include an eye-tracking system, an HMD-tracking system and a display configured to display virtual images. The virtual images may present an augmented reality to a wearer of the HMD and the virtual images may adjust dynamically based on HMD-tracking data. However, position and orientation sensor errors may introduce drift into the displayed virtual images. By incorporating eye-tracking data, the drift of virtual images may be reduced. In one embodiment, the eye-tracking data could be used to determine a gaze axis and a target object in the displayed virtual images. The HMD may then move the target object towards a central axis. The HMD may also record data based on the gaze axis, central axis and target object to determine a user interface preference. The user interface preference could be used to adjust similar interactions with the HMD.

Description

CROSS REFERENCE TO RELATED APPLICATION
The present application is a continuation of U.S. patent application Ser. No. 13/302,916, filed on Nov. 22, 2011 now U.S. Pat. No. 8,611,015 B2, which is herein incorporated by reference as if fully set forth in this description.
BACKGROUND
Wearable systems can integrate various elements, such as miniaturized computers, input devices, sensors, detectors, image displays, wireless communication devices as well as image and audio processors, into a device that can be worn by a user. Such devices provide a mobile and lightweight solution to communicating, computing and interacting with one's environment. With the advance of technologies associated with wearable systems and miniaturized optical elements, it has become possible to consider wearable compact optical displays that augment the wearer's experience of the real world.
By placing an image display element close to the wearer's eye(s), an artificial image can be made to overlay the wearer's view of the real world. Such image display elements are incorporated into systems also referred to as “near-eye displays”, “head-mounted displays” (HMDs) or “heads-up displays” (HUDs). Depending upon the size of the display element and the distance to the wearer's eye, the artificial image may fill or nearly fill the wearer's field of view.
SUMMARY
In a first aspect, a method is provided. The method includes displaying images on a display, the display having a central axis, determining a gaze axis with respect to the central axis, and determining a target object in the displayed images based on the gaze axis. The method further includes adjusting the displayed images on the display to move the target object towards the central axis.
In a second aspect, a method is provided. The method includes displaying images on a display, the display having a central axis. The method further includes determining a gaze axis with respect to the central axis, and determining a target object in the displayed images based on the gaze axis. The method further includes recording data based on the central axis, the gaze axis, the target object, and the displayed images. The method further includes adjusting the displayed images on the display based on the recorded data.
In a third aspect, a head-mounted display (HMD) is provided. The HMD includes a head-mounted support and an optical system attached to the head-mounted support. The optical system includes a display having a central axis and the display is configured to display images that are viewable from a viewing location. The HMD further includes an infrared light source configured to illuminate the viewing location with infrared light such that infrared light is reflected from the viewing location as reflected infrared light. The HMD further includes a camera configured to image the viewing location by collecting the reflected infrared light and a sensor configured to generate sensor data that relates to the motion of the HMD. The HMD further includes a computer configured to determine a gaze axis based on one or more images of the viewing location obtained by the camera, control the display to display images based on the sensor data, determine a target object in the displayed images based on the gaze axis, and control the display to move the target object towards the central axis.
In a fourth aspect, a non-transitory computer readable medium having stored instructions is provided. The instructions are executable by a computing device to cause the computing device to perform functions. The functions include: (i) controlling a display to display images, the display having a central axis; (ii) determining a gaze axis with respect to the central axis; (iii) determining a target object in the displayed images based on the gaze axis; and (iv) controlling the display to adjust the displayed images so as to move the target object towards the central axis.
In a fifth aspect, a method is provided. The method includes displaying images on a display of a head-mounted display (HMD). The displayed images are viewable at a viewing location and the display includes a central axis. The method further includes acquiring sensor data related to the motion of the HMD and controlling the display to display images based on the sensor data. The method also includes determining a gaze axis based on one or more images of the viewing location obtained by a camera and determining a target object in the displayed images based on the gaze axis. The method additionally includes controlling the display to move the target object towards the central axis.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a schematic diagram of a wearable computing device, in accordance with an example embodiment.
FIG. 2 is a top view of an optical system, in accordance with an example embodiment.
FIG. 3A is a front view of a head-mounted display, in accordance with an example embodiment.
FIG. 3B is a top view of the head-mounted display ofFIG. 3A, in accordance with an example embodiment.
FIG. 3C is a side view of the head-mounted display ofFIG. 3A andFIG. 3B, in accordance with an example embodiment.
FIG. 4A is a side view of a head-mounted display with a forward gaze axis, in accordance with an example embodiment.
FIG. 4B is a side view of the head-mounted display ofFIG. 4A with an upward gaze axis, in accordance with an example embodiment.
FIG. 5 is a flowchart of a method, in accordance with an example embodiment.
FIG. 6 is a flowchart of a method, in accordance with an example embodiment.
FIG. 7A shows a field of view of an HMD, in accordance with an example embodiment.
FIG. 7B shows two fields of view of an HMD, in accordance with an example embodiment.
FIG. 8 is a flowchart of a method, in accordance with an example embodiment.
DETAILED DESCRIPTION
In the following detailed description, reference is made to the accompanying figures, which form a part thereof. In the figures, similar symbols typically identify similar components, unless context dictates otherwise. The illustrative embodiments described in the detailed description and figures are not meant to be limiting. Other embodiments may be utilized, and other changes may be made, without departing from the spirit or scope of the subject matter presented herein. It will be readily understood that the aspects of the present disclosure, as generally described herein, and illustrated in the figures, can be arranged, substituted, combined, separated, and designed in a wide variety of different configurations, all of which are contemplated herein.
1. Overview
A head-mounted display (“HMD”) may enable its wearer to observe the wearer's real-world surroundings and also view a displayed image, such as a computer-generated image or virtual image. In some cases, the displayed image may overlay a portion of the wearer's field of view of the real world. Thus, while the wearer of the HMD is going about his or her daily activities, such as walking, driving, exercising, etc., the wearer may be able to see a displayed image generated by the HMD at the same time that the wearer is looking out at his or her real-world surroundings.
The displayed image might include, for example, graphics, text, and/or video. The content of the displayed image could relate to any number of contexts, including but not limited to the wearer's current environment, an activity in which the wearer is currently engaged, the biometric status of the wearer, and any audio, video, or textual communications that have been directed to the wearer. The images displayed by the HMD may also be part of an interactive user interface. For example, the HMD could be part of a wearable computing device. Thus, the images displayed by the HMD could include menus, selection boxes, navigation icons, or other user interface features that enable the wearer to invoke functions of the wearable computing device or otherwise interact with the wearable computing device.
The images displayed by the HMD could appear anywhere in the wearer's field of view. For example, the displayed image might occur at or near the center of the wearer's field of view, or the displayed image might be confined to the top, bottom, or a corner of the wearer's field of view. Alternatively, the displayed image might be at the periphery of or entirely outside of the wearer's normal field of view. For example, the displayed image might be positioned such that it is not visible when the wearer looks straight ahead but is visible when the wearer looks in a specific direction, such as up, down, or to one side. In addition, the displayed image might overlay only a small portion of the wearer's field of view, or the displayed image might fill most or all of the wearer's field of view. The displayed image could be displayed continuously or only at certain times (e.g., only when the wearer is engaged in certain activities).
The virtual images could be displayed based on the position and orientation of the HMD. For example, the HMD may include position and orientation sensors so that when the user moves his or her head, data regarding the position and orientation of the HMD can be received by a processor. The HMD may additionally include a display controllable by the processor, so when the user moves his or her head, the processor may adjust the displayed image on the display. In particular, the displayed image may move in the opposite direction of head movement to create the sensation of looking around a world with superimposed virtual images. In other words, the virtual images could be displayed to create the illusion that the graphical images are part of the real world.
However, HMD position and orientation sensors are unable to process information infinitely fast and are limited at least by the speed of the sensor hardware. Thus, when the HMD is in motion, some sensory data may be lost. Additionally, because of imperfections in the hardware and other factors, drifts, offsets, and other errors may be introduced into the sensor data. As a result, the processor may receive erroneous sensor data. For instance, when the sensor is at rest, the data it returns may cause the sensor appear as if it is in motion, at least slightly. Additional errors can occur when the sensor is in motion.
Sensor drift problems can be corrected for by, for example, software algorithms that may round or average data to reduce sensor error. These techniques can help resolve the drift problems when the sensor is at rest. However, these algorithms may not correct the problem when the sensor is in motion.
For instance, consider a HMD wearer traveling by train who is seated facing a side wall of the train. The HMD wearer may experience both a lateral shift while the train is in motion and also a gradual rotation change while the train rounds a corner. The position and orientation sensors of the HMD may measure these movements and may cause images to be displayed inappropriately. Drift errors may not be properly controlled in this situation by rounding or data averaging.
The below described methods, non-transitory computer readable media and apparatus may serve to reduce the effect of drift errors and unintentional HMD movement by utilizing eye-tracking information. The eye-tracking information could allow the determination of a gaze axis, which could be related to a target object in the images displayed by the HMD. Subsequently, the target object could be moved towards a central axis of the HMD.
In other words, the HMD may generally use HMD motion data to adjust virtual images on the display. For instance, as the HMD user walks around the real-world environment, virtual images could be presented to the wearer based upon the orientation of the HMD. However, when the system detects a target object selection from the eye-tracking system (e.g. the HMD wearer gazes at a specific object for a sufficient period of time), the HMD may act to move the target object to the central axis. In this manner, the determination of a gaze axis could act to override the HMD motion data and move or keep the target object near the central axis. Thus, errors due to imperfect sensor data and/or unintentional HMD movement could be reduced or eliminated.
Certain illustrative examples of a system and method for correcting sensor drift based on current and recorded eye gaze information are described below. It is to be understood, however, that other embodiments are possible and are implicitly considered within the context of the following example embodiments.
2. A Head-Mounted Display Apparatus with Eye Tracking Functionality
FIG. 1 is a schematic diagram of a wearable computing device or a head-mounted display (HMD)100 that may include several different components and subsystems. Components of theHMD100 may include an eye-trackingsystem102, a HMD-trackingsystem104, anoptical system106,peripherals108, apower supply110, aprocessor112, amemory114, and auser interface115. The eye-trackingsystem102 may include hardware such as aninfrared camera116 and at least one infraredlight source118. The HMD-trackingsystem104 may include agyroscope120, a global positioning system (GPS)122, and anaccelerometer124. Theoptical system106 may include, in one embodiment, adisplay panel126, a displaylight source128, andoptics130. Theperipherals108 may include, for example, awireless communication interface134, atouchpad136, amicrophone138, acamera140, and aspeaker142.
In an example embodiment,HMD100 includes a see-through display. Thus, the wearer ofHMD100 may observe a portion of the real-world environment, i.e., in a particular field of view provided by theoptical system106. In the example embodiment,HMD100 is operable to display virtual images that are superimposed on the field of view, for example, to provide an “augmented reality” experience. Some of the virtual images displayed byHMD100 may be superimposed over particular objects in the field of view.HMD100 may also display images that appear to hover within the field of view instead of being associated with particular objects in the field of view.
Components of theHMD100 may be configured to work in an interconnected fashion with other components within or outside their respective systems. For instance, in an example embodiment, theinfrared camera116 may image one or both of the HMD wearer's eyes. Theinfrared camera116 may deliver image information to theprocessor112, which may access thememory114 and make a determination regarding the direction of the HMD wearer's gaze, also termed a gaze axis. Theprocessor112 may further accept input from theGPS unit122, thegyroscope120, and/or theaccelerometer124 to determine the location and orientation of theHMD100. Subsequently, theprocessor112 may control theuser interface115 and thedisplay panel126 to display virtual images to the HMD wearer that may include context-specific information based on the HMD location and orientation as well as the HMD wearer's gaze axis.
HMD100 could be configured as, for example, eyeglasses, goggles, a helmet, a hat, a visor, a headband, or in some other form that can be supported on or from the wearer's head. Further,HMD100 may be configured to display images to both of the wearer's eyes, for example, using two see-through displays. Alternatively,HMD100 may include only a single see-through display and may display images to only one of the wearer's eyes, either the left eye or the right eye. TheHMD100 may also represent an opaque display configured to display images to one or both of the wearer's eyes without a view of the real-world environment. Further, theHMD100 could provide an opaque display for a first eye of the wearer as well as provide a view of the real-world environment for a second eye of the wearer.
Apower supply110 may provide power to various HMD components and could represent, for example, a rechargeable lithium-ion battery. Various other power supply materials and types known in the art are possible.
The function of theHMD100 may be controlled by aprocessor112 that executes instructions stored in a non-transitory computer readable medium, such as thememory114. Thus, theprocessor112 in combination with instructions stored in thememory114 may function as a controller ofHMD100. As such, theprocessor112 may control theuser interface115 to adjust the images displayed byHMD100. Theprocessor112 may also control thewireless communication interface134 and various other components of theHMD100. Theprocessor112 may additionally represent a plurality of computing devices that may serve to control individual components or subsystems of theHMD100 in a distributed fashion.
In addition to instructions that may be executed by theprocessor112, thememory114 may store data that may include a set of calibrated wearer eye pupil positions and a collection of past eye pupil positions. Thus, thememory114 may function as a database of information related to gaze direction. Such information may be used byHMD100 to anticipate where the user will look and determine what images are to be displayed to the wearer. Calibrated wearer eye pupil positions may include, for instance, information regarding the extents or range of the wearer's eye pupil movement (right/left and upwards/downwards) as well as wearer eye pupil positions that may relate to various reference axes.
Reference axes could represent, for example, an axis extending from a viewing location and through a target object or the apparent center of a field of view (i.e. a central axis that may project through a center point of the apparent display panel of the HMD). Other possibilities for reference axes exist. Thus, a reference axis may further represent a basis for determining dynamic gaze direction.
In addition, information may be stored in thememory114 regarding possible control instructions that may be enacted using eye movements. For instance, two consecutive wearer eye blinks may represent a control instruction directing theHMD100 to capture an image with aperipheral camera140. Another possible embodiment may include a configuration such that specific eye movements may represent a control instruction. For example, a HMD wearer may unlock theuser interface115 with a series of predetermined eye movements.
Control instructions could be based on dwell-based selection of a target object. For instance, if a wearer fixates visually upon a particular virtual image or real-world object for longer than a predetermined time period, a control instruction may be generated to select the virtual image or real-world object as a target object. Many other control instructions are possible.
In addition to the aforementioned features,memory114 could store various recorded data from previous HMD/user interactions. For instance, multiple images of a HMD wearer's eye(s) could be averaged to obtain an averaged eye gaze axis. This could lessen the effect of saccadic eye movements or saccades, in which the eye moves in a rapid and somewhat random manner around an eye gaze axis. These saccades help humans build up a mental image of a field of view with better resolution than if the eye remained static, and by averaging a number of eye images within a particular time period, an average gaze axis could be determined with less saccadic ‘noise’.
Additionally,memory114 could store recorded data regarding recent eye gaze axes for various application-based functions. For instance, the recent variance of the eye gaze axis could be coupled to scrolling images generated by theHMD100. In this embodiment, if recent eye gaze axis variance is high, the images (e.g. text or other images) could scroll faster. If the eye gaze axis variance is low, the images may scroll slower or stop altogether. In this context, a lower variance in eye gaze axis could indicate the HMD wearer is concentrating on one particular gaze location, whereas a higher eye gaze axis variance means the opposite—the HMD wearer may be quickly scanning a document and desire a faster scrolling speed.
Depending on the content that is presented on the HMD display, the variance may differ depending on the axis along which it is measured. For example, the horizontal variance of a HMD wearer's eye gaze may be high while the vertical variance may be relatively low. This could indicate to theHMD100 that the wearer is reading text. Accordingly, text scrolling/tracking could be adjusted in a different or more controlled fashion compared to ‘non-reading’ scrolling/panning/pagination situations.
TheHMD100 may include auser interface115 for providing information to the wearer or receiving input from the wearer. Theuser interface115 could be associated with, for example, the displayed virtual images, a touchpad, a keypad, buttons, a microphone, and/or other peripheral input devices. Theprocessor112 may control the functioning of theHMD100 based on inputs received through theuser interface115. For example, theprocessor112 may utilize user input from theuser interface115 to control how theHMD100 displays images within a field of view or to determine what images theHMD100 displays.
An eye-trackingsystem102 may be included in theHMD100. In an example embodiment, an eye-trackingsystem102 may deliver information to theprocessor112 regarding the eye position of a wearer of theHMD100. The eye-tracking data could be used, for instance, to correct for sensor drift errors introduced by gaps in sensor data and/or sensor noise. In particular, theprocessor112 could determine a target object in the displayed images based on information from the eye-trackingsystem102. Theprocessor112 may then control theuser interface115 and thedisplay panel126 to adjust the target object and/or other displayed images in various ways. For instance, the target object could be held static on thedisplay panel126. Alternatively, the target object could be moved towards a central axis of thedisplay panel126.
Aninfrared camera116 may be utilized by the eye-trackingsystem102 to capture images of a viewing location associated with theHMD100. Thus, theinfrared camera116 may image the eye of a HMD wearer that may be located at the viewing location. The viewing location may be illuminated by an infraredlight source118. The images could be either video images or still images. The images obtained by theinfrared camera116 regarding the HMD wearer's eye may help determine where the wearer is looking within the HMD field of view, for instance by allowing theprocessor112 to ascertain the location of the HMD wearer's eye pupil. Analysis of the images obtained by theinfrared camera116 could be performed by theprocessor112 in conjunction with thememory114.
The imaging of the viewing location could occur continuously or at discrete times depending upon, for instance, user interactions with theuser interface115. Theinfrared camera116 could be integrated into theoptical system106 or mounted on theHMD100. Alternatively, the infrared camera could be positioned apart from theHMD100 altogether. Furthermore, theinfrared camera116 could additionally represent a conventional visible light camera with sensing capabilities in the infrared wavelengths.
The infraredlight source118 could represent one or more infrared light-emitting diodes (LEDs) or infrared laser diodes that may illuminate a viewing location. One or both eyes of a wearer of theHMD100 may be illuminated by the infraredlight source118. The infraredlight source118 may be positioned along an optical axis common to theinfrared camera116 and/or the infraredlight source118 may be positioned elsewhere. The infraredlight source118 may illuminate the viewing location continuously or may be turned on at discrete times. Additionally, when illuminated, the infraredlight source118 may be modulated at a particular frequency. Other types of modulation of the infraredlight source118 are possible.
The HMD-trackingsystem104 could be configured to provide a HMD position and a HMD orientation to theprocessor112. This position and orientation data may help determine a central axis to which a gaze axis is compared. For instance, the central axis may correspond to the orientation of the HMD.
Thegyroscope120 could be a microelectromechanical system (MEMS) gyroscope, a fiber optic gyroscope, or another type of gyroscope known in the art. Thegyroscope120 may be configured to provide orientation information to theprocessor112. TheGPS unit122 could be a receiver that obtains clock and other signals from GPS satellites and may be configured to provide real-time location information to theprocessor112. The HMD-trackingsystem104 could further include anaccelerometer124 configured to provide motion input data to theprocessor112.
Theoptical system106 could represent components configured to provide virtual images to a viewing location. An example ofoptical system106 is described in detail below with respect toFIG. 2.
Variousperipheral devices108 may be included in theHMD100 and may serve to provide information to and from a wearer of theHMD100. In one example, theHMD100 may include awireless communication interface134 for wirelessly communicating with one or more devices directly or via a communication network. For example,wireless communication interface134 could use 3G cellular communication, such as CDMA, EVDO, GSM/GPRS, or 4G cellular communication, such as WiMAX or LTE. Alternatively,wireless communication interface134 could communicate with a wireless local area network (WLAN), for example, using WiFi. In some embodiments,wireless communication interface134 could communicate directly with a device, for example, using an infrared link, Bluetooth, or ZigBee.
AlthoughFIG. 1 shows various components of the HMD100 (i.e.,wireless communication interface134,processor112,memory114,infrared camera116,display panel126,GPS122, and user interface115) as being integrated intoHMD100, one or more of these components could be physically separate fromHMD100. For example, theinfrared camera116 could be mounted on the wearer separate fromHMD100. Thus, theHMD100 could be part of a wearable computing device in the form of separate devices that can be worn on or carried by the wearer. The separate components that make up the wearable computing device could be communicatively coupled together in either a wired or wireless fashion.
FIG. 2 illustrates a top view of anoptical system200 that is configured to display a virtual image superimposed upon a real-world scene viewable along aviewing axis204. For clarity, adistal portion232 and aproximal portion234 represent optically-coupled portions of theoptical system200 that may or may not be physically separated. An example embodiment includes adisplay panel206 that may be illuminated by alight source208. Light emitted from the visiblelight source208 is incident upon thedistal beam splitter210. The visiblelight source208 may include one or more light-emitting diodes (LEDs) and/or laser diodes. The visiblelight source208 may further include a linear polarizer that acts to pass one particular polarization to the rest of the optical system.
In an example embodiment, thedistal beam splitter210 is a polarizing beam splitter that reflects light depending upon the polarization of light incident upon the beam splitter. To illustrate, s-polarized light from the visiblelight source208 may be preferentially reflected by a distal beam-splittinginterface212 towards thedisplay panel206. Thedisplay panel206 in the example embodiment is a liquid crystal-on-silicon (LCOS) display. In an alternate embodiment in which the beam splitter coating atinterface212 is not a polarizing beam splitter, the display could be a digital light projector (DLP) micro-mirror display, or other type of reflective display panel. In either embodiment, thedisplay panel206 acts to spatially-modulate the incident light to generate a light pattern at an object plane in the display. Alternatively, thedisplay panel206 may be an emissive-type display such as an organic light-emitting diode (OLED) display, and in such a case, thebeam splitter cube210 is not needed.
In the example in which thedisplay panel206 is a LCOS display panel, thedisplay panel206 generates a light pattern with a polarization perpendicular to the polarization of light initially incident upon the panel. In this example embodiment, thedisplay panel206 converts incident s-polarized light into a light pattern with p-polarization. The generated light pattern from thedisplay panel206 is directed towards thedistal beam splitter210. The p-polarized light pattern passes through thedistal beam splitter210 and is directed along anoptical axis214 towards the proximal region of theoptical system200. In an example embodiment, theproximal beam splitter216 is also a polarizing beam splitter. The light pattern is at least partially transmitted through theproximal beam splitter216 to the image former218. In an example embodiment, image former218 includes aconcave mirror230 and a proximal quarter-wave plate228. The light pattern passes through the proximal quarter-wave plate228 and is reflected by theconcave mirror230.
The reflected light pattern passes back through proximal quarter-wave plate228. Through the interactions with the proximal quarter-wave plate228 and theconcave mirror230, the light patterns are converted to the s-polarization and are formed into a viewable image. This viewable image is incident upon theproximal beam splitter216 and the viewable image is reflected from proximalbeam splitting interface220 towards aviewing location222 along aviewing axis204. A real-world scene is viewable through aviewing window224. Theviewing window224 may include a linear polarizer in order to reduce stray light within the optical system. Light from theviewing window224 is at least partially transmitted through theproximal beam splitter216. Thus, both a virtual image and a real-world image are viewable to theviewing location222 through theproximal beam splitter216.
AlthoughFIG. 2 depicts thedistal portion232 of the optical system housing as to the left of theproximal portion234 of the optical system housing when viewed from above, it is understood that other embodiments are possible to physically realize theoptical system200, including thedistal portion232 being configured to be to the right, below and above with respect to theproximal portion234. Further, although an example embodiment describes an image former218 as comprising aconcave mirror230, it is understood by those skilled in the art that the image former218 may comprise a different optical element, such as an optical lens or a diffractive optic element.
In one embodiment, theproximal beam splitter216, thedistal beam splitter210, and other components ofoptical system200 are made of glass. Alternatively, some or all of such optical components may be partially or entirely plastic, which can also serve to reduce the weight ofoptical system200. A suitable plastic material is Zeonex® E48R cyclo olefin optical grade polymer which is available from Zeon Chemicals L.P., Louisville, Ky. Another suitable plastic material is polymethyl methacrylate (PMMA).
An example embodiment may include an infraredlight source226 that is configured to illuminate theviewing location222. AlthoughFIG. 2 depicts the infraredlight source226 as adjacent toviewing window224, those skilled in the art will understand that the infraredlight source226 could be located elsewhere, such as on the side of theproximal beam splitter216 that is adjacent to theviewing location222 or in thedistal portion232 of theoptical system200. The infraredlight source226 may represent, for example, one or more infrared light-emitting diodes (LEDs). Infrared LEDs with a small size may be implemented, such as the Vishay Technology TSML 1000 product.
Those skilled in the art will understand that, for best eye-tracking accuracy, it may be advantageous to obtain infrared images of the eye pupil using light sources that illuminate the eye from positions off-axis and/or on-axis with respect to theviewing axis204. Therefore, the infraredlight source226 may include one or more LEDs located at different locations in, and/or separate from, theoptical system200.
Infrared light generated from the infraredlight source226 is configured to be incident upon theviewing location222. Thus, the wearer's eye pupil may be illuminated with the infrared light. The infrared light may be reflected from the wearer's eye back along theviewing axis204 towards theproximal beam splitter216. A portion of the reflected infrared light may be reflected from thebeam splitting interface220 towards the image former218.
In order to transmit infrared light to aninfrared camera202, the image former218 may include a dichroic thin film configured to selectively reflect or transmit incident light depending upon the wavelength of the incident light. For instance, the dichroic thin film may be configured to pass infrared light while reflecting visible light. In an example embodiment, the visible light pattern generated by thedisplay panel206 may be reflected by theconcave mirror230 and the visible light pattern may be formed into a viewable image. The infrared light may thus be preferably transmitted through theconcave mirror230 to theinfrared camera202. Dichroic thin film coatings are available commercially from companies such as JML Optical Industries and Precision Glass & Optics (PG&O) and comprise multiple layers of dielectric and/or metal films. These dichroic coatings are also called ‘cold mirrors’.
In an example embodiment, a small aperture may be introduced into the image former218, which may be realized by a pinhole in the center of theconcave mirror230. In this example embodiment, most of the visible and infrared light is reflected off of and is formed by the image former218 into an image viewable by the HMD wearer. Some of the visible and infrared light passes through the aperture and is incident upon theinfrared camera202. Theinfrared camera202 may selectively filter and detect the infrared light from the combination of visible and infrared light to obtain information regarding the wearer's eye pupil location. Alternatively and/or additionally, the infraredlight source226 may be modulated with respect to a clock signal of a lock-in amplifier or phase-locked loop in order that the infrared light signal is transduced efficiently. Also, the visiblelight source208 may be modulated and infrared light detection could be performed when the visiblelight source208 is off, for example. Reflected infrared light may also be collected from off-axis angles, and thus the infrared camera may also be located elsewhere onoptical system200 or located separately fromoptical system200. Those with skill in the art will understand that there are other variations of transducing an infrared light signal mixed with a visible light signal with an infrared camera and that those variations are included implicitly in this specification.
FIG. 3A presents a front view of a head-mounted display (HMD)300 in an example embodiment that includes a head-mountedsupport309.FIGS. 3B and 3C present the top and side views, respectively, of the HMD inFIG. 3A. Although this example embodiment is provided in an eyeglasses format, it will be understood that wearable systems and HMDs may take other forms, such as hats, goggles, masks, headbands and helmets. The head-mountedsupport309 includes lens frames314 and316, acenter frame support318,lens elements310 and312, and extending side-arms320 and322. Thecenter frame support318 and side-arms320 and322 are configured to secure the head-mountedsupport309 to the wearer's head via the wearer's nose and ears, respectively. Each of theframe elements314,316, and318 and the extending side-arms320 and322 may be formed of a solid structure of plastic or metal, or may be formed of a hollow structure of similar material so as to allow wiring and component interconnects to be internally routed through the head-mountedsupport309. Alternatively or additionally, head-mountedsupport309 may support external wiring.Lens elements310 and312 are at least partially transparent so as to allow the wearer to look through them. In particular, the wearer'sleft eye308 may look throughleft lens312 and the wearer'sright eye306 may look throughright lens310.Optical systems302 and304, which may be configured as shown inFIG. 2, may be positioned in front oflenses310 and312, respectively, as shown inFIGS. 3A,3B, and3C.Optical systems302 and304 may be attached to the head-mountedsupport309 using support mounts324 and326, respectively. Furthermore,optical systems302 and304 may be integrated partially or completely intolens elements310 and312, respectively.
Although this example includes an optical system for each of the wearer's eyes, it is to be understood that a HMD might include an optical system for only one of the wearer's eyes (eitherleft eye308 or right eye306). As described inFIG. 2, the HMD wearer may simultaneously observe fromoptical systems302 and304 a real-world image with an overlaid virtual image. TheHMD300 may include various elements such as aprocessor340, atouchpad342, amicrophone344, and abutton346. Thecomputer340 may use data from, among other sources, various sensors and cameras to determine the virtual image that should be displayed to the user. In an example embodiment, as described earlier, an infrared light source or sources may illuminate the viewing position(s)308 and306, i.e. the wearer's eye(s), and the reflected infrared light may be collected with an infrared camera.
Those skilled in the art would understand that other user input devices, user output devices, wireless communication devices, sensors, and cameras may be reasonably included in such a wearable computing system.
FIGS. 4A and 4B depict side and front views of an eye as well as schematic drawings of pupil location information under different conditions. One way to determine a gaze axis of an eye is to ascertain the position of an eye pupil with respect to a reference point. In an example embodiment that tracks eye pupil position, infrared light is reflected off of a person's eye. The reflected light may be imaged with an infrared camera. Upon imaging of the eye, image processing can be conducted with aprocessor112 in order to determine, for instance, the extents and centroid location of the person's pupil. Other known means and methods of eye-tracking, including the use of visible light illumination and/or other imaging techniques are possible.
In anembodiment400, a person may be looking directly forward as depicted inFIG. 4A. Theeye412 is open and thepupil418 is located along acentral axis410. After image processing, which may include edge detection, the position of the pupil may be determined to be atpupil location422. In this example, theprocessor112 may subsequently determine that the gaze axis, based on thepupil location422, coincides with acentral axis410. Virtual image display position and movement may be adjusted due to the determinedpupil location422. For instance, theprocessor112 may adjust a tracking rate to zero when a gaze axis and the central axis are equivalent or nearly equivalent. This may allow a user to slowly read critical text or closely examine a virtual image, for example.
In anexample embodiment424, as illustrated inFIG. 4B, a person may be looking upwards with respect to acentral axis428. Theeye434 is open and the pupil location is generally higher than areference point440. In this situation, imaging the person'spupil438 with infrared light may result in adetermined pupil position442. Theprocessor112 may determine that thegaze axis430 is inclined above thecentral axis428. Theangle difference432 may represent the absolute difference in angle between thecentral axis428 and thegaze axis430. Theprocessor112 may calculate theangle difference432 and, based on theangle difference432, adjust a tracking rate. For instance, alarge angle difference432 could represent an adjustment in tracking rate such that the tracking rate is higher, for instance to scroll a virtual image across a field of view at a faster rate. In other embodiments, theprocessor112 may calculate theangle difference432 and move a target object in the displayed images in order to minimize theangle difference432. In other words, HMD display may be controlled such that the target object is moved toward the center of the HMD display.
Other embodiments could include the use of different eye gaze determination techniques. For instance, instead of using the eye pupil to determine gaze axis, it is possible to track eye motions using the boundary between the sclera and iris (the boundary is labelled as416 and436 inFIGS. 4A and 4B, respectively). For the purposes of determining an eye gaze axis, finding the centroid of the sclera/iris boundary (also called the limbus) may be equivalent to finding the centroid of a pupil.
Other possibilities for eye-tracking exist that may determine different reference points on the eye and may be implemented within the context of this invention. For instance, instead of ascertaining the pupil centroid to determine the gaze axis, position data from multiple glint reflections on the eye may be used in addition to or in lieu of information about the pupil position to determine the gaze axis.
3. Method for Adjusting Virtual Images on a Display Based on a Gaze Axis and a Central Axis of the Display.
Amethod500 is provided for adjusting displayed images on a display to move the target object closer to the central axis.Method500 could be performed using a HMD that could be configured as shown in any ofFIGS. 1 through 3C, or configured in some other way.FIG. 5 illustrates the steps in an example method, however, it is understood that in other embodiments, the steps may appear in different order and steps may be added or subtracted.
Step502 includes displaying images on a display. The display may have a central axis that could be considered a line passing through the center of the display, normal to the display itself. In alternate embodiments, the central axis may pass through other locations on the display that may not be at the center of the display, for instance if the display is not symmetric.
As discussed above, the display may present a wide variety of virtual images to a user including text, shapes, pictures, etc. These virtual images may allow the HMD wearer to experience an augmented reality.
Step504 includes determining a gaze axis with respect to the central axis. The determination of a gaze axis could be performed through eye-tracking. For example, an eye-trackingsystem102 could be used to image the eye of an HMD wearer to determine the position of his or her eye pupil and therefore determine a gaze axis, such as illustrated inFIGS. 4A and 4B.
Step506 includes determining a target object in the displayed images based on the gaze axis. After determining a gaze axis, theprocessor112 could determine that the wearer is gazing at a specific target object among the set of displayed images. This target object could be any virtual image from the images being displayed on the display. For example, a wearer may be viewing lines of virtual text and may gaze fixedly at a particular word or phrase. In turn, the word or phrase could be determined to be a target object.
Step508 includes adjusting the displayed images on the display to move the target object towards the central axis. This step could include moving a whole set of displayed images (including the target object) toward the central axis. For instance, following the previous example embodiment where a specific word or phrase is the target object, the entire formatted body of text may be moved towards the central axis.
Alternatively, only the target object may move towards the central axis. For instance, if a virtual image representing a specific icon from a set of icons is determined to be the target object, theprocessor112 may move only the target object icon toward the central axis of the display. Other movements are possible as well, including dynamic reformatting of text and other virtual images while moving a target object towards the central axis.
The rate at which the target object is moved, or the tracking rate, may be based on the angle between the gaze axis and the target axis, among other factors. For instance, if the HMD wearer is gazing at a target object near the edge of the field of view of the HMD (thus, the angle difference is large), the target object may be moved at a tracking rate that is relatively fast. Conversely, if the target object is only slightly separated from the central axis, the tracking rate may be slow.
An example ofmethod500 is illustrated inFIGS. 7A and 7B.FIG. 7A shows a HMD field ofview700 in whichvirtual text704 is being presented to the HMD wearer. An eye-trackingsystem102 could determine that an eye gaze axis may pass through thedisplay702 at a central eye gaze point706. The target object could be determined to be the word, ‘of.’ Since the target object is near the center of the display, the gaze axis and central axis are aligned or nearly so. Thus, theprocessor112 may determine that the wearer is gazing at the center of the screen and no virtual image movement may occur.
InFIG. 7B, a related field ofview708 may be presented to the HMD wearer. The HMD wearer is determined to have a gaze point710 within a set ofvirtual text704. In this example, the eye-trackingsystem102 andprocessor112 may determine that the wearer is gazing at the word, ‘hope’ and determine that word to be the target object. Thus, the virtual images may be adjusted so that the word ‘hope’ is moved toward the central axis. Another field ofview712 may be presented to the HMD wearer where the set ofvirtual text704 has been moved to the right and up such that the target object (in this example, the word ‘hope’) is located along the central axis of the display.
4. Method for Recording Data Based on a Gaze Axis, a Target Object and a Central Axis of a Display and Adjusting the Displayed Images on the Display Based on the Recorded Data
Amethod600 is provided for recording data based on the central axis, gaze axis, and target object and adjusting displayed images on a display to move the target object closer to the central axis.Method600 could be performed using a HMD that could be configured as shown in any ofFIGS. 1 through 3C, or configured in some other way.FIG. 6 illustrates the steps in an example method, however, it is understood that in other embodiments, the steps may appear in different order and steps may be added or subtracted.
In afirst step602, images are displayed on a display. The display may have a central axis that could be represented by a line that passes through the display at a normal incidence. The central axis may pass through other locations on the display that may not be at the center of the display, for instance if the display is not symmetric.
In asecond step604, a gaze axis may be determined with respect to the central axis. For example, a means for eye-tracking, such as eye-trackingsystem102, may be used to acquire images of an HMD wearer's eye. The images could be used byprocessor112 in order to determine information regarding the wearer's eye position, for instance, a centroid of a wearer's pupil and from this information further determine a gaze axis.
In athird step606, a target object may be determined based on the gaze axis. For example, aprocessor112 may determine that a particular gaze axis may be associated with a target object from among the displayed virtual images on the display. In one embodiment, the gaze axis may pass through a displayed virtual image that may be determined to be the target object. In other embodiments, target objects may be determined from the gaze axis even if the gaze axis does not intersect a corresponding virtual image. For instance, an HMD wearer may gaze fixedly at a particular virtual image on the display. Correspondingly, theprocessor112 may determine a target object not spatially related to the virtual image of interest.
In afourth step608, data may be recorded based on the central axis, the gaze axis, the target object, and the displayed images. For instance, data regarding the central axis, the gaze axis of an HMD wearer, the target object, and the displayed images may be stored inmemory114. The data could be recorded in a continuous fashion, or only be recorded when specific tasks are being performed. Alternatively or additionally, data could be recorded upon user request through theuser interface115 or during other HMD operations.
In afifth step610, the displayed images may be adjusted on the display based on the recorded data. In one example embodiment, data may be recorded during normal wearer interaction with the HMD. The data may include information, for instance regarding the HMD wearer's average reading speed, that could thereafter be used to adjust the virtual images displayed on the screen when the HMD wearer is reading text. For example, the rate of automated text scrolling could be controlled by prerecorded data regarding the wearer's average reading speed. The HMD could obtain data to determine user preferences that may encompass a broad range of various HMD user interactions and functions.
Other examples of how such recorded data could be used are possible. For example, the recorded data could be used for HMD calibration. Due to variations in interocular spacing and facial features between users, information from the eye-tracking system may vary slightly from one HMD wearer to the next, even when the users are performing the same actions. Thus, eye gaze determinations may also vary between individuals. Aprocessor112 could analyze the data stored in thememory114 to match a user eye gaze input to an intended user input. Displayed images on the display could depend at least on the results of the recorded data and corresponding user customizations. By recording data regarding eye-tracking interactions with the HMD, the need for calibration and customization for individual users may be reduced.
5. Non-Transitory Computer Readable Medium for Sensor Drift Correction
Some or all of the functions described above and illustrated inFIGS. 5,6,7A and7B may be performed by a computing device in response to the execution of instructions stored in a non-transitory computer readable medium. The non-transitory computer readable medium could be, for example, a random access memory (RAM), a read-only memory (ROM), a flash memory, a cache memory, one or more magnetically encoded discs, one or more optically encoded discs, or any other form of non-transitory data storage. The non-transitory computer readable medium could also be distributed among multiple data storage elements, which could be remotely located from each other. The computing device that executes the stored instructions could be a wearable computing device, such as awearable computing device100 illustrated inFIG. 1. Alternatively, the computing device that executes the stored instructions could be another computing device, such as a server in a server network.
With reference toFIG. 1, the non-transitory computer readable medium, which may correspond tomemory114, may store instructions executable by theprocessor112 to perform various functions. For instance, upon receiving gaze axis information from the eye-trackingsystem102, theprocessor112 may be instructed to control thedisplay panel126 to adjust the displayed images based on, for instance, the gaze axis and the central axis. In another embodiment, data could be recorded based on the gaze axis, central axis and target object in order to adjust the target object on the display or determine a user interface preference. The user interface preference could be used to adjust similar interactions with the HMD in the future. Those skilled in the art will understand that other sub-functions or functions may be reasonably included to instruct a processor to adjust a virtual image on a display based upon eye-tracking and other sensor data.
6. Method for Displaying Images in a HMD Based on the Movement of the HMD and Eye-Tracking Data
Amethod800 is provided for displaying images on a display of a head-mounted display (HMD) based on the movement of the HMD and eye-tracking data of the HMD wearer.Method800 could be performed using a HMD that could be configured as shown in any ofFIGS. 1 through 3C, or configured in some other way.FIG. 8 illustrates the steps in an example method, however, it is understood that in other embodiments, the steps may appear in different order and steps may be added or subtracted.
Method step802 includes displaying images on a display of a HMD. In this embodiment, the displayed images are viewable at a viewing location and the HMD display has a central axis. The viewing location could be, for instance, the location of an eye (or both eyes) of the HMD wearer. Similar to aforementioned embodiments, the central axis could be an axis protruding normally from the center of the HMD display.
Method step804 includes acquiring sensor data related to the motion of the HMD. The sensor data could be generated by, for instance, theaccelerometer124 and/orgyroscope120. The sensor data could include information such as the angle and azimuth of the HMD, which may correspond to the angle and azimuth of the central axis. Further, the sensor data could include information regarding the position of the HMD, such as may be acquired by theGPS122. Other types of sensor data, such as HMD velocity and acceleration information, are possible and may be reasonably applied using the example embodiment.
Method step806 includes controlling the display to display images based on the sensor data. In the example embodiment, the HMD could provide a graphical user interface in response to a movement of the HMD. The graphical user interface could be configured to be displayed in response to any HMD movement, or alternately, in response to a predetermined HMD movement. Thus,processor112 could be configured to acquire sensor data. In response to an appropriate sensor data input, theprocessor112 could control theoptical system106 to display images associated with a graphical user interface. For instance, if the HMD user moves his or her head upwards, the graphical user interface could be provided.
Further, while the graphical user interface is provided by the HMD, the user may view virtual images that could appear overlaid upon the real world environment. These virtual images could be substantially anchored to real world objects and/or reference points. Alternatively or additionally, some or all of the virtual images could stay substantially fixed with respect to, in one example, the azimuth (rotation angle within the horizontal plane) of the HMD. Thus, the HMD wearer could view different aspects of the graphical user interface by rotating his or her head and/or body.
Themethod step808 includes determining a gaze axis based on one or more images of the viewing location using a camera. In this step, as described in above embodiments, an infrared camera could work together with an infrared light source or sources to image the HMD wearer's eye(s). The images may be captured in an effort to ascertain a gaze direction or gaze axis of the HMD wearer. The gaze direction/axis may relate to the direction in which the user is looking.
Inmethod step810, a target object may be determined within the displayed images based on the gaze axis. Thus, using information obtained frommethod step808, theprocessor112 could be used to determine a particular target object from the set of currently displayed images. For example, a target object may be determined based on the point of intersection of the gaze direction and thedisplay panel126.
Method step812 includes controlling the display to move the target object towards the central axis. Within the context of this method, the user may be interacting within an established graphical user interface that could overlay his or her view of the real world environment. By gazing at a particular element within the graphical user interface could select it from the set of currently displayed images. The particular element could thus be selected to become the target object. Target object selection could be performed by gazing at the element for a predetermined period of time. The target object may then be moved towards the central axis of the HMD. This movement of the target object could allow the HMD wearer to, for instance, read a passage of text more easily or concentrate on a particular area of interest on an image. Additionally, as the centering action inmethod step812 could be independent of HMD movement, the method could be useful to reduce the effect of noise on drift in the sensor data, as well as unintentional movement of the HMD. In effect, commands to move the displayed images based on the HMD motion data can be ‘overridden’ when eye-tracking data may indicate a target object. In this manner, display of images could be essentially decoupled from HMD motion, which may reduce or eliminate the effect of unwanted motion sensor drift and HMD movement.
Those skilled in the art will understand that there are other possible ways of using HMD movement data and eye-tracking data to center target objects on a display of a HMD, and the aforementioned example embodiment is not meant to preclude any other such examples.
CONCLUSION
The above detailed description describes various features and functions of the disclosed systems, devices, and methods with reference to the accompanying figures. While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.

Claims (14)

What is claimed is:
1. A method comprising:
displaying images on a display, the display having a central axis;
determining a gaze axis with respect to the central axis;
determining a target object in the displayed images based on the gaze axis; and
adjusting the displayed images on the display to move the target object towards the central axis.
2. The method ofclaim 1, wherein determining the gaze axis comprises:
obtaining an eye pupil image;
determining the gaze axis from the eye pupil image.
3. The method ofclaim 1, wherein determining the gaze axis comprises:
obtaining a plurality of eye pupil images; and
determining the gaze axis from the plurality of eye pupil images.
4. The method ofclaim 1, wherein adjusting the displayed images on the display comprises:
moving the target object at a tracking rate.
5. The method ofclaim 4, wherein adjusting the displayed images on the display further comprises:
adjusting the tracking rate based on a difference between the gaze axis and the central axis.
6. A method comprising:
displaying images on a display, the display having a central axis;
determining a gaze axis with respect to the central axis;
determining a target object in the displayed images based on the gaze axis;
recording data based on the central axis, the gaze axis, the target object, and the displayed images; and
adjusting the displayed images on the display based on the recorded data, wherein adjusting the displayed images on the display based on the recorded data comprises controlling movement of the displayed images on the display.
7. The method ofclaim 6, further comprising:
determining a user interface preference based on the recorded data;
adjusting the displayed images on the display based on the user interface preference.
8. The method ofclaim 6, wherein controlling movement of the displayed images on the display comprises controlling a rate of text scrolling.
9. A non-transitory computer readable medium having stored therein instructions executable by a computing device to cause the computing device to perform functions, the functions comprising:
controlling a display to display images, the display having a central axis;
determining a gaze axis with respect to the central axis;
determining a target object in the displayed images based on the gaze axis; and
controlling the display to adjust the displayed images so as to move the target object towards the central axis.
10. The non-transitory computer readable medium ofclaim 9, wherein determining a gaze axis comprises:
obtaining an eye pupil image;
determining the gaze axis from the eye pupil image.
11. The non-transitory computer readable medium ofclaim 9, wherein determining the gaze axis comprises:
obtaining a plurality of eye pupil images; and
determining a gaze axis from the plurality of eye pupil images.
12. The non-transitory computer readable medium ofclaim 9, wherein controlling the display to adjust the displayed images comprises moving the target object at a tracking rate.
13. The non-transitory computer readable medium ofclaim 12, wherein controlling the display to adjust the displayed images further comprises adjusting the tracking rate based on a difference between the gaze axis and the central axis.
14. The non-transitory computer readable medium ofclaim 9, wherein the functions further comprise:
recording data based on the central axis, the gaze axis, the target object, and the displayed images, and adjusting the displayed images on the display based on the recorded data.
US14/071,9742011-11-222013-11-05User interfaceActiveUS8786953B2 (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
US14/071,974US8786953B2 (en)2011-11-222013-11-05User interface

Applications Claiming Priority (2)

Application NumberPriority DateFiling DateTitle
US13/302,916US8611015B2 (en)2011-11-222011-11-22User interface
US14/071,974US8786953B2 (en)2011-11-222013-11-05User interface

Related Parent Applications (1)

Application NumberTitlePriority DateFiling Date
US13/302,916ContinuationUS8611015B2 (en)2011-11-222011-11-22User interface

Publications (2)

Publication NumberPublication Date
US20140055846A1 US20140055846A1 (en)2014-02-27
US8786953B2true US8786953B2 (en)2014-07-22

Family

ID=48426619

Family Applications (2)

Application NumberTitlePriority DateFiling Date
US13/302,916Active2032-04-09US8611015B2 (en)2011-11-222011-11-22User interface
US14/071,974ActiveUS8786953B2 (en)2011-11-222013-11-05User interface

Family Applications Before (1)

Application NumberTitlePriority DateFiling Date
US13/302,916Active2032-04-09US8611015B2 (en)2011-11-222011-11-22User interface

Country Status (4)

CountryLink
US (2)US8611015B2 (en)
EP (1)EP2783252B1 (en)
CN (1)CN104067160B (en)
WO (1)WO2013077978A2 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US9619020B2 (en)2013-03-012017-04-11Tobii AbDelay warp gaze interaction
US9864498B2 (en)2013-03-132018-01-09Tobii AbAutomatic scrolling based on gaze detection
US9952883B2 (en)2014-08-052018-04-24Tobii AbDynamic determination of hardware
US10168772B2 (en)2015-10-122019-01-01Samsung Electronics Co., LtdHead mounted electronic device
US10317995B2 (en)2013-11-182019-06-11Tobii AbComponent determination and gaze provoked interaction
US10558262B2 (en)2013-11-182020-02-11Tobii AbComponent determination and gaze provoked interaction
EP3941077A4 (en)*2020-05-222022-06-29Beijing Baidu Netcom Science and Technology Co., Ltd.Method and apparatus for controlling video playing, and electronic device and storage medium
US11972046B1 (en)*2022-11-032024-04-30Vincent JiangHuman-machine interaction method and system based on eye movement tracking

Families Citing this family (311)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
IL166799A (en)2005-02-102014-09-30Lumus LtdSubstrate-guided optical device utilizing beam splitters
US10073264B2 (en)2007-08-032018-09-11Lumus Ltd.Substrate-guide optical device
US9158116B1 (en)2014-04-252015-10-13Osterhout Group, Inc.Temple and ear horn assembly for headworn computer
US9366867B2 (en)2014-07-082016-06-14Osterhout Group, Inc.Optical systems for see-through displays
US9952664B2 (en)2014-01-212018-04-24Osterhout Group, Inc.Eye imaging in head worn computing
US9715112B2 (en)2014-01-212017-07-25Osterhout Group, Inc.Suppression of stray light in head worn computing
US9298007B2 (en)2014-01-212016-03-29Osterhout Group, Inc.Eye imaging in head worn computing
US9400390B2 (en)2014-01-242016-07-26Osterhout Group, Inc.Peripheral lighting for head worn computing
US9965681B2 (en)2008-12-162018-05-08Osterhout Group, Inc.Eye imaging in head worn computing
US9229233B2 (en)2014-02-112016-01-05Osterhout Group, Inc.Micro Doppler presentations in head worn computing
CN106484115B (en)2011-10-282019-04-19奇跃公司 Systems and methods for augmented and virtual reality
US8970452B2 (en)2011-11-022015-03-03Google Inc.Imaging method
US8611015B2 (en)2011-11-222013-12-17Google Inc.User interface
US8235529B1 (en)*2011-11-302012-08-07Google Inc.Unlocking a screen using eye tracking information
US10108316B2 (en)*2011-12-302018-10-23Intel CorporationCognitive load assessment for digital documents
US9001030B2 (en)*2012-02-152015-04-07Google Inc.Heads up display
US9239415B2 (en)2012-03-082016-01-19Google Inc.Near-to-eye display with an integrated out-looking camera
US9691241B1 (en)2012-03-142017-06-27Google Inc.Orientation of video based on the orientation of a display
WO2013168173A1 (en)*2012-05-112013-11-14Umoove Services Ltd.Gaze-based automatic scrolling
IL219907A (en)*2012-05-212017-08-31Lumus LtdHead-mounted display eyeball tracker integrated system
US9146397B2 (en)*2012-05-302015-09-29Microsoft Technology Licensing, LlcCustomized see-through, electronic display device
JP6020577B2 (en)*2012-09-192016-11-02株式会社ニコン Measuring system, measuring method, spectacle lens design method, spectacle lens selection method, and spectacle lens manufacturing method
US10180715B2 (en)*2012-10-052019-01-15Elwha LlcCorrelating user reaction with at least an aspect associated with an augmentation of an augmented view
US10269179B2 (en)2012-10-052019-04-23Elwha LlcDisplaying second augmentations that are based on registered first augmentations
US10713846B2 (en)2012-10-052020-07-14Elwha LlcSystems and methods for sharing augmentation data
US8860634B2 (en)*2012-10-112014-10-14Sony Computer Entertainment Europe LimitedHead mountable display
US9007301B1 (en)*2012-10-112015-04-14Google Inc.User interface
KR101978214B1 (en)*2012-11-192019-05-14엘지전자 주식회사Display device for displaying video and method thereof
KR20140090552A (en)2013-01-092014-07-17엘지전자 주식회사Head Mounted Display and controlling method for eye-gaze calibration
US20140191927A1 (en)*2013-01-092014-07-10Lg Electronics Inc.Head mount display device providing eye gaze calibration and control method thereof
US9619021B2 (en)2013-01-092017-04-11Lg Electronics Inc.Head mounted display providing eye gaze calibration and control method thereof
JP6375591B2 (en)*2013-01-152018-08-22セイコーエプソン株式会社 Head-mounted display device, head-mounted display device control method, and image display system
US9639964B2 (en)2013-03-152017-05-02Elwha LlcDynamically preserving scene elements in augmented reality systems
US10109075B2 (en)2013-03-152018-10-23Elwha LlcTemporal element restoration in augmented reality systems
US10025486B2 (en)2013-03-152018-07-17Elwha LlcCross-reality select, drag, and drop for augmented reality systems
WO2014156146A1 (en)*2013-03-292014-10-02パナソニック株式会社Electronic mirror device
US10137361B2 (en)*2013-06-072018-11-27Sony Interactive Entertainment America LlcSystems and methods for using reduced hops to generate an augmented virtual reality scene within a head mounted system
GB201310376D0 (en)*2013-06-112013-07-24Sony Comp Entertainment EuropeHead-mountable apparatus and systems
GB201310364D0 (en)*2013-06-112013-07-24Sony Comp Entertainment EuropeHead-mountable apparatus and systems
US10175483B2 (en)*2013-06-182019-01-08Microsoft Technology Licensing, LlcHybrid world/body locked HUD on an HMD
US20140375540A1 (en)*2013-06-242014-12-25Nathan AckermanSystem for optimal eye fit of headset display device
US10228242B2 (en)*2013-07-122019-03-12Magic Leap, Inc.Method and system for determining user input based on gesture
KR20150026336A (en)2013-09-022015-03-11엘지전자 주식회사Wearable display device and method of outputting content thereof
KR20150032019A (en)*2013-09-172015-03-25한국전자통신연구원Method and apparatus for providing user interface by using eye tracking
WO2015041642A1 (en)*2013-09-182015-03-26Intel CorporationA method, apparatus, and system for displaying a graphical user interface
KR102065417B1 (en)*2013-09-232020-02-11엘지전자 주식회사Wearable mobile terminal and method for controlling the same
KR20150037254A (en)*2013-09-302015-04-08엘지전자 주식회사Wearable display device and method of controlling layer
US20150169048A1 (en)*2013-12-182015-06-18Lenovo (Singapore) Pte. Ltd.Systems and methods to present information on device based on eye tracking
CN104731467B (en)*2013-12-192018-02-27联想(北京)有限公司A kind of information processing method and a kind of electronic equipment
US9633252B2 (en)2013-12-202017-04-25Lenovo (Singapore) Pte. Ltd.Real-time detection of user intention based on kinematics analysis of movement-oriented biometric data
US10180716B2 (en)2013-12-202019-01-15Lenovo (Singapore) Pte LtdProviding last known browsing location cue using movement-oriented biometric data
US9639152B2 (en)*2013-12-302017-05-02Lenovo (Singapore) Pte. Ltd.Display alignment based on eye tracking
US9524580B2 (en)*2014-01-062016-12-20Oculus Vr, LlcCalibration of virtual reality systems
US9299194B2 (en)2014-02-142016-03-29Osterhout Group, Inc.Secure sharing in head worn computing
US9746686B2 (en)2014-05-192017-08-29Osterhout Group, Inc.Content position calibration in head worn computing
US9810906B2 (en)2014-06-172017-11-07Osterhout Group, Inc.External user interface for head worn computing
US11103122B2 (en)2014-07-152021-08-31Mentor Acquisition One, LlcContent presentation in head worn computing
US10254856B2 (en)2014-01-172019-04-09Osterhout Group, Inc.External user interface for head worn computing
US20150277118A1 (en)2014-03-282015-10-01Osterhout Group, Inc.Sensor dependent content position in head worn computing
US9829707B2 (en)2014-08-122017-11-28Osterhout Group, Inc.Measuring content brightness in head worn computing
US9529195B2 (en)2014-01-212016-12-27Osterhout Group, Inc.See-through computer display systems
US9671613B2 (en)2014-09-262017-06-06Osterhout Group, Inc.See-through computer display systems
US10684687B2 (en)2014-12-032020-06-16Mentor Acquisition One, LlcSee-through computer display systems
US10649220B2 (en)2014-06-092020-05-12Mentor Acquisition One, LlcContent presentation in head worn computing
US9575321B2 (en)2014-06-092017-02-21Osterhout Group, Inc.Content presentation in head worn computing
US11227294B2 (en)2014-04-032022-01-18Mentor Acquisition One, LlcSight information collection in head worn computing
US20150228119A1 (en)2014-02-112015-08-13Osterhout Group, Inc.Spatial location presentation in head worn computing
US10191279B2 (en)2014-03-172019-01-29Osterhout Group, Inc.Eye imaging in head worn computing
US9841599B2 (en)2014-06-052017-12-12Osterhout Group, Inc.Optical configurations for head-worn see-through displays
US20160019715A1 (en)2014-07-152016-01-21Osterhout Group, Inc.Content presentation in head worn computing
US9939934B2 (en)2014-01-172018-04-10Osterhout Group, Inc.External user interface for head worn computing
US9594246B2 (en)2014-01-212017-03-14Osterhout Group, Inc.See-through computer display systems
US12093453B2 (en)2014-01-212024-09-17Mentor Acquisition One, LlcEye glint imaging in see-through computer display systems
US9740280B2 (en)2014-01-212017-08-22Osterhout Group, Inc.Eye imaging in head worn computing
US9753288B2 (en)2014-01-212017-09-05Osterhout Group, Inc.See-through computer display systems
US9615742B2 (en)2014-01-212017-04-11Osterhout Group, Inc.Eye imaging in head worn computing
US12105281B2 (en)2014-01-212024-10-01Mentor Acquisition One, LlcSee-through computer display systems
US9651784B2 (en)2014-01-212017-05-16Osterhout Group, Inc.See-through computer display systems
US9494800B2 (en)2014-01-212016-11-15Osterhout Group, Inc.See-through computer display systems
US20150205135A1 (en)2014-01-212015-07-23Osterhout Group, Inc.See-through computer display systems
US9836122B2 (en)2014-01-212017-12-05Osterhout Group, Inc.Eye glint imaging in see-through computer display systems
US11737666B2 (en)2014-01-212023-08-29Mentor Acquisition One, LlcEye imaging in head worn computing
US9811152B2 (en)2014-01-212017-11-07Osterhout Group, Inc.Eye imaging in head worn computing
US11892644B2 (en)2014-01-212024-02-06Mentor Acquisition One, LlcSee-through computer display systems
US11487110B2 (en)2014-01-212022-11-01Mentor Acquisition One, LlcEye imaging in head worn computing
US9766463B2 (en)2014-01-212017-09-19Osterhout Group, Inc.See-through computer display systems
US9651788B2 (en)2014-01-212017-05-16Osterhout Group, Inc.See-through computer display systems
US11669163B2 (en)2014-01-212023-06-06Mentor Acquisition One, LlcEye glint imaging in see-through computer display systems
US9311718B2 (en)2014-01-232016-04-12Microsoft Technology Licensing, LlcAutomated content scrolling
US20160018651A1 (en)2014-01-242016-01-21Osterhout Group, Inc.See-through computer display systems
US9846308B2 (en)2014-01-242017-12-19Osterhout Group, Inc.Haptic systems for head-worn computers
US9401540B2 (en)2014-02-112016-07-26Osterhout Group, Inc.Spatial location presentation in head worn computing
US12112089B2 (en)2014-02-112024-10-08Mentor Acquisition One, LlcSpatial location presentation in head worn computing
KR102182161B1 (en)2014-02-202020-11-24엘지전자 주식회사Head mounted display and method for controlling the same
US20160187651A1 (en)2014-03-282016-06-30Osterhout Group, Inc.Safety for a vehicle operator with an hmd
US20150302422A1 (en)*2014-04-162015-10-222020 Ip LlcSystems and methods for multi-user behavioral research
US9614724B2 (en)2014-04-212017-04-04Microsoft Technology Licensing, LlcSession-based device configuration
US9672210B2 (en)2014-04-252017-06-06Osterhout Group, Inc.Language translation with head-worn computing
US10853589B2 (en)2014-04-252020-12-01Mentor Acquisition One, LlcLanguage translation with head-worn computing
US9651787B2 (en)2014-04-252017-05-16Osterhout Group, Inc.Speaker assembly for headworn computer
US10424103B2 (en)*2014-04-292019-09-24Microsoft Technology Licensing, LlcDisplay device viewer gaze attraction
US20160137312A1 (en)2014-05-062016-05-19Osterhout Group, Inc.Unmanned aerial vehicle launch system
US9430667B2 (en)2014-05-122016-08-30Microsoft Technology Licensing, LlcManaged wireless distribution network
US9384335B2 (en)2014-05-122016-07-05Microsoft Technology Licensing, LlcContent delivery prioritization in managed wireless distribution networks
US10111099B2 (en)2014-05-122018-10-23Microsoft Technology Licensing, LlcDistributing content in managed wireless distribution networks
US9384334B2 (en)2014-05-122016-07-05Microsoft Technology Licensing, LlcContent discovery in managed wireless distribution networks
US9874914B2 (en)2014-05-192018-01-23Microsoft Technology Licensing, LlcPower management contracts for accessory devices
US10037202B2 (en)2014-06-032018-07-31Microsoft Technology Licensing, LlcTechniques to isolating a portion of an online computing service
US10663740B2 (en)2014-06-092020-05-26Mentor Acquisition One, LlcContent presentation in head worn computing
US9367490B2 (en)2014-06-132016-06-14Microsoft Technology Licensing, LlcReversible connector for accessory devices
US10074003B2 (en)*2014-07-112018-09-11Intel CorporationDynamic control for data capture
US10204658B2 (en)2014-07-142019-02-12Sony Interactive Entertainment Inc.System and method for use in playing back panorama video content
US10451875B2 (en)2014-07-252019-10-22Microsoft Technology Licensing, LlcSmart transparency for virtual objects
US9766460B2 (en)2014-07-252017-09-19Microsoft Technology Licensing, LlcGround plane adjustment in a virtual reality environment
US9858720B2 (en)2014-07-252018-01-02Microsoft Technology Licensing, LlcThree-dimensional mixed-reality viewport
US9904055B2 (en)2014-07-252018-02-27Microsoft Technology Licensing, LlcSmart placement of virtual objects to stay in the field of view of a head mounted display
US10311638B2 (en)2014-07-252019-06-04Microsoft Technology Licensing, LlcAnti-trip when immersed in a virtual reality environment
US9865089B2 (en)2014-07-252018-01-09Microsoft Technology Licensing, LlcVirtual reality environment with real world objects
US10416760B2 (en)2014-07-252019-09-17Microsoft Technology Licensing, LlcGaze-based object placement within a virtual reality environment
CA2956795C (en)2014-08-032020-06-30PogoTec, Inc.Wearable camera systems and apparatus and method for attaching camera systems or other electronic devices to wearable articles
US9779633B2 (en)2014-08-082017-10-03Greg Van CurenVirtual reality system enabling compatibility of sense of immersion in virtual space and movement in real space, and battle training system using same
US9599821B2 (en)*2014-08-082017-03-21Greg Van CurenVirtual reality system allowing immersion in virtual space to consist with actual movement in actual space
US9489739B2 (en)*2014-08-132016-11-08Empire Technology Development LlcScene analysis for improved eye tracking
EP3182881B1 (en)*2014-08-202023-11-29California Baptist UniversitySystems for monitoring eye health
US10261579B2 (en)2014-09-012019-04-16Samsung Electronics Co., Ltd.Head-mounted display apparatus
US9798383B2 (en)*2014-09-192017-10-24Intel CorporationFacilitating dynamic eye torsion-based eye tracking on computing devices
KR102194787B1 (en)*2014-09-242020-12-24삼성전자주식회사Apparatus and method for user based sensor data acquiring
CN105527825B (en)*2014-09-282018-02-27联想(北京)有限公司Electronic equipment and display methods
US9984505B2 (en)*2014-09-302018-05-29Sony Interactive Entertainment Inc.Display of text information on a head-mounted display
CN104391567B (en)*2014-09-302017-10-31深圳市魔眼科技有限公司A kind of 3D hologram dummy object display control method based on tracing of human eye
US10943395B1 (en)2014-10-032021-03-09Virtex Apps, LlcDynamic integration of a virtual environment with a physical environment
IL235073A (en)*2014-10-072016-02-29Elbit Systems LtdHead-mounted displaying of magnified images locked on an object of interest
US20160109943A1 (en)*2014-10-212016-04-21Honeywell International Inc.System and method for controlling visibility of a proximity display
US11144119B2 (en)2015-05-012021-10-12Irisvision, Inc.Methods and systems for generating a magnification region in output video images
US11546527B2 (en)2018-07-052023-01-03Irisvision, Inc.Methods and apparatuses for compensating for retinitis pigmentosa
CA2971163A1 (en)2014-11-102016-05-19Visionize Corp.Methods and apparatus for vision enhancement
US11372479B2 (en)2014-11-102022-06-28Irisvision, Inc.Multi-modal vision enhancement system
WO2016075782A1 (en)*2014-11-122016-05-19富士通株式会社Wearable device, display control method, and display control program
US9535497B2 (en)2014-11-202017-01-03Lenovo (Singapore) Pte. Ltd.Presentation of data on an at least partially transparent display based on user focus
CN104717483B (en)*2014-12-022017-02-01上海理鑫光学科技有限公司Virtual reality home decoration experience system
US9684172B2 (en)2014-12-032017-06-20Osterhout Group, Inc.Head worn computer display systems
US9997199B2 (en)2014-12-052018-06-12Warner Bros. Entertainment Inc.Immersive virtual reality production and playback for storytelling content
CN104581126A (en)*2014-12-162015-04-29青岛歌尔声学科技有限公司Image display processing method and processing device for head-mounted display device
US20160180798A1 (en)*2014-12-222016-06-23Elwha LlcSystems, methods, and devices for controlling content update rates
US20160180762A1 (en)*2014-12-222016-06-23Elwha LlcSystems, methods, and devices for controlling screen refresh rates
CA2972064A1 (en)2014-12-232016-06-30PogoTec, Inc.Wireless camera system and methods
USD751552S1 (en)2014-12-312016-03-15Osterhout Group, Inc.Computer glasses
USD753114S1 (en)2015-01-052016-04-05Osterhout Group, Inc.Air mouse
CN104570366A (en)*2015-01-162015-04-29中国科学院上海光学精密机械研究所Holographic helmet display with gesture recognition function
US10740971B2 (en)*2015-01-202020-08-11Microsoft Technology Licensing, LlcAugmented reality field of view object follower
US9846968B2 (en)2015-01-202017-12-19Microsoft Technology Licensing, LlcHolographic bird's eye view camera
KR101685105B1 (en)*2015-01-272016-12-20네이버 주식회사Cartoon displaying method and cartoon displaying device
WO2016126672A1 (en)*2015-02-022016-08-11Brian MullinsHead mounted display calibration
US20160239985A1 (en)2015-02-172016-08-18Osterhout Group, Inc.See-through computer display systems
US10878775B2 (en)2015-02-172020-12-29Mentor Acquisition One, LlcSee-through computer display systems
CN107249497B (en)*2015-02-202021-03-16柯惠Lp公司 Operating Room and Surgical Site Awareness
CN104777912B (en)*2015-04-292018-02-09北京奇艺世纪科技有限公司A kind of barrage methods of exhibiting, apparatus and system
US10254544B1 (en)*2015-05-132019-04-09Rockwell Collins, Inc.Head tracking accuracy and reducing latency in dynamic environments
IL295437B2 (en)2015-05-192024-11-01Magic Leap Inc Dual integrated light field device
US10078219B2 (en)*2015-05-282018-09-18Thalmic Labs Inc.Wearable heads-up display with integrated eye tracker and different optical power holograms
US10481417B2 (en)2015-06-102019-11-19PogoTec, Inc.Magnetic attachment mechanism for electronic wearable device
WO2016199731A1 (en)*2015-06-102016-12-15株式会社ソニー・インタラクティブエンタテインメントHead-mounted display, display control method, and program
EP3308216B1 (en)2015-06-102021-04-21Pogotec, Inc.Eyewear with magnetic track for electronic wearable device
EP3923229A1 (en)2015-06-242021-12-15Magic Leap, Inc.Augmented reality devices, systems and methods for purchasing
WO2017003719A2 (en)2015-06-302017-01-053M Innovative Properties CompanyIlluminator
US10635189B2 (en)2015-07-062020-04-28RideOn Ltd.Head mounted display curser maneuvering
US11003246B2 (en)2015-07-222021-05-11Mentor Acquisition One, LlcExternal user interface for head worn computing
US10139966B2 (en)2015-07-222018-11-27Osterhout Group, Inc.External user interface for head worn computing
US9870049B2 (en)*2015-07-312018-01-16Google LlcReflective lenses to auto-calibrate a wearable system
US10127725B2 (en)2015-09-022018-11-13Microsoft Technology Licensing, LlcAugmented-reality imaging
US10757399B2 (en)2015-09-102020-08-25Google LlcStereo rendering system
CN108141559B (en)*2015-09-182020-11-06Fove股份有限公司 Image system, image generation method, and computer-readable medium
US10962780B2 (en)*2015-10-262021-03-30Microsoft Technology Licensing, LlcRemote rendering for virtual images
US10466780B1 (en)*2015-10-262019-11-05PillantasSystems and methods for eye tracking calibration, eye vergence gestures for interface control, and visual aids therefor
US10338677B2 (en)2015-10-282019-07-02Microsoft Technology Licensing, LlcAdjusting image frames based on tracking motion of eyes
TW201729610A (en)2015-10-292017-08-16帕戈技術股份有限公司Hearing aid adapted for wireless power reception
JP2019506017A (en)*2015-11-062019-02-28フェイスブック・テクノロジーズ・リミテッド・ライアビリティ・カンパニーFacebook Technologies, Llc Eye tracking using optical flow
US9805512B1 (en)2015-11-132017-10-31Oculus Vr, LlcStereo-based calibration apparatus
WO2017091477A1 (en)*2015-11-252017-06-01Google Inc.Prism-based eye tracking
CN105869214A (en)*2015-11-262016-08-17乐视致新电子科技(天津)有限公司Virtual reality device based view frustum cutting method and apparatus
CN105511618A (en)*2015-12-082016-04-20北京小鸟看看科技有限公司3D input device, head-mounted device and 3D input method
CN105301778A (en)*2015-12-082016-02-03北京小鸟看看科技有限公司Three-dimensional control device, head-mounted device and three-dimensional control method
US10229324B2 (en)2015-12-242019-03-12Intel CorporationVideo summarization using semantic information
IL243400A (en)2015-12-292017-06-29Elbit Systems LtdHead mounted display symbology concepts and implementations, associated with a reference vector
BR112018016726B1 (en)*2016-02-182023-03-14Apple Inc IMAGE PROCESSING METHOD FOR MIXED REALITY AND HEAD WEAR DEVICE
US10591728B2 (en)2016-03-022020-03-17Mentor Acquisition One, LlcOptical systems for head-worn computers
US10850116B2 (en)2016-12-302020-12-01Mentor Acquisition One, LlcHead-worn therapy device
US10667981B2 (en)2016-02-292020-06-02Mentor Acquisition One, LlcReading assistance system for visually impaired
US9880441B1 (en)2016-09-082018-01-30Osterhout Group, Inc.Electrochromic systems for head-worn computer systems
US9826299B1 (en)2016-08-222017-11-21Osterhout Group, Inc.Speaker systems for head-worn computer systems
US11558538B2 (en)2016-03-182023-01-17Opkix, Inc.Portable camera system
CN107229122B (en)*2016-03-232019-08-16宏达国际电子股份有限公司Head-mounted display device
US20170277259A1 (en)*2016-03-242017-09-28Daqri, LlcEye tracking via transparent near eye lens
US9910284B1 (en)2016-09-082018-03-06Osterhout Group, Inc.Optical systems for head-worn computers
US10684478B2 (en)2016-05-092020-06-16Mentor Acquisition One, LlcUser interface systems for head-worn computers
US10466491B2 (en)2016-06-012019-11-05Mentor Acquisition One, LlcModular systems for head-worn computers
WO2017223042A1 (en)*2016-06-202017-12-28PogoTec, Inc.Image alignment systems and methods
US10444509B2 (en)*2016-06-272019-10-15Daqri, LlcNear eye diffractive holographic projection method
CN106131539A (en)*2016-06-302016-11-16乐视控股(北京)有限公司A kind of Virtual Reality equipment and video broadcasting method thereof
US20180007422A1 (en)2016-06-302018-01-04Sony Interactive Entertainment Inc.Apparatus and method for providing and displaying content
CN106265006B (en)*2016-07-292019-05-17维沃移动通信有限公司A kind of control method and mobile terminal of the apparatus for correcting of dominant eye
KR102558473B1 (en)2016-09-302023-07-25삼성전자주식회사Method for displaying an image and an electronic device thereof
EP3539285A4 (en)2016-11-082020-09-02Pogotec, Inc.A smart case for electronic wearable device
EP3322186A1 (en)*2016-11-142018-05-16Thomson LicensingMethod and device for transmitting data representative of an image
US10122990B2 (en)*2016-12-012018-11-06Varjo Technologies OyImaging system and method of producing context and focus images
EP3553692A1 (en)*2016-12-092019-10-16Shenzhen Royole Technologies Co., Ltd.Adjustment method and adjustment system for user interface, and head-mounted display device
CN108604015B (en)*2016-12-262020-07-14华为技术有限公司 Image display method and head mounted display device
CN110431467A (en)2017-01-282019-11-08鲁姆斯有限公司Augmented reality imaging system
TWI633336B (en)*2017-02-242018-08-21宏碁股份有限公司Helmet mounted display, visual field calibration method thereof, and mixed reality display system
AU2018243565B2 (en)2017-03-302023-03-16Magic Leap, Inc.Non-blocking dual driver earphones
US10977858B2 (en)2017-03-302021-04-13Magic Leap, Inc.Centralized rendering
IL269861B2 (en)2017-04-142023-11-01Magic Leap Inc Multimodal eye tracking
US10319266B1 (en)*2017-04-242019-06-11Facebook Technologies, LlcDisplay panel with non-visible light detection
CN110574099B (en)*2017-05-012022-07-12安波福技术有限公司Head tracking based field sequential saccadic separation reduction
US11079522B1 (en)2017-05-312021-08-03Magic Leap, Inc.Fiducial design
US10620710B2 (en)*2017-06-152020-04-14Microsoft Technology Licensing, LlcDisplacement oriented interaction in computer-mediated reality
EP3422147A1 (en)*2017-06-282019-01-02Koninklijke Philips N.V.Display apparatus for computer-mediated reality
EP4215980A1 (en)2017-07-192023-07-26Lumus Ltd.Lcos illumination via loe
US10422995B2 (en)2017-07-242019-09-24Mentor Acquisition One, LlcSee-through computer display systems with stray light management
US11409105B2 (en)2017-07-242022-08-09Mentor Acquisition One, LlcSee-through computer display systems
US10578869B2 (en)2017-07-242020-03-03Mentor Acquisition One, LlcSee-through computer display systems with adjustable zoom cameras
GB2566924B (en)*2017-07-272022-08-03Mo Sys Engineering LtdPositioning system
US10969584B2 (en)2017-08-042021-04-06Mentor Acquisition One, LlcImage expansion optic for head-worn computer
US10394034B2 (en)*2017-08-152019-08-27Microsoft Technology Licensing, LlcEye-tracking with MEMS scanning and optical relay
CN109426342B (en)*2017-08-292022-04-01深圳市掌网科技股份有限公司Document reading method and device based on augmented reality
US11157073B2 (en)*2017-10-042021-10-26Tectus CorporationGaze calibration for eye-mounted displays
IL307592A (en)2017-10-172023-12-01Magic Leap Inc Spatial audio for mixed reality
CN111213184B (en)*2017-11-302024-04-09惠普发展公司,有限责任合伙企业Virtual dashboard implementation based on augmented reality
US10506220B2 (en)2018-01-022019-12-10Lumus Ltd.Augmented reality displays with active alignment and corresponding methods
US20190212482A1 (en)*2018-01-102019-07-11Oculus Vr, LlcAngle selective filter for near eye displays
IL275824B2 (en)2018-01-172024-08-01Magic Leap Inc Display systems and methods for determining registration between a display and a user's eyes
AU2019209930B2 (en)2018-01-172023-08-03Magic Leap, Inc.Eye center of rotation determination, depth plane selection, and render camera positioning in display systems
JP2019133504A (en)*2018-02-012019-08-08トヨタ自動車株式会社Vehicle dispatch service cooperation search support system
CN112534467A (en)2018-02-132021-03-19弗兰克.沃布林Method and apparatus for contrast sensitivity compensation
CN111713090B (en)2018-02-152023-02-17奇跃公司Mixed reality musical instrument
WO2019161314A1 (en)2018-02-152019-08-22Magic Leap, Inc.Dual listener positions for mixed reality
IL305799B2 (en)2018-02-152024-10-01Magic Leap Inc Virtual reverberation in mixed reality
JP6582205B2 (en)*2018-02-282019-10-02株式会社コナミデジタルエンタテインメント Information processing apparatus, information processing apparatus program, head mounted display, and information processing system
KR20190118040A (en)*2018-04-092019-10-17삼성전자주식회사Wearable display apparatus and method of displaying 3-dimensional images thereof
CN112602005A (en)2018-04-242021-04-02曼特收购第一有限责任公司See-through computer display system with vision correction and increased content density
US10642049B2 (en)2018-04-252020-05-05Apple Inc.Head-mounted device with active optical foveation
CN108592865A (en)*2018-04-282018-09-28京东方科技集团股份有限公司Geometric measurement method and its device, AR equipment based on AR equipment
IL259518B2 (en)2018-05-222023-04-01Lumus LtdOptical system and method for improvement of light field uniformity
US10779082B2 (en)2018-05-302020-09-15Magic Leap, Inc.Index scheming for filter parameters
US10860113B2 (en)*2018-05-302020-12-08Atheer, Inc.Augmented reality head gesture recognition systems
CN110547759B (en)*2018-05-312024-08-16托比股份公司Robust convergence signal
US10509467B1 (en)*2018-06-012019-12-17Facebook Technologies, LlcDetermining fixation of a user's eyes from images of portions of the user's face enclosed by a head mounted display
EP3803541B1 (en)2018-06-112024-12-25Brainlab AGVisualization of medical data depending on viewing-characteristics
US10667072B2 (en)2018-06-122020-05-26Magic Leap, Inc.Efficient rendering of virtual soundfields
WO2019241760A1 (en)2018-06-142019-12-19Magic Leap, Inc.Methods and systems for audio signal filtering
JP7478100B2 (en)2018-06-142024-05-02マジック リープ, インコーポレイテッド Reverberation Gain Normalization
WO2019246164A1 (en)2018-06-182019-12-26Magic Leap, Inc.Spatial audio for interactive audio environments
WO2019246562A1 (en)2018-06-212019-12-26Magic Leap, Inc.Wearable system speech processing
IL279705B2 (en)2018-06-272025-04-01Sentiar IncGaze based interface for augmented reality environment
JP7499749B2 (en)2018-07-242024-06-14マジック リープ, インコーポレイテッド Application Sharing
US11567336B2 (en)2018-07-242023-01-31Magic Leap, Inc.Display systems and methods for determining registration between display and eyes of user
US11347056B2 (en)*2018-08-222022-05-31Microsoft Technology Licensing, LlcFoveated color correction to improve color uniformity of head-mounted displays
CN110603513B (en)*2018-08-272023-08-29深圳市汇顶科技股份有限公司Eye tracking device and method for tracking eyes by utilizing optical imaging
WO2020042843A1 (en)2018-08-272020-03-05Shenzhen GOODIX Technology Co., Ltd.Eye tracking based on imaging eye features and assistance of structured illumination probe light
JP7316360B2 (en)2018-09-252023-07-27マジック リープ, インコーポレイテッド Systems and methods for augmented reality
EP3861763A4 (en)2018-10-052021-12-01Magic Leap, Inc. Highlighting audio spatialization
CN113170272B (en)2018-10-052023-04-04奇跃公司Near-field audio rendering
US11315325B2 (en)2018-10-092022-04-26Magic Leap, Inc.Systems and methods for artificial intelligence-based virtual and augmented reality
CN113227935B (en)2018-10-242024-09-13奇跃公司Asynchronous ASIC
CN109507686B (en)*2018-11-082021-03-30歌尔光学科技有限公司Control method, head-mounted display device, electronic device and storage medium
US11300857B2 (en)2018-11-132022-04-12Opkix, Inc.Wearable mounts for portable camera
WO2020140078A1 (en)2018-12-272020-07-02Magic Leap, Inc.Systems and methods for virtual and augmented reality
US11200656B2 (en)*2019-01-112021-12-14Universal City Studios LlcDrop detection systems and methods
CN109901709B (en)*2019-01-142022-06-14北京七鑫易维信息技术有限公司Method and device for adjusting display picture and VR equipment
CN111506188A (en)*2019-01-302020-08-07托比股份公司Method and HMD for dynamically adjusting HUD
US11587563B2 (en)2019-03-012023-02-21Magic Leap, Inc.Determining input for speech processing engine
TWI800657B (en)2019-03-122023-05-01以色列商魯姆斯有限公司Image projector
WO2020185219A1 (en)2019-03-132020-09-17Hewlett-Packard Development Company, L.P.Detecting eye tracking calibration errors
US12245097B2 (en)2019-03-252025-03-04Magic Leap, Inc.Systems and methods for virtual and augmented reality
CN216434536U (en)2019-04-042022-05-03鲁姆斯有限公司Near-eye display
EP4510125A1 (en)2019-04-192025-02-19Magic Leap, Inc.Identifying input for speech recognition engine
JP7369212B2 (en)2019-06-062023-10-25マジック リープ, インコーポレイテッド Photorealistic character construction for spatial computing
CN114041101A (en)*2019-07-112022-02-11惠普发展公司,有限责任合伙企业 Eye Tracking for Displays
US11328740B2 (en)2019-08-072022-05-10Magic Leap, Inc.Voice onset detection
US11704874B2 (en)2019-08-072023-07-18Magic Leap, Inc.Spatial instructions and guides in mixed reality
WO2021077024A1 (en)2019-10-182021-04-22Magic Leap, Inc.Gravity estimation and bundle adjustment for visual-inertial odometry
CN114586382B (en)2019-10-252025-09-23奇跃公司 A method, system and medium for determining and processing audio information
US11488365B2 (en)2019-10-252022-11-01Magic Leap, Inc.Non-uniform stereo rendering
KR102593103B1 (en)*2019-10-282023-10-24주식회사 케이티Apparatus, method and computer program for aligning of camera
US11959997B2 (en)2019-11-222024-04-16Magic Leap, Inc.System and method for tracking a wearable device
CN115698847A (en)2019-12-042023-02-03奇跃公司Variable pitch color emissive display
WO2021113781A1 (en)2019-12-062021-06-10Magic Leap, Inc.Environment acoustics persistence
JP7676400B2 (en)2019-12-092025-05-14マジック リープ, インコーポレイテッド SYSTEM AND METHOD FOR OPERATING A HEAD MOUNTED DISPLAY SYSTEM BASED ON USER IDENTIFICATION - Patent application
US11337023B2 (en)2019-12-202022-05-17Magic Leap, Inc.Physics-based audio and haptic synthesis
EP3851939A1 (en)*2020-01-142021-07-21Apple Inc.Positioning a user-controlled spatial selector based on extremity tracking information and eye tracking information
JP2021120693A (en)*2020-01-302021-08-19シチズンファインデバイス株式会社 Reflective liquid crystal display device
WO2021158159A1 (en)*2020-02-032021-08-12BAE Systems Hägglunds AktiebolagEmbedded target tracking training
WO2021163224A1 (en)2020-02-102021-08-19Magic Leap, Inc.Dynamic colocation of virtual content
JP2023514573A (en)2020-02-142023-04-06マジック リープ, インコーポレイテッド tool bridge
CN115698818B (en)2020-02-142024-01-23奇跃公司Session manager
US11778410B2 (en)2020-02-142023-10-03Magic Leap, Inc.Delayed audio following
EP4104456A4 (en)2020-02-142023-07-19Magic Leap, Inc.Multi-application audio rendering
CN115398316B (en)2020-02-142025-08-26奇跃公司 3D object annotation
CN116325808B (en)2020-03-022023-12-22奇跃公司Immersive audio platform
US11917384B2 (en)2020-03-272024-02-27Magic Leap, Inc.Method of waking a device using spoken voice commands
US11636843B2 (en)2020-05-292023-04-25Magic Leap, Inc.Surface appropriate collisions
WO2021243103A1 (en)2020-05-292021-12-02Magic Leap, Inc.Determining angular acceleration
WO2022072752A1 (en)2020-09-302022-04-07Magic Leap, Inc.Voice user interface using non-linguistic input
US12306413B2 (en)2021-03-122025-05-20Magic Leap , Inc.Athermalization concepts for polymer eyepieces used in augmented reality or mixed reality devices
CN113093502A (en)*2021-04-302021-07-09荆门市探梦科技有限公司Wearing tracking type geometric holographic display system
CN114332420A (en)*2021-12-282022-04-12歌尔光学科技有限公司Display method and device of AR glasses, AR glasses and storage medium
US11806078B1 (en)2022-05-012023-11-07Globe Biomedical, Inc.Tear meniscus detection and evaluation system
USD1058631S1 (en)2023-04-072025-01-21Globe Biomedical, Inc.Smart spectacles
US12186019B2 (en)2023-04-072025-01-07Globe Biomedical, IncMechanical integration of components of wearable devices and ocular health monitoring system
USD1057791S1 (en)2023-04-072025-01-14Globe Biomedical, Inc.Smart spectacles
CN116719165A (en)*2023-06-082023-09-08业桓科技(成都)有限公司 Head-mounted display and near-eye display method

Citations (10)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US6307526B1 (en)1998-02-022001-10-23W. Steve G. MannWearable camera system with viewfinder means
US20020180799A1 (en)2001-05-292002-12-05Peck Charles C.Eye gaze control of dynamic information presentation
US20030020755A1 (en)1997-04-302003-01-30Lemelson Jerome H.System and methods for controlling automatic scrolling of information on a display or screen
US20030098954A1 (en)2001-04-272003-05-29International Business Machines CorporationCalibration-free eye gaze tracking
KR20040027764A (en)2004-03-032004-04-01학교법인 한국정보통신학원A method for manipulating a terminal using user's glint, and an apparatus
US20040174496A1 (en)2003-03-062004-09-09Qiang JiCalibration-free gaze tracking under natural head movement
US20060082542A1 (en)2004-10-012006-04-20Morita Mark MMethod and apparatus for surgical operating room information display gaze detection and user prioritization for control
US20060110008A1 (en)*2003-11-142006-05-25Roel VertegaalMethod and apparatus for calibration-free eye tracking
KR20100006652A (en)2008-07-102010-01-21성균관대학교산학협력단Full browsing method using gaze detection and handheld terminal performing the method
US20130128364A1 (en)*2011-11-222013-05-23Google Inc.Method of Using Eye-Tracking to Center Image Content in a Display

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US5742264A (en)*1995-01-241998-04-21Matsushita Electric Industrial Co., Ltd.Head-mounted display
US7542210B2 (en)*2006-06-292009-06-02Chirieleison Sr AnthonyEye tracking head mounted display
CN101943982B (en)*2009-07-102012-12-12北京大学Method for manipulating image based on tracked eye movements

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20030020755A1 (en)1997-04-302003-01-30Lemelson Jerome H.System and methods for controlling automatic scrolling of information on a display or screen
US6307526B1 (en)1998-02-022001-10-23W. Steve G. MannWearable camera system with viewfinder means
US20030098954A1 (en)2001-04-272003-05-29International Business Machines CorporationCalibration-free eye gaze tracking
US20020180799A1 (en)2001-05-292002-12-05Peck Charles C.Eye gaze control of dynamic information presentation
US20040174496A1 (en)2003-03-062004-09-09Qiang JiCalibration-free gaze tracking under natural head movement
US20060110008A1 (en)*2003-11-142006-05-25Roel VertegaalMethod and apparatus for calibration-free eye tracking
KR20040027764A (en)2004-03-032004-04-01학교법인 한국정보통신학원A method for manipulating a terminal using user's glint, and an apparatus
US20060082542A1 (en)2004-10-012006-04-20Morita Mark MMethod and apparatus for surgical operating room information display gaze detection and user prioritization for control
KR20100006652A (en)2008-07-102010-01-21성균관대학교산학협력단Full browsing method using gaze detection and handheld terminal performing the method
US20130128364A1 (en)*2011-11-222013-05-23Google Inc.Method of Using Eye-Tracking to Center Image Content in a Display
US8611015B2 (en)2011-11-222013-12-17Google Inc.User interface

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
Arrington Research, "ViewPoint EyeTracker, Software User Guide," May 28, 2009, Scottsdale, AZ, pp. 1-234.
Borah, Joshua, "Technology and Application of Gaze Based Control," RTO Lecture Series on Alternative Control Techniques: Human Factors Issues, Bretigny, France, Oct. 7, 1998, pp. 3-1 to 3-10.
Gilson et al., "An automated calibration method for non-see-through head mounted displays," Journal of Neuroscience Methods, 2011, vol. 199, Issue 2, pp. 328-335, May 19, 2011.
International Search Report and Written Opinion, International Application No. PCT/US2012/062933 dated May 15, 2013, 12 pages.
Lastra, et al., "Course Notes Programming Virtual Worlds," SIGGRAPH 97, Aug. 8, 1997, Los Angeles, CA, 277 pages.

Cited By (11)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US9619020B2 (en)2013-03-012017-04-11Tobii AbDelay warp gaze interaction
US10545574B2 (en)2013-03-012020-01-28Tobii AbDetermining gaze target based on facial features
US9864498B2 (en)2013-03-132018-01-09Tobii AbAutomatic scrolling based on gaze detection
US10534526B2 (en)2013-03-132020-01-14Tobii AbAutomatic scrolling based on gaze detection
US10317995B2 (en)2013-11-182019-06-11Tobii AbComponent determination and gaze provoked interaction
US10558262B2 (en)2013-11-182020-02-11Tobii AbComponent determination and gaze provoked interaction
US9952883B2 (en)2014-08-052018-04-24Tobii AbDynamic determination of hardware
US10168772B2 (en)2015-10-122019-01-01Samsung Electronics Co., LtdHead mounted electronic device
EP3941077A4 (en)*2020-05-222022-06-29Beijing Baidu Netcom Science and Technology Co., Ltd.Method and apparatus for controlling video playing, and electronic device and storage medium
US12170822B2 (en)2020-05-222024-12-17Beijing Baidu Netcom Science And Technology Co., Ltd.Method and apparatus for controlling video playing, electronic device and storage medium
US11972046B1 (en)*2022-11-032024-04-30Vincent JiangHuman-machine interaction method and system based on eye movement tracking

Also Published As

Publication numberPublication date
US20130128364A1 (en)2013-05-23
EP2783252A2 (en)2014-10-01
WO2013077978A3 (en)2013-07-18
EP2783252B1 (en)2018-07-25
EP2783252A4 (en)2014-12-10
CN104067160A (en)2014-09-24
WO2013077978A2 (en)2013-05-30
US20140055846A1 (en)2014-02-27
CN104067160B (en)2017-09-12
US8611015B2 (en)2013-12-17

Similar Documents

PublicationPublication DateTitle
US8786953B2 (en)User interface
US8970452B2 (en)Imaging method
US8767306B1 (en)Display system
US10055642B2 (en)Staredown to produce changes in information density and type
US8971570B1 (en)Dual LED usage for glint detection
US9213185B1 (en)Display scaling based on movement of a head-mounted display
US8506080B2 (en)Unlocking a screen using eye tracking information
CN106164744B (en) Head Mounted Display Relative Motion Compensation
EP3097461B1 (en)Automated content scrolling
US8955973B2 (en)Method and system for input detection using structured light projection
US9285872B1 (en)Using head gesture and eye position to wake a head mounted device
US9552060B2 (en)Radial selection by vestibulo-ocular reflex fixation
US8982471B1 (en)HMD image source as dual-purpose projector/near-eye display
US20130088413A1 (en)Method to Autofocus on Near-Eye Display
US20150084864A1 (en)Input Method
US20140247286A1 (en)Active Stabilization for Heads-Up Displays
US9261959B1 (en)Input detection
US20130241805A1 (en)Using Convergence Angle to Select Among Different UI Elements
US20150003819A1 (en)Camera auto-focus based on eye gaze
US20150153572A1 (en)Adjustment of Location of Superimposed Image
CA2889563A1 (en)Direct hologram manipulation using imu
CN108604015B (en) Image display method and head mounted display device
US20250291409A1 (en)Utilizing blind spot locations to project system images

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:GOOGLE INC., CALIFORNIA

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WHEELER, AARON JOSEPH;GOMEZ, LUIS RICARDO PRADA;RAFFLE, HAYES SOLOS;REEL/FRAME:031545/0422

Effective date:20111122

STCFInformation on status: patent grant

Free format text:PATENTED CASE

ASAssignment

Owner name:GOOGLE LLC, CALIFORNIA

Free format text:CHANGE OF NAME;ASSIGNOR:GOOGLE INC.;REEL/FRAME:044277/0001

Effective date:20170929

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551)

Year of fee payment:4

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment:8


[8]ページ先頭

©2009-2025 Movatter.jp