Movatterモバイル変換


[0]ホーム

URL:


USRE42471E1 - System and method for monitoring eye movement - Google Patents

System and method for monitoring eye movement
Download PDF

Info

Publication number
USRE42471E1
USRE42471E1US12/199,693US19969308AUSRE42471EUS RE42471 E1USRE42471 E1US RE42471E1US 19969308 AUS19969308 AUS 19969308AUS RE42471 EUSRE42471 EUS RE42471E
Authority
US
United States
Prior art keywords
eye
person
emitters
frame
array
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
US12/199,693
Inventor
William C. Torch
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Eyefluence Inc
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US08/699,670external-prioritypatent/US5748113A/en
Priority claimed from US08/978,100external-prioritypatent/US6246344B1/en
Priority claimed from US09/104,258external-prioritypatent/US6163281A/en
Priority claimed from US09/740,738external-prioritypatent/US6542081B2/en
Application filed by IndividualfiledCriticalIndividual
Priority to US12/199,693priorityCriticalpatent/USRE42471E1/en
Application grantedgrantedCritical
Publication of USRE42471E1publicationCriticalpatent/USRE42471E1/en
Assigned to EYEFLUENCE, INC.reassignmentEYEFLUENCE, INC.ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: EYE-COM CORPORATION
Assigned to EYE-COM CORPORATIONreassignmentEYE-COM CORPORATIONASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: TORCH, WILLIAM C., DR.
Anticipated expirationlegal-statusCritical
Expired - Lifetimelegal-statusCriticalCurrent

Links

Images

Classifications

Definitions

Landscapes

Abstract

Apparatus for monitoring movement of a person's eye, e.g., to monitor drowsiness. The system includes a frame that is worn on a person's head, an array of emitters on the frame for directing light towards the person's eye, and an array of sensors on the frame for detecting light from the array of emitters. The sensors detect light that is reflected off of respective portions of the eye or its eyelid, thereby producing output signals indicating when the respective portions of the eye is covered by the eyelid. The emitters project a reference frame towards the eye, and a camera on the frame monitors movement of the eye relative to the reference frame. This movement may be correlated with the signals from the array of sensors and/or with signals from other sensors on the frame to monitor the person's level of drowsiness.

Description

Notice: More than one reissue application has been filed for the reissue of U.S. Pat. No. 6,163,281. The reissue applications are application Ser. No. 11/097,942, issued as Re 39,539, Ser. No. 11/732,828, which is a continuation reissue of Re 39,539, and Ser. No. 12/199,693 (the present application), which is a divisional reissue of Ser. No. 11/732,828.
This application is a reissue of U.S. Pat. No. 6,542,081 and is a divisional of application Ser. No. 11/732,828, which is a continuation of application Ser. No. 11/097,942, issued as Re 39,539, which are both reissues of application Ser. No. 09/740,738, U.S. Pat. No. 6,542,081, which is a continuation-in-part of application Ser. No. 09/104,258, filed Jun. 24, 1998, issuing as U.S. Pat. No. 6,163,281 on Dec. 19, 2000, which is a continuation-in-part of application Ser. No. 08/978,100, filed Nov. 25, 1997, now U.S. Pat. No. 6,246,344 issued Jun. 12, 2001, which is a continuation-in-part of application Ser. No. 08/699,670, filed Aug. 19, 1996, now U.S. Pat. No. 5,748,113 issued May 5, 1998, the disclosures of which are expressly incorporated herein by reference.
FIELD OF THE INVENTION
The present invention relates generally to systems and methods for monitoring movement of a human eye, and more particularly to systems and methods for real-time monitoring of fatigue and other states of mind in individuals, purposeful communication, and/or controlling devices based upon movement of their eye, eyelid, and/or other components of their eye.
BACKGROUND
There have been attempts to use movement of the human eye to monitor involuntary conditions, specifically a person's wakefulness or drowsiness. For example, U.S. Pat. No. 3,863,243 discloses a device that sounds an alarm to warn a person using the device that they are beginning to fall asleep. The device includes a frame similar to a set of eyeglasses onto which is mounted a fiber optic bundle and a photocell that are directed towards the user's eye when the frame is worn. The fiber optic bundle is coupled to a source of light and a pulse generator to emit light towards the user's eye.
The photocell detects the intensity of light reflected off of the user's eye, i.e., either by the eyelid when the eye is closed or the eye surface when the eye is open. Circuitry receives a signal from the photocell, and uses a timer to distinguish between regular blinks, and an extended time period during which the eye is closed, i.e., a time period that may indicate that the person is falling asleep. When a threshold time elapses, an alarm is sounded to notify and/or wake the user. This device, however, requires running wires and fiber optic bundles from the frame to external components, e.g., the pulse generator and the required circuitry, and for this reason, the device may be awkward or inconvenient to use.
Other devices, such as those disclosed in U.S. Pat. Nos. 5,469,143 and 4,359,724, directly engage the eyelid or eyebrow of a user to detect movement of the eye and activate an alarm when a drowsiness condition is detected. These mechanical devices may be mounted directly onto the skin to detect muscle movement or may involve placing a mechanical arm against the eyelid, and consequently may be uncomfortable to wear and use.
In addition, some devices may detect eye movement, but may not be able to distinguish when the eye is opened or closed. For example, it may be desirable to measure the percentage of total time that the eyelids are closed as a function of time or the area of the palpebral fissure that is covered by the eyelid as the eye is opened or closed, commonly known as “PERCLOS,” for example during medical research or when monitoring driver alertness. Devices that merely detect eye muscle movement or eyelash movement may not be able to distinguish when the eye is open or closed, and consequently may not be able to measure PERCLOS. Similarly, such devices may not measure other parameters, such as velocity of eyelid closing or opening, acceleration or deceleration characteristics, duration of open or closed eye states, intervals between eye blinks and/or partial versus full eye blinks or eye closures.
Further, infrared cameras or other devices may be used to monitor a driver's awareness, which are typically mounted on the dashboard, roof or other fixed mounting within the user's vehicle. Such devices, however, require that the user maintain constant eye contact with the camera. In addition, they do not monitor eyelid movement if the user looks sideways or downwards, turns around, exits the vehicle or compartment in which he or she is being monitored, or if the camera moves relative to the individual. Further, such cameras may have problems seeing through eyeglasses, sunglasses, or even contact lenses, and may not operate effectively in sunlight.
Accordingly, it is believed that a more effective system and method for monitoring eye and/or eyelid movement would be considered useful.
SUMMARY OF THE INVENTION
The present invention is directed to systems and methods for monitoring eye movement. Generally, humans blink at least about 5-30 times per minute, or about 7,000-43,000 times per day. Each involuntary-reflexive blink lasts about 200-300 milliseconds, and generally averaging about 250 milliseconds, amounting to about 1,750-10,800 seconds per day of eye closure due to involuntary blinking. As tiredness or sleepiness occurs, the eye blink gets longer and slower until the eyes begin to close for short term “microsleeps,” i.e., sleep conditions that last for about 3-5 seconds or longer, or for prolonged sleep. The present invention provides systems and methods for monitoring, measuring, and/or responding to eye movement, e.g., nonpurposeful reflexive eyeblinks.
In a preferred embodiment, the system includes an emitter and a sensor in a predetermined relationship with an eye such that the emitter emits light and the sensor detects light from the emitter, the sensor producing a light intensity signal indicating when the eye is open or closed. More preferably, the emitter is directed or aimed at the eyelid and eye, while the sensor detects eyelid-reflected light, since, unlike the eyelid, the eye ball (except the retina, which may cause a “red reflex” under white light conditions or “white pupil” under infrared light) does not reflect substantial light back to the sensor. Circuitry is coupled to the sensor for converting sequential light intensity signals corresponding to eyelid movement received from the sensor into a stream of data, and a processor converts the stream of data into an understandable message.
The circuitry for converting sequential light intensity signals may compare the sequential light intensity signals with a predetermined time threshold to detect voluntary-intentional or unintentional-involuntary sequences of eyelid movements, corresponding, for example, to a predetermined binary code. Memory circuitry may be coupled to the processor for storing the stream of data and/or a communication device, such as a video monitor or synthesized voice module, may be coupled to the processor for communicating the understandable message. In addition, a control system may be coupled to the processor, and the understandable message may include a command for controlling equipment, including electrical or electronic equipment, machinery, or a computer or computer accessory devices coupled to the control system.
The system preferably also includes a transmitter, preferably a radio frequency transmitter, for wireless transmission of the stream of data to a remote location. Alternatively, other forms of wireless transmission, e.g. infrared, as well as hard-wire connections may be used. The processor, as well as the memory circuitry, communication device, and/or control system, may be located at the remote location, and a receiver may be coupled to the processor for receiving the stream of data from the transmitter.
In a preferred form, the system includes a detection device having a frame adapted to be worn on a person's head, e.g., with the frame resting on the bridge of the user's nose and/or ears. The frame has the emitter and sensor thereon such that the emitter and sensor are oriented towards the person's eye when the frame is worn on the person's head. Preferably, the emitter and sensor are a single solid state device, such as a biosensor device, that emits light within a predetermined frequency range, for example infrared light, towards the eye and detects the emitted light reflected off of the eyelid, respectively.
In another preferred embodiment, a system for monitoring a blinking cycle of a person from a remote location is provided that includes an emitter for directing light towards an eye, and a sensor in a predetermined relationship with the emitter for detecting the emitted light reflected off of the eye, the sensor producing an output signal indicating when the eye is open or closed. Depending upon the relative position of the emitter and sensor with respect to the moving eyelid, the emitter light may be reflected off of the eyelid back to the sensor, or diffused by the surface of the eyeball.
A transmitter is coupled to the sensor for wireless transmission of the output signal, and a processor is provided for comparing the output signal to a predetermined threshold to detect when the eyelid is closed for a minimum predetermined duration. A warning indicator may be coupled to the processor, the warning indicator being activated when the processor detects that the eyelid is closed for the minimum predetermined duration. For example, the warning indicator may be an audible buzzer, a visible warning light, a vibrating device, an electrical shock device, a gustatory smell device, or other device that may act as a stimulus to any sensory modality.
Similar to the previous embodiment, a receiver may be provided at the remote location coupled to the processor for receiving the wireless transmission from the transmitter. Memory circuitry may be provided for storing the output signal and/or a processor may be provided for converting the output signal into an understandable message. A communication device may be coupled to the processor for communicating the understandable message.
In another preferred embodiment, a self-contained device for detecting movement of a person's eyelid is provided that includes a frame adapted to be worn on the person's head, an emitter on the frame for directing light towards an eye of the person when the frame is worn, and a sensor on the frame for detecting light from the emitter. The sensor produces an output signal indicating when the eye is open or closed, and a transmitter on the frame is coupled to the sensor for wireless transmission of the output signal to a remote location. The frame may also include a processor for comparing the output signal to a predetermined threshold to detect drowsiness-induced eyelid movement. Similar to the previous embodiments, the emitter and sensor are preferably a solid state biosensor device for emitting and detecting infrared light, or alternatively an array of emitters and/or sensors in a predetermined configuration on the frame, e.g., in a vertical, horizontal, diagonal, or other linear or other geometric array of more than one emitter and/or sensor oriented towards one or both eyes. In particular, an array of emitters and/or sensors allows measurement of eyelid velocity, acceleration and deceleration, and calculation of “PERCLOS.”
The emitter and/or sensors may be affixed to any number of points on the frame, e.g., around the lens and preferably in the nose bridge, or alternatively anywhere along the frame, including near or on the nasal portion of the frame, the attachment of a temple piece of the frame, and/or surface mounted on the lens of an eyeglass. Alternatively, the emitter and/or sensor may be embedded in the lens of an eyeglass, or otherwise such that they operate through the lens. Thus, the emitter(s) and/or sensor(s) are fixed on an eye-frame such that they move with the wearer's head movements, and continuously focus on the user's eyes, whether the user is in a vehicle, outdoors or in any other environment.
Thus, a system in accordance with the present invention may detect eyelid movement of the user, distinguish normal blinks from other voluntary or involuntary eyelid movement, and produce a stream of data. The stream of data may be converted into an understandable message, such as a binary code, a command for controlling a piece of equipment, or an indicator of the user's physical, mental or emotional state. Thus, the system may provide a convenient and/or effective method for voluntary or involuntary communication based simply upon movement of the user's eye.
In accordance with another aspect of the present invention, a system is provided for monitoring movement of a person's eye. The system includes a device configured to be worn on a person's head and an array of emitters on the device for directing light towards an eye of the person when the device is worn. The array of emitters is configured for projecting a reference frame towards the eye. A camera is oriented towards the eye for monitoring movement of the eye relative to the reference frame. The camera may be provided on the device or may be provided remote from the device, but in relatively close proximity to the user.
Preferably, the array of emitters includes a plurality of emitters disposed in a substantially vertical arrangement on the device, and a plurality of emitters disposed in a substantially horizontal arrangement on the device. Thus, the array of emitters may project a focused set of crossed bands towards the eye for dividing a region including the eye into four quadrants.
In addition, the system preferably includes one or more scanning or nonscanning sensors on the device for detecting light from the array of emitters. The one or more sensors produce an output signal indicating when the eye is open or closed, similar to the embodiments described above. More preferably, the sensors include an array of focused sensors in a predetermined relationship with the array of focused emitters for detecting light from the array of emitters that is reflected off of respective portions of the eye or its eyelid. The emitters, because of their fixed position, produce a fixed reflection off of the surface of the eye and eyelid, appearing as a “glint,” i.e., a spot or band of light. Each sensor produces an output signal indicating when the respective portion of the eye is covered or not covered by the eyelid.
The system may also include a processor for correlating the output signal from the one or more sensors with a video signal from the camera for determining the person's level of alertness. The system may also include a warning indicator on the device, the warning indicator being activated when the processor determines a predetermined level of drowsiness has occurred.
Light from the array of emitters may be emitted towards the eye of a user wearing the device to project a reference frame onto the eye. The camera is capable of imaging light produced by the emitters, e.g., in the infrared light range, thereby detecting the projected light as a spot of light, band of light or other “glint.” Movement of the eye relative to the reference frame may be monitored with the camera. A graphical output of the movement monitored by the camera relative to the reference frame may be monitored. For example, infrared light from the emitters may be reflected off of the retina as a “red reflex” under white light, as a “white pupil” under infrared light, or as a dark pupil under subtraction, using methods known to those skilled in the art. The processor, using these methods, may detect movement of the eye's pupil may be measured relative to the reference frame. This movement may be graphically displayed, showing the movement of the eye's pupil relative to the reference frame.
In addition, the output signal from the one or more sensors may be correlated with video signals produced by the camera monitoring movement of the eye relative to the reference frame, thereby determining the person's level of drowsiness.
Other objects and features of the present invention will become apparent from consideration of the following description taken in conjunction with the accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a perspective view of a patient in a hospital wearing a system for communication using eyelid movement in accordance with the present invention.
FIG. 2 is an enlarged perspective view of a preferred embodiment of the system for communication using eyelid movement, shown inFIG. 1, including a detection device and a processing box.
FIG. 3 is a schematic drawing of a preferred embodiment of circuitry for transmitting an output signal corresponding to a sequence of eyelid movements.
FIG. 4 is a schematic drawing of a preferred embodiment of circuitry for controlling equipment in response to an output signal corresponding to a sequence of eyelid movements.
FIG. 5 is a schematic drawing of a preferred embodiment of circuitry for detecting eyelid movement.
FIGS. 6A-6C are sectional and front views of alternate embodiments of a device for emitting light towards and detecting light reflected from a surface of an open eye.
FIGS. 7A-7C are sectional and front views of the devices ofFIGS. 6A-6C, respectively, emitting light towards and detecting light reflected from a closed eyelid.
FIG. 8 is a perspective view and block diagram of another preferred embodiment of a system for communication using eyelid movement.
FIG. 9 is a block diagram of the components of a system for communication in accordance with the present invention.
FIG. 10A is a perspective view of still another preferred embodiment of a system for communication using eyelid movement.
FIG. 10B is a schematic detail of a portion of the system ofFIG. 10A.
FIG. 10C is a detail of a preferred embodiment of an array of emitters and sensors that may be provided on a nose bridge of an eye frame, such as that ofFIG. 10A.
FIG. 10D is a sectional view of the array of emitters and sensors ofFIG. 10C emitting light and detecting light reflected from an eye.
FIG. 11A is a schematic view of a system for selectively controlling a number of devices from a remote location based upon eyelid movement.
FIG. 11B is a schematic view of additional devices that may be controlled by the system ofFIG. 11B.
FIG. 12A is a table showing the relationship between the activation of an array of sensors, such as that shown inFIGS. 10A-10D and an eye being monitored by the array, as the eye progresses between open and closed conditions.
FIG. 12B is a graph showing a stream of data provided by an array of sensors, such as that shown inFIGS. 10A-10D, indicating the percentage of eye coverage as a function of time (“PERCLOS”).
FIG. 12C is a graphical display of a number of physiological parameters, including PERCLOS, of a person being monitored by a system including a device such as that shown inFIGS. 10A-10D.
FIG. 12D is a table showing the relationship between the activation of two-dimensional arrays of sensors and an eye being monitored, as the eye progresses between open and closed conditions.
FIG. 13 is a perspective view of another system for monitoring eye movement, in accordance with the present invention.
FIG. 14 is a detail of a camera on the frame ofFIG. 13.
FIGS. 15A-15I are graphical displays of several parameters that may be monitored with the system ofFIG. 13.
FIG. 16 is a detail of video output from a camera on the frame ofFIG. 13.
FIG. 17 is a schematic showing circuitry for processing signals from a five-element sensor array, in accordance with an exemplary embodiment of the present invention.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
Turning now to the drawings,FIG. 1 shows a patient10 in abed12 wearing adetection device30 for detecting eyelid movement of the patient10 to provide voluntary-purposeful and/or involuntary-nonpurposeful communication. Thedetection device30 is coupled to aprocessing box130 which converts the detected eyelid movement into a stream of data, an understandable message and/or into information, which may be communicated, for example, using avideo display50, to amedical care provider40. Thedetection device30 andprocessing box130 together provide a system forcommunication14 in accordance with one aspect of the present invention.
Turning toFIGS. 2,6A and7A, a preferred embodiment of a system forcommunication14 is shown that includes an aimable andfocusable detection device30 that is attachable to a conventional pair ofeyeglasses20. Theeyeglasses20 include a pair oflenses21 attached to aframe22, which includesbridgework24 extending between thelenses21, and side members ortemple pieces25 carryingear pieces26, all of which are conventional. Alternatively, because thelenses21 are not necessary to the present invention, theframe22 may also be provided without thelenses21.
Thedetection device30 includes aclamp27 for attaching to one of theside members25 and anadjustable arm31 onto which is mounted anemitter32 and asensor33. Preferably, theemitter32 andsensor33 are mounted in a predetermined relationship such that theemitter32 may emit a signal towards aneye300 of a person wearing theeyeglasses20 and thesensor33 may detect the signal reflected from the surface of theeye300 andeyelid302. As shown inFIGS. 6A and7A, theemitter32 andsensor33 may be mounted adjacent one another.
Alternatively, as shown inFIGS. 6B and 7B, theemitter32′ andsensor33′ may be mounted on the frame separately away from one another, preferably such that theemitter32′ andsensor33′ are disposed substantially laterally with respect to each other. In a further alternative, shown inFIGS. 6C and 7C, theemitter32″ andsensor33″ may be mounted across theeye300 in axial alignment with another. As theeyelid302 closes, it may break thebeam340 being detected by thesensor33″.
In a preferred form, theemitter32 andsensor33 produce and detect continuous or pulsed light, respectively, preferably within the infrared range to minimize distraction or interference with the wearer's normal vision. Preferably, theemitter32 emits light in pulses at a predetermined frequency and thesensor33 is configured to detect light pulses at the predetermined frequency. This pulsed operation may reduce energy consumption by theemitter32 and/or may minimize interference with other light sources. Alternatively, other predetermined frequency ranges of light beyond or within the visible spectrum, such as ultraviolet light, or other forms of energy, such as radio waves, sonic waves and the like, may be used.
Theprocessing box130 is coupled to thedetection device30 by acable34 including one or more wires therein (not shown). As shown inFIG. 9, theprocessing box130 preferably includes a central processing unit (CPU)140 and/or other circuitry, such as the exemplary circuitry shown inFIGS. 3-5, for receiving and/or processing anoutput signal142, such as a light intensity signal, from thesensor33. Theprocessing box130 may also includecontrol circuitry141 for controlling theemitter32 and/or thesensor33, or theCPU140 may include internal control circuitry.
For example, in a preferred form, thecontrol circuitry141 controls theemitter32 to produce a flickering infrared signal pulsed at a predetermined frequency, as high as thousands of pulses per second to as little as about 4-5 pulses per second, and preferably at least about 5-20 pulses per second, thereby facilitating detection of nonpurposeful or purposeful eye-blinks as short as about 200 milliseconds per blink. Thesensor33 may be controlled to detect light pulses only at the predetermined frequency specific to the flicker frequency of theemitter32. Thus, by synchronizing theemitter32 and thesensor33 to the predetermined frequency, thesystem10 may be used under a variety of ambient conditions without theoutput signal142 being substantially affected by, for example, bright sun light, total darkness, ambient infrared light backgrounds, or other emitters operating at different flicker frequencies. The flicker frequency may be adjusted to maximize the efficient measurement of the number of eye blinks per unit time (e.g. about ten to about twenty eye blinks per minute), the duration of each eye blink (e.g. about 200 milliseconds to about 300 milliseconds), and/or PERCLOS (i.e., the percentage of time that the eyelid is completely or partially closed), or to maximize efficiency of the system, while keeping power consumption to a minimum.
Thecontrol circuitry141 and/orprocessing box130 may include manual controls (not shown) for adjusting the frequency, focus, or intensity of the light emitted by theemitter32, to turn theemitter32 off and on, to adjust the threshold sensitivity of thesensor33, and/or to allow for self-focusing with maximal infrared reflection off of a closed eyelid, as will be appreciated by those skilled in the art.
In addition, theprocessing box130 also preferably includes apower source160 for providing power to theemitter32, thesensor33, theCPU144, and/or other components in theprocessing box130. Theprocessor box130 may be powered by a conventional DC battery, e.g., a nine volt battery or a lithium battery. Alternatively, an adapter (not shown) may be connected to theprocessor box130, such as a conventional AC adapter or a twelve volt automobile lighter adapter.
Preferably, theCPU140 includestimer circuitry146 for comparing the length of individual elements of theoutput signal142 to a predetermined threshold to distinguish between normal blinks and other eyelid movement. Thetimer circuitry146 may be separate discrete components or may be provided internally within theCPU140, as will be appreciated by those skilled in the art. TheCPU140 converts theoutput signal142 into a stream ofdata144 which may be used to communicate to other persons or equipment. For example, the stream ofdata144 produced by theCPU140 may be a binary signal, such as Morse code or ASCI code. Alternatively, theCPU140 may be capable of producing a synthesized voice signal, a control signal for a piece of equipment, or even a pictorial representation.
To facilitate communication, theprocessing box130 may include a variety of output devices for using the stream ofdata144. For example, aninternal speaker150 may be provided, which may produce an alarm sound or a synthesized voice. Anoutput port148 may be provided to which a variety of equipment, such as thevideo display50 shown inFIG. 1, may be directly coupled by hard-wire connections.
Theprocessing box130 may also include atransmitter152 coupled to theCPU144 for wireless communication of the stream ofdata144 to a remote location. For example, as shown inFIG. 9, the system forcommunication14 may also include a receiving andprocessing unit154, such as a computer or other control or display system. Thetransmitter152 is preferably a radio frequency transmitter capable of producing a short range signal, for example, reaching as far as about one hundred feet or more, and preferably about forty five feet to fifty feet, even through walls or obstacles, although alternatively an infrared transmitter may also be effective.
Thetransmitter152 may also be coupled to an amplifier (not shown) to allow the stream of data to be transmitted thousands of feet or more. For example, the amplifier andtransmitter152 may communicate via telephone communication lines, satellites and the like, to transmit the stream of data to a remote location miles away from the system. The system may include, or may be coupled to a global positioning system (GPS) for monitoring the location, movement, and state of wakefulness and safety of an individual wearing thedetection device30.
The receiving andprocessing unit154 includes areceiver156, preferably a radio frequency receiver, for receiving asignal153, including the stream of data, transmitted by thetransmitter152. Aprocessor158 is coupled to thereceiver156 for translating, storing and/or using the information in the stream of data, theprocessor158 being coupled tomemory circuitry160, acommunication device162, and/or acontrol system164. For example, the receiving andprocessing unit154 may include thememory circuitry160 therein into which theprocessor158 may simply store the stream of data for subsequent retrieval and analysis.
Theprocessor158 may interpret the stream of data, for example, by converting a binary code in the stream of data into an understandable message, i.e., a series of letters, words and/or commands, and/or may use augmentative communication devices or software (such as KE:NX or Words Plus) to facilitate communication. The resulting message may be displayed on thecommunication device162, which may include a video display for displaying text, pictures and/or symbols, a synthesized voice module for providing electronic speech, and the like.
Alternatively, the stream of data may be displayed graphically on a computer of video screen or other electronic display device as a “real time” message signal or numerically (e.g., displaying blink rate, blink duration, PERCLOS, etc.), or displayed graphically similar to an EKG or EEG tracing. In addition, as shown inFIG. 12C, the stream of data may be displayed along with other physiological data (e.g. heart rate, respiratory rate, other sleep polysomnographic (PSG) or electroencephalographic (EEG) variables). Alternatively, the stream of data may be integrated with controllers which monitor automobile or mechanical functions (e.g. vehicle speed, acceleration, braking functions, torque, sway or tilt, engine or motor speed, etc.) to make intelligent decisions regarding slowing down or speeding up the vehicle depending upon road and/or vehicle conditions, as well as the state of consciousness, wakefulness or attentiveness of the driver or machine operator.
In addition, the message may be interpreted by theprocessor158 for directing thecontrol system164 to control one or more pieces of machinery or equipment. For example, the stream of data may include a command to direct thecontrol system164 to control relay switches or other devices to turn off and on an electrical device, such as an appliance, electrical wheelchair, engine, light, alarm, telephone, television, computer, a tactile vibrating seat, and the like, or to operate an eye-activated computer mouse or other controller.
Alternatively, theprocessor158 may use the stream of data to control PC, IBM, Macintosh and other computers and compatible computer software and/or hardware, e.g., to interact with a computer similar to a mouse, a “return” key or a “joystick.” For example, the stream of data may include commands to activate a series of menus from which sub-menus or individual items may be selected, as are used in commercially available special communications software, such as WORDS-PLUS or Ke:NX. Theprocessor158 may then control, scroll or select items from computer software programs, operate a printer or other peripheral device (e.g., selecting a font, paragraph, tab or other symbol operator, selecting commands, such as “edit,” “find,” “format,” “insert,” “help,” or controlling CD-ROM or disc drive operations, and/or other Windows and non-Windows functions).
Alternatively, thereceiver156 may be coupled directly to a variety of devices (not shown), such as radio or television controls, lamps, fans, heaters, motors, remote control vehicles, vehicle monitoring or controlling devices, computers, printers, telephones, lifeline units, electronic toys, or augmentative communication systems, to provide a direct interface between the user and the devices.
During use, thedetection device30 is placed on a user's head, i.e., by putting theeyeglasses20 on as shown inFIG. 1. Theadjustable arm31 and/or theclamp27 may be adjusted to optimally orient theemitter32 andsensor33 towards the user's eye300 (shown inFIGS. 6A-6C and7A-7C). Theemitter32 is activated and a beam oflight340 is directed from theemitter32 towards theeye300. The intensity and/or frequency of theemitter32 and/or the threshold sensitivity of thesensor33 or other focus may then be adjusted (e.g. manually or automatically using self-adjusting features).
Because of the difference in the reflective characteristics of the surface of theeye300 itself and theeyelid302, the intensity of the light reflected off of theeye300 depends upon whether theeye300 is open or closed. For example,FIGS. 6A and 6B illustrate an open eye condition, in which a ray oflight340 produced by theemitter32 strikes the surface of theeye300 itself and consequently is scattered, as shown by therays350. Thus, the resulting light intensity detected by thesensor33 is relatively low, i.e., thesensor33 may not receive any substantial return signal.
InFIGS. 7A and 7B, theeye300 is shown with theeyelid302 closed as may occur during normal blinks, moments of drowsiness, intentional blinks, or other eyelid movement. Because the light340 strikes theeyelid302, it is substantially reflected back to thesensor33, as shown by theray360, resulting in a relatively high light intensity being detected by thesensor33. Alternatively, as shown in7C, the beam oflight340 may be broken or cut by theeyelid302 when theeye300 is closed.
Thesensor33 consequently produces a light intensity signal that indicates when theeye300 is open or closed, i.e., corresponding to the time during which reflected light is not detected or detected, respectively, by thesensor33. Generally, the intensity of the infrared light reflected from the surface of the eyelid is not substantially affected by skin pigmentation. If it is desired to adjust the intensity of light reflected from the eyelid, foil, glitter, reflective moisturizer creams and the like may be applied to increase reflectivity, or black eye liner, absorptive or deflective creams and the like may be applied to reduce reflectivity.
Returning toFIG. 9, the light intensity detected by thesensor33 results in anoutput signal142 including a series of time-dependent light intensity signals (as shown, for example, inFIG. 12B). Theoutput signal142 is received by theCPU140 coupled to thesensor33, which compares the length of time of eachlight intensity signal142, for example, corresponding to a closed eye condition, with a predetermined threshold. Thetimer circuitry146 may provide a threshold time to theCPU140 for distinguishing normal blinks from intentional and/or other unintentional eyelid movement, which theCPU140 may then filter out of theoutput signal142. TheCPU140 then produces a stream ofdata144 which may be used for voluntary and/or involuntary communication.
In one useful application, thedetection device30 may be used to detect impending drowsiness or “micro-sleeps” (i.e., sleep intrusions into wakefulness lasting a few seconds) of a user, with theprocessing box130 triggering a warning to alert the user, others in his or her presence, or monitoring equipment of the onset of drowsiness. The threshold of thetimer circuitry146 may be adjusted such that theCPU140 detects relatively long periods of eye closure, as may occur when a person is falling asleep.
For example, because normal blinks are relatively short, the threshold may be set at a time ranging from close to zero seconds up to several seconds, preferably from about 200 milliseconds to about 300 milliseconds, and most preferably about 250 milliseconds, to distinguish normal blinks from drowsiness-induced eyelid movement. When theCPU140 detects a drowsiness condition, i.e., detects a high light intensity signal exceeding the predetermined threshold time, it may activate a warning device. The warning device may be included within theprocessing box130, such as thespeaker150, or alternatively on the frame, for example, by mounting a warning light (not shown) or an alarm speaker (not shown inFIG. 9, seeFIG. 10C) on the frame.
Alternatively, thedetection device30 may be used to unobtrusively record or monitor drowsiness-induced eyelid movement, with theCPU140 producing a stream ofdata144 which thetransmitter152 may transmit to the receiving and processing unit154 (FIG. 9). For example, thedevice30 may be used in conjunction with a vehicle safety system to monitor a driver's level of awareness or attentiveness. The stream ofdata144 may be transmitted to a receiving andprocessing unit154 mounted in a vehicle, which may store data on the driver's drowsiness and/or may use the data to make decisions and control the vehicle, e.g., adjust the vehicle's speed or even turn the vehicle's engine off. Thus, thedetection device30 may be used to monitor truck drivers, taxi drivers, ship or airline pilots, train conductors or engineers, radar or airport control tower operators, operators of heavy equipment or factory machinery, scuba divers, students, astronauts, entertainment participants or observers, and the like. The signals may be stored and analyzed in real time for trend changes measured over time to predict drowsiness effects of individuals using the device.
Thedetection device30 andsystem14 may also be used in a medical diagnostic, therapeutic, research or professional setting to monitor the wakefulness, sleep patterns and/or the effects of drugs, which may affect blink rate, blink velocity, blink duration, or PERCLOS of a patient or vehicle operator. Similar to the method just described, theCPU140 produces a stream ofdata144, which the transmitter may send to a remote receiving andprocessing unit154, which may store the stream ofdata144 in thememory circuitry160 for later retrieval and analysis by researchers, medical professionals, or safety personnel (e.g., similar to the way in which flight recorder data may be stored in an aircraft's “black box” recorder). The receiving andprocessing unit154 may also display the stream ofdata144, for example at a nurse's station, as an additional parameter to continually monitor a patient's physical, mental, or emotional condition. Theunit154 may store and/or produce a signal, e.g., by a series of algorithms, that must be responded to within a predetermined time (e.g., performance vigilance monitoring) to prevent false positives and negatives.
A number of medical conditions may be monitored by thedetection device30 andsystem14, such as petit mal epilepsy, in which the eyes flutter at a rate of about three cycles per second, grand mal or psychometer seizures, where the eyes may stare or close repetitively in a jerky manner, myoclonic seizures, in which the lids may open and close in a jerky manner, or tics, or other eye movements, such as encountered by people with Tourette's syndrome. The system may be used to monitor g-lock of pilots caused by g-force effects, hypoxemia of passengers or crew in aircraft due to losses in cabin pressure, nitrogen narcosis or “the bends” in divers, or the effects of gases, chemicals, or biological agents on military personnel or other individuals.
The system may also be used to monitor psychological situations, for example, to detect when a person lies (e.g., by closing their eyes when lying), during hypnosis, to monitor attentiveness, the effects of medications, e.g., L-dopa and other anti-Parkinsonian medications or anti-convulsants, drugs, alcohol, toxins, or the effects of hypoxia or ventilation, and the like. Neurological conditions may also be monitored where the innervation or mechanical function of the eyelid may be affected, such as in Parkinson's disease, muscle diseases, e.g., myotonia, myotonic muscular dystrophy, blepharospasm, photophobia or light sensitivity, encephalopathy, seizures, Bell's palsy, or where the condition may produce eyelid drooping or ptosis, such as third cranial nerve palsy or paresis, brainstem lesions or stroke, tumors, infection, metabolic diseases, trauma, degenerative conditions, e.g., multiple sclerosis, amyotrophic lateral sclerosis, polyneuropathy, myesthenia gravis, botulism, tetanus, tetany, tardive dyskinesia, brainstem encephalitis, and other primary eyelid conditions, such as exophthalmos, thyrotoxicosis or other thyroid conditions.
Similarly, thedetector device30 may be used in biofeedback applications, for example, in biofeedback, hypnosis or psychological therapies of certain conditions (e.g. tic disorders). Thedetector device30 may produce a stimulus, e.g. activating a light or speaker, and monitor the user's eyelid movement in anticipation of receiving a response, e.g., a specific sequence of blinks, acknowledging the stimulus within a predetermined time. If the user fails to respond, the processor may store the response, e.g. including response time, and/or may automatically transmit a signal, such as an alarm signal.
In addition, thedetection device30 may be used to monitor individuals in non-medical settings, such as during normal activity in a user's home or elsewhere. For example, individuals with involuntary medical conditions, such as epilepsy or narcolepsy, may be monitored, or other individuals, such as, infants and children, prison inmates, demented patients (e.g., with Alzheimer's disease), law enforcement personnel, military personnel, bank tellers, cashiers, casino workers, students, swing or graveyard shift workers, and the like, may be monitored. Similar application may be applied in a sleep laboratory for monitoring sleep patients to measure parameters, such as onset of sleep, sleep latency, time of eyelid closing or opening, time of awakening during the night, etc., or to animal research where eye blinking may be a factor to be studied. Similarly, the performance and vigilance abilities of the user may be tested and analyzed as a direct function of, or in relationship to, PERCLOS.
When theCPU140 detects the presence of particular eyelid movement, such as an extensive period of eye closure which may occur, for example, during an epileptic seizure, a syncopal episode, a narcoleptic episode, or when dozing off while driving or working, theCPU140 may produce an output signal which activates an alarm. Alternatively, thetransmitter152 may send an output signal to shut off equipment being used, to notify medical personnel, such as by automatically activating a telephone to dial emergency services, to signal remote sites, such as police stations, ambulances, vehicle control centers, guardians, and the like.
The system forcommunication14 may also find useful application for voluntary communication. A user wearing thedetection device30 may intentionally blink in a predetermined pattern, for example, in Morse code or other blinked code, to communicate an understandable message to people or equipment (e.g., to announce an emergency). TheCPU140 may convert alight intensity signal142 received from thesensor33 and corresponding to the blinked code into a stream ofdata144, or possibly directly into an understandable message including letters, words and/or commands.
The stream ofdata144 may then be displayed on avideo60 display50 (seeFIG. 1) coupled to theoutput port148, or emitted as synthesized speech on theinternal speaker150. The stream ofdata144 may be transmitted by thetransmitter152 via thesignal153 to the receiving andprocessing unit154 for displaying messages, or for controlling equipment, such as household devices, connected to thecontrol system164. In addition to residential settings, thesystem14 may be used by individuals in hospitalized or nursing care, for example by intubated, ventilated, restrained, paralyzed or weakened patients, to communicate to attending medical staff and/or to consciously signal a nurse's station. These include all patients who have no physical ability to communicate verbally, but who retain ability to communicate using eye blinking of one or both eyes (e.g., patients with amyotrophic lateral sclerosis, transverse myelitis, locked-in syndrome, cerebravascular strokes, terminal muscular dystrophy and those intubated on ventilation).
The device may be used in any environment or domain, e.g., through water or other substantially transparent fluids. Further, thedevice30 may also be used as an emergency notification and/or discrete security tool. A person who may be capable of normal speech may wear thedevice30 in the event of circumstances under which normal communication, i.e., speech, is not a viable option. For example, a bank or retail employee who is being robbed or is otherwise present during the commission of a crime may be able to discretely blink out a preprogrammed warning to notify security or to call law enforcement. Alternatively, a person with certain medical conditions may wear the device in the event that they are physically incapacitated, i.e., are unable to move to call for emergency medical care, but are still able to voluntarily move their eyes. In such cases, a pre-recorded message or identifying data (e.g. name of the user, their location, the nature of the emergency, etc.) may be transmitted to a remote location by a specific set of eyeblink codes or preprogrammed message. In this manner, thedetection device30 may be used to monitor patients in an ICU setting, patients on ventilators, prisoners, elderly or disabled persons, heavy equipment operators, truck drivers, motorists, ship and aircraft pilots, train engineers, radar or airport control tower operators, or as a nonverbal or subliminal tool for communication by military guards, police bank tellers, cashiers, taxi-drivers, and the like. Thedetection device30 may also be used as a recreational device, for example, as a children's toy similar to a walkie-talkie or to operate a remote control toy vehicle.
In addition, it may be desirable to have theCPU140 perform an additional threshold comparison to ensure continued use of thedetection device30. For example, additional timer circuitry may be coupled to theCPU140 such that theCPU140 may compare the light intensity signals received from thesensor33 to a second predetermined threshold provided by the timer circuitry. Preferably, the second predetermined threshold corresponds to a time period during which a person would normally blink. If theCPU140 fails to detect a normal blink within this time period or if the user fails to respond to a predetermined stimulus (e.g. a blinking light or sound), theCPU140 may produce a signal, activating thespeaker150 or transmitting a warning using thetransmitter152.
This may be useful, if, for example, thedetection device30 is removed by a perpetrator during commission of a crime, falls off because of the onset of a medical episode, as well as to prevent “false alarms,” or to measure the “state of attentiveness” of the user. Alternatively, performance vigilance tasks may be required of the user to determine whether the signal transmitted is a purposeful or “false alarm” signal, and also for measuring attention or drowsiness levels for purposes of biofeedback, and also to measure compliance of the user wearing the device.
Alternatively, the polarity of theoutput signal142 may be reversed such that a stream of data is produced only when the eye is opened, for example, when monitoring patients in a sleep lab to measure onset of sleep, sleep latency, time of eyelid closure, etc., or to monitor sleeping prison inmates. For such uses, theCPU140 may activate an alarm only when an open eye condition is detected, as will be appreciated by those skilled in the art.
Turning toFIG. 8, another preferred embodiment of thedetection device30 in accordance with the present invention is shown. In this embodiment, the emitter and sensor are a single solid state light emission and detectingbiosensor device132 which are mounted directly onto theeyeglasses20. Thebiosensor device132, which preferably produces and detects infrared light, may be as small as 2 mm ×4 mm and weigh only a few grams, thereby enhancing the convenience, comfort and/or discretion of thedetection device30. Because of the small size, the biosensor device133 may be mounted directly in thelens21, as shown inFIG. 8, on an outside or inside surface of thelens21, in thebridgework24 or at another location on theframe22 that may facilitate detection of eye movement. Thebiosensor device132 may measure less than about five millimeters by five millimeters surface area, and may weigh as little as about one ounce, thereby providing a emitter/sensor combination that may be unobtrusive to vision, portable, and may be conveniently incorporated into a light weight eye frame. Because the entire system may be self-contained on the frame, it moves with the user no matter which direction he or she looks and may operate in a variety of environments or domains, day or night, underwater, etc.
Hamamatsu manufactures a variety of infrared emitter and detector devices which may be used for thebiosensor device132, such as Model Nos. L1909, L1915-01, L2791-02, L2792-02, L2959, and 5482-11, or alternatively, a Radio Shack infrared emitter, Model No. 274-142, may be used. Multiple element arrays, e.g., linear optical scanning sensor arrays, appropriate for use with the present invention may be available from Texas Advanced Optoelectronic Solutions, Inc. (TAOS) of Plano, Tex,, such as Model Nos. TSL 201 (64 pixels×1 pixel), TSL 202 (128×1), TSL 208 (512×1), TSL 2301 (102×1). These sensors may be used in combination with lens arrays to facilitate focusing of the detected light, such as the Selfoc lens array for line scanning applications made by NSG America, Inc. of Irvine, Calif.
In addition,multiple biosensor devices132 may be provided on theeyeglasses20, for example, a pair ofbiosensor devices132 may be provided, as shown inFIG. 8, for detecting eyelid movement of each eye of the user (not shown). Acable134 extends from eachbiosensor device132 to aprocessing box130, similar to theprocessing box130 described above. TheCPU140 of the processing box130 (not shown inFIG. 8) may receive and compare the output signal from eachbiosensor device132 to further augment distinguishing normal blinks from other eyelid movement.
The pair ofbiosensor devices132 may allow use of more sophisticated codes by the user, e.g., blinking each eye individually or together, for communicating more effectively or conveniently, as will be appreciated by those skilled in the art. In one form, a blink of one eye could correspond to a “dot,” and the other eye to a “dash” to facilitate use of Morse code. The output signals from each eye could then be interpreted by theCPU140 and converted into an understandable message.
In another form, a right eye blink (or series of blinks) may cause an electric wheelchair to move to the right, a left eye blink (or series of blinks) may move to the left, two simultaneous right and left eye blinks may cause the wheelchair to move forward, and/or four simultaneous right and left eye blinks may cause the wheelchair to move backward. Similar combinations or sequences of eye blinks may be used to control the on/off function, or volume or channel control of a television, AM/FM radio, VCR, tape recorder or other electronic or electromechanical device, any augmentative communications or controlling device, or any device operable by simple “on/off” switches (e.g., wireless television remote controls single switch television control units, universal remote controllers, single switch multi-appliance units with AC plug/wall outlet or wall switch modules, computer input adapters, lighted signaling buzzer or vibrating signal boxes, switch modules of all types, video game entertainment controller switch modules and switch-controlled electronic toys).
In additional alternatives, one or more lenses or filters may be provided for controlling the light emitted and/or detected by the biosensor device, an individual emitter and/or detector. For example, the angle of the light emitted may be changed with a prism or other lens, or the light may be columnated or focused through a slit to create a predetermined shaped beam of light directed at the eye or to receive the reflected light by the sensor. An array of lenses may be provided that are adjustable to control the shape, e.g. the width, etc., of the beam of light emitted or to adjust the sensitivity of the sensor. The lenses may be encased along with the emitter in plastic and the like, or provided as a separate attachment, as will be appreciated by those skilled in the art.
Turning now toFIG. 10A, another preferred embodiment of a system forcommunication414 is shown, that includes aframe422 including abiosensor device432 with associated processor andtransmitter circuitry430 provided directly on theframe422, for example, to enhance the convenience and discretion of the system forcommunication414. Theframe422 may include abridge piece424 onto which thebiosensor device432 may be slidably and/or adjustably mounted, and a pair of ear supports423,425.
One of thesupports423 may have a larger size compared to theother support425, for example, to receive the processor andtransmitter circuitry430 embedded or otherwise mounted thereon. Aprocessor440, similar to theCPU140 in theprocessing box130 previously described, may be provided on theframe422, and a power source, such as alithium battery460, may be inserted or affixed to thesupport423. A radio frequency orother transmitter452 is provided on thesupport423, including anantenna453, which may be embedded or otherwise fastened along theear support423, in the temple piece or elsewhere in theframe422.
Thesystem414 may also include manual controls (not shown) on theear support423 or elsewhere on theframe422, for example to turn the power off and on, or to adjust the intensity and/or threshold of thebiosensor device432. Thus, a system forcommunication414 may be provided that is substantially self-contained on theframe422, which may or may not include lenses (not shown) similar to eyeglasses. External cables or wires may be eliminated, thereby providing a more convenient and comfortable system for communication.
In another alternative, shown inFIGS. 10B,10C, and10D, alinear array530 ofemitters532 andsensors533 may be provided, preferably in a vertical arrangement mounted on anose bridge524 of aneye frame522. ACPU540,battery460,transmitter antenna543, andwarning indicator550 may also be provided on theframe522, preferably in thetemple piece525, similar to the previously described embodiment. AnLED542 or similar stimulus device may also be provided at a predetermined location on theeye frame522 to allow routine biofeedback responses from the user. In addition, areceiver544 may be provided for receiving the stream of data created by theCPU540 and transmitted by thetransmitter543.
As shown particularly inFIG. 10C, each of thesensors533 and theemitter532 are coupled to theCPU540 or other control circuitry for controlling theemitter532 and for processing the light intensity signals produced by thesensors532. Thus, theCPU540 may cycle through thesensors533 in thearray530 and sequentially process the signal from each of thesensors533, similar to the processors previously described. More preferably, as shown inFIG. 10D, theemitter532 includes alens534 to focus a beam of light (indicated by individual rays360a,360b) onto theeye300, preferably towards thepupil301. Thesensors533 are embedded within thenose bridge524 and aslit535 is provided for each, theslits535 having a predetermined size to control the reflected light detected by eachsensor533. Thus, eachsensor535 may detect movement of theeyelid302 past a particular portion of theeye300, e.g., to measure PERCLOS, as shown inFIG. 12A. The sensors or emitters may have lenses or columnating devices to focus emitted or reflected light.
Thelinear array530 may facilitate measurement of additional parameters related to eyelid movement in addition to mere eye closure. For example, to measure the velocity of the eyelid opening or closing, i.e., the rate of eye closure, theCPU540 may compare the time delay between the activation ofsuccessive sensors533. In addition, the output signals from thesensors553 may be processed to measure the percentage of pupil coverage of theeyelid302, for example, due to partial eye closure, as a function of time, e.g., to monitor when the eye is partially, but not completely, closed, and/or to monitor the percentage of time that the eye is closed (PERCLOS), as shown inFIGS. 12A-12C, e.g., compared to the user's baseline of maximal eye opening.
Turning toFIG. 12D, in a further alternative, a two-dimensional array of sensors, such as a 5×5array633 or a 9×11 array,733 may be provided. Other arrays including any number of elements in the array may be provided, and the invention should not be limited to the exemplary embodiments described herein. Thesensors633,733 may then be used to measure surface area reflectivity of light from theemitter632, i.e., the processor (not shown) may process the signals from each sensor in thearray633,733 to create a stream of data indicating the percentage of surface area of theeye300 covered by theeyelid302.
The sensors in thearray633,733 may be sufficiently sensitive or have sufficient resolution such that they may detect “red reflex” or the equivalent infrared “bright pupil” reflection due to the reflection of light off of the retina through thepupil301. Thus, the sensors may produce a light intensity signal that includes a substantially zero value, indicating no red reflex or bright pupil, a low output, indicating red reflex or white pupil reflex, and a high output, indicating reflection off of aclosed eyelid302. The red reflex may appear as a bright white light pupil (resulting from infrared light from the emitter(s) reflecting off of the retina when the eyelid is open, or as a dark or “black pupil” if the processor uses subtraction algorithms, as is known in the art. The processor may thereby process the light intensity signals to detect when thepupil301 is covered by theeyelid302, i.e., at which point the user cannot see, even though theireye300 may not be entirely covered by theeyelid302, generally at a PERCLOS value of about 50-75 percent in primary gaze. Alternatively, as the eyelid, eye, and pupil descend, the sensor(s) may detect a red reflex or bright pupil even through the PERCLOS measurement may be as great as 75 -80 percent or more, e.g., where the eye may still see through a narrow slit-like palpebral fissure opening in downward gaze.
In another alternative, the processor and/or transmitter circuitry (such as theCPU140 in theprocessor box130 ofFIG. 2, or the CPU's440,540 ofFIGS. 10A and 10B) may include identification circuitry (not shown), either as a discrete memory chip or other circuit element, or within the CPU itself. The identification circuitry may be preprogrammed with a fixed identification code, or may be programmable, for example, to include selected identification information, such as the identity of the user, the user's location, an identification code for the individual detection device, and the like.
The CPU may selectively add the identification information to the transmitted stream ofdata553, or the identification information may be automatically or periodically included in the stream ofdata553, thereby allowing the stream ofdata553 to be associated with a particular detection device, individual user and/or a specific location. The identification information may be used by the processor, for example, at a remote location, to distinguish between streams of data received from a number of detection devices, which may then be stored, displayed, etc. as previously described. Thus, the detection device may not require users to consciously communicate certain identification or other standard information when the system is used.
As shown inFIG. 11A, thereceiver544 may allow the user to control one or more devices coupled to thereceiver544 through a single switchmulti-appliance control unit550. Thecontrol unit550 includes its own transmitter adapted to transmit on/off or other control signals that may be received byindividual control modules552a-552fe. Theuser10 may blink to create a transmitted stream ofdata553 that includes commands to turn off and on, or otherwise control, selected appliances using thecontrol unit550 andcontrol modules552a-552fe, such as, aradio554, atelevision556, a light558a., a light562 controlled by awall switch560, afan566 plugged into awall socket564, and the like.
Alternatively, as shown inFIG. 11B, thereceiver554 may be coupled to other systems, such as acomputer570 andprinter572, avehicle integration system574, alifeline unit576, a GPS orother satellite transmitter578, and the like. The transmitted stream ofdata553 may be processed alone or along with additional data, such as othervehicle sensor information573, to further enhance monitoring a user, such as a long-distance truck driver.
Turning toFIG. 13, yet another embodiment of asystem810 for monitoring eye movement is shown. Generally, thesystem810 includes aframe812 that may include abridge piece814 and a pair of ear supports816. Theframe812 may include a pair of lenses (not shown), such as prescription, shaded, or protective lenses, although they are not necessary for operation of the invention. Alternatively, the system may be provided on other devices that may be worn on a user's head, such as a pilot's oxygen mask, protective eye gear, a patient's ventilator, a scuba or swimming mask, a helmet, a hat, a head band, a head visor, and the like (not shown). The components of the system may be provided at a variety of locations on the device that generally minimize interference with the user's vision and/or normal use of the device.
An array ofemitters820 are provided on theframe812, preferably in avertical array820a and ahorizontal array820b. In a preferred embodiment, theemitters820 are infrared emitters configured to emit pulses at a predetermined frequency, similar to the embodiments described above. Theemitters820 are arranged on the frame such that they project areference frame850 onto the region of the user'seye 300. In a preferred embodiment, the reference frame includes a pair of crossedbands850a,850b dividing the region into four quadrants. The intersection of the crossed bands is preferably disposed at a location corresponding substantially to the eye's pupil during primary gaze, i.e., when the user is looking generally straight forward alongaxis 310 extending directly ahead of the user'seye 300. Alternatively, other reference frames may be provided, generally including a vertical component and a horizontal component.
An array ofsensors822 are also provided on theframe812 for detecting light from theemitters820 that is reflected off of the user's eyelid. Thesensors822 preferably generate output signals having an intensity identifying whether the eyelid is closed or open, similar to the embodiments described above. Preferably, thesensors822 are disposed adjacent torespective emitters820 for detecting light reflected off of respective portions of the eyelid. Alternatively,sensors822 may only be provided in a vertical array, e.g., along thebridge piece814, for monitoring the amount of eyelid closure, similar to the embodiments described above. In a further alternative, theemitters820 andsensors822 may be solid state biosensors (not shown) that provide both the emitting and sensing functions in a single device.
Circuitry may be provided for measuring PERCLOS or other parameters using the signals generated by the array of sensors. For example,FIG. 17 shows an exemplary schematic that may be used for processing signals from a five element array, e.g., to obtain PERCLOS measurements or other alertness parameters.
Returning toFIG. 13, thesystem810 also includes acamera830 provided on theframe810. Preferably, thecamera830 is mounted on or adjacent thebridge piece814 offset from theaxis 310 such that thecamera830 is oriented towards the region surrounding one of the user'seyes 300 while minimizing interference with the user's vision. Thecamera830 preferably includes a bundle offiberoptic cables832 that terminate in alens834, as shown inFIG. 14, on a first end mounted adjacent thebridge piece814 and asecond end837 that is connected to adetector838, e.g., a CCD or CMOS sensor, such as those used in endoscopes, that may convert an image into a digital video signal. Thecamera830 is configured to detect the frequency of light emitted by theemitters820, e.g., infrared light. Thecamera830 may rely on the light projected by theemitters820, or thefiberoptic cables832 may includeemitters836 for projecting light, e.g., infrared light, onto the user's eyes and/or face. In addition, thesystem810 may include asecond camera840 oriented away from the user's head, e.g., to monitor the user's surroundings.
One of the ear supports816 may include a panel818 for mounting a controller orother processor842, atransmitter844, anantenna845, and abattery846. Preferably, theprocessor840 842 is coupled to theemitters820, thesensors822, and/or thecamera830 for controlling their operation. Thetransmitter844 may be coupled to theprocessor842 for receiving the output signals from thesensors822 and/or the video signals from thecamera830, e.g., to transmit the signals to a remote location, as described below. Alternatively, thetransmitter844 may be coupled directly to output leads from thesensors822 and thecamera830. Theframe812 may also include manual controls (not shown), e.g., on theear support816, for example, to turn the power off and on, or to adjust the intensity and/or threshold of theemitters820, thesensors822, and/or thecamera830.
If desired, thesystem810 may also include one or more additional sensors on theframe812. The sensors may be coupled to theprocessor842 and/or to thetransmitter844 so that the signals from the sensors may be monitored, recorded, and/or transmitted to a remote location. For example, one ormore position sensors852a,852b may be provided, e.g., for determining the spatial orientation of theframe812, and consequently the user's head. For example, actigraphic sensors may be provided to measure tilt or movement of the head, e.g., to monitor whether the user's head is drooping forward or tilting to the side. Acoustic sensors, e.g., amicrophone854 may be provided for detecting environmental noise or sounds produced by the user.
In addition or alternatively, theframe812 may include one or more sensors for measuring one or more physical characteristics of the user. For example,EEG electrodes856 may be provided on theear support816, above or below the nasion, and/or other region that may contact the patient's skin to measure brain activity, e.g., waking, drowsy, or other sleep-related brain activity. An EKG electrode (not shown) may be provided that is capable of measuring cardiac activity through a skin contact site. A pulse sensor (not shown) may be used to measure cardiovascular pulsations, or anoximetry sensor858 may be used to measure oxygen saturation levels. A thermistor or other sensor may measure of respiratory air flow, e.g., through the user's nose. A thermister, thermocouple, or other temperature sensor (not shown) may be provided for measuring the user's skin temperature. A sweat detector (not shown) may be provided for measuring moisture on the user's skin.
In addition, thesystem810 may include one or more feedback devices on theframe812. These devices may provide feedback to the user, e.g., to alert and/or wake the user, when a predetermined condition is detected, e.g., a state of drowsiness or lack of consciousness. The feedback devices may be coupled to theprocessor842, which may control their activation. For example, amechanical vibrator device860 may be provided at a location that may contact the user, e.g., on theear support816, that may provide tactile vibrating stimuli through skin contact. An electrode (not shown) may be provided that may produce relatively low power electrical stimuli. A light emitter, such as one or more LED's may provided at desired locations, e.g., above thebridge piece814. Alternatively,audio devices862, such as a buzzer or other alarm, may be provided, similar to the previous embodiments. In a further alternative, aroma-emitters may be provided on theframe810 812, e.g., on or adjacent to thebridge piece814.
Alternatively, the feedback devices may be provided separate from the frame, but located in a manner capable of providing a feedback response to the user. For example, audio, visual, tactile (e.g., vibrating seat), or olfactory emitters may be provided in the proximity of the user, such as any of the devices described above. In a further alternative, heat or cold generating devices may be provided that are capable of producing thermal stimuli to the user, e.g., a remotely controlled fan or air conditioning unit.
Thesystem810 may also include components that are remote from theframe812, similar to the embodiments described above. For example, thesystem810 may include a receiver, a processor, and/or a display (not shown) at a remote location from theframe812, e.g., in the same room, at a nearby monitoring station, or at a more distant location. The receiver may receive signals transmitted by thetransmitter842, including output signals from thesensors822 or any of the other sensors provided on theframe812 and/or the video signals from thecamera830.
A processor may be coupled to the receiver for analyzing signals from the components on theframe812, e.g., to prepare the signals for graphical display. For example, the processor may prepare the video signals from thecamera830 for display on a monitor, thereby allowing personal monitoring of the user. Simultaneously, other parameters may be displayed, either on a single monitor or on separate displays. For example,FIG. 15a FIGS. 15A-15I shows signals indicating the output of various sensors that may be on theframe812, which may be displayed along a common time axis or otherwise correlated, e.g., to movement of the user's eye and/or level of drowsiness. The processor may superimpose or otherwise simultaneously display the video signal in conjunction with the other sensed parameters to allow a physician or other individual to monitor and personally correlate these parameters to the user's behavior.
In a further alternative, the processor may automatically process the signals to monitor or study the user's behavior. For example, the processor may use the output signals to monitor various parameters related to eye movement, such as eye blink duration (EBD), eye blink frequency, eye blink velocity, eye blink acceleration, interblink duration (IBD), PERCLOS, PEROP (percentage eyelid is open), and the like.
The video signals from thecamera830 may be processed to monitor various eye parameters, such as pupillary size, location, e.g., within the four quadrant defined by the crossedbands850, eye tracking movement, eye gaze distance, and the like. For example, because thecamera830 is capable of detecting the light emitted by theemitters822, thecamera830 may detect a reference frame projected onto the region of the user's eye by the emitters.FIG. 16 shows an exemplary video output from a camera included in a system having twenty emitters disposed in a vertical arrangement. The camera may detect twenty discrete regions of light arranged as a vertical band. The camera may also detect a “glint” point, G, and/or a moving bright pupil, P. Thus, the movement of the pupil may be monitored in relation to the glint point, G, and/or in relation to the vertical band 1-20.
Because theemitters822 are fixed to theframe812, thereference frame850 remains substantially stationary. Thus, the processor may determine the location of the pupil in terms of orthogonal coordinates (e.g., x-y or angle-radius) relative to thereference frame850. Alternatively, if the reference frame is eliminated, the location of the pupil may be determined relative to any stationary “glint” point on the user's eye. For example, thecamera830 itself may project a point of light onto the eye that may be reflected and detected by the camera. This “glint” point remains substantially stationary since thecamera830 is fixed to theframe812.
In addition, the video signals from a remote camera that may view the user's face from a distance may be used to monitor various facial measures, such as facial expression, yawning frequency, and the like, in addition to or alternatively, the project instead of the projected light reference frame from the emitters. In addition or alternatively, the parameters from other sensors may be processed and correlated, such as head orientation, tilt, body movement, physiological parameters, and the like. Preferably, the processor may correlate these parameters to generate a composite fatigue index (CFI) that is a function of two or more of these parameters. When a predetermined CFI is detected, thesystem810 may activate an alarm or other notice to the user and/or to another party at a remote location. Thus, thesystem810 may provide a more effective way to monitor the user's fatigue, drowsiness, alertness, mental state, and the like. In a further alternative, thesystem810 may be used to generate predetermined outputs, e.g., to activate or deactivate equipment, such as a vehicle being operated by the user when a predetermined condition, e.g., CFI value, is determined by thesystem810.
Alternatively, the processor may be provided on theframe812, e.g. as part ofprocessor842, for monitoring the parameters for a predetermined event, such as a predetermined CFI value, to occur. Although only a single lens and set of emitters, sensors, and cameras are shown, it will be appreciated that another set may be provided for the other eye of the user of thesystem810. In a further alternative, the eye tracking parameters described above may be monitored by a remote camera, e.g., in a fixed position in front of the user, such as the dashboard of a vehicle and the like. The remote camera may be coupled to the processor, either directly or via its own transmitter, as will be appreciated by those skilled in the art.
Thus, a system in accordance with the present invention may monitor or detect one or more parameters, such as those listed below in Table 1.
TABLE 1
Potential Biometric Measures
EYELID MEASURESEYEGLAZE MEASURES
Percentage of time (t) andEye Tracking Movements (ETM)
the amount palpebralincluding Directional Nystagmus
fissure is openedEye Gaze Distance (EGD) and
(PEROP-t, -d, -dt),Direction
or closedEye Movement Distance
(PERCLOS-t, -d, -dt),Eye Movement Velocity (EMV)
lid droopEye Movement Acceleration (EMA)
Eye Blink Duration (EBD)and Deceleration (EMD)
Eye Blink Frequency (EBF)Eye Movement Frequency (EMF)
Eye Blink Velocity (EBV)Phoria/eye Drift Measures (PDM)
Eye Blink AccelerationHEAD ORIENTATION MEASURES
(EBAc)Head Direction or Orientation
Decceleration (EBDc)(HDir)
Interblink duration (IBD)HEAD MOVEMENT MEASURES
Eye blink flurriesHead Nodding Frequency (HNF)
PUPIL MEASURESHead Tilt (HT)
Pupillary Appearance orOTHER NON-
Disappearance (with eyelidVIDEO SENSOR METRICS
movement)EEG, EKG, pulse, oxygen
Pupillary Size Measurementsaturation, respiration rate,
(PSM)body temp, skin conductance,
Presence and quality ofactigraphic movements, head
Pupillarylilt sensors
Dilation or Construction
(including Hippus)
While the invention is susceptible to various modifications, and alternative forms, specific examples thereof have been shown in the drawings and are herein described in detail. It should be understood, however, that the invention is not to be limited to the particular forms or methods disclosed, but to the contrary, the invention is to cover all modifications, equivalents and alternatives falling within the spirit and scope of the appended claims.

Claims (52)

20. A system for monitoring movement of a person's eye, comprising:
a frame configured to be worn on a person's head;
an array of emitters on the frame for directing light towards an eye of the person when the frame is worn, the array emitters configured to project a reference frame towards the eye;
an array of sensors on the frame in a predetermined relationship with the array of emitters for detecting light from the array of emitters that is reflected off of respective portions of the eye or its eyelid, each sensor producing an output signal indicating when the respective portion of the eye is covered or not covered by the eyelid;
a camera on the frame for monitoring movement of the eye relative to the reference frame, the camera configured for producing a video signal of a region of the eye and the reference frame; and
a transmitter coupled to the sensor for wireless transmission of the output signal and the video signal to a remote location.
23. A method for monitoring movement of a person's eye using a detection device including an array of emitters that are directed towards an eye of the person when the detection device is worn, and a camera oriented towards the eye away from the person when the detection device is worn, the method comprising:
emitting light from the array of one or more emitters towards the eye to project a reference frame onto the eye;
monitoring movement of the eye relative to the reference frame the person's surroundings with the camera; and
generating a graphical output of the movement monitored by the camera relative to the reference frame;
wherein the detection device further comprises one or more sensors, and wherein the method further comprises detecting light from the array of one or more emitters reflected off of the eye with the one or more sensors, the one or more sensors producing a light intensity signal indicating when the eye is open or closed.
29. A system for monitoring movement of a person's eye, comprising:
a device configured to be worn on a person's head such that the device does not interfere substantially with the person's vision along an axis extending directly ahead of a first eye of the person;
one or more emitters on the device for directing light towards the first eye when the device is worn;
an array of sensors on the device directed towards the first eye when the device is worn, the sensors configured for converting images of the first eye into output signals; and
a processor coupled to the sensors for interpreting the output signals to control one or more devices,
wherein the one or more emitters and the sensors are provided on the device at locations offset from the axis to generally minimize interference with the person's vision and such that the array of sensors are oriented directly towards the region surrounding the first eye.
39. A method for monitoring movement of a person's eye using a detection device including one or more emitters and an array of sensors that are directed towards a first eye of the person when the detection device is worn, the method comprising:
placing the detection device on a person's head such that the detection device does not interfere substantially with the person's vision along an axis extending directly ahead of the first eye and the array of sensors are offset from the axis and oriented directly towards a region surrounding the first eye;
emitting light from one or more emitters towards the first eye;
detecting light from the one or more emitters reflected off of the first eye with the array of sensors, the array of sensors producing light intensity signals indicating when the first eye is open or closed; and
interpreting the light intensity signals to control one or more devices.
48. A system for controlling a computing device, comprising:
a frame configured to be worn on a person's head such that the device does not interfere substantially with the person's vision;
a sensor on the frame comprising a lens directed towards the eye of the person when the frame is worn, the sensor generating output signals representing video images of the eye; and
a processor coupled to the sensor for processing the output signals to monitor movement of the eye relative to a reference frame, the processor communicating with an electronic device remote from the frame and interpreting the output signals to control the electronic device,
wherein the processor is configured for monitoring the output signals to detect video images indicating that the person wearing the device has blinked in a predetermined sequence, the processor configured for executing the command on the electronic device based upon the predetermined sequence.
49. A system for monitoring movement of a person's eye, comprising:
an eyeglass frame comprising a nose bridge configured to be placed on the person's nose, and a pair of ear supports configured to be placed over the person's ears such that the frame, when worn on the person's head, does not interfere substantially with the person's vision along an axis extending directly ahead of a first eye of the person;
one or more emitters on the frame for directing light towards the first eye when the frame is worn, the one or more emitters provided on the frame at one or more locations that generally minimize interference with the person's vision along the axis;
a sensor on or adjacent the nose bridge such that the sensor is offset from the axis to generally minimize interference with the person's vision along the axis and oriented directly towards the region surrounding the first eye, the sensor configured for converting images of the first eye into output signals; and
a processor coupled to the sensor for interpreting the output signals to control one or more devices.
US12/199,6931996-08-192008-08-27System and method for monitoring eye movementExpired - LifetimeUSRE42471E1 (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
US12/199,693USRE42471E1 (en)1996-08-192008-08-27System and method for monitoring eye movement

Applications Claiming Priority (7)

Application NumberPriority DateFiling DateTitle
US08/699,670US5748113A (en)1996-08-191996-08-19Method and apparatus for communication
US08/978,100US6246344B1 (en)1996-08-191997-11-25Method and apparatus for voluntary communication
US09/104,258US6163281A (en)1996-08-191998-06-24System and method for communication using eye movement
US09/740,738US6542081B2 (en)1996-08-192000-12-18System and method for monitoring eye movement
US11/097,942USRE39539E1 (en)1996-08-192005-04-01System and method for monitoring eye movement
US11/732,828USRE41376E1 (en)1996-08-192007-04-03System and method for monitoring eye movement
US12/199,693USRE42471E1 (en)1996-08-192008-08-27System and method for monitoring eye movement

Related Parent Applications (1)

Application NumberTitlePriority DateFiling Date
US09/740,738ReissueUS6542081B2 (en)1996-08-192000-12-18System and method for monitoring eye movement

Publications (1)

Publication NumberPublication Date
USRE42471E1true USRE42471E1 (en)2011-06-21

Family

ID=46205529

Family Applications (3)

Application NumberTitlePriority DateFiling Date
US11/097,942Expired - LifetimeUSRE39539E1 (en)1996-08-192005-04-01System and method for monitoring eye movement
US11/732,828Expired - LifetimeUSRE41376E1 (en)1996-08-192007-04-03System and method for monitoring eye movement
US12/199,693Expired - LifetimeUSRE42471E1 (en)1996-08-192008-08-27System and method for monitoring eye movement

Family Applications Before (2)

Application NumberTitlePriority DateFiling Date
US11/097,942Expired - LifetimeUSRE39539E1 (en)1996-08-192005-04-01System and method for monitoring eye movement
US11/732,828Expired - LifetimeUSRE41376E1 (en)1996-08-192007-04-03System and method for monitoring eye movement

Country Status (1)

CountryLink
US (3)USRE39539E1 (en)

Cited By (33)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20120314045A1 (en)*2009-08-262012-12-13Ecole Polytechnique Federale De Lausanne (Epfl)Wearable systems for audio, visual and gaze monitoring
US8337404B2 (en)2010-10-012012-12-25Flint Hills Scientific, LlcDetecting, quantifying, and/or classifying seizures using multimodal data
US8382667B2 (en)2010-10-012013-02-26Flint Hills Scientific, LlcDetecting, quantifying, and/or classifying seizures using multimodal data
US20130131908A1 (en)*2006-03-162013-05-23Gray & Company, Inc.Navigation and control system for autonomous vehicles
US8452387B2 (en)2010-09-162013-05-28Flint Hills Scientific, LlcDetecting or validating a detection of a state change from a template of heart rate derivative shape or heart beat wave complex
US8562536B2 (en)2010-04-292013-10-22Flint Hills Scientific, LlcAlgorithm for detecting a seizure from cardiac data
US8641646B2 (en)2010-07-302014-02-04Cyberonics, Inc.Seizure detection using coordinate data
US8649871B2 (en)2010-04-292014-02-11Cyberonics, Inc.Validity test adaptive constraint modification for cardiac data used for detection of state changes
US8684921B2 (en)2010-10-012014-04-01Flint Hills Scientific LlcDetecting, assessing and managing epilepsy using a multi-variate, metric-based classification analysis
US8725239B2 (en)2011-04-252014-05-13Cyberonics, Inc.Identifying seizures using heart rate decrease
US20140147019A1 (en)*2011-07-112014-05-29Toyota Jidosha Kabushiki KaishaRed-eye detection device
US8831732B2 (en)2010-04-292014-09-09Cyberonics, Inc.Method, apparatus and system for validating and quantifying cardiac beat data quality
US20150305686A1 (en)*2012-11-102015-10-29The Regents Of The University Of CaliforniaSystems and methods for evaluation of neuropathologies
US9265458B2 (en)2012-12-042016-02-23Sync-Think, Inc.Application of smooth pursuit cognitive testing paradigms to clinical drug development
US9380976B2 (en)2013-03-112016-07-05Sync-Think, Inc.Optical neuroinformatics
US9402550B2 (en)2011-04-292016-08-02Cybertronics, Inc.Dynamic heart rate threshold for neurological event detection
US9418617B1 (en)2013-03-132016-08-16Google Inc.Methods and systems for receiving input controls
US9489817B2 (en)*2015-01-292016-11-08Vigo Technologies, Inc.Infrared sensing of eye and eyelid movements to detect drowsiness
US9504390B2 (en)2011-03-042016-11-29Globalfoundries Inc.Detecting, assessing and managing a risk of death in epilepsy
US9681836B2 (en)2012-04-232017-06-20Cyberonics, Inc.Methods, systems and apparatuses for detecting seizure and non-seizure states
WO2018020334A1 (en)*2016-07-282018-02-01Tata Consultancy Services LimitedSystem and method for aiding communication
US9883814B1 (en)2016-05-052018-02-06Mansour ZarreiiSystem and method for evaluating neurological conditions
TWI645366B (en)2016-12-132018-12-21國立勤益科技大學 Image semantic conversion system and method applied to home care
US10206591B2 (en)2011-10-142019-02-19Flint Hills Scientific, LlcSeizure detection methods, apparatus, and systems using an autoregression algorithm
US10220211B2 (en)2013-01-222019-03-05Livanova Usa, Inc.Methods and systems to diagnose depression
US10448839B2 (en)2012-04-232019-10-22Livanova Usa, Inc.Methods, systems and apparatuses for detecting increased risk of sudden death
WO2022212029A1 (en)*2021-03-312022-10-06Microsoft Technology Licensing, LlcHead mounted display with obscured light emitting diodes
US11615688B2 (en)2017-12-222023-03-28Resmed Sensor Technologies LimitedApparatus, system, and method for motion sensing
US11707197B2 (en)2017-12-222023-07-25Resmed Sensor Technologies LimitedApparatus, system, and method for physiological sensing in vehicles
US11813081B2 (en)2020-06-152023-11-14Beijing Xiaomi Mobile Software Co., Ltd.Intelligent glasses and glasses box
US12226162B2 (en)2022-05-092025-02-18Kure, LlcSmart eye mask
US12303287B2 (en)2017-12-222025-05-20Resmed Sensor Technologies LimitedApparatus, system, and method for health and medical sensing
US12383190B2 (en)2011-03-042025-08-12Flint Hills Scientific, LlcDetecting, assessing and managing extreme seizure events

Families Citing this family (48)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US6483484B1 (en)*1998-12-182002-11-19Semiconductor Energy Laboratory Co., Ltd.Goggle type display system
US7369951B2 (en)*2004-02-272008-05-06Board Of Trustees Of Michigan State UniversityDigital, self-calibrating proximity switch
US9658473B2 (en)2005-10-072017-05-23Percept Technologies IncEnhanced optical and perceptual digital eyewear
US20070081123A1 (en)2005-10-072007-04-12Lewis Scott WDigital eyewear
US8696113B2 (en)2005-10-072014-04-15Percept Technologies Inc.Enhanced optical and perceptual digital eyewear
US11428937B2 (en)2005-10-072022-08-30Percept TechnologiesEnhanced optical and perceptual digital eyewear
US8628478B2 (en)*2009-02-252014-01-14Empire Technology Development LlcMicrophone for remote health sensing
US8160311B1 (en)*2008-09-262012-04-17Philip Raymond SchaeferSystem and method for detecting facial gestures for control of an electronic device
US9039631B2 (en)*2008-10-092015-05-26Neuro KineticsQuantitative, non-invasive, clinical diagnosis of traumatic brain injury using VOG device for neurologic testing
US8866621B2 (en)*2009-02-252014-10-21Empire Technology Development LlcSudden infant death prevention clothing
US8824666B2 (en)*2009-03-092014-09-02Empire Technology Development LlcNoise cancellation for phone conversation
US8193941B2 (en)2009-05-062012-06-05Empire Technology Development LlcSnoring treatment
US20100286545A1 (en)*2009-05-062010-11-11Andrew WolfeAccelerometer based health sensing
US20100286567A1 (en)*2009-05-062010-11-11Andrew WolfeElderly fall detection
KR20100138725A (en)*2009-06-252010-12-31삼성전자주식회사 Virtual World Processing Unit and Methods
WO2012061871A1 (en)*2010-11-082012-05-18Optalert Australia Pty LtdFitness for work test
US8698639B2 (en)2011-02-182014-04-15Honda Motor Co., Ltd.System and method for responding to driver behavior
US9292471B2 (en)2011-02-182016-03-22Honda Motor Co., Ltd.Coordinated vehicle response system and method for driver behavior
US9033502B2 (en)2011-03-182015-05-19Sensomotoric Instruments Gesellschaft Fur Innovative Sensorik MbhOptical measuring device and method for capturing at least one parameter of at least one eye wherein an illumination characteristic is adjustable
EP2499964B1 (en)2011-03-182015-04-15SensoMotoric Instruments Gesellschaft für innovative Sensorik mbHOptical measuring device and system
WO2013087892A1 (en)2011-12-162013-06-20Chordate Medical AgDouble stimulation
US20130158452A1 (en)*2011-12-162013-06-20Chordate Medical AgTreatment of gastrointestinal disease
US9116545B1 (en)*2012-03-212015-08-25Hayes Solos RaffleInput detection
US9201512B1 (en)2012-04-022015-12-01Google Inc.Proximity sensing for input detection
US9128522B2 (en)2012-04-022015-09-08Google Inc.Wink gesture input for a head-mountable device
US10010270B2 (en)*2012-09-172018-07-03Verily Life Sciences LlcSensing system
CN103777351A (en)*2012-10-262014-05-07鸿富锦精密工业(深圳)有限公司Multimedia glasses
US20140242560A1 (en)*2013-02-152014-08-28EmotientFacial expression training using feedback from automatic facial expression recognition
US9751534B2 (en)2013-03-152017-09-05Honda Motor Co., Ltd.System and method for responding to driver state
US10499856B2 (en)2013-04-062019-12-10Honda Motor Co., Ltd.System and method for biological signal processing with highly auto-correlated carrier sequences
US9936916B2 (en)*2013-10-092018-04-10Nedim T. SAHINSystems, environment and methods for identification and analysis of recurring transitory physiological states and events using a portable data collection device
US9454887B1 (en)2013-11-192016-09-27Mustafa Q. MatalgahDoze alert
US9808157B2 (en)*2013-12-302017-11-07Verily Life Sciences LlcHands-free interface
WO2015103444A1 (en)*2013-12-312015-07-09Eyefluence, Inc.Systems and methods for gaze-based media selection and editing
WO2015116640A1 (en)*2014-01-292015-08-06Shazly Tarek AEye and head tracking device
US9955907B2 (en)2014-04-232018-05-01Case Western Reserve UniversityLow frequency non-invasive sensorial stimulation for seizure control
US9767373B2 (en)*2014-09-052017-09-19Ford Global Technologies, LlcHead-mounted display head pose and activity estimation
US20160343229A1 (en)*2015-05-182016-11-24Frank ColonyVigilance detection method and apparatus
US10007845B2 (en)*2015-07-062018-06-26Pixart Imaging Inc.Eye state detecting method and eye state detecting system
US20170185149A1 (en)*2015-12-262017-06-29Intel CorporationSingle sensor brain wave monitor
US10332315B2 (en)*2016-06-202019-06-25Magic Leap, Inc.Augmented reality display system for evaluation and modification of neurological conditions, including visual processing and perception conditions
US10839712B2 (en)*2016-09-092020-11-17International Business Machines CorporationMonitoring learning performance using neurofeedback
US10925479B2 (en)*2016-10-132021-02-23Ronald Michael KurtzNetworked system of mobile communication platforms for nonpharmacologic constriction of a pupil
ES2610196A1 (en)*2016-12-202017-04-26Universitat D'alacant / Universidad De AlicanteMethod and biometric authentication device through the recognition of flashing (Machine-translation by Google Translate, not legally binding)
US10424232B2 (en)2017-12-212019-09-24X Development LlcDirectional light emitters and electronic displays featuring the same
US11237639B2 (en)2018-03-272022-02-01Iowa Adaptive Technologies, Inc.Method and system for electronic communication by persons with disabilities
US12153212B2 (en)*2021-09-292024-11-26Pixieray OyEyeglass lens with eye-tracking components
CN117717340B (en)*2024-02-072024-05-31中汽研汽车检验中心(天津)有限公司Driver sleepiness detection method, device, equipment and medium

Citations (76)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US3689135A (en)*1971-01-261972-09-05Laurence R YoungMethod for monitoring movement of a subject{40 s eyes
US3798599A (en)1972-02-241974-03-19H KafafianSingle input controller for a communication system
US3863243A (en)1972-01-191975-01-28Max SkolnickSleep inhibiting alarm
US3966310A (en)1974-02-151976-06-29Larson Merlin DPupillometer and method of use thereof
US4102564A (en)1975-04-181978-07-25Michael Henry LPortable device for the accurate measurement of eye movements both in light and obscurity
US4359724A (en)1980-04-281982-11-16Ronald R. ZimmermanEyelid movement detector
US4815839A (en)1987-08-031989-03-28Waldorf Ronald AInfrared/video electronystagmographic apparatus
US4850691A (en)1987-03-181989-07-25University Of IllinoisMethod and apparatus for determining pupillary response parameters
US4852988A (en)*1988-09-121989-08-01Applied Science LaboratoriesVisor and camera providing a parallax-free field-of-view image for a head-mounted eye movement measurement system
US4894777A (en)1986-07-281990-01-16Canon Kabushiki KaishaOperator mental condition detector
US4953111A (en)1987-02-121990-08-28Omron Tateisi Electronics Co.Doze detector
US4967186A (en)1989-08-181990-10-30Ariold LudmirskyMethod and apparatus for fatigue detection
US4988183A (en)1988-06-131991-01-29Konan Camera Research Institute, Inc.Eye movement inspection device
US5070883A (en)1988-12-161991-12-10Konan Camera Research Institute Inc.Eye movement analyzing device utilizing pupil center-of-gravity data
US5093567A (en)1989-07-141992-03-03Gec-Marconi LimitedHelmet systems with eyepiece and eye position sensing means
US5183512A (en)1991-08-291993-02-02Aquotech, Inc.Method of cleaning semiconductor wafer and computer disks by purified water treated with electric AC signal
US5214456A (en)1991-10-091993-05-25Computed Anatomy IncorporatedMapping of corneal topography with display of pupil perimeter
US5341181A (en)1992-11-201994-08-23Godard Roger RSystems and methods for capturing and presentng visual information
US5345281A (en)*1992-12-171994-09-06John TaboadaEye tracking system and method
US5402109A (en)1993-04-291995-03-28Mannik; Kallis H.Sleep prevention device for automobile drivers
GB2284582A (en)1993-11-221995-06-14Toad Innovations LtdVehicle safety device to warn driver of fatigue
US5447166A (en)1991-09-261995-09-05Gevins; Alan S.Neurocognitive adaptive computer interface method and system based on on-line measurement of the user's mental effort
EP0679984A1 (en)1994-04-221995-11-02Canon Kabushiki KaishaDisplay apparatus
US5469143A (en)1995-01-101995-11-21Cooper; David E.Sleep awakening device for drivers of motor vehicles
US5478239A (en)1993-12-211995-12-26Maximum Performance, Inc.Dynamic visual acuity training method and apparatus
US5481622A (en)1994-03-011996-01-02Rensselaer Polytechnic InstituteEye tracking apparatus and method employing grayscale threshold values
US5566067A (en)1995-03-231996-10-15The President And Fellows Of Harvard CollegeEyelid vigilance detector system
US5570698A (en)1995-06-021996-11-05Siemens Corporate Research, Inc.System for monitoring eyes for detecting sleep behavior
US5583795A (en)1995-03-171996-12-10The United States Of America As Represented By The Secretary Of The ArmyApparatus for measuring eye gaze and fixation duration, and method therefor
US5682144A (en)1995-11-201997-10-28Mannik; Kallis HansEye actuated sleep prevention devices and other eye controlled devices
US5689241A (en)1995-04-241997-11-18Clarke, Sr.; James RussellSleep detection and driver alert apparatus
US5704369A (en)1994-07-251998-01-06Beth Israel Hospital Association, Inc.Non-invasive method for diagnosing Alzeheimer's disease in a patient
US5726916A (en)1996-06-271998-03-10The United States Of America As Represented By The Secretary Of The ArmyMethod and apparatus for determining ocular gaze point of regard and fixation duration
US5748113A (en)1996-08-191998-05-05Torch; William C.Method and apparatus for communication
US5778893A (en)1991-04-011998-07-14President And Fellows Of Harvard CollegeMethod of diagnosing and monitoring a treatment for Alzheimer's disease
US5795306A (en)1994-03-101998-08-18Mitsubishi Denki Kabushiki KaishaBodily state detection apparatus
WO1998049028A1 (en)1997-04-251998-11-05Applied Science Group, Inc.An alertness monitor
US5861936A (en)1996-07-261999-01-19Gillan Holdings LimitedRegulating focus in accordance with relationship of features of a person's eyes
US5867587A (en)1997-05-191999-02-02Northrop Grumman CorporationImpaired operator detection and warning system employing eyeblink analysis
US5956125A (en)1997-06-191999-09-21Bioprobes, Inc.System and method for screening for dementia
US6003991A (en)1996-02-171999-12-21Erik Scott ViirreEye examination apparatus and method for remote examination of a patient by a health professional
EP0984347A2 (en)1998-08-312000-03-08Sel Semiconductor Energy Laboratory Co., Ltd.Portable information processing system
JP2000137792A (en)1998-10-302000-05-16Toyota Motor Corp Eye detection device
US6088470A (en)1998-01-272000-07-11Sensar, Inc.Method and apparatus for removal of bright or dark spots by the fusion of multiple images
US6087941A (en)1998-09-012000-07-11Ferraz; MarkWarning device for alerting a person falling asleep
US6091378A (en)1998-06-172000-07-18Eye Control Technologies, Inc.Video processing methods and apparatus for gaze point tracking
US6090051A (en)1999-03-032000-07-18Marshall; Sandra P.Method and apparatus for eye tracking and monitoring pupil dilation to evaluate cognitive activity
JP2000201289A (en)1999-01-072000-07-18Sony CorpImage input-output device and image acquiring method
US6091546A (en)1997-10-302000-07-18The Microoptical CorporationEyeglass interface system
US6097295A (en)1998-01-282000-08-01Daimlerchrysler AgApparatus for determining the alertness of a driver
US6116736A (en)1999-04-232000-09-12Neuroptics, Inc.Pupilometer with pupil irregularity detection capability
US6163281A (en)1996-08-192000-12-19Torch; William C.System and method for communication using eye movement
US6246344B1 (en)1996-08-192001-06-12William C. TorchMethod and apparatus for voluntary communication
US6246779B1 (en)1997-12-122001-06-12Kabushiki Kaisha ToshibaGaze position detection apparatus and method
US6247813B1 (en)1999-04-092001-06-19Iritech, Inc.Iris identification system and method of identifying a person through iris recognition
US6252977B1 (en)1997-12-012001-06-26Sensar, Inc.Method and apparatus for illuminating and imaging eyes through eyeglasses using multiple sources of illumination
US20010028309A1 (en)1996-08-192001-10-11Torch William C.System and method for monitoring eye movement
US6334683B2 (en)1997-10-232002-01-01Lasersight Technologies, Inc.Eye illumination system and method
US20020024633A1 (en)1999-04-092002-02-28Daehoon KimPupil evaluation system
US6373961B1 (en)1996-03-262002-04-16Eye Control Technologies, Inc.Eye controllable screen pointer
US6388639B1 (en)1996-12-182002-05-14Toyota Jidosha Kabushiki KaishaStereoscopic image display apparatus, method of displaying stereoscopic image, and recording medium
JP2002309925A (en)2001-04-102002-10-23Meidensha CorpCarburetor temperature regulating method for nox removal device
US6611618B1 (en)1997-11-132003-08-26Schepens Eye Research Institute, Inc.Wide-band image enhancement
US20040061680A1 (en)2002-07-102004-04-01John TaboadaMethod and apparatus for computer control
US6775060B2 (en)2001-07-262004-08-10The Schepens Eye Research Institute, Inc.Bioptic telescope system embedded into a spectacle lens
US6820979B1 (en)1999-04-232004-11-23Neuroptics, Inc.Pupilometer with pupil irregularity detection, pupil tracking, and pupil response detection capability, glaucoma screening capability, intracranial pressure detection capability, and ocular aberration measurement capability
US20050007552A1 (en)2003-02-132005-01-13Fergason Patent Properties, LlcOptical system for monitoring eye movement
US6864473B2 (en)2000-12-072005-03-08The United States Of America As Represented By The United States National Aeronautics And Space AdministrationDynamic optical filtration
US6997556B2 (en)2001-10-012006-02-14Ernst PflegerMethod for detecting, evaluating, and analyzing look sequences
US7046215B1 (en)1999-03-012006-05-16Bae Systems PlcHead tracker system
US7071831B2 (en)2001-11-082006-07-04Sleep Diagnostics Pty., Ltd.Alertness monitor
WO2006092022A1 (en)2005-03-042006-09-08Sleep Diagnostics Pty. LtdMeasuring alertness
US7120880B1 (en)1999-02-252006-10-10International Business Machines CorporationMethod and system for real-time determination of a subject's interest level to media content
US7206435B2 (en)2002-03-262007-04-17Honda Giken Kogyo Kabushiki KaishaReal-time eye detection and tracking under various light conditions
US7374284B2 (en)2003-12-172008-05-20The Schepens Eye Research Institute, Inc.Peripheral field expansion device
US7391888B2 (en)2003-05-302008-06-24Microsoft CorporationHead pose assessment methods and systems

Patent Citations (79)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US3689135A (en)*1971-01-261972-09-05Laurence R YoungMethod for monitoring movement of a subject{40 s eyes
US3863243A (en)1972-01-191975-01-28Max SkolnickSleep inhibiting alarm
US3798599A (en)1972-02-241974-03-19H KafafianSingle input controller for a communication system
US3966310A (en)1974-02-151976-06-29Larson Merlin DPupillometer and method of use thereof
US4102564A (en)1975-04-181978-07-25Michael Henry LPortable device for the accurate measurement of eye movements both in light and obscurity
US4359724A (en)1980-04-281982-11-16Ronald R. ZimmermanEyelid movement detector
US4894777A (en)1986-07-281990-01-16Canon Kabushiki KaishaOperator mental condition detector
US4953111A (en)1987-02-121990-08-28Omron Tateisi Electronics Co.Doze detector
US4850691A (en)1987-03-181989-07-25University Of IllinoisMethod and apparatus for determining pupillary response parameters
US4815839A (en)1987-08-031989-03-28Waldorf Ronald AInfrared/video electronystagmographic apparatus
US4988183A (en)1988-06-131991-01-29Konan Camera Research Institute, Inc.Eye movement inspection device
US4852988A (en)*1988-09-121989-08-01Applied Science LaboratoriesVisor and camera providing a parallax-free field-of-view image for a head-mounted eye movement measurement system
US5070883A (en)1988-12-161991-12-10Konan Camera Research Institute Inc.Eye movement analyzing device utilizing pupil center-of-gravity data
US5093567A (en)1989-07-141992-03-03Gec-Marconi LimitedHelmet systems with eyepiece and eye position sensing means
US4967186A (en)1989-08-181990-10-30Ariold LudmirskyMethod and apparatus for fatigue detection
US5778893A (en)1991-04-011998-07-14President And Fellows Of Harvard CollegeMethod of diagnosing and monitoring a treatment for Alzheimer's disease
US5183512A (en)1991-08-291993-02-02Aquotech, Inc.Method of cleaning semiconductor wafer and computer disks by purified water treated with electric AC signal
US5447166A (en)1991-09-261995-09-05Gevins; Alan S.Neurocognitive adaptive computer interface method and system based on on-line measurement of the user's mental effort
US5214456A (en)1991-10-091993-05-25Computed Anatomy IncorporatedMapping of corneal topography with display of pupil perimeter
US5341181A (en)1992-11-201994-08-23Godard Roger RSystems and methods for capturing and presentng visual information
US5345281A (en)*1992-12-171994-09-06John TaboadaEye tracking system and method
US5402109A (en)1993-04-291995-03-28Mannik; Kallis H.Sleep prevention device for automobile drivers
GB2284582A (en)1993-11-221995-06-14Toad Innovations LtdVehicle safety device to warn driver of fatigue
US5478239A (en)1993-12-211995-12-26Maximum Performance, Inc.Dynamic visual acuity training method and apparatus
US5481622A (en)1994-03-011996-01-02Rensselaer Polytechnic InstituteEye tracking apparatus and method employing grayscale threshold values
US5795306A (en)1994-03-101998-08-18Mitsubishi Denki Kabushiki KaishaBodily state detection apparatus
EP0679984A1 (en)1994-04-221995-11-02Canon Kabushiki KaishaDisplay apparatus
US6346929B1 (en)1994-04-222002-02-12Canon Kabushiki KaishaDisplay apparatus which detects an observer body part motion in correspondence to a displayed element used to input operation instructions to start a process
US5704369A (en)1994-07-251998-01-06Beth Israel Hospital Association, Inc.Non-invasive method for diagnosing Alzeheimer's disease in a patient
US5469143A (en)1995-01-101995-11-21Cooper; David E.Sleep awakening device for drivers of motor vehicles
US5583795A (en)1995-03-171996-12-10The United States Of America As Represented By The Secretary Of The ArmyApparatus for measuring eye gaze and fixation duration, and method therefor
US5566067A (en)1995-03-231996-10-15The President And Fellows Of Harvard CollegeEyelid vigilance detector system
US5689241A (en)1995-04-241997-11-18Clarke, Sr.; James RussellSleep detection and driver alert apparatus
US5570698A (en)1995-06-021996-11-05Siemens Corporate Research, Inc.System for monitoring eyes for detecting sleep behavior
US5682144A (en)1995-11-201997-10-28Mannik; Kallis HansEye actuated sleep prevention devices and other eye controlled devices
US6003991A (en)1996-02-171999-12-21Erik Scott ViirreEye examination apparatus and method for remote examination of a patient by a health professional
US6373961B1 (en)1996-03-262002-04-16Eye Control Technologies, Inc.Eye controllable screen pointer
US5726916A (en)1996-06-271998-03-10The United States Of America As Represented By The Secretary Of The ArmyMethod and apparatus for determining ocular gaze point of regard and fixation duration
US5861936A (en)1996-07-261999-01-19Gillan Holdings LimitedRegulating focus in accordance with relationship of features of a person's eyes
US5748113A (en)1996-08-191998-05-05Torch; William C.Method and apparatus for communication
US6163281A (en)1996-08-192000-12-19Torch; William C.System and method for communication using eye movement
US6246344B1 (en)1996-08-192001-06-12William C. TorchMethod and apparatus for voluntary communication
US20010028309A1 (en)1996-08-192001-10-11Torch William C.System and method for monitoring eye movement
US6388639B1 (en)1996-12-182002-05-14Toyota Jidosha Kabushiki KaishaStereoscopic image display apparatus, method of displaying stereoscopic image, and recording medium
WO1998049028A1 (en)1997-04-251998-11-05Applied Science Group, Inc.An alertness monitor
US5867587A (en)1997-05-191999-02-02Northrop Grumman CorporationImpaired operator detection and warning system employing eyeblink analysis
US5956125A (en)1997-06-191999-09-21Bioprobes, Inc.System and method for screening for dementia
US6334683B2 (en)1997-10-232002-01-01Lasersight Technologies, Inc.Eye illumination system and method
US6091546A (en)1997-10-302000-07-18The Microoptical CorporationEyeglass interface system
US6611618B1 (en)1997-11-132003-08-26Schepens Eye Research Institute, Inc.Wide-band image enhancement
US6252977B1 (en)1997-12-012001-06-26Sensar, Inc.Method and apparatus for illuminating and imaging eyes through eyeglasses using multiple sources of illumination
US6246779B1 (en)1997-12-122001-06-12Kabushiki Kaisha ToshibaGaze position detection apparatus and method
US6088470A (en)1998-01-272000-07-11Sensar, Inc.Method and apparatus for removal of bright or dark spots by the fusion of multiple images
US6097295A (en)1998-01-282000-08-01Daimlerchrysler AgApparatus for determining the alertness of a driver
US6091378A (en)1998-06-172000-07-18Eye Control Technologies, Inc.Video processing methods and apparatus for gaze point tracking
US6867752B1 (en)1998-08-312005-03-15Semiconductor Energy Laboratory Co., Ltd.Portable information processing system
EP0984347A2 (en)1998-08-312000-03-08Sel Semiconductor Energy Laboratory Co., Ltd.Portable information processing system
US6087941A (en)1998-09-012000-07-11Ferraz; MarkWarning device for alerting a person falling asleep
JP2000137792A (en)1998-10-302000-05-16Toyota Motor Corp Eye detection device
JP2000201289A (en)1999-01-072000-07-18Sony CorpImage input-output device and image acquiring method
US7120880B1 (en)1999-02-252006-10-10International Business Machines CorporationMethod and system for real-time determination of a subject's interest level to media content
US7046215B1 (en)1999-03-012006-05-16Bae Systems PlcHead tracker system
US6090051A (en)1999-03-032000-07-18Marshall; Sandra P.Method and apparatus for eye tracking and monitoring pupil dilation to evaluate cognitive activity
US20020024633A1 (en)1999-04-092002-02-28Daehoon KimPupil evaluation system
US6247813B1 (en)1999-04-092001-06-19Iritech, Inc.Iris identification system and method of identifying a person through iris recognition
US6116736A (en)1999-04-232000-09-12Neuroptics, Inc.Pupilometer with pupil irregularity detection capability
US6260968B1 (en)1999-04-232001-07-17Neuroptics, Inc.Pupilometer with pupil irregularity detection capability
US6820979B1 (en)1999-04-232004-11-23Neuroptics, Inc.Pupilometer with pupil irregularity detection, pupil tracking, and pupil response detection capability, glaucoma screening capability, intracranial pressure detection capability, and ocular aberration measurement capability
US6864473B2 (en)2000-12-072005-03-08The United States Of America As Represented By The United States National Aeronautics And Space AdministrationDynamic optical filtration
JP2002309925A (en)2001-04-102002-10-23Meidensha CorpCarburetor temperature regulating method for nox removal device
US6775060B2 (en)2001-07-262004-08-10The Schepens Eye Research Institute, Inc.Bioptic telescope system embedded into a spectacle lens
US6997556B2 (en)2001-10-012006-02-14Ernst PflegerMethod for detecting, evaluating, and analyzing look sequences
US7071831B2 (en)2001-11-082006-07-04Sleep Diagnostics Pty., Ltd.Alertness monitor
US7206435B2 (en)2002-03-262007-04-17Honda Giken Kogyo Kabushiki KaishaReal-time eye detection and tracking under various light conditions
US20040061680A1 (en)2002-07-102004-04-01John TaboadaMethod and apparatus for computer control
US20050007552A1 (en)2003-02-132005-01-13Fergason Patent Properties, LlcOptical system for monitoring eye movement
US7391888B2 (en)2003-05-302008-06-24Microsoft CorporationHead pose assessment methods and systems
US7374284B2 (en)2003-12-172008-05-20The Schepens Eye Research Institute, Inc.Peripheral field expansion device
WO2006092022A1 (en)2005-03-042006-09-08Sleep Diagnostics Pty. LtdMeasuring alertness

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Attorney William A. English, Office Actions and Responses for related U.S. Appl. No. 11/732,828 (EYE-002.1), dated Dec. 6, 2007 to Dec. 7, 2009, 68 pages.
Hewlett-Packard GmbH: "Helmet-Mounted Display for Data Recall and Direct Consultation During Surgical Operation," Jan. 1996, Research Disclosure, Mason Publications, Hampshire, GB.
PCT International Search Report for PCT/US2005/011104, Applicant: Dr. William Torch, Forms PCT/ISA/210 & 220, dated Mar. 21, 2006 (9 pages).
PCT Written Opinion of the International Search Authority for PCT/US2005/011104, Applicant: Dr. William C. Torch, Form PCT/ISA/237, dated Mar. 21, 2006 (11 pages).

Cited By (52)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20130131908A1 (en)*2006-03-162013-05-23Gray & Company, Inc.Navigation and control system for autonomous vehicles
US20120314045A1 (en)*2009-08-262012-12-13Ecole Polytechnique Federale De Lausanne (Epfl)Wearable systems for audio, visual and gaze monitoring
US8831732B2 (en)2010-04-292014-09-09Cyberonics, Inc.Method, apparatus and system for validating and quantifying cardiac beat data quality
US9700256B2 (en)2010-04-292017-07-11Cyberonics, Inc.Algorithm for detecting a seizure from cardiac data
US9241647B2 (en)2010-04-292016-01-26Cyberonics, Inc.Algorithm for detecting a seizure from cardiac data
US8562536B2 (en)2010-04-292013-10-22Flint Hills Scientific, LlcAlgorithm for detecting a seizure from cardiac data
US8649871B2 (en)2010-04-292014-02-11Cyberonics, Inc.Validity test adaptive constraint modification for cardiac data used for detection of state changes
US9220910B2 (en)2010-07-302015-12-29Cyberonics, Inc.Seizure detection using coordinate data
US8641646B2 (en)2010-07-302014-02-04Cyberonics, Inc.Seizure detection using coordinate data
US8948855B2 (en)2010-09-162015-02-03Flint Hills Scientific, LlcDetecting and validating a detection of a state change from a template of heart rate derivative shape or heart beat wave complex
US8571643B2 (en)2010-09-162013-10-29Flint Hills Scientific, LlcDetecting or validating a detection of a state change from a template of heart rate derivative shape or heart beat wave complex
US9020582B2 (en)2010-09-162015-04-28Flint Hills Scientific, LlcDetecting or validating a detection of a state change from a template of heart rate derivative shape or heart beat wave complex
US8452387B2 (en)2010-09-162013-05-28Flint Hills Scientific, LlcDetecting or validating a detection of a state change from a template of heart rate derivative shape or heart beat wave complex
US8684921B2 (en)2010-10-012014-04-01Flint Hills Scientific LlcDetecting, assessing and managing epilepsy using a multi-variate, metric-based classification analysis
US8337404B2 (en)2010-10-012012-12-25Flint Hills Scientific, LlcDetecting, quantifying, and/or classifying seizures using multimodal data
US8852100B2 (en)2010-10-012014-10-07Flint Hills Scientific, LlcDetecting, quantifying, and/or classifying seizures using multimodal data
US8888702B2 (en)2010-10-012014-11-18Flint Hills Scientific, LlcDetecting, quantifying, and/or classifying seizures using multimodal data
US8382667B2 (en)2010-10-012013-02-26Flint Hills Scientific, LlcDetecting, quantifying, and/or classifying seizures using multimodal data
US8945006B2 (en)2010-10-012015-02-03Flunt Hills Scientific, LLCDetecting, assessing and managing epilepsy using a multi-variate, metric-based classification analysis
US9504390B2 (en)2011-03-042016-11-29Globalfoundries Inc.Detecting, assessing and managing a risk of death in epilepsy
US12383190B2 (en)2011-03-042025-08-12Flint Hills Scientific, LlcDetecting, assessing and managing extreme seizure events
US8725239B2 (en)2011-04-252014-05-13Cyberonics, Inc.Identifying seizures using heart rate decrease
US9402550B2 (en)2011-04-292016-08-02Cybertronics, Inc.Dynamic heart rate threshold for neurological event detection
US9177202B2 (en)*2011-07-112015-11-03Toyota Jidosha Kabushiki KaishaRed-eye detection device
US20140147019A1 (en)*2011-07-112014-05-29Toyota Jidosha Kabushiki KaishaRed-eye detection device
US10206591B2 (en)2011-10-142019-02-19Flint Hills Scientific, LlcSeizure detection methods, apparatus, and systems using an autoregression algorithm
US9681836B2 (en)2012-04-232017-06-20Cyberonics, Inc.Methods, systems and apparatuses for detecting seizure and non-seizure states
US10448839B2 (en)2012-04-232019-10-22Livanova Usa, Inc.Methods, systems and apparatuses for detecting increased risk of sudden death
US11596314B2 (en)2012-04-232023-03-07Livanova Usa, Inc.Methods, systems and apparatuses for detecting increased risk of sudden death
US20150305686A1 (en)*2012-11-102015-10-29The Regents Of The University Of CaliforniaSystems and methods for evaluation of neuropathologies
US10258291B2 (en)*2012-11-102019-04-16The Regents Of The University Of CaliforniaSystems and methods for evaluation of neuropathologies
US9265458B2 (en)2012-12-042016-02-23Sync-Think, Inc.Application of smooth pursuit cognitive testing paradigms to clinical drug development
US12144992B2 (en)2013-01-222024-11-19Livanova Usa, Inc.Methods and systems to diagnose depression
US11103707B2 (en)2013-01-222021-08-31Livanova Usa, Inc.Methods and systems to diagnose depression
US10220211B2 (en)2013-01-222019-03-05Livanova Usa, Inc.Methods and systems to diagnose depression
US9380976B2 (en)2013-03-112016-07-05Sync-Think, Inc.Optical neuroinformatics
US9418617B1 (en)2013-03-132016-08-16Google Inc.Methods and systems for receiving input controls
US9489817B2 (en)*2015-01-292016-11-08Vigo Technologies, Inc.Infrared sensing of eye and eyelid movements to detect drowsiness
US11559243B2 (en)2016-05-052023-01-24Mansour ZarreiiSystem and method for evaluating neurological conditions
US9883814B1 (en)2016-05-052018-02-06Mansour ZarreiiSystem and method for evaluating neurological conditions
US11086473B2 (en)2016-07-282021-08-10Tata Consultancy Services LimitedSystem and method for aiding communication
WO2018020334A1 (en)*2016-07-282018-02-01Tata Consultancy Services LimitedSystem and method for aiding communication
TWI645366B (en)2016-12-132018-12-21國立勤益科技大學 Image semantic conversion system and method applied to home care
US11707197B2 (en)2017-12-222023-07-25Resmed Sensor Technologies LimitedApparatus, system, and method for physiological sensing in vehicles
US12033485B2 (en)2017-12-222024-07-09Resmed Sensor Technologies LimitedApparatus, system, and method for motion sensing
US11615688B2 (en)2017-12-222023-03-28Resmed Sensor Technologies LimitedApparatus, system, and method for motion sensing
US12207904B2 (en)2017-12-222025-01-28Resmed Sensor Technologies LimitedApparatus, system, and method for physiological sensing in vehicles
US12303287B2 (en)2017-12-222025-05-20Resmed Sensor Technologies LimitedApparatus, system, and method for health and medical sensing
US11813081B2 (en)2020-06-152023-11-14Beijing Xiaomi Mobile Software Co., Ltd.Intelligent glasses and glasses box
US11867915B2 (en)2021-03-312024-01-09Microsoft Technology Licensing, LlcHead mounted display with obscured light emitting diodes
WO2022212029A1 (en)*2021-03-312022-10-06Microsoft Technology Licensing, LlcHead mounted display with obscured light emitting diodes
US12226162B2 (en)2022-05-092025-02-18Kure, LlcSmart eye mask

Also Published As

Publication numberPublication date
USRE39539E1 (en)2007-04-03
USRE41376E1 (en)2010-06-15

Similar Documents

PublicationPublication DateTitle
USRE42471E1 (en)System and method for monitoring eye movement
US6542081B2 (en)System and method for monitoring eye movement
CA2967756C (en)Biosensors, communicators, and controllers monitoring eye movement and methods for using them
US6163281A (en)System and method for communication using eye movement
US10039445B1 (en)Biosensors, communicators, and controllers monitoring eye movement and methods for using them
US20110077548A1 (en)Biosensors, communicators, and controllers monitoring eye movement and methods for using them
US8647268B2 (en)Patient monitoring apparatus
WO2008127316A1 (en)Security and monitoring apparatus
CA2980062A1 (en)Method and apparatus for biological evaluation
CN105144199A (en)Imaging device based occupant monitoring system supporting multiple functions
Mabry et al.Commercial motor vehicle operator fatigue detection technology catalog and review

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:EYE-COM CORPORATION, NEVADA

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TORCH, WILLIAM C., DR.;REEL/FRAME:030964/0128

Effective date:20130328

Owner name:EYEFLUENCE, INC., NEVADA

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:EYE-COM CORPORATION;REEL/FRAME:030964/0386

Effective date:20130806

FPAYFee payment

Year of fee payment:12


[8]ページ先頭

©2009-2025 Movatter.jp