Movatterモバイル変換


[0]ホーム

URL:


US7312699B2 - Ear associated machine-human interface - Google Patents

Ear associated machine-human interface
Download PDF

Info

Publication number
US7312699B2
US7312699B2US10/816,508US81650804AUS7312699B2US 7312699 B2US7312699 B2US 7312699B2US 81650804 AUS81650804 AUS 81650804AUS 7312699 B2US7312699 B2US 7312699B2
Authority
US
United States
Prior art keywords
user
ear
transmitting apparatus
set forth
electronic module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime, expires
Application number
US10/816,508
Other versions
US20050238194A1 (en
Inventor
T. Eric Chornenky
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by IndividualfiledCriticalIndividual
Priority to US10/816,508priorityCriticalpatent/US7312699B2/en
Priority to PCT/US2004/038974prioritypatent/WO2005104618A2/en
Priority to AU2004318969Aprioritypatent/AU2004318969A1/en
Priority to EP04811662Aprioritypatent/EP1736032A2/en
Publication of US20050238194A1publicationCriticalpatent/US20050238194A1/en
Application grantedgrantedCritical
Publication of US7312699B2publicationCriticalpatent/US7312699B2/en
Adjusted expirationlegal-statusCritical
Expired - Lifetimelegal-statusCriticalCurrent

Links

Images

Classifications

Definitions

Landscapes

Abstract

A human-machine interface can detect when a user's ear is pulled back to initiate a plurality of procedures. Such procedures include turning on a TV using a laser attached to the user, starting an additional procedure by speaking a command, communicating with other users in environments which have high ambient noise, and interacting with the internet. Head position sensors are used to detect the position of the head of a user and to either initiate a procedure if a characteristic of the head position or positions meets a certain criteria, or to pass the head position information to another device.

Description

This application claims the benefit of U.S. Provisional Application No. 60/459,289 filed Apr. 1, 2003.
FIELD OF THE INVENTION
The present invention generally relates to a human-machine interface structure and method.
BACKGROUND OF THE INVENTION
There are many human activities which can be made possible or made easier using a human-machine interface wherein a human can select certain options, such as turning a TV on or off, without having to use his or her hands, communicate with a computer using only his or her voice. Also, information about the condition of a person such as heart rate for example can be monitored without restricting the movements of the person.
Human-machine interface structures are known in the art. For example U.S. Pat. No. 6,696,973 to Ritter et al., and the references cited therein, teach communications systems which are mobile and carried by a user. U.S. Pat. No. 6,694,180 to Boesen describes biopotential sensing and medical monitoring which uses wireless communication to transmit the information from the sensors.
However, a human-machine interface that is convenient to use and is relatively inexpensive to manufacturer is still highly desirable.
SUMMARY OF THE INVENTION
Shown in a preferred embodiment of the present invention is a transmitting apparatus having a sensor for detecting an ear pull of a user and a laser worn by the user. An electronic module is coupled to both the ear pull sensor and the laser and generates a laser beam upon detection of the ear pull.
Also shown in a preferred embodiment of the present invention is a transmitting apparatus for a user which has a plurality of sensors for detecting a head position of the user, a RF transmitter and an electronic module coupled to the plurality of sensors and to the RF transmitter. The electronic module generates an encoded RF signal containing information about the head position of the user.
Further shown in a preferred embodiment of the invention is a communication apparatus including a portable computer worn by a user together with a microphone and speaker worn by the user and an electronic module. The electronic module is coupled to the microphone, the speaker and the portable computer and receives a voice message from the microphone and sends the voice message to the portable computing device, wherein the portable computing device, in response to the voice message, sends an answering audio communication to the electronic module which, in turn transfers the audio communication to the speaker.
Still further shown in a preferred embodiment of the present invention is a method for transmitting commands including sensing when an ear of a user is pulled back and turning on a laser mounted on the user when the sensing occurs.
OBJECTS OF THE INVENTION
It is, therefore, an object of the present invention to provide human-machine interface that is convenient to use and is relatively inexpensive to manufacture.
Another object is to provide a head worn communications device which communicates when a user pulls back one of his or her ears.
A further object is to provide a human-machine interface that will communicate with a plurality of devices.
A still further object of the present invention is to provide a method for communicating the head position of a user to other device.
An additional object of the present invention is to provide a hands free communication between a user and the internet.
In addition to the above-described objects and advantages of the present invention, various other objects and advantages will become more readily apparent to those persons who are skilled in the same and related arts from the following more detailed description on the invention, particularly, when such description is taken in conjunction with the attached drawing, figures, and appended claims.
DESCRIPTION OF THE DRAWING
FIG. 1A is a block diagram of one embodiment of the human-machine interface of the present invention;
FIG. 1B isFIG. 1A with several elements removed to show one minimal configuration of the present invention;
FIG. 1C shows an alternative embodiment in which a modulated retroflector is worn on each side of the head of auser14.
FIG. 2 isFIG. 1A modified to show other types of devices that can be used with the human-machine interface of the present invention;
FIG. 3 shows two sides of a user's head; and
FIG. 4 is the user ofFIG. 1A wearing a helmet with a laser detector mounted on the helmet.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
Prior to proceeding to a much more detailed description of the present invention, it should be noted that identical components which have identical functions have been identified with identical reference numerals throughout the several views illustrated in the drawing figures for the sake of clarity and understanding of the invention.
Turning now to the drawings,FIG. 1A shows several biometric devices inside adashed line box10 proximate to anear12 of anuser14. Theuser14 also has a pair ofglasses16. Mounted on thetemple piece18 of theglasses16 is alaser20 and acamera22. Also shown inFIG. 1A is a portable computing device which, in the preferred embodiment of the invention, is a personal data assistant (PDA)24 with a location sensing device which, in the preferred embodiment of the invention, is a local positioning system (LPS) module or a global positioning system (GPS)module26 attached thereto, acomputer28 connected by acable30 to theinternet32 and aTV set34.
The biometric devices inside thedashed line box10 include muscle actuation detectors which, inFIG. 1A, is astrain gauge36 attached to the skin of theuser14, asecond strain gauge38 attached to or embedded in thetemple piece18, athird strain gauge40 attached to the user's skin and positioned at least partially behind theear12 of theuser14, afourth strain gauge41 placed on the bridge of theglasses16, capacitance plates42 (attached to the back of the ear12) and44 (attached to the head behind the ear12), anear lobe clip46 and a combination microphone and an ambientnoise reducing speaker48 placed inside theear12. Also shown is aRFID chip47 placed underneath the skin of theuser14 behind theear12. The RFID chip could also be attached less intrusively by placing the RFID chip in an ear ring or in theear clip46, or attaching a RFID chip to theear12 by two magnets acting as a clamp. Thecapacitance plates42 and44, thestrain gauges36,38 and40 and theear lobe clip46 are connected by wires to anelectronic module50. Theelectronic module50 contains abattery51 to power theelectronic module50, twotilt meters52, and amagnetic sensor54. The twotilt meters52 measure the tilt from horizontal from a direction from the back to the front of the user's head, and from a direction from one ear to the other ear. Themagnetic sensor54 senses the direction of the earth's magnetic field. The twotilt meters52 and themagnetic sensor54 are used to determine the position of the user's head.
TheTV34 has alaser light sensor56 which responds in a predetermined manner upon detecting a laser light modulated with a predetermined code.
The system shown inFIG. 1A can operate in a number of different ways. In a relatively simple application, theuser14 aims thelaser20 atsensor56 and wiggles or pulls back theear12 by pulling back theear12. Only one of theear movement sensors36,38,40 and the combination of theplates42 and44, is needed, forexample strain gauge38. Other ear movement detectors could also be used such as detectors that detect the change in capacitance betweencapacitor plates44 and45 or betweenplates45 and49, the capacitance between the body of theuser14 andcapacitance plate44 or between the frames of theglasses16 and thecapacitance plate44. Also, theear12 movement can be detected by detecting a change in the magnitude of an RF field or a magnetic field using a detector in theelectronic module50. The RF generator or magnet could be located in theear clip46. Also the resistance of the user's skin proximate to theear12 would change sufficiently to detect anear12 movement. Thestrain gauge38, together with theelectronic module50, detects the change of the strain in thetemple piece18 when theear12 is pulled back. When the ear movement is detected, theelectronic module50, connected to thelaser generator20 by wires hidden behind or inside thetemple piece18 of theglasses16, causes thelaser20 to send the predetermined code which activates thesensor56 to turn on or turn off theTV set34. This simple application uses components that are relatively inexpensive to manufacture.
Thelaser20 could have a beam which is narrow or which diverges to cover a larger area than a narrow beam. Thelaser20 could have a variable divergence that the user could adjust. Thelaser20 could also be replaced with other types of light sources such as an LED, LCD or a flashlight. Still other types of signaling means could be used such as an ultrasonic generator or a high frequency (i.e., 60 Ghz) transmitter which would generate a narrow RF signal could be used.
Other types of strain gauges, such as the flexible strain gauge shown in U.S. Pat. No. 6,360,615 to Smela which could be applied to the back of theear12.
Detecting the movement of theear12 using a capacitance detector can also be accomplished by attaching or embedding two capacitor plates in thetemple piece18 of theglasses16 thereby eliminating the need to attach the capacitor plates to the skin of theuser14. The movement of theear12 can be detected by the change of capacitance between the two plates.
FIG. 1B shows a minimal configuration of the human-machine interface of the present invention which uses only thelaser20,strain gauge40 andelectronic module50 to control theTV set34. Anear bracket63 is used to hold the human-machine interface components behind theear12 of theuser14.
FIG. 1C shows an alternative embodiment where a modulated retroflector is worn on each side of the head of auser14. The modulated retroflector shown inFIG. 1C is worn as astud ear ring65 or adangle ear ring67. The modulatedretroflector65,67 could also be partially covered by placing the modulatedretroflector65,67 in the hair of theuser14. In operation theTV set34 would emit either a light signal or an RF signal from a combination transmitter andreceiver69. The signal from the combination transmitter andreceiver69 would be received by both of the modulatedretroflectors65,67 on each side of the head of theuser14 when theuser14 is looking at theTV set34, and at least one of the modulatedretroflectors65,67 will not receive the signal if theuser14 is looking in another direction.
Each of the modulatedretroflectors65,67 will, upon receipt of a signal from the combination transmitter andreceiver69 emit a light or RF signal which will be received by the combination transmitter andreceiver69. The combination transmitter andreceiver69 will be able to detect if both modulatedretroflectors65,67 on theuser14 are responding by detecting differences in the signals sent by each modulated retroflector. Such differences could be different frequencies or codes sent by each modulatedretroflector65,67. When theuser14 pulls backear12, the modulatedretroflectors65,67 will change signals that the combination transmitter andreceiver69 will detect. If the combination transmitter andreceiver69 detects the change in signal from both modulatedretroflectors65,67 the electronics in theTV set34 will perform a predetermined procedure such as turning on the TC set34.
TheTV set34 could haveadditional sensors58 for controlling other TV functions such as volume control while theear12 is pulled back. The volume increases using one of thesensors58 and decreases using another of thesensors58. Two other of thesensors58 could be used to select the TV channel in the same manner.
Theelectronic module50 can communicate with thePDA24 and thecomputer28 by wireless communication such as the Bluetooth protocol. Thecomputer28 can, in turn, communicate with theinternet32. Using the combination microphone andspeaker48 theuser14 can send audio information to theelectronic module50 which can then digitize the audio signal and send it to thePDA24 for voice recognition. If the audio is too complex for thePDA24, the audio can be sent to thecomputer28 for voice recognition. Thecomputer28 can access theinternet32 for help in the voice recognition if necessary. Finally if none of the equipment inFIG. 1A can recognize the audio, the PDA communicating to theelectronic module50 and the combination microphone andspeaker48 can tell theuser14 to repeat the statement or can ask specific questions of theuser14 which theuser14 can answer by pulling back theear12 either once or twice to answer a yes or no question.
There could also be a set of predetermined voice commands that theuser14 is restricted to. The voice recognition software to recognize the limited list of commands is less complex and more accurate than the software needed to recognize all words. Such voice commands as “channel59” when theear12 is pulled back would be decoded either directly by theelectronic module50 or by thePDA24, encoded and sent back to theelectronic module50 which would, in turn, modulate the laser beam from thelaser20 with the correct code which thesensor56 would decode and theTV set34 would change the channel to channel59. The laser beam would therefore have to aimed at thesensor56 to transmit the encoded laser beam signal to theTV set34. The same sequence could be used to set a thermostat, a VCR, etc.
There are some operations which do not require the use of thelaser20. For example auser14 could say “time” while pulling back theear12 and the time in an audio format would be sent to the speaker in the combination microphone andspeaker48. Also, a telephone number could be spoken and a telephone call would be made, and the call could be terminated when theuser14 says “hang up”.
In this manner more complex commands and communication can be achieved such as using the biometric device and system to simply record an audio message to communicating to any other applications such as viewing and taking a picture of a home appliance that needs repair and having thePDA24, thecomputer28 and the internet recognize the appliance and providing information needed to repair the appliance.
Thelaser20 can be used to send commands to or query many products such as notifying a traffic light that the user wants to cross the street along with the amount of time the user needs to cross the street. The laser could also be used by emergency personnel to cause traffic lights to turn green for them when they are going to an emergency.
Pulling theear12 back can simply be a single pull or can be a more complex action such as pulling back and holding theear12 back until a object, such as a TV, reaches a desired set point, such as reaching the wanted channel. Other actions can be to pull back theear14 twice within 2 seconds, etc. Even more complex movements can be used such as movements which may resemble Morse code signals or be actual Morse code. It is believed that some individuals with training can eventually control the movement of either ear separately and independently, thus generating a user interface capable of even more selectivity, complexity and discrimination.
Also, for a novice user the ear can be pushed back by hand until the user develops the ability to pull back his or her ear without using a hand.
Theear clip46 can be used to monitor the user's physical condition such as pulse rate and pulse oximetry. Other sensors can be attached to the user and wired to theelectronic module50 such as an accelerometer for monitoring other body parameters such as whether theuser14 has a fever on not and whether the person is awake, has fallen, etc.
A simple driving drowsiness detector can be made by having theelectronic module50 issue sporadic random tones to theuser14 using the combination microphone andspeaker48 and requiring theuser14 to respond with an ear wiggle movement at that time. The response delay would indicate the level of a user's reflex time and degree of sleepiness. A prolonged delay would result in a much louder tone to wake up theuser14.
Using a camera, either thecamera22 or another camera, theuser14 could pull back theear12 and say “camera mode” to tell theelectronic module50 to cause the camera to take a picture when theear12 is pulled back. Other camera mode activation means could be used such as a sequence of ear pulls. If the camera is a stand alone camera and the orientation of the camera can be remotely controlled, thetilt sensors52 andmagnetic sensor54 would be used to detect the what area theuser14 is looking at, and the camera would also point at the same area. Thus theuser14 at a sporting event could aim the camera and command the camera to take a picture simply by looking in the desired direction and pulling theear12 back to take a picture.
The combination microphone andspeaker48 could also contain an actuator which would provide tactile signaling for situations such as when the ambient noise is too high for reliable communication using the combination microphone andspeaker48 alone. The tactile signaling could be a signal touch or could be a pattern of touches.
Theelectronic module50 and the combination microphone andspeaker48 could be used as a cell phone with the proper electronics inside themodule50.
FIG. 2 shows the biometric system ofFIG. 1A, but is more generalized as to devices that the laser beam can be used on. Thetarget60 can be a stereo sound system with detectors to enable selecting a particular station, the type of music the user wants to hear, an appliance which needs repair as discussed above, a VCR, a lamp, a thermostat or a burglar alarm system, for example. Thetarget60 could be a refrigerator or a drawer having a laser detection device which, when queried, would provide an audio or digital feedback as to the contents of the refrigerator or drawer. Thetarget60 could be a door lock which would open when a correctly encoded laser signal is beamed to its detector. Of course the predetermined signal could be sent via an RF signal rather than by thelaser20. InFIG. 2 thelaser20 ofFIG. 1A could be modified to detect bar code labels. The reading of bar codes and the connections to the internet could provide information about a product which can not obtained by observing the product alone.
Thetarget60 could have a sensor61 which would receive light or RF signals from theuser14. In this embodiment theuser14 would compose a message and enter the message as an audio signal which would be stored in thePDA24,electronic module50 or a storage device shown aselement38 for this embodiment. When theuser14 approaches thetarget60 and pulls backear12, the stored message is sent as an audio message or a binary message to the sensor61 and thetarget60 will either immediately respond to the message or will store the message for later retrieval.
Thetarget60 could be a luminescent screen which could be written on with thelaser20 when it emits a blue light.
FIG. 3 shows the microphone64 of the combination microphone andspeaker48 ofFIG. 1A placed in one ear and thespeaker66 placed in the other ear. Thespeaker66 is connected to theelectronic module50 by asire68. The use of the microphone64 in one ear and thespeaker68 in the other ear attenuates the feedback from the speaker to the microphone in the combination microphone andspeaker48 ofFIG. 1A.
FIG. 4 shows the biometric devices and system ofFIG. 1A with the addition of ahelmet70 which soldiers or firemen might use. Thehelmet70 has alaser light detector72 on the back of the helmet and awire74 from thehelmet70 to theelectronic module50. Thelaser light detector72 allows another person with essentially the same equipment to communicate with theuser14 by aiming the other person's laser light at thelaser light detector72. The apparatus ofFIG. 4 allows for secure communication from one person to another, and allows communication when there is a high degree of ambient noise since the combination microphone andspeaker48 are in the ear channel which allows the words of the sender to be detected without much ambient noise and the receiver to receive the communication directly into his ear. Theear12 can still receive normal voice communication.
The identity of auser14 can be verified using theRFID chip47. Theelectronic module50 would query theRFID chip47 to verify the identity of the user.
Although the invention has been described in part by making detailed reference to a certain specific embodiment, such detail is intended to be, and will be understood to be, instructional rather than restrictive. It will be appreciated by those skilled in the art that many variations may be made on the structure and mode of operation without departing from the spirit and scope of the invention as disclosed in the teachings contained herein.

Claims (16)

US10/816,5082003-04-012004-04-01Ear associated machine-human interfaceExpired - LifetimeUS7312699B2 (en)

Priority Applications (4)

Application NumberPriority DateFiling DateTitle
US10/816,508US7312699B2 (en)2003-04-012004-04-01Ear associated machine-human interface
PCT/US2004/038974WO2005104618A2 (en)2004-04-012004-11-19Ear associated machine-human interface
AU2004318969AAU2004318969A1 (en)2004-04-012004-11-19Ear associated machine-human interface
EP04811662AEP1736032A2 (en)2004-04-012004-11-19Ear associated machine-human interface

Applications Claiming Priority (2)

Application NumberPriority DateFiling DateTitle
US45928903P2003-04-012003-04-01
US10/816,508US7312699B2 (en)2003-04-012004-04-01Ear associated machine-human interface

Publications (2)

Publication NumberPublication Date
US20050238194A1 US20050238194A1 (en)2005-10-27
US7312699B2true US7312699B2 (en)2007-12-25

Family

ID=35136451

Family Applications (1)

Application NumberTitlePriority DateFiling Date
US10/816,508Expired - LifetimeUS7312699B2 (en)2003-04-012004-04-01Ear associated machine-human interface

Country Status (4)

CountryLink
US (1)US7312699B2 (en)
EP (1)EP1736032A2 (en)
AU (1)AU2004318969A1 (en)
WO (1)WO2005104618A2 (en)

Cited By (41)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20050248719A1 (en)*2003-10-092005-11-10Howell Thomas AEvent eyeglasses
US20050248717A1 (en)*2003-10-092005-11-10Howell Thomas AEyeglasses with hearing enhanced and other audio signal-generating capabilities
US20050264752A1 (en)*2003-10-092005-12-01Howell Thomas AEyewear supporting after-market electrical components
US20060023158A1 (en)*2003-10-092006-02-02Howell Thomas AEyeglasses with electrical components
US20060236121A1 (en)*2005-04-142006-10-19Ibm CorporationMethod and apparatus for highly secure communication
US20070046887A1 (en)*2003-10-092007-03-01Howell Thomas AEyewear supporting after-market electrical components
US20070186330A1 (en)*2004-04-152007-08-16Howell Thomas AHat with a radiation sensor
US7438410B1 (en)2003-10-092008-10-21Ip Venture, Inc.Tethered electrical components for eyeglasses
US20080278678A1 (en)*2003-10-092008-11-13Howell Thomas AEyeglasses with user monitoring
US7500746B1 (en)2004-04-152009-03-10Ip Venture, Inc.Eyewear with radiation detection system
US7543934B2 (en)2006-09-202009-06-09Ipventures, Inc.Eyeglasses with activity monitoring and acoustic dampening
US7677723B2 (en)2003-10-092010-03-16Ipventure, Inc.Eyeglasses with a heart rate monitor
US7792552B2 (en)2003-04-152010-09-07Ipventure, Inc.Eyeglasses for wireless communications
US7806525B2 (en)2003-10-092010-10-05Ipventure, Inc.Eyeglasses having a camera
US8109629B2 (en)2003-10-092012-02-07Ipventure, Inc.Eyewear supporting electrical components and apparatus therefor
US8337013B2 (en)2004-07-282012-12-25Ipventure, Inc.Eyeglasses with RFID tags or with a strap
US8465151B2 (en)2003-04-152013-06-18Ipventure, Inc.Eyewear with multi-part temple for supporting one or more electrical components
US9405135B2 (en)2011-09-152016-08-02Ipventure, Inc.Shutter eyewear
US9451068B2 (en)2001-06-212016-09-20Oakley, Inc.Eyeglasses with electronic components
US9494807B2 (en)2006-12-142016-11-15Oakley, Inc.Wearable high resolution audio visual interface
US9619201B2 (en)2000-06-022017-04-11Oakley, Inc.Eyewear with detachable adjustable electronics module
US9720258B2 (en)2013-03-152017-08-01Oakley, Inc.Electronic ornamentation for eyewear
US9720260B2 (en)2013-06-122017-08-01Oakley, Inc.Modular heads-up display system
US9864211B2 (en)2012-02-172018-01-09Oakley, Inc.Systems and methods for removably coupling an electronic device to eyewear
US10042186B2 (en)2013-03-152018-08-07Ipventure, Inc.Electronic eyewear and display
US10222617B2 (en)2004-12-222019-03-05Oakley, Inc.Wearable electronically enabled interface system
US10310296B2 (en)2003-10-092019-06-04Ingeniospec, LlcEyewear with printed circuit board
US10344960B2 (en)*2017-09-192019-07-09Bragi GmbHWireless earpiece controlled medical headlight
US10345625B2 (en)2003-10-092019-07-09Ingeniospec, LlcEyewear with touch-sensitive input surface
US10567564B2 (en)2012-06-152020-02-18Muzik, Inc.Interactive networked apparatus
US10624790B2 (en)2011-09-152020-04-21Ipventure, Inc.Electronic eyewear therapy
US10777048B2 (en)2018-04-122020-09-15Ipventure, Inc.Methods and apparatus regarding electronic eyewear applicable for seniors
US11513371B2 (en)2003-10-092022-11-29Ingeniospec, LlcEyewear with printed circuit board supporting messages
US11630331B2 (en)2003-10-092023-04-18Ingeniospec, LlcEyewear with touch-sensitive input surface
US11644693B2 (en)2004-07-282023-05-09Ingeniospec, LlcWearable audio system supporting enhanced hearing support
US11733549B2 (en)2005-10-112023-08-22Ingeniospec, LlcEyewear having removable temples that support electrical components
US11829518B1 (en)2004-07-282023-11-28Ingeniospec, LlcHead-worn device with connection region
US11852901B2 (en)2004-10-122023-12-26Ingeniospec, LlcWireless headset supporting messages and hearing enhancement
US12044901B2 (en)2005-10-112024-07-23Ingeniospec, LlcSystem for charging embedded battery in wireless head-worn personal electronic apparatus
US12183341B2 (en)2008-09-222024-12-31St Casestech, LlcPersonalized sound management and method
US12249326B2 (en)2007-04-132025-03-11St Case1Tech, LlcMethod and device for voice operated control

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20060153409A1 (en)*2005-01-102006-07-13Ming-Hsiang YehStructure of a pair of glasses
DE102006012451A1 (en)*2006-03-172007-09-20Albert-Ludwigs-Universität Freiburg Imaging device
JP2010510602A (en)*2006-11-222010-04-02チョルネンキー、ティー・イー Security and monitoring device
JP4935545B2 (en)*2007-07-092012-05-23ソニー株式会社 Operation system
US8655004B2 (en)*2007-10-162014-02-18Apple Inc.Sports monitoring system for headphones, earbuds and/or headsets
US20100289912A1 (en)*2009-05-142010-11-18Sony Ericsson Mobile Communications AbCamera arrangement with image modification
US20100308999A1 (en)*2009-06-052010-12-09Chornenky Todd ESecurity and monitoring apparatus
WO2011022409A1 (en)*2009-08-172011-02-24Verto Medical Solutions, LLCEar sizing system and method
US9050029B2 (en)2010-01-062015-06-09Harman International Industries, Inc.Image capture and earpiece sizing system and method
KR20120046937A (en)*2010-11-032012-05-11삼성전자주식회사Method and apparatus for providing 3d effect in video device
EP2469743B1 (en)2010-12-232019-02-20Nagravision S.A.A system to identify a user of television services by using biometrics
WO2016174659A1 (en)2015-04-272016-11-03Snapaid Ltd.Estimating and using relative head pose and camera field-of-view

Citations (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US5091926A (en)*1990-03-261992-02-25Horton Jerry LHead activated fluoroscopic control
US5677834A (en)*1995-01-261997-10-14Mooneyham; MartinMethod and apparatus for computer assisted sorting of parcels
US6091832A (en)*1996-08-122000-07-18Interval Research CorporationWearable personal audio loop apparatus
US6184863B1 (en)*1998-10-132001-02-06The George Washington UniversityDirect pointing apparatus and method therefor
US6345111B1 (en)*1997-02-282002-02-05Kabushiki Kaisha ToshibaMulti-modal interface apparatus and method
US6424410B1 (en)*1999-08-272002-07-23Maui Innovative Peripherals, Inc.3D navigation system using complementary head-mounted and stationary infrared beam detection units
US6806847B2 (en)*1999-02-122004-10-19Fisher-Rosemount Systems Inc.Portable computer in a process control environment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US5091926A (en)*1990-03-261992-02-25Horton Jerry LHead activated fluoroscopic control
US5677834A (en)*1995-01-261997-10-14Mooneyham; MartinMethod and apparatus for computer assisted sorting of parcels
US6091832A (en)*1996-08-122000-07-18Interval Research CorporationWearable personal audio loop apparatus
US6345111B1 (en)*1997-02-282002-02-05Kabushiki Kaisha ToshibaMulti-modal interface apparatus and method
US6184863B1 (en)*1998-10-132001-02-06The George Washington UniversityDirect pointing apparatus and method therefor
US6806847B2 (en)*1999-02-122004-10-19Fisher-Rosemount Systems Inc.Portable computer in a process control environment
US6424410B1 (en)*1999-08-272002-07-23Maui Innovative Peripherals, Inc.3D navigation system using complementary head-mounted and stationary infrared beam detection units

Cited By (89)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US9619201B2 (en)2000-06-022017-04-11Oakley, Inc.Eyewear with detachable adjustable electronics module
US9451068B2 (en)2001-06-212016-09-20Oakley, Inc.Eyeglasses with electronic components
US12078870B2 (en)2003-04-152024-09-03Ingeniospec, LlcEyewear housing for charging embedded battery in eyewear frame
US9690121B2 (en)2003-04-152017-06-27Ingeniospec, LlcEyewear supporting one or more electrical components
US8465151B2 (en)2003-04-152013-06-18Ipventure, Inc.Eyewear with multi-part temple for supporting one or more electrical components
US7792552B2 (en)2003-04-152010-09-07Ipventure, Inc.Eyeglasses for wireless communications
US7500747B2 (en)2003-10-092009-03-10Ipventure, Inc.Eyeglasses with electrical components
US9033493B2 (en)2003-10-092015-05-19Ingeniospec, LlcEyewear supporting electrical components and apparatus therefor
US20080278678A1 (en)*2003-10-092008-11-13Howell Thomas AEyeglasses with user monitoring
US7481531B2 (en)2003-10-092009-01-27Ipventure, Inc.Eyeglasses with user monitoring
US10330956B2 (en)2003-10-092019-06-25Ingeniospec, LlcEyewear supporting electrical components and apparatus therefor
US10345625B2 (en)2003-10-092019-07-09Ingeniospec, LlcEyewear with touch-sensitive input surface
US10310296B2 (en)2003-10-092019-06-04Ingeniospec, LlcEyewear with printed circuit board
US7581833B2 (en)2003-10-092009-09-01Ipventure, Inc.Eyewear supporting after-market electrical components
US7621634B2 (en)*2003-10-092009-11-24Ipventure, Inc.Tethered electrical components for eyeglasses
US7677723B2 (en)2003-10-092010-03-16Ipventure, Inc.Eyeglasses with a heart rate monitor
US7760898B2 (en)2003-10-092010-07-20Ip Venture, Inc.Eyeglasses with hearing enhanced and other audio signal-generating capabilities
US7771046B2 (en)2003-10-092010-08-10I p Venture, Inc.Eyewear with monitoring capability
US20070046887A1 (en)*2003-10-092007-03-01Howell Thomas AEyewear supporting after-market electrical components
US7806525B2 (en)2003-10-092010-10-05Ipventure, Inc.Eyeglasses having a camera
US7922321B2 (en)2003-10-092011-04-12Ipventure, Inc.Eyewear supporting after-market electrical components
US8109629B2 (en)2003-10-092012-02-07Ipventure, Inc.Eyewear supporting electrical components and apparatus therefor
US11803069B2 (en)2003-10-092023-10-31Ingeniospec, LlcEyewear with connection region
US8430507B2 (en)2003-10-092013-04-30Thomas A. HowellEyewear with touch-sensitive input surface
US8434863B2 (en)2003-10-092013-05-07Thomas A. HowellEyeglasses with a printed circuit board
US12164180B2 (en)2003-10-092024-12-10Ingeniospec, LlcEyewear supporting distributed and embedded electronic components
US8500271B2 (en)2003-10-092013-08-06Ipventure, Inc.Eyewear supporting after-market electrical components
US11086147B2 (en)2003-10-092021-08-10Ingeniospec, LlcEyewear supporting electrical components and apparatus therefor
US8905542B2 (en)2003-10-092014-12-09Ingeniospec, LlcEyewear supporting bone conducting speaker
US7438410B1 (en)2003-10-092008-10-21Ip Venture, Inc.Tethered electrical components for eyeglasses
US11762224B2 (en)2003-10-092023-09-19Ingeniospec, LlcEyewear having extended endpieces to support electrical components
US20060023158A1 (en)*2003-10-092006-02-02Howell Thomas AEyeglasses with electrical components
US11204512B2 (en)2003-10-092021-12-21Ingeniospec, LlcEyewear supporting embedded and tethered electronic components
US20050248719A1 (en)*2003-10-092005-11-10Howell Thomas AEvent eyeglasses
US9547184B2 (en)2003-10-092017-01-17Ingeniospec, LlcEyewear supporting embedded electronic components
US20050264752A1 (en)*2003-10-092005-12-01Howell Thomas AEyewear supporting after-market electrical components
US20050248717A1 (en)*2003-10-092005-11-10Howell Thomas AEyeglasses with hearing enhanced and other audio signal-generating capabilities
US11243416B2 (en)2003-10-092022-02-08Ingeniospec, LlcEyewear supporting embedded electronic components
US11630331B2 (en)2003-10-092023-04-18Ingeniospec, LlcEyewear with touch-sensitive input surface
US11536988B2 (en)2003-10-092022-12-27Ingeniospec, LlcEyewear supporting embedded electronic components for audio support
US11513371B2 (en)2003-10-092022-11-29Ingeniospec, LlcEyewear with printed circuit board supporting messages
US10061144B2 (en)2003-10-092018-08-28Ingeniospec, LlcEyewear supporting embedded electronic components
US10060790B2 (en)2004-04-122018-08-28Ingeniospec, LlcEyewear with radiation detection system
US9488520B2 (en)2004-04-122016-11-08Ingeniospec, LlcEyewear with radiation detection system
US11326941B2 (en)2004-04-152022-05-10Ingeniospec, LlcEyewear with detection system
US11644361B2 (en)2004-04-152023-05-09Ingeniospec, LlcEyewear with detection system
US8770742B2 (en)2004-04-152014-07-08Ingeniospec, LlcEyewear with radiation detection system
US10539459B2 (en)2004-04-152020-01-21Ingeniospec, LlcEyewear with detection system
US20070186330A1 (en)*2004-04-152007-08-16Howell Thomas AHat with a radiation sensor
US7500746B1 (en)2004-04-152009-03-10Ip Venture, Inc.Eyewear with radiation detection system
US10359311B2 (en)2004-04-152019-07-23Ingeniospec, LlcEyewear with radiation detection system
US11644693B2 (en)2004-07-282023-05-09Ingeniospec, LlcWearable audio system supporting enhanced hearing support
US12140819B1 (en)2004-07-282024-11-12Ingeniospec, LlcHead-worn personal audio apparatus supporting enhanced audio output
US12238494B1 (en)2004-07-282025-02-25Ingeniospec, LlcHead-worn device with connection region
US12025855B2 (en)2004-07-282024-07-02Ingeniospec, LlcWearable audio system supporting enhanced hearing support
US12001599B2 (en)2004-07-282024-06-04Ingeniospec, LlcHead-worn device with connection region
US11921355B2 (en)2004-07-282024-03-05Ingeniospec, LlcHead-worn personal audio apparatus supporting enhanced hearing support
US11829518B1 (en)2004-07-282023-11-28Ingeniospec, LlcHead-worn device with connection region
US8337013B2 (en)2004-07-282012-12-25Ipventure, Inc.Eyeglasses with RFID tags or with a strap
US11852901B2 (en)2004-10-122023-12-26Ingeniospec, LlcWireless headset supporting messages and hearing enhancement
US12242138B1 (en)2004-10-122025-03-04Ingeniospec, LlcWireless headset supporting messages and hearing enhancement
US10222617B2 (en)2004-12-222019-03-05Oakley, Inc.Wearable electronically enabled interface system
US10120646B2 (en)2005-02-112018-11-06Oakley, Inc.Eyewear with detachable adjustable electronics module
US20060236121A1 (en)*2005-04-142006-10-19Ibm CorporationMethod and apparatus for highly secure communication
US12345955B2 (en)2005-10-112025-07-01Ingeniospec, LlcHead-worn eyewear structure with internal fan
US12313913B1 (en)2005-10-112025-05-27Ingeniospec, LlcSystem for powering head-worn personal electronic apparatus
US12248198B2 (en)2005-10-112025-03-11Ingeniospec, LlcEyewear having flexible printed circuit substrate supporting electrical components
US11733549B2 (en)2005-10-112023-08-22Ingeniospec, LlcEyewear having removable temples that support electrical components
US12044901B2 (en)2005-10-112024-07-23Ingeniospec, LlcSystem for charging embedded battery in wireless head-worn personal electronic apparatus
US7543934B2 (en)2006-09-202009-06-09Ipventures, Inc.Eyeglasses with activity monitoring and acoustic dampening
US9494807B2 (en)2006-12-142016-11-15Oakley, Inc.Wearable high resolution audio visual interface
US9720240B2 (en)2006-12-142017-08-01Oakley, Inc.Wearable high resolution audio visual interface
US10288886B2 (en)2006-12-142019-05-14Oakley, Inc.Wearable high resolution audio visual interface
US12249326B2 (en)2007-04-132025-03-11St Case1Tech, LlcMethod and device for voice operated control
US12183341B2 (en)2008-09-222024-12-31St Casestech, LlcPersonalized sound management and method
US12374332B2 (en)2008-09-222025-07-29ST Fam Tech, LLCPersonalized sound management and method
US10624790B2 (en)2011-09-152020-04-21Ipventure, Inc.Electronic eyewear therapy
US9405135B2 (en)2011-09-152016-08-02Ipventure, Inc.Shutter eyewear
US9864211B2 (en)2012-02-172018-01-09Oakley, Inc.Systems and methods for removably coupling an electronic device to eyewear
US10567564B2 (en)2012-06-152020-02-18Muzik, Inc.Interactive networked apparatus
US11924364B2 (en)2012-06-152024-03-05Muzik Inc.Interactive networked apparatus
US9720258B2 (en)2013-03-152017-08-01Oakley, Inc.Electronic ornamentation for eyewear
US11042045B2 (en)2013-03-152021-06-22Ingeniospec, LlcElectronic eyewear and display
US10042186B2 (en)2013-03-152018-08-07Ipventure, Inc.Electronic eyewear and display
US9720260B2 (en)2013-06-122017-08-01Oakley, Inc.Modular heads-up display system
US10288908B2 (en)2013-06-122019-05-14Oakley, Inc.Modular heads-up display system
US10344960B2 (en)*2017-09-192019-07-09Bragi GmbHWireless earpiece controlled medical headlight
US11721183B2 (en)2018-04-122023-08-08Ingeniospec, LlcMethods and apparatus regarding electronic eyewear applicable for seniors
US10777048B2 (en)2018-04-122020-09-15Ipventure, Inc.Methods and apparatus regarding electronic eyewear applicable for seniors

Also Published As

Publication numberPublication date
AU2004318969A1 (en)2005-11-03
EP1736032A2 (en)2006-12-27
US20050238194A1 (en)2005-10-27
WO2005104618A2 (en)2005-11-03
WO2005104618A3 (en)2006-06-08

Similar Documents

PublicationPublication DateTitle
US7312699B2 (en)Ear associated machine-human interface
US20210287522A1 (en)Systems and methods for managing an emergency situation
US10817251B2 (en)Dynamic capability demonstration in wearable audio device
ES2971050T3 (en) Notification method and device
US20100308999A1 (en)Security and monitoring apparatus
CN104799641B (en)Safe and intelligent is rested the head on
US20070060118A1 (en)Centralized voice recognition unit for wireless control of personal mobile electronic devices
US20100035648A1 (en)Earplug with alarm and electronic device system with same
US20040155770A1 (en)Audible alarm relay system
CN106686187B (en) Playing mode switching method of wearable device and wearable device
WO2008127316A1 (en)Security and monitoring apparatus
CN102625219A (en)Listening system comprising an alerting device and a listening device
CN110599747A (en)User reminding method and device and intelligent doorbell system
CN106714105A (en)Playing mode control method of wearable device and wearable device
CN108763978A (en)Information cuing method, device, terminal, earphone and readable storage medium storing program for executing
CN107801154A (en)Mobile device system for prompting, management system and method for managing object
KR101328865B1 (en)Wrist watch for deaf and its control method
EP1889464B1 (en)Monitoring system with speech recognition
US10117604B2 (en)3D sound positioning with distributed sensors
US20190120871A1 (en)Sensor elements to detect object movement relative to a surface
KR20090094572A (en)Alarm system for a hearing-impaired person
US20230292064A1 (en)Audio processing using ear-wearable device and wearable vision device
KR20200004181A (en)Speaker based service system
CN111558141A (en)Neck massager and control method thereof
CN204698241U (en)Safe and intelligent is rested the head on

Legal Events

DateCodeTitleDescription
STCFInformation on status: patent grant

Free format text:PATENTED CASE

FPAYFee payment

Year of fee payment:4

FPAYFee payment

Year of fee payment:8

FEPPFee payment procedure

Free format text:MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

FEPPFee payment procedure

Free format text:11.5 YR SURCHARGE- LATE PMT W/IN 6 MO, SMALL ENTITY (ORIGINAL EVENT CODE: M2556); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 12TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2553); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

Year of fee payment:12


[8]ページ先頭

©2009-2025 Movatter.jp