Movatterモバイル変換


[0]ホーム

URL:


US8952796B1 - Enactive perception device - Google Patents

Enactive perception device
Download PDF

Info

Publication number
US8952796B1
US8952796B1US13/535,206US201213535206AUS8952796B1US 8952796 B1US8952796 B1US 8952796B1US 201213535206 AUS201213535206 AUS 201213535206AUS 8952796 B1US8952796 B1US 8952796B1
Authority
US
United States
Prior art keywords
data
sensory
sensory input
input
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US13/535,206
Inventor
Warren L. Wolf
Manu Rehani
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lingo Ip Holdings LLC
Original Assignee
DW ASSOCIATES LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by DW ASSOCIATES LLCfiledCriticalDW ASSOCIATES LLC
Priority to US13/535,206priorityCriticalpatent/US8952796B1/en
Assigned to DW ASSOCIATES, LLCreassignmentDW ASSOCIATES, LLCASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: REHANI, MANU, WOLF, WARREN L.
Application grantedgrantedCritical
Publication of US8952796B1publicationCriticalpatent/US8952796B1/en
Assigned to WOLF, WARREN L., REHANI, MANUreassignmentWOLF, WARREN L.ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: DW ASSOCIATES, LLC
Assigned to LINGO IP HOLDINGS, LLCreassignmentLINGO IP HOLDINGS, LLCASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: REHANI, MANU, WOLF, WARREN L
Activelegal-statusCriticalCurrent
Adjusted expirationlegal-statusCritical

Links

Images

Classifications

Definitions

Landscapes

Abstract

An enactive perception device includes the ability to receive information. The information received can be either sensory information or other data. The received information can then be converted into sensory information, which can then be provided to the user.

Description

RELATED APPLICATION DATA
This application claims the benefit of U.S. Provisional Patent Application Ser. No. 61/501,953, titled “ENACTIVE PERCEPTION DEVICE”, filed Jun. 28, 2011, which is herein incorporated by reference for all purposes.
FIELD
This invention pertains to information, and more particularly to easier processing of incoming information.
BACKGROUND
The need to externally process (in some way) information and convey that information to a user has been around for a long time. For example, blind persons can use canes or Seeing Eye dogs to navigate: these devices translate information that would normally be processed visually into another sense that the blind person can perceive: in this case, the sense of touch.
Military applications of data processing also exist. In the early days of air warfare, the pilot (usually the sole person in the plane, and therefore also responsible for firing the weapons) had to locate enemy planes using the senses of sight and sound. With the advent of detection apparatuses (for example, radar) that extend beyond the pilot's range of vision, the pilot has access to additional information. This information is provided to the pilot using a head's-up display.
But the information provided can often be overwhelming. Even in the early days of air travel, without the development of technologies such as radar, a pilot had a great deal of data he must process visually: visually checking the skies for their current condition; checking instruments for current elevation, the current roll, pitch, and yaw of the aircraft, the speed of the aircraft, current fuel reserves, and so on. Specialized pilots might also have to keep track of additional information: for example, a military pilot needs to know how much weaponry he is still carrying.
The use of head's up displays can reduce the number of different places a pilot has to look at to gather the information he needs. Instead of looking for a particular instrument in the cockpit, the pilot just looks at a particular location on the head's-up display. But as more and more information is conveyed to the pilot, he has to look at more places in the head's-up display to find everything he needs to know, and he has to process all the visual information to understand what the data represents.
A need remains for a way to address these and other problems associated with the prior art.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 shows a system for performing data processing as an enactive perception device, according to an embodiment of the invention.
FIG. 2 shows data flow in the enactive perception device ofFIG. 1.
FIG. 3 shows a user receiving data using the enactive perception device ofFIG. 1.
FIG. 4 shows feedback of the enactive perception device ofFIG. 1 in an environment.
FIG. 5 shows a flowchart of a procedure for processing data using the enactive perception device ofFIG. 1, according to an embodiment of the invention.
DETAILED DESCRIPTION
Consider a soldier in a combat environment. The soldier needs situational awareness. He needs to keep track of the totality of his environment: not simply what he can see in front of him with his eyes. He needs to know where the rest his unit is located, relative to his location. He needs to know how much ammunition he is currently carrying. He needs to know about dangers in his environment: for example, if there are trace chemicals indicating an explosive device is in the vicinity. And so on.
The soldier can survey his environment and keep aware of his teammates' locations by constantly moving his head around, looking in all directions. But this is both tiring and dangerous, as the motion might be noticed by enemies. The soldier can keep track of his ammunition in his head, but this information is easily forgotten in the heat of the battle. And unless his senses are particularly sensitive, he is unlikely to detect dangers in the environment, such as trace chemicals that can be smelled.
An enactive perception device, according to embodiments of the invention, permits a user, such as the soldier, to process information more efficiently. An enactive perception device can use sensory substitution to provide a user with information about his environment. For example, movement behind the soldier can be detected using cameras and infrared sensors; when detected, the soldier can be alerted to the movement by stimulation of a muscle, such as the soldier's trapezius muscle. Chemical sensors can be used to detect chemical traces, which can be translated into tongue stimulation to alert the soldier to the chemical traces. Ammunition status can be determined tracking how many rounds of ammunition have been fired relative to the soldier's original load, and status can be provided by muscle stimulation of the abdomen. And so on.
The advantage of using an enactive perception device is that the soldier can provided with information using different senses. Rather than the soldier having to specifically search for information (e.g., searching the arena for teammates, or checking his remaining ammunition), the information can be determined automatically and provided to the soldier in a manner he can process.
Further, by providing different sensations in different ways, the user automatically knows what information is being conveyed, without having to process the information mentally. To use a different example, fighter pilots get their information via various instruments (e.g., altimeter, fuel gauge, compass, etc.), sometimes in combination with heads-up displays. The fighter pilot then has to mentally process the information he receives visually to identify a particular piece of information. For example, to determine the airplane's current ammunition status in the prior art, the pilot has to visually find the information (either from an instrument in the instrument panel or from a section of the heads-up display). But using an enactive perception device according to embodiments of the invention, the fighter pilot can know when he is running low on ammunition when he feels the muscle stimulation of a particular part of the body (e.g., the abdomen). This sensation is unique: a stimulation of the abdomen would not be confused with a stimulation of another muscle of the body. Thus, the fighter pilot can more quickly process information, via its translation to a different sensation, which avoids the need for the fighter pilot to mentally process the visual information to discern the pertinent data.
From the above discussion, it can be understood that an enactive perception device allows a user to integrate perception and action. An enactive perception device enhances a user's ability to perceive his environment and act on it to change his environment and to perceive the change in faster, richer, and more differentiated ways. An enactive perception device provides contextually relevant information in addition to sensory information, and can utilize all sensory mechanisms available to the user.
An advantage of the enactive perception device is that the enactive perception device can interface with underutilized sensory mechanisms to provide relevant stimulus to the user. An enactive perception device can also provide information that is both local and remote to the user. For example, in the soldier example discussed above, the soldier's ammunition load can be local information, but information about where supporting units (e.g., artillery) might be located can be remote information: for example, provided from headquarters.
An enactive perception device according to embodiments of the invention can include components that are worn by the user as part of armor or a uniform, as discussed above. But it is equally viable for the enactive perception device to be something that the user contacts, but is not worn. For example, an enactive perception device can be fitted into the steering wheel and car set of an automobile, providing the user with information only when the user is touching the steering wheel or the car seat. Such an enactive perception device can provide the user with information about traffic conditions, and suggest a direction to travel. For example, when the time is right for the user to make a turn, the enactive perception device can stimulate a muscle on the appropriate side of the body or the steering wheel, indicating the direction to turn. Such an enactive perception device can provide complementary information to a navigation device.
An enactive perception device according to embodiments of the invention can also provide information that is dependent on other factors. For example, an enactive perception device can include a wrist band or other sensor device. The location of the wrist band or sensor device can indicate what information should be provided. If the wrist band or sensor device is in front of the user's body, the enactive perception device can provide prospective information; if the wrist band or sensor device is behind the user's body, the enactive perception device can provide retrospective information. The relative location of the wrist band or sensor device can be determined using any desired technology: for example, Bluetooth® technology can be used to determine the relative location of the wrist band or sensor device to the user's body. (Bluetooth is a registered trademark of Bluetooth SIG, Inc.)
FIG. 1 shows a system for performing data processing as an enactive perception device, according to an embodiment of the invention.FIG. 1 showsmachine105, which can be, for example, a server or a user's personal computer. InFIG. 1,machine105 is shown as includingcomputer110, monitor115,keyboard120, andmouse125. A person skilled in the art will recognize that other components can be included with machine105: for example, other input/output devices, such as a printer. In addition,FIG. 1machine105 can include conventional internal components (not shown): for example, a central processing unit, a memory, storage, etc. Although not shown inFIG. 1, a person skilled in the art will recognize thatmachine105 can interact with other machine, either directly or over a network (not shown) of any type.
Machine105 includesdata receiver130,converter135, andsensory input devices140 and145.Data receiver130 can receive data from a data source.Converter135 can convert the data into a sensory input form, which can then be provided tosensory input devices140 and145. A user wearingsensory input devices140 and145 can then receive the data as sensory input.
The reader will understand thatFIG. 1 is a very simple representation of the enactive perception device, and that the enactive perception device can take other forms. For example, the drawings of U.S. Provisional Patent Application Ser. No. 61/501,953, titled “ENACTIVE PERCEPTION DEVICE”, filed Jun. 28, 2011, which is herein incorporated by reference for all purposes, show various forms an enactive perception device can take for a soldier, a fighter pilot, a race car driver, a knowledge worker, and a regular driver. But distilled to its most basic form, the enactive perception device permits the translation of information and sensory data from one form to another, so that the information and sensory data can be conveyed to the user, as described above.
AlthoughFIG. 1 showsmachine105 as a conventional desktop computer, a person skilled in the art will recognize thatcomputer system105 can be any type of machine or computing device capable of providing the services attributed herein tomachine105, including, for example, a laptop computer, a personal digital assistant (PDA), or a cellular telephone. In addition,machine105 can be worn by the user, using a form factor appropriate to the user's needs. For example, if the user is a soldier,machine105 can take the form of a special purpose computer that fits into a compartment in the user's gear.
WhileFIG. 1 shows the various components as being grouped together, a person of ordinary skill in the art will recognize the embodiments of the invention can separate the components. For example,data receiver130 andconverter135 can be included withmachine105, butsensory input devices140 and145 can be located elsewhere. For example,machine105 might be located remotely from the user, who can wearsensory input devices140 and145.Machine105 might then also include any necessary components to achieve communication between the separated components.
FIG. 2 shows data flow in the enactive perception device ofFIG. 1. InFIG. 2,data205 and210 are received bydata receiver130.Data205 and210 are then passed toconverter135, which converts the data intosensory inputs215 and220. These sensory inputs can then be provided tosensory input devices140 and145, so that the sensory inputs can be provided to the user (as shown inFIG. 3).
FIG. 4 shows feedback of the enactive perception device ofFIG. 1 in an environment. InFIG. 4,user305 has contact withlocal environment405 andremote environment410. The enactive perception device includes localenvironment sensor layer415, which can provide information aboutlocal environment405, and remote environmentsensory layer420, which can provide information aboutremote environment410.
Information from both localenvironment sensor layer415 and remoteenvironment sensor layer420 is forwarded tosensory processor425, and then tosensory translator interface430. (The combination ofsensory processor425 and sensory translator interface can formconverter135 as described with reference toFIG. 1 above.)Support infrastructure435 can then be used to provide sensory feedback information touser305. Sensory feedback information can also be provided touser305 fromsensory translator interface430, without going throughsupport infrastructure435.
FIG. 5 shows a flowchart of a procedure for processing data using the enactive perception device ofFIG. 1, according to an embodiment of the invention. InFIG. 5, atblock505, data are received by the system. Atblock510, the data are converted into sensory inputs. Atblock515, the sensory inputs are then provided to the user.
The following discussion is intended to provide a brief, general description of a suitable machine in which certain aspects of the invention may be implemented. Typically, the machine includes a system bus to which is attached processors, memory, e.g., random access memory (RAM), read-only memory (ROM), or other state preserving medium, storage devices, a video interface, and input/output interface ports. The machine may be controlled, at least in part, by input from conventional input devices, such as keyboards, mice, etc., as well as by directives received from another machine, interaction with a virtual reality (VR) environment, biometric feedback, or other input signal. As used herein, the term “machine” is intended to broadly encompass a single machine, or a system of communicatively coupled machines or devices operating together. Exemplary machines include computing devices such as personal computers, workstations, servers, portable computers, handheld devices, telephones, tablets, etc., as well as transportation devices, such as private or public transportation, e.g., automobiles, trains, cabs, etc.
The machine may include embedded controllers, such as programmable or non-programmable logic devices or arrays, Application Specific Integrated Circuits, embedded computers, smart cards, and the like. The machine may utilize one or more connections to one or more remote machines, such as through a network interface, modem, or other communicative coupling. Machines may be interconnected by way of a physical and/or logical network, such as an intranet, the Internet, local area networks, wide area networks, etc. One skilled in the art will appreciated that network communication may utilize various wired and/or wireless short range or long range carriers and protocols, including radio frequency (RF), satellite, microwave, Institute of Electrical and Electronics Engineers (IEEE) 810.11, Bluetooth, optical, infrared, cable, laser, etc.
The invention may be described by reference to or in conjunction with associated data including functions, procedures, data structures, application programs, etc. which when accessed by a machine results in the machine performing tasks or defining abstract data types or low-level hardware contexts. Associated data may be stored in, for example, the volatile and/or non-volatile memory, e.g., RAM, ROM, etc., or in other non-transitory storage devices and their associated storage media, including hard-drives, floppy-disks, optical storage, tapes, flash memory, memory sticks, digital video disks, biological storage, etc. Associated data may be delivered over transmission environments, including the physical and/or logical network, in the form of packets, serial data, parallel data, propagated signals, etc., and may be used in a compressed or encrypted format. Associated data may be used in a distributed environment, and stored locally and/or remotely for machine access.
Having described and illustrated the principles of the invention with reference to illustrated embodiments, it will be recognized that the illustrated embodiments may be modified in arrangement and detail without departing from such principles. And, though the foregoing discussion has focused on particular embodiments, other configurations are contemplated. In particular, even though expressions such as “in one embodiment” or the like are used herein, these phrases are meant to generally reference embodiment possibilities, and are not intended to limit the invention to particular embodiment configurations. As used herein, these terms may reference the same or different embodiments that are combinable into other embodiments.
Consequently, in view of the wide variety of permutations to the embodiments described herein, this detailed description and accompanying material is intended to be illustrative only, and should not be taken as limiting the scope of the invention. What is claimed as the invention, therefore, is all such modifications as may come within the scope and spirit of the following claims and equivalents thereto.

Claims (17)

The invention claimed is:
1. A system, comprising:
a data receiver to receive a first data as a first sensory data and a second data as a second sensory data;
a converter to convert said first data into a first sensory input and said second data into a second sensory input;
a first sensory input device to provide a user with said first sensory input; and
a second sensory input device to provide said user with said second sensory input,
where said first sensory input and said second sensory input can be perceived by said user without said user having to distinguish between said first sensory input and said second sensory input and where said first sensory input and said second sensory input use different human senses.
2. A system according toclaim 1, wherein the data receiver includes a second data receiver to receive said second data.
3. A system according toclaim 1, wherein said second sensory input is different from said first sensory input.
4. A system according toclaim 1, wherein the converter is operative to convert said first sensory data into a first sensory input and said second sensory data into a second sensory input, where said first sensory input is perceived by said user using a different sense than said first sensory data and said second sensory input perceived by said user using a different sense than said second sensory data.
5. A system according toclaim 1, wherein the data receiver is operative to receive said first data and said second data dependent on a relative location of a sensor device.
6. A system according toclaim 1, wherein the first sensory input device is designed to be worn by said user.
7. A method comprising:
receiving a first data from a first data input;
receiving a second data from a second data input, the second data input different from the first data input;
converting the first data into a first sensory input;
converting the second data into a second sensory input; and
providing a user with the first sensory input and the second sensory input via at least one sensory input device designed to be worn by the user,
where the first sensory input and the second sensory input can be perceived by the user without the user having to distinguish between the first sensory input and the second sensory input and where said first sensory input and said second sensory input use different human senses.
8. A method according toclaim 7, wherein converting the second data into a second sensory input includes converting the second data into the second sensory input, the second sensory input different from the first sensory input.
9. A method according toclaim 7, wherein receiving a first data and a second data includes receiving a first sensory data and a second sensory data.
10. A method according toclaim 9, wherein:
converting the first data into a first sensory input includes converting the first sensory data into a first sensory input, the first sensory input perceived by the user using a different sense than the first sensory data; and
converting the second data into a second sensory input includes converting the second sensory data into a second sensory input, the second sensory input perceived by the user using a different sense than the second sensory data.
11. A method according toclaim 7, wherein receiving a first data and a second data includes receiving the first data and the second data, the first data and the second data dependent on a relative location of a sensor device.
12. An article, comprising a non-transitory storage medium, said non-transitory storage medium having stored thereon instructions that, when executed by a machine, result in:
receiving a first data from a first data input;
receiving a second data from a second data input, the second data input different from the first data input;
converting the first data into a first sensory input;
converting the second data into a second sensory input; and
providing a user with the first sensory input and the second sensory input,
where the first sensory input and the second sensory input can be perceived by the user without the user having to distinguish between the first sensory input and the second sensory input and where said first sensory input and said second sensory input use different human senses.
13. An article according toclaim 12, wherein converting the second data into a second sensory input includes converting the second data into the second sensory input, the second sensory input different from the first sensory input.
14. An article according toclaim 12, wherein receiving a first data and a second data includes receiving a first sensory data and a second sensory data.
15. An article according toclaim 14, wherein:
converting the first data into a first sensory input includes converting the first sensory data into a first sensory input, the first sensory input perceived by the user using a different sense than the first sensory data; and
converting the second data into a second sensory input includes converting the second sensory data into a second sensory input, the second sensory input perceived by the user using a different sense than the second sensory data.
16. An article according toclaim 12, wherein receiving a first data and a second data includes receiving the first data and the second data, the first data and the second data dependent on a relative location of a sensor device.
17. An article according toclaim 12, wherein providing a user with the first sensory input and the second sensory input includes providing the user with the first sensory input via a sensory input device designed to be worn by the user.
US13/535,2062011-06-282012-06-27Enactive perception deviceActive2033-02-28US8952796B1 (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
US13/535,206US8952796B1 (en)2011-06-282012-06-27Enactive perception device

Applications Claiming Priority (2)

Application NumberPriority DateFiling DateTitle
US201161501953P2011-06-282011-06-28
US13/535,206US8952796B1 (en)2011-06-282012-06-27Enactive perception device

Publications (1)

Publication NumberPublication Date
US8952796B1true US8952796B1 (en)2015-02-10

Family

ID=52443609

Family Applications (1)

Application NumberTitlePriority DateFiling Date
US13/535,206Active2033-02-28US8952796B1 (en)2011-06-282012-06-27Enactive perception device

Country Status (1)

CountryLink
US (1)US8952796B1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20080188310A1 (en)*2000-05-122008-08-07Murdock Wilbert QInternet sports computer cellular device aka mega machine
US20140132388A1 (en)*2012-11-142014-05-15Ishraq ALALAWISystem, method and computer program product to assist the visually impaired in navigation
US11117033B2 (en)2010-04-262021-09-14Wilbert Quinc MurdockSmart system for display of dynamic movement parameters in sports and training

Citations (101)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US5128865A (en)1989-03-101992-07-07Bso/Buro Voor Systeemontwikkeling B.V.Method for determining the semantic relatedness of lexical items in a text
US5249967A (en)1991-07-121993-10-05George P. O'LearySports technique video training device
US5454722A (en)1993-11-121995-10-03Project Orbis International, Inc.Interactive multimedia eye surgery training apparatus and method
US5533181A (en)1990-12-241996-07-02Loral CorporationImage animation for visual training in a simulator
US5781879A (en)1996-01-261998-07-14Qpl LlcSemantic analysis and modification methodology
US5797123A (en)1996-10-011998-08-18Lucent Technologies Inc.Method of key-phase detection and verification for flexible speech understanding
US5857855A (en)1993-08-101999-01-12Midori KatayamaMethod for teaching body motions
US5887120A (en)1995-05-311999-03-23Oracle CorporationMethod and apparatus for determining theme for discourse
US5961333A (en)1996-09-051999-10-05Harrison; Robert G.Educational and training devices and methods
US6126449A (en)1999-03-252000-10-03Swing LabInteractive motion training device and method
US6138085A (en)1997-07-312000-10-24Microsoft CorporationInferring semantic relations
US6173261B1 (en)1998-09-302001-01-09At&T CorpGrammar fragment acquisition using syntactic and semantic clustering
US6385620B1 (en)1999-08-162002-05-07Psisearch,LlcSystem and method for the management of candidate recruiting information
US20020059376A1 (en)2000-06-022002-05-16Darren SchwartzMethod and system for interactive communication skill training
JP2002149675A (en)2000-11-152002-05-24Toshiba Corp Text data analysis apparatus and method, program therefor and recording medium recording the same
US20020099730A1 (en)2000-05-122002-07-25Applied Psychology Research LimitedAutomatic text classification system
US20020106622A1 (en)2001-02-072002-08-08Osborne Patrick J.Interactive employee training system and method
US6453315B1 (en)1999-09-222002-09-17Applied Semantics, Inc.Meaning-based information organization and retrieval
US6504990B1 (en)1998-11-122003-01-07Max AbecassisRandomly and continuously playing fragments of a video segment
US20030027121A1 (en)2001-08-012003-02-06Paul GrudnitskiMethod and system for interactive case and video-based teacher training
US20030028564A1 (en)2000-12-192003-02-06Lingomotors, Inc.Natural language method and system for matching and ranking documents in terms of semantic relatedness
US6523026B1 (en)1999-02-082003-02-18Huntsman International LlcMethod for retrieving semantically distant analogies
US6556964B2 (en)1997-09-302003-04-29Ihc Health ServicesProbabilistic system for natural language processing
US20030093322A1 (en)2000-10-102003-05-15Intragroup, Inc.Automated system and method for managing a process for the shopping and selection of human entities
US20030167266A1 (en)2001-01-082003-09-04Alexander SaldanhaCreation of structured data from plain text
US20030182310A1 (en)2002-02-042003-09-25Elizabeth CharnockMethod and apparatus for sociological data mining
US6684202B1 (en)2000-05-312004-01-27Lexis NexisComputer-based system and method for finding rules of law in text
US20040030556A1 (en)1999-11-122004-02-12Bennett Ian M.Speech based learning/training system using semantic decoding
US20040053203A1 (en)2002-09-162004-03-18Alyssa WaltersSystem and method for evaluating applicants
JP2004157931A (en)2002-11-082004-06-03Ntt Advanced Technology Corp Intention sentence type classification extraction method
US20040117234A1 (en)2002-10-112004-06-17Xerox CorporationSystem and method for content management assessment
US20050055209A1 (en)2003-09-052005-03-10Epstein Mark E.Semantic language modeling and confidence measurement
US20050165600A1 (en)2004-01-272005-07-28Kas KasraviSystem and method for comparative analysis of textual documents
US20050192949A1 (en)2004-02-272005-09-01Yuichi KojimaDocument group analyzing apparatus, a document group analyzing method, a document group analyzing system, a program, and a recording medium
US20050197890A1 (en)2004-03-052005-09-08Angel LuSystem, method and computer-readable medium for resume management
US20050204337A1 (en)2003-12-312005-09-15Automatic E-Learning LlcSystem for developing an electronic presentation
US20050202871A1 (en)2004-01-262005-09-15Lippincott Louis A.Multiple player game system, methods and apparatus
US20050262428A1 (en)2004-05-212005-11-24Little Chad MSystem and method for contextual correlation of web document content
US20050272517A1 (en)2001-06-112005-12-08Recognition Insight, LlcSwing position recognition and reinforcement
US20050282141A1 (en)2004-06-172005-12-22Falash Mark DScenario workflow based assessment system and method
US20060206332A1 (en)2005-03-082006-09-14Microsoft CorporationEasy generation and automatic training of spoken dialog systems using text-to-speech
US20060230102A1 (en)2005-03-252006-10-12Murray HidaryAutomated training program generation and distribution system
US20060235843A1 (en)2005-01-312006-10-19Textdigger, Inc.Method and system for semantic search and retrieval of electronic documents
US20060246973A1 (en)2005-04-132006-11-02Thomas Jeffrey JSystems and methods for simulating a particular user in an interactive computer system
US20070061179A1 (en)2005-09-092007-03-15International Business Machines CorporationMethod for managing human resources
US20070112710A1 (en)2005-05-242007-05-17Drane Associates, L.P.Method and system for interactive learning and training
US20070135225A1 (en)2005-12-122007-06-14Nieminen Heikki VSport movement analyzer and training device
US20070196798A1 (en)2006-02-172007-08-23Innertalent CorporationSelf-improvement system and method
US20070203991A1 (en)2006-02-282007-08-30Microsoft CorporationOrdering personal information using social metadata
US20070213126A1 (en)2003-07-142007-09-13Fusion Sport International Pty LtdSports Training And Testing Methods, Appartaus And System
US20070259324A1 (en)2006-05-082007-11-08Interwise Ltd.Computerized medium for exchanging contents of interest
US20070260421A1 (en)2006-05-032007-11-08Nike, Inc.Athletic or other performance sensing systems
US20070265089A1 (en)2002-05-132007-11-15Consolidated Global Fun UnlimitedSimulated phenomena interaction game
US20080052283A1 (en)2000-02-252008-02-28Novell, Inc.Construction, manipulation, and comparison of a multi-dimensional semantic space
US20080120029A1 (en)*2006-02-162008-05-22Zelek John SWearable tactile navigation system
US7403890B2 (en)2002-05-132008-07-22Roushar Joseph CMulti-dimensional method and apparatus for automated language interpretation
US20080191864A1 (en)2005-03-312008-08-14Ronen WolfsonInteractive Surface and Display System
US20080281620A1 (en)2007-05-112008-11-13Atx Group, Inc.Multi-Modal Automation for Human Interactive Skill Assessment
US20080300930A1 (en)2007-05-302008-12-04Compitello Michael JDeveloping and structuring business ecosystems
US20090006164A1 (en)2007-06-292009-01-01Caterpillar Inc.System and method for optimizing workforce engagement
US20090024554A1 (en)2007-07-162009-01-22Vanessa MurdockMethod For Matching Electronic Advertisements To Surrounding Context Based On Their Advertisement Content
US20090024747A1 (en)2007-07-202009-01-22International Business Machines CorporationSystem and method for visual representation of a social network connection quality
US7487094B1 (en)2003-06-202009-02-03Utopy, Inc.System and method of call classification with context modeling based on composite words
US20090035736A1 (en)2004-01-162009-02-05Harold WolpertReal-time training simulation system and method
US7539697B1 (en)2002-08-082009-05-26Spoke SoftwareCreation and maintenance of social relationship network graphs
US20090153350A1 (en)*2007-12-122009-06-18Immersion Corp.Method and Apparatus for Distributing Haptic Synchronous Signals
US7555441B2 (en)2003-10-102009-06-30Kronos Talent Management Inc.Conceptualization of job candidate information
US7565403B2 (en)2000-03-162009-07-21Microsoft CorporationUse of a bulk-email filter within a system for classifying messages for urgency or importance
US7567895B2 (en)2004-08-312009-07-28Microsoft CorporationMethod and system for prioritizing communications based on sentence classifications
US20090198488A1 (en)2008-02-052009-08-06Eric Arno VigenSystem and method for analyzing communications using multi-placement hierarchical structures
US20090248399A1 (en)2008-03-212009-10-01Lawrence AuSystem and method for analyzing text using emotional intelligence factors
US7607083B2 (en)2000-12-122009-10-20Nec CorporationTest summarization using relevance measures and latent semantic analysis
US20090282104A1 (en)2008-05-092009-11-12O'sullivan Patrick JosephSystem and method for indicating availability
US20090287672A1 (en)2008-05-132009-11-19Deepayan ChakrabartiMethod and Apparatus for Better Web Ad Matching by Combining Relevance with Consumer Click Feedback
US20090292541A1 (en)2008-05-252009-11-26Nice Systems Ltd.Methods and apparatus for enhancing speech analytics
US20090319508A1 (en)2008-06-242009-12-24Microsoft CorporationConsistent phrase relevance measures
US7644144B1 (en)2001-12-212010-01-05Microsoft CorporationMethods, tools, and interfaces for the dynamic assignment of people to groups to enable enhanced communication and collaboration
US20100023377A1 (en)2008-07-232010-01-28Hr Solutions, Inc.Systems and methods for managing human resource activities
US20100098289A1 (en)2008-07-092010-04-22Florida Atlantic UniversitySystem and method for analysis of spatio-temporal data
US7711672B2 (en)1998-05-282010-05-04Lawrence AuSemantic network methods to disambiguate natural language meaning
US7711573B1 (en)2003-04-182010-05-04Algomod Technologies CorporationResume management and recruitment workflow system and method
US7720675B2 (en)2003-10-272010-05-18Educational Testing ServiceMethod and system for determining text coherence
US20100145678A1 (en)2008-11-062010-06-10University Of North TexasMethod, System and Apparatus for Automatic Keyword Extraction
US20100179916A1 (en)2009-01-152010-07-15Johns TammyCareer management system
US7792685B2 (en)2001-11-302010-09-07United Negro College Fund, Inc.Selection of individuals from a pool of candidates in a competition system
US20100228733A1 (en)2008-11-122010-09-09Collective Media, Inc.Method and System For Semantic Distance Measurement
US7813917B2 (en)2004-06-222010-10-12Gary Stephen ShusterCandidate matching using algorithmic analysis of candidate-authored narrative information
US20100271298A1 (en)*2009-04-242010-10-28Anthrotronix, Inc.Haptic automated communication system
US20100306251A1 (en)2009-05-292010-12-02Peter SnellSystem and Related Method for Digital Attitude Mapping
US20100328051A1 (en)*2008-06-102010-12-30Hale Kelly SMethod And System For the Presentation Of Information Via The Tactile Sense
US7870203B2 (en)2002-03-082011-01-11Mcafee, Inc.Methods and systems for exposing messaging reputation to an end user
US20110040837A1 (en)2009-08-142011-02-17Tal EdenMethods and apparatus to classify text communications
US20110055098A1 (en)2008-04-302011-03-03Stewart Jeffrey AAutomated employment information exchange and method for employment compatibility verification
US7917587B2 (en)2004-07-302011-03-29Microsoft CorporationMethod and system for prioritizing communications based on interpersonal relationships
US20110184939A1 (en)2010-01-282011-07-28Elliott Edward SMethod of transforming resume and job order data into evaluation of qualified, available candidates
US20110208511A1 (en)2008-11-042011-08-25Saplo AbMethod and system for analyzing text
US20110258049A1 (en)2005-09-142011-10-20Jorey RamerIntegrated Advertising System
US20110268300A1 (en)*2010-04-302011-11-03Honeywell International Inc.Tactile-based guidance system
US20110295759A1 (en)2010-05-262011-12-01Forte Hcm Inc.Method and system for multi-source talent information acquisition, evaluation and cluster representation of candidates
US8090725B1 (en)2006-01-132012-01-03CareerBuilder, LLCMethod and system for matching data sets of non-standard formats
WO2012000013A1 (en)2010-06-292012-01-05Springsense Pty LtdMethod and system for determining word senses by latent semantic distance

Patent Citations (102)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US5128865A (en)1989-03-101992-07-07Bso/Buro Voor Systeemontwikkeling B.V.Method for determining the semantic relatedness of lexical items in a text
US5533181A (en)1990-12-241996-07-02Loral CorporationImage animation for visual training in a simulator
US5249967A (en)1991-07-121993-10-05George P. O'LearySports technique video training device
US5857855A (en)1993-08-101999-01-12Midori KatayamaMethod for teaching body motions
US5454722A (en)1993-11-121995-10-03Project Orbis International, Inc.Interactive multimedia eye surgery training apparatus and method
US5887120A (en)1995-05-311999-03-23Oracle CorporationMethod and apparatus for determining theme for discourse
US5781879A (en)1996-01-261998-07-14Qpl LlcSemantic analysis and modification methodology
US5961333A (en)1996-09-051999-10-05Harrison; Robert G.Educational and training devices and methods
US5797123A (en)1996-10-011998-08-18Lucent Technologies Inc.Method of key-phase detection and verification for flexible speech understanding
US6138085A (en)1997-07-312000-10-24Microsoft CorporationInferring semantic relations
US6556964B2 (en)1997-09-302003-04-29Ihc Health ServicesProbabilistic system for natural language processing
US7711672B2 (en)1998-05-282010-05-04Lawrence AuSemantic network methods to disambiguate natural language meaning
US6173261B1 (en)1998-09-302001-01-09At&T CorpGrammar fragment acquisition using syntactic and semantic clustering
US6504990B1 (en)1998-11-122003-01-07Max AbecassisRandomly and continuously playing fragments of a video segment
US6523026B1 (en)1999-02-082003-02-18Huntsman International LlcMethod for retrieving semantically distant analogies
US6126449A (en)1999-03-252000-10-03Swing LabInteractive motion training device and method
US6385620B1 (en)1999-08-162002-05-07Psisearch,LlcSystem and method for the management of candidate recruiting information
US6453315B1 (en)1999-09-222002-09-17Applied Semantics, Inc.Meaning-based information organization and retrieval
US20040030556A1 (en)1999-11-122004-02-12Bennett Ian M.Speech based learning/training system using semantic decoding
US20080052283A1 (en)2000-02-252008-02-28Novell, Inc.Construction, manipulation, and comparison of a multi-dimensional semantic space
US7565403B2 (en)2000-03-162009-07-21Microsoft CorporationUse of a bulk-email filter within a system for classifying messages for urgency or importance
US20020099730A1 (en)2000-05-122002-07-25Applied Psychology Research LimitedAutomatic text classification system
US6684202B1 (en)2000-05-312004-01-27Lexis NexisComputer-based system and method for finding rules of law in text
US20020059376A1 (en)2000-06-022002-05-16Darren SchwartzMethod and system for interactive communication skill training
US20030093322A1 (en)2000-10-102003-05-15Intragroup, Inc.Automated system and method for managing a process for the shopping and selection of human entities
JP2002149675A (en)2000-11-152002-05-24Toshiba Corp Text data analysis apparatus and method, program therefor and recording medium recording the same
US7607083B2 (en)2000-12-122009-10-20Nec CorporationTest summarization using relevance measures and latent semantic analysis
US20030028564A1 (en)2000-12-192003-02-06Lingomotors, Inc.Natural language method and system for matching and ranking documents in terms of semantic relatedness
US20030167266A1 (en)2001-01-082003-09-04Alexander SaldanhaCreation of structured data from plain text
US20020106622A1 (en)2001-02-072002-08-08Osborne Patrick J.Interactive employee training system and method
US20050272517A1 (en)2001-06-112005-12-08Recognition Insight, LlcSwing position recognition and reinforcement
US20030027121A1 (en)2001-08-012003-02-06Paul GrudnitskiMethod and system for interactive case and video-based teacher training
US7792685B2 (en)2001-11-302010-09-07United Negro College Fund, Inc.Selection of individuals from a pool of candidates in a competition system
US7644144B1 (en)2001-12-212010-01-05Microsoft CorporationMethods, tools, and interfaces for the dynamic assignment of people to groups to enable enhanced communication and collaboration
US20030182310A1 (en)2002-02-042003-09-25Elizabeth CharnockMethod and apparatus for sociological data mining
US7870203B2 (en)2002-03-082011-01-11Mcafee, Inc.Methods and systems for exposing messaging reputation to an end user
US7403890B2 (en)2002-05-132008-07-22Roushar Joseph CMulti-dimensional method and apparatus for automated language interpretation
US20070265089A1 (en)2002-05-132007-11-15Consolidated Global Fun UnlimitedSimulated phenomena interaction game
US7539697B1 (en)2002-08-082009-05-26Spoke SoftwareCreation and maintenance of social relationship network graphs
US20040053203A1 (en)2002-09-162004-03-18Alyssa WaltersSystem and method for evaluating applicants
US20040117234A1 (en)2002-10-112004-06-17Xerox CorporationSystem and method for content management assessment
JP2004157931A (en)2002-11-082004-06-03Ntt Advanced Technology Corp Intention sentence type classification extraction method
US7711573B1 (en)2003-04-182010-05-04Algomod Technologies CorporationResume management and recruitment workflow system and method
US7487094B1 (en)2003-06-202009-02-03Utopy, Inc.System and method of call classification with context modeling based on composite words
US20070213126A1 (en)2003-07-142007-09-13Fusion Sport International Pty LtdSports Training And Testing Methods, Appartaus And System
US20050055209A1 (en)2003-09-052005-03-10Epstein Mark E.Semantic language modeling and confidence measurement
US7555441B2 (en)2003-10-102009-06-30Kronos Talent Management Inc.Conceptualization of job candidate information
US7720675B2 (en)2003-10-272010-05-18Educational Testing ServiceMethod and system for determining text coherence
US20050204337A1 (en)2003-12-312005-09-15Automatic E-Learning LlcSystem for developing an electronic presentation
US20090035736A1 (en)2004-01-162009-02-05Harold WolpertReal-time training simulation system and method
US20050202871A1 (en)2004-01-262005-09-15Lippincott Louis A.Multiple player game system, methods and apparatus
US20050165600A1 (en)2004-01-272005-07-28Kas KasraviSystem and method for comparative analysis of textual documents
US20050192949A1 (en)2004-02-272005-09-01Yuichi KojimaDocument group analyzing apparatus, a document group analyzing method, a document group analyzing system, a program, and a recording medium
US20050197890A1 (en)2004-03-052005-09-08Angel LuSystem, method and computer-readable medium for resume management
US20050262428A1 (en)2004-05-212005-11-24Little Chad MSystem and method for contextual correlation of web document content
US20050282141A1 (en)2004-06-172005-12-22Falash Mark DScenario workflow based assessment system and method
US7813917B2 (en)2004-06-222010-10-12Gary Stephen ShusterCandidate matching using algorithmic analysis of candidate-authored narrative information
US7917587B2 (en)2004-07-302011-03-29Microsoft CorporationMethod and system for prioritizing communications based on interpersonal relationships
US7567895B2 (en)2004-08-312009-07-28Microsoft CorporationMethod and system for prioritizing communications based on sentence classifications
US20060235843A1 (en)2005-01-312006-10-19Textdigger, Inc.Method and system for semantic search and retrieval of electronic documents
US20060206332A1 (en)2005-03-082006-09-14Microsoft CorporationEasy generation and automatic training of spoken dialog systems using text-to-speech
US20060230102A1 (en)2005-03-252006-10-12Murray HidaryAutomated training program generation and distribution system
US20080191864A1 (en)2005-03-312008-08-14Ronen WolfsonInteractive Surface and Display System
US20060246973A1 (en)2005-04-132006-11-02Thomas Jeffrey JSystems and methods for simulating a particular user in an interactive computer system
US20070112710A1 (en)2005-05-242007-05-17Drane Associates, L.P.Method and system for interactive learning and training
US20070061179A1 (en)2005-09-092007-03-15International Business Machines CorporationMethod for managing human resources
US20110258049A1 (en)2005-09-142011-10-20Jorey RamerIntegrated Advertising System
US20070135225A1 (en)2005-12-122007-06-14Nieminen Heikki VSport movement analyzer and training device
US8090725B1 (en)2006-01-132012-01-03CareerBuilder, LLCMethod and system for matching data sets of non-standard formats
US20080120029A1 (en)*2006-02-162008-05-22Zelek John SWearable tactile navigation system
US20070196798A1 (en)2006-02-172007-08-23Innertalent CorporationSelf-improvement system and method
US20070203991A1 (en)2006-02-282007-08-30Microsoft CorporationOrdering personal information using social metadata
US20070260421A1 (en)2006-05-032007-11-08Nike, Inc.Athletic or other performance sensing systems
US20070259324A1 (en)2006-05-082007-11-08Interwise Ltd.Computerized medium for exchanging contents of interest
US20080281620A1 (en)2007-05-112008-11-13Atx Group, Inc.Multi-Modal Automation for Human Interactive Skill Assessment
US7966265B2 (en)2007-05-112011-06-21Atx Group, Inc.Multi-modal automation for human interactive skill assessment
US20080300930A1 (en)2007-05-302008-12-04Compitello Michael JDeveloping and structuring business ecosystems
US20090006164A1 (en)2007-06-292009-01-01Caterpillar Inc.System and method for optimizing workforce engagement
US20090024554A1 (en)2007-07-162009-01-22Vanessa MurdockMethod For Matching Electronic Advertisements To Surrounding Context Based On Their Advertisement Content
US20090024747A1 (en)2007-07-202009-01-22International Business Machines CorporationSystem and method for visual representation of a social network connection quality
US20090153350A1 (en)*2007-12-122009-06-18Immersion Corp.Method and Apparatus for Distributing Haptic Synchronous Signals
US20090198488A1 (en)2008-02-052009-08-06Eric Arno VigenSystem and method for analyzing communications using multi-placement hierarchical structures
US20090248399A1 (en)2008-03-212009-10-01Lawrence AuSystem and method for analyzing text using emotional intelligence factors
US20110055098A1 (en)2008-04-302011-03-03Stewart Jeffrey AAutomated employment information exchange and method for employment compatibility verification
US20090282104A1 (en)2008-05-092009-11-12O'sullivan Patrick JosephSystem and method for indicating availability
US20090287672A1 (en)2008-05-132009-11-19Deepayan ChakrabartiMethod and Apparatus for Better Web Ad Matching by Combining Relevance with Consumer Click Feedback
US20090292541A1 (en)2008-05-252009-11-26Nice Systems Ltd.Methods and apparatus for enhancing speech analytics
US20100328051A1 (en)*2008-06-102010-12-30Hale Kelly SMethod And System For the Presentation Of Information Via The Tactile Sense
US20090319508A1 (en)2008-06-242009-12-24Microsoft CorporationConsistent phrase relevance measures
US20100098289A1 (en)2008-07-092010-04-22Florida Atlantic UniversitySystem and method for analysis of spatio-temporal data
US20100023377A1 (en)2008-07-232010-01-28Hr Solutions, Inc.Systems and methods for managing human resource activities
US20110208511A1 (en)2008-11-042011-08-25Saplo AbMethod and system for analyzing text
US20100145678A1 (en)2008-11-062010-06-10University Of North TexasMethod, System and Apparatus for Automatic Keyword Extraction
US20100228733A1 (en)2008-11-122010-09-09Collective Media, Inc.Method and System For Semantic Distance Measurement
US20100179916A1 (en)2009-01-152010-07-15Johns TammyCareer management system
US20100271298A1 (en)*2009-04-242010-10-28Anthrotronix, Inc.Haptic automated communication system
US20100306251A1 (en)2009-05-292010-12-02Peter SnellSystem and Related Method for Digital Attitude Mapping
US20110040837A1 (en)2009-08-142011-02-17Tal EdenMethods and apparatus to classify text communications
US20110184939A1 (en)2010-01-282011-07-28Elliott Edward SMethod of transforming resume and job order data into evaluation of qualified, available candidates
US20110268300A1 (en)*2010-04-302011-11-03Honeywell International Inc.Tactile-based guidance system
US20110295759A1 (en)2010-05-262011-12-01Forte Hcm Inc.Method and system for multi-source talent information acquisition, evaluation and cluster representation of candidates
WO2012000013A1 (en)2010-06-292012-01-05Springsense Pty LtdMethod and system for determining word senses by latent semantic distance

Non-Patent Citations (17)

* Cited by examiner, † Cited by third party
Title
Aiolli, Fabio; Sebastiani, Fabrizio; Sperduti, Alessandro, Preference Learning for Category-Ranking Based Interactive Text Cagegorization, Proceedings of International Joint Conference on Neural Networks, ICJNN 2007, Orlando, FL, Aug. 12-17, 2007, pp. 2034-2039.
Keh, Huan-Chao, The Chinese Text Categorization System with Category Priorities, Journal of Software, Oct. 2010, vol. 5, No. 10, pp. 1137-1143.
Lingway Vertical Search Solutions, Lingway HR Suite, "Lingway e-Recruitment Applications: a Semantic Solution for Recruitment", retrieved from http://www.lingway.com/images/pdf/fichelhrslea07anglaisweb.pdf on Jun. 17, 2012 (2 pages).
Loftin, R.B. et al., Training the Hubble Space Telescope Flight Team, IEEE Computer Graphics and Applications, 1995, pp. 31-37.
Mohammad, "Measuring Semantic Distance Using Distributional Profiles of Concepts", a thesis submitted in conformity with the requirements for the degree of Graduate Department of Computer Science University of Toronto, 2008, pp. 1-167.
Mohammad, et al., "Measuring Semantic Distance Using Distributional Profiles of Concepts", Association for Computational Linguistics; retrieved at http://www.umiacs.umd.edu/~saif/WebDocs/Measuring-Semantic-Distance.pdf, 2006, pp. 1-34.
Mohammad, et al., "Measuring Semantic Distance Using Distributional Profiles of Concepts", Association for Computational Linguistics; retrieved at http://www.umiacs.umd.edu/˜saif/WebDocs/Measuring-Semantic-Distance.pdf, 2006, pp. 1-34.
Mood Indicator Based on History of Electronic Communication Thread, IPCOM, Disclosure No. IPCOM000198194D, Jul. 29, 2010, 3 pages, retrieved from http://ip.com/IPCOM/000198194.
Office Action dated Jun. 16, 2008, U.S. Appl. No. 11/419,317, filed May 19, 2006 entitled "Method for Interactive Training and Learning."
Office Action dated Jun. 16, 2009 U.S. Appl. No. 11/419,317, filed May 19, 2006 entitled "Method for Interactive Training and Learning."
Office Action dated May 5, 2009, U.S. Appl. No. 11/419,324, filed May 19, 2006 entitled "System and Method for Authoring and Learning".
Office Action dated Nov. 17, 2009 U.S. Appl. No. 11/419,317, filed May 19, 2006 entitled "Method for Interactive Training and Learning."
Office Action dated Oct. 31, 2008, U.S. Appl. No. 11/419,317, filed May 19, 2006 entitled "Method for Interactive Training and Learning."
R. Hawkins and M. Russell, Document Categorization Using Lexical Analysis and Fuzzy Sets, IBM Technical Disclosure Bulletin, Jun. 1992, vol. 35, No. 1A, 1 pg.
Thomas, P.G. et al., AESOP-An Electronic Student Observatory Project, Frontiers in Education, 1998, 5 pages.
Tseng, "Semantic Classification of Chinese unknown words", ACL '03 Proceedings of the 41st Annual Meeting on Association for Computational Linguistics-vol. 2 Association for Computational Linguistics Stroudsburg, PA, USA © 2003.
Van Rijk, R et al., Using CrisisKit and MOPED to Improve Emergency Management Team Training, Proceedings ISCRAM 2004, Brussels, May 3-4, 2004. pp. 161-166.

Cited By (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20080188310A1 (en)*2000-05-122008-08-07Murdock Wilbert QInternet sports computer cellular device aka mega machine
US9802129B2 (en)*2000-05-122017-10-31Wilbert Q. MurdockInternet sports computer cellular device
US11117033B2 (en)2010-04-262021-09-14Wilbert Quinc MurdockSmart system for display of dynamic movement parameters in sports and training
US20140132388A1 (en)*2012-11-142014-05-15Ishraq ALALAWISystem, method and computer program product to assist the visually impaired in navigation
US9384679B2 (en)*2012-11-142016-07-05Ishraq ALALAWISystem, method and computer program product to assist the visually impaired in navigation

Similar Documents

PublicationPublication DateTitle
US9428034B2 (en)Integrated vehicle cabin with driver or passengers' prior conditions and activities
Steelman et al.Modeling the control of attention in visual workspaces
Elliott et al.Development of tactile and haptic systems for US infantry navigation and communication
US9229535B2 (en)Haptic automated communication system
US7986961B2 (en)Mobile computer communication interface
US10634914B1 (en)Information handling system head-mounted display critical situation context through tactile modalities
US8952796B1 (en)Enactive perception device
US20140267388A1 (en)Crew shared video display system and method
Hancock et al.Tactile cuing to augment multisensory human-machine interaction
US20220067153A1 (en)Artificial Intelligence Embedded and Secured Augmented Reality
US20160014548A1 (en)Method And System For Integrating Wearable Glasses To Vehicle
Godfroy-Cooper et al.Isomorphic spatial visual-auditory displays for operations in DVE for obstacle avoidance
KimThe origin of the see-through graphical interface: World War II aircraft gunsights and the status of the material in early computer interface design
Gomes et al.A First Exploration on the Use of Head-Mounted Augmented Reality in the context of the Portuguese Military
Qaurooni et al.The “enhanced” warrior: drone warfare and the problematics of separation
Li et al.Evaluating pilot’s perceived workload on interacting with augmented reality device in flight operations
Deveans et al.Overcoming information overload in the cockpit
White et al.Tactile displays in army operational environments
US9135793B1 (en)Force feedback to identify critical events
US20250265785A1 (en)Augmented Reality Threat Indicator Overlay
BurchamA Comprehensive Literature Review of Autonomous Surveillance Technologies Relating to Dismounted Soldiers
Elliott et al.Utilizing glove-based gestures and a tactile vest display for covert communications and robot control
Ibrahim Asif et al.Iterative interface design for robot integration with tactical teams
Skinner et al.Development of Tactile and Gestural Displays for Navigation, Communication, and Robotic Control
Admile et al.The Future of Warfare: A Smart AR/VR AI-Driven Helmet for Real-Time Data Analysis and Tactical Advantage

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:DW ASSOCIATES, LLC, TEXAS

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WOLF, WARREN L.;REHANI, MANU;REEL/FRAME:034009/0276

Effective date:20141022

STCFInformation on status: patent grant

Free format text:PATENTED CASE

ASAssignment

Owner name:REHANI, MANU, OREGON

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:DW ASSOCIATES, LLC;REEL/FRAME:035424/0898

Effective date:20150120

Owner name:WOLF, WARREN L., TEXAS

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:DW ASSOCIATES, LLC;REEL/FRAME:035424/0898

Effective date:20150120

ASAssignment

Owner name:LINGO IP HOLDINGS, LLC, TEXAS

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:REHANI, MANU;WOLF, WARREN L;REEL/FRAME:046391/0077

Effective date:20180705

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 4TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2551)

Year of fee payment:4

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 8TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2552); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

Year of fee payment:8


[8]ページ先頭

©2009-2025 Movatter.jp