Movatterモバイル変換


[0]ホーム

URL:


USRE40368E1 - Data input device - Google Patents

Data input device
Download PDF

Info

Publication number
USRE40368E1
USRE40368E1US10/815,195US81519504AUSRE40368EUS RE40368 E1USRE40368 E1US RE40368E1US 81519504 AUS81519504 AUS 81519504AUS RE40368 EUSRE40368 EUS RE40368E
Authority
US
United States
Prior art keywords
image
data input
input device
light beam
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US10/815,195
Inventor
Boaz Arnon
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lumio Home Services LLC
Original Assignee
VKB Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from IL13643200Aexternal-prioritypatent/IL136432A0/en
Priority claimed from US09/687,141external-prioritypatent/US6650318B1/en
Application filed by VKB IncfiledCriticalVKB Inc
Priority to US10/815,195priorityCriticalpatent/USRE40368E1/en
Application grantedgrantedCritical
Publication of USRE40368E1publicationCriticalpatent/USRE40368E1/en
Adjusted expirationlegal-statusCritical
Expired - Fee Relatedlegal-statusCriticalCurrent

Links

Images

Classifications

Definitions

Landscapes

Abstract

A data input device including an optically generated image of a data input device, the image including at least one input zone actuable by an action performed thereon by a user, a sensor operative to sense the action performed on the at least one input zone, and to generate signals in response to the action, and a processor in communication with the sensor operative to process the signals for performing an operation associated with the at least one input zone.

Description

CROSS REFERENCE TO RELATED APPLICATIONS
This application is based upon, and claims the benefit of priority from, Israeli Patent Application136432, filed May29,2000, the contents of which is incorporated herein in its entirety by reference.
FIELD OF THE INVENTION
The present invention relates generally to data input devices, such as keyboards, and particularly to optically generated images of data input devices.
BACKGROUND OF THE INVENTION
Data input devices, such as keyboards, touch pads, calculator pads, telephone keypads, and the like, are well known devices with alphanumeric keys. Other data input devices, such as joysticks, mouses, trackballs and the like, generally do not have keys. Whatever the kind of input device, a user must generally press one or more keys or buttons in order to input data
Data input devices are generally in wired communication with a computer terminal and the like, for controlling cursor movement, displaying commands, etc. Wireless cursor control systems have also been proposed, such as the system described in U.S. Pat. No. 5,181,181, the disclosure of which is incorporated herein by reference. This system includes a three-dimensional computer apparatus input device that uses three sets of accelerometers and angular rate sensors to determine acceleration, velocity, relative position and attitude of the device.
However, all of the known input devices have several drawbacks. Although tremendous technological advances have been made in computer and telecommunication hardware, nevertheless the data input device still remains a device with a relatively large amount of moving parts and electronics. In addition, mobile communication devices that use input devices such as keyboards, have a particular problem of balancing logistics and space. If a small keyboard is used, then the keys sometimes must be pressed several times just to indicate one character, making the device cumbersome to use. If a larger keyboard is used, then the device becomes too large to carry conveniently.
SUMMARY OF THE INVENTION
The present invention seeks to provide a novel and improved data input device. In the present invention, there is no physical input device, rather an optical image of a data input device is generated. A light beam emanating from a light source (e.g., laser source) is preferably moved by means of a mirror array or scanner, for example, at high speed to form a two-dimensional or three-dimensional image of an input device, such as a keyboard with all of the keys, in which case the user presses the “virtual” keys of the “virtual” optically generated keyboard. Another example of an optically generated input device is a “virtual” mouse, wherein pressing or touching an outlined area performs a “click”. Other examples include “virutal” musical instruments, such as an organ, a “virtual” switch, a “virtual” telephone touch pad, and the like.
Preferably optical, acoustic, position or movement sensors sense the “pressing” or “striking” of the virtual keys, and the sensed movement is sent to a processor which processes and interprets the “pressing” into the desired characters, instructions, information and data, etc. The input may then be transmitted to a computer, mobile telephone, musical instrument, and the like. The laser and beam-moving apparatus are preferably housed in a unit approximately the same size as a cell phone, or even smaller. The laser and beam-moving apparatus may be provided separately from a cell phone, or may be a built-in unit manufactured integrally with the phone.
The present invention is particularly advantageous for mobile communication devices. A user can carry any conveniently small size cell phone, for example, plus the equivalently-sized laser unit of the invention. If the user wishes to type messages to be sent to the Internet via the cell phone, for example, the user simply generates a large size keyboard with the laser unit and comfortably types the commands and message, without having to grapple with multiple presses of keys or with too small keys, or with lugging a clumsy, large keyboard. The present invention thus enables user-friendly use of cell phones for communication on the Internet. The same holds true for palm-sized computer/calculators or PDAs (personal digital assistants).
The present invention also provides a multilingual keyboard heretofore impossible to achieve in the prior art. Current keyboards generally have at most two languages indicated on the keys, e.g., the local language and English. In the present invention, since the keys are “virtual”, any language can be optically formed on the keys of the keyboard, and a suitable linguistic processor can interpret between the keyed-in language and any other language in which it is desired to transmit a message. This enables users of different languages from all over the world to communicate with each other with great ease.
In another aspect of the invention, the user can modify the arrangement, size and shape of the virtual keys. In still another aspect of the invention, a holographic image of all or part of the virtual keyboard can be employed.
The image of the virtual keyboard can be constructed by means of a monochromatic laser, or a blend of differently colored laser beams, either by using multiple laser sources having different colors and wavelengths, or by using a single laser source and using color and wavelength splitters. Differently polarized light beams can also be used. The keyboard of the present invention can not only be used as the sole data input device, but can also be integrated with other conventional or non-conventional data input devices.
There is thus provided in accordance with a preferred embodiment of the present invention a data input device including an optically generated image of a data input device, the image including at least one input zone actuable by an action performed thereon by a user, a sensor operative to sense the action performed on the at least one input zone, and to generate signals in response to the action, and a processor in communication with the sensor operative to process the signals for performing an operation association with the at least one input zone.
In accordance with a preferred embodiment of the present invention a light source is provided which generates a light beam, and beam-moving apparatus is provided which moves the light beam to generate the optically generated image of the data input device.
Further in accordance with a preferred embodiment of the present invention the beam-moving apparatus includes a mirror arranged to reflect the light beam, and an actuator operatively connected to the mirror, wherein the actuator moves the mirror to reflect the light beam to form at least a two-dimensional image of the data input device.
Still further in accordance with a preferred embodiment of the present invention the beam-moving apparatus includes a scanner arranged to scan the light beam, and an actuator operatively connected to the scanner, wherein the actuator moves the scanner to scan the light beam to form at least a two-dimensional image of the data input device.
In accordance with a preferred embodiment of the present invention the data input device includes a key of a keyboard, a keyboard, a mouse with at least one input button or a key of a touch pad.
Further in accordance with a preferred embodiment of the present invention the sensor includes an optical sensor (such as a CCD or PSD), an acoustic sensor or a movement sensor.
Still further in accordance with a preferred embodiment of the present invention the processor is in communication with an output device, such as a computer, a mobile telephone, a switch or a palm-held computer/calculator.
There is also provided in accordance with a preferred embodiment of the present invention a method for data input including generating an optical image of a data input device, the image including at least one input zone actuable by an action performed thereon by a user, performing an action on the at least one input zone, sensing the action performed on the at least one input zone, generating signals in response to the action, and processing the signals for performing an operation associated with the at least one input zone.
In accordance with a preferred embodiment of the present invention the step of generating the optical image includes generating an image of a keyboard and the step of performing an action includes pressing keys of the image of the keyboard.
Further in accordance with a preferred embodiment of the present invention the step of processing the signals causes typing alphanumeric characters on a computer, cell phone, palm-sized computer/calculator or PDA.
In accordance with a preferred embodiment of the present invention the method further includes modifying the image of the keyboard so as to modify a configuration of keys of the keyboard.
Additionally in accordance with a preferred embodiment of the present invention the method further includes optically generating an image of characters of a first language on keys of the keyboard, selecting a second language different from the first language, and optically generating an image of characters of the second language on keys of the keyboard.
Further in accordance with a preferred embodiment of the present invention the optical image of the data input device is a holographic image.
Still further in accordance with a preferred embodiment of the present invention the optical image of the data input device is generated by means of a monochromatic laser.
Additionally in accordance with a preferred embodiment of the present invention the optical image of the data input device is generated by means of multiple laser sources having different colors and wavelengths.
In accordance with a preferred embodiment of the present invention the optical image of the data input device is generated by means of a single laser source and using color and wavelength splitters to split light from the single laser source.
Further in accordance with a preferred embodiment of the present invention the optical image of the data input device is generated by means of differently polarized light beams.
In accordance with a preferred embodiment of the present invention the step of sensing includes detecting light reflected from an object within a silhouette of the image, and analyzing a reflection of the light to determine a spatial position of the object.
Further in accordance with a preferred embodiment of the present invention the step of sensing includes providing a light beam emanating from a light source, detecting light reflected from an object within a silhouette of the image, corresponding to the light beam, and analyzing an angle of the light beam and a time for the beam to be reflected back from the object to a reference to determine a spatial position of the object.
Still further in accordance with a preferred embodiment of the present invention the reference includes an optically readable reference.
Additionally in accordance with a preferred embodiment of the present invention the optically readable reference includes a tangible bar code strip or an optically generated bar code strip.
In accordance with a preferred embodiment of the present invention the optical image of a data input device is generated by the same light beam whose reflection is used to determine the spatial position of the object.
Further in accordance with a preferred embodiment of the present invention the step of sensing includes providing a non-visible-light beam emanating from a non-visible-light source, detecting an image of the non-light impinging upon an object within a silhouette of the image of the data input device, and analyzing the image of the non-light to determine a spatial position of the object.
Still further in accordance with a preferred embodiment of the present invention the non-visible-light beam includes an infrared beam and the image of the non-light includes an infrared image of the object.
In accordance with a preferred embodiment of the present invention the object includes a finger and the step of analyzing includes analyzing a difference in the infrared images of the finger before and after pressing the finger.
Further in accordance with a preferred embodiment of the present invention the method includes detecting light reflected from an object within a silhouette of the image and preventing the image from impinging upon the object.
BRIEF DESCRIPTION OF THE DRAWINGS
The present invention will be understood and appreciated more fully from the following detailed description, taken in conjunction with the drawings in which:
FIG. 1 is a simplified pictorial illustration of a data input device constructed and operative in accordance with a preferred embodiment of the present invention;
FIG. 2 is a simplified block diagram of the data input device ofFIG. 1;
FIGS. 3A-3E are simplified pictorial illustrations of optically generated images of data input devices, constructed and operative in accordance with difference preferred embodiments of the present invention;
FIG. 4A is a simplified pictorial illustration of beam-moving apparatus constructed and operative in accordance with a preferred embodiment of the present invention, including a mirror array with actuators for moving the array;
FIG. 4B is a simplified pictorial illustration of beam-moving apparatus constructed and operative in accordance with another preferred embodiment of the present invention, including a crystal beam modifier;
FIG. 4C is a simplified pictorial illustration of beam-moving apparatus constructed and operative in accordance with yet another preferred embodiment of the present invention, including a scanner;
FIG. 5 is a simplified pictorial illustration of a data input device constructed and operative in accordance with another preferred embodiment of the present invention, including a light unit that projects an optical image of a data input device by projecting light from underneath a transparent or translucent substrate;
FIG. 6 is a simplified illustration of a multilingual keyboard, constructed and operative in accordance with a preferred embodiment of the present invention;
FIG. 7 is a simplified illustration of a non-standard layout of keys on an optically generated image of a keyboard, wherein a user can modify the arrangement, size and shape of the “virtual” keys, in accordance with a preferred embodiment of the present invention;
FIG. 8 is a simplified illustration of an optical sensor system for sensing input of data in any of the data input devices of the invention, constructed and operative in accordance with a preferred embodiment of the present invention, which uses two light beams to determine the position of the data input;
FIG. 9A is a simplified illustration of a light beam passing over the light-generated data input device ofFIG. 8, with no object placed on the input zones;
FIG. 9B is a simplified illustration of a light beam passing over the light-generated data input device ofFIG. 8, with an object placed on one of the input zones;
FIG. 10 is a simplified illustration of an optical sensor system for sensing input of data in any of the data input devices of the invention, constructed and operative in accordance with another preferred embodiment of the present invention, which uses one light beam to determine the position of the data input;
FIG. 11 is a simplified illustration of an optical sensor system for sensing input of data in any of the data input devices of the invention, constructed and operative in accordance with yet another preferred embodiment of the present invention, wherein a bar code reference is used to determine the position of the data input;
FIG. 12 is a simplified illustration of a sensor system for sensing input of data in any of the data input devices of the invention, constructed and operative in accordance with another preferred embodiment of the present invention, wherein a non-visible-light beam is used to determine the position of the data input;
FIGS. 13 and 14 are simplified illustrations of two typical infrared images of fingers placed upon a “virtual” keyboard constructed in accordance with a preferred embodiment of the present invention;
FIG. 15 is a simplified flow chart of a method for preventing displaying an image of a data input device on selected locations, in accordance with another preferred embodiment of the present invention;
FIGS. 16 and 17 are simplified illustrations of generating images of data input devices in accordance with two preferred embodiments of the present invention, wherein inFIG. 16, a web page is light-generated, and wherein inFIG. 17, a game object is light-generated; and
FIG. 18 is a simplified illustration of a mirror with one or more darkened portions for generating images of data input devices in accordance with another preferred embodiment of the present invention.
DETAILED DESCRIPTION OF A PREFERRED EMBODIMENT
Reference is now made toFIGS. 1 and 2 which illustrate adata input device10 constructed and operative in accordance with a preferred embodiment of the present invention.
Data input device10 preferably includes alight source12 which generates alight beam14. In accordance with one preferred embodiment of the present invention,light source12 is a single laser source, such as a monochromatic laser. Color andwavelength splitters15 may be provided to split light from the single laser source. Alternatively,multiple laser sources12 having different colors and wavelengths, may be employed. Additionally or alternatively,light source12 may generate differently polarized light beams.
Beam-movingapparatus16, described more in detail hereinbelow, is preferably arranged with respect tolight source12 such that it moveslight beam14 to generate an optically generatedimage18 of a data input device.Image18 of the data input device preferably includes one ormore input zones19 actuable by an action performed thereon by a user, as will be readily understood by examples ofimages18 shown inFIGS. 3A-3E. InFIG. 3A, an image of akeyboard20 withkeys22 is generated.Keys22 are the input zones, and a user “presses”keys22 to input data. The manner in which the pressing it detected is described hereinbelow.Image18 may include not only the silhouette ofkeys22 but also alphanumeric characters23 formed in the outline of each key22.
FIG. 3B illustrates another example of an optically generated input device, that of amouse24, wherein pressing or touching an outlined area of abutton26 performs a “click”. Alternatively, moving a user's finger in the outlined area can also perform a function. Another example, shown inFIG. 3C, includes an optically generated image of amusical instrument28, such as an organ withkeys30, wherein “pressing”keys30 can generate musical notes.
InFIG. 3D, an optically generated image of atouch pad32, such as for a telephone, is provided with pad keys34, wherein “pressing” one of keys34 can generate alphanumeric characters. InFIG. 3E, an optically generated image of palm-held computer/calculator (or any other kind of PDA)36 is provided with keys orbuttons38, wherein “pressing” one of keys orbuttons38 can generate mathematical functions or alphanumeric characters. The pad keys34 orkeys38 are also examples of “virtual” PDA switches that can be optically generated. Of course, any kind of switch can be optically generated, such as single-pole and multi-pole switches, for example.
A sensor is preferably provided to sense the above described actions performed on theinput zone19. Many kinds of sensors can be employed to detect pressing any of the “virutal” keys of the embodiments shown inFIGS. 3A-3E. For example, as seen inFIG. 1, the sensor may be anoptical sensor40, such as an electronic camera, CCD or position sensing device (PSD), whose field of view encompasses the “virtual” keyboard or touch pad, etc. Other examples of suitable sensors include anacoustic sensor42 and a position ormovement sensor44. Threeacoustic sensors42 should preferably be used for sensing the action by means of triangulation. Any number of position or movement sensors can be used, and move than one kind of sensor can be employed in carrying out the invention. Other examples of suitable sensors are described hereinbelow with reference toFIGS. 8-10.
The sensors, upon sensing the “pressing” or “striking” of the “virtual” keys, preferably generate electrical signals based upon the sensed information and transmit them to aprocessor50 which processes and interprets the signals into the desired characters, instructions, information and data, input by the user.Processor50 is preferably in electrical communication with an output device, such as acomputer52,mobile telephone54,musical instrument56, palm-held computer/calculator58, and the like, which visually or audibly output the desired characters, instructions, information and data.
In accordance with a preferred embodiment of the present invention, as shown inFIG. 4A, beam-movingapparatus16 includes a mirror array60 (one or more mirrors) arranged to reflectlight beam14, and an actuator, such as aservomotor62, operatively connected to mirrorarray60.Servomotor62 preferably rapidly movesmirror array60 to reflectlight beam14 to form a two-dimensional or three-dimensional image ofdata input device10. Another example is shown inFIG. 4B, wherein beam-movingapparatus16 includes acrystal beam modifier64.FIG. 4C illustrates yet another example of beam-movingapparatus16, that of ascanner66. In all cases,light beam14 is rapidly moved to form a two-dimensional or three-dimensional image ofdata input device10. Alternatively, a holographic image ofdata input device10 can be produced by hologramic equipment65 (FIG.2). As another alternative, an image ofdata input device10 can be produced by a grating67 (FIG.2).
Light source12 and beam-movingapparatus16 are preferably housed in a laser unit68 (FIG. 1) approximately the same size as a cell phone. This makes the present invention particularly advantageous for mobile communication devices. For example, a user can carry any conveniently small size cell phone, for example, plus the equivalently-sized laser unit68. If the user wishes to type messages to be sent to the Internet via the cell phone, for example, the user simply generates a large size keyboard withlaser unit68 and comfortably types the commands and message, without having to grapple with multiple presses of keys or with too small keys, or with lugging a clumsy, large keyboard. The present invention thus enables user-friendly use of cell phones for communication on the Internet. The same holds true for palm-sized computer/calculators, and other small data input devices. It is noted that thedata input devices10 of the present invention can not only be used as the sole data input device, but can also be integrated with other conventional or non-conventional data input devices.
Although the above describedlaser unit68 is considered the most preferred embodiment, nevertheless other light units can be used to generate the optical image of the data input device. For example, as shown inFIG. 5, alight unit70 may project an optical image72 of a data input device74, such as a keyboard, by projecting light from underneath a transparent ortranslucent substrate76. A reticle71 may be provided with a template of the keyboard for producing the image, for example. The sensing of “pressing” the keys of the keyboard and processing signals generated by the sensor is preferably as described hereinabove.
Reference is now made toFIG. 6 which illustrates amultilingual keyboard80, constructed and operative in accordance with a preferred embodiment of the present invention.Keyboard80 is preferably formed bylaser unit68, described hereinabove.Laser unit68 preferably forms a silhouette of keys82 withalphanumeric characters84 formed in the outline of each key82. In the embodiment ofFIG. 6, alinguistic processor86 is in electrical communication withlaser unit68.Linguistic processor86 is operative to form an optical image of letters of any alphabet, as chosen by the user.
The user can choose the particular language in a number of ways. For example, as shown inFIG. 6,laser unit68 can first display a standard “qwertyuiop” layout of keys82 in English. The user can then type in English the desired language, other than English, andlaser unit68 promptly generates a different set ofkeys88 configured to the chosen language. Additionally or alternatively, switches90 may be provided for switching between languages. It is important to note that the different set ofkeys88 does not necessarily have the same amount or layout as the standard “qwertyuiop” layout of keys82 in English.Linguistic processor86 is operative to interpret between the keyed-in language and any other language in which it is desired to transmit a message. For example, a Japanese user interested in a website of a Hungarian company, can commandlaser unit68 to generate an optical image of a Japanese keyboard, and type a message in Japanese.Linguistic processor86 then translates the Japanese message into Hungarian, and directs the translated message to the website.
It is noted thatlinguistic processor86 may be locally connected todata input device10, and may be part of its hardware. Alternatively,linguistic processor86 can be provided on a remote server, such as in the Internet, and remotely accessed. The latter feature enables having an international linguistic interface for global communication.
Reference is now made toFIG. 7 which illustrates thatlaser unit68 can display a non-standard layout ofkeys92. In accordance with a preferred embodiment of the present invention, the user can modify the arrangement, size and shape ofkeys92, such as by typing in commands which are interpreted and processed byprocessor50 to generate the desired arrangement. Additionally or alternatively, switches94 or other hardware may be provided for selecting an arrangement ofkeys92.
Reference is now made toFIG. 8 which illustrates anoptical sensor system100 for sensing input of data in any of the data input devices of the present invention, constructed and operative in accordance with a preferred embodiment of the present invention.Optical sensing system100 preferably includes twolight beams102 and104, different fromlight beam14, to determine the position of the data input. Light beams102 and104 may emanate fromlight source12 or one or more additionallight sources106. Light beams102 and104 preferably cover the entire area ofimage18, either by means of scanning or by having sufficient beam width to cover the entire area.
A pair oflight detectors108 and110 are preferably provided for detecting any light reflected from objects within the silhouette ofimage18, corresponding tolight beams102 and104, respectively. For example, as seen inFIG. 9A, if no object is in the silhouette ofimage18, thenlight beam102 has one type of reflection which is detected bylight detector108. However, as seen inFIG. 9B, if a finger or other object is placed on one ofinput zones19 ofimage18, thenlight beam102 has a new and different reflection detected bylight detector108. The same holds true forlight beam104. By analyzing the reflection of one of the light beams (102 or104), such as withprocessor50, the system knows the angle relative to the light source at which the object lies. By analyzing both of the reflections oflight beams102 and104 and their intersection, the system knows the spatial position of the object. Finally, when the finger moves to press thevirtual input zone19, the movement of the finger causes yet another different set of reflections oflight beams102 and104. The new reflections are analyzed to sense whichinput zone19 was “pressed”.
Reference is now made toFIG. 10 which illustrates anoptical sensor system120 for sensing input of data in any of the data input devices of the present invention, constructed and operative in accordance with another preferred embodiment of the present invention.Optical sensing system120 differs fromoptical sensing system100 in thatoptical sensing system120 preferably includes onelight beam122 to determine the position of the data input.Light beam122 may emanate fromlight source12 or additionallight source106.Light beam122 preferably covers the entire area ofimage18, either by means of scanning or by having sufficient beam width to cover the entire area.
As seen inFIG. 10,light source12 or106 is preferably located at a fixed, known distance x from a “virtual”keyboard124. For a given angle, such as angle β, there are a plurality of “virtual”keys126 in the path oflight beam122. The time forlight beam122 to impinge on a finger or other object placed on one ofkeys126 and be reflected back to alight detector128 is a function of the distance of the key126 fromlight source12 or106. For example, the time forlight beam122 to be reflected from key126A may12 be60 picoseconds whereas the time forlight beam122 to be reflected from key126B may be 100 picoseconds.Processor50 preferably analyzes the angle and time data forlight beams122 and derives the spatial position of the finger. Finally, when the finger moves to press theparticular key126, the movement of the finger causes a different reflection oflight beam122. The new reflection is analyzed to sense whichkey126 was “pressed”.
Reference is now made toFIG. 11 which illustrates anoptical sensor system130 for sensing input of data in any of the data input devices of the present invention, constructed and operative in accordance with yet another preferred embodiment of the present invention.Optical sensing system130 is preferably similar to the previously describedoptical sensing system120, with like elements being designated by like numerals.
Inoptical sensing system120,light source12 or106 is preferably located at a fixed, known distance fromkeyboard124 in order to determine the distance to the particular finger or object.Optical sensing system130 differs fromoptical sensing system120 in thatsensing system130 preferably uses an opticallyreadable reference132, such as a bar code, as a reference for determining the distance to the particular finger or object. Opticallyreadable reference132 may be a tangible bar code strip placed on a working surface by the user. Alternatively, opticallyreadable reference132 may be optically generated just likekeyboard124.
For a given angle, such as angle β,light beam122 not only crosses over a plurality ofkeys126, but also impinges upon a particular region of opticallyreadable reference132. The particular place of impingement on opticallyreadable reference132 uniquely determines the angle oflight beam122.Processor50 can proceed to analyze the angle and time data forlight beams122 and derive the spatial position of the finger, as described hereinabove with reference to FIG.9.
The embodiments ofFIGS. 8-11 have been described such that the light beams102,104 and122 used to sense the input of data are different from thelight beam14 used to create the virtual keyboard. Alternatively, with appropriate circuitry or software,light beam14 itself can be used as the light beam used to sense the input of data.
Reference is now made toFIG. 12 which illustrates asensor system140 for sensing input of data in any of the data input devices of the present invention, constructed and operative in accordance with yet another preferred embodiment of the present invention.Sensing system140 is preferably similar to the previously describedoptical sensing systems120 and130, with like elements being designated by like numerals.Sensing system140 differs from the previousoptical sensing systems100,120 and130 in thatsensing system140 preferably includes a non-visible-light beam142 emanating from a non-visible-light source143 to determine the position of the data input. Non-visible-light beam142 is any beam of electromagnetic wave radiation whose wavelength is outside the range of visible light. Alternatively, non-visible-light beam142 can be an acoustic beam. Most preferably,beam142 is an infrared beam.Beam142 preferably covers the entire area ofimage18, either by means of scanning or by having sufficient beam width to cover the entire area.
Reference is now made toFIGS. 13 and 14 which illustrate two typical infrared images of fingers placed upon thevirtual keyboard124.FIG. 13 shows an infrared image before one of the fingers presses a key126.FIG. 14 shows an infrared image after pressing a key126. It is seen that the act of pressing changes the blood flow to and from the tips of the fingers, and thus causes a different infrared image, such as seen atreference number146. The difference in the infrared images betweenFIGS. 13 and 14, is preferably detected by an infrared detector144 in electrical communication withprocessor50.Processor50 preferably analyzes the differences in the images and determines which key126 was pressed.
When creating and projecting images of any of the data input devices of the present invention, it is possible that portions of the image may fall upon fingers of the user. Although this does not affect the operation of the invention, nevertheless some users may desire that no portion of the image fall on their fingers. Reference is now made toFIG. 15 which illustrates a method for preventing displaying an image of a data input device on selected locations, in accordance with another preferred embodiment of the present invention.
As described hereinabove, beam-movingapparatus16 is arranged with respect tolight source12 such that it moveslight beam14 to generate optically generatedimage18 of the data input device. Any of the above-describedsensor systems100,120,130 or140 scans theimage18 to detect data input as described hereinabove. The sensor system also detects the presence of an object, e.g., a hand or finger, in the outline ofimage18. Sinceprocessor50 knows the exact position of the hand or finger is known, as well as the position oflight beam14,processor50 can instruct beam-movingapparatus16 andlight source12 to causelight beam14 to generate theimage18 only in those regions not covered by the fingers.
It is noted that any of the above-describedsensor systems100,120,130 or140 can be used to detect data input and the like even without being used in conjunction with the generation ofimage18. For example, any of the sensor systems of the invention can be used to detect finger movement on a “regular”, tangible keyboard.
Reference is now made toFIGS. 16 and 17 which illustrate other examples of applications generating images of data input devices in accordance with preferred embodiments of the present invention. InFIG. 16, a light-generated web page is generated with any of the above-described apparatus for generating images of data input devices. A user can input data by “clicking” on aclick zone148, the click being detected as described hereinabove.
InFIG. 17, a light-generatedgame object150, such as achess piece152 andchess board154 are generated with any of the above-described apparatus for generating images of data input devices. A user can input data related to the game, such as “moving” thechess piece152, with the input being detected as described hereinabove.
As mentioned hereinabove,laser unit68 is considered the most preferred embodiment, but other light units can be used to generate the optical image of the data input device. Another example is shown inFIG. 18, mirror array60 (described hereinabove with reference toFIG. 4A) may include amirror160 with adarkened portion162 that does not reflect light, andclear portions164 which do reflect light. Theclear portions164 are shaped like characters, numerals, letters or any other shape which it is desired to form a light-generatedimage166 thereof.
It will be appreciated by persons skilled in the art that the present invention is not limited by what has been particularly shown and described hereinabove. Rather the scope of the present invention includes both combinations and subcombinations of the features described hereinabove as well as modifications and variations thereof which would occur to a person of skill in the art upon reading the foregoing description and which are not in the prior art.

Claims (46)

US10/815,1952000-05-292004-04-01Data input deviceExpired - Fee RelatedUSRE40368E1 (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
US10/815,195USRE40368E1 (en)2000-05-292004-04-01Data input device

Applications Claiming Priority (3)

Application NumberPriority DateFiling DateTitle
IL13643200AIL136432A0 (en)2000-05-292000-05-29Data input device
US09/687,141US6650318B1 (en)2000-10-132000-10-13Data input device
US10/815,195USRE40368E1 (en)2000-05-292004-04-01Data input device

Related Parent Applications (1)

Application NumberTitlePriority DateFiling Date
US09/687,141ReissueUS6650318B1 (en)2000-05-292000-10-13Data input device

Publications (1)

Publication NumberPublication Date
USRE40368E1true USRE40368E1 (en)2008-06-10

Family

ID=39484576

Family Applications (1)

Application NumberTitlePriority DateFiling Date
US10/815,195Expired - Fee RelatedUSRE40368E1 (en)2000-05-292004-04-01Data input device

Country Status (1)

CountryLink
US (1)USRE40368E1 (en)

Cited By (36)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20070002028A1 (en)*2000-07-052007-01-04Smart Technologies, Inc.Passive Touch System And Method Of Detecting User Input
US20070103448A1 (en)*2005-11-072007-05-10Cook Steven DAdaptable keyboard for a tablet PC
US20070236454A1 (en)*2003-10-092007-10-11Smart Technologies, Inc.Apparatus For Determining The Location Of A Pointer Within A Region Of Interest
US20080180654A1 (en)*2007-01-252008-07-31Microsoft CorporationDynamic projected user interface
US20090278795A1 (en)*2008-05-092009-11-12Smart Technologies UlcInteractive Input System And Illumination Assembly Therefor
US20090295730A1 (en)*2008-06-022009-12-03Yun Sup ShinVirtual optical input unit and control method thereof
US20100142016A1 (en)*2008-12-082010-06-10Light Blue Optics Ltd.Holographic image projection systems
US20110078614A1 (en)*2009-09-302011-03-31Pantech Co., Ltd.Terminal and method for providing virtual keyboard
US20110095989A1 (en)*2009-10-232011-04-28Smart Technologies UlcInteractive input system and bezel therefor
US20110234638A1 (en)*2003-09-162011-09-29Smart Technologies UlcGesture recognition method and touch system incorporating the same
US20110291938A1 (en)*2010-05-252011-12-01Fih (Hong Kong) LimitedTouch-type transparent keyboard
US8089462B2 (en)2004-01-022012-01-03Smart Technologies UlcPointer tracking across multiple overlapping coordinate input sub-regions defining a generally contiguous input region
US8094137B2 (en)2007-07-232012-01-10Smart Technologies UlcSystem and method of detecting contact on a display
US8115753B2 (en)2007-04-112012-02-14Next Holdings LimitedTouch screen system with hover and click input methods
US8120596B2 (en)2004-05-212012-02-21Smart Technologies UlcTiled touch system
US8149221B2 (en)2004-05-072012-04-03Next Holdings LimitedTouch panel display system with illumination and detection provided from a single edge
US20120083339A1 (en)*2010-08-242012-04-05Janos StoneSystems and methods for transforming and/or generating a tangible physical structure based on user input information
US8228304B2 (en)2002-11-152012-07-24Smart Technologies UlcSize/scale orientation determination of a pointer in a camera-based touch system
US8274496B2 (en)2004-04-292012-09-25Smart Technologies UlcDual mode touch systems
US8289299B2 (en)2003-02-142012-10-16Next Holdings LimitedTouch screen signal processing
US8339378B2 (en)2008-11-052012-12-25Smart Technologies UlcInteractive input system with multi-angle reflector
US8384693B2 (en)2007-08-302013-02-26Next Holdings LimitedLow profile touch panel systems
US8405637B2 (en)2008-01-072013-03-26Next Holdings LimitedOptical position sensing system and optical position sensor assembly with convex imaging window
US8432377B2 (en)2007-08-302013-04-30Next Holdings LimitedOptical touchscreen with improved illumination
US8456451B2 (en)2003-03-112013-06-04Smart Technologies UlcSystem and method for differentiating between pointers used to contact touch surface
US8456447B2 (en)2003-02-142013-06-04Next Holdings LimitedTouch screen signal processing
US8508508B2 (en)2003-02-142013-08-13Next Holdings LimitedTouch screen signal processing with single-point calibration
WO2014029020A1 (en)*2012-08-202014-02-27Ctx Virtual Technologies Inc.Keyboard projection system with image subtraction
US8902193B2 (en)2008-05-092014-12-02Smart Technologies UlcInteractive input system and bezel therefor
US20150065221A1 (en)*2013-09-032015-03-05Samsung Electronics Co., Ltd.Method and device for operating 3d virtual chessboard
EP2897027A4 (en)*2012-09-122015-09-23Zte Corp METHOD, DEVICE AND INPUT TERMINAL
US9360888B2 (en)2013-05-092016-06-07Stephen HowardSystem and method for motion detection and interpretation
US9442607B2 (en)2006-12-042016-09-13Smart Technologies Inc.Interactive input system and method
US9465488B2 (en)2013-05-092016-10-11Stephen HowardSystem and method for motion detection and interpretation
US10891003B2 (en)2013-05-092021-01-12Omni Consumer Products, LlcSystem, method, and apparatus for an interactive container
US11233981B2 (en)2014-12-302022-01-25Omni Consumer Products, LlcSystem and method for interactive projection

Citations (9)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US4843568A (en)*1986-04-111989-06-27Krueger Myron WReal time perception of and response to the actions of an unencumbered participant/user
US5181181A (en)*1990-09-271993-01-19Triton Technologies, Inc.Computer apparatus input device for three-dimensional information
US5457550A (en)*1991-02-271995-10-10Ricoh Company, Ltd.Optical scanner unit having recursive optical system
DE29802435U1 (en)*1998-02-121998-05-07Siemens Nixdorf Inf Syst Arrangement of the projection surface of a virtual input unit
US5767842A (en)*1992-02-071998-06-16International Business Machines CorporationMethod and device for optical input of commands or data
EP0982676A1 (en)*1998-08-272000-03-01Hewlett-Packard CompanyA method and apparatus for a virtual display/keyboard for a PDA
WO2000021024A1 (en)*1998-10-072000-04-13Intel CorporationInput device using scanning sensors
US6281878B1 (en)*1994-11-012001-08-28Stephen V. R. MontelleseApparatus and method for inputing data
US6377238B1 (en)*1993-04-282002-04-23Mcpheters Robert DouglasHolographic control arrangement

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US4843568A (en)*1986-04-111989-06-27Krueger Myron WReal time perception of and response to the actions of an unencumbered participant/user
US5181181A (en)*1990-09-271993-01-19Triton Technologies, Inc.Computer apparatus input device for three-dimensional information
US5457550A (en)*1991-02-271995-10-10Ricoh Company, Ltd.Optical scanner unit having recursive optical system
US5767842A (en)*1992-02-071998-06-16International Business Machines CorporationMethod and device for optical input of commands or data
US6377238B1 (en)*1993-04-282002-04-23Mcpheters Robert DouglasHolographic control arrangement
US6281878B1 (en)*1994-11-012001-08-28Stephen V. R. MontelleseApparatus and method for inputing data
DE29802435U1 (en)*1998-02-121998-05-07Siemens Nixdorf Inf Syst Arrangement of the projection surface of a virtual input unit
EP0982676A1 (en)*1998-08-272000-03-01Hewlett-Packard CompanyA method and apparatus for a virtual display/keyboard for a PDA
WO2000021024A1 (en)*1998-10-072000-04-13Intel CorporationInput device using scanning sensors

Cited By (50)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20100265202A1 (en)*2000-07-052010-10-21Smart Technologies UlcPassive touch system and method of detecting user input
US20070002028A1 (en)*2000-07-052007-01-04Smart Technologies, Inc.Passive Touch System And Method Of Detecting User Input
US8378986B2 (en)2000-07-052013-02-19Smart Technologies UlcPassive touch system and method of detecting user input
US8203535B2 (en)2000-07-052012-06-19Smart Technologies UlcPassive touch system and method of detecting user input
US8055022B2 (en)2000-07-052011-11-08Smart Technologies UlcPassive touch system and method of detecting user input
US8228304B2 (en)2002-11-152012-07-24Smart Technologies UlcSize/scale orientation determination of a pointer in a camera-based touch system
US8466885B2 (en)2003-02-142013-06-18Next Holdings LimitedTouch screen signal processing
US8289299B2 (en)2003-02-142012-10-16Next Holdings LimitedTouch screen signal processing
US8508508B2 (en)2003-02-142013-08-13Next Holdings LimitedTouch screen signal processing with single-point calibration
US8456447B2 (en)2003-02-142013-06-04Next Holdings LimitedTouch screen signal processing
US8456451B2 (en)2003-03-112013-06-04Smart Technologies UlcSystem and method for differentiating between pointers used to contact touch surface
US8325134B2 (en)2003-09-162012-12-04Smart Technologies UlcGesture recognition method and touch system incorporating the same
US20110234638A1 (en)*2003-09-162011-09-29Smart Technologies UlcGesture recognition method and touch system incorporating the same
US20070236454A1 (en)*2003-10-092007-10-11Smart Technologies, Inc.Apparatus For Determining The Location Of A Pointer Within A Region Of Interest
US8456418B2 (en)2003-10-092013-06-04Smart Technologies UlcApparatus for determining the location of a pointer within a region of interest
US8089462B2 (en)2004-01-022012-01-03Smart Technologies UlcPointer tracking across multiple overlapping coordinate input sub-regions defining a generally contiguous input region
US8576172B2 (en)2004-01-022013-11-05Smart Technologies UlcPointer tracking across multiple overlapping coordinate input sub-regions defining a generally contiguous input region
US8274496B2 (en)2004-04-292012-09-25Smart Technologies UlcDual mode touch systems
US8149221B2 (en)2004-05-072012-04-03Next Holdings LimitedTouch panel display system with illumination and detection provided from a single edge
US8120596B2 (en)2004-05-212012-02-21Smart Technologies UlcTiled touch system
US20070103448A1 (en)*2005-11-072007-05-10Cook Steven DAdaptable keyboard for a tablet PC
US9442607B2 (en)2006-12-042016-09-13Smart Technologies Inc.Interactive input system and method
US8493366B2 (en)*2007-01-252013-07-23Microsoft CorporationDynamic projected user interface
US20110285633A1 (en)*2007-01-252011-11-24Microsoft CorporationDynamic projected user interface
US8022942B2 (en)*2007-01-252011-09-20Microsoft CorporationDynamic projected user interface
US20080180654A1 (en)*2007-01-252008-07-31Microsoft CorporationDynamic projected user interface
US8115753B2 (en)2007-04-112012-02-14Next Holdings LimitedTouch screen system with hover and click input methods
US8094137B2 (en)2007-07-232012-01-10Smart Technologies UlcSystem and method of detecting contact on a display
US8432377B2 (en)2007-08-302013-04-30Next Holdings LimitedOptical touchscreen with improved illumination
US8384693B2 (en)2007-08-302013-02-26Next Holdings LimitedLow profile touch panel systems
US8405637B2 (en)2008-01-072013-03-26Next Holdings LimitedOptical position sensing system and optical position sensor assembly with convex imaging window
US8405636B2 (en)2008-01-072013-03-26Next Holdings LimitedOptical position sensing system and optical position sensor assembly
US8902193B2 (en)2008-05-092014-12-02Smart Technologies UlcInteractive input system and bezel therefor
US20090278795A1 (en)*2008-05-092009-11-12Smart Technologies UlcInteractive Input System And Illumination Assembly Therefor
US20090295730A1 (en)*2008-06-022009-12-03Yun Sup ShinVirtual optical input unit and control method thereof
US8339378B2 (en)2008-11-052012-12-25Smart Technologies UlcInteractive input system with multi-angle reflector
US8154780B2 (en)2008-12-082012-04-10Light Blue Optics, Ltd.Holographic image projection systems
US20100142016A1 (en)*2008-12-082010-06-10Light Blue Optics Ltd.Holographic image projection systems
US20110078614A1 (en)*2009-09-302011-03-31Pantech Co., Ltd.Terminal and method for providing virtual keyboard
US20110095989A1 (en)*2009-10-232011-04-28Smart Technologies UlcInteractive input system and bezel therefor
US20110291938A1 (en)*2010-05-252011-12-01Fih (Hong Kong) LimitedTouch-type transparent keyboard
US20120083339A1 (en)*2010-08-242012-04-05Janos StoneSystems and methods for transforming and/or generating a tangible physical structure based on user input information
WO2014029020A1 (en)*2012-08-202014-02-27Ctx Virtual Technologies Inc.Keyboard projection system with image subtraction
EP2897027A4 (en)*2012-09-122015-09-23Zte Corp METHOD, DEVICE AND INPUT TERMINAL
US9360888B2 (en)2013-05-092016-06-07Stephen HowardSystem and method for motion detection and interpretation
US9465488B2 (en)2013-05-092016-10-11Stephen HowardSystem and method for motion detection and interpretation
US10891003B2 (en)2013-05-092021-01-12Omni Consumer Products, LlcSystem, method, and apparatus for an interactive container
US20150065221A1 (en)*2013-09-032015-03-05Samsung Electronics Co., Ltd.Method and device for operating 3d virtual chessboard
US11233981B2 (en)2014-12-302022-01-25Omni Consumer Products, LlcSystem and method for interactive projection
US12120471B2 (en)2014-12-302024-10-15Omni Consumer Products, LlcSystem and method for interactive projection

Similar Documents

PublicationPublication DateTitle
US6650318B1 (en)Data input device
USRE40368E1 (en)Data input device
US7084857B2 (en)Virtual data entry device and method for input of alphanumeric and other data
WO2001093006A1 (en)Data input device
US11093086B2 (en)Method and apparatus for data entry input
USRE40880E1 (en)Optical system for inputting pointer and character data into electronic equipment
US8334837B2 (en)Method for displaying approached interaction areas
CA2835509C (en)Method for user input from the back panel of a handheld computerized device
US6388657B1 (en)Virtual reality keyboard system and method
EP2717120B1 (en)Apparatus, methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications
EP1493124B1 (en)A touch pad and a method of operating the touch pad
US6600480B2 (en)Virtual reality keyboard system and method
WO2012056864A1 (en)Input device, information apparatus provided with the input device, program for causing computer to function as input device, and method for using the input device to input characters
US20020061217A1 (en)Electronic input device
US20140240267A1 (en)Method Using a Finger Above a Touchpad for Controlling a Computerized System
JP2006509269A (en) Apparatus and method for inputting data
GB2337349A (en)Keyboard input using trace of stylus on touch screen display
GB2470654A (en)Data input on a virtual device using a set of objects.
US20120105375A1 (en)Electronic device
JP2000242394A (en)Virtual keyboard system
KR100810455B1 (en)Micro-keyboard simulator
JP2000187551A (en)Input device
JP2500283B2 (en) Virtual space keyboard device
JPH06195169A (en)Input device for electronic computer
JP2005258482A (en)Virtual keyboard input device

Legal Events

DateCodeTitleDescription
FEPPFee payment procedure

Free format text:PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAYFee payment

Year of fee payment:8

FEPPFee payment procedure

Free format text:PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Free format text:PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

REMIMaintenance fee reminder mailed
LAPSLapse for failure to pay maintenance fees

[8]ページ先頭

©2009-2025 Movatter.jp