REFERENCE TO RELATED APPLICATIONSThe present application is related to a U.S. patent application filed on even date herewith entitled “METHOD OF INTERACTING WITH A COMPUTER USING A PROXIMITY SENSOR IN A COMPUTER INPUT DEVICE”, identified by attorney docket number M61.12-00. The present invention is also related to a U.S. patent application filed on even date herewith entitled “A TECHNIQUE FOR IMPLEMENTING A TWO-HANDED DESKTOP USER INTERFACE FOR A COMPUTER”. The present application is also related to a U.S. patent application filed on even date herewith entitled “INPUT DEVICE WITH FORWARD/BACKWARD CONTROL”, identified by attorney docket number M61.12-0083.[0001]
BACKGROUND OF THE INVENTIONThe present invention relates to computerized systems. In particular, the present invention relates to input devices for computerized systems.[0002]
Computerized systems receive input signals from input devices such as keyboards, mice, joysticks, game pads, touch pads, track balls, and headsets. These input devices create input signals using touch sensors, transducers, or switches. Switches are typically found in the buttons of mice, joysticks, and game pads, as well as in the keys of keyboards. Transducers are found in mice and track balls and create electrical signals based on the movement of balls in those devices. Transducers are also found in headsets where they convert speech signals into electrical signals. Touch sensors are found in touch pads and provide an electrical signal when the user contacts the touch pad that includes the location within the touch pad where contact was made.[0003]
Although it is desirable to increase the amount of information that an input device can provide to the computer, the number of transducers and switches that can be added to an input device is limited by the user's ability to remember all of the functions that a particular transducer or switch performs. In addition, the number of transducers and switches that can be added to an input device is limited by the average user's dexterity and their physical ability to manipulate the added controls.[0004]
SUMMARY OF THE INVENTIONAn input device for a computer system includes an exterior surface and a touch sensor located on the exterior surface. The touch sensor is adapted to generate an electrical signal when a user touches the touch sensor. The electrical signal contains touch information that is the same each time the user touches the touch sensor regardless of where the user's touch occurs on the touch sensor. The input device also includes an input generator capable of generating input information sent to the computer system. The input information includes at least a depressible key's state, a depressible button's state, sound information, or movement information.[0005]
In the various embodiments, the input device can include a mouse, a keyboard, a joystick, a game pad, a headset, a remote control for an Internet set-top system, and a remote control for a television.[0006]
In various other embodiments, multiple touch sensors are located on an input device in a variety of locations. In some embodiments, the multiple touch sensor provide sufficient information to indicate how the user is holding the input device and with which hand the user is holding the input device.[0007]
BRIEF DESCRIPTION OF THE DRAWINGSFIG. 1 is a block diagram of a computer system of the present invention.[0008]
FIG. 2 is a more detailed block diagram of one embodiment of an input device of the present invention.[0009]
FIG. 3 is a perspective view of a headset of the present invention.[0010]
FIG. 4A is a perspective view of a mouse of the present invention.[0011]
FIG. 4B is a bottom view of the mouse of FIG. 4A.[0012]
FIG. 4C is a perspective view of a circuit board of the mouse of FIG. 4A.[0013]
FIG. 5 is a top view of another embodiment of a mouse of the present invention.[0014]
FIG. 6A is a left side view of another embodiment of a mouse of the present invention.[0015]
FIG. 6B is a left side view of another embodiment of a mouse of the present invention.[0016]
FIG. 6C is a right side view of another embodiment of a mouse of the present invention.[0017]
FIG. 6D is a right side view of another embodiment of a mouse of the present invention.[0018]
FIGS. 7A and 7B are a left side view and a top view, respectively, of another embodiment of a mouse of the present invention.[0019]
FIGS. 8A and 8B are a left side view and a top view, respectively, of another embodiment of a mouse of the present invention.[0020]
FIGS. 9A, 9B and[0021]9C are a left side view, a top view, and a right side view, respectively, of another embodiment of a mouse of the present invention.
FIGS. 10A, 10B,[0022]10C,10D,10E,10F,10G, and10H, are top views of different embodiments for a mouse button under the present invention.
FIG. 11A is a top view of another embodiment of a mouse under the present invention.[0023]
FIG. 11B is a top view of another embodiment of a mouse under the present invention.[0024]
FIGS. 12A and 12B are right side views of different embodiments of mice under the present invention.[0025]
FIGS. 13A, 13B,[0026]13C, and13D are left side views of different embodiments of mice under the present invention.
FIGS. 14A, 14B,[0027]14C, and14D are top views of different embodiments of mice under the present invention showing touch sensor proximate a wheel on a mouse.
FIG. 15 is a perspective view of a track ball of the present invention.[0028]
FIG. 16 is a perspective view of a joystick of the present invention.[0029]
FIG. 17 is a perspective view of a game pad of the present invention.[0030]
FIG. 18 is a perspective view of a keyboard of the present invention.[0031]
FIG. 19 is a more detailed block diagram of the computer of FIG. 1.[0032]
FIG. 20 is a screen display as it appears before an input device of the present invention is touched.[0033]
FIG. 21 is an image of a screen display after an input device of the present invention has been touched.[0034]
FIG. 22 is an image of a screen display showing a pull-down menu activated through the present invention.[0035]
FIG. 23 is an image of a screen display showing a second pull-down menu opened through the present invention.[0036]
FIG. 24 is an image of a screen display showing an item selected in a pull-down menu through the process of the present invention.[0037]
FIG. 25 is an image of a screen display showing a radial menu.[0038]
FIGS. 26A, 26B,[0039]26C, show animation around a cursor in response to an input device of the present invention being touched.
FIG. 27 is an image of a screen saver.[0040]
FIG. 28 is an image of a screen display showing ink trails of different widths produced by the input device of the present invention.[0041]
FIG. 29 is an image of a screen display showing a cursor in a hypertext link.[0042]
FIG. 30 is an image of a screen display showing a web browser that includes a current page.[0043]
FIG. 31 is an image of a screen display showing a web browser that includes a past page.[0044]
FIG. 32 is an image of a screen display showing a web browser that includes a next page.[0045]
FIG. 33 is a top view of an Internet set-top remote control.[0046]
DETAILED DESCRIPTION OF ILLUSTRATIVE EMBODIMENTSFIG. 1 and the related discussion are intended to provide a brief, general description of a suitable computing environment in which the invention may be implemented. Although not required, the invention will be described, at least in part, in the general context of computer-executable instructions, such as program modules, being executed by a personal computer. Generally, program modules include routine programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Moreover, those skilled in the art will appreciate that the invention may be practiced with other computer system configurations, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, and the like. The invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.[0047]
With reference to FIG. 1, an exemplary system for implementing the invention includes a general purpose computing device in the form of a conventional[0048]personal computer20, including a processing unit (CPU)21, asystem memory22, and asystem bus23 that couples various system components including thesystem memory22 to theprocessing unit21. Thesystem bus23 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. Thesystem memory22 includes read only memory (ROM)24 and random access memory (RAM)25. A basic input/output (BIOS)26, containing the basic routine that helps to transfer information between elements within thepersonal computer20, such as during start-up, is stored inROM24. Thepersonal computer20 further includes ahard disk drive27 for reading from and writing to a hard disk (not shown), amagnetic disk drive28 for reading from or writing to removablemagnetic disk29, and anoptical disk drive30 for reading from or writing to a removableoptical disk31 such as a CD ROM or other optical media. Thehard disk drive27,magnetic disk drive28, andoptical disk drive30 are connected to thesystem bus23 by a harddisk drive interface32, magneticdisk drive interface33, and anoptical drive interface34, respectively. The drives and the associated computer-readable media provide nonvolatile storage of computer readable instructions, data structures, program modules and other data for thepersonal computer20.
Although the exemplary environment described herein employs the hard disk, the removable[0049]magnetic disk29 and the removableoptical disk31, it should be appreciated by those skilled in the art that other types of computer readable media which can store data that is accessible by a computer, such as magnetic cassettes, flash memory cards, digital video disks, Bernoulli cartridges, random access memories (RAMS), read only memory (ROM), and the like, may also be used in the exemplary operating environment.
A number of program modules may be stored on the hard disk,[0050]magnetic disk29,optical disk31,ROM24 orRAM25, including anoperating system35, one ormore application programs36,other program modules37,program data38, anddevice driver60. The device driver process commands and information entered by a user through aninput device43, which can include a keyboard, a pointing device, a microphone, a headset, a track ball, a joystick, a game pad, or the like. Under the present invention, at least one of the input devices includes both atouch sensor40 and amovement transducer42.Touch sensor40 is capable of generating a signal that indicates when the user is touching the input device.Movement transducer42 is capable of generating a signal that indicates when a user causes part of the input device to move. The signals generated bytouch sensor40 andmovement transducer42 are passed along a conductor connected to theprocessing unit21 through aserial port interface46 that is coupled to thesystem bus23, but may be connected by other interfaces, such as a sound card, a parallel port, a game port or a universal serial bus (USB).
A[0051]monitor47 or other type of display device is also connected to thesystem bus23 via an interface, such as avideo adapter48. In addition to themonitor47, personal computers may typically include other peripheral output devices, such as aspeaker45 and printers (not shown).
The[0052]personal computer20 may operate in a networked environment using logic connections to one or more remote computers, such as aremote computer49. Theremote computer49 may be another personal computer, a hand-held device, a server, a router, a network PC, a peer device or other network node, and typically includes many or all of the elements described above relative to thepersonal computer20, although only amemory storage device50 has been illustrated in FIG. 1. The logic connections depicted in FIG. 1 include a local area network (LAN)51 and a wide area network (WAN)52. Such networking environments are commonplace in offices, enterprise-wide computer network intranets, and the Internet.
When used in a LAN networking environment, the[0053]personal computer20 is connected to the local area network51 through a network interface oradapter53. When used in a WAN networking environment, thepersonal computer20 typically includes amodem54 or other means for establishing communications over thewide area network52, such as the Internet. Themodem54, which may be internal or external, is connected to thesystem bus23 via theserial port interface46. In a network environment, program modules depicted relative to thepersonal computer20, or portions thereof, may be stored in the remote memory storage devices. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used. For example, a wireless communication link may be established between one or more portions of the network.
FIG. 2 is an expanded block diagram of a portion of one embodiment of an[0054]input device43 of FIG. 1.Input device43 includes an array of fourtouch sensors100,102,104, and106. Each of the sensors produces an electrical signal along arespective conductor108,110,112, and114, which are connected to an analog-to-digital converter andmuliplexer116.Touch sensors100,102,104, and106, generate their electrical signals based on actual contact between the user and a portion of the sensor or based on extreme proximity between the user and the sensor. Those touch sensors that rely on contact are referred to as contact sensors and those that rely on proximity are referred to as proximity sensors. In the context of this application, a touch sensor is touched when it is contacted in the case of contact sensors or when the user is sufficiently proximate the sensor in the case of proximity sensors.
In some contact sensor embodiments, a touch sensor includes a conductive film available from ChemTronics that has a capacitance that changes when it is touched. This sensor also includes a capacitive measuring circuit that generates an electrical signal based on the change in capacitance of the conductive film. Those skilled in the art will recognize that other contact sensor technologies are available such as photodiodes, piezoelectric materials, and capacitive pressure sensors. Any of these sensors may be used within the context of the present invention. In one proximity sensor embodiment, the touch sensor uses reflected light from an LED to detect when the user is proximate the sensor. A chip used to drive the LED and sense the reflected light under this embodiment is produced by Hamamatsu Corporation of Bridgewater, N.J. Other proximity sensor embodiments use changes in electric or magnetic fields near the input device to determine when the user is proximate the device.[0055]
In embodiments of the present invention, the touch sensors provide the same information regardless of where on the touch sensor the user touches the input device or the portion of the sensor that the user is proximate. Thus, a given touch sensor does not provide location information that would indicate where the user made contact within that touch sensor or where the user came closest to the touch sensor within the touch sensor. Thus, under the present invention, the touch sensors decouple touch data from position data.[0056]
This distinguishes the present invention from touch pads, touch screens and touch tablets of the prior art. In all of these prior devices, one cannot specify positional data without touching the device, nor can one touch the device without specifying a position. Hence, touch sensing and position sensing are tightly coupled in these prior devices.[0057]
Analog-to-digital converter and[0058]multiplexer116 converts the analog electrical signals found onconductors108,110,112, and114, into digital values carried on aline118.Line118 is connected tomicrocontroller120, which controlsmultiplexer116 to selectively monitor the state of the four touch sensors.Microcontroller120 also receives inputs from various other sensors on the input device. For simplicity, these inputs are shown collectively asinput122. Those skilled in the art will recognize that different input devices provide different input signals depending on the types of motion sensors in the input device. Examples of motion sensors include switches; which provide signals indicative of the motion needed to close a switch; microphones, which provide signals indicative of air movement created by an audio signal; encoder wheels, which provide signals indicative of the motion of a mouse ball, track ball, or mouse wheel; and resistance wipers, which provide electrical signals indicative of the movements of a joystick. Each of these motion sensors acts as an input generator that is capable of generating input information to be sent to the computer system. Based on the particular input generator, this input information can include a depressible key's state, a depressible button's state, sound information, or movement information.
Those skilled in the art will also recognize that the number of input lines tied to[0059]microcontroller120 depends on the number of sensors on the input device and the configuration of the input device. For example, for a keyboard, the microcontroller uses input lines to determine if any of the keys have been depressed. The microcontroller accomplishes this using a multiplexer (not shown) to sequentially test the state of each key on the keyboard. The techniques used to detect these switch closures are well known in the keyboard art.
In a mouse or track ball,[0060]input lines122 include lines for detecting the closure of switches and lines for detecting the rotation of encoder wheels. The switches are located beneath buttons on the mouse or tack ball. The encoder wheels track the movement of the mouse ball or track ball. Typically, one encoder wheel tracks movement in the X direction and another encoder wheel tracks movement in the Y direction. In most embodiments, each encoder wheel has its own associated input line intomicrocontroller120. In some mice, an additional encoder wheel tracks the rotation of a wheel located on top of the mouse.
In some mice, the X and Y movement of the mouse is tracked by a separate optics microcontroller that is connected to[0061]microcontroller120 throughlines122. The optics microcontroller uses optical data to determine movement of the mouse. The optical microcontroller converts this optical data into movement values that are transmitted tomicrocontroller120 along input lines122.
In a game pad,[0062]input lines122 include lines for detecting the closure of multiple switches on the game pad as well as lines for detecting the rotation of wheels on the game pad. In joysticks,input lines122 can include lines connected to resistance wipers on the joystick as well as switches on the joystick. In headsets,lines122 include multiple lines that carry multi-bit digital values indicative of the magnitude of the analog electrical signal generated by the microphone. These digital values are typically produced by an analog-to-digital converter. To reduce the weight of the headset, the analog-to-digital converter andmicrocontroller120 are often found on a soundboard located within the computer. To further reduce the weight of the headset, multiplexer and A-to-D converter116 of FIG. 2 can also be implemented on the soundboard.
[0063]Microcontroller120 produces anoutput124, which is provided toserial port interface46 of FIG. 1. Typically,output124 is a serial, digital value that indicates which motion sensor or touch sensor has been activated. For keyboards, the digital values include scan codes that uniquely identify the key or touch sensor on the keyboard that has been activated. For mice, the digital values include a mouse packet that describes the current state of each switch and each touch sensor on the mouse as well as the distances that the mouse wheel and mouse ball have moved since the last mouse packet was sent.
FIG. 3 is a perspective view of a[0064]headset150 of the present invention.Headset150 includes amicrophone152, asupport piece154, atouch sensor156, and anoutput line158.Support piece154 is designed to loop around a user's ear to support the headset such thatmicrophone152 positioned in front of the user's mouth.
[0065]Output line158 carries signals frommicrophone15,2 and fromtouch sensor156. In some embodiments,headset150 is connected to a computer system that includes a speech recognition system. In these embodiments, the speech recognition system is inactive unlesstouch sensor156 indicates thatheadset150 is being touched by a user. The activation of the speech recognition system can include loading the speech recognition system into random access memory when the user first touchesheadset154. It can also include prompting a speech recognition system that resides in random access memory so that it can process input speech signals. In either case, by only activating the speech recognition system whenheadset150 indicates that the user is touching the headset, the present invention reduces the likelihood that extraneous speech will be processed by the speech recognition system.
FIG. 4A is a perspective view of one embodiment of a[0066]mouse170 of the present invention.Mouse170 includes a palm-rest172, aleft button174, a right button176, awheel178, aside180, and anoutput line182. Palm-rest172,left button174, and twoside areas184 and186 ofside180 are coated with separate conductive films. Each of the conductive films is connected to and forms part of a separate sensor such assensor100,102,104, and106 of FIG. 2.
FIG. 4B shows a bottom view of[0067]mouse170.Mouse170 includes atrack ball190 located in a track ball nest192. Whenmouse170 is moved across a surface through force applied at palm-rest172,side180,left button174, or right button176,track ball190 rolls within nest192. This rolling is detected by a pair ofencoder wheels194 and196 that are shown in FIG. 4C.
FIG. 4C is a perspective view of some of the[0068]internal electronics189 ofmouse170. In FIG. 4C,track ball190 has been omitted for clarity.Internal electronics189 includeencoder wheels194 and196 that detect movements oftrack ball190 along two perpendicular directions. The encoder wheels produce electrical signals that are provided to microcontroller200, which also receives inputs fromswitches202 and204 located underleft button174 and right button176, respectively.Switches202 and204 indicate when leftbutton174 and right button176, respectively, have been depressed by the user. Microcontroller200 also receives signals fromswitch201, which indicate whenwheel178 has been depressed, and anencoder wheel203, which indicates rotational movement ofwheel178. Microcontroller200 also receives electrical signals from the four sensors attached to the conductive films on palm-rest172,left button174, andside areas184 and186 of FIG. 4A. These four sensors are grouped together in FIG. 4C assensor array206.
Thus, the mouse of the present invention is able to detect when certain areas of the mouse are being touched and when portions of the mouse or the entire mouse are being moved. Specifically, the conductive films at palm-[0069]rest172,left button174, andside areas184 and186 indicate when these areas are being touched by the user. Note that even if the user does not move the mouse or press a button, the sensors associated with the conductive films of FIG. 4A will generate an electrical signal when the user touches the mouse.Encoder wheels194 and196 generate a separate electrical signal when the user moves the mouse and switches202,204, and201 generate separate electrical signals when the user depressesbuttons174 and176, andwheel178 respectively. Thus, the mouse of the present invention adds functionality without increasing dexterity needed to manipulate the controls of the mouse.
In alternative embodiments of the present invention,[0070]track ball190 andencoder wheels194 and196 are replaced by a solid-state position-tracking device that collects images of the surface that the mouse travels over to determine changes in the position of the mouse. Under these embodiments, the mouse typically includes a light source used to illuminate the surface, an optics system used to collect images of the surface, and a processor used to compare the various images to determine if the mouse has moved, and if so, in what direction. Since the solid-state position-tracking device converts movement into an electrical signal, it can be considered to be a sophisticated transducer or motion sensor.
FIGS. 5, 6A,[0071]6B,6C,6D,7A,7B,8A,8B,9A,9B,9C,10A,10B,10C,10D,10E,10F,10G,10H,11A,11B,12A,12B,13A,13B,13C,13D,14A,14B,14C, and14D show alternative configurations for a mouse under the present invention. FIG. 5 is a top view of a mouse that only has a touch sensor on its palm rest600. FIGS.6A, and6B, show separate mice embodiments that each have a sensor at the palm rest and along the left side of the mouse. In FIG. 6A, which is a side view, asingle sensor602 covers both the palm rest and the left side of the mouse. In FIG. 6B, also a side view, one sensor covers apalm rest604 and a separate sensor covers a left side606.
FIGS. 6C and 6D show separate mice embodiments of the present invention that each has a sensor at the palm rest and along the right side of the mouse. In FIG. 6C, which is a right side view, a[0072]single sensor603 covers both the right side and the palm rest. In FIG. 6D, also a right side view, onesensor605 covers the palm rest and aseparate sensor607 covers the right side.
FIGS. 7A and 7B show a side view and a top view, respectively, of a mouse embodiment having a single sensor[0073]608 across a palm rest and a left side of the mouse, and aseparate sensor610 on the left button of the mouse. FIGS. 8A and 8B show a side view and a top view, respectively, of a mouse embodiment having asingle touch sensor612 across the palm rest and left side of the mouse, atouch sensor614 on the left button of the mouse and atouch sensor616 on the right button of the mouse.
FIGS. 9A, 9B, and[0074]9C show a left side view, a top view and a right side view, respectively, of amouse690 of the present invention.Mouse690 includes aleft side sensor692, apalm sensor694, aright side sensor696, and abutton sensor698. Inmouse690,right side sensor696 andleft side sensor692 are separate from palm sensor649. In another embodiment of the present invention these three sensors are formed as a single sensor.
FIGS. 10A, 10B,[0075]10C,10D,10E,10F,10G and10H show top views of different mice embodiments showing possible configurations for touch sensors on the left button of a mouse of the present invention. These button configurations may appear alone on the mouse or in combination with other sensors on other parts of the mouse. FIG. 10A shows a singlesolid sensor618 across the entire left button. FIG. 10B shows a set of sixsensor strips620 that each produce a separate electrical signal when they are touched. FIG. 10C shows tworegions624 and626 separated by aridge628. Bothregion624 and626 end at afront end627 of button622. FIG. 10D shows tworegions634 and637 separated by aridge636, whereregions634 and637 both end at aside end631 ofbutton630. The configurations ofbuttons622 and630 are particularly useful in paging through documents as discussed below. FIG. 10E shows a button configuration for abutton640 having four separate sensor areas formed assquares641,642,643, and644. In some embodiments, the lines that separate the four sensor areas are formed as ridges that have a different topography from the sensor areas. FIG. 10F also shows four separate sensors on abutton646. In FIG. 10F three of thesensor areas650,651, and652 are found at a front end ofbutton646, and the remainingsensor648 covers the remainder ofbutton646. FIG. 10G shows abutton660 with nine sensor regions arranged in a layout similar to a keypad. FIG. 10H shows abutton670 with an outer circle of eightsensors672 that surrounds acentral sensor674. The configuration ofbutton670 is especially useful for manipulating radial menus.
FIGS. 11A and 11B show mice embodiments that include separate sensors on both buttons of the mouse. In FIG. 11A,[0076]buttons700 and702 have sensors butpalm rest704 does not have a sensor. In FIG. 11B,buttons706 and708 andpalm rest710 each has separate sensors.
FIGS. 12A and 12B show mice embodiments with multiple sensors along the right side of the mouse. In FIG. 12A, which is a right side view, there are two[0077]sensors720 and722 along the right side. In FIG. 12B, there are threesensors724,726, and728 along the right side.
FIGS. 13A, 13B,[0078]13C, and13D show side views of mice embodiments with multiple sensors along the left side of the mouse. The mouse of FIG. 13A has twosensors734 and736 along the left side. In FIG. 13B, the mouse has threetouch sensors738,740, and742, each separated by a space. The mouse of FIG. 13C also has three touch sensors along the left side. However, in FIG. 13C,middle touch sensor744, which is located betweensensors746 and748, has a raised surface and is formed as a ridge betweensensors746 and748. The raised surface ofsensor744 provides tactile feedback to the user to allow the user to determine the position of their thumb without looking at the mouse. FIG. 13D shows a mouse embodiment with a plurality ofstrips752 running along the left side of the mouse.
Note that all of the embodiments of FIGS. 12A, 12B,[0079]13A,13B,13C, and13D can be practiced under the present invention along with a sensor located on the palm rest and/or a sensor located on the left button and/or a sensor located on the right button.
FIGS. 14A, 14B,[0080]14C, and14D are top views of mice embodiments with touch sensors proximate a wheel on a mouse. In FIG. 14A, the touch sensor is located directly on awheel760. In FIG. 14E, onetouch sensor762 is located forward of awheel764, and onetouch sensor766 is located in back ofwheel764. In the embodiment of FIG. 14B,wheel764 does not have a touch sensor. In FIG. 14C, onetouch sensor770 is located in front of awheel768 and onetouch sensor772 is located in back ofwheel768. In addition,wheel768 includes a touch sensor. In the embodiment of FIG. 14D, touch sensors are located on awheel774,front area776, which is in front ofwheel774, backarea778, which is in back ofwheel774, andpalm rest780.
Although various embodiments have been described with particularity with respect to touch sensor location in FIGS. 5, 6A,[0081]6B,6C,6D,7A,7B,8A,8B,9A,9B,9C,10A,10B,10C,10D,10E,10F,10G,10H,11A,11B,12A,12B,13A,13B,13C,13D,14A,14B,14C, and14D, it should be noted that sensors may also be included in other locations. For example, it is possible to combine some or all of the touch sensors illustrated in one embodiment with some or all of the touch sensors illustrated in another embodiment.
FIG. 15 is a perspective view of a[0082]track ball220 of the present invention.Track ball220 includes abase222,buttons224 and226, and aball228. In one embodiment of the present invention,track ball228 is coated with a conductive film that is contacted by three rotating metal wheels (not shown) inbase222. One of the metal wheels is contacted by a conductive sheet that sits behind the wheel and is pressed into the wheel by a spring force. The conductive sheet is further connected to a touch sensor that produces an electrical signal whentrack ball228 is touched by a user. The other two wheels inbase222 form two orthogonal motion sensors (not shown) capable of tracking the rotary motion oftrack ball228 inbase222. Beneathbuttons224 and226,base222 includes two switches that are capable of generating electrical signals when a user depressesbuttons224 and226. Thus,track ball220 is able to provide one electrical signal based on the user simply touchingball228 and separate electrical signals based on the user movingtrack ball228 ordepressing buttons224 or226.
FIG. 16 is a perspective view of a joystick[0083]240 of the present invention that includes abase242, a handle244, atrigger246, andbuttons248,250, and252. In one embodiment of the present invention,trigger246 is coated with a conductive film that is connected to a touch sensor withinbase242. In further embodiments,button248 is also coated with a conductive film connected to a separate touch sensor inbase242.Trigger246 andbuttons248,250, and252 are further connected to switches that provide respective electrical signals when the user depresses the respective buttons or trigger. Handle244 is connected to a set of transducers that track the relative motion of handle244 relative tobase242. Thus, joystick240 provides a set of electrical signals when the user is touchingtrigger246 orbutton248 and a separate set of electrical signals when the user moves handles244 or moves trigger246 orbuttons248,250, or252.
FIG. 17 is a perspective view of a[0084]game pad260 of the present invention havingside buttons262 and264,left hand buttons266,268,270,272,274, and276 andright hand buttons278,280,282,284,286, and288. In addition,game pad260 has a start button290 and aselect button292. In some embodiments of the present invention,side buttons262 and264 are each coated with a conductive film that is connected to a respective touch sensor withingame pad260.Game pad260 also includes a plurality of switches, one switch for each button on the game pad. Thus, in some embodiments,game pad260 is able to provide one set of signals indicative of when the user is touchingside buttons262 and264 and a second set of electrical signals indicative of when the user has depressed a button ongame pad260.
FIG. 18 depicts a[0085]keyboard300 of one embodiment of the present invention that has atypical QWERTY layout302 on the left side of the keyboard and anumeric keypad304 on the right side.Numeric keypad304 includes the numbers 0-9 with the numbers 1-9 appearing in a 3×3 box. In some embodiments, all nine of these keys are covered with a conductive film. In other embodiments, other keys on the keyboard are covered by the conductive film. The conductive film on each key is connected to and forms part of a separate touch sensor inkeyboard300. The application of such touch sensors in the present invention is discussed further below. The fact that each key has a conductive film means that the keys are each able to provide two signals. One signal is provided when the user touches but does not depress the key and a second signal is provided when the user depresses the key.
Additional touch sensors are located on[0086]keyboard casing301 atportions306 and307 belowspace bar308, atportion309 belowarrow keys310, and at aportion311 belowkey pad304.Arrow keys310 are typically used by the user to move a cursor across the display. Note that althoughkeyboard300 is shown with touch sensors on the keys and touch sensors onportions306,307,309, and311, other embodiments of the invention only have touch sensors on the keys or only on one of theportions306,307,309, and311. In other embodiments, different combinations of these touch sensors are found on the keyboard. In addition, some or all of the touch sensors onportions307,307,309, and311 are proximity sensors in some embodiments. The proximity sensors can detect the user's hand when it is near the sensor without requiring the hand to actually contact the sensor.
FIG. 19 is a more detailed block diagram of[0087]computer20 useful in describing a message routing system of one embodiment of the present invention. In FIG. 19,input device43 provides a serial binary signal toserial interface46.Input device43 can include any of the input devices described above that have touch sensors.
[0088]Serial interface46 converts the serial binary signal frominput device43 into parallel multi-bit values that are passed todevice driver60. In many embodiments of the presentinvention device driver60 is implemented as a software routine that is executed byCPU21 of FIG. 1. In these embodiments,device driver60 is input device specific and is designed to interact with a particular input device based on a designated protocol. Thus, ifinput device43 is a mouse,device driver60 is a mouse driver that is designed to receive mouse packets generated by the mouse using a mouse packet protocol. Ifinput device43 is a keyboard,device driver60 is a keyboard driver designed to receive keyboard scan codes indicative of a key being depressed or a touch sensor being touched.
Based on the designated protocol,[0089]device driver60 converts the multi-bit values into device messages that are passed tooperating system35. These device messages indicate what events have taken place on the input device. For example if a touch sensor on a mouse has been touched, the message indicates that the particular sensor is being touched. When the touch sensor is released, a separate message is generated bydevice driver60 to indicate that the touch sensor has been released.
The messages generated by[0090]device driver60 are provided tooperating system35, which controls the routing of these messages. Under many embodiments, the device messages are usually sent to afocus application812. The focus application is typically the application that has the top-most window on the display.
In some embodiments of[0091]operating system35, the operating system maintains a list of message hook procedures that have been registered with the operating system. In these embodiments,operating system35 sequentially passes the device message to each message hook procedure on the list before sending the message to focusapplication812. Such message hook procedures are shown generally asmessage hook procedures810 of FIG. 19. Most message hook procedures simply evaluate the device message to determine if some action should be taken. After evaluating the device message, the message hook procedure returns a value tooperating system35 indicating that the operating system should pass the device message to the next procedure in the list. Some message hook procedures have the ability to “eat” a device message by returning a value tooperating system35 that indicates that the operating system should not pass the device message to any other message hook procedures or to the focus application.
The message hook procedures and the focus application use the device messages, especially those indicating that a touch sensor has been touched, to initiate a variety of functions that are discussed below.[0092]
For example, FIGS. 20 and 21 depict images of screens displayed by various applications of the present invention that utilize device messages generated based on signals from an input device of the present invention such as[0093]mouse170 andtrack ball220 of FIGS. 4A and 15, respectively. FIG. 20 depicts an image of ascreen320 that shows avirtual desktop322.Virtual desktop322 includes images oficons324 and326 as well as anopen window328.Open window328 is associated with a word processing application known as Microsoft Word, offered by Microsoft Corporation of Redmond, Wash.
In[0094]Window328, acaret330 is positioned within a sentence of an open document.Caret330 is may be positioned by movingmouse170 orball228 oftrack ball220. In FIG. 20caret330 appears as a vertical line that extends between two smaller horizontal lines. Those skilled in the art will recognize thatcaret330 can have many different shapes, and typically appears as an arrow ondesktop322.
The position of[0095]caret330 within the sentence ofwindow328 causes atool tip332 to appear.Tool tip332 indicates who entered the word that caret330 is positioned over.
[0096]Window328 also includes atool bar334 that includes drawing tools that can be used to draw pictures in the document ofwindow328.
Under embodiments of the present invention,[0097]caret330,tool tip332, andtool bar334 only appear inwindow328 while the user is touching a portion of the input device. If the user is not touching the input device,caret330,tool tip332, andtool bar334 disappear. FIG. 21 shows an image ofdisplay320 when the user is not touching a portion of the input device. By eliminatingtool bar334,caret330, andtool tip332 when the user is not touching the input device, the present invention reduces the clutter found inwindow328 and makes it easier for the user to read the document shown inwindow328.
Those skilled in the art will recognize that the disappearance of[0098]cursor330,tool tip332, andtool bar334 when the user is not touching the input device can be controlled independently. Thus, the user may customizewindow328 such thattool tip332, andtool bar334 disappear when the user releases the input device, butcaret330 remains visible. In addition, the rate at which items disappear and reappear can be controlled. Thus, it is possible to fade images off the display and to fade them back onto the display as the user releases and then touches the input device. In some embodiments, the fade-out period is 2.0 seconds to minimize distraction, and the fade-in period is 0.0 seconds for the cursor, which appears instantly and 0.3 seconds for toolbars.
FIGS. 22, 23, and[0099]24 show a series of display screens that include pull-down menus that are displayed as a result of keyboard messages fromkeyboard300 of FIG. 18. In particular, inscreen image350 of FIG. 22, an application generates anactive window352 onvirtual desktop354 that includes an image of a pull-down menu356. Pull-down menu356 is associated with a menu heading entitled “Tools” found in amenu bar358. Pull-down menu356 is displayed in response to a keyboard message that indicates that the user is touching but not depressing one of the keys ofnumeric keypad304 ofkeyboard300.
In other embodiments, the user may move left and right across[0100]menu bar358 by using the keys representing the numbers “4” and “6” onnumeric keypad304. As the user moves across menu bar358 a different pull-down menu is displayed for each respective menu heading. Specifically, by touching the key representing the number “4”, the user causes a keyboard message to be sent to the application, which changes the display so that the menu heading to the left of the current menu heading inheader menu358 is displayed. Thus, if the pull-down menu for the menu heading “Tools” is currently displayed inwindow352, touching the key representing the number “4” causes a pull-down menu associated with the menu heading “Insert” to be displayed. Similarly, the user can cause a pull-down menu to appear for a menu heading to the right of the current menu heading by touching the key representing the number “6” onnumeric keypad304. Thus, if the current pull-down menu is associated with the menu heading “Tools”, and the user touches the key representing the number “6”, the pull-down menu associated with the menu heading “Format” inheader menu358 will be displayed. This is shown in FIG. 23 where pull-down menu360 for the menu heading “Format”358 is displayed.
By touching the keys representing the numbers “2” and “8” on[0101]numeric keypad304, the user can also move up and down within a pull-down menu such as pull-down menu360. As the user moves through a pull-down menu, different items within the pull-down menu become highlighted. An example of a highlighted entry isentry362 of FIG. 23, which highlights the entry “Tabs” of pull-downwindow360 as the current entry. If the user touches the key representing the number “8” whileentry362 is the current entry, the application that receives the associated keyboard message highlightsentry364 located aboveentry362 as the current entry. If the user touches the key representing the number “2” whileentry362 is the current entry,entry366 belowentry362 is highlighted as the current entry.
FIG. 23 can also be used to describe another embodiment of the present invention. In particular, pull-down[0102]window360 may also be activated by positioning the caret over the menu heading “Format” and depressing a select button on a pointing device such asmouse170 ortrack ball220 of FIGS. 4A and 15, respectively. The user may select an entry in pull-downwindow360 by moving the pointing device downward through the list of entries. As the user moves the input device, individual entries in the list are highlighted.
In the prior art, pull-[0103]down menu360 will continue to be displayed, even if the caret is positioned outside of the pull-down menu itself. The only way to make the pull-down menu disappear is to click on an area outside of the menu itself. However, under an embodiment of the present invention, the application that produces the pull-down menu, removes the pull-down menu as soon as it receives a mouse message that indicates that the user released the pointing device. This improves user efficiency by reducing the movements the user must make to close the pull-down windows associated withheader menu358.
FIG. 25 is an image of a display screen that includes a[0104]radial menu370 that is displayed under an alternative embodiment of the present invention.Radial menu370 includes eight entries arranged in acircle371 around a cancelbutton372.Radial menu370 may either be manipulated by usingkeyboard300 of FIG. 18 or by using the touch sensors onbutton670 of the mouse of FIG. 10H.
Using[0105]keyboard300, a focus application displaysradial menu370 when it receives a keyboard message indicating that a user touched one of the keys inkey pad304. To highlight a specific entry, the user touches a key inkeypad304 that is spatially related to the entry. For example, to highlightentry373 ofradial menu370, the user touches the key representing the number “8”, which is located directly above a center key representing the number “5” because the spatial positioning of the “8” key relative to the “5” key is the same as the spatial relationship betweenentry373 and cancelbutton372. To select an entry, the user depresses the key that causes the entry to be highlighted. To dismiss the radial menu, the user depress the “5” key.
To manipulate the radial menu using the touch sensors of[0106]button670 on the mouse of FIG. 10H, the user simply touches the touch sensor that corresponds to an entry on the radial menu. Simply touching the corresponding touch sensor causes the entry to be highlighted.Depressing button670 while touching the corresponding touch sensor causes the entry to be selected. The application determines that both events have occurred based on two separate mouse messages. A first mouse message indicates which touch sensor is currently being touched. A second mouse message indicates that the left button has been depressed.
FIGS. 26A, 26B, and[0107]26C, show images of screens displayed by a program application of the present invention that depict an animation created by the application. In particular, these Figures show the animation of a caret “sonar” that is formed by sequentially placing rings around the caret. This animated sonar is initiated under the present invention when the user initially touches an input device such asmouse170 of FIG. 4A.
The animation can be seen in FIGS. 26A, 26B, and[0108]26C by viewing therespective displays400,402, and406 as a sequence of displays that are presented to the user in that order. Indisplay400 of FIG. 26A,caret406, which appears as an arrow, is shown without any surrounding graphics. In display402,caret406 is surrounded by acircle408. In display404,caret406 is surrounded by twocircles408 and410. Under one embodiment, the animation of FIGS. 26A, 26B, and26C only last for 0.3 seconds after the user initially touches the input device.
FIG. 26A can also be used to describe another embodiment of the present invention. Specifically, under this embodiment of the present invention,[0109]caret406 of FIG. 26A will not move unless the input device is being touched by the user while it is being moved. Thus, ifmouse170 moves because the user accidentally kicks the cord of the mouse,caret406 will not move under the present invention since the user was not touching the mouse directly. Under prior art systems, applications moved the caret upon receiving a mouse message that indicated that the mouse had been moved. Under the present invention, the application only moves the caret if it receives a message that the mouse is being touched and a message that the mouse has moved. This helps to prevent unwanted movement of the caret.
In the prior art of computer systems, if the user has not moved the input device or has not entered text over a period of time, the computer system will initiate a screen saver program. Such a program provides a mostly black display to help reduce the wear on the screen. An example of a screen saver is shown in FIG. 27. Under the present invention, the screen saver application will be stopped when the user touches an input device of the present invention. Thus, the user does not have to move the input device as in the prior art, but only has to touch the input device in order to stop the screen saver program and to redisplay the virtual desktop. Thus, when the user touches the input device,[0110]screen saver display430 is replaced with a desktop display such asdisplay400 of FIG. 26A.
In some embodiments of the present invention, the input device includes enough touch sensors that it is possible for the present invention to identify how the user is gripping the input device. For example,[0111]mouse690 of FIGS. 9A, 9B,9C, which is referred to by the inventors as a “pinch” mouse, includes twoside touch sensors692 and696 and a palmrest touch sensor694. Thus, it is possible for the applications of the present invention to identify which touch sensors the user is touching based on a collection of device messages and thus, how the user is grippingmouse690.
This information can be used to control how the caret moves on the display. For example, under one embodiment of the present invention, if the user is gripping[0112]mouse690 so that the user's thumb is touchingleft side sensor692 and their palm is touching palmrest touch sensor694, the caret moves relatively large distances across the display for fixed movements ofmouse690. If the user is grippingmouse690 such that the user is touchingleft side sensor692,right side sensor696 but not palmrest touch sensor694, the caret moves small distances for the same fixed movement ofmouse690. This provides more flexibility in the control of the caret and is useful in programs where the caret is used to draw on the screen, to place the cursor on the screen, and to move objects.
In an alternative embodiment, the manner in which the user grips the input device can be used to control the width of an ink trail produced behind the caret as the user moves the input device. FIG. 28 is an image of a[0113]display436 showing twoink trails438 and440 of different widths. Under this embodiment of the invention, these ink trails are produced when the user grips the input device in two different ways. For example, narrow-width ink trail438 is produced when the user touches bothleft side sensor692 andright side sensor696 ofmouse690. On the other hand, thick-width ink trail440 is produced when the user touchesleft side sensor692 and palm-rest touch sensor694 but notright side sensor696.
In further embodiments of the present invention, ink trails, such as ink trails[0114]438 and440 of FIG. 28 can be produced by touching a button on an input device such asbutton174 of FIG. 4A. In the prior art, such ink trails are usually only produced if the button is depressed. Under the present invention, the user does not have to strain to maintain pressure on the button while producing the ink trail. Instead, the user only needs to keep their finger in contact with the button. Similarly, in some embodiments of the present invention, the user may open boxes, drag objects, and initiate commands by simply touching the top of the button instead of having to depress the button. The movement of the object, box, or ink trail is then controlled by the movement of the input device by the user while the user maintains contact with the button.
The user may also place a cursor within a hypertext link, such as link[0115]457 of FIG. 29, by touching a button on the input device while a displayedcaret458 is positioned over the link. The user activates the link by depressing the button. Such embodiments make it easier to place a cursor within a link without activating the link.
In one embodiment of the present invention, multiple touch areas on an input device can be used to page backwards and forwards through web pages provided by an Internet browser. Examples of input devices having multiple touch sensitive areas useful in paging are the mice of FIGS. 10C, 10D,[0116]12A,12B,13A,13B, and13C. In FIG. 10C, touchingregion624 and thenregion626 initiates a page backward function andtouching region626 and thenregion624 initiates a page forward function. In FIG. 10D, touchingregion637 and thenregion634 initiates a page backward function andtouching region634 and thenregion637 initiates a page forward function. In FIGS. 12A and 12B, touchingregions722 and724, respectively, and thenregions720 and728, respectively, initiates page forward functions and touchingregions720 and728, respectively, and thenregions722 and724, respectively, initiates page backward functions. In FIGS. 13A, 13B, and13C, touchingregions734,738, and746, respectively, and then touchingregions736,742 and748, respectively, initiates page forward functions and touchingregions736,742, and748, respectively, and then touchingregions734,738, and746, respectively, initiates page backward functions.
Note that a mouse of the present invention can be configured so that paging functions are initiated simply by touching one touch sensor instead of touching a sequence of two touch sensors. Thus, in FIG.[0117]10C touching region624 can initiate a page forward function andtouching region626 can initiate a page backward function. Similarly, touchingregion734 of FIG. 13A can initiate a page forward function andtouching region736 of FIG. 13A can initiate a page backward function. In this context, the touch sensors of the present invention provide the functionality of the side switches found in a patent application filed on even date herewith entitled “INPUT DEVICE WITH FORWARD/BACKWARD CONTROL”, and identified by attorney docket number M61.12-0083, the inventors of which were under a duty to assign the application to the assignee of the present application.
The paging functions performed using these touch areas are shown in FIGS. 30, 31, and[0118]32. In FIG. 30display460 shows anInternet browser window462 that depicts acurrent page464. A user can page backward to the Internet page that was displayed beforecurrent page464 to display apast page470 of FIG. 31, which is shown inInternet browser window472. The user can move forward to anext page476, shown inbrowser window478 ofdisplay480 in FIG. 32, using the touch sensor combination described above. In order to be able move forward tonext page476, the user must at some point move backward fromnext page476 tocurrent page464.
Input devices of the present invention also allow for scrolling through pages of documents on a line-by-line basis. In particular, the mice of FIGS. 10B and 13D allow for scrolling using a series of touch sensor strips on the left button and on the left side of the mouse, respectively. When the user strokes the strips by moving their thumb or finger toward their hand, the document is scrolled downward. When the user strokes the strips in the opposite direction, the document is scrolled upward. In some embodiments, the speed at which the strips are stroked determines the scroll rate.[0119]
Scrolling under the present invention is also accomplished using the mice embodiments of FIGS. 14A, 14B,[0120]14C, and14D. In these embodiments, when the user rolls the wheel of the mouse toward their hand, the document scrolls down. When the user rolls the wheel away from their hand, the document scrolls up. In addition, if the user's finger remains in contact with a touch sensor on the wheel or on a touch sensor behind the wheel after rotating the wheel backward, the document will continue to scroll until the user releases the touch sensor. Similarly, if the user's finger remains in contact with a touch sensor on the wheel or a touch sensor in front of the wheel after the user has rolled the wheel forward, the document will continue to scroll up until the user releases the touch sensor. The sensor in front of the wheel can also be tapped by rapidly touching and releasing the touch sensor in order to page down through the document. Similarly, the sensor behind the wheel can be tapped to page up through the document.
In addition to controlling the output images provided to the user, applications of the present invention also control audio signals presented to the user based on touch-indicative signals provided by an input device of the present invention. In some embodiments of the present invention, some audio signals are suppressed if the user is touching the input device. In other embodiments, audio signals are suppressed if the user is not touching the input device. The audio signals can include notification signals such as mail chimes, and hourly clock bells.[0121]
Under some embodiments of the present invention, computer-executable instructions determine at least one characteristic of how a user touches an input device based on a touching signal from the input device. Other instructions record profile information about the user based on this characteristic. One simple characteristic is whether the user is touching the input device. Under an embodiment of the present invention, whether the user is touching the input device is recorded and is transmitted over a network to other user, to indicate that the user is present at their station.[0122]
Additionally, the amount of time that the user spends touching the input device can be recorded. This information can be refined to reflect the amount of time that the user is touching the input device while a certain page from the network is displayed as the top-most page on their computer screen. This is useful in determining the amount of time that the user spends looking at a page from the network for instance a page from the Internet. Being able to track the amount of time a user spends looking at particular pages on the Internet makes it possible to track user interest in pages and to make more accurate determinations of whether a user was likely to have viewed an advertisement on an Internet page.[0123]
The mice embodiments and the keyboard embodiment of the present invention described above are particularly useful for collecting this type of information. For the keyboard of FIG. 18, signals from[0124]touch sensors306,307,309 and311 are used to collect this type of information.
In other embodiments of the present invention, computer-executable instructions determine what hand the user uses to grip the input device. For example, since[0125]mouse170 hasside areas184 and186, the computer system can determine if the user is gripping the mouse with their right hand or their left hand. If the user gripsmouse170 with their right hand,side area186 will be covered by the user's thumb. If the user gripsmouse170 with their left hand,side area186 will not be covered by the user's thumb. By identifying which hand the user uses to grip the mouse, the computer system can identify the user's dominant hand and can allocate functions to the input device's buttons based on the user's dominant hand. Thus, if the left button on the mouse is used for click and drag functions for right handed users the right button on the mouse is used for click and drag functions for left handed users. This allows both left-handed and right-handed users to use the same fingers to activate the same functions.
In one embodiment of the present invention, a computer system has computer-executable instructions for determining if the user is touching the input device and for initiating the spinning of a disk drive when it is determined that the user is touching the input device. Thus, the disk drive would remain inactive until it is determined that the user is touching the input device, which would be an indication that the computer system may need to access the disk drive.[0126]
In another embodiment of the present invention, a computer system determines if the user is touching a headset that is capable of converting a user's speech into an electrical signal. If the system determines that the user is touching the headset, it activates a speech recognition program so that the speech recognition program processes the electrical signals produced by the headset. In other embodiments, the system only activates the speech recognition program if the user is touching a mouse. In still other embodiments, the user must touch both the headset and the mouse to activate the speech recognition program. By only activating the speech recognition system when an input device is touched, the embodiment of the invention reduces unwanted processing of speech that was not directed toward the speech recognition system.[0127]
In yet another embodiment of the present invention, a television or an Internet set-top system utilizes a remote control that includes at least one touch sensor. Such Internet set-top systems provide access to the Internet using a television as a display unit. Some Internet set-tops can also integrate television programs with Internet based information.[0128]
FIG. 32 shows one embodiment of a[0129]remote control500 for an Internet set-top system or television system under the present invention.Remote control500 includes touch sensor502, which includes a conductive film. In one embodiment ofremote control500, the remote control enters an inactive state when the user is not touching touch sensor502. In the inactive state,remote control500 uses less power than in its active state and thus conserves the power of the batteries in the remote control. In another embodiment ofremote control500, a speech recognition program is activated when the user contacts touch sensor502.
In further embodiments of the present invention, a computer system suppresses processor intensive computer-executable instructions if it determines that the user is not touching an input device. Specifically, the invention suppresses instructions that produce images on the display or that produce audio signals. The reason for suppressing these instructions is that the may be wasted since it is likely that the user is not viewing the display if they are not touching an input device. By suppressing these processor intensive instructions, the present invention increases the execution speed of many applications.[0130]
Although the present invention has been described with reference to particular embodiments, workers skilled in the art will recognize that changes may be made in form and detail without departing from the spirit and scope of the invention.[0131]