CROSS REFERENCE TO RELATED APPLICATIONS This application claims priority benefit of U.S. Provisional Patent Application No. 60/755,656, filed Dec. 30, 2005, entitled “TOUCH PAD WITH FEEDBACK,” which is hereby incorporated herein by reference.
This application is related to the following applications, all of which are herein incorporated by reference:
U.S. patent application Ser. No. 10/188,182, titled “TOUCH PAD FOR HANDHELD DEVICE”, filed Feb. 25, 2002;
U.S. patent application Ser. No. 10/722,948, titled “TOUCH PAD FOR HANDHELD DEVICE”, filed Nov. 25, 2003;
U.S. patent application Ser. No. 10/643,256, titled “MOVABLE TOUCH PAD WITH ADDED FUNCTIONALITY”, filed Aug. 18, 2003;
U.S. patent application Ser. No. 10/840,862, titled “MULTIPOINT TOUCHSCREEN”, filed May 6, 2004; and
U.S. patent application Ser. No. 11/115,539, titled “HAND HELD ELECTRONIC DEVICE WITH MULTIPLE TOUCH SENSING DEVICES”, filed Apr. 26, 2005.
BACKGROUND OF THE INVENTION 1. Field of the Invention
The present invention relates generally to touch pads that provide visual feedback. More particularly, the present invention relates to illuminated touch pads that use light to provide feedback.
2. Description of the Related Art
There exist today many styles of input devices for performing operations in a computer system. The operations generally correspond to moving a cursor and/or making selections on a display screen. By way of example, the input devices may include buttons or keys, mice, trackballs, touch pads, joy sticks, touch screens and the like.
Touch pads, in particular, are becoming increasingly popular because of their ease and versatility of operation as well as to their declining price. Touch pads allow a user to make selections and move a cursor by simply touching an input surface via a finger or stylus. In general, the touch pad recognizes the touch and position of the touch on the input surface and the computer system interprets the touch and thereafter performs an action based on the touch event.
Touch pads typically include an opaque touch panel, a controller and a software driver. The touch panel registers touch events and sends these signals to the controller. The controller processes these signals and sends the data to the computer system. The software driver translates the touch events into computer events.
Although touch pads work well, improvements to their form feel and functionality are desired. By way of example, it may be desirable to provide visual stimuli at the touch pad so that a user can better operate the touch pad. For example, the visual stimuli may be used (among others) to alert a user when the touch pad is registering a touch, alert a user where the touch is occurring on the touch pad, provide feedback related to the touch event, indicate the state of the touch pad, and/or the like.
SUMMARY OF THE INVENTION The invention relates, in one embodiment, to an illuminated input device. The illuminated input device includes an object sensing mechanism capable of sensing a user input over an input surface. The illuminated input device also includes a visual feedback system configured to illuminate the input surface in association with a user input.
The invention relates, in another embodiment, to a method of operating an input device. The method includes sensing an object over an input surface. The method also includes and illuminating at least a portion of the input surface when an object is sensed.
The invention relates, in another embodiment, to a method of operating an input device. The method includes illuminating at least a portion of an input surface when an object is detected over the input surface. The method also includes adjusting the illumination when the object is moved over the input surface.
The invention relates, in another embodiment, to a method of operating an input device. The method includes detecting a user input over the input surface. The method also includes determining an input state of the input device based on the user input. The method additionally includes illuminating the input surface based on the input state of the input device. Each input state having a different illumination profile.
BRIEF DESCRIPTION OF THE DRAWINGS The invention may best be understood by reference to the following description taken in conjunction with the accompanying drawings in which:
FIG. 1 is a simplified block diagram of an input device, in accordance with one embodiment of the present invention.
FIG. 2 is a method of operating an input device, in accordance with one embodiment of the present invention.
FIG. 3 is a method of operating an input device, in accordance with one embodiment of the present invention.
FIG. 4 is a simplified diagram of an illuminated touch pad, in accordance with one embodiment of the present invention.
FIG. 5 is a simplified diagram of an illuminated touch pad, in accordance with one embodiment of the present invention.
FIG. 6A is a side view of an illuminated touch pad, in accordance with one embodiment of the present invention.
FIG. 6B is an exploded perspective view of the illuminated touch pad ofFIG. 6A, in accordance with one embodiment of the present invention.
FIG. 7A is a side view of an illuminated touch pad, in accordance with one embodiment of the present invention.
FIG. 7B is a top view, in part, of the touch pad ofFIG. 7A, in accordance with another embodiment of the present invention.
FIG. 8A is a side view of an illuminated touch pad, in accordance with one embodiment of the present invention.
FIG. 8B is a top view, in part, of the touch pad ofFIG. 8A, in accordance with another embodiment of the present invention.
FIG. 9A is a side view of an illuminated touch pad, in accordance with one embodiment of the present invention.
FIG. 9B is a top view, in part, of the touch pad ofFIG. 7A, in accordance with another embodiment of the present invention.
FIG. 10 is a diagram of an illuminated touch pad, in accordance with one embodiment of the present invention.
FIG. 11 is a diagram of a light panel that can be used in an illuminated touch pad, in accordance with another embodiment of the present invention.
FIG. 12 is a method of operating an illuminated touch pad, in accordance with one embodiment of the present invention.
FIG. 13A illustrates one implementation where an angular segment is illuminated when the user places their finger over the angular segment, in accordance with one embodiment of the present invention.
FIG. 13B illustrates one implementation where two angular segments are illuminated at the same time when two finger are distinctly placed over the two angular segments, in accordance with one embodiment of the present invention.
FIG. 14A illustrates one implementation where illumination points adjacent and surrounding the location of the finger are illuminated when the user places their finger over the input surface, in accordance with one embodiment of the present invention.
FIG. 14B illustrates one implementation where the area around two fingers are illuminated at the same time when the two finger are placed over the input surface at the same time, in accordance with one embodiment of the present invention.
FIG. 15 is a method of operating an illuminated touch pad, in accordance with one embodiment of the present invention.
FIGS. 16A-16D illustrate one implementation where the illuminated portion follows the motion of the finger as it is moved across the surface of the touch pad, in accordance with one embodiment of the present invention.
FIGS. 17A-17D illustrate one implementation where the illuminated portion follows the motion of the finger as it is moved across the surface of the touch pad, in accordance with one embodiment of the present invention.
FIG. 18 is a method of operating an illuminated touch pad, in accordance with one embodiment of the present invention.
FIG. 19A is a method of determining the state of the touch pad, in accordance with one embodiment of the present invention.
FIG. 19B is a method of illuminating a touch pad, in accordance with one embodiment of the present invention.
FIGS. 20A and 20B illustrate illuminating the touch pad with a first color or intensity when the touch pad is in a first state, and illuminating the touch pad with a second color or intensity when the touch pad is in a second state, in accordance with one embodiment of the present invention.
FIG. 21 is a method of operating an illuminated touch pad, in accordance with one embodiment of the present invention.
FIGS. 22A and 22B illustrate increasing the intensity of the illumination when an object is closer or exerts increased pressure relative to the touch surface, and decreasing the intensity of the illumination when an object is further away or exerts decreased pressure relative to the touch surface, in accordance with one embodiment of the present invention.
FIG. 23 is a method of operating an illuminated touch pad, in accordance with one embodiment of the present invention.
FIG. 24 is a method of operating an illuminated touch pad, in accordance with one embodiment of the present invention.
FIG. 25 illustrates providing low intensity illumination when a touch is first detected, providing medium intensity illumination when the object is slowly moved around the input surface (e.g., low acceleration), and providing high intensity illumination when the object is quickly moved around the input surface (e.g., high acceleration), in accordance with one embodiment of the present invention.
DETAILED DESCRIPTION OF THE INVENTION The present invention pertains to improved feedback mechanisms for touch pads. One aspect relates to devices capable of illuminating the touch sensitive surface of the touch pad. Not just in backlighting so that the user knows where the touchpad is located in low light conditions, but also to give other feedback related to how the touch pad is being used. Another aspect relates to methods for providing feedback at the touch pad. For example, changing intensity or color based on motion characteristics-and/or pressure, providing an illumination point that follows a finger as it is moved about the touch sensitive surface, showing different states with varying levels of brightness or color, etc.
Embodiments of the invention are discussed below with reference toFIGS. 1-25. However, those skilled in the art will readily appreciate that the detailed description given herein with respect to these figures is for explanatory purposes as the invention extends beyond these limited embodiments.
FIG. 1 is a simplified block diagram of aninput device10, in accordance with one embodiment of the present invention. Theinput device10 may be a standalone peripheral device that connects to a host device through wired or wireless connections or it may be integrated into a host device (e.g., hard wired). In either case, theinput device10 is configured to provide inputs to the host device. Examples of host devices include any consumer related electronic device such as computers, PDAs, media players, telephones, etc.
In order to generate inputs as for example initiating commands, making selections or tracking, theinput device10 includes anobject sensing mechanism12 configured to detect one or more objects in close proximity to and/or in contact with aninput surface13. Theobject sensing mechanism12 may be based on proximity sensing and/or touch sensing.
In the case of proximity sensing, theinput surface13 may be the surface directly underneath a proximity sensing field. Theobject sensing mechanism12 generates input signals when an object such as a finger (or stylus) is moved above the input surface and within the sensing field (e.g., x and y plane), from an object holding a particular position above the surface and within the sensing field and/or by an object moving through or in and out of the sensing field (e.g., z direction). Proximity detection may be based on technologies including but not limited to capacitive, electric field, inductive, hall effect, reed, eddy current, magneto resistive, optical shadow, optical visual light, optical IR, optical color recognition, ultrasonic, acoustic emission, radar, heat, sonar, conductive or resistive and the like.
In the case of touch sensing, theinput surface13 may be a touch surface that is sensitive to direct physical touch. Theobject sensing mechanism12 generates input signals when an object in contact with theinput surface13 is moved across the input surface (e.g., x and y plane), from an object holding a particular position on the input surface and/or by an object tapping on the input surface. Touch sensing may be based on technologies including but not limited to resistive, capacitive, infrared and surface acoustic wave. Examples of touch sensing devices that utilize these technologies include touch pads, touch screens, and the like.
To elaborate, the sensing region, i.e., input surface or the sensing field above the input surface, is typically divided into several independent and spatially distinct sensing points, nodes or regions. The sensing points, which are typically hidden from view, are dispersed about the sensing region with each sensing point representing a different position in the sensing region. The sensing points may be positioned in a grid or a pixel array where each pixilated sensing point is capable of generating a signal. In the simplest case, a signal is produced each time an object is positioned over a sensing point. When an object is placed over multiple sensing points or when the object is moved between or over multiple sensing points, multiple signals can be generated. The sensing points generally map the sensing region into a coordinate system such as a Cartesian coordinate system, a Polar coordinate system or some other coordinate system. Furthermore, the touch sensing means may be based on single point sensing or multipoint sensing. Single point sensing is capable of only distinguishing a single object at any given time, while multipoint sensing is capable of distinguishing multiple objects at the same time.
Theinput device10 also includes avisual feedback system14 configured to output visual effects at theinput surface13 in association with theobject sensing system12. Thevisual feedback system14 is dedicated to enhancing the operation of theinput device10 by providing visual feedback to the user when making touch or proximity inputs via theobject sensing system12. For example, the visual effects may be used to indicate the location of theinput surface13 thereby making inputting easier for the user (e.g., backlighting).
Alternatively or additionally, the visual effects may be used during and after the input event to dynamically indicate characteristics associated with the input events. The characteristics may for example include when and where and the number of inputs being made relative to theinput surface13. This type of feedback also improves inputting by providing visual cues to the user about theinput device10 as theinput device10 is used.
Alternatively or additionally, the visual effects may be used before an input event to invoke the user to perform a particular input event at theinput surface13. This type of feedback also improves inputting by helping the user make appropriate inputs or helping them learn input gestures (e.g., timing, location and movements).
In most cases, the visual effects (outputs) are linked or tied to and associated with the input events being performed. Although capable of, they typically do not provide external outputs associated with events occurring outside theinput device10. That is, the visual effects typically do not operate separately as an output for any device other than the input device10 (e.g., the visual feedback system should not be considered a separate display).
Thevisual feedback system14 includes one or more visual changingelements15. The visual changingelements15 may be separate of integral with the sensing elements of theobject sensing system12. In some cases, the one or more of the visual changingelements15 may be mapped, associated with or tied to one or more of the sensing nodes of the object sensing system. The number of visual changing elements tied to a particular sensing node may be less than, equal to, or more than the number of sensing nodes.
The resolution of the visual changingelements15 can be widely varied. In one embodiment, the resolution of the visual changingelements15 is greater than the sensing nodes (e.g., the number of visual changing elements is greater than the number of sensing nodes). In another embodiment, the resolution of the visual changingelements15 is substantially equal to the resolution of the sensing nodes (e.g., the number of visual changing elements is substantially equal to the number of sensing nodes). In yet another embodiment, the resolution of the visual changingelements15 is less than the resolution of the sensing nodes (e.g., the number of visual changing elements is less than the number of sensing nodes). The resolution generally depends on the needs of the particular input device. In some cases, high resolution is need to create dynamic visual effects such a graphical effects. In other cases, only low resolution is required, as for example to visually change a region as large as a finger.
Thevisual feedback system14 may be widely varied. In one embodiment, thevisual feedback system14 is a light based system that illuminates theinput surface13. In this embodiment, the visual changingelements15 are embodied as light emitting devices. The light emitting devices may include one or more light sources, and a light distribution system for distributing the light at theinput surface13. In some cases, the light from the light sources may be diffused so that theinput surface13 emits a characteristic glow (not a precise point of light, but rather a blurred glowing effect or phosphorous glow). That is, the input surface can generate glowing special effects that may for example provide backlighting to theinput surface13 and/or provide an outline, trace or shadow of the sensed object on theinput surface13. The glowing special effects may even indicate a state of the input device as for example when the input device is in a tracking state or gesture state.
Alternatively, the visual feedback system may be a graphically based system that generates graphics at the input surface. LCDs, OLEDs and electronic inks are examples of graphically based systems. These devices however can be cost prohibitive and more complex to implement when compared to light based systems.
Theinput device10 also includes acontroller16 that is operatively coupled to theobject sensing device12 andvisual feedback system14. Thecontroller16 monitors signals generated at theinput surface13 and sends corresponding control signals associated therewith to the host device, which interprets the signals in accordance with its programming. (e.g., input events). Thecontroller16 also generates visual effect commands for controlling the visual effects outputted by thevisual feedback system14. Single or multiple commands can be generated to change one, some or all of the visual changing elements at the same time. Further, the commands may be based on the signals generated via theobject sensing device12.
In one embodiment, thecontroller16 may instruct the visual changing elements to change in a non-trivial manner in the region of the detected object in order to indicate a location of the object relative to theinput surface13. In another embodiment, the commands may be based on instructions from the host device. For example, the host device may instruct theinput device10 to backlight theinput surface13 or alternatively to alter the input surface in such a way as to prompt the user to perform a particular event relative to the input surface (e.g., the host device may use the input surface to teach the user how to perform a particular gesture).
FIG. 2 is amethod50 of operating an input device, in accordance with one embodiment of the present invention. The input device may for example be the input device shown and described inFIG. 1. The method begins atblock52 where one or more objects are detected at a sensing region. This may for example be accomplished with the object sensing device described above. Followingblock52, the method proceeds toblocks54 and56. Inblock54, visual effects are displayed at the sensing region based on the detected objects. This may for example be accomplished with the visual feedback system described above.
In one embodiment, the visual effects are performed in the region of the detected object. For example, in the case of a light based system, the area under and/or around the detected object may be illuminated. As should be appreciated, the visual effects can be made to follow the object as it is moved around the sensing region. In fact in some cases, the visual effects may include a leading edge, a body, and/or a trailing edge. The leading edge indicates where the object is directed, the body indicates the current location of the object, and the trailing edge indicates where the object has been.
In another embodiment, the visual effects are performed to indicate the state of the object sensing event. For example, if one object is detected, a first visual effect may be performed, and if multiple objects are detected, a second visual effect may be performed. This may be beneficial in cases where single objects are used for tracking and multiple objects are used for gesturing. See for example U.S. patent application Ser. No. 10/903,964, which is herein incorporated by reference.
Inblock56, control signals are generated based on the detected objects. This may for example be accomplished with the controller described above. The signals are reported to a host device as an input event, and the host device interprets the signals in accordance with its programming.
FIG. 3 is amethod70 of operating an input device, in accordance with one embodiment of the present invention. The input device may for example be the input device shown and described inFIG. 1. The method begins atblock72 where visual effects are displayed at the sensing region. This may for example be accomplished with the visual feedback system described above.
In one embodiment, the visual effects are based on a control signal from the host device. For example, the host device may instruct the visual feedback system to output visual effects to encourage a user to place an object at a particular location at the input surface or to perform a gesture during a particular operation in the host device (e.g., training sequence).
Followingblock72, the method proceeds toblocks74 and76. Inblock74, the sensing region is monitored. This may for example be accomplished with the object sensing device described above. Inblock76, control signals are generated when objects are detected in the sensing region. This may for example be accomplished with the controller described above. The signals may be used by the host device as an input event.
FIG. 4 is a simplified diagram oftouch pad100, in accordance with one embodiment of the present invention. In this embodiment, thetouch pad100 includes an optically transmissivetouch sensing device102 disposed over alight panel104. Both thetouch sensing device102 and thelight panel104 communicate with acontroller106 that monitors touch inputs on the touch sensitive surface108 of thetouch sensing device102 and that directs thelight panel104 to emit light in the direction of the touch sensing device in a controlled manner.
The touch sensing device may be widely varied. The touch sensing device may for example be selected from any of those used for touch screens. An example of a touch screen that may be used can be found in U.S. patent application Ser. No. 10/840,862, which is herein incorporated by reference.
The light panel may also be widely varied. In one embodiment, the light panel is a pixilated light device that includes a plurality of light sources that are distributed over an extended area such as the touch sensitive surface of the touch sensing device. The light panel may include a plurality of light emitting diodes (LEDs) that are laid out in a matrix such as rows and columns. Any number of LEDs may be used. The number generally depends on the desired resolution of the light panel. In the simplest case, LEDs are placed next to or adjacent one another in rows and columns on a PCB that is sized similarly to the touch sensing device (e.g., covers the same area).
FIG. 5 is a simplified diagram of atouch pad120, in accordance with one embodiment of the present invention. In this embodiment, thetouch pad120 includes an opaque or alternatively an optically transmissivetouch sensing device122 disposed below alight panel124. Both thetouch sensing device122 and thelight panel124 communicate with acontroller126 that monitors touch inputs on the touch sensitive surface128 of thetouch sensing device122 and that directs thelight panel124 to emit light in a controlled manner.
The touch sensing device may be widely varied. The touch sensing device may for example be selected from any of those used for touch pads or touch screens. An example of a touch pad that may be used can be found in U.S. patent application Ser. Nos. 10/188,182, 10/722,948 and 10/643,256, all of which are herein incorporated by reference.
The light panel may also be widely varied. Unlike the light panel discussed inFIG. 4, this light panel needs to allow touch sensing to occur therethrough. The light panel may further need to be somewhat diffused to hide the electronics displayed underneath the light panel. In one embodiment, the light panel includes one or more light sources and a light distribution mechanism for distributing the light from the light source over an extended area such as the touch sensitive surface of the touch sensing device. The light distribution mechanism may include for example light pipes or light guides that allow the light sources to be placed away from the sensing field. In one embodiment, the light distribution mechanism is formed from a dielectric material so that touch sensing can occur therethrough with impediments (e.g., capacitance). By way of example, transparent or semi-transparent plastic materials may be used.
FIGS. 6A and 6B are diagrams of anilluminable touch pad150, in accordance with one embodiment of the present invention. Thetouch pad150 includes a translucent orsemi-translucent touch screen152 and a pixilatedlight panel154 disposed below thetouch screen152. Thetouch screen152 is divided into several independent and spatially distinct sensing points, nodes or regions. The sensing points, which are hidden from view (transparent), are dispersed about thetouch screen152 with each sensing point representing a different position on the surface of the touch screen (or touch screen plane). The sensing points may be positioned in a grid or a pixel array where each pixilated sensing point is capable of generating a signal. In the simplest case, a signal is produced each time an object is positioned over a sensing point. When an object is placed over multiple sensing points or when the object is moved between or over multiple sensing point, multiple signals can be generated.
In one embodiment, thetouch screen152 includes a plurality of capacitance sensing nodes. The capacitive sensing nodes may be widely varied. For example, the capacitive sensing nodes may be based on self-capacitance or mutual capacitance. In self-capacitance, the “self” capacitance of a single electrode is measured as for example relative to ground. In mutual capacitance, the mutual capacitance between at least first and second electrodes is measured. In either cases, each of the nodes works independent of the other nodes so as to produce simultaneously occurring signals representative of different points on thetouch screen152.
In order to produce atransparent touch screen152, the capacitance sensing nodes may be formed with a transparent conductive medium such as indium tin oxide (ITO).
In self-capacitance sensing arrangements, the transparent conductive medium is patterned into spatially separated electrodes and traces. Each of the electrodes represents a different coordinate and the traces connect the electrodes to a capacitive sensing circuit. The coordinates may be associated with Cartesian coordinate system (x and y), Polar coordinate system (r, θ) or some other coordinate system. During operation, the capacitive sensing circuit monitors changes in capacitance that occur at each of the electrodes. The positions where changes occur and the magnitude of those changes are used to help recognize the touch events. A change in capacitance typically occurs at an electrode when a user places an object such as a finger in close proximity to the electrode, i.e., the object steals charge thereby affecting the capacitance.
In mutual capacitance, the transparent conductive medium is patterned into a group of spatially separated lines formed on two different layers. Driving lines are formed on a first layer and sensing lines are formed on a second layer. Although separated by being on different layers, the sensing lines traverse, intersect or cut across the driving lines thereby forming a capacitive coupling node. The manner in which the sensing lines cut across the driving lines generally depends on the coordinate system used. For example, in a Cartesian coordinate system, the sensing lines are perpendicular to the driving lines thereby forming nodes with distinct x and y coordinates. Alternatively, in a polar coordinate system, the sensing lines may be concentric circles and the driving lines may be radially extending lines (or vice versa). The driving lines are connected to a voltage source and the sensing lines are connected to capacitive sensing circuit. During operation, a current is driven through one driving line at a time, and because of capacitive coupling, the current is carried through to the sensing lines at each of the nodes (e.g., intersection points). Furthermore, the sensing circuit monitors changes in capacitance that occurs at each of the nodes. The positions where changes occur and the magnitude of those changes are used to help recognize the multiple touch events. A change in capacitance typically occurs at a capacitive coupling node when a user places an object such as a finger in close proximity to the capacitive coupling node, i.e., the object steals charge thereby affecting the capacitance.
Referring now to thelight panel154, thelight panel154 includes a light emitting surface that is typically divided into several independent and spatially distinct illumination points, nodes orregions156. The illumination points156 are dispersed about the light emitting surface with eachillumination point156 representing a different position in the light emitting surface. The illumination points156 may be positioned in a grid or a pixel array where each pixilated illumination point is capable of emitting light. The illumination points156 generally map the illumination region into a coordinate system such as a Cartesian coordinate system, a Polar coordinate system or some other coordinate system. In some cases, the illuminations points may be laid out in a pattern similar to the sensing points of the touch panel152 (e.g., same coordinate system, same number of points). In other cases, the illumination points156 may be laid out in a pattern that is different than the sensing points of the touch panel152 (e.g., different coordinate system, different number of points).
Thelight panel154 may be widely varied. In the illustrated embodiment, the illumination points156 are embodied as individual light emitting diodes that are placed in a grid like manner thereby forming a pixilated illumination area, i.e., each of the light emitting diodes forms an illumination node. The grid may be oriented rows and columns (x and y) or angular/radial segments (as shown). Furthermore, the LEDs are attached to the printedcircuit board160 and operatively coupled to thecontroller158 located on the backside of the printedcircuit board160.
Thetouch screen152 is also operatively coupled to thecontroller158 as for example using a flex circuit that attached to the printedcircuit board160. During operation, thecontroller158 monitors the changes in capacitance and generates control signals based on these changes. Thecontroller158 also separately adjusts the intensity of each of the LEDs to illuminate portions or all of thetouch screen152 in a controlled manner. That is, thelight panel154 can produce any number of various light effects by selectively controlling the intensities of the LED's via thecontroller158. Because thetouch screen154 is translucent, the light can be seen through thetouch screen154.
In some cases, thetouch pad150 may further include alight diffuser162. Thelight diffuser162 is configured to diffuse the light being emitted by thelight panel154. This may be done to normalize the light intensity of the LEDs, to produce a characteristic glow at the input surface, and/or to hide the physical parts of the touch pad located underneath the light diffuser.
Although thelight diffuser162 can include color components, in most cases, the light diffuser appears as a white or semi transparent white material. When embodied with white elements, thelight diffuser162 takes on the color of light emitted by the LEDs. Generally speaking, thelight diffuser162 is positioned somewhere between the LEDs and the input surface. More particularly, thelight diffuser162 can be placed above, within or underneath the touch screen. For example, alight diffuser162 can be placed on the upper surface, lower surface, or in the layers of the touch screen. Alternatively or additionally, thelight diffuser162 may be integrated with or attached to the light panel or even be a separate component disposed between thelight panel154 and touch screen152 (as shown).
Thelight diffuser162 may be embodied in many different forms including for example surface treatments on one or more layers of the touch screen, additives in one or more layers of the touch screen, an additional layer in the touch screen, rigid plastic inserts disposed above or below the touch screen, flexible labels disposed above or below the touch screen, and the like. Thelight diffuser162 may even be the ITO coating used to form the sensing components of the touch screen (e.g., the greater the density of the ITO coating, the greater the amount of light that is diffused).
In the illustrated embodiment, thelight diffuser162 is a plastic insert that includes light scattering additives. Furthermore, thelight diffuser162 is disposed between thelight panel154 and thetouch screen152.
It should be pointed out that LED's offer many advantages over other light sources. For example, LED's are relatively small devices that are energy efficient and long lasting. LED's also run relatively cool and are low in cost. Furthermore, LED's come in various colors such as white, blue, green, red and the like. The pixilated LEDs may be configured to emit that same color of light or a different color of light.
Furthermore, although shown as single LEDs, it should be noted that the LEDs may be embodied as an integrated array of LEDs that are grouped together as for example an array of red, blue, green and/or white LEDs that cooperate to produce a resultant color (via color mixing). The resultant color may be a wide range of colors, as for example, a majority of the colors from the color spectrum. During operation, the controller can produced almost any color by adjusting the intensity of each of the colored LED's. By way of example, in order to produce the highest shade of red, the intensities of the green and blue are reduced to zero intensity and the intensity of the red is increased to its peak intensity. The highest shades of green and blue can be implemented in a similar manner. In addition, in order to produce a shade of red and green, the intensities of the green and red are increased to levels above zero intensity while the intensity of blue is reduced to zero intensity. Shades of green and blue and blue and red can be implemented in a similar manner. Furthermore, in order to produce shades of white, the intensities of the red, green and blue are increased to the same levels above zero intensity, or alternatively the red, green and blue LED's are turned off and a white LED is turned on.
Although the integrated LED array is described as using the three primary colors, it should be noted that this is not a limitation and that other combinations may be used. For example, the integrated LED array may be configured to include only two of the primary colors or it may only include LED's with a single color.
When the LEDs are capable of generating any color, unique input surfaces can be produced. By way of example, the touch pad can produce an input surface with rainbow stripes, different colored spots, different colored quadrants or sections and the like. The touch pad can also produce an input surface that has a dynamically changing pattern. This is typically accomplished by activating distinct LED's at different times or by adjusting the intensities of distinct LED's at different times.
FIGS. 7A and 7B are diagrams of atouch pad200, in accordance with another embodiment of the present invention. Thetouch pad200 includes various layers including alight panel202, anelectrode layer204 and a printed circuit board206 (PCB). Theelectrode layer204 is positioned on thePCB206 and thelight panel202 is placed above theelectrode layer204.
Theelectrode layer204 includes a plurality of spatially separatedelectrodes205 configured to detect changes in capacitance at an upper surface208 of thelight panel202. Each of theelectrodes205 is operatively coupled to acontroller210 located on the backside of the printedcircuit board206. During operation, thecontroller210 monitors the changes in capacitance and generates control signals based on these changes.
Thelight panel202 includes alight distribution panel212 disposed over theelectrode layer204 and one or more side mountedlight emitting diodes214 disposed around the periphery of thelight distribution panel212. The side mountedlight emitting diodes214 are configured to direct light into a different portion of thelight distribution panel212. Alternatively, a light pipe may be used to direct light from an LED located away from thelight distribution panel212. Thelight distribution panel212 is configured to redirect the light made incident thereon via thelight emitting diodes214 to an upper surface of thelight distribution panel212 thereby illuminating thetouch pad surface201. Thelight distribution panel212 is also configured to serve as a dielectric layer that covers theelectrode layer204 in order to help form the capacitance sensing circuit of thetouch pad200.
As shown, theLEDs214 are attached to the printedcircuit board206 and operatively coupled to thecontroller210 located on the backside of the printedcircuit board206. During operation, thecontroller210 selectively adjusts the intensity of each of theLEDs214 to illuminate portions of or all of thelight distribution panel212 in a controlled manner.
Although shown as single LEDs, the LEDs may be embodied as an array of LEDs as for example an array of red, blue and green LEDs. Arrayed LEDs such as this may be capable of generating most colors in the color spectrum.
Thelight distribution panel212 can be widely varied. In one embodiment, thelight distribution panel212 is a separate component disposed within thehousing211 of thetouch pad200. For example, thelight distribution panel212 is inserted within an opening in the housing211 (as shown). In this arrangement, it may be preferable to place the upper surface of thelight distribution panel212 flush with or recessed below the outer surface of thehousing211. Furthermore, in order to provide a tight fit that limits dust and particles from entering thetouch pad200, thelight distribution panel212 may include edges that extend over the outer surface of thehousing211.
In another embodiment, thelight distribution panel212 is an integral part of thehousing211. For example, thehousing211 is formed from a transparent or semi-transparent material. This particular embodiment provides a continuous surface without gaps or breaks, which can be aesthetically pleasing to the user.
In either embodiment, thelight distribution panel212 typically includes aportion213 that extends below the inner surface of thehousing211. Thisportion213 provides a light receiving area at the sides of thelight distribution panel212 for receiving light emitted by the side mounted LED's214.
Thelight distribution panel212, which can be formed from a single or multiple layers, is typically formed from translucent or semi-translucent dielectric materials including for example plastic materials such as polycarbonate, acrylic or ABS plastic. It should be appreciated, however, that these materials are not a limitation and that any optically transmittable dielectric material may be used.
In most cases, thelight distribution panel212 or some other component of thetouch pad200 includes light diffusing elements to diffuse the light made incident thereon in order to normalize the light intensity of the LEDs, to produce a characteristic glow at the input surface, and/or to hide the physical parts of the touch pad located underneath the input surface. The light diffusing elements may be provided on an inner surface, outer surface or they may be embedded inside thelight distribution panel212. Additionally or alternatively, the light diffusing elements can also be applied to a separate optical component disposed above thelight distribution panel212.
In one embodiment, the light diffusing element is an additive disposed inside thelight distribution panel212. For example, thelight distribution panel212 may include a plurality of light scattering particles dispersed between the top and bottom surfaces of the light distribution panel. When the light is made incident on the inner surface, it is transmitted through thelight distribution panel212 until is intersects a light scattering particle disposed inside the panel. After intersecting the light scattering particle, the light is scattered outwards in a plurality of directions, i.e., the light is reflected off the surface and/or refracted through the light scattering particle thereby creating the characteristic glow. By way of example, the light scattering particles may be formed from small glass particles or white pigments. Furthermore, by changing the amount of light scattering particles disposed in the panel, the characteristics of the glow can be altered, i.e., the greater the particles the greater the light scattering.
In another embodiment, the light diffusing element is a layer, coating and/or texture that is applied to the inner, side or outer surfaces of thepanel212. For example, thepanel212 may include a light scattering coating or a light scattering texture disposed on the side or outer surface of the panel. By way of example, the light scattering coating may be a paint, film or spray coating. In addition, the light scattering texture may be a molded surface of the wall or a sandblasted surface of the panel. When light is made incident on the inner or outer surface, it intersects the light scattering coating or texture applied on the surface. After intersecting the light scattering coating or the light scattering texture, the light is scattered outwards in a plurality of directions, i.e., the light is reflected off the surface and/or refracted through the light scattering particle thereby creating a characteristic glow.
In the illustrated embodiment, the light diffusing element is embodied as alight diffusing label216. Thelight diffusing label216 is at least adhered to the top surface of thelight distribution panel212. In some cases, thelabel216 may even extend over and be adhered to a top edge of thehousing wall211. In cases such as this, thelight diffusing label216 may even be placed in a pocket formed byrecesses217 at the top edge of thehousing wall211 in order to make the top surface of thelight diffusing label216 flush with the external surface of thehousing wall211. Thelabel216 can have a graphic printed thereon, can have multiple colors and can have varying thickness to assist in controlling the intensity and color of the illumination. Thelabel216 may be formed from transparent or semitransparent dielectric materials such as Mylar or Polycarbonate or any other dielectric material that is thin, optically transmittable and includes some sort of light diffusing means.
Further, thelight distribution panel212 may be configured as a single node, or it may be broken up into plurality ofdistinct nodes218, each of which includes its own dedicated light emitting diode for individual illumination thereof. During operation, when light is released by alight emitting diode214, the light is made incident on the side of thelight distribution panel212 at thenode218. Thenode218 redirects and transmits the light from its side to an upper surface of thenode218. In order to prevent light bleeding betweenadjacent nodes218, eachnode218 may be optically separated by a reflecting or masking region disposed therebetween.
Each of thenodes218 may be formed from a solid piece of material or it may be formed from a combination of elements. In one embodiment, each of thenodes218 is formed from a translucent or semi-translucent plastic insert that when combined with the other inserts forms thelight distribution panel212. In another embodiment, each of thenodes218 is formed from a bundle of fiber optic strands.
The configuration of thenodes218 including layout, shape and size may be widely varied. With regards to layout, thenodes218 may be based on a Polar or Cartesian coordinate system (or some other coordinate system). With regards to shape, any shape including for example standard shapes such as circles, squares, rectangles, triangles, may be used. With regards to size, thenodes218 may be larger than a finger or stylus, about the same size as a finger or stylus, or smaller than a finger or stylus. In one embodiment, thenodes218 are set up similarly to theelectrodes205 of theelectrode layer204, i.e., thenodes218 have generally the same layout, number, size and shape as theelectrodes205. In another embodiment, the nodes are set up differently. For example, thenodes218 may have a different layout, different number, different shape and/or different size when compared to theelectrodes205.
In the illustrated embodiment, thetouch pad200 is circular and thenodes218 are embodied as distinct angular segments (e.g., pie shaped). Any number of angular segments may be used. The number generally depends on the desired resolution of the illuminating surface. In this particular embodiment, the resolution of thelight panel202 is low and therefore each of the angular segments cover a plurality ofsensing electrodes205.
In one embodiment, all theLEDs214 are powered at the same time to produce a fully illuminatedtouch pad200. This may be analogous to backlighting. In another embodiment, theLEDs214 are powered in accordance with the capacitance changes measured by each of theelectrodes205. For example, the segments above the detected area may be illuminated while the segments above the undetected areas may be turned off. This provides indication to the user as to their exact location of the touch surface. In yet another embodiment, selected segments may be illuminated to encourage a user to place their finger in a particular area of the touch pad.
Although only a singlelight panel202 is shown, it should be appreciated that this is not a limitation and that additional light panels may be used. For example, one or more light panels may be further positioned underneath the first light panel described above. In one embodiment, each light panel in a group of light panels is configured to distribute a different color. For example, three light panels including a red, green and blue light panel may be used. Using this arrangement, different colored segments may be produced. By controlling their intensity, almost any color can be produced (mixed) at the touch surface. In another embodiment, each light panel in the group of light panels may have a different orientation. For example, the angularly segmented nodes of the light distribution panel may be rotated relative to the other light panels so that they are placed at different positions about an axis (e.g., partially overlapping and angularly offset). Using this arrangement, leading and trailing illumination can be produced.
FIGS. 8A and 8B are diagrams of an illuminatedtouch pad250, in accordance with one embodiment of the present invention. Thetouch pad250 is similar to thetouch pad200 shown inFIGS. 7A and 7B in that it includes alight panel252, electrode layer254 andPCB256. It differs from the touch pad ofFIGS. 7A and 7B in that thelight panel252 additionally includes inner side mountedLEDs258 to go along with the outer side mountedLEDs260. It also differs from the touch pad ofFIGS. 7A and 7B in that thelight distribution panel262 of thelight panel252 breaks up each of the angularly segmented nodes ofFIG. 7 into a pair of radially positioned nodes includinginner nodes264 andouter nodes266 that cooperate with therespective LEDs258 and260. As a result, each of thenodes264 and266 represent both an angular and radial position in the plane of thetouch pad250. This works particularly well in touch pads with a circular shape. In addition, unlike the touch pad ofFIGS. 7A and 7B, thetouch pad250 also includes amechanical button268 at the center of thetouch pad250. Themechanical button268 may be illuminated with one ormore center LEDs270.
In this embodiment, both thelight distribution panel262 and the electrode layer254 have an annular shape that creates a void at the center of thetouch pad250. The void provides a space for placement of the extralight emitting diodes258 and270 as well as themechanical button268. As shown, theinner LEDs258 are disposed along the inner periphery of thelight distribution panel262 next to distinctinner nodes264 of thelight distribution pane262. Furthermore, theouter LEDs260 are disposed along the outer periphery of thelight distribution panel262 next to distinctouter nodes266 of thelight distribution panel262.
Furthermore, thecenter LED270 is disposed near the center of thetouch pad250 underneath atranslucent button cap272 of themechanical button270. Thebutton cap272 is movable trapped between a diffusing label layer274 and a spring loadedswitch276 that is also located near the center of thetouch pad250. When the button cap is pressed, it moves against the actuator of the spring loaded switch thereby generating a button event.
In the illustrated embodiment, the electrode layer254,LEDs258,260 and270 andmechanical switch276 are all attached to the printedcircuit board256, and operatively coupled to acontroller280 located on the backside of thePCB256. During operation, thecontroller280 monitors the signals generated at the electrode layer254 and switch276, and provides commands for controlling theLEDs258,260 and270.
FIGS. 9A and 9B are diagrams of an illuminatedtouch pad300, in accordance with one embodiment of the present invention. Thetouch pad300 is similar to the touch pad shown inFIGS. 8A and 8B in that it includes alight panel302,electrode layer304 andPCB306. It differs from the touch pad ofFIGS. 8A and 8B in that thelight panel302 includes a second set ofinner LEDs310 to go along with a first set ofinner LEDs308 and a second set ofouter LEDs314 to go along with a first set ofouter LEDs312. The first sets are located above the second sets, and may be masked from one another to prevent bleed through.
It also differs from the touch pad ofFIGS. 8A and 8B in that thelight distribution panel316 of thelight panel302 further breaks up each of the angularly segmented nodes into four radially positioned nodes includinginner nodes318, inner/middle nodes320, outer/middle nodes322, andouter nodes324 that optically cooperate with their respective LEDs. Particularly, the first set ofinner LEDs308 are positioned to illuminate theinner nodes318, the second set ofinner LEDs310 are positioned to illuminate the inner/middle nodes320, the first set ofouter LEDs312 are positioned to illuminate theouter nodes324, and the second set ofouter LEDs314 are positioned to illuminate the outer/middle nodes322.
In order to transmit light from the second set ofinner LEDs310 to the inner/middle nodes320, the inner/middle nodes320 may include alight transmitting portion326 that extends underneath theinner nodes318. In most cases, thelight transmitting potions326 are optically separated from theinner nodes318 so that the light does not bleed into theinner nodes318 when the light is passing through thelight transmitting portions326.
In order to transmit light from the second set ofouter LEDs314 to the outer/middle nodes322, the outer/middle nodes322 may include alight transmitting portion328 that extends underneath theouter nodes324. In most cases, thelight transmitting potions328 are optically separated from theouter nodes324 so that the light does not bleed into theouter nodes324 when the light is passing through thelight transmitting portions328. Alternatively, a light pipe may be used.
Thelight distribution panel316 may be embodied in a variety of ways. In one embodiment, thelight distribution panel316 includes an outer translucent ring disposed over and around a stepped outer/middle translucent ring, and an inner translucent ring disposed over and around a stepped inner/middle translucent ring that is adjacent the outer/middle ring. A masking layer may be placed between the various rings to prevent bleed through.
Although the touch has been described with only four radial segments, it should be noted that any number of radial segments may be used as well as any number of angular segments to obtain the desired resolution.
FIG. 10 is a diagram of an illuminatedtouch pad350, in accordance with one embodiment of the present invention. Thetouch pad350 includes various layers including adiffuser layer352, anopaque electrode layer354, aPCB356 and alight panel358. Thelight panel358 may be embodied as a light panel including pixilated light sources such as the one described inFIG. 6 or a light panel including a light distribution panel and side mounted light sources such as the one described inFIG. 7.
To elaborate, thelight diffuser layer352 is disposed over theelectrode layer354, theopaque electrode layer354 is disposed on thePCB356, and thePCB356 is disposed over thelight panel358. In order to illuminate thelight diffuser layer352, theelectrode layer354 includes one ormore openings360 through which light may pass when emitted from thelight panel358. Theopenings360 may be the gaps that would normally be formed between the spatially separated electrodes or they may be predetermined gaps around which the spatially separated electrodes are positioned when printed on thePCB356. Furthermore, in order to allow light to pass through thePCB356, thePCB356 either is formed from a translucent material or it also includes one ormore openings362 that correspond with theopenings360 of theelectrode layer354. Moreover, thelight panel358 is typically laid out similarly to theopenings360/362 so that each of theopenings360/362 includes an individuallight source364 of thelight panel358. During operation, the light emitted from each of thelight sources364 travels through thePCB356 and through theelectrode layer354 where it illuminates either thediffuser layer352 or abutton cap366.
FIG. 11 is a diagram of alight panel400 that can be used in an illuminated touch pad, in accordance with another embodiment of the present invention. The light panel may, for example, correspond to any of those described above. Thelight panel400 includes adiffuser panel402, one or morelight multiplexers404 and a controller. Thediffuser panel402 is configured to diffuse light as described above. Thelight multiplexer404, which is operatively coupled to thecontroller406 and in optical communication with thediffuser panel402, is configured to illuminate thediffuser panel402 in a controlled and pixilated manner via commands from thecontroller406.
Thelight multiplexer404 includes a single light emitting device408 and a plurality oflight pipes410 with correspondinglight switches412. Thelight pipe410 may for example include one or more optical fibers, and the light emitting device408 may be embodied in many different forms including for example one or more individual LEDs or one or more LED arrays.
The first end of eachlight pipe410 is optically connected to a different point, node or region of thediffuser panel402. Thelight pipes410 can therefore form a pixilated pattern of illumination points, nodes or regions across thelight diffuser panel402. By way of example, the position of thelight pipes410 may be based on Cartesian coordinates, Polar coordinates, or some other coordinate system. The second and opposite end of each of thelight pipes410 is optically connected to a distinctlight switch412. Thelight switches412 are therefore dedicated to a particular illumination point, node or region of thediffuser panel402. Further, thelight switches412 are all in optical communication with the light emitting device408. In some cases, the light emitting device408 extends across thelight switches412. In other cases, the light emitted by the light emitting device408 is focused onto the variouslight switches412 via a lens orlight guide414.
Furthermore, thelight switches412 and light emitting device408 are operatively coupled to thecontroller406. During operation, thecontroller406 selectively controls the light emitted by the light emitting device408 (e.g., color and intensity), and at the same time selectively controls the opening and closing of thelight switches412. As such, the illumination provided at thediffuser panel402 can be controlled in a pixilated manner using a single light emitting device408. Any number of switches can be opened or closed at any particular point in time to provide the desired illumination pattern (by opening and closing different light switches, various patterns can be created). When the light is turned on and a light switch is opened, light is allowed to pass through the light switch into the associated light pipe, which carries the light from the light switch to a distinct illumination point node or region of the diffuser panel. When the light is turned on and a light switch is closed, light is blocked from entering the light pipe and therefore no illumination is provided at the corresponding point, node or region of the diffuser panel.
It should be noted that the multiplexer can include any number of switches, and the light panel can include any number of multiplexers to arrive at the desired resolution of the light panel.
FIG. 12 is amethod450 of operating an illuminated touch pad, in accordance with one embodiment of the present invention. The method includes atleast blocks452 and454. Inblock452, an object is detected over an input surface of the touch pad. This may for example be accomplished with a capacitance sensing device. In454, at least a portion of the input surface proximate the location of the detected object is illuminated. This may be accomplished with a light panel disposed above or below the touch pad. As a result, the user will be informed where the object is located within the sensing plane at all times.
In one embodiment, the input surface is broken up into illumination regions, and whichever region is closest to the detected object is illuminated. By way of example, and referring toFIG. 13A, if the user places their finger over a single angular segment of the distribution panel that particular angular segment is illuminated. If the user simultaneously places their finger over multiple segments one of two things may occur. In one implementation, both segments are illuminated. In another implementation, only one of the segments is illuminated. In the later case, a decision may be made as to which segment is the intended segment.
In another embodiment, the input surface is broken up into illumination nodes or points (pixilated), and those points contained within and/or surrounding the detected object area are illuminated. In one implementation, at least the area adjacent the object is illuminated. By way of example, and referring toFIG. 14A, if the user places their finger over the input surface, illumination points adjacent and surrounding the location of the finger are illuminated. In some cases, the illumination points are only those points next to the finger (e.g., halo). In other cases, the illuminated points extend away from the finger as for example in a star like configuration.
The method may additionally includeblocks456 and458. Inblock456, a second object is detected over the input surface at the same time as the first object. This may for example be accomplished with a multipoint capacitance sensing device. Inblock458, at least a portion of the input surface proximate the location of the second detected object is illuminated. As a result, the user will be informed where distinct multiple objects are located within the sensing plane at all times.
In one embodiment, the input surface is broken up into illumination regions, and the regions closest to the detected objects are illuminated. By way of example, and referring toFIG. 13B, when two fingers are placed over the input surface, two illumination segments in the location of the fingers are illuminated at the same time.
In another embodiment, the input surface is broken up into illumination nodes or points (pixilated), and those points contained within and/or surrounding the detected objects are illuminated. By way of example, and referring toFIG. 14B, when two finger are placed over the input surface, the area around both fingers are illuminated at the same time.
FIG. 15 is amethod500 of operating an illuminated touch pad, in accordance with one embodiment of the present invention. Themethod500 generally begins atblock502 where object sensing is performed. This may for example be accomplished with a capacitive touch sensing device. Inblock504, at least a portion of the input surface is illuminated when an object is detected. In most cases, the portion of the illuminated surface that is illuminated is a localized area disposed near, around, and/or underneath the location of the detected object. The illuminated portion may for example be one or more illumination points, nodes or regions. In most cases, the portion is sized similarly to the size of the object. In the case of a finger for example the illumination portion may cover an area similar to the detected area of the finger.
Inblock506, a determination is made as to whether or not the object is moving. If the object is not moving, the method proceeds to block508 where a determination is made as to whether or not the object is still detected. If the object is still detected, the method proceeds back to block504 where the same portion of the input surface is illuminated. If the object is no longer detected, the method proceeds to block510 where the illumination is stopped. This may occur immediately after determining that an object is no longer detected, or it may occur after a period of time (e.g., time out). Furthermore, the illumination may be stopped using an illumination effect such as fading out. Thereafter, the method proceeds back to block502.
Referring back to block506, if the object is moving across the input surface, the method proceeds to block512 where motion characteristics of the object are determined. The motion characteristics may for example include acceleration, direction, and the like. Thereafter, inblock514, the characteristics of the illumination are adjusted based on one or more motion characteristics. Followingblocks514, the method proceeds back to block506.
In one embodiment, block514 includes moving the illumination area in accordance with the location of the moving object. That is, the illuminated portion follows the finger as the finger is moved about the input surface (i.e., the illumination tracks object movement). As a result, the user always knows where the object is located relative to the input surface. In some cases, block514 may further include providing directional indicators around the illuminated portion in order to indicate previous and/or future locations of the object based on the motion characteristics of the moving object (e.g., acceleration, direction, etc.).
FIGS. 16A-16D illustrate one implementation where the illuminated portion follows the motion of the finger as it is moved across the surface. In this illustration, illuminated segments are configured to follow the motion of the finger as it is moved across the surface.FIG. 16A illustrates the state when no objects are detected.FIG. 16B illustrates the state when an object is detected, and the segment underneath the object is illuminated.FIG. 16C illustrates the state where the illuminated segment follows the moving finger.FIG. 16D illustrates one implementation where the illuminated segment further includes a leading edge, body and trailing edge. The leading edge indicates the direction of the motion, the body indicates the current location of the finger, and the trailing edge indicates where the finger has been.
The leading edge, body and trailing edge may have different illumination profiles. For example, the leading edge may have a high intensity level, the body may have a medium intensity level and the trailing edge may have a low intensity level. Alternatively, the leading edge may have a low intensity level, the body may have a high intensity level, and the trailing edge may have a low intensity level. Alternatively or additionally, the colors of these components may differ. For example, the leading edge may be red, the body may be orange and the trailing edge may be yellow. Furthermore, the trailing edge may include an illumination tail. For example, the trailing edge may be segmented into regions that go from higher intensity to lower intensity levels (e.g., fades outwardly from body).
FIGS. 17A-17D illustrate another implementation where the illuminated portion follows the motion of the finger as it is moved across the surface. In this illustration, the area around the finger is illuminated and configured to follow the motion of the finger as it is moved across the surface.FIG. 17A illustrates the state when no objects are detected.FIG. 17B illustrates the state when an object is detected, and the area around the object is illuminated (e.g., halo).FIG. 17C illustrates the state where the illuminated area follows the moving finger.FIG. 17D illustrates one implementation where the illuminated area includes a body and a tail (e.g., comet). The body surrounds the finger with illumination and the tail tapers away from the body to a point. The tail trails the body as the body moves around the input surface. The tail therefore indicates the previous location of the object. The tail typically has a lower intensity level than the body. The intensity of the tail may even vary from higher to lower intensity levels as for example over its length or from its core to its edge.
FIG. 18 is amethod550 of operating an illuminated touch pad, in accordance with one embodiment of the present invention. Themethod500 generally begins atblock552 where object sensing is performed. Inblock554, the state of the touch pad is determined. The states may for example be selected from a selection state, tracking state or gesture state. In a selection state, the touch pad is set up for receiving selection inputs from the user (e.g., acts like a button). In a tracking state, the touch pad is set up to track a finger as it is moved about the input surface. In a gesture state, the touch pad is set up to receive various gesture inputs from the user. An example of determining states of a touch surface may be found in U.S. patent application Ser. No. 10/903,964, which is herein incorporated by reference.
Inblock556, the input surface is illuminated based on the state of the touch pad. As a result, the user is alerted to the current state of the touch pad, and therefore the type of inputs that can be made. By way of example, each state may include a different illumination profile. An illumination profile defines the illumination characteristics of the illumination to be provided. The illumination characteristics include for example intensity and/or color and/or illumination effects (e.g., fading, blinking, rastering, etc). In one example, a first state includes a first illumination profile (e.g., first color), a second state includes a second illumination profile (e.g., second color), and the third state includes a third illumination profile (e.g., third color).
Referring to19A, one example of determining state will be described. In blocks570 and574, one or more objects are detected. Inblock572, if a single object is detected, the touch pad is placed in a tracking state such that object motion is used to perform tracking operations. Inblock576, if multiple objects are detected, the touch pad is placed in a gesturing state such that object motion is used to perform gesturing operations. By way of example, and referring toFIGS. 20A and 20B, when a single finger is detected, the touch pad is placed in a first state, and when multiple fingers are detected, the touch pad is placed in a second state.
Referring to19B, one example of illuminating based on states will be described. In blocks580 and582, a determination is made as to whether the touch pad is in a tracking state or a gesturing state. Inblock584, if the touch pad is in a tracking state, the touch pad is illuminated with a first illumination profile. Inblock586, if the touch pad is in a gesturing state, the touch pad is illuminated with a second illumination profile that is different than the first illumination profile. The illumination profiles contain illumination information such as color, intensity and effects (e.g., blinking, fading, etc). By way of example, and referring toFIGS. 20A and 20B, when the touch pad is in a first state, the touch pad is illuminated with a first color or intensity, and when the touch pad is in a second state, the touch pad is illuminated with a second color or intensity.
In one embodiment, the method of changing illumination based on states of the touch pad may be further developed. For example, the method may include capturing a first touch image; determining the touch mode based on the first touch image; and illuminating the touch surface based on the first touch mode. The method may also include capturing a second touch image; determining the touch mode based on the second touch image; determining if the touch mode changed between the first and second touch images; if the touch mode stayed the same, comparing the first and second touch images and performing a control function based on the comparison; and if the touch mode changed, illuminating the touch surface based on the second touch mode. The method additionally includes capturing a third touch image; determining the touch mode based on the third touch image; determining if the touch mode changed between the second and third touch images; if the touch mode stayed the same, comparing the second and third touch images and performing a control function based on the comparison; and if the touch mode changed, illuminating the touch surface based on the third touch mode.
FIG. 21 is amethod600 of operating an illuminated touch pad, in accordance with one embodiment of the present invention. The method generally begins atblock602 where an object is detected. Inblock604, a least a portion of the input surface is illuminated when the object is detected. Inblock606, z characteristics of the object are determined. The z characteristics may include the pressure being exerted on the input surface by the object, and/or the location of the object in the z direction relative to the x-y input surface (e.g., how close the object is to the x-y plane). Inblock608, the illumination characteristics are adjusted based on the z-characteristic. For example, the color and/or intensity of the illumination may be adjusted based on the z height or pressure.
By way of example, and referring toFIGS. 22A and 22B, the entire touch pad may be illuminated when the object is detected, and the intensity of the illumination may be increased when an object is closer or exerts increased pressure relative to the touch surface, and the intensity may be decreased when an object is further away or exerts decreased pressure relative to the touch surface. Alternatively, only a portion of the touch pad may be illuminated (as for example a segment or the area directly adjacent the finger) and the intensity of the illumination may be increased when an object is closer or exerts increased pressure relative to the touch surface, and the intensity may be decreased when an object is further away or exerts decreased pressure relative to the touch surface.
FIG. 23 is amethod700 of operating an illuminated touch pad, in accordance with one embodiment of the present invention. The method generally begins atblock702 where an object sensing is performed. Inblock704, at least a portion of the input surface is illuminated in response to the sensed object. For example, a segment or the area around a finger may be illuminated. Thereafter inblock706, a rotational user input is detected over the input surface. For example, in the case of a circular touch pad, the rotational user input may be the user swirling their finger around the circular touch pad. In some cases, this may include determining an acceleration value pertaining to the rotational user input. Inblock708, the input surface is illuminated in accordance with the rotational user input. For example, the region of the touch pad underneath the finger is illuminated as the user rotates their finger around the circular touch pad. In some cases, this may include moving through illumination segments, nodes or points based on at least the acceleration value, whereby the acceleration value specifies a degree of acceleration associated with the rate at which said moving through illumination segments, nodes or points is to be achieved.
Rotational user inputs are further described in U.S. patent application Ser. Nos. 10/256,716 and 10/259,159, which are herein incorporated by reference.
FIG. 24 is amethod800 of operating an illuminated touch pad, in accordance with one embodiment of the present invention. The method generally begins atblock802 where at least a portion of the input surface is illuminated with a first illumination profile when an object is detected proximate the input surface. Followingblock802, the method proceeds to block804 where the illumination of illuminated portion of the input surface changes when the object is moved. For example, the intensity of the illumination may be varied based on the acceleration of the moving object. For example, the intensity may be increased with increased acceleration and the intensity may be decreased with decreased acceleration. In another embodiment, thresholds are used. For example, a first intensity level may be used for high acceleration, a second intensity level may be used for low acceleration, and a third intensity level may be used for no acceleration (stationary).
By way of example and referring toFIG. 25, low intensity illumination is provided when a touch is first detected, medium intensity illumination is provided when the object is slowly moved around the input surface (e.g., low acceleration), and high intensity illumination is provided when the object is quickly moved around the input surface (e.g., high acceleration). Alternatively, the intensity may continuously vary according to the acceleration of the object.
While this invention has been described in terms of several preferred embodiments, there are alterations, permutations, and equivalents, which fall within the scope of this invention.
For example, although the invention was primarily directed at touch pads, it should be pointed out that this is not a limitation and that invention may be applied to other touch sensing devices as for example touch sensitive housings and touch sensing palm rests. An example of a touch sensitive housing may be found in U.S. patent application Ser. No. 11/115,539, which is herein incorporated by reference.
It should also be noted that there are many alternative ways of implementing the methods and apparatuses of the present invention.
For example, different areas of the touch pad may be illuminated with different illumination profiles at the same time. By way of example, the touch pad may be segmented into illuminated quadrants that correspond to button functionality of the touch pad such as menu, play/pause, forward and reverse. See for example U.S. patent application Ser. No. 10/643,256, which is herein incorporated by reference.
Furthermore, the touch pad may be used as an indicator for a handheld computing device such as a media player. For example, the touch pad may be configured to ebb in and out when the device is in sleep mode or vary from high to low intensity based on the battery level. Other examples of controlling light may be found in U.S. patent application Ser. Nos. 10/889,933, 10/075,964 and 10/075,520, all of which are herein incorporated by reference.
Moreover, the touch pad may be used as a timer or clock. In the case of a clock, the touch pad may include segments corresponding to the position of a clock, and the segments can be illuminated in a controlled manner to indicate the current time. For example, to indicate 12:30, a 12 o'clock segment may be illuminated with a first illumination profile and 6 o'clock segment may be illuminated with a second illumination profile. In the case of a timer, the touch pad may be used to show how much time is left in a playing media item such as a song. For example, the entire touch pad may be illuminated when the song starts and consecutive segments may be turned off as the song plays. When the song is over, the touch pad is no longer illuminated. Alternatively, consecutive segments may be turned on as the song plays until the song is over and the touch pad is fully illuminated. The may be useful in a media player such as a music player.
In addition, the illumination of the touch pad may be further controlled by a sensor such as a light sensor. The light sensor measures the ambient light level, and the intensity of the illumination is adjusted based on the ambient light level. Examples of light arrangements that utilize ambient light sensors may be found in U.S. patent application Ser. No. 10/402,311, which is herein incorporated by reference.
It is therefore intended that the following appended claims be interpreted as including all such alterations, permutations, and equivalents as fall within the true spirit and scope of the present invention.