CROSS-REFERENCE TO RELATED APPLICATIONSThis application claims the benefit of U.S. Provisional Application No. 60/972,261, filed Sep. 14, 2007. This application is also a continuation-in-part of U.S. patent application Ser. No. 11/977,348 filed on Oct. 24, 2007, which claims the benefit of U.S. Provisional Application No. 60/946,804, filed Jun. 28, 2007. The disclosures of the above applications are incorporated herein by reference.
FIELDThe present disclosure relates to a remote user interaction device and, more specifically, related to a direction and holding-style invariant, symmetric design, touch and button based remote user interaction device.
BACKGROUNDThe statements in this section merely provide background information related to the present disclosure and may not constitute prior art.
Practically all consumer electronic products in use today come with a remote control. In most cases, the remote control has many buttons, each dedicated to the control of one or more specific features of the consumer electronics product. As these products increase in complexity, so does the number of buttons required. At some point, the increased number of buttons renders the remote control mostly useless for a large number of users.
SUMMARYA remote control unit is disclosed that selectively transmits a control signal for remotely controlling an electronic device. The remote control unit defines an imaginary cut plane that substantially bisects the remote control unit. The remote control unit includes a plurality of input features collectively disposed in a substantially symmetric manner with respect to the imaginary cut plane. The input features include at least a first input feature and a second input feature. The first and second input features are disposed on opposite sides of the imaginary cut plane. Furthermore, the unit includes a sensor that detects at least a first holding position and a second holding position of the remote control unit. The first holding position and the second holding position are substantially opposite to each other. Moreover, the unit includes a controller that associates the control signal with the first input feature when the sensor detects the first holding position, and the controller associates the control signal with the second input feature when the sensor detects the second holding position.
A remote control system is also disclosed that includes an electronic device and a remote control unit that selectively transmits a control signal to remotely control the electronic device. The remote control unit defines an imaginary cut plane that substantially bisects the remote control unit. The remote control unit also includes a plurality of input features collectively disposed in a substantially symmetric manner with respect to the imaginary cut plane. The input features include at least a first input feature and a second input feature. The first and second input features are disposed on opposite sides of the imaginary cut plane. The remote control unit also includes a sensor that detects at least a first holding position and a second holding position of the remote control unit. The first holding position and the second holding position are substantially opposite to each other. The system also includes a controller that associates the control signal with the first input feature when the sensor detects the first holding position, and the controller associates the control signal with the second input feature when the sensor detects the second holding position. The system additionally includes a display that indicates which of the first and second input features is associated with the control signal.
Moreover, a method of operating a remote control system is disclosed. This system includes a remote control unit that defines an imaginary cut plane that substantially bisects the remote control unit. The remote control unit also includes a plurality of input features collectively disposed in a substantially symmetric manner with respect to the imaginary cut plane. The input features include at least a first input feature and a second input feature. The first and second input features are disposed on opposite sides of the imaginary cut plane. The method includes detecting one of at least a first holding position and a second holding position of the remote control unit. The first holding position and the second holding position are substantially opposite to each other. Also, the method includes associating the control signal with the first input feature when the sensor detects the first holding position. Additionally, the method includes associating the control signal with the second input feature when the sensor detects the second holding position.
Further areas of applicability will become apparent from the description provided herein. It should be understood that the description and specific examples are intended for purposes of illustration only and are not intended to limit the scope of the present disclosure.
DRAWINGSThe drawings described herein are for illustration purposes only and are not intended to limit the scope of the present disclosure in any way.
FIG. 1A is a perspective view of the remote control unit;
FIG. 1B is a plan view of the remote control unit;
FIG. 1C is a view of the remote control unit in a portrait orientation;
FIG. 1D is a view of the remote control unit in a landscape orientation;
FIG. 2 is a system block diagram illustrating the remote control system in operation by a user to control a piece of consumer electronic equipment;
FIG. 3 is a block diagram illustrating an exemplary embodiment of the remote control system, including components associated with the control circuit coupled to the consumer electronic equipment and associated with the remote control unit;
FIG. 4A is a top view of a remote control unit according to the teachings of the present disclosure;
FIG. 4B is a perspective view of the remote control unit ofFIG. 4A;
FIG. 5 is a schematic view of a remote control system that includes the remote control unit ofFIG. 4A held by the user in a holding position;
FIG. 6 is a schematic view of the remote control system ofFIG. 4A with the remote control unit in another holding position;
FIG. 7 is a schematic view of the remote control system ofFIG. 4A with the remote control unit in another holding position;
FIG. 8 is a schematic view of the remote control system ofFIG. 4A with the remote control unit in still another holding position;
FIG. 9 is a schematic view of the remote control system ofFIG. 4A with the remote control unit in another holding position; and
FIG. 10 is schematic view of the remote control system ofFIG. 4A with the remote control unit in still another holding position.
DETAILED DESCRIPTIONThe following description is merely exemplary in nature and is not intended to limit the present disclosure, application, or uses.
Referring first toFIGS. 1A and 1B, theremote control unit20 of the remote control system has been illustrated. This remote control unit interacts with a control circuit that is coupled to the consumer electronic equipment. The control circuit and consumer electronic equipment have not been showed inFIGS. 1A-1D but are shown in subsequentFIGS. 2 and 3.
Theremote control unit20 has atouchpad22 that may include predefined clickable regions, such as the up-down-left-right-okay region24, the channel up-down region26, the volume up-down region28 and themute region30. It will be understood that these predefined clickable regions are merely exemplary of the basic concept that the touch pad can have regions that respond to pressure as a way of signifying that the user has “selected” a particular function. While the basic design of the remote control unit strives to eliminate physical push buttons to a large extent, the remote control unit may still have physical push buttons if desired. Thus, for illustration purposed, four push buttons are shown at32,33,34 and35. It is also contemplated that the touchpad may be split into two distinct zones with or without a physical divider interposed between the two zones.
The pre-defined clickable regions may be visually designated on the touchpad surface by either silk screening the region graphics onto the surface of the touchpad, or by using a see-through graphic with backlighting. As will be more fully discussed below, the backlighting can be triggered by the appropriate combination of sensory inputs as recognized by the pattern recognizer also discussed below. It is contemplated that the touchpad surface may not include any pre-defined clickable regions.
The case of the remote control unit is preferably provided with a series of capacitive sensors, such assensors36 around the horizontal side walls of the case perimeter. Capacitive sensors can also be at other locations, such as on the underside of the case. These sensors detect how the user is holding the remote control. In this regard, different users may grip the remote control in different ways and the capacitive sensors are arranged to be able to discriminate these different ways of holding the remote control. Although there may be subtle differences in how one user holds the remote control as compared with another, the pattern recognition system, discussed below, can use this information to recognize these subtle differences. Moreover, the sensors in cooperation with the pattern recognition system enable a user to operate the remote independently of how the remote is being held.
Referring now toFIG. 2, an overview of the pattern recognition system will be presented.FIG. 2 illustrates theremote control unit20 being manipulated by auser40 to operate a consumerelectronic equipment component48 having adisplay screen50. The consumerelectronic equipment48 conventionally has its own electronics that are used to provide the equipment with its normal functionality. In the case of the illustratedcomponent48 such functionality includes displaying audio visual material on the display screen. This material may include, for example, television programs, pre-recorded content, internet content and the like. For illustration purposes, the associated electronics of the consumerelectronic equipment48 have been illustrated separately at52. Embedded within theelectronics package52 is a control circuit shown diagrammatically at60 that defines part of the remote control system.Control circuit60 is coupled to the consumer electronic equipment and responds to commands sent from theremote control unit20 to control the operation of the consumer electronic equipment.
The remote control system is made up of theremote control20 and thecontrol circuit60. Together, these two components implement a sophisticated sensory input detecting and pattern recognizing system that allows theuser40 to control operations of the consumerelectronic equipment50 using a rich variety of finger, hand, wrist, arm and body movements. The system may be viewed as effecting a dialogue between theremote control unit20 and thecontrol circuit60, where that dialogue is expressed using a vocabulary and grammar associated with a diverse variety of different sensory inputs, (e.g., from the touchpad, accelerometer, case perimeter, sensor, pressure sensors, RF signal sensors and the like). The control system also includes a feedback loop through theuser40. Theuser40 has his or her own set of user sensory inputs (sight, sound, touch) and the user manipulates theremote control unit20 based, in part, on audible and visual information obtained from the consumer electronic equipment, and on visual, audible and tactile information from the remote control unit. Thus, the remote control system supports a dialogue betweenremote control unit20 andcontrol circuit60, with a concurrent dialogue betweenuser40, the control system and the consumer electronic equipment.
FIG. 2 thus illustrates thatuser40 may receive visual, audible or tactile feedback fromremote control20 and this may be performed concurrently while viewing thedisplay screen50. For illustration purposes, the information acquired byuser40 are depicted diagrammatically as usersensory inputs62. Likewise, the sensory inputs acquired by the control system (from a diverse array of different types of sensors) has been diagrammatically illustrated at64.
The relationship between the control systemsensory inputs64 and the usersensory inputs62 is a non-trivial one. The user will manipulate theremote control unit20, in part, based on what the user is trying to accomplish and also, in part, based on what the user sees on thedisplay50 and what the user also senses audibly, visually or tactilely from the remote control unit and/or consumer electronic equipment. To illustrate this point, imagine that the consumer electronic equipment is a television set that has been programmed to block certain channels from being viewed by young children. In order to bypass the parental blocking feature,user40 must manipulate the remote control unit in a predefined way. To prevent the child from simply watching the parent and learning the manipulating technique, the parental blocking unlocking feature can be changed each time it is used. The adult user must watch what is shown on the display screen in order to learn how to manipulate the control unit to unlock the parental blocking feature. The instructions on the display are presented in a form, such as textual instructions, that a young child is not able to read. Thus, the control of the parental blocking feature relies on a particular manipulation (e.g., flick the wrist three times) that is context-based. A later unlocking operation would be treated as a different context and would potentially have a different gestural command to effect unlocking. Although this is but one example the example illustrates that the behavior of the remote control system is context-dependent and that the user's sensory perception (e.g., reading the screen, feeling tactile vibrations, hearing particular sounds) will affect how the user's manipulations of the remote control unit are interpreted.
The control system is able to make sense of a rich and diverse collection of sensory inputs using apattern recognizer70 and associatedcontrol logic72. As the user manipulates the remote control unit, sensory inputs are collected as a temporal sequence from the various sensors within the remote control unit. As previously noted, the sensors may include at least one touchpad responsive to manipulation by a user's fingers and at least one additional sensor such as, for example, an acceleration sensor responsive to movement of the remote control unit, case perimeter sensors such as capacitive sensors that discriminate which parts of the case are in contact with the user's body, pressure sensors responsive to pressing forces upon a predetermined region of the touchpad and RF signal sensors responsive to radio frequency signals transmitted from thecontrol circuit60.
The temporal sequence of sensory inputs is fed to thepattern recognizer70. The pattern recognizer is configured to classify the received sensory input message according to a predetermined recognition scheme to generate message meaning data that are then sent to thecontrol logic72. Thecontrol logic72 decodes the message meaning data and generates a device control signal. The device control signal may be supplied to the remote control unit itself, to effect control over the behavior of the remote control unit (e.g., putting the unit to sleep or waking the unit up) or the device control signal may be sent to and/or used by thecontrol circuit60, where it is passed on to the consumer electronic equipment as a command to control the operation of the consumer electronic equipment. Thepattern recognizer70 and thecontrol logic72 may be implemented separately or together and may be deployed in thecontrol circuit60, in theremote control20, or distributed across both.
In one embodiment, thepattern recognizer70 employs a trained model that may be adaptively altered or customized to more closely fit each user's style of using the remote control unit. In such trained model embodiment, thepattern recognizer70 is preferably provided with an initial set of models that classify certain operations as being mapped onto certain commands or control functions. For example, with reference toFIG. 1B, an upward sliding motion of the fingertip on channel up-down region26 might launch a forward channel scanning mode, whereas a single click or finger press upon the upward arrow of theregion26 would simply increment the channel by one. This behavior might be classified differently, however, if the remote control unit is positioned in landscape orientation as illustrated inFIG. 1D. For example, when in landscape orientation and held by two hands (as determined by the capacitive sensors), the channel up-down region26 might perform a function entirely unrelated to channel selection.
To adapt the model for a particular user, the preferred embodiment includes a sensory input mechanism to allow the user to inject a meta command—to let the system know that the user wishes to alter the pattern recognition models either for himself or herself, or for all users. For example, a rapid back and forth wrist motion (analogous to shaking one's head in a “no” gesture) might be used to inform the recognition system that the most recent pattern recognition conclusion was wrong and that a different behavior is desired. For example, assume that the user has used the remote control unit on a coffee table and then manipulates the channel up-down region26, causing the television to begin a channel-scanning mode. Perhaps the user would prefer that the channel scanning mode should not be initiated when the remote control unit is resting on the coffee table (i.e., not being held). To change this behavior, the user would pick up the remote control unit and shake it back and forth in a “no” gesture. This would cause an on-screen prompt to appear on thetelevision display50, instructing the user how the most recent temporal sequence of sensory inputs can be modified in this context to result in a different device control signal outcome.
Because thepattern recognizer70 can respond to a rich variety of different types of sensory inputs, the control system is able to interpret the meaning of user manipulations and gestures that can be quite complex, thereby allowing the user to interact in an intuitive or natural way that can be customized from user to user. In this regard, there may be instances where two or more gestural commands might be very similar and yet might have different meanings and thus might require different commands to be sent to the consumer electronic equipment. To handle this, thepattern recognizer70 may be based on a statistical model where the control system sensory inputs generate probability scores associated with a plurality of different meanings. The pattern recognizer would (a) select the meaning with the highest score, if that score is above a predetermined probability threshold value and/or above the next-most value by a predetermined threshold, or (b) engage the user in a dialogue on-screen to resolve which meaning was intended, if the preceding threshold conditions are not met. The results of such user interaction may then be used to fine tune or adapt the model so that the system learns what behavior is expected for subsequent use.
With the above overview in mind, refer now toFIG. 3 where a detailed description of the remote control unit and control circuit hardware has been illustrated. InFIG. 3, the components associated with the control circuit are shown generally at60 and the components associated with the remote control unit are shown generally at20. The consumer electronic equipment is shown at48.
Beginning with thecontrol circuit60, a first processor orCPU80 is attached to abus82, to whichrandom access memory84 and programmable nonvolatilerandom access memory86 are attached. The first processor includes an input/output (I/O)module88 that provides an I/O bus90 to which anRF communication module92 and consumerelectronic product interface94 are attached. The consumerelectronic product interface94, in turn, couples to the remaining circuitry of the consumerelectronic equipment48. The radiofrequency communication module92 includes an antenna and is designed to communicate with a corresponding communication module associated with theremote control unit20.
Theremote control unit20 has asecond processor96 with associatedbus98,random access memory99 and nonvolatile programmablerandom access memory100. Theprocessor96 also has an I/O module102 that supports an I/O bus104 to which a variety of sensors and other devices may be attached. Attached to the I/O bus104 is theRF communication module106 that communicates with itscounterpart module92 of thecontrol circuit60. Thedisplay illumination device108 is also coupled to the I/O bus104 so that the backlighting can be switched on and off to render any backlit graphical elements on the touchpad visible or invisible. A tactile feedback annunciator/speaker110 is coupled to the I/O bus. The annunciator/speaker may be activated to produce tactile feedback (vibrations) as well as audible tones.
As previously discussed, the remote control unit includes an assortment of different sensors. These include the touchpad ortouchpads22, a button padmembrane switch assembly112,accelerometer114 andcapacitive sensors36. The button pad membrane switch assembly may be physically disposed beneath the touchpads so that pressure upon the touchpad will effect a switch state change from off to on. If desired, the button padmembrane switch assembly112 may employ pressure-sensitive switches that can register a range of pressures, as opposed to a simple on/off binary state.
Because the remote control unit is designed to sit on the coffee table when not in use, a battery power supply is preferred. Thus, thepower supply200 includes aremovable battery202 as well as apower management circuit204. The power management circuit supplies power to thesecond processor96 and to all of the modules within the remote control unit requiring power. Such modules include all of the sensors, display illumination, and speaker/annunciator components attached to the I/O bus104. If desired, an RFID tag206 may be included in the remote control unit circuitry. The RFID tag can be used to help locate the remote control from thecontrol circuit60 in the event the remote control unit is lost.
Further Implementation Details of Preferred Embodiments
The Touchpad Sensor
The touchpad sensor can be segmented to provide several different intuitive zones of interaction. The touchpad is also clickable by virtue of the button pad membrane switch assembly located beneath or embedded within it. The clickable touchpad can register pressure information and react to pressure (both mechanically and electrically) by sending a specific signal while providing sufficient haptic feedback to the user such as through vibrations and sounds via the annunciator/speaker110. The touchpad allows for the use of at least two contact points simultaneously. (e.g., two finger input) such as one contact point per side of the pad. The touchpad can be viewed as divided in two along a medial line (e.g., separating the right and left sides of the touchpad when held in a landscape orientation). The touchpad can thus be constructed using two single-position registering touchpads mounted side by side, or one single multi-touch touchpad with the ability to register with equal precision (two points of contact at the same time).
Physical Buttons
Although not required in all embodiments, the remote control unit may have a complement of physical buttons. In this regard, four buttons32-35 have been illustrated inFIGS. 1A and 1B. These physical buttons may be implemented using the same button pad membrane switch assembly112 (FIG. 3) embedded beneath the touchpad. The physical buttons, like the context-dependent virtual buttons on the touchpad surface, can be backlit to reveal button function names.
Redefining Regions of Interaction
To allow for natural operation, the remote control unit uses its pattern recognition system to interpret the sensory data. Included in the sensory data are inputs from the accelerometer or accelerometers and the capacitive sensors placed around the periphery and the bottom of the case. The user will naturally turn the remote control unit in his or her hands to best accommodate what he or she is trying to accomplish. The pattern recognition system interprets how the user is holding the remote control unit and redefines these zones of interaction so that they will appear to be at the same place, no matter how the remote is oriented. For instance, the remote control unit can be used with one or two hands, and in both landscape and portrait orientation. The pattern recognition system can discriminate the difference and will automatically redefine the zones of interaction so that the user can perform the most probably operations in the easiest manner for that user. The zones of interaction include, for example, different zones within the touchpad. Different regions of the touchpad may be dedicated to different functions or different user manipulation styles. In addition, the remote control unit itself can be manipulated into different virtual “zones of interaction” by employing different gestures with the remote in mid-air, such as a quick flick of the wrist to change channels.
Power Management
The presently preferred embodiment is contemplated for very low power consumption. For example, the remote control unit may run on a single AA or AAA battery or batteries for approximately one year. With currently available technology, the wireless circuitry associated with RF modules consumes more power than the touch sensors; and the accelerometers and actuators consume less power than the touch sensors. For this reason, thepower management circuitry204 places the wireless circuitry in a sleep mode (or turned off altogether) after a short period of time after the remote control unit is no longer being used (e.g., 30 seconds). The touch sensors will then be placed in sleep mode (or turned off) after a somewhat longer period of time (e.g., 2 minutes). This will allow turning on the wireless circuitry again (in case the user touches the surface of the touchpad or picks up the unit within two minutes). The accelerometers are put into a low power mode where the circuitry checks the accelerometer status at a much lower rate than the normal accelerometer refresh rate. In this regard the normal refresh rate might be on the order of 50 Hz whereas in the low power mode the refresh rate might be in the order of 1 Hz, or even 0.1 Hz. Thepower management circuitry204 would implement a turn on sequence that is essentially the reverse of the turn off sequence, with the accelerometer refresh rate being increased to full rate first, followed by reactivation of the touch sensors and finally by activation of the wireless circuitry. In the sleep mode, the RF modules may periodically be awakened, to check to see if there are any pending messages from thecontrol circuit60.
In the presently preferred embodiment, the remote control unit does not have a dedicated power-on button, as this might be a potential source of user confusion as to whether such button powers on the remote control unit or the television. Thus, the pattern recognition system is used to handle power-on in an efficient manner. The remote control unit turns on when the user first picks it up. For this reason, the system first checks the lower resolution acceleration data to determine if the remote has been moved. If so, the capacitive sensors are next energized to determine if the remote is actually being held (as opposed to simply being inadvertently pushed or moved when resting on the coffee table). If the pattern recognition system determines that the remote control unit is being held, then next the touchpads and finally the wireless circuitry are activated.
Alternatively, power-on can be triggered by a specific gesture, such as shaking the remote control unit. More complex power-on operation can also be utilized, for example, to enforce parental control as discussed above in connection with parental blocking features.
The pattern recognition system will likewise detect when it is time to turn the remote control unit off by detecting inactivity, or if detecting that the television has been turned off. This latter event would be detectable, for example, by information communicated via the RF modules.
Remote Finder
Thecontrol circuit60, associated with the consumer electronic equipment, may include a button that will send a remote location message to the remote control unit. The user would push this button if the remote control unit has gotten misplaced. The control circuit would then periodically send a tell-me-where-you-are signal to the remote via RF. When the remote control unit's RF module next wakes up and finds the wake up signal, it will activate the haptic feedback system (e.g., speaker/annunciator110) causing the unit to make sound and/or vibrate and optionally use thedisplay illumination circuitry108 to turn the backlighting on. In addition, if desired, the remote control unit and the control circuitry can use RF ranging functionality to measure the distance between the remote control unit and the control circuit. This information has been used to display the distance on thedisplay50, or even present a picture of the room with highlighted areas identifying where the remote control unit could be. Alternatively, the RFID tag206 may be used, allowing the precise location of the remote control to be displayed on thedisplay screen50.
Tight Coupling Between Remote Control System and On-Screen User Interface
As illustrated by the previously discussed example regarding parental control, the remote control system is able to capitalize on its tight coupling with the on-screen information. The on-screen information, such as instructions on how to deactivate the parental blocking feature, may be stored in the programmablerandom access memory86 of the control circuit (FIG. 3) and may then be projected onto thedisplay50 as an overlay upon the presently viewed program. First, by displaying information to the user on the display screen, the user does not need to look at the remote control unit in order to operate it. If the user needs to enter input, such as a spelled word, an overlay image of a keyboard may be presented and the user can navigate to the desired keys by simply manipulating the touch pad while watching a cursor or cursors (one for each finger) on the displayed overlay keyboard. If desired, the remote control system circuitry can also obtain program guide information and the display overlay can then allow the user to select which programs to view or record by simply manipulating the touch pad.
One can better understand the effectiveness of the remote control system by considering where the functionality of the system has been placed. By tight integration with the display screen, the remote control system can use the display screen, with its high resolution graphics capability, to provide an unlimited amount of visual information to the user which would be virtually impossible to provide through a set of dedicated buttons as conventional controllers do. The rich collection of diverse sensory inputs allows the user to adopt many different, and even redundant, ways of communicating the user's desires to the system. Interpretation of the diverse collection of sensory inputs by the pattern recognizer handles much of the complexity of converting the user's gestural and touch commands into message meaning data that correlate to functions that the consumer electronic equipment can perform. The resulting division of labor produces a control system that provides both a very high, visually engaging information content to the user regarding his or her control system choices, with an equally rich collection of gestural and touch commands that the user can employ to get his or her message across to the control system. Compare this to the conventional push button remote control that requires one button, or a sequence of buttons, to be pressed for each desired function, with the added inconvenience that the user must look at the remote control in order to find the desired button to push.
Referring now toFIGS. 4A through 10, other aspects of the present disclosure will be further discussed. Specifically, another embodiment of the remote control unit is illustrated and is indicated generally at310. Theremote control unit310 is shown in detail inFIGS. 4A and 4B. Theremote control unit310 can be incorporated in aremote control system312 illustrated inFIGS. 5-10 and discussed in greater detail below.
Referring initially toFIGS. 4A and 4B, theremote control unit310 will be discussed in greater detail. Theremote control unit310 generally includes acasing314. Thecasing314 in some embodiments is generally elongate, rectangular, and box-like so as to be held comfortably in one or two hands. Thecasing314 defines afirst end316, asecond end318 opposite thefirst end316, afirst side320, and asecond side322 opposite thefirst side320. The first andsecond sides320,322 are generally perpendicular to the first and second ends316,318. Furthermore, thecasing314 generally defines atop face325. It will be appreciated that theremote control unit310 can have any suitable shape without departing from the scope of the present disclosure.
Thecasing314 also defines at least one imaginary cut plane that substantially bisects theremote control unit310. In the embodiments represented inFIG. 4, thecasing314 defines a first imaginary cut plane X1 and a second imaginary cut plane X2. (Each imaginary cut planes X1, X2 are represented inFIG. 4 by broken lines.) The first imaginary cut plane X1 intersects the first andsecond sides320,322 midway between the first and second ends316,318 and also intersects thetop face325. The second imaginary cut plane X2 is substantially perpendicular to the first cut plane X1 and intersects the first and second ends316,318 midway between the first andsecond sides320,322. Also, the second imaginary cut plane X2 intersects thetop face325 of theremote control unit310. As shown in the embodiments represented inFIG. 4A, thecasing314 is substantially symmetric about each of the first and second imaginary cut planes X1, X2. It will be appreciated that thecasing314 could be symmetric about only one of the imaginary cut planes X1, X2 without departing from the scope of the present disclosure. It will also be appreciated that one or more of the imaginary cut planes X1, X2 could bisect theremote control unit310 at any suitable location.
Theremote control unit310 further includes a transmitter schematically illustrated at326. Thetransmitter326 is operable for transmitting one or more control signals for controlling an electronic device, such as a television, audio equipment, air conditioning equipment, ceiling fans, or any other suitable device. It will be appreciated that theremote control unit310 can control any suitable electronic device remotely as will be discussed. Furthermore, thetransmitter326 can be of any suitable type. In some embodiments, thetransmitter326 transmits radio frequency (RF) signals; however, it will be appreciated that thetransmitter326 can be of any suitable multi-directional transmitter. It will also be appreciated, however, that thetransmitter326 can be of any suitable directional transmitter, such as an infrared (IR) transmitter, without departing from the scope of the present disclosure.
Theremote control unit310 also includes a plurality of input features, generally indicated at328. The input features328 can be of any suitable type, such as movable buttons, touchpads, dials, joysticks, and the like. As will be described, a user manipulates one or more of the input features328 to cause thetransmitter326 to transmit the control signal for controlling the associated electronic device. For instance, in the embodiments represented inFIG. 5, theremote control unit310 is used to control atelevision330, having areceiver332. When the input features328 are manipulated, thetransmitter326 transmit one or more control signals currently associated with the input features328 that the user manipulates. Once thereceiver332 receives the transmitted control signal(s), thetelevision330 operates accordingly. It will be appreciated that theremote control unit310 can be used for any suitable control of thetelevision330, such as channel control, volume control, power on/off, and the like.
Theremote control system312 can also include adisplay346. In the embodiment ofFIG. 5, thedisplay346 is included on thetelevision330; however, it will be appreciated that thedisplay346 can be separate from the electronic device controlled by theremote control unit310. It will be appreciated that thedisplay346 can also be included on theremote control unit310 itself. Thedisplay346 displays a virtual representation of the remote control unit310 (i.e., a virtualremote control unit348 with virtual input features328). In some embodiments, when the user picks up or otherwise contacts theremote control unit310, thedisplay346 automatically displays the virtualremote control unit348. In the embodiment shown, the virtualremote control unit348 is substantially similar in appearance to the actualremote control unit310. Also, thedisplay346 displays a plurality oficons350. Theicons350 are displayed so as to indicate the functions associated with eachinput feature328. Also, thedisplay346 displays acursor352 corresponding to the location of the user's finger or stylus on theremote control unit310. The user moves thecursor352 by moving a finger over theremote control unit310 as will be discussed. In some embodiments, thecursor352 is in the shape of a thumb.
In the embodiment shown, the input features328 of theremote control unit310 include a first touchsensitive area334aand a second touchsensitive area334b. The touchsensitive areas334a,334bare distinct from each other and separated at a distance so as to define afirst touchpad336aand asecond touchpad336b. The first andsecond touchpads336a,336bcan be of any suitable type and can recognize when and where the user touches thetouchpad336a,336b. Thetouchpads336a,336bcan also trace movement of the users finger(s) thereon for movement of thecursor352. Furthermore, in some embodiments, each of thetouchpads336a,336bcan detect when the user touches with two fingers simultaneously. Moreover, in some embodiments, eachtouchpad336a,336bcan recognize contact with the users skin and/or when the user contacts thetouchpad336a,336bwith a stylist or other indicating device. Also, thetouchpads336a,336bcan be configured to be movable (i.e., clickable) for providing further user input.
Moreover, in the embodiment shown, theremote control unit310 includes a plurality of moveable buttons disposed generally between the first andsecond touchpads336a,336b. More specifically, in the embodiment shown, theremote control unit310 includes acentral button338a, andfirst end button338b, asecond end button338c, afirst rocker button338d, asecond rocker button338e, athird rocker button338f, and afourth rocker button338g. Thecentral button338ais located generally in a central location on thetop face325. The first andsecond end buttons338b,338care located on opposite sides of thecentral button338a. The first andsecond rocker buttons338d,338eare located on a side of thecentral button338aopposite that from the third andfourth rocker buttons338f,338g. It will be appreciated that theremote control unit310 can include any number and any style of buttons without departing from the scope of the present disclosure. Furthermore, it will be appreciated that theremote control unit310 can include any style of input features328, including those other than touch sensitive areas and buttons.
Manipulation of the input features328 (e.g., pressing the buttons338a-338gand touching thetouchpads336a,336b) selectively causes thetransmitter326 to transmit an associated control signal. This will be described in greater detail below.
As shown inFIG. 4, the input features328 (i.e., thetouchpads336a,336band the buttons338a-338g) are collectively disposed in a substantially symmetric manner with respect to the first and second imaginary cut planes X1, X2. In other words, the position and shape of the input features328 are substantially symmetric with respect to the first and second cut planes X1, X2. Specifically, in the embodiment shown, the first andsecond touchpads336a,336bare located on opposite sides and are disposed at substantially equal distances from the first cut plane X1. Moreover, the first andsecond touchpads336a,336bare shaped substantially the same. Moreover, each of the first andsecond touchpads336a,336bare substantially bisected by the second cut plane X2. Furthermore, the array of buttons338a-338gis substantially bisected by each of the first and second cut planes X1, X2. It will be appreciated, however, that the input features328 could be symmetric about only one of the cut planes X1, X2 without departing from the scope of the present disclosure. It will also be appreciated that the input features328 could be symmetric about more than two cut planes.
As will be described in greater detail, the symmetrical layout of the input features328 allows for various advantages. For instance, the array of input features328 appears the same in multiple orientations and holding positions. As such, theremote control unit310 can be operated in a very intuitive manner as will be described.
Theremote control unit310 can also include at least onesensor340 for detecting the way the user is holding theremote control unit310. In other words, thesensor340 detects one of a plurality of holding positions of theremote control unit310. Thesensor340 can be of any suitable type, such as an acceleration sensor, a contact sensor, a capacitive sensor, a pressure sensor, and the like. For instance, in some embodiments, thesensor340 detects areas of contact between the user's hand and theremote control unit310 to detect the holding position of theremote control unit310. Furthermore, in some embodiments, thesensor340 is an accelerometer that detects movement of theremote control unit310, for instance, detecting that theremote control unit310 has been inverted or otherwise rotated. Pattern recognition methods and features described above can be used to detect the holding position of theremote control unit310. In some embodiments, thesensor340 detects and distinguishes between a first holding position and a second holding position. The first holding position and the second holding position are substantially opposite each other. For instance, in some embodiments, the first holding position is inverted with respect to the second holding position as will be described in greater detail. Furthermore, in some embodiments, the user holds theremote control unit310 in a right hand in the first holding position, and the user holds theremote control unit310 in a left hand in the second holding position as will be described.
Moreover, as shown inFIG. 4A, theremote control unit310 includes acontroller342. Thecontroller342 can include any suitable hardware and/or software. Also, thecontroller342 can be housed within thecasing314 and/or can be disposed outside thecasing314 of theremote control unit310. Thecontroller342 includes a functional map, which associates a plurality offunctions344 with corresponding ones of the input features328 of theremote control unit310.
For instance, in the embodiment ofFIG. 5, theremote control unit310 controls thetelevision330. Thetelevision330 includesvarious functions344 such as power on/off, volume control, channel control, switching the input source, mute, and entry of alphanumeric symbols. Each of these functions of thetelevision330 can be controlled by manipulating one or more of the input features328 of theremote control unit310. The map of thecontroller342 associates each of thefunctions344 with one or more of the input features328. For instance, the power on/off function can be associated with thecentral button338ain the map of thecontroller342. As such, when the user presses thecentral button338a, thetelevision330 turns on or off. In some embodiments, the most commonly used functions of thetelevision330 are associated in the map with the buttons338a-338gfor simple control of thetelevision330. Also, in some embodiments, other less common functions of thetelevision330 are associated with thetouchpads336a,336bof theremote control unit310.
As will be described, thecontroller342 changes the association of thefunctions344 and the input features328 depending on the holding position detected by thesensor340 of theremote control unit310. As such, theremote control unit310 can operate substantially the same in multiple holding positions. Also, as will be described, the mapping of thefunctions344 to the input features328 can be changed depending on the detected holding position such that thefunctions344 are associated with input features328 in more convenient locations on theremote control unit310. As such, theremote control unit310 can be operated in a more ergonomic and intuitive manner.
Referring now toFIGS. 5 and 6, a comparison will be made of the operation of theremote control unit310 in multiple holding positions. More specifically, inFIG. 5, theremote control unit310 is held such that thefirst end316 is oriented outward relative to the user, thesecond end318 is oriented inward relative to the user, and so on. In contrast, inFIG. 6, theremote control unit310 is held with thesecond end318 oriented outward relative to the user, thefirst end316 oriented inward relative to the user, and so on. In other words, theremote control unit310 is inverted inFIG. 6 as compared to the holding position shown inFIG. 5. Because of the symmetrical layout of the input features328, theremote control unit310 appears substantially the same to the user in both holding positions. Also, when thesensor340 detects the holding position ofFIG. 5, thecontroller342 maps (i.e., associates) thefunctions344 with corresponding input features328; however, when thesensor340 detects the holding position ofFIG. 6, thecontroller342 remaps thefunctions344 to those input features328 on the opposite side of the first cut plane X1.
More specifically, in the holding position ofFIG. 5, the numeric input functions344 (i.e., represented by icons0 through9) are mapped to thefirst touchpad336a, but in the holding position ofFIG. 6, the numeric input functions344 are mapped to thesecond touchpad336b. Similarly, theicons350 representing numeric input functions344 are displayed on thefirst touchpad336ain the holding position ofFIG. 5, but theicons350 are displayed on thesecond touchpad336bin the holding position ofFIG. 6. The orientation of theicons350 displayed inFIG. 5 is inverted across the first cut plane X1 with respect to the orientation displayed inFIG. 6 such that the icons appear right side up.
Likewise, thecontroller342 remaps thefunctions344 associated with the movable buttons338a-338gwhen the holding position is changed from the holding position ofFIG. 5 to the holding position ofFIG. 6. For instance, in one embodiment, in the holding position ofFIG. 5, themute function344 is associated with thesecond end button338c, but in the inverted holding position ofFIG. 6, themute function344 is associated with thefirst end button338b.
Accordingly, theremote control unit310 can be picked up without looking at theremote control unit310 in either of the inverted positions, and the user can immediately begin using it. As such, theremote control unit310 can be used in a highly intuitive and convenient fashion. Furthermore, because theicons350 are remapped by thecontroller342 and theicons350 are displayed on thedisplay346, theremote control unit310 can effectuate a wide variety offunctions344 without having to look at theremote control unit310.
Referring now toFIGS. 7 and 8, mapping of thefunctions344 is further illustrated with respect to additional opposite holding positions. For instance, in the embodiment ofFIG. 7, theremote control unit310 is held in the right hand of the user, but in the embodiment ofFIG. 8, theremote control unit310 is held in the left hand of the user. When the user holds theremote control unit310 in the right hand (FIG. 7), thefunctions344 are associated with certain corresponding input features328; however, when the user holds theremote control unit310 in the left hand (FIG. 8), thecontroller342 remaps thefunctions344 to the input features328 on the opposite side of the second imaginary cut plane X2.
For instance, in one embodiment, thechannel control functions344 are associated with the first andsecond rocker buttons338d,338eand thevolume control functions344 are associated with the third andfourth rocker buttons338f,338gwhen theremote control unit310 is held in the right hand (FIG. 7). However, when theremote control unit310 is held in the left hand, thechannel control functions344 are associated with the third andfourth buttons338f,338gand thevolume control functions344 are associated with the first andsecond rocker buttons338d,338e. As such, thechannel control functions344 can be located closer to the thumb of the user for easier access to thechannel control functions344 in both holding positions.
Also, theicons350 shown on thedisplay346 are relocated to correspond to the mapping performed by thecontroller342. Furthermore, it will be appreciated that any one of thefunctions344 and associatedicons350 can be remapped and re-associated as described above, including thefunctions344 andicons350 associated with thetouchpads336a,336b.
Moreover, thecursor352 can change depending on the holding position detected by thesensor340. In the embodiment shown, for instance, when theremote control unit310 is held in the right hand, a right thumb is displayed as thecursor352, but when theremote control unit310 is held in the left hand, a left thumb is displayed as thecursor352. As such, operation of theremote control unit310 is less likely to confuse the user.
Referring now toFIGS. 9 and 10, operation of theremote control unit310 is discussed further. In the embodiment shown, when theremote control unit310 is turned to a substantially horizontal position (i.e., a landscape orientation), thesensor340 detects the change in orientation. As a result, thecontroller342 automatically causes thesystem312 to enter a text entry mode. More specifically, thedisplay346 displays a keyboard arranged in any suitable fashion. In the embodiment shown, thedisplay346 displays a QWERTY keyboard. Also, thedisplay346displays text suggestions360, which suggest complete words that the user can select based on prior inputted text. Also, in the embodiment shown, theremote control unit310 can be operated using two hands, with one thumb on one of the first andsecond touchpads336a,336band the other thumb on theother touchpad336a,336b. Thedisplay346 also displays a corresponding right and left thumb as thecursors352. Furthermore, thedisplay346 highlights the individual keys that thecursor352 overlaps for easier text input.
In comparingFIGS. 9 and 10, it is shown that thecontroller342 remaps the input features328 such that the input features328 can be manipulated in the same manner regardless of whether thefirst side320 or thesecond side322 is held outward from the user. More specifically, if thefirst side320 is held outward from the user (FIG. 9), thefirst touchpad336acan be operated with the left thumb and thesecond touchpad336bcan be operated with the right thumb. In contrast, if thesecond side322 is held outward from the user (FIG. 10), thesecond touchpad336bcan be operated with the left thumb, and thefirst touchpad336acan be operated with the right thumb. As such, the user can use theremote control unit310 in the same fashion regardless of the horizontal (i.e., landscape) holding position. Thecontroller342 remaps the text entry functions344 as described above such that the user can operate theremote control unit310 in the same manner in both orientations shown inFIGS. 9 and 10.
In summary, the symmetric design and remapping operation of thecontroller342 allows for substantially intuitive user interaction with theremote control unit310. As such, theremote control unit310 can be operated more easily and conveniently. Furthermore, the heads-up operation enabled by thedisplay346 allows theremote control unit310 to be operated in the dark, without having to look at theremote control unit310. Theremote control unit310 can simply be picked up, and the user can begin operating theremote control unit310 almost immediately.
Moreover, the foregoing discussion discloses and describes merely exemplary embodiments of the present disclosure. One skilled in the art will readily recognize from such discussion, and from the accompanying drawings and claims, that various changes, modifications and variations may be made therein without departing from the spirit and scope of the disclosure as defined in the following claims.