CROSS REFERENCE TO RELATED APPLICATIONS This patent application is a divisional application of commonly assigned and co-pending U.S. application Ser. No. 10/233,212 filed on 30 Aug. 2002, the contents of which are incorporated herein by reference and from which benefits under 35 U.S.C. 120 are claimed.
FIELD OF THE DISCLOSURE The present inventions relate generally to mobile wireless communications devices, and more particularly to user enriching events in wireless communications devices, for example in cellular communications handsets, and methods therefor.
BACKGROUND As consumers in the competitive wireless cellular communications handset market become more sophisticated, the successful marketing of cellular handsets depends upon the ability of manufacturers and network providers to offer more than basic features. Cellular handsets are now viewed by many consumers as apparel items integrated as a part of the individual being. Consumers also increasingly desire the ability to customize and personalize their handsets as a form of self-expression to reflect changes in mood or psychological disposition, to differentiate from others, to associate with peers, etc.
It is known to generate audio sounds upon the occurrence of specified events on cellular telephone handsets. The Motorola Timeport 280, for example, produces a sound when a charger cable is connected thereto. However, the user has no control over this audible signal. The Motorola V60 cellular handset enables the association of different user specified audio alerts with different incoming communications including calls and e-mail.
The various aspects, features and advantages of the present disclosure will become more fully apparent to those having ordinary skill in the art upon careful consideration of the following Detailed Description with the accompanying drawings described below.
BRIEF DESCRIPTION OF THE DRAWINGSFIG. 1 is an exemplary mobile cellular communications handset having a pivoting panel.
FIG. 2 is an exemplary cellular handset housing configuration detection switch.
FIG. 3 is a schematic electrical block diagram for an exemplary cellular communications handset.
FIG. 4 is a process flow diagram for one exemplary cellular handset mode of operation.
FIG. 5 is an exemplary process follow diagram for associating sensory output with event occurring on a wireless communications handset.
FIG. 6 is a process flow diagram for another exemplary cellular handset mode of operation.
FIG. 7 is a process flow diagram for yet another exemplary cellular handset mode of operation.
DETAILED DESCRIPTION InFIG. 1, an exemplarycellular handset100 includes a housing having a cover portion, or flip,110 pivotally coupled to a housing portion120. A user interface is exposed upon opening theflip110. The exemplary user interface includes adisplay112 and anaudio output114 on the cover portion, and aninput keypad122 including an alpha/numeric keys and other controls on the housing portion120.
InFIG. 1, the housing includes a switch for sensing whether thepivotal cover portion110 is opened or closed relative to the housing.FIG. 2 is an enlarged view of ahousing portion220 including a cover position-detectingswitch222 disposed near the cover hinge. The switch is actuated upon pivoting thecover210, which includes a protrudingmember212 for engaging an actuating the switch. The switch and its location are exemplary and not intended to limit the disclosure, as many other switches and configuration are suitable for detecting the position of the pivoting cover.
In other embodiments, the housing may have a portion that rotates, for example, a blade that rotates to cover and expose a user interface. The blade position may be detected by a switch or by a rotary encoder, or by some other position detecting device. Other handset housings include sliding housing covers or portions, the position of which may also be detected by a sensor or switch.
InFIG. 3, an exemplary schematic block diagram of a mobilewireless communications device300 includes aprocessor310 coupled tomemory320, adisplay330, and a radio frequency (RF)transceiver340. In one embodiment, the transceiver is for communicating within service provider network infrastructures. In other embodiments the wireless device receives and transmits over small area networks, for example, Bluetooth, and IEEE 802.11b.
InFIG. 3,user inputs350, for example, a microphone, keypad, scrolling input device, joystick, data input jack, infrared signal input, accessory connectors, etc., are also coupled to theprocessor310. The processor is coupled tooutputs360, for example, a speaker, audio output jack, etc. The exemplary configuration is not intended to limit the disclosure, as other architectures may be implemented.
InFIG. 3, ahousing actuation input370 is also coupled to the processor for indicating the position of a mechanically actuatable portion of the mobile wireless communications device, for example, a user interface cover or some other actuating portion. Thehousing actuation input370 ofFIG. 3 corresponds, for example, to the position-detectingswitch222 ofFIG. 2, or to any other mechanically actuatable housing portion. The switch is not required in all embodiments of the invention, for example, some embodiments thereof do not include an actuatable user interface cover.
In the process flow diagram ofFIG. 4, atblock410, a mechanical portion of the wireless device is actuated. This actuation may be the translating or pivoting or rotating action of a housing cover portion, or some other mechanically actuatable portion thereof. The actuation of the mechanical portion may also be the depression of one or more input keys, or the actuation of a switch, the extension of a retractable antenna, or the connection of an accessory, for example, a plug-in charger, a camera, ear phones, etc.
InFIG. 4, atblock420, a user-configurable sensory output of the mobile wireless communication device is produced upon actuating a mechanical portion of the mobile wireless communication device.
In the process flow diagram500 ofFIG. 5, atblock510, the user selects a sensory output from a plurality of sensory outputs, for example, at a user configuration menu. Atblock520, the selected sensory output is associated with a particular event on the wireless communication device.
The event selected atblock520 may be the mechanical actuation of a portion of the device, examples of which are discussed above, including the rotation or translation of a cover portion, or the depression of one or more input keys, the extension or retraction of a whip antenna, the opening or removal of a compartment, for example, a battery compartment cover or a face place, or the actuation of some other mechanical portion of the device. In another embodiment, the user may select, or re-map, one or more sensory outputs associated with the depression of each input key.
In one embodiment, the user-configurable sensory output is an audio output, for example, a melodic sound, or an audio message, or some other sound clip. In some embodiments, the sound produced is related to the action performed, for example, a “Creeeeeak” sound may be produced as the cover pivots open, or a “Zzzzzzzip” sound may be produced as an antenna whip is withdrawn or retracted.
In other embodiments, the user-configurable sensory output is a tactile sensation, which may be in the form of a buzz or it may be a more melodic or rhythmic tactile sensation. In some embodiments, the tactile output is produced in concert with some other sensory output, for example, in synchronization with a melodic audio output.
The user-configurable sensory output may also be the production of some visual stimulation, for example, an image on the display. The visual image may be a still image or a dynamic video image, like a short video clip.
InFIG. 1, thewireless device100 includes avanity light130 disposed along a side thereof, or on some other portion of the device, for emitting light upon the occurrence of a user specified event. In one embodiment, the visual sensory output is the illumination of one or more vanity lights upon the occurrence of the event specified atblock520 inFIG. 5. The sensory output may also be the illumination of the display alone or in addition to the illumination of the vanity lights. The lights may be configured to flash or to provide steady brightness depending on the user's preference. The lighting may also be synchronized with other sensory outputs, for example, with audio and tactile outputs.
In other embodiments, the user-sensory output may be a thermal output, for example, a change in temperature of the wireless device or a portion thereof, or an olfactory sensory output. Generally, one or more of the user-configurable sensory outputs may be produced in combination, either serially or in parallel, in response to actuating the mechanical portion of the wireless device.
In one embodiment atblock510 ofFIG. 5, the user may also configure properties of the sensory output selected, for example, the audio volume, or the fade-in and fade-out of the sensory output, among other properties.
InFIG. 4, atblock430, in some embodiments, the sensory output terminates after a specified time period. In one embodiment, the user may specify that the sensory output fade-out slowly, for example, audio outputs may fade-out to an inaudible volume level.
In another embodiment, the event specified atblock520 inFIG. 5 is the transitioning of the wireless device between a reduced power consumption mode and a relatively higher power consumption mode, for example, between sleep and active modes. Wireless handsets generally transition from active mode to sleep mode after some period of inactivity to conserve power. The handset transitions to the active mode in response to some user input, for example, upon depressing an input key or upon actuating some other mechanical portion thereof. The user may specify whether the sensory output occurs when the device assumes the active or sleep mode, or both. Also, different events may be associated with the transition depending upon the direction the change in state.
In the process flow diagram600 ofFIG. 6, atblock610, the mobile wireless communication device transitions between a reduced power consumption mode and a relatively higher power consumption mode. Many events prompt the wireless device to transition between modes. The wireless device may transition between a sleep mode and active mode upon actuating a mechanical portion of the mobile wireless communications device, for example, by actuating a cover portion, or depressing an input key or other button or switch.
InFIG. 6, atblock620, a user-configurable sensory output of the mobile wireless communication device is produced upon transitioning the mobile wireless communication device between modes.
In another embodiment, the event selected atblock520 inFIG. 5 is the transitioning between power-on and power-off modes of operation of the mobile wireless communication device. The user may specify whether the sensory output occurs when the device is turned on and/or when it is turned off, and associate different events depending upon the direction the transition. Atblock520, one or more user-specified sensory outputs are associated with the transitioning between off and on modes. Thereafter, upon applying or removing power, the associated sensory output is produced, according to the user's selection. In some embodiments, the user-configurable sensory output terminates after a specified time period.
In another embodiment, the mobile wireless communication device receives information from a communications service provider associated with the occurrence of an event that occurs on the mobile wireless communication device, whereby the occurrence of the event initiates the production of the sensory output on the wireless device. The temporary sensory output thus communicates information received from the communications service provider upon the occurrence of the event. In this embodiment, the service provider selects the sensory output and associates it with an event, for example, when the mobile wireless communication device transitions between power-off and power-on modes of operation, or some other event.
In one embodiment, the sensory output that communicates information received from the communications network is the displaying of visual information, for example, a still image or a short video clip. In some embodiments corresponding audio and/or tactile information, also received from the service provider, is produced in concert with the visual information. According to this embodiment, the sensory output is controlled by the network service provider upon the occurrence of the specified event, for example, to communicate important service related information to the user from the service provider or from third parties. The service provider may update the information by transmitting new information to the wireless device, for example, in a broadcast message or in a point-to-point message.
In another mode of operation, illustrated in the process flow diagram700 ofFIG. 7, atblock710, the mobile wireless communication device undergoes a change in reception of a radio signal from a source other than the communications service provider, for example, a Bluetooth signal, an IEEE 802.11b signal, an infrared signal, or some other signal.
Atblock720, a user-configurable sensory output of the mobile wireless communication device is produced upon undergoing a change in reception of the radio signal from the source other than the communications service provider. The sensory output may be, for example, an audio signal alerting the user that the wireless device is receiving the signal or no longer receiving the signal. Ablock730, the user-configurable sensory output is terminated after a specified time period.
While the present inventions and what are considered presently to be the best modes thereof have been described in a manner that establishes possession thereof by the inventors and that enables those of ordinary skill in the art to make and use the inventions, it will be understood and appreciated that there are many equivalents to the exemplary embodiments disclosed herein and that myriad modifications and variations may be made thereto without departing from the scope and spirit of the inventions, which are to be limited not by the exemplary embodiments but by the appended claims.