TECHNICAL FIELDThis disclosure relates to user interfaces incorporating a visual display and/or a touch-sensitive control.
BACKGROUNDPart of enjoying the playing of an audio/visual program (e.g., a piece of music, a recorded lecture, a recorded live performance, a movie, a slideshow, family pictures, an episode of a television program, etc.) is the task of selecting the desired audio/visual program to be played. Unfortunately, the increasing variety of choices of sources of audio/visual programs and the increasing variety of mechanisms by which audio/visual programs are able to be stored and played has greatly complicated what was once the relatively simple act of watching or listening to the playing of an audio/visual program to enjoy it.
For example, those wishing to “tune in” an audio/visual program being broadcast must now select a channel on which to view an audio/visual program from as many as 500 channels available through typical cable and/or satellite connections for television and/or radio. Further, it has become commonplace to employ audio/visual devices that are able to be programmed to autonomously tune in and record an audio/visual program for playing at a later time. Still further, it is now becoming increasingly commonplace to obtain audio/visual programs from websites accessible through the Internet, either by receiving those audio/visual programs as streaming data while they are played, or downloading those audio/visual programs as a storable digital file on an audio/visual device for playing at a later time. Yet further, some of these possible sources of audio/visual programs require paid subscriptions for which key cards and/or decryption keys are required to gain access to at least some audio/visual programs.
Those seeking to avail themselves of even a modest subset of such a wide array of options for playing an audio/visual program have often found themselves having to obtain multiple audio/visual devices (e.g., tuners, descramblers, disc media players, video recorders, web access devices, digital file players, televisions, visual displays without tuners, etc.). Each such audio/visual device often has a unique user interface, and more often than not, is accompanied by a separate handheld wireless remote control by which it is operated. Attempts have been made to grapple with the resulting plethora of remote controls that often accompany a multitude of audio/visual devices by providing so-called “universal remotes” enabling multiple audio/visual devices to be operated using a single remote control. However, a universal remote tends to go only so far in satisfying the desire of many users to simplify the coordination required in the operation of multiple audio/visual devices to perform the task of playing an audio/visual program.
Efforts have recently been made through cooperation among multiple purveyors of audio/visual devices to further ease the coordinated operation of multiple audio/visual devices through the adoption of standardized command codes and various approaches to coupling multiple audio/visual devices to enable the exchange of those standardized command codes among multiple audio/visual devices. An example of this effort is the CEC standardized command set created as part of the HDMI interface specification promulgated by HDMI Licensing, LLC of Sunnyvale, Calif. However, these efforts, even in conjunction with a universal remote, still only go so far in making the playing of an audio/visual program into a truly simple undertaking.
SUMMARYA user interface for an audio/visual device incorporates one or both of a touch sensor having a touch surface on which is defined a racetrack surface having a ring shape and a display element on which is displayed a racetrack menu also having a ring shape, and where the user interface incorporates both, the ring shapes of the racetrack surface and the racetrack menu are structured to generally correspond such that the position of a marker on the racetrack menu is caused to correspond to the position at which a digit of a user's hand touches the racetrack surface.
In one aspect, an apparatus includes a display element capable of visually displaying a visual portion of an audio/visual program and a racetrack menu having a ring shape; a processing device; and a storage accessible to the processing device and storing a sequence of instructions. When the sequence of instructions is executed by the processing device, the processing device is caused to: cause the racetrack menu to be visually displayed on the display element such that the racetrack menu surrounds a first display area in which the visual portion of the audio/visual program may be visually displayed; cause a plurality of menu items to be visually displayed in the racetrack menu; cause a first marker to be visually displayed in the racetrack menu; receive an indication that a first manually-operable control is being operated to move the first marker; in response to the indication of the first manually-operable control being operated to move the first marker, move the first marker about the racetrack menu and constrain movement of the first marker to remain within the racetrack menu; receive an indication of the first manually-operable control being operated to select a menu item of the plurality of menu items that is in the vicinity of the first marker at a time subsequent to the first manually-operable control being operated to move the first marker about the racetrack; and in response to the indication of the first manually-operable control being operated to select the menu item that is in the vicinity of the first marker, cause the menu item to be selected, wherein causing the menu item to be selected comprises taking an action to cause the audio/visual program to be selected for playing.
Implementations may include, and are not limited to, one or more of the following features. The apparatus may further include a source interface operable to select a source from which to receive an audio/visual program, wherein the action taken to cause the audio/visual program to be selected for playing is selected from the group of actions consisting of: selecting the source interface to enable receipt of the audio/visual program from the source through the source interface; transmitting a command through the source interface to the source to select the audio/visual program from among a plurality of audio/visual programs available from the source; and transmitting a command through the source interface to the source to cause the source to provide the audio/visual program to the apparatus as part of playing the audio/visual program. Transmitting a command through the source interface to the source to select the audio/visual program from among the plurality of audio/visual programs available from the source may include causing the source to operate a radio frequency tuner to receive the audio/visual program, and/or causing the source to begin playing the audio/visual program from a storage medium accessible to the source.
The first manually-operable control may be a touch sensor having a touch-sensitive surface that is manually operable with a digit of a hand. Execution of the sequence of instructions by the processing device may further cause the processing device to cause the racetrack menu to be visually displayed in response to an indication of the digit touching the touch-sensitive surface followed by an indication of the digit moving about the touch-sensitive surface in a wiping motion at a time when the racetrack menu is not being visually displayed, and cause a command concerning playing the audio/visual program to be transmitted to a source of the audio/visual program in response to an indication of the digit touching the touch-sensitive surface followed by an indication of the digit ceasing to touch the touch-sensitive surface at a time when the racetrack menu is not being visually displayed. Execution of the sequence of instructions by the processing device may further cause the processing device to cause the racetrack menu to be visually displayed in response to an indication of the digit touching the touch-sensitive surface followed by an indication of the digit remaining in contact with the touch-sensitive surface for at least a predetermined period of time at a time when the racetrack menu is not being visually displayed, and cause a command concerning playing the audio/visual program to be transmitted to a source of the audio/visual program in response to an indication of the digit touching the touch-sensitive surface followed by an indication of the digit ceasing to touch the touch-sensitive surface at a time when the racetrack menu is not being visually displayed. Execution of the sequence of instructions by the processing device may further cause the processing device to cause a second marker to be visually displayed in the vicinity of the first marker in response to an indication of the digit touching the touch-sensitive surface, and move the second marker to a position relative to the first marker such that the position of the second marker relative to the first marker more precisely indicates the position where the digit is touching the touch-sensitive surface than does the position of the first marker within the racetrack menu in response to an indication of the digit moving about the touch-sensitive surface in a wiping motion.
Execution of the sequence of instructions by the processing device may further cause the processing device to move the first marker about the racetrack menu in a manner in which the first marker snaps between being in the vicinity of a first menu item of the plurality of menu items and being in the vicinity of a second menu item of the plurality of menu items, and the processing device may be further caused to operate an acoustic driver to acoustically output a sound at each instance of the first marker snapping between the vicinities of the first and second menu items. Execution of the sequence of instructions by the processing device may further cause the processing device to alter a dimension of the size of the first marker as the processing device is caused to move the first marker about the racetrack menu between the vicinity of a first menu item of the plurality of menu items and the vicinity of a second menu item of the plurality of menu items. The first marker may have the form of an arrow pointer pointing at a menu item of the plurality of menu items, a box surrounding a menu item of the plurality of menu items, or an alteration to the appearance of a menu item of the plurality of menu items that is distinct from the appearance of other menu items of the plurality of menu items. The racetrack menu may have a rectangular ring shape having four sides; and execution of the sequence of instructions by the processing device may further cause the processing device to receive an indication that the first manually-operable control is being operated, and cause a second marker to be visually displayed in the racetrack menu in the vicinity of one of the four sides of the rectangular ring shape of the racetrack menu to visually indicate which one of the four sides the first marker is currently located within.
Execution of the sequence of instructions by the processing device may further cause the processing device to cause the racetrack menu to be visually displayed on the display element in response to receiving any indication of any manual operation of the first manually-operable control, and cause the racetrack menu to cease to be visually displayed on the display element in response to a predetermined period of time having elapsed without any receipt of any indication of any manual operation of the first manually-operable control. Execution of the sequence of instructions by the processing device may further cause the processing device to: cause both the first display area and a second display area to be displayed on the display element in a manner in which both the first and second display areas are surrounded by the racetrack menu; cause a first menu item that is associated with a visual portion of an audio/visual program being played in the first display area to be located in a first portion of the racetrack menu located closer to the first display area than a second portion of the racetrack menu; and cause a second menu item that is associated with a visual portion of an audio/visual program being played in the second display area to be located in the second portion of the racetrack menu. The first display area may be positioned to overlie a portion of the second display area. The first display area and the second display area may be positioned adjacent to each other in a manner in which neither of the first and second display areas overlie the other.
In one aspect, a method includes visually displaying a racetrack menu having a ring shape on a display element that is capable of visually displaying both the racetrack menu and a visual portion of an audio/visual program such that the racetrack menu surrounds a first display area in which the visual portion of the audio/visual program may be visually displayed; visually displaying a plurality of menu items in the racetrack menu; visually displaying a first marker in the racetrack menu; receiving an indication that a first manually-operable control is being operated to move the first marker; in response to the indication of the first manually-operable control being operated to move the first marker, moving the first marker about the racetrack menu and constraining movement of the first marker to remain within the racetrack menu; receiving an indication of the first manually-operable control being operated to select a menu item of the plurality of menu items that is in the vicinity of the first marker at a time subsequent to the first manually-operable control being operated to move the first marker about the racetrack; and in response to the indication of the first manually-operable control being operated to select the menu item that is in the vicinity of the first marker, selecting the menu item, wherein selecting the menu item comprises taking an action to cause the audio/visual program to be selected for playing.
Implementations may include, and are not limited to, one or more of the following features. The method may further include visually displaying the racetrack menu in response to receiving an indication of a digit of a hand touching a touch-sensitive surface of the first manually-operable control followed by an indication of the digit moving about the touch-sensitive surface in a wiping motion at a time when the racetrack menu is not being visually displayed; and transmitting a command concerning playing the audio/visual program to a source of the audio/visual program in response to an indication of the digit touching the touch-sensitive surface followed by an indication of the digit ceasing to touch the touch-sensitive surface at a time when the racetrack menu is not being visually displayed. The method may further include visually displaying the racetrack menu in response to receiving an indication of a digit of a hand touching a touch-sensitive surface of the first manually-operable control followed by an indication of the digit remaining in contact with the touch-sensitive surface for at least a predetermined period of time at a time when the racetrack menu is not being visually displayed; and transmitting a command concerning playing the audio/visual program to a source of the audio/visual program in response to an indication of the digit touching the touch-sensitive surface followed by an indication of the digit ceasing to touch the touch-sensitive surface at a time when the racetrack menu is not being visually displayed. The method may further include visually displaying the racetrack menu on the display element in response to receiving any indication of any manual operation of the first manually-operable control; and ceasing to visually display the racetrack menu on the display element in response to a predetermined period of time having elapsed without any receipt of any indication of any manual operation of the first manually-operable control.
The action taken to cause the audio/visual program to be selected for playing is selected from the group of actions may include selecting a source interface to enable receipt of the audio/visual program from a source of the audio/visual program through the source interface; transmitting a command through the source interface to the source to select the audio/visual program from among a plurality of audio/visual programs available from the source; and transmitting a command through the source interface to the source to cause the source to provide the audio/visual program to the apparatus as part of playing the audio/visual program. Transmitting a command through the source interface to the source to select the audio/visual program from among the plurality of audio/visual programs available from the source may include causing the source to operate a radio frequency tuner to receive the audio/visual program and/or causing the source to begin playing the audio/visual program from a storage medium accessible to the source.
The method may further include moving the first marker about the racetrack menu in a manner in which the first marker snaps between being in the vicinity of a first menu item of the plurality of menu items and being in the vicinity of a second menu item of the plurality of menu items, and may further include operating an acoustic driver to acoustically output a sound at each instance of the first marker snapping between the vicinities of the first and second menu items. The racetrack menu may have a rectangular ring shape having four sides; and the method may further include receiving an indication that the first manually-operable control is being operated, and causing a second marker to be visually displayed in the racetrack menu in the vicinity of one side of the four sides of the rectangular ring shape of the racetrack menu to visually indicate which one of the four sides the first marker is currently located within.
The method may further include displaying both the first display area and a second display on the display element in a manner in which both the first and second display areas are surrounded by the racetrack menu; where a first menu item is associated with a visual portion of an audio/visual program being played in the first display area, displaying the first menu item in a first portion of the racetrack menu located closer to the first display area than a second portion of the racetrack menu; and where a second menu item is associated with a visual portion of an audio/visual program being played in the second display area, displaying the second menu item in the second portion of the racetrack menu. The method may further include positioning the first display area to overlie a portion of the second display area, or positioning the first display area and the second display area adjacent to each other in a manner in which neither of the first and second display areas overlie the other.
Other features and advantages of the invention will be apparent from the description and claims that follow.
DESCRIPTION OF THE DRAWINGSFIG. 1 is a perspective view of an embodiment of a user interface.
FIG. 2 depicts correlations between movement of a digit on a racetrack sensor of the user interface ofFIG. 1 and movement of a marker on a racetrack menu of the user interface ofFIG. 1.
FIGS. 3a,3b,3cand3d, together, depict possible variants of the user interface ofFIG. 1 incorporating different forms and combinations of markers.
FIG. 4 is a block diagram of a possible architecture of the user interface ofFIG. 1.
FIG. 5 is a perspective view of another embodiment of the user interface ofFIG. 1 combining more of the features of the user interface into a single device.
FIG. 6 depicts a possibility of switching between displaying and not displaying the racetrack menu of the user interface ofFIG. 1.
FIGS. 7aand7b, together, depict additional possible details of the user interface ofFIG. 1.
FIG. 8 is a perspective view of the embodiment of the user interface ofFIG. 5, additionally incorporating the possible details ofFIGS. 7aand7b.
FIG. 9 is a block diagram of the controller of the architecture ofFIG. 4.
FIGS. 10aand10b, together, depict possible variants of the touch sensor employed in the user interface ofFIG. 1.
FIGS. 11aand11b,together, depict possible variants of the user interface ofFIG. 1 incorporating more than one display area.
FIG. 12 depicts another embodiment of the user interface ofFIG. 1 in which the racetrack menu and the display area surrounded by the racetrack menu do not occupy substantially all of a display element.
DETAILED DESCRIPTIONWhat is disclosed and what is claimed herein is intended to be applicable to a wide variety of audio/visual devices, i.e., devices that are structured to be employed by a user to play an audio/visual program. It should be noted that although various specific embodiments of audio/visual devices (e.g., televisions, set-top boxes and hand-held remotes) are presented with some degree of detail, such presentations of specific embodiments are intended to facilitate understanding through the use of examples, and should not be taken as limiting either the scope of disclosure or the scope of claim coverage.
It is intended that what is disclosed and what is claimed herein is applicable to audio/visual devices that employ a tuner and/or a network interface to receive an audio/visual program. It is intended that what is disclosed and what is claimed herein is applicable to audio/visual devices structured to cooperate with other devices to play an audio/visual program and/or to cause an audio/visual program to be played. It is intended that what is disclosed and what is claimed herein is applicable to audio/visual devices that are wirelessly connected to other devices, that are connected to other devices through electrically and/or optically conductive cabling, or that are not connected to any other device, at all. It is intended that what is disclosed and what is claimed herein is applicable to audio/visual devices having physical configurations structured to be either portable or not. Still other configurations of audio/visual devices to which what is disclosed and what is claimed herein are applicable will be apparent to those skilled in the art.
FIG. 1 depicts auser interface1000 enabling a user's hand-eye coordination to be employed to more intuitively operate at least one audio/visual device to select and play an audio/visual program. Theuser interface1000 incorporates a displayed “racetrack”menu150 and a corresponding “racetrack”surface250. As depicted, theuser interface1000 is implemented by an interoperable set of devices that include at least an audio/visual device100 and a handheldremote control200, and as will be explained in greater detail, may further include another audio/visual device900. However, as will also be explained in greater detail, theuser interface1000 may be substantially fully implemented by a single audio/visual device, such as the audio/visual device100.
Theracetrack menu150 is visually displayed on adisplay element120 disposed on acasing110 of the audio/visual device100, and as depicted, the audio/visual device100 is a flat panel display device such as a television, employing a flat panel form of thedisplay element120 such as a liquid crystal display (LCD) element or a plasma display element. Further, the audio/visual device100 may further incorporateacoustic drivers130 to acoustically output sound. However, as those skilled in the art will readily recognize, theracetrack menu150 may be displayed by any of a variety of types, configurations and sizes of audio/visual device, whether portable or stationary, including and not limited to, a projector or a handheld device.
Theracetrack surface250 is defined on a touch-sensitive surface225 of atouch sensor220 disposed on acasing210 of the handheldremote control200, and as depicted, the touch-sensitive surface225 has a rectangular ring shape that physically defines the shape and position of theracetrack surface250 such that theracetrack surface250 encompasses substantially all of the touch-sensitive surface of thetouch sensor220. However, as those skilled in the art will readily recognize, thetouch sensor220 may be incorporated into any of a wide variety of devices, whether portable or stationary, including and not limited to, a wall-mounted control panel or a keyboard. Further, it is also envisioned that thetouch sensor220 may have a variant of the touch-sensitive surface225 (seeFIG. 2) that is of a shape other than a ring shape with theracetrack surface250 defined on that variant of the touch-sensitive surface225 in another way such that theracetrack surface250 encompasses only a subset of that variant of the touch-sensitive surface225 of thetouch sensor220. Further, thetouch sensor220 may be based on any of a variety of technologies.
As depicted, both theracetrack menu150 and theracetrack surface250 have a ring shape that is a generally rectangular ring shape with corresponding sets of four sides. More specifically, the foursides150a,150b,150cand150dof theracetrack menu150 are arranged to correspond to the foursides250a,250b,250cand250dof theracetrack surface250. This four-sided nature of both of theracetrack menu150 and theracetrack surface250 are meant to accommodate the rectilinear nature of the vast majority of display elements currently found in audio/visual devices and the rectilinear nature of the visual portion of the vast majority of currently existing audio/visual programs that have a visual portion. However, it is important to note that although theracetrack menu150 and theracetrack surface250 are depicted and discussed herein as having a rectangular ring shape, other embodiments are possible in which the ring shape adopted by theracetrack surface250 has a circular ring shape, an oval ring shape, a hexagonal ring shape or still other geometric variants of a ring shape. Further, where theracetrack menu150 and/or theracetrack surface250 have a ring shape that is other than a rectangular ring shape, one or both of thedisplay element120 and thetouch sensor220 may have a shape other than the rectangular shapes depicted herein.
As will be explained in greater detail, the foursides150a-dof theracetrack menu150 surround or overlie the edges of adisplay area950 in which the visual portion of an audio/visual program selected via theuser interface1000 may be played. It is this positioning of theracetrack menu150 about the periphery of thedisplay element120 and the display area950 (whether surrounding or overlying the periphery of the display area950) that supplies the impetus for both theracetrack menu150 and theracetrack surface250 having a ring shape that is generally a rectangular ring shape, rather than a ring shape of some other geometry. Where a selected audio/visual program does not have a visual portion (e.g., the audio/visual program is an audio recording having only an audio portion), thedisplay area950 may remain blank (e.g., display only a black or blue background color) or display status information concerning the playing of the selected audio/visual program as the selected audio/visual program is played, perhaps with the audio portion being acoustically output by theacoustic drivers130. As depicted, the foursides150a-dof theracetrack menu150 are displayed by thedisplay element120 at the edges of thedisplay element120. However, it is also envisioned that the foursides150a-dof theracetrack menu150 may be positioned about the edges of a “window” of a graphical user interface of the type commonly employed in the operation of typical computer systems, perhaps where the audio/visual device100 is a computer system on which audio/visual programs are selected and played through theuser interface1000.
As shown inFIG. 2, at various positions along one or more of the foursides150a-dof theracetrack menu150 aremenu items155 that may be selected by a user of theuser interface1000. Themenu items155 may include alphanumeric characters (such as those depicted as positioned along theside150a) that may be selected to specify a channel or a website from which to select and/or receive an audio/visual program, symbols (such as those depicted as positioned along theside150b) representing commands to control the operation of an audio/visual device capable of playing an audio/visual program (e.g., “play” and “stop” commands for a video cassette recorder, a disc media player, or solid state digital file player, etc.), and indicators of inputs (such as those depicted as positioned along theside150c) to an audio/visual device that may be selected and through which an audio/visual program may be selected and/or received. Although thevarious menu items155 positioned along theracetrack menu150 could conceivably serve any of a wide variety of purposes, it is envisioned that much of the functionality of themenu items155 will be related to enabling a user to select an audio/visual program for playing, and/or to actually play an audio/visual program.
To operate theuser interface1000, a user places the tip of a digit of one of their hands (i.e., the tip of a thumb or finger) on a portion of theracetrack surface250 defined on the touch-sensitive surface225 of thetouch sensor220, and amarker160 is displayed on a portion of theracetrack menu150 that has a position on theracetrack menu150 that corresponds to theposition260 on theracetrack surface250 at which the tip of their digit is in contact with the touch-sensitive surface225 of thetouch sensor250.FIG. 2 also depicts how themarker160 moves about and is constrained to moving about theracetrack menu150 to maintain a correspondence between its location on theracetrack menu150 and theposition260 of the digit on theracetrack surface250 as the user moves that digit about theracetrack surface250. In some embodiments, themarker160 may move about theracetrack menu150 in a manner in which themarker160 “snaps” from being centered about onemenu item155 to anadjacent menu item155 as themarker160 is moved about a portion of theracetrack menu150 having adjacent ones of themenu items155. Further, such “snapping” of themarker160 between adjacent ones of themenu items155 may be accompanied by the concurrent acoustic output of some form of sound (e.g., a “click” or “beep” sound that accompanies each “snap” of the marker160) to provide further feedback to a user of themarker160 moving from onesuch menu item155 to another.
When themarker160 is positioned over amenu item155 that the user wishes to select, the user selects thatmenu item155 by pressing whichever one of their digits that is already in contact with theracetrack surface250 with greater pressure than was used in simply placing that digit in contact with theracetrack surface250. In some embodiments, thetouch sensor220, itself, is capable of distinguishing different degrees of pressure with which the digit is put into contact with the touch-sensitive surface225 of thetouch sensor220 on which theracetrack surface250 is defined in order to distinguish an instance in which the user is pressing harder with that digit to select one of themenu items155. In other embodiments, thetouch sensor220 is able to function in a manner not unlike a mechanically depressible button in which the additional pressure applied through that digit by the user causes thetouch sensor220 to be pressed inward towards thecasing210 as part of selecting a menu item. This may be accomplished by overlying one or more buttons disposed within thecasing210 with thetouch sensor220 so that such buttons are depressed by thetouch sensor220 as thetouch sensor220 is itself depressed towards thecasing210. Where thetouch sensor220 is able to be pressed inward towards thecasing210, such inward movement may be accompanied by a “click” sound that may be heard by the user and/or a tactile “snap” sensation that can be sensed by the user through their digit to give the user some degree of positive feedback that they've successfully selected one of themenu items155. Regardless of whether thetouch sensor220 is able to be pressed inward towards thecasing210, or not, a “click” or other sound accompanying the user's use of increased pressure on theracetrack surface250 to select one of themenu items155 may be acoustically output through an acoustic driver (not shown) incorporated into theremote control200 and/or through theacoustic drivers130 of the audio/visual device100.
FIGS. 3a,3band3cdepict other variations of forms of marker and combinations of markers. As will be made clear, different forms of marker and combinations of multiple markers may be used to enhance the rapidity with which the eyes of a user of theuser interface1000 is drawn to a specific location on theracetrack menu150, and to aid the hand-eye coordination of that user.
Although themarker160 was depicted inFIG. 2 as taking the form of a box-shaped graphical element sized to surround one of themenu items155 at a time when positioned in the vicinity of one or more of themenu items155,FIG. 3adepicts another variant of themarker160 having the form of a triangular pointer. Still other possible graphical representations of themarker160 will occur to those skilled in the art, such as forms of themarker160 having other geometric shapes (e.g., a dot, a circle, an arrow, etc.) or other ways of being positioned in the vicinity of a given one of the menu items155 (e.g., overlying, surrounding, pointing to, touching, etc., one of the menu items155). Still further, instead of the marker being a graphical element that is separate and distinct from any of themenu items155, themarker160 may instead be a modified form of a given one of themenu items155, such as a change in a color of a menu item, an enlargement of a menu item in comparison to others, or some form of recurring animation or movement imparted to a menu item. In other words, the position of the marker160 (and by extension, theposition260 of the tip of a digit on the racetrack surface250) may be indicated by one of themenu items155 changing color, changing font, becoming larger, becoming brighter, or being visually altered in comparison to the others of themenu items155 in any of a number of ways to draw a user's eyes to it.
FIG. 3aalso depicts an optionaladditional marker165 that follows the location of themarker160 and provides a visual “highlight” of which one of the foursides150a-dthemarker160 is currently positioned within as a visual aid to enable a user's eyes to be more quickly directed to that one of the foursides150a-dwhen looking at theracetrack menu150. Though not specifically depicted, in other embodiments, theadditional marker165 may be implemented as a highlighting, change in color, change in background color, change in font, enlargement or other visual alteration made to all of themenu items155 that are positioned in that one of the foursides150a-d.
FIG. 3bdepicts the manner in which themarker160 may be dynamically resized as it is moved about theracetrack menu150, especially in embodiments where themarker160 is of a form that in some way overlaps or surrounds one of themenu items155 at a time in order to take into account the different sizes of different ones of themenu items155. More specifically, and as depicted inFIG. 3b, the numeral “3” has visibly smaller dimensions (i.e., occupies less space in the racetrack menu150) than does the numeral “III” that is also present on thesame racetrack menu150. Thus, when the depicted form of the marker160 (i.e., the “box” form of themarker160 that has been discussed at length) is positioned on one or the other of these two particular ones of themenu items155, themarker160 is resized to be larger or smaller as needed to take into account the different sizes of these two particular ones of themenu items155.
FIG. 3calso depicts an optionaladditional marker162 that follows the location of themarker160 and provides a more precise visual indication than does themarker160 of theposition260 of the tip of a user's finger along a corresponding portion of theracetrack surface250. As depicted, themarker162 takes the form of what might be called a “dash” positioned along one of the edges of the box form of themarker160. However, it should be noted that themarker162 may take any of a variety of forms (e.g., a dot, a circle, an arrow, etc.). The provision of themarker162 may be deemed desirable in embodiments where themarker160 moves in the manner previously described in which themarker160 “snaps” between adjacent ones of themenu items155 such that themarker160 does not, itself, provide as precise an indication of theposition260 of the tip of the user's digit. More specifically,FIG. 3cdepicts a succession of views of a portion of theracetrack menu150 on whichmenu items155 taking the form of the numerals “1” through “5” are positioned. As can be seen in this depicted succession, themarker162 provides a more precise indication of the movement of theposition260 of the tip of the user's digit along a portion of theracetrack surface250 from left to right than does themarker160 which remains on the one of themenu items155 having the form of the numeral “2” on this portion of theracetrack menu150. Such a higher precision indication of theposition260 of the tip of the user's digit may aid the user in improving their hand-eye coordination in operating theuser interface1000. Such a higher precision indication of theposition260 may also provide a user with some degree of reassurance that theuser interface1000 is responding to their actions (or more specifically, whatever processing device is incorporated into theuser interface1000 is responding to their actions) by seeing that theexact position260 of the tip of their digit is being successfully detected.
FIG. 3ddepicts yet another alternate variation of themarker160 in a variant of theuser interface1000 in which theracetrack menu150 is divided into multiple segments, with each such segment serving as a background to one of themenu items155. As depicted, themarker160 is implemented as both a change in the color and/or brightness of one of those segments of theracetrack menu150 and an enlarging of the graphical element representing the one of the menu items155 (specifically, the numeral “3”) positioned within that segment. As so depicted, themarker160 might be said to have a form that is a variant of the earlier-depicted box, but a box that is made visible by having a color and/or brightness that differs from the rest of theracetrack menu150, rather than a box that is made visible by a border or outline.FIG. 3dalso depicts this alternate variation of themarker160 being used in combination with the earlier-describedadditional marker162 that provides a more precise indication of theposition260 of the tip of a user's digit along a portion of theracetrack surface250.
FIG. 3dalso depicts how this variant of themarker160 is resized to accommodate the different sizes of the different ones of themenu items155, although this resizing now corresponds to the differing dimensions of different ones of the segments into which theracetrack menu150 is divided. In some variants, each of the segments may be individually sized to fit the visual size and shape of its corresponding one of themenu items155, as depicted inFIG. 3d. Thus, since the numeral “3” of one of themenu items155 is smaller in at least one dimension than the numeral “III” of another one of the menu items155 (even with the numeral “3” being enlarged in font size), the segment of theracetrack menu150 in which the numeral “3” is positioned is smaller than the segment in which the numeral “III” is positioned. However, in other variants, the segments filling at least one of the foursides150a-dmay all be sized based on the quantity of themenu items155 positioned in that one of the four sides so as to divide that one of the foursides150a-dinto equal-sized segments. Where the ones of themenu items155 along that one of the foursides150a-dmay change in response to a selection of an input or for other reasons, the size of the segments in that one of the foursides150a-dmay change in response to a change in quantity of themenu items155 positioned in that one of the foursides150a-d.Thus, for example, a reduction in the quantity ofmenu items155 in that one of the foursides150a-dresults in each of its segments becoming larger in at least one dimension, and an increase in the quantity ofmenu items155 results in that one of the foursides150a-dresults in each of its segments becoming smaller.
FIG. 4 is a block diagram of a possible architecture of theuser interface1000 by which acontroller500 receives input through a user's use of at least theracetrack surface250 defined on at least a portion of a touch-sensitive surface225 of thetouch sensor220 to which thecontroller500 is coupled, and provides at least theracetrack menu150 as a visual output to the user through at least thedisplay element120 to which thecontroller500 is also coupled. In various possible embodiments, thecontroller500 may be incorporated directly into the audio/visual device100, or into another audio/visual device900 coupled to the audio/visual device100 and shown in dotted lines inFIG. 1. As also depicted inFIG. 1, theremote control200 communicates wirelessly through the emission of radio frequency, infrared or other wireless emissions to whichever one of the audio/visual devices100 and900 incorporates thecontroller500. However, as those skilled in the art will readily recognize, theremote control200 may communicate through an electrically and/or optically conductive cable (not shown) in other possible embodiments. Alternatively and/or additionally, theremote control200 may communicate through a combination of wireless and cable-based (optical or electrical) connections forming a network between theremote control200 and thecontroller500.
Still other embodiments may incorporate thetouch sensor220 directly on a user accessible portion of one or both of the audio/visual devices100 and900, either in addition to or as an alternative to providing thetouch sensor220 on theremote control200. Indeed,FIG. 5 depicts an alternate variant of the audio/visual device100 having more of a portable configuration incorporating both thedisplay element120 displaying theracetrack menu150 and thetouch sensor220 on a touch-sensitive surface225 on which theracetrack surface250 is defined. This alternative variant of the audio/visual device100 may also incorporate thecontroller500, such that much (if not substantially all) of theuser interface1000 is implemented solely by the audio/visual device100.
Returning toFIG. 4, regardless of which audio/visual device incorporates thecontroller500, thecontroller500 incorporates multiple interfaces in the form of one or more connectors and/or one or more wireless transceivers by which thecontroller500 is able to be coupled to one ormore sources901,902,903 and/or904. Any such connectors may be disposed on the casing of whatever audio/visual device thecontroller500 is incorporated into (e.g., thecasing110 of the audio/visual device100 or a casing of the audio/visual device900). In being so coupled, thecontroller500 is able to transmit commands to one or more of the sources901-904 to access and select audio/visual programs, and is able to receive audio/visual programs therefrom. Each of the sources901-904 may be any of a variety of types of audio/visual device, including and not limited to, RF tuners (e.g., cable television or satellite dish tuners), disc media recorders and/or players, tape media recorders and/or players, solid-state or disk-based digital file players (e.g., a MP3 file player), Internet access devices to access streaming data of audio/visual programs, or docking cradles for portable audio/visual devices (e.g., a digital camera). Further, in some embodiments, one or more of the sources901-904 may be incorporated into the same audio/visual device into which thecontroller500 is incorporated (e.g., a built-in disc media player or built-in radio frequency tuner).
In embodiments where one of the sources901-904 is not incorporated into the same audio/visual device as thecontroller500, and where that one of the sources901-904 is coupled to thecontroller500 via an interface of thecontroller500 employing a connector, any of a variety of types of electrical and/or optical signaling conveyed via electrically and/or optically conductive cabling may be employed. Preferably, a single cable is employed both in relaying commands from thecontroller500 to that one of the sources901-904 and in relaying audio/visual programs to thecontroller500. However, combinations of cabling in which different cables separately perform these functions are also possible. Some of the possible forms of cabling able to relay both commands and audio/visual programs may conform to one or more industry standards, including and not limited to, Syndicat des Constructeurs d'Appareils Radiorecepteurs et Televiseurs (SCART) promulgated in the U.S. by the Electronic Industries Alliance (EIA) of Arlington, Va.; Ethernet (IEEE-802.3) or IEEE-1394 promulgated by the Institute of Electrical and Electronics Engineers (IEEE) of Washington, D.C.; Universal Serial Bus (USB) promulgated by the USB Implementers Forum, Inc. of Portland, Oreg.; Digital Visual Interface (DVI) promulgated by the Digital Display Working Group (DDWG) of Vancouver, Wash.; High-Definition Multimedia Interface (HDMI) promulgated by HDMI Licensing, LLC of Sunnyvale, Calif.; or DisplayPort promulgated by the Video Electronics Standards Association (VESA) of Milpitas, Calif. Other possible forms of cabling able to relay only one or the other of commands and audio/visual programs may conform to one or more industry standards, including and not limited to, RS-422 or RS-232-C promulgated by the EIA; Video Graphics Array (VGA) maintained by VESA; RC-5720C (more commonly called “Toslink”) maintained by the Japan Electronics and Information Technology Industries Association (JEITA) of Tokyo, Japan; the widely known and used Separate Video (S-Video); or S-Link maintained by Sony Corporation of Tokyo, Japan.
In other embodiments where one of the sources901-904 is not incorporated into the same audio/visual device as thecontroller500, and where that one of the sources901-904 is coupled to thecontroller500 via a wireless transceiver, any of a variety of types of infrared, radio frequency or other wireless signaling may be employed. Preferably, a single wireless point-to-point coupling is employed both in relaying commands from thecontroller500 to that one of the sources901-904 and in relaying audio/visual programs to thecontroller500. However, combinations of separate wireless couplings in which these functions are separately performed are also possible. Some of the possible forms of wireless signaling able to relay both commands and audio/visual programs may conform to one or more industry standards, including and not limited to, IEEE 802.11a, 802.11b or 802.11g promulgated by the IEEE; Bluetooth promulgated by the Bluetooth Special Interest Group of Bellevue, Wash.; or ZigBee promulgated by the ZigBee Alliance of San Ramon, Calif.
In still other embodiments where one of the sources901-904 is not incorporated into the same audio/visual device as thecontroller500, a combination of cabling-based and wireless couplings may be used. An example of such a combination may be the use of a cabling-based coupling to enable thecontroller500 to receive an audio/visual program from that one of the sources901-904, while an infrared transmitter coupled to thecontroller500 may be positioned at or near the one of the sources901-904 to wirelessly transmit commands via infrared to that one of the sources901-904. Still further, althoughFIG. 4 depicts each of the sources901-904 as being directly coupled to thecontroller500 in a point-to-point manner, those skilled in the art will readily recognize that one or more of the sources901-904 may be coupled to thecontroller500 indirectly through one or more of the others of the sources901-904, or through a network formed among the sources901-904 (and possibly incorporating routers, bridges and other relaying devices that will be familiar to those skilled in the art) with multiple cabling-based and/or wireless couplings.
Some of the above-listed industry standards include specifications of commands that may be transmitted between audio/visual devices to control access to and/or control the playing of audio/visual programs, including most notably, SCART, IEEE-1394, USB, HDMI, and Bluetooth. Where such an industry standard for coupling thecontroller500 to one or more of the sources901-904 is employed, thecontroller500 may limit the commands transmitted to one or more of the sources901-904 to the commands specified by that industry standard and map one or more of those commands to corresponding ones of themenu items155 such that a user is able to cause thecontroller500 to send those commands to one or more of the sources901-904 by selecting those corresponding ones of themenu items155. However, where the benefit of such a standardized command set is unavailable, thecontroller500 may employ any of a wide variety of approaches to identify one or more of the sources901-904 to an extent necessary to “learn” what commands are appropriate to transmit and the manner in which they must be transmitted.
A user of theuser interface1000 may select one of the sources901-904 as part of selecting an audio/visual program for being played by employing theracetrack surface250 and themarker160 to select one or more of themenu items155 shown on theracetrack menu150, such as the “I” through “IV”menu items155 depicted as displayed by thecontroller500 on theside150cof theracetrack menu150. Thosemenu items155 depicted on theside150ccorrespond to thesources901 through904, which are depicted as bearing the labels “source I” through “source IV” inFIG. 4. Thecontroller500 receives input from thetouch sensor220 indicating the contact of the user's digit with a portion of theracetrack surface250, indicating movement of theposition260 of contact of the digit about theracetrack surface250, and indicating the application of greater pressure by the user through that digit against thetouch sensor220 at the position260 (wherever theposition260 is at that moment) when selecting one of themenu items155. The selection of one of the sources901-904 by the user causes thecontroller500 to switch to receiving audio/visual programs from that one of the sources901-904, and to be ready to display any visual portion in thedisplay area950 and acoustically output any audio portion through the acoustic drivers130 (or whatever other acoustic drivers may be present and employed for playing audio/visual programs).
The selection of one of the sources901-904 may further cause thecontroller500 to alter the quantity and types ofmenu items155 displayed on one or more of thesides150a-dof theracetrack menu150 such that the displayedmenu items155 more closely correspond to the functions supported by whichever one of the sources901-904 that has been selected. This changing display of at least a subset of themenu items155 enables the user to operate at least some functions of a selected one of the sources901-904 by selecting one or more of themenu items155 to thereby cause thecontroller500 to transmit one or more commands corresponding to those menu items to the selected one of the sources901-904. By way of example, where the one of the sources901-904 with the ability to record an audio/visual program was previously selected, theracetrack menu150 may include one ormore menu items155 that could be selected to cause thecontroller500 to transmit a command to that previously selected one of the sources901-904 to cause it to start recording an audio/visual program. However, if the user then selects another one of the sources901-904 that does not have the ability to record an audio/visual program, then thecontroller500 would alter themenu items155 displayed on theracetrack menu150 to remove one or more menu items associated with recording an audio/visual program. In this way, at least a subset of themenu items155 displayed on theracetrack menu150 are “modal” in nature, insofar as at least that subset changes with the selection of different ones of the sources901-904.
The coupling and/or uncoupling of one or more of the sources901-904 to and/or from whatever audio/visual device into which thecontroller500 is incorporated may also cause thecontroller500 to alter the quantity and/or types ofmenu items155 that are displayed in another example of at least a subset of themenu items155 being modal in nature. By way of example, the uncoupling of one of the sources901-904 where that one of the sources901-904 had been coupled through cabling may cause thecontroller500 to remove the one of themenu items155 by which that now uncoupled one of the sources901-904 could be selected. Alternatively and/or additionally, where that uncoupled one of the sources901-904 was already selected at the time of such uncoupling such that a subset of themenu items155 is displayed that is meant to correspond to the functions able to be performed by that now uncoupled one of the sources901-904, thecontroller500 may respond to such an uncoupling by autonomously selecting one of the other of the sources901-904 and altering the subset of themenu items155 to correspond to the functions able to be performed by that newly selected one of the sources901-904. In contrast, and by way of another example, the uncoupling of one of the sources901-904 where that one of the sources901-904 had been wirelessly coupled may or may not cause thecontroller500 to remove the one of themenu items155 by which that now uncoupled one of the sources901-904 could be selected. If there is a mechanism provided in the chosen form of wireless communications used in the coupling that indicates that the uncoupling is due simply to that one of the sources901-904 entering into a low-power or “sleep” mode, then it may be that no change is made by thecontroller500 to themenu items155 that are displayed, especially if the form of wireless communications used allows thecontroller500 to signal that one of the sources901-904 to “wake up” in response to the user selecting one of themenu items155 that is associated with it. However, if no such mechanism to indicate the circumstances of an uncoupling are available, then the uncoupling may well result in an alteration or removal of at least some of themenu items155 displayed on theracetrack menu150. Where a previously uncoupled one of the sources901-904 is subsequently coupled, once again, regardless of the type of coupling, thecontroller500 may be caused to automatically select that now coupled one of the sources901-904. This may be done based on an assumption that the user has coupled that source to whatever audio/visual device into which thecontroller500 is incorporated with the intention of immediately playing an audio/visual program from it.
While at least some of themenu items155 may be modal in nature such that they are apt to change depending on the selection and/or condition of one or more of the sources901-904, others of themenu items155 may not be modal in nature such that they are always displayed whenever theracetrack menu150 is displayed. More specifically, where one or more of the sources901-904 are incorporated into the same audio/visual device as thecontroller500, the ones of themenu items155 associated with those sources may remain displayed in theracetrack menu150, regardless of the occurrences of many possible events that may causeother menu items155 having a modal nature to be displayed, to not be displayed, or to be displayed in some altered form. By way of example, where a radio frequency tuner is incorporated into the same audio/visual device into which thecontroller500 is incorporated, then a subset of themenu items155 associated with selecting a radio frequency channel (e.g., the decimal point and numerals “0” through “9” depicted as displayed within theside150a) may be a subset of themenu items155 that is always displayed in theracetrack menu150. It may be that the selection of any menu item of such a subset of themenu items155 may cause thecontroller500 to automatically switch the selection of a source of audio/visual programs to the source associated with thosemenu items155. Thus, in the example where an audio/visual device incorporates a radio frequency tuner andmenu items155 associated with selecting a radio frequency channel are always displayed, the selection of any one of those menu items would cause thecontroller500 to automatically switch to that radio frequency tuner as the source from which to receive an audio/visual program if that tuner were not already selected as the source. By way of another example, one or more of themenu items155 associated with selecting a source of audio/visual programs (e.g., the roman numerals “I” through “IV” depicted as displayed within theside150c) may be menu items that are always displayed in theracetrack menu150.
Regardless of what source is selected or how the source is selected, if an audio/visual program received by thecontroller500 from that source has a visual portion, then thecontroller500 causes that visual portion to be displayed in thedisplay area950. As has so far been depicted and described, theracetrack menu150 has a rectilinear configuration with the foursides150a-dthat are configured to surround or overlie edges of thedisplay area950. However, in some embodiments, it may be that theracetrack menu150 is not always displayed such that what is shown on thedisplay element120 of the audio/visual device100 could be either thedisplay area950 surrounded by theracetrack menu150, or thedisplay area950 expanded to fill the area otherwise occupied by theracetrack menu150.
As depicted inFIG. 6, what is shown on thedisplay element120 could toggle between these two possibilities, and this toggling could occur in response to observed activity and/or a lack of observed activity in the operation of at least theracetrack surface250. More specifically, on occasions where no indication of contact by a user's digit on theracetrack surface250 has been received by thecontroller500 for at least a predetermined period of time, thecontroller500 may provide thedisplay element120 with an image that includes substantially nothing else but thedisplay area950 such that a visual portion of an audio visual program is substantially the only thing shown on thedisplay element120. However, once thecontroller500 has received an indication of activity such as the tip of a digit making contact withracetrack surface250, thecontroller500 then provides thedisplay element120 with an image that includes a combination of thedisplay area950 and theracetrack menu150.
In some embodiments, at a time when both thedisplay area950 and theracetrack menu150 are displayed, thecontroller500 reduces the size of thedisplay area950 to make room around the edges of thedisplay area950 for the display of theracetrack menu150 on thedisplay element120, and in so doing, may rescale the visual portion (if there is one) of whatever audio/visual program may be playing at that time. In other embodiments, thedisplay area950 is not resized, and instead, theracetrack menu150 is displayed in a manner in which theracetrack menu150 overlies edge portions of thedisplay area950 such that edge portions of any visual portion of an audio/visual program are no longer visible. However, in those embodiments in which the racetrack menu overlies edge portions of thedisplay area950, theracetrack menu150 may be displayed in a manner in which at least some portions of the racetrack menu have a somewhat “transparent” quality in which the overlain edge portions of any visual portion of an audio/visual program can still be seen by the user “looking through” theracetrack menu150. As will be familiar to those skilled in the art, this “transparent” quality may be achieved through any of a number of possible approaches to combining the pixels of the image of theracetrack menu150 with pixels of the overlain portion of any visual portion of an audio/visual program (e.g., by averaging pixel color values, alternately interspersing pixels, or bit-wise binary combining of pixels with a pixel mask).
Along with combining the visual display of thedisplay area950 and theracetrack menu150, thecontroller500 may also combine audio associated with operation of theuser interface1000 with an audio portion (if present) of an audio/visual program being played. More specifically, “click” sounds associated with the user pressing theracetrack surface250 defined on a surface of thetouch sensor220 with greater pressure and/or with the “snapping” of themarker160 between adjacent ones of themenu items155 may be combined with whatever audio portion is acoustically output as part of the playing of an audio/visual program.
In some embodiments, at a time when theracetrack menu150 is not displayed (e.g., at a time when only thedisplay area950 is displayed), thecontroller500 may do more than simply cause theracetrack menu150 to be displayed in response to a user touching a portion of theracetrack sensor250. More specifically, in addition to causing theracetrack menu150 to be displayed, thecontroller500 may take particular actions in response to particular ones of thesides250a-dof theracetrack surface250 being touched by a user at a time when theracetrack menu150 is not being displayed. By way of example, at a time when theracetrack menu150 is not being displayed, the detection of a touch to theside250dmay cause a command to be sent to one of the sources901-904 to provide an on-screen guide concerning audio/visual programs able to be provided by that source, where such a guide would be displayed in thedisplay area950, with edges of thedisplay area950 being either surrounded or overlain by theracetrack menu150 as has been previously described.
In a variation of such embodiments, it may be that causing theracetrack menu150 to be displayed requires both a touch and some minimum degree of movement of the tip of a user's digit on the racetrack surface250 (i.e., a kind of “touch-and-drag” or “wiping” motion across a portion of the racetrack surface250), while other particular actions are taken in response to where there is only a touch of a tip of a user's digit on particular ones of thesides250a-dof theracetrack sensor250. By way of example, while theracetrack menu150 is not displayed, touching theside250amay cause a command to be sent to a source to turn that source on or off, and touching theside250bmay cause an audio portion of an audio/visual program to be muted, while both touching and moving a digit across a portion of theracetrack surface250 in a “wiping” motion is required to enable the display and use of theracetrack menu150.
FIGS. 7aand7b, taken together, depict additional features that may be incorporated into theuser interface1000. Where a selected one of the sources901-904 displays its own on-screen menu170 (e.g., a guide concerning audio/visual programs available from that source), either in place of a visual portion of an audio/visual program or overlying a visual portion of an audio/visual program, some embodiments of theuser interface1000 may be augmented to support at least partly integrating the manner in which a user would navigate such an on-screen menu170 into theuser interface1000. In such embodiments, thetouch sensor220, with its ring shape (whether that ring shape is a rectangular ring shape, or a ring shape of a different geometry), may be configured to surround a set of controls for use in navigating the on-screen menu170 just as theracetrack menu150 surrounds the on-screen menu170, itself.
In particular,FIG. 7bdepicts the manner in which thetouch sensor220 disposed on thecasing210 of theremote control200 ofFIG. 1 may surroundnavigation buttons270a,270b,270cand270d,as well as aselection button280, that are also disposed on thecasing210. In alternate variants, other forms of one or more manually-operable controls may be surrounded by thetouch sensor220, in addition to or in place of the navigation buttons270a-dand theselection button280, including and not limited to, a joystick, or a four-way rocker switch that may either surround a selection button (such as the selection button280) or be usable as a selection button by being pressed in the middle. As a result of the ring shape of thetouch sensor220 being employed to surround the navigation buttons270a-dand theselection buttons280, a nested arrangement of concentrically located manually operable controls is created.FIG. 7adepicts a form of possible on-screen menu that will be familiar to those skilled in the art, includingvarious menu items175 that may be selected via theselection button280, and amarker180 that may be moved by a user among themenu items175 via the navigation buttons270a-d.The concentrically nested arrangement of manually operable controls surrounded by theracetrack menu250 defined on the touch-sensitive surface225 of thetouch sensor220 that is disposed on thecasing210 of theremote control200 corresponds to the similarly nested arrangement of the on-screen menu170 surrounded by theracetrack menu150 that is displayed on thedisplay element120.
FIG. 7balso depictsadditional controls222,225,226 and228 that may be employed to perform particular functions where it may be deemed desirable to provide at least some degree of functionality in a manner that does not require the selection of menu items to operate. In one possible variant, thecontrols222,225,226 and228 are operable as a power button, a mute button, volume rocker switch and a channel increment/decrement rocker switch, respectively.FIG. 8 depicts a variant of the handheld form of the audio/visual device100 depicted inFIG. 5 in which thetouch sensor220 is positioned so as to surround the navigation buttons270a-dand theselection button280, and in which this variant of the handheld form of the audio/visual device100 may further incorporate thecontrols222,225,226 and228.
FIG. 9 is a block diagram of a possible architecture of thecontroller500 in which thecontroller500 incorporates anoutput interface510, asensor interface520, astorage540, aprocessing device550 and asource interface590. Theprocessing device550 is coupled to each of theoutput interface510, thesensor interface520, thestorage540 and thesource interface590 to at least coordinate the operation of each to perform at least the above-described functions of thecontroller500.
Theprocessing device550 may be any of a variety of types of processing device based on any of a variety of technologies, including and not limited to, a general purpose central processing unit (CPU), a digital signal processor (DSP), a microcontroller, or a sequencer. Thestorage540 may be based on any of a variety of data storage technologies, including and not limited to, any of a wide variety of types of volatile and nonvolatile solid-state memory, magnetic media storage, and/or optical media storage. It should be noted that although thestorage540 is depicted in a manner that is suggestive of it being a single storage device, thestorage540 may be made up of multiple storage devices, each of which may be based on different technologies.
Each of theoutput interface510, thesensor interface520 and thesource interface590 may employ any of a variety of technologies to enable thecontroller500 to communicate with other devices and/or other components of whatever audio/visual device into which thecontroller500 is incorporated. More specifically, where thecontroller500 is incorporated into an audio/visual device that also incorporates one or both of a display element (such as the display element120) and at least one acoustic driver (such as the acoustic drivers130), theoutput interface510 may be of a type able to directly drive a display element with signals causing the display of theracetrack menu150 and thedisplay area950 to display visual portions of audio/visual programs, and/or able to directly drive one or more acoustic drivers to acoustically output audio portions of audio/visual programs. Alternatively, where one or both of a display element and acoustic drivers are not incorporated into the same audio/visual device into which thecontroller500 is incorporated, theoutput interface510 may be of a type employing cabling-based and/or a wireless signaling (perhaps signaling conforming to one of the previously listed industry standards) to transmit a signal to another audio/visual device into which a display element and/or acoustic drivers are incorporated (e.g., the audio/visual device100).
Similarly, where thecontroller500 is incorporated into an audio/visual device into which thetouch sensor220 is also incorporated, thesensor interface520 may be of a type able to directly receive electrical signals emanating from thetouch sensor220. With such a more direct coupling, thesensor interface520 may directly monitor a two-dimensional array of touch-sensitive points of the touch-sensitive surface225 of thetouch sensor220 for indications of which touch-sensitive points are being touched by a tip of a user's digit, and thereby enable theprocessing device550 to employ those indications to directly determine where the touch-sensitive surface225 is being touched. Thus, a determination of whether or not the tip of the digit is touching a portion of theracetrack surface250 and/or theposition260 by theprocessing device550 may be enabled. However, where thecontroller500 is incorporated into a device into which thetouch sensor220 is not also incorporated (e.g., thecontroller500 is incorporated into the audio/visual device100 and the touch sensor is incorporated into the remote control200), thesensor interface520 may be of a type able to receive cabling-based and/or wireless signaling transmitted by that other device (e.g., infrared signals emitted by the remote control200). With such a more remote coupling, circuitry (not shown) that is co-located with thetouch sensor220 may perform the task of directly monitoring a two-dimensional array of touch-sensitive points of the touch-sensitive surface225, and then transmit indications of which touch-sensitive points are being touched by the tip of a user's digit to thesensor interface520.
Although it is possible that the audio/visual device into which thecontroller500 is incorporated may not incorporate any sources (such as the sources901-904) from which thecontroller500 receives audio/visual programs, it is deemed more likely that the audio/visual device into which thecontroller500 is incorporated will incorporate one or more of such sources in addition to being capable of receiving audio/visual programs from sources not incorporated into the same audio/visual device. By way of example, it is envisioned that thecontroller500 may be incorporated into an audio/visual device into which a radio frequency tuner and/or an Internet access device is also incorporated to enable access to audio/visual programs for selection and playing without the attachment of another audio/visual device, while also having the capability of being coupled to another audio/visual device to receive still other audio/visual programs. In other words, it is envisioned that thecontroller500 may well be incorporated into an audio/visual device that is at least akin to a television, whether portable (e.g., as depicted inFIG. 5) or stationary (e.g., as depicted inFIG. 1). Therefore, although thesource interface590 may have any of a number of configurations to couple thecontroller500 to any of a number of possible sources, it is envisioned that thesource interface590 will be configured to enable thecontroller500 to be coupled to at least one source that is also incorporated into the same audio/visual device into which thecontroller500 is incorporated, and to also enable thecontroller500 to be coupled to at least one source that is not incorporated into the same audio/visual device.
Thus, thesource interface590 incorporates one or more of anelectrical interface595, anoptical interface596, aradio frequency transceiver598 and/or aninfrared receiver599. The electrical interface595 (if present) enables thesource interface590 to couple thecontroller500 to at least one source, whether incorporated into the same audio/visual device as thecontroller500, or not, to receive electrical signals (e.g., Ethernet, S-Video, USB, HDMI, etc.) conveying an audio/visual program to thecontroller500. The optical interface596 (if present) enables thesource interface590 to couple thecontroller500 to at least one source to receive optical signals (e.g., Toslink) conveying an audio/visual program to thecontroller500. The radio frequency transceiver598 (if present) enables thesource interface590 to wirelessly couple thecontroller500 to at least one other audio/visual device functioning as a source to receive radio frequency signals (e.g., Bluetooth, a variant of IEEE 802.11, ZigBee, etc.) conveying an audio/visual program to thecontroller500 from that other audio/visual device. The infrared receiver599 (if present) enables thesource interface590 to wirelessly couple thecontroller500 to at least one other audio/visual device functioning as a source to receive infrared signals conveying an audio/visual program to thecontroller500 from that other source. It should be noted that although theoutput interface510 and thesensor interface520 are depicted as separate from thesource interface590, it may be deemed advantageous, depending on the nature of the signaling supported, to combine one or both of theoutput interface510 and thesensor interface520 with thesource interface590.
Stored within thestorage540 are one or more of acontrol routine450, aprotocols data492, acommands data493, an audio/visual data495, a rescaled audio/visual data496, andmenu data498. Upon being executed by theprocessing device550, a sequence of instructions of thecontrol routine450 causes theprocessing device550 to coordinate the monitoring of thetouch sensor220 for user input, the output of theracetrack menu150 to a display element (e.g., the display element120), the selection of a source of an audio/visual program to be played, and one or both of the display of a visual portion of an audio/visual program on a display element on which theracetrack menu150 is also displayed and the acoustic output of an audio portion of the audio/visual program via one or more acoustic drivers (e.g., the acoustic drivers130).
Upon execution, thecontrol routine450 causes theprocessing device550 to operate thesensor interface520 to await indications of a user placing a tip of a digit in contact with a portion of theracetrack surface250 defined on a surface of thetouch sensor220, moving that digit about theracetrack surface250 and/or applying greater pressure at theposition260 on theracetrack surface250 to make a selection. Upon receiving an indication of activity by the user involving theracetrack surface250, theprocessing device550 may be caused to operate the output interface to display theracetrack menu150 with one or more of themenu items155 positioned thereon and surrounding thedisplay area950 via a display element, if theracetrack menu150 is not already being displayed. Theprocessing device550 is further caused to display and position at least themarker160 on theracetrack menu150 in a manner that corresponds to theposition260 of the user's digit on theracetrack surface250. Further, in response to the passage of a predetermined period of time without receiving indications of activity by the user involving theracetrack surface250, theprocessing device550 may be caused to operate theoutput interface510 to cease displaying theracetrack menu150, and to display substantially little else on a display element than thedisplay area950.
Upon execution, thecontrol routine450 causes theprocessing device550 to operate thesensor interface520 to await an indication of a selection of amenu item155 that corresponds to selecting a source from which the user may wish an audio/visual program to be provided for playing, and may operate thesource interface590 to at least enable receipt of an audio/visual program from that selected source. Where an audio/visual program is received, theprocessing device550 may be further caused to buffer audio and/or visual portions of the audio/visual program in thestorage540 as the audio/visual data495. In embodiments in which a visual portion of an audio/visual program is rescaled to be displayed in thedisplay area950 at a time when thedisplay area950 is surrounded by theracetrack menu150, theprocessing device550 may be further caused to buffer the rescaled form of the visual portion in the storage as the rescaled audio/visual program data496.
Upon execution, thecontrol routine450 causes theprocessing device550 to operate thesensor interface520 to await an indication of a selection of amenu item155 corresponding to the selection of a command (e.g., “play” or “record” commands, numerals or other symbols specifying a radio frequency channel to tune, etc.) to be transmitted to an audio/visual device serving as a source, and may operate thesource interface590 to transmit a command to that audio/visual device (e.g., one of sources901-904) that corresponds to amenu item155 that has been selected. In transmitting that command, theprocessing device550 may be further caused to refer to theprotocols data492 for data concerning sequences of signals that must be transmitted by thesource interface590 as part of a communications protocol in preparation for transmitting the command, and/or theprocessing device550 may be further caused to refer to thecommands data493 for data concerning the sequence of signals that must be transmitted by thesource interface590 as part of transmitting the command. As will be familiar to those skilled in the art, some of the earlier listed forms of coupling make use of various protocols to organize various aspects of commands and/or data that are conveyed, including and not limited to, Ethernet, Bluetooth, IEEE-1394, USB, etc. In support of theprocessing device550 responding to the selection of various ones of themenu items155, theprocessing device550 is further caused to store data correlating at least some of the various menu items with actions to be taken by theprocessing device550 in response to their selection by the user in thestorage540 as themenu data498.
Amidst operating thesource interface590 to enable receipt of an audio/visual program from a source selected by the user, theprocessing device550 may be caused to operate theoutput interface510 to alter the quantity and/or type ofmenu items155 that are displayed at various positions on theracetrack menu150. In so doing, theprocessing device550 may be further caused to store information concerning the size, shape, color and other characteristics of theracetrack menu150, at least some of the graphical representations of themenu items155, and/or at least one graphical representation of themarker160 in thestorage540 as part of themenu data498.
FIGS. 10aand10b, taken together, depict and contrast two variants of thetouch sensor220. Both variants are depicted in perspective as distinct touch-sensitive devices that are typically mounted within a recess of a casing of a device, including either thecasing110 of any variant of the audio/visual device100 or thecasing210 of any variant of theremote control200. However, as those skilled in the art will readily recognize, other touch-sensitive device technologies may yield variants of the touch-sensitive device220 that are film-like overlays that may be positioned to overlie a portion of a casing or of a circuitboard of a device. The discussion that follows is centered more on the shape and utilization of the touch-sensitive surface225 of thetouch sensor220, and not on the touch-sensitive technology employed.
FIG. 10adepicts the variant of thetouch sensor220 having the ring shape that has been discussed above at length that permits other manually-operable controls (e.g., the navigation buttons270a-dand the selection button280) to be positioned in a manner in which they are surrounded by the ring shape of thetouch sensor220. As has already been discussed, the ring shape of this variant of thetouch sensor220 provides a form of the touch-sensitive surface225 that is bounded by the ring shape of thetouch sensor220, and this in turn defines the ring shape of the racetrack surface250 (where theracetrack surface250 is defined on the touch-sensitive surface225 to encompass substantially all of the touch-sensitive surface225). Once again, although this variant of thetouch sensor220 is depicted as having a rectangular ring shape having four sides, other embodiments are possible in which thetouch sensor220 has a ring shape of a different geometry, such as a circular ring shape, an oval ring shape, a hexagonal ring shape, etc.
FIG. 10bdepicts an alternate variant of thetouch sensor220 having a rectangular shape that provides a continuous form of the touch-sensitive surface225 that is bounded by this rectangular shape (i.e., there is no “hole” or formed through the touch-sensitive surface225). This rectangular shape more easily enables more than the ring shape of theracetrack surface250 to be defined on the touch-sensitive surface225 in a manner in which theracetrack surface250 encompasses only a portion of the touch-sensitive surface225 and leaves open the possibility of one or more other surfaces that serve other functions also being defined on thereon. In this alternate variant, the ring shape of theracetrack surface250 may be defined by a processing device executing a sequence of instructions of a routine, such as theprocessing device550 executing thecontrol routine450 inFIG. 9. In other words, the location of theracetrack surface250 may be defined by a processing device first being provided with indications of which touch-sensitive points of an array of touch-sensitive points making up the touch-sensitive surface225 are being touched by a tip of a user's digit, and second treating some of those touch-sensitive points as belonging to theracetrack surface250 and others of those touch-sensitive points as belonging to other surfaces that are defined on the touch-sensitive surface225 (and which serve other functions).
Alternatively and/or additionally, one ormore ridges227 and/or grooves (not shown) may be formed in the touch-sensitive surface225 to at least provide a tactile guide as to where theracetrack surface250 is defined on the touch-sensitive surface225.Such ridges227 may be formed integrally with the touch-sensitive surface225, may be formed as part of a casing on which thetouch sensor220 is disposed, or may be adhered to the touch-sensitive surface225. Further,such ridges227 and/or grooves (not shown) may coincide with locations on the touch-sensitive surface225 at which thetouch sensor220 is incapable of detecting the touch of a tip of a digit (i.e., the touch-sensitive surface225 may be made up of multiple separate touch-sensitive portions, of which one is a portion having a ring shape where theracetrack surface250 is defined).
More specifically, and as depicted in dotted lines inFIG. 10b, theracetrack surface250 is defined on the touch-sensitive surface225 so as to be positioned about the periphery of the touch-sensitive surface225 such that the ring shape of theracetrack surface250 surrounds the remainder of the touch-sensitive surface225. As also depicted, at least a portion of the touch-sensitive surface225 that is surrounded by theracetrack surface250 may be employed to provide the equivalent function of other manually-operable controls, such as the navigation buttons270a-dand theselection button280. In other words, the navigation buttons270a-dand theselection button280 may be implemented as navigation surfaces and a selection surface, respectively, defined on the touch-sensitive surface of the touch sensor220 (perhaps by a processing device executing a sequence of instructions), along with theracetrack surface250.
It should be noted that although both of the variants of thetouch sensor220 have been depicted inFIGS. 10aand10bas having rectangular shapes with right angle corners, either variant may alternatively have rounded corners. Indeed, where such a variant of thetouch sensor220 has one or more of theridges227 and/or grooves (not shown), such ones of theridges227 and/or grooves may also have rounded corners, despite being depicted as having right angle corners inFIGS. 10aand10b.
FIGS. 11aand11b,taken together, depict two variants of theuser interface1000 in which more than one display area is defined within the portion of thedisplay element120 that is surrounded by theracetrack menu150. These variants enable more than one visual portion of one or more selected audio/visual programs to be played on thedisplay element120 in a manner that enables a user to view them simultaneously. Also depicted is the manner in which various ones of themenu items155 associated within only one of the display areas may be positioned along theracetrack menu150 to provide a visual indication of their association with that one of the display areas.
More specifically,FIG. 11adepicts a configuration that is commonly referred to as “picture-in-picture” in which adisplay area970 having smaller dimensions than thedisplay area950 is positioned within and overlies a portion of thedisplay area950. As also depicted, ones of themenu items155 that are associated with the visual portion displayed in thedisplay area970 are positioned along portions of theracetrack menu150 that are located closer to the display area970 (specifically, portions of thesides150band150d) to provide a visual indication to the user of that one association. Further, ones of themenu items155 that are associated with the visual portion displayed in thedisplay area950 are positioned along portions of theracetrack menu150 that are further from the display area970 (specifically, thesides150aand150c) to provide a visual indication to the user of that other association. As suggested in the depiction ofFIG. 11a,the ones of themenu items155 that are associated with thedisplay area950 correspond to commands to play or to stop playing an audio/visual program, selection of an input, and radio frequency channel tuning. The ones of themenu items155 that are associated with thedisplay area970 correspond to commands to play or to stop playing an audio/visual program, and selection of an input.
Also more specifically,FIG. 11bdepicts a configuration that is commonly referred to as “picture-by-picture” in which thedisplay areas950 and970 are positioned adjacent each other (as opposed to one overlapping the other) within the portion of the display element surrounded by theracetrack menu150. Again as depicted, ones of themenu items155 that are associated with the visual portion displayed in thedisplay area950 are positioned along portions of theracetrack menu150 that are located closer to the display area950 (specifically, theside150cand portions of thesides150aand150b) to provide a visual indication to the user of that one association. Further, ones of themenu items155 that are associated with the visual portion displayed in thedisplay area970 are positioned along portions of theracetrack menu150 that are located closer to the display area970 (specifically, theside150dand portions of thesides150aand150b) to provide a visual indication to the user of that other association. As suggested in the depiction ofFIG. 11b,each of thedisplay areas950 and970 are associated with separate ones of themenu items155 that correspond to commands to play or to stop playing an audio/visual program, selection of an input, and radio frequency channel tuning.
AlthoughFIGS. 11aand11bdepict embodiments having only two display areas (i.e., thedisplay areas950 and970) within the portion of thedisplay element120 surrounded by theracetrack menu150, those skilled in the art will readily recognize that other embodiments incorporating more than two such display areas are possible, and that in such embodiments, each of themenu items155 may be positioned along theracetrack menu150 in a manner providing a visual indication of its association with one of those display areas. Indeed, it is envisioned that variants of theuser interface1000 are possible having 2-by-2 or larger arrays of display areas to accommodate the simultaneous display of multiple visual portions, possibly in security applications.
AlthoughFIGS. 11aand11bdepict separate sets of themenu items155 corresponding to commands to play and to stop playing an audio/visual program that are separately associated with each of thedisplay areas150 and170, and although this suggests that the visual portions played in each of thedisplay areas150 and170 must be from different audio/visual programs, it should be noted that the simultaneously displayed visual portions in thedisplay areas150 and170 may be of the same audio/visual program. As those skilled in the art will readily recognize, an audio/visual program may have more than one visual portion. An example of this may be an audio/visual program including video of an event taken from more than one angle, such as an audio/visual program of a sports event where an athlete is shown in action from more than one camera angle. In such instances, there may be only one set of themenu items155 corresponding to commands to play, fast-forward, rewind, pause and/or to stop playing the single audio/visual program, instead of the separate sets of menu items depictedFIGS. 11aand11b.
With the simultaneous display of multiple visual portions, there may be multiple audio portions that each correspond to a different one of the visual portions. While viewing multiple visual portions simultaneously may be relatively easy for a user insofar as the user is able to choose any visual program to watch with their eyes, listening to multiple audio portions simultaneously may easily become overwhelming. To address this, some embodiments may select one of the audio portions to be acoustically output to the user based on theposition260 of a tip of a digit along the racetrack surface250 (referring back toFIG. 2). Where theposition260 at which the user places a tip of a digit on theracetrack surface250 corresponds to a portion of theracetrack menu150 that is closer to thedisplay area950, then an audio portion of the audio/visual program of the visual portion being displayed in thedisplay area950 is acoustically output to the user. If the user then moves that tip of a digit along theracetrack surface250 such that theposition260 is moved to a portion of theracetrack surface250 that corresponds to a portion of theracetrack menu150 that is closer to thedisplay area970, then an audio portion of the audio/visual program of the visual portion being displayed in thedisplay area970 is acoustically output to the user. As the selection of audio portion that is acoustically output to the user changes as the user moves the tip of a digit about theracetrack surface250, the corresponding position of themarker160 along theracetrack menu150 may serve as a visual indication to the user of which visual portion the current selection of audio portion corresponds to.
FIG. 12 depicts an alternate variant of theuser interface1000 in which the combined display of theracetrack menu150 and thedisplay area950 surrounded by theracetrack menu150 does not fill substantially all of thedisplay element120. Such an embodiment may be implemented on a more complex variant of the audio/visual device100 capable of simultaneously performing numerous functions, some of which are entirely unrelated to selection and playing of an audio/visual program. As depicted, this leaves adisplay area920 that is outside theracetrack menu150 and that is overlain by the combination of theracetrack menu150 and thedisplay area950 available for such unrelated functions. Such a more complex variant of the audio/visual device100 may be a general purpose computer system, perhaps one employed as a “media center system” or “whole house entertainment system.” In such an embodiment, the combination of theracetrack menu150 and thedisplay area950 may be displayed in a window defined by an operating system having a windowing graphical user interface where the window occupies substantially less than all of thedisplay element120.
As also depicted inFIG. 12, in such an embodiment, the user may select and control the playing of an audio/visual program through the use of a variant of thetouch sensor220 having a touch-sensitive surface225 that has a continuous rectangular shape (such as the variant of thetouch sensor220 ofFIG. 10b), as opposed to having a ring shape (such as the variant of thetouch sensor220 ofFIG. 10a). Theracetrack surface250 is defined on the touch-sensitive surface225 in a manner that occupies the periphery of the touch-sensitive surface225 and that surrounds a remaining portion of the touch-sensitive surface225 that enables conventional operation of other functions of the audio/visual device100 that may be unrelated to the selection and playing of an audio/visual program. In essence, this remaining portion of the touch-sensitive surface225 may be employed in a conventional manner that will be familiar to those skilled in the art of graphical user interfaces in which a user moves about a graphical cursor using a tip of a digit placed on this remaining portion. Thus, the user may choose to engage in selecting audio/visual programs and controlling the playing of those audio/visual programs through theracetrack surface250, and may choose to engage in performing other tasks unrelated to the selection and playing of audio/visual programs through the remaining portion of the touch-sensitive surface225.
To provide tactile guidance to the user as to the location of theracetrack surface250, one ormore ridges227 and/or grooves (not shown) may be formed in the touch-sensitive surface225. In this way, the user may be aided in unerringly placing a tip of a digit on whichever one of theracetrack surface250 or the remaining portion of the touch-sensitive surface225 that they wish to place that tip upon, without errantly placing that tip on both, and without having to glance at the touch-sensitive surface225 of thetouch sensor220.
Other implementations are within the scope of the following claims and other claims to which the applicant may be entitled.