RELATED APPLICATION This application is related to, and claims priority from, U.S. Provisional Patent Application Ser. No. 60/823,870 filed on Aug. 29, 2006, entitled “Graphical User Interface”, the disclosure of which is incorporated here by reference.
BACKGROUND Technologies associated with the communication of information have evolved rapidly over the last several decades. Television, cellular telephony, the Internet and optical communication techniques (to name just a few things) combine to inundate consumers with available information and entertainment options. Taking television as an example, the last three decades have seen the introduction of cable television service, satellite television service, pay-per-view movies and video-on-demand. Whereas television viewers of the 1960s could typically receive perhaps four or five over-the-air TV channels on their television sets, today's TV watchers have the opportunity to select from hundreds, thousands, and potentially millions of channels of shows and information. Video-on-demand technology, currently used primarily in hotels and the like, provides the potential for in-home entertainment selection from among thousands of movie titles.
The technological ability to provide so much information and content to end users provides both opportunities and challenges to system designers and service providers. One challenge is that while end users typically prefer having more choices rather than fewer, this preference is counterweighted by their desire that the selection process be both fast and simple. Unfortunately, the development of the systems and interfaces by which end users access media items has resulted in selection processes which are neither fast nor simple. Consider again the example of television programs. When television was in its infancy, determining which program to watch was a relatively simple process primarily due to the small number of choices. One would consult a printed guide which was formatted, for example, as series of columns and rows which showed the correspondence between (1) nearby television channels, (2) programs being transmitted on those channels and (3) date and time. The television was tuned to the desired channel by adjusting a tuner knob and the viewer watched the selected program. Later, remote control devices were introduced that permitted viewers to tune the television from a distance. This addition to the user-television interface created the phenomenon known as “channel surfing” whereby a viewer could rapidly view short segments being broadcast on a number of channels to quickly learn what programs were available at any given time.
Despite the fact that the number of channels and amount of viewable content has dramatically increased, the generally available user interface, control device options and frameworks for televisions has not changed much over the last 30 years. Printed guides are still the most prevalent mechanism for conveying programming information. The multiple button remote control with up and down arrows is still the most prevalent channel/content selection mechanism. The reaction of those who design and implement the TV user interface to the increase in available media content has been a straightforward extension of the existing selection procedures and interface objects. Thus, the number of rows in the printed guides has been increased to accommodate more channels. The number of buttons on the remote control devices has been increased to support additional functionality and content handling, e.g., as shown inFIG. 1. However, this approach has significantly increased both the time required for a viewer to review the available information and the complexity of actions required to implement a selection. Arguably, the cumbersome nature of the existing interface has hampered commercial implementation of some services, e.g., video-on-demand, since consumers are resistant to new services that will add complexity to an interface that they view as already too slow and complex.
In addition to increases in bandwidth and content, the user interface bottleneck problem is being exacerbated by the aggregation of technologies. Consumers are reacting positively to having the option of buying integrated systems rather than a number of segregable components. An example of this trend is the combination television/VCR/DVD in which three previously independent components are frequently sold today as an integrated unit. This trend is likely to continue, potentially with an end result that most if not all of the communication devices currently found in the household will be packaged together as an integrated unit, e.g., a television/VCR/DVD/internet access/radio/stereo unit. Even those who continue to buy separate components will likely desire seamless control of, and interworking between, the separate components. With this increased aggregation comes the potential for more complexity in the user interface. For example, when so-called “universal” remote units were introduced, e.g., to combine the functionality of TV remote units and VCR remote units, the number of buttons on these universal remote units was typically more than the number of buttons on either the TV remote unit or VCR remote unit individually. This added number of buttons and functionality makes it very difficult to control anything but the simplest aspects of a TV or VCR without hunting for exactly the right button on the remote. Many times, these universal remotes do not provide enough buttons to access many levels of control or features unique to certain TVs. In these cases, the original device remote unit is still needed, and the original hassle of handling multiple remotes remains due to user interface issues arising from the complexity of aggregation. Some remote units have addressed this problem by adding “soft” buttons that can be programmed with the expert commands. These soft buttons sometimes have accompanying LCD displays to indicate their action. These too have the flaw that they are difficult to use without looking away from the TV to the remote control. Yet another flaw in these remote units is the use of modes in an attempt to reduce the number of buttons. In these “moded” universal remote units, a special button exists to select whether the remote should communicate with the TV, DVD player, cable set-top box, VCR, etc. This causes many usability issues including sending commands to the wrong device, forcing the user to look at the remote to make sure that it is in the right mode, and it does not provide any simplification to the integration of multiple devices. The most advanced of these universal remote units provide some integration by allowing the user to program sequences of commands to multiple devices into the remote. This is such a difficult task that many users hire professional installers to program their universal remote units.
Some attempts have also been made to modernize the screen interface between end users and media systems. However, these attempts typically suffer from, among other drawbacks, an inability to easily scale between large collections of media items and small collections of media items. For example, interfaces which rely on lists of items may work well for small collections of media items, but are tedious to browse for large collections of media items. Interfaces which rely on hierarchical navigation (e.g., tree structures) may be speedier to traverse than list interfaces for large collections of media items, but are not readily adaptable to small collections of media items. Additionally, users tend to lose interest in selection processes wherein the user has to move through three or more layers in a tree structure. For all of these cases, current remote units make this selection process even more tedious by forcing the user to repeatedly depress the up and down buttons to navigate the list or hierarchies. When selection skipping controls are available such as page up and page down, the user usually has to look at the remote to find these special buttons or be trained to know that they even exist. Accordingly, organizing frameworks, techniques and systems which simplify the control and screen interface between users and media systems as well as accelerate the selection process, while at the same time permitting service providers to take advantage of the increases in available bandwidth to end user equipment by facilitating the supply of a large number of media items and new services to the user have been proposed in U.S. patent application Ser. No. 10/768,432, filed on Jan. 30, 2004, entitled “A Control Framework with a Zoomable Graphical User Interface for Organizing, Selecting and Launching Media Items”, the disclosure of which is incorporated here by reference.
In addition to being able to locate media items, such as movies, music, photos and personal videos, another feature which has become increasingly popular with consumers is the capability to create personalized media playlists. These playlists, when launched, provide an ordered way to present stored media items. Playlists can be stored and shared between users. Given the large volume of media items which can be navigated using the afore-described frameworks, new tools for playlist creation would be desirable.
SUMMARY According to one exemplary embodiment, a playlist generation system includes a television, a storage device for storing media items and selectively replaying the media items, a user interface for creating a playlist displayed on the television using images which correspond to the stored media items, and a 3D pointing device for providing pointing and selection inputs to the user interface to add said images corresponding to the media items to the displayed playlist.
According to another exemplary embodiment, a playlist generation method includes the steps of displaying a user interface for creating a playlist using images which correspond to stored media items, providing input to the user interface by pointing at the images displayed on the user interface using a 3D pointing device, and selectively adding media items to the playlist based on the input.
According to yet another exemplary embodiment, a computer-readable medium contains instructions which, when executed by a processor, performs the steps of: displaying a user interface for creating a playlist using images which correspond to stored media items, receiving input to said user interface associated with pointing at the images displayed on the user interface using a 3D pointing device, and selectively adding media items to the playlist based on the input.
BRIEF DESCRIPTION OF THE DRAWINGS The accompanying drawings illustrate exemplary embodiments of the present invention, wherein:
FIG. 1 depicts a conventional remote control unit for an entertainment system;
FIG. 2 depicts an exemplary media system in which exemplary embodiments of the present invention can be implemented;
FIG. 3(a) shows a 3D pointing device according to an exemplary embodiment of the present invention;
FIG. 3(b) illustrates a user employing a 3D pointing device to provide input to a user interface on a television according to an exemplary embodiment of the present invention;
FIG. 3(c) illustrates a home UI view of a user interface according to an exemplary embodiment of the present invention; and
FIGS.4(a)-4(g) illustrate user interface screens associated with playlist functionality according to an exemplary embodiment of the present invention.
DETAILED DESCRIPTION The following detailed description of the invention refers to the accompanying drawings. The same reference numbers in different drawings identify the same or similar elements. Also, the following detailed description does not limit the invention. Instead, the scope of the invention is defined by the appended claims.
In order to provide some context for this discussion, an exemplary aggregatedmedia system200 in which the present invention can be implemented will first be described with respect toFIG. 2. Those skilled in the art will appreciate, however, that the present invention is not restricted to implementation in this type of media system and that more or fewer components can be included therein. Therein, an input/output (I/O)bus210 connects the system components in themedia system200 together. The I/O bus210 represents any of a number of different of mechanisms and techniques for routing signals between the media system components. For example, the I/O bus210 may include an appropriate number of independent audio “patch” cables that route audio signals, coaxial cables that route video signals, two-wire serial lines or infrared or radio frequency transceivers that route control signals, optical fiber or any other routing mechanisms that route other types of signals.
In this exemplary embodiment, themedia system200 includes a television/monitor212, a video cassette recorder (VCR)214, digital video disk (DVD) recorder/playback device216, audio/video tuner218 andcompact disk player220 coupled to the I/O bus210. TheVCR214,DVD216 andcompact disk player220 may be single disk or single cassette devices, or alternatively may be multiple disk or multiple cassette devices. They may be independent units or integrated together. In addition, themedia system200 includes a microphone/speaker system222,video camera224 and a wireless I/O control device226. According to exemplary embodiments of the present invention, the wireless I/O control device226 is a 3D pointing device. The wireless I/O control device226 can communicate with theentertainment system200 using, e.g., an IR or RF transmitter or transceiver. Alternatively, the I/O control device can be connected to theentertainment system200 via a wire. One or more hard drives (or disks)280 can be provided for storage of recorded video, music or other media.
Theentertainment system200 also includes asystem controller228. According to one exemplary embodiment of the present invention, thesystem controller228 operates to store and display entertainment system data available from a plurality of entertainment system data sources and to control a wide variety of features associated with each of the system components. As shown inFIG. 2,system controller228 is coupled, either directly or indirectly, to each of the system components, as necessary, through I/O bus210. In one exemplary embodiment, in addition to or in place of I/O bus210,system controller228 is configured with a wireless communication transmitter (or transceiver), which is capable of communicating with the system components via IR signals or RF signals. Regardless of the control medium, thesystem controller228 is configured to control the media components of themedia system200 via a graphical user interface described below.
As further illustrated inFIG. 2,media system200 may be configured to receive media items from various media sources and service providers. In this exemplary embodiment,media system200 receives media input from and, optionally, sends information to, any or all of the following sources:cable broadcast230, satellite broadcast232 (e.g., via a satellite dish), very high frequency (VHF) or ultra high frequency (UHF) radio frequency communication of the broadcast television networks234 (e.g., via an aerial antenna),telephone network236 and cable modem238 (or another source of Internet content). Those skilled in the art will appreciate that the media components and media sources illustrated and described with respect toFIG. 2 are purely exemplary and thatmedia system200 may include more or fewer of both. For example, other types of inputs to the system include AM/FM radio and satellite radio.
More details regarding this exemplary entertainment system and frameworks associated therewith can be found in the above-incorporated by reference U.S. Patent Application entitled “A Control Framework with a Zoomable Graphical User Interface for Organizing, Selecting and Launching Media Items”. Additionally, the interested reader is also referred to U.S. patent application Ser. No. 11/437,215, entitled “Global Navigation Objects in User Interfaces”, filed on May 19, 2006, the disclosure of which is incorporated here by reference. Alternatively, remote devices and interaction techniques between remote devices and user interfaces in accordance with the present invention can be used in conjunction with other types of systems, for example computer systems including, e.g., a display, a processor and a memory system or with various other systems and applications.
As mentioned in the Background section, remote devices which operate as 3D pointers are of particular interest for the present specification, although the present invention is not limited to systems including 3D pointers. Such devices enable the translation of movement of the device, e.g., linear movement, rotational movement, acceleration or any combination thereof, into commands to a user interface. An exemplary loop-shaped,3D pointing device300 is depicted inFIG. 3(a), however the present invention is not limited to loop-shaped devices. In this exemplary embodiment, the3D pointing device300 includes twobuttons302 and304 as well as a scroll wheel306 (scroll wheel306 can also act as a button by depressing the scroll wheel306), although other exemplary embodiments will include other physical configurations. User movement of the3D pointing device300 can be defined, for example, in terms of rotation about one or more of an x-axis attitude (roll), a y-axis elevation (pitch) or a z-axis heading (yaw). In addition, some exemplary embodiments of the present invention can additionally (or alternatively) measure linear movement of the3D pointing device300 along the x, y, and/or z axes to generate cursor movement or other user interface commands. An example is provided below. A number of permutations and variations relating to 3D pointing devices can be implemented in systems according to exemplary embodiments of the present invention. The interested reader is referred to U.S. patent application Ser. No. 11/119,663, entitled (as amended) “3D Pointing Devices and Methods”, filed on May 2, 2005, U.S. patent application Ser. No. 11/119,719, entitled (as amended) “3D Pointing Devices with Tilt Compensation and Improved Usability”, also filed on May 2, 2005, U.S. patent application Ser. No. 11/119,987, entitled (as amended) “Methods and Devices for Removing Unintentional Movement in 3D Pointing Devices”, also filed on May 2, 2005, U.S. patent application Ser. No. 11/119,688, entitled “Methods and Devices for Identifying Users Based on Tremor”, also filed on May 2, 2005, and U.S. patent application Ser. No. 11/480,662, entitled “3D Pointing Devices”, filed on Jul. 3, 2006, the disclosures of which are incorporated here by reference, for more details regarding exemplary 3D pointing devices which can be used in conjunction with exemplary embodiments of the present invention.
According to exemplary embodiments of the present invention, it is anticipated that3D pointing devices300 will be held by a user in front of adisplay308 and that motion of the3D pointing device300 will be translated by the 3D pointing device into output which is usable to interact with the information displayed ondisplay308, e.g., to move thecursor310 on thedisplay308. For example, such 3D pointing devices and their associated user interfaces can be used to make media selections on a television as shown inFIG. 3(b), which will be described in more detail below. Aspects of exemplary embodiments of the present invention can be optimized to enhance the user's experience of the so-called “10-foot” interface, i.e., a typical distance between a user and his or her television in a living room. For example, interactions between pointing, scrolling, zooming and panning, e.g., using a 3D pointing device and associated user interface, can be optimized for this environment as will be described below, although the present invention is not limited thereto.
Referring again toFIG. 3(a), an exemplary relationship between movement of the3D pointing device300 and corresponding cursor movement on a user interface will now be described. Rotation of the3D pointing device300 about the y-axis can be sensed by the3D pointing device300 and translated into an output usable by the system to movecursor310 along the y2axis of thedisplay308. Likewise, rotation of the3D pointing device308 about the z-axis can be sensed by the3D pointing device300 and translated into an output usable by the system to movecursor310 along the x2axis of thedisplay308. It will be appreciated that the output of3D pointing device300 can be used to interact with thedisplay308 in a number of ways other than (or in addition to) cursor movement, for example it can control cursor fading, volume or media transport (play, pause, fast-forward and rewind). Additionally, the system can be programmed to recognize gestures, e.g., predetermined movement patterns, to convey commands in addition to cursor movement. Moreover, other input commands, e.g., a zoom-in or zoom-out on a particular region of a display (e.g., actuated by pressingbutton302 to zoom-in orbutton304 to zoom-out), may also be available to the user.
According to exemplary embodiments of the present invention, user interfaces may use, at least in part, zooming techniques for moving between user interface views. The zooming transition effect can be performed by progressive scaling and displaying of at least some of the UI objects displayed on the current UI view to provide a visual impression of movement of those UI objects away from an observer. In another functional aspect of the present invention, user interfaces may zoom-in in response to user interaction with the user interface which will, likewise, result in the progressive scaling and display of UI objects that provide the visual impression of movement toward an observer. More information relating to zoomable user interfaces can be found in U.S. patent application Ser. No. 10/768,432, filed on Jan. 30, 2004, entitled “A Control Framework with a Zoomable Graphical User Interface for Organizing, Selecting and Launching Media Items”, and U.S. patent application Ser. No. 09/829,263, filed on Apr. 9, 2001, entitled “Interactive Content Guide for Television Programming”, the disclosures of which are incorporated here by reference.
Movement within the user interface between different user interface views is not limited to zooming. Other non-zooming techniques can be used, in addition to zooming or as an alternative thereto, to transition between user interface views. For example, panning can be performed by progressive translation and display of at least some of the user interface objects which are currently displayed in a user interface view. This provides the visual impression of lateral movement of those user interface objects to an observer.
Returning now to the application illustrated inFIG. 3(b), the GUI screen (also referred to herein as a “UI view”, which terms refer to a currently displayed set of UI objects) seen ontelevision320 is a home view. In this particular exemplary embodiment, the home view displays a plurality ofapplications322, e.g., “Photos”, “Music”, “Recorded”, “Guide”, “Live TV”, “On Demand”, and “Settings”, which are selectable by the user by way of interaction with the user interface via the3D pointing device300. Such user interactions can include, for example, pointing, scrolling, clicking or various combinations thereof. For more details regarding exemplary pointing, scrolling and clicking interactions which can be used in conjunction with exemplary embodiments of the present invention, the interested reader is directed to U.S. patent application Ser. No. 11/417,764, entitled “METHODS AND SYSTEMS FOR SCROLLING AND POINTING IN USER INTERFACE”, to Frank J. Wroblewski, filed on May 4, 2006, the disclosure of which is incorporated here by reference.
Playlist Construction
As mentioned above, the provision of playlist creation and management tools and techniques which interact with the afore-described systems and user interfaces is a desirable feature. These exemplary embodiments harness the power of pointing, e.g., on a user interface displayed on a television using a 3D pointer, in conjunction with other user interface visualizations to provide a powerful playlist creation and management tool for users, as will now be described with respect to FIGS.3(c)-4(g). Starting withFIG. 3(c), another version of the “home” UI view ofFIG. 3(b) is shown. Therein, the circle of initial selection elements inFIG. 3(c) is expanded to include a selection element for “Internet” and “Games” as compared to the version shown inFIG. 3(b), any of which can be selected by, for example, pointing to the appropriate selection element and pressing a button on the3D pointing device300. Of particular interest for this discussion is the “Music”selection element390 illustrated inFIG. 3(c). Actuation of this element provides one navigation path in these exemplary embodiments to the various media selection screens which include the playlist creation and management capabilities described below, although it will be appreciated that other paths may be followed through the user interface to reach these UI views. Additionally, playlists are not limited to lists of albums, songs or music videos which are used to illustrate these exemplary embodiments below but are, instead, capable of being used with any stored media items.
For example, actuatingselection element390 via3D pointer300 could result in the display of the UI view shown inFIG. 4(a), e.g., on a television. Therein, a number of user selection items, e.g., CD album cover art images associated with various musical recordings, are accessible by pointing and clicking.Global navigation icons324, described in the above-incorporated by reference patent application, are also available for quick and easy navigation away from the music selection functionality. In this example, six “bookshelves” of music albums are presented each of which contains twenty images, each image associated with an album of musical recordings. It will be appreciated, however, that these exemplary embodiments are not limited to bookshelves containing twenty image items and may contain more or fewer images. However, the provision of images as selectable media items provides for visual browsing which enhances a user's experience and makes it easier to quickly navigate a large number of items. Also shown inFIG. 4(a) is aplaylist icon400. By positioning the cursor over the playlist icon and providing an input command, e.g., by moving the3D pointer300 such that a cursor is displayed over top of theicon400 and pressing one of the buttons associated therewith, a user can toggle between a mode where the playlist is hidden and a mode, as shown inFIG. 4(b) wherein theplaylist402 is displayed as an overlay on top of the selectable image items.
Theplaylist creation tool402 in this exemplary embodiment includes aregion404 into which items which are selected to be in the playlist are represented. In the example ofFIG. 4(b),region404 is currently unpopulated as no items have yet been selected. In this exemplary embodiment, theplaylist tool402 also includes fouruser selectable buttons406,408,410 and412. By pointing to any one of these buttons and actuating a button on a3D pointing device300, the user interface displayable buttons can be actuated. For example, if the user positions a cursor (not shown) over the “play”button406 and actuates a button on the3D pointing device300, then the user interface will begin to play the items in the playlist. For example, in the context of music, the selected music in the playlist would be played in order, e.g., via output through thespeakers222.Button408, when actuated, commands the user interface to shuffle the listed items in the playlist so as to provide, for example, presentation of the media items in the playlist in a random order.Button410 clears theregion404 of selected items.Button412, when actuated, closes theplaylist creation tool402 and returns to the view ofFIG. 4(a) where the playlist option is represented byicon400.
As discussed in the above-identified '263 patent application, playlist creation tools in accordance with these exemplary embodiments can be provided in conjunction with a zoomable user interface as described therein. For example, a user can move from the view illustrated inFIG. 4(a) to the view illustrated inFIG. 4(c) by positioning a cursor over any of the images in the “jazz vocal” bookshelf and pressing a zoom-in button. This will result in, for example, a transition effect as described above wherein the user zooms into the interface and is provided with an enlarged view of the images on this bookshelf as shown inFIG. 4(c). As shown inFIG. 4(d), the playlist creation tool can also be superimposed over the UI view of the selectable media items at this second, zoomed in level.FIG. 4(d) also illustrates a partiallypopulated selection region404 which, according to this exemplary embodiment, includes the images of the selected music items. This provides again a very visual experience for the user who is manipulating images rather than text in this exemplary embodiment.
If the user, for example, selects the Louis Armstrong album “C'est si Bon” within the “Jazz Vocal” bookshelf, and actuates a further zoom-in command to the user interface, an exemplary UI view such as that illustrated inFIG. 4(e) may be displayed. Again, a transition effect may accompany the shift from the UI view ofFIG. 4(d) to the UI view ofFIG. 4(e). For example, the image associated with the cover art of the Louis Armstrong CD can be zoomed into, magnified and translated into a new location on the screen, thereby providing the user with an anchor element and a sense of position within the user interface. Within the context of playlist creation tools according to these exemplary embodiments, the detailed view ofFIG. 4(e) may also include additional user interface elements for populating the playlist. For example, according to this exemplary embodiment, a user may add all of the songs from the “C'est si Bon” album to the playlist by pointing at thebutton420 and providing an input via3D pointer300. Alternatively, a user may add songs individually by pressing buttons associated with each individual track on the album, for example,button422. If the album includes enough tracks that it is not convenient to display them all in the detailed view ofFIG. 4(e), then scrolling can be provided by, for example, scrollarrows428 and430 whereby pointing to the scroll arrows and actuating a button on thepointing device300 will scroll the list of music tracks up or down, respectively. The exemplary detailed view ofFIG. 4(e) also provides for tabbed presentations. The default tab which is initially shown upon a transition from the view inFIG. 4(d) to the view inFIG. 4(e) is provided by the “Album”tab424. However, a user may also view music selections, other media selections, or even product items which are related to Louis Armstrong's “C'est si Bon” album by pointing and clicking on thetab426 entitled “Related”.
Although the foregoing examples regarding playlist creation and management are provided in the context of audio tracks, these tools are likewise applicable to any type of media which is amenable to presentation via a playlist. For example, as seen in FIGS.4(f) and4(g), music videos can be stored on thehard drive280 and organized in a playlist for presentation in the same manner as described above with respect to albums.
Systems and methods for processing data according to exemplary embodiments of the present invention can be performed by one or more processors executing sequences of instructions contained in a memory device. Such instructions may be read into the memory device from other computer-readable mediums such as secondary data storage device(s). Execution of the sequences of instructions contained in the memory device causes the processor to operate, for example, as described above. In alternative embodiments, hard-wire circuitry may be used in place of or in combination with software instructions to implement the present invention.
Numerous variations of the afore-described exemplary embodiments are contemplated. The above-described exemplary embodiments are intended to be illustrative in all respects, rather than restrictive, of the present invention. Thus the present invention is capable of many variations in detailed implementation that can be derived from the description contained herein by a person skilled in the art. All such variations and modifications are considered to be within the scope and spirit of the present invention as defined by the following claims. No element, act, or instruction used in the description of the present application should be construed as critical or essential to the invention unless explicitly described as such. Also, used herein, the article “a” is intended to include one or more items.