BACKGROUND OF THE INVENTIONA user can control the playback of media using a media playback device in different ways. In particular, the manner in which the user controls the media playback can be set by the particular input interfaces available to the device. If several buttons are available (e.g., buttons associated with dome switches), the user can control media playback by selecting buttons associated with different playback operations. Alternatively, if a display having displayed playback options is available, the user can direct an input interface to select the displayed playback options. For example, a user can select a displayed play/pause option, fast-forward option, rewind option, and volume option.
These input approaches, however, can require physical buttons on which to provide the inputs. This in turn can increase the overall size of the device, or require moving components extending from an exterior surface of the device. These input approaches can also require the display of selectable options, which can require additional power or prevent the user from controlling the media playback without looking at the display (e.g., when the device faces away from the user, for example during a workout).
SUMMARY OF THE INVENTIONThis is directed to systems, methods and computer-readable media for controlling media playback based on specific combinations of touch inputs. In particular, this is directed to detecting different combinations of taps, and performing playback operations associated with each of the tap combinations.
In some embodiments, a user can control the playback of media on an electronic device by providing inputs to an input interface, such as a physical button. For example, the electronic device can include several buttons, each of which is associated with a different playback operation. The buttons can be incorporated in the electronic device or remotely coupled, for example wirelessly or using a wire (e.g., buttons in a headphone cable). Alternatively, the electronic device can instead or in addition display selectable playback options, which the user can select using an input interface. The selectable playback options can include, for example, play/pause options, fast forward and rewind options, next and last options, and volume control options. Upon receiving a selection of an option, the electronic device can perform the corresponding media playback operation.
An electronic device, however, may not have dedicated playback control buttons or interfaces. In addition, a user may wish to control media playback operations without needing to first look at a display to select a specific displayed option. To allow a user to control media playback using a touch sensing device without requiring the selection of displayed options, the electronic device can include a mode or configuration for which the touch sensing device can sense touch events, but not display any content on a display. For example, an electronic device with a touch screen can have a mode in which no content is displayed on the touch screen (e.g., the touch screen remains dark), but the touch screen is operative to detect touch events of the user.
The electronic device can associate any suitable combination of touch events with different media playback inputs. In some embodiments, the electronic device can associate tap events with different media playback operations. In particular, the electronic device can associate tap events that correspond to or mimic button press events from a remote input interface (e.g., an in-line button) with the corresponding media playback operations. In one implementation, the electronic device can associate single button presses and single touch events with the same playback operation, and combinations of button presses (e.g., short and long presses) and corresponding touch events (e.g., short and long taps) with the same playback operations.
In some embodiments, the user can control volume operations or other playback operations by providing touch events that do not correspond to button press events provided by a remote input interface. For example, the user can provide a circular touch event to control the volume of played back media. The electronic device can also allow other mechanisms for providing playback controls, such as selectable options selectively displayed on a display (e.g., displayed in response to a specific request from the user).
BRIEF DESCRIPTION OF THE DRAWINGSThe above and other features of the present invention, its nature and various advantages will be more apparent upon consideration of the following detailed description, taken in conjunction with the accompanying drawings in which:
FIG. 1 is a block diagram of a computer system in accordance with one embodiment of this invention.
FIG. 2 is a block diagram of another computer system in accordance with one embodiment of this invention.
FIG. 3 is a multipoint processing method in accordance with one embodiment of this invention.
FIGS. 4A and 4B are schematic views of a detected touch image in accordance with one embodiment of this invention.
FIG. 5 illustrates a group of features in accordance with one embodiment of this invention.
FIG. 6 is a parameter calculation method in accordance with one embodiment of this invention.
FIG. 7A is a schematic view of a now-playing display in accordance with one embodiment of the invention;
FIG. 7B is a schematic view of a selectable volume overlay on the now-playing display in accordance with one embodiment of the invention;
FIG. 7C is a schematic view of a playback control overlay on the now-playing display in accordance with one embodiment of the invention;
FIG. 7D is a schematic view of a playlist control overlay on the now-playing display in accordance with one embodiment of the invention;
FIG. 8A is a schematic view of an illustrative display having visual feedback for a play touch instruction in accordance with one embodiment of the invention;
FIG. 8B is a schematic view of an illustrative display having visual feedback for a fast-forward touch instruction in accordance with one embodiment of the invention;
FIG. 9A is a schematic view of a first volume control overlay in accordance with one embodiment of the invention;
FIG. 9B is a schematic view of a second volume control overlay in accordance with one embodiment of the invention;
FIG. 10 is an illustrative table of touch input and device operation associations in accordance with one embodiment of the invention; and
FIG. 11 is a flowchart of an illustrative process for controlling media playback based on detected touch events in accordance with one embodiment of the invention.
DETAILED DESCRIPTIONThis is directed to providing instructions for media playback operations by detecting tap touches using a touch sensing device.
In the following description of preferred embodiments, reference is made to the accompanying drawings which form a part hereof, and in which it is shown by way of illustration specific embodiments in which the invention can be practiced. It is to be understood that other embodiments can be utilized and structural changes can be made without departing from the scope of the preferred embodiments of the present invention.
FIG. 1 is a block diagram of an exemplary computer system50 in accordance with one embodiment of the present invention. Computer system50 can correspond to a personal computer system, such as desktops, laptops, tablets or handheld computers. Computer system50 can also correspond to other computing devices, such as an iPod® available by Apple Inc., of Cupertino, Calif., a cellular telephone, a personal e-mail or messaging device (e.g., a Blackberry® or a Sidekick®), an iPhone® available from Apple Inc., pocket-sized personal computers, personal digital assistants (PDAs), a music recorder, a video recorder, a gaming device, a camera, radios, or any other suitable consumer electronic device.
The exemplary computer system50 shown inFIG. 1 can include aprocessor56 configured to execute instructions and to carry out operations associated with computer system50. For example, using instructions retrieved for example from memory,processor56 can control the reception and manipulation of input and output data between components of computing system50.Processor56 can be implemented on a single-chip, multiple chips or multiple electrical components. For example, various architectures can be used forprocessor56, including dedicated or embedded processor, single purpose processor, controller, ASIC, and so forth.
In most cases,processor56 together with an operating system can operate to execute computer code and produce and use data. Operating systems are generally well known and will not be described in greater detail. By way of example, the operating system can correspond to OS/2, DOS, Unix, Linux, Palm OS, and the like. The operating system can also be a special purpose operating system, such as can be used for limited purpose appliance-type computing devices. The operating system, other computer code and data can reside withinmemory block58 that is operatively coupled toprocessor56.Memory block58 generally provides a place to store computer code and data that are used by computer system50. By way of example,memory block58 can include Read-Only Memory (ROM), Random-Access Memory (RAM), hard disk drive and/or the like. The information could also reside on a removable storage medium and loaded or installed onto computer system50 when needed. Removable storage mediums include, for example, CD-ROM, PC-CARD, memory card, floppy disk, magnetic tape, and a network component.
Computer system50 can also includedisplay device68 that is operatively coupled toprocessor56.Display device68 can be a liquid crystal display (LCD) (e.g., active matrix, passive matrix and the like). Alternatively,display device68 can be a monitor such as a monochrome display, color graphics adapter (CGA) display, enhanced graphics adapter (EGA) display, variable-graphics-array (VGA) display, super VGA display, cathode ray tube (CRT), and the like.Display device68 can also correspond to a plasma display or a display implemented with electronic inks.
Display device68 can generally be configured to display graphical user interface (GUI)69 that can provide an easy to use interface between a user of the computer system and the operating system or application running thereon. Generally speaking,GUI69 can represent programs, files and operational options with graphical images, objects, or vector representations. The graphical images can include windows, fields, dialog boxes, menus, icons, buttons, cursors, scroll bars, etc. Such images can be arranged in predefined layouts, or can be created dynamically to serve the specific actions being taken by a user. During operation, the user can select and/or activate various graphical images in order to initiate functions and tasks associated therewith. By way of example, a user can select a button that opens, closes, minimizes, or maximizes a window, or an icon that launches a particular program.GUI69 can additionally or alternatively display information, such as non interactive text and graphics, for the user ondisplay device68.
Computer system50 can also includeinput device70 that is operatively coupled toprocessor56.Input device70 can be configured to transfer data from the outside world into computer system50.Input device70 can, for example, be used to perform tracking and to make selections with respect toGUI69 ondisplay68.Input device70 can also be used to issue commands in computer system50.Input device70 can include a touch-sensing device or interface configured to receive input from a user's touch and to send this information toprocessor56. By way of example, the touch-sensing device can correspond to a touchpad or a touch screen. In many cases, the touch-sensing device can recognize touches, as well as the position and magnitude of touches on a touch sensitive surface. The touch-sensing device can detect and report the touches toprocessor56 andprocessor56 can interpret the touches in accordance with its programming. For example,processor56 can initiate a task in accordance with a particular touch. A dedicated processor can be used to process touches locally and reduce demand for the main processor of the computer system. In some embodiments,input device70 can be a touch screen that can be positioned over or in front ofdisplay68, integrated withdisplay device68, or can be a separate component, such as a touch pad.
The touch-sensing device can be based on sensing technologies including but not limited to capacitive sensing, resistive sensing, surface acoustic wave sensing, pressure sensing, optical sensing, and/or the like. Furthermore, the touch-sensing means can be based on single point sensing or multipoint sensing. Single point sensing is capable of only distinguishing a single touch, while multipoint sensing is capable of distinguishing multiple touches that occur at the same time. The touch-sensing can include actual contact of the touch-sensing device, near-touch of the touch-sensing device (e.g. detecting hovering), or remote detection of the user by the touch-sensing device.
Computer system50 can also include capabilities for coupling to one or more I/O devices80. By way of example, I/O devices80 can correspond to keyboards, printers, scanners, cameras, microphones, speakers, and/or the like. I/O devices80 can be integrated with computer system50 or they can be separate components (e.g., peripheral devices). In some cases, I/O devices80 can be connected to computer system50 through wired connections (e.g., cables/ports). In other cases, I/O devices80 can be connected tocomputer system80 through wireless connections. By way of example, the data link can correspond to PS/2, USB, IR, Firewire, RF, Bluetooth or the like.
In accordance with one embodiment of the present invention, computer system50 can be designed to recognizegestures85 applied to inputdevice70 and to control aspects of computer system50 based on thegestures85. In some cases, a gesture can be defined as a stylized interaction with an input device that can be mapped to one or more specific computing operations.Gestures85 can be made through various hand, and more particularly finger motions. Alternatively or additionally, the gestures can be made with a stylus. In all of these cases,input device70 can receivegestures85 andprocessor56 can execute instructions to carry out operations associated with thegestures85. In addition,memory block58 can include gestureoperational program88, which can be part of the operating system or a separate application.Gesture operation program88 can generally include a set of instructions that can recognize the occurrence ofgestures85 and can inform one or more software agents of thegestures85 and/or what action(s) to take in response to thegestures85. Additional details regarding the various gestures that can be used as input commands is discussed further below.
In accordance with the preferred embodiment, upon a user performing one or more gestures,input device70 can relay gesture information toprocessor56. Using instructions frommemory58, and more particularly, gestureoperational program88,processor56 can interpret thegestures85 and control different components of computer system50, such asmemory58,display68 and I/O devices80, based on thegestures85.Gestures85 can be identified as commands for performing actions in applications stored inmemory58, modifying image objects shown ondisplay68, modifying data stored inmemory58, and/or for performing actions in I/O devices80.
Again, althoughFIG. 1 illustratesinput device70 anddisplay68 as two separate boxes for illustration purposes, the two boxes can be realized on one device.
FIG. 2 illustrates anexemplary computing system10 that usesmulti-touch panel24 as an input device for gestures, thoughmulti-touch panel24 can at the same time be a display panel.Computing system10 can include one or moremulti-touch panel processors12 dedicated tomulti-touch subsystem27. Alternatively, multi-touch panel processor functionality can be implemented by dedicated logic, such as a state machine.Peripherals11 can include, but are not limited to, random access memory (RAM) or other types of memory or storage, watchdog timers and the like.Multi-touch subsystem27 can include, but is not limited to, one ormore analog channels17,channel scan logic18 anddriver logic19.Channel scan logic18 can accessRAM16, autonomously read data fromanalog channels17 and provide control foranalog channels17. This control can include multiplexing columns ofmulti-touch panel24 toanalog channels17. In addition,channel scan logic18 can controldriver logic19 and stimulation signals being selectively applied to rows ofmulti-touch panel24. In some embodiments,multi-touch subsystem27,multi-touch panel processor12 andperipherals11 can be integrated into a single application specific integrated circuit (ASIC).
Driver logic19 can provide multiple multi-touch subsystem outputs20 and can present a proprietary interface that drives high voltage driver, which can include adecoder21 and subsequent level shifter anddriver stage22, although level-shifting functions could be performed before decoder functions. Level shifter anddriver stage22 can provide level shifting from a low voltage level (e.g. CMOS levels) to a higher voltage level, providing a better signal-to-noise (S/N) ratio for noise reduction purposes.Decoder21 can decode the drive interface signals to one out of N outputs, whereas N is the maximum number of rows in the panel.Decoder21 can be used to reduce the number of drive lines needed between the high voltage driver andmulti-touch panel24. Each multi-touchpanel row input23 can drive one or more rows inmulti-touch panel24. It should be noted thatdriver22 anddecoder21 can also be integrated into a single ASIC, be integrated intodriver logic19, or in some instances be unnecessary.
Multi-touch panel24 can include a capacitive sensing medium having a plurality of row traces or driving lines and a plurality of column traces or sensing lines, although other sensing media can also be used. The row and column traces can be formed from a transparent conductive medium, such as Indium Tin Oxide (ITO) or Antimony Tin Oxide (ATO), although other transparent and non-transparent materials, such as copper, can also be used. In some embodiments, the row and column traces can be formed on opposite sides of a dielectric material, and can be perpendicular to each other, although in other embodiments other non-orthogonal orientations are possible. In a polar coordinate system, for example, the sensing lines can be concentric circles and the driving lines can be radially extending lines (or vice versa). It should be understood, therefore, that the terms “row” and “column,” “first dimension” and “second dimension,” or “first axis” and “second axis” as used herein are intended to encompass not only orthogonal grids, but the intersecting traces of other geometric configurations having first and second dimensions (e.g. the concentric and radial lines of a polar-coordinate arrangement). The rows and columns can be formed on a single side of a substrate, or can be formed on two separate substrates separated by a dielectric material. In some instances, an additional dielectric cover layer can be placed over the row or column traces to strengthen the structure and protect the entire assembly from damage.
At the “intersections” of the traces ofmulti-touch panel24, where the traces pass above and below (cross) each other (but do not make direct electrical contact with each other), the traces can essentially form two electrodes (although more than two traces could intersect as well). Each intersection of row and column traces can represent a capacitive sensing node and can be viewed as picture element (pixel)26, which can be particularly useful whenmulti-touch panel24 is viewed as capturing an “image” of touch. In other words, aftermulti-touch subsystem27 has determined whether a touch event has been detected at each touch sensor in the multi-touch panel, the pattern of touch sensors in the multi-touch panel at which a touch event occurred can be viewed as an “image” of touch (e.g. a pattern of fingers touching the panel. The capacitance between row and column electrodes can appear as a stray capacitance on all columns when the given row is held at DC and as a mutual capacitance Csig when the given row is stimulated with an AC signal. The presence of a finger or other object near or on the multi-touch panel can be detected by measuring changes to Csig. The columns ofmulti-touch panel24 can drive one or more analog channels17 (also referred to herein as event detection and demodulation circuits) inmulti-touch subsystem27. In some implementations, each column can be coupled to onededicated analog channel17. However, in other implementations, the columns can be couplable via an analog switch to a fewer number ofanalog channels17.
Computing system10 can also includehost processor14 for receiving outputs frommulti-touch panel processor12 and performing actions based on the outputs that can include, but are not limited to, moving an object such as a cursor or pointer, scrolling or panning, adjusting control settings, opening a file or document, viewing a menu, making a selection, executing instructions, operating a peripheral device connected to the host device, etc.Host processor14, which can be a personal computer CPU, can also perform additional functions that can not be related to multi-touch panel processing, and can be coupled toprogram storage15 anddisplay device13 such as an LCD display for providing a user interface (UI) to a user of the device.
It should be noted that, whileFIG. 2 illustrates a dedicatedmulti-touch panel processor12, the multi-touch subsystem can be controlled directly by thehost processor14. Additionally, it should also be noted thatmulti-touch panel24 anddisplay device13 can be integrated into one single touch-screen display device. Further details of multi-touch sensor detection, including proximity detection by a touch panel, is described in commonly assigned co-pending applications, including application Ser. No. 10/840,862 titled “Multipoint Touchscreen,” which was published on May 11, 2006 as U.S. Publication No. US2006/0097991; application Ser. No. 11/428,522 titled “Identifying Contacts On A Touch Surface,” which was published on Oct. 26, 2006 as U.S. Publication No. 2006/0238522, and U.S. application Ser. No. 11/649,998 entitled “Proximity and Multi-Touch Sensor Detection and Demodulation,” filed on Jan. 3, 2007, the entirety of each of which is hereby incorporated herein by reference.
FIG. 3 illustrates amultipoint processing method300 in accordance with one embodiment of the present invention.Multipoint processing method300 can, for example, be performed with the system shown inFIG. 1 orFIG. 2.Multipoint processing method300 generally begins atstep302 where images are read from a multipoint input device, and more particularly a multipoint touch screen. Although the term “image” is used it should be noted that the data can come in other forms. In most cases, the image read from the touch screen provides magnitude (Z) as a function of position (x and y) for each sensing point or pixel of the touch screen. The magnitude can, for example, reflect the capacitance measured at each point.
Followingstep302,multipoint processing method300 proceeds to step304 where the image can be converted into a collection or list of features. Each feature can represent a distinct input such as a touch. In most cases, each feature can include its own unique identifier (ID), x coordinate, y coordinate, Z magnitude, angle θ, area A, and the like. By way of example,FIGS. 4A and 4B illustrate aparticular image420 in time. Inimage420, there are two features422 based on two distinct touches. The touches can for example be formed from a pair of fingers touching the touch screen. As shown, each feature422 can include unique identifier (ID), x coordinate, y coordinate, Z magnitude, angle θ, and area A. More particularly, thefirst feature422A is represented by ID1, X1, Y1, Z1, θ1, A1and thesecond feature422B is represented by ID2, X2, Y2, Z2, θ2, A2This data can be outputted for example using a multi-touch protocol.
The conversion from data or images to features can be accomplished using methods described in co-pending U.S. application Ser. No. 10/840,862 titled “Multipoint Touchscreen,” which is hereby again incorporated herein by reference. As disclosed therein, the raw data can be received in a digitized form, and can include values for each node of the touch screen. The values can be between 0 and 256 where 0 equates to no touch pressure and 256 equates to full touch pressure. Thereafter, the raw data can be filtered to reduce noise. Once filtered, gradient data, which indicates the topology of each group of connected points, can be generated. Thereafter, the boundaries for touch regions can be calculated based on the gradient data (i.e., a determination can be made as to which points are grouped together to form each touch region). By way of example, a watershed algorithm can be used. Once the boundaries are determined, the data for each of the touch regions can be calculated (e.g., X, Y, Z, θ, A).
Returning toFIG. 3, followingstep304,multipoint processing method300 proceeds to step306 where feature classification and groupings can be performed. During classification, the identity of each of the features can be determined. For example, the features can be classified as a particular finger, thumb, palm or other object. Once classified, the features can be grouped. The manner in which the groups are formed can widely vary. In most cases, the features can be grouped based on some criteria (e.g., they carry a similar attribute). For example, the two features shown inFIG. 4A andFIG. 4B can be grouped together because each of these features is located in proximity to each other or because they are from the same hand. The grouping can include some level of filtering to filter out features that are not part of the touch event. In filtering, one or more features can be rejected because they either meet some predefined criteria or because they do not meet some predefined criteria. By way of example, one of the features can be classified as a thumb located at the edge of a tablet PC. Because the thumb is being used to hold the device rather than being used to perform a task, the feature generated therefrom can be rejected (i.e., is not considered part of the touch event being processed).
Followingstep306,multipoint processing method300 proceeds to step308 where key parameters for the feature groups can be calculated. The key parameters can include distance between features, x/y centroid of all features, feature rotation, total pressure of the group (e.g., pressure at centroid), and the like. As shown inFIG. 5, the calculation can include finding the centroid C, drawing avirtual line530 to each feature from the centroid C, defining the distance D for each virtual line (D1and D2), and then averaging the distances D1and D2Once the parameters are calculated, the parameter values can be reported. The parameter values can be typically reported with a group identifier (GID) and number of features within each group (in this case three). In most cases, both initial and current parameter values can be reported. The initial parameter values can be based on set down, for example when the user sets their fingers on the touch screen, and the current values can be based on any point within a stroke occurring after set down.
As should be appreciated, steps302-308 ofprocess300 can be repetitively performed during a user stroke thereby generating a plurality of sequentially configured signals. The initial and current parameters can be compared in later steps to perform actions in the system.
Following step308, the process flow moves to step310 where the group can be associated with a user interface (UI) element. UI elements can be buttons boxes, lists, sliders, wheels, knobs, etc. Each UI element can represent a component or control of the user interface. The application behind the UI element(s) can have access to the parameter data calculated in step308. In one implementation, the application can rank the relevance of the touch data to the UI element corresponding there to. The ranking can be based on some predetermined criteria. The ranking can include producing a figure of merit and, whichever UI element has the highest figure of merit, giving it sole access to the group. There can even be some degree of hysteresis as well (e.g., once one of the UI elements claims control of that group, the group sticks with the UI element until another UI element has a much higher ranking). By way of example, the ranking can include determining proximity of the centroid (or features) to the image object associated with the UI element.
Followingstep310,multipoint process300 proceeds tosteps312 and314.Steps312 and314 can be performed approximately at the same time. From the user perspective, in one embodiment, steps312 and314 appear to be performed concurrently. Instep312, one or more actions can be performed based on differences between initial and current parameter values, and can also be based to a UI element to which they are associated, if any. Instep314, user feedback pertaining to the one or more action being performed can be provided. By way of example, user feedback can include display, audio, tactile feedback and/or the like.
FIG. 6 illustrates aparameter calculation method600 in accordance with one embodiment of the present invention.Parameter calculation method600 can, for example, correspond to block308 shown inFIG. 3. Theparameter calculation method600 generally begins atstep601. Atstep602, a group of features can be received. Followingstep602, theparameter calculation method600 moves to step604 where a determination can be made as to whether or not the number of features in the group of features has changed. For example, the number of features can have changed due to the user picking up or placing an additional finger. Different fingers can be needed to perform different controls (e.g., tracking, gesturing). If the number of features has changed, theparameter calculation method600 proceeds to step606 where the initial parameter values can be calculated. If the number stays the same, theparameter calculation method600 proceeds to step608 where the current parameter values can be calculated. Thereafter, theparameter calculation method600 proceeds to step610 where the initial and current parameter values can be reported. By way of example, the initial parameter values can contain the average initial distance between points (or Distance (AVG) initial) and the current parameter values can contain the average current distance between points (or Distance (AVG) current). These can be compared in subsequent steps in order to control various aspects of a computer system.
The above methods and techniques can be used to implement any number of GUI interface objects and actions. For example, gestures can be created to detect and effect a user command to resize a window, scroll a display, rotate an object, zoom in or out of a displayed view, delete or insert text or other objects, etc.
In some embodiments, the electronic device can include several power modes. For example, a in a first power mode, both a touch-sensing device and a display can be powered, such that the user can provide inputs and see content displayed by the device. In a second power mode, the touch-sensing device can be powered, but the display may not be powered such that the electronic device can detect touch events provided by the user without displaying content. For example, in a touch screen embodiment, the electronic device can enable the touch screen to detect touch events without displaying content or selectable options.
The user can control playback operations using any suitable approach. In some embodiments, the electronic device can display selectable options on a display interface.FIG. 7A is a schematic view of a now-playing display in accordance with one embodiment of the invention.FIG. 7B is a schematic view of a selectable volume overlay on the now-playing display in accordance with one embodiment of the invention.FIG. 7C is a schematic view of a playback control overlay on the now-playing display in accordance with one embodiment of the invention.FIG. 7D is a schematic view of a playlist control overlay on the now-playing display in accordance with one embodiment of the invention. Now-playingdisplay700 can be provided at any suitable time. For example, display700 can be provided in response to a user request to view information regarding the media item being played back.Display700 can includeart702 describing the played back media item. For example,art702 can include album cover art associated with music items.
To control playback operations, the user can direct the electronic device to overlay selectable options. The electronic device can overlay any suitable number of type of options in response to a user instruction. For example, the electronic device can display, in sequence, volume control, playback control, and playlist control options. Alternatively, the options can be combined or split up in one or more overlays, options for other controls can be displayed, or the order of displayed options can change. Display720 can includevolume overlay730 displayed over art720, which can be the same asart702. The electronic device can also display media identifying information724 (e.g., song title, artist and album name) and playback indicator726 (e.g., a progress bar with timing information).Volume overlay730 can include any suitable option or information. For example,volume overlay730 can include bar732 representing the current volume level. The user can change the volume level by selectingdecrease option734 andincrease option736.
Display750 can includeplayback control overlay760 displayed overart752, which can be the same asart702. The electronic device can also display media identifying information754 (e.g., song title, artist and album name) and playback indicator756 (e.g., a progress bar with timing information).Playback control overlay760 can include any suitable option or information relating to playback control. For example,playback control overlay760 can include play/pause option762. The user can rewind and fast-forward played back media by selecting and holding (e.g., tap and hold or a long tap)next option764 andback option766, respectively. The user can also skip to the previous or next media item available for playback (e.g., in a playlist) by selecting without holding (e.g., single tap)options764 and766, respectively.
Display780 can includeplaylist control overlay790 displayed overart782, which can be the same asart702. The electronic device can also display media identifying information784 (e.g., song title, artist and album name) and playback indicator786 (e.g., a progress bar with timing information).Playlist control overlay790 can include any suitable option or information relating to playlist control. For example,playlist control overlay790 can include genius playlist option792 (e.g., for generating a new playlist of media items related to the currently played back media item). The user can toggle playlist shuffling and repeat options by selectingshuffle option794 andrepeat option796, respectively. In some embodiments, the user can toggle between more than two options in response to selections of displayed options (e.g., toggle between repeat all, repeat one, and repeat none).
While the user can select the options displayed in each ofFIGS. 7B-7D, the user may also wish to provide playback instructions and volume control instructions without selecting displayed on-screen options. In one embodiment, the user can provide media playback instructions to the device by providing specific touch gestures each associated with particular operations. In one implementation, the touch gestures can include combinations of taps that match the combinations of button presses associated with a button-based input interface of the device, so that the user can provide inputs using a single input scheme. This approach can be implemented in a reduced power mode, for example when a touch interface is enabled but a display is not.
In some embodiments, in response to detecting a particular input associated with a playback instruction, the electronic device can provide a visual confirmation of the instruction on the display.FIG. 8A is a schematic view of an illustrative display having visual feedback for a play touch instruction in accordance with one embodiment of the invention.FIG. 8B is a schematic view of an illustrative display having visual feedback for a fast-forward touch instruction in accordance with one embodiment of the invention.Display800 can include anysuitable background802, including for example the background displayed at the time the playback touch instruction was received. For example, the background can include several selectable options (e.g., as part of a menu). To indicate to the user that a play instruction was detected, the electronic device can overlay playicon810 on the display.Play icon810 can remain displayed for any suitable duration, including for example until the device detects an instruction to hide the icon, until a particular duration lapses (e.g., a 2 second duration), or any other suitable criteria is satisfied. In some embodiments, the electronic device can provide other forms of feedback instead or in addition to displayingicon810. For example, the electronic device can provide an audio cue (e.g., a series of tones matching the detected touch pattern, or a voice-over describing the identified operation), other forms of visual feedback, vibrations (e.g., a vibration pattern matching the input), or any other suitable feedback.
Display850 can include anysuitable background852, including for example the background displayed at the time the playback touch instruction was received (e.g., as described in connection with background802). To indicate to the user that a fast-forward instruction was detected, the electronic device can overlay fast-forward icon860 on the display. Fast-forward icon860 can remain displayed for any suitable duration, for example as described above in connection withplay icon810. In some embodiments, the electronic device can provide other forms of feedback instead or in addition to displayingicon860. For example, the electronic device can provide an audio cue (e.g., a series of tones matching the detected touch pattern, or a voice-over describing the identified operation), other forms of visual feedback, vibrations (e.g., a vibration pattern matching the input), or any other suitable feedback. The electronic device can similarly provide other operation icon overlays for other media playback operations for which touch inputs are detected, including for example pause, rewind, next track and previous track operations.
In some embodiments, a user can control the volume of played back media by providing particular touch inputs associated with volume control.FIG. 9A is a schematic view of a first volume control overlay in accordance with one embodiment of the invention.FIG. 9B is a schematic view of a second volume control overlay in accordance with one embodiment of the invention.Display900 can include anysuitable background902, including for example the background displayed at the time the playback touch instruction was received. For example, the background can include album art associated with a “Now Playing” display. To indicate to the user that a volume control instruction was detected, the electronic device canoverlay volume animation910 on the display.Volume animation910 can include a circular region having distinctcontinuous portions912 and914 graphically depicting the current volume level of the device. For example,portion912 can be substantially filled and opaque, whileportion914 can be more transparent, where the relative amount of the circular region taken by each ofportions912 and914 can depict the relative volume level.Volume animation910 can includenumber916 provided within the circular region to numerically quantify the current volume level. As a user provides a volume related touch input (e.g., a circular motion), the size ofrespective portions912 and914, and the displayednumber916 can graphically change to match changes in the volume level. The electronic device can relate volume adjustments and the user's gesture using any suitable approach. For example, volume can be adjusted based on the distance traced around the circle by the user, the angular velocity of the user's gesture, the position of the user's finger relative to an origin, or any other suitable approach.Volume animation910 can remain displayed for any suitable duration, for example as described above in connection withplay icon810. In some embodiments, the electronic device can provide other forms of feedback instead of or in addition to displayingicon860, including for example audio feedback, tactile feedback (e.g., vibrations), or other forms of feedback.
Display950 is an alternate animation depicting volume level.Background952 can include any suitable background, including for example backgrounds discussed in connection withbackground902.Volume bar960 can be overlaid onbackground952 to provide a graphical representation of the current volume level. For example, the size of the opaque portion ofvolume bar960 relative to the size of the transparent portion ofvolume bar960 can provide a graphical depiction of the volume level. As the device detects a touch input changing the volume level, the electronic device can adjustvolume bar960 to graphically depict the changed volume.
The electronic device can associate any suitable touch input with corresponding media playback operations.FIG. 10 is an illustrative table of touch event or gesture and device operation associations in accordance with one embodiment of the invention. Table1000 can include several columns, including for exampletouch input column1002 anddevice operation column1010. Each row of the table can include a particular touch input and its associated electronic device operation. For example, single tap touch events can be associated with play/pause instructions. Double tap touch events can be associated with next item instructions. Triple tap touch events can be associated with previous item instructions. Double tap and hold (e.g., a short tap followed by a long tap) can be associated with fast forward instructions. Triple tap and hold (e.g., two short tapa followed by a long tap) can be associated with rewind instructions. Clockwise circle touch events can be associated with volume up instructions, and counterclockwise circle touch events can be associated with volume down instructions. The touch event and device operation associations of table1000 will be understood to be merely illustrative, as any other suitable combination of touch events and device operations can be used instead or in addition to those shown in table1000 to control device operations. In some embodiments, some of the touch gesture and device operation associations can mimic the associations defined for device operations and inputs provided by a button, such as an in line button on a wired headphone (e.g., touch gesture taps correspond to button clicks).
The following flowchart describes processes used by the electronic device to control media playback.FIG. 11 is a flowchart of an illustrative process for controlling media playback based on detected touch events in accordance with one embodiment of the invention.Process1100 can begin atstep1102. Atstep1104, the electronic device can detect a touch event. For example, the electronic device can receive an indication from a touch-sensing interface that a touch gesture was detected. The touch-sensing interface can identify the particular touch gesture, and provide identifying information for the gesture to the electronic device control circuitry. For example, the touch-sensing interface can indicate that the detected touch gesture was a particular combination of tapping and holding a finger on a touch sensitive surface, or a circular motion on the touch sensitive surface. At step1106, the electronic device can determine whether the detected touch event is associated with a playback operation. For example, the electronic device can determine whether the touch event matches one of the events in table1000 (FIG. 10). If the electronic device determines that the touch event is not associated with a playback operation,process1100 can return to step1104 and continue to detect touch events. Alternatively,process1100 can end.
If, at step1106, the electronic device instead determines that the detected touch event is associated with a playback operation,process1100 can move to step1108. Atstep1108, the electronic device can identify the particular playback operation associated with the detected touch event. For example, the electronic device can refer to a table or other data structure associating particular playback operations with different touch gestures. Atstep1110, the electronic device can perform the identified playback operation. For example, the electronic device can play, pause, fast forward, or rewind a media item. As another example, the electronic device can skip to a previous or next media item. As still another example, the electronic device can change the volume of the played back media.Process1000 can then end atstep1112.
Although many of the embodiments of the present invention are described herein with respect to personal computing devices, it should be understood that the present invention is not limited to personal computing applications, but is generally applicable to other applications.
The invention is preferably implemented by software, but can also be implemented in hardware or a combination of hardware and software. The invention can also be embodied as computer readable code on a computer readable medium. The computer readable medium is any data storage device that can store data which can thereafter be read by a computer system. Examples of the computer readable medium include read-only memory, random-access memory, CD-ROMs, DVDS, magnetic tape, and optical data storage devices. The computer readable medium can also be distributed over network-coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.
Insubstantial changes from the claimed subject matter as viewed by a person with ordinary skill in the art, now known or later devised, are expressly contemplated as being equivalently within the scope of the claims. Therefore, obvious substitutions now or later known to one with ordinary skill in the art are defined to be within the scope of the defined elements.
The above described embodiments of the invention are presented for purposes of illustration and not of limitation, and the present invention is limited only by the claims which follow.