CROSS-REFERENCE TO RELATED APPLICATIONS/INCORPORATION BY REFERENCEThis patent application is a continuation application of non-provisional application Ser. No. 12/774,380, filed May 5, 2010, entitled “SYSTEM AND METHOD IN A TELEVISION FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM” which is related to and claims priority from provisional patent application Ser. No. 61/242,234 filed Sep. 14, 2009, and titled “TELEVISION SYSTEM,” the contents of which are hereby incorporated herein by reference in their entirety. This patent application is also related to U.S. patent application Ser. No. 12/850,832, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A DISTRIBUTED SYSTEM FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/850,866, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION RECEIVER FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/850,911, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION CONTROLLER FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/850,945, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION CONTROLLER FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/851,036, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION SYSTEM FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/851,075 filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A PARALLEL TELEVISION SYSTEM FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”. The contents of each of the above-mentioned applications are hereby incorporated herein by reference in their entirety.
FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT[Not Applicable]
SEQUENCE LISTING[Not Applicable]
MICROFICHE/COPYRIGHT REFERENCE[Not Applicable]
BACKGROUND OF THE INVENTIONPresent television systems are incapable of providing for and/or conveniently providing for user-selection of objects in a television program. Further limitations and disadvantages of conventional and traditional approaches will become apparent to one of skill in the art, through comparison of such systems with the present invention as set forth in the remainder of the present application with reference to the drawings.
BRIEF SUMMARY OF THE INVENTIONVarious aspects of the present invention provide a system and method in a television for providing for user selection of objects in a television program, substantially as shown in and/or described in connection with at least one of the figures, as set forth more completely in the claims. These and other advantages, aspects and novel features of the present invention, as well as details of illustrative aspects thereof, will be more fully understood from the following description and drawings.
BRIEF DESCRIPTION OF SEVERAL VIEWS OF THE DRAWINGSFIG. 1 is a diagram illustrating an exemplary television system, in accordance with various aspects of the present invention.
FIG. 2 is a flow diagram illustrating an exemplary method for providing user-selection of objects in television programming, in accordance with various aspects of the present invention.
FIG. 3 is a flow diagram illustrating an exemplary method for providing user-selection of objects in television programming, in accordance with various aspects of the present invention.
FIG. 4 is a diagram illustrating an exemplary television, in accordance with various aspects of the present invention.
FIG. 5 is a diagram illustrating exemplary modules and/or sub-modules for a television, in accordance with various aspects of the present invention.
DETAILED DESCRIPTION OF VARIOUS ASPECTS OF THE INVENTIONThe following discussion will refer to various communication modules, components or circuits. Such modules, components or circuits may generally comprise hardware and/or a combination of hardware and software (e.g., including firmware). Such modules may also, for example, comprise a computer readable medium (e.g., a non-transitory medium) comprising instructions (e.g., software instructions) that, when executed by a processor, cause the processor to perform various functional aspects of the present invention. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of particular hardware and/or software implementations of a module, component or circuit unless explicitly claimed as such. For example and without limitation, various aspects of the present invention may be implemented by one or more processors (e.g., a microprocessor, digital signal processor, baseband processor, microcontroller, etc.) executing software instructions (e.g., stored in volatile and/or non-volatile memory). Also for example, various aspects of the present invention may be implemented by an application-specific integrated circuit (“ASIC”) and/or other hardware components.
Additionally, the following discussion will refer to various television system modules (e.g., television modules). It should be noted that the following discussion of such various modules is segmented into such modules for the sake of illustrative clarity. However, in actual implementation, the boundaries between various modules may be blurred. For example, any or all of the functional modules discussed herein may share various hardware and/or software components. For example, any or all of the functional modules discussed herein may be implemented wholly or in-part by a shared processor executing software instructions. Additionally, various software sub-modules that may be executed by one or more processors may be shared between various software modules. Accordingly, the scope of various aspects of the present invention should not be limited by arbitrary boundaries between various hardware and/or software components, unless explicitly claimed.
The following discussion may also refer to communication networks and various aspects thereof. For the following discussion, a communication network is generally the communication infrastructure through which a communication device (e.g., a portable communication device, television, television control device, television provider, television programming provider, television receiver, video recording device, etc.) may communicate with other systems. For example and without limitation, a communication network may comprise a cable and/or satellite television communication network, a cellular communication network, a wireless metropolitan area network (WMAN), a wireless local area network (WLAN), a wireless personal area network (WPAN), any home or premises communication network, etc. A particular communication network may, for example, generally have a corresponding communication protocol according to which a communication device may communicate with the communication network. Unless so claimed, the scope of various aspects of the present invention should not be limited by characteristics of a particular type of communication network.
The following discussion will at times refer to an on-screen pointing location. Such a pointing location refers to a location on the television screen to which a user (either directly or with a pointing device) is pointing. Such a pointing location is to be distinguished from other types of on-screen location identification, such as, for example, using arrow keys and/or a mouse to move a cursor or to traverse blocks (e.g., on an on-screen program guide) without pointing. Various aspects of the present invention, while referring to on-screen pointing location, are also readily extensible to such other forms of on-screen location identification.
Additionally, the following discussion will at times refer to television programming. Such television programming generally includes various types of television programming (e.g., television programs, news programs, sports programs, music television, movies, television series programs and/or associated advertisements, educational programs, live or recorded television programming, broadcast/multicast/unicast television programming, etc.). Such television programming may, for example, comprise real-time television broadcast programming (or multicast or unicast television programming) and/or user-stored television programming that is stored in a user device (e.g., a VCR, PVR, etc.). Such television programming video content is to be distinguished from other non-programming video content that may be displayed on a television screen (e.g., an electronic program guide, user interface menu, a television set-up menu, a typical web page, a document, a graphical video game, etc.). Various aspects of the present invention may, for example in a television, comprise receiving television programming, presenting such received television programming to a user, determining an on-screen pointing location pointed to by the user and identifying a user-selected object in the presented television programming.
Also, the following discussion will at times refer to user-selectable objects in television programming. Such user-selectable objects includes both animate (i.e., living) and inanimate (i.e., non-living) objects, both still and moving. Such objects may, for example, comprise characteristics of any of a variety of objects present in television programming. Such objects may, for example and without limitation, comprise inanimate objects, such as consumer good objects (e.g., clothing, automobiles, shoes, jewelry, furniture, food, beverages, appliances, electronics, toys, artwork, cosmetics, recreational vehicles, sports equipment, safety equipment, computer equipment, communication devices, books, etc.), premises objects (e.g., business locations, stores, hotels, signs, doors, buildings, landmarks, historical sites, entertainment venues, hospitals, government buildings, etc.), objects related to services (e.g., objects related to transportation, objects related to emergency services, objects related to general government services, objects related to entertainment services, objects related to food and/or drink services, etc.), objects related to location (e.g., parks, landmarks, streets, signs, road signs, etc.), etc. Such objects may, for example, comprise animate objects, such as people (e.g., actors/actresses, athletes, musicians, salespeople, commentators, reports, analysts, hosts/hostesses, entertainers, etc.), animals (e.g., pets, zoo animals, wild animals, etc.) and plants (e.g., flowers, trees, shrubs, fruits, vegetables, cacti, etc.).
Turning first toFIG. 1, such figure is a diagram illustrating a non-limitingexemplary television system100 in accordance with various aspects of the present invention. Theexemplary system100 includes atelevision provider110. Thetelevision provider110 may, for example, comprise a television network company, a cable company, a movie-providing company, a news company, an educational institution, etc. Thetelevision provider110 may, for example, be an original source of television programming (or related information). Also for example, thetelevision provider110 may be a communication company that provides programming distribution services (e.g., a cable television company, a satellite television company, a telecommunication company, a data network provider, etc.). Thetelevision provider110 may, for example, provide television programming and non-programming information and/or video content. Thetelevision provider110 may, for example, provide information related to a television program (e.g., information describing or otherwise related to selectable objects in programming, etc.).
Theexemplary television system100 may also include a third partyprogram information provider120. Such a provider may, for example, provide information related to a television program. Such information may, for example, comprise information describing selectable objects in programming, program guide information, etc.
Theexemplary television system100 may include one or more communication networks (e.g., the communication network(s)130). Theexemplary communication network130 may comprise characteristics of any of a variety of types of communication networks over which television programming and/or information related to television programming may be communicated. For example and without limitation, thecommunication network130 may comprise characteristics of any one or more of: a cable television network, a satellite television network, a telecommunication network, the Internet, a local area network (LAN), a personal area network (PAN), a metropolitan area network (MAN), any of a variety of different types of home networks, etc.
Theexemplary television system100 may include afirst television140. Such afirst television140 may, for example, comprise networking capability enablingsuch television140 to communicate directly with thecommunication network130. For example, thefirst television140 may comprise one or more embedded television receivers or transceivers (e.g., a cable television receiver, satellite television transceiver, Internet modem, etc.). Also for example, thefirst television140 may comprise one or more recording devices (e.g., for recording and/or playing back video content, television programming, etc.). Thefirst television140 may, for example, operate to (which includes “operate when enabled to”) perform any or all of the functionality discussed herein.
Theexemplary television system100 may include afirst television controller160. Such afirst television controller160 may, for example, operate to (e.g., which may include “operate when enabled to”) control operation of thefirst television140. Thefirst television controller160 may comprise characteristics of any of a variety of television controlling devices. For example and without limitation, thefirst television controller160 may comprise characteristics of a dedicated television control device, a universal remote control, a cellular telephone or personal computing device with television control capability, etc.
The first television controller160 (or television control device) may, for example, transmit signals directly to thefirst television140 to control operation of thefirst television140. Thefirst television controller160 may also, for example, operate to transmit signals (e.g., via the communication network130) to thetelevision provider110 to control television programming (or related information) being provided to thefirst television140, or to conduct other transactions (e.g., business transactions, etc.).
As will be discussed in more detail later, thefirst television controller160 may operate to communicate screen pointing information with thefirst television140 and/or other devices. Also, as will be discussed in more detail later, various aspects of the present invention include a user pointing to a location on a television screen (e.g., pointing to an animate or inanimate object presented in television programming). In such a scenario, the user may perform such pointing in any of a variety of manners. One of such exemplary manners includes pointing with a television control device. Thefirst television controller160 provides a non-limiting example of a device that a user may utilize to point to an on-screen location.
Theexemplary television system100 may also include atelevision receiver151. Thetelevision receiver151 may, for example, operate to (e.g., which may include “operate when enabled to”) provide a communication link between a television and/or television controller and a communication network and/or information provider. For example, thetelevision receiver151 may operate to provide a communication link between thesecond television141 and thecommunication network130, or between thesecond television141 and the television provider110 (and/or third party program information provider120) via thecommunication network130.
Thetelevision receiver151 may comprise characteristics of any of a variety of types of television receivers. For example and without limitation, thetelevision receiver151 may comprise characteristics of a cable television receiver, a satellite television receiver, etc. Also for example, thetelevision receiver151 may comprise a data communication network modem for data network communications (e.g., with the Internet, a LAN, PAN, MAN, telecommunication network, etc.). Thetelevision receiver151 may also, for example, comprise recording capability (e.g., programming recording and playback, etc.).
Theexemplary television system100 may include asecond television controller161. Such asecond television controller161 may, for example, operate to (e.g., which may include “operate when enabled to”) control operation of thesecond television141 and thetelevision receiver151. Thesecond television controller161 may comprise characteristics of any of a variety of television controlling devices. For example and without limitation, thesecond television controller161 may comprise characteristics of a dedicated television control device, a dedicated television receiver control device, a universal remote control, a cellular telephone or personal computing device with television control capability, etc.
Thesecond television controller161 may, for example, operate to transmit signals directly to thesecond television141 to control operation of thesecond television141. Thesecond television controller161 may, for example, operate to transmit signals directly to thetelevision receiver151 to control operation of thetelevision receiver151. Thesecond television controller161 may additionally, for example, operate to transmit signals (e.g., via thetelevision receiver151 and the communication network130) to thetelevision provider110 to control television programming (or related information) being provided to thetelevision receiver151, or to conduct other transactions (e.g., business transactions, etc.).
As will be discussed in more detail later, various aspects of the present invention include a user pointing to a location on a television screen (e.g., pointing to an animate or inanimate object presented in television programming). In such a scenario, the user may perform such pointing in any of a variety of manners. One of such exemplary manners includes pointing with a television control device. Thesecond television controller161 provides one non-limiting example of a device that a user may utilize to point to an on-screen location.
Theexemplary television system100 was provided to provide a non-limiting illustrative foundation for discussion of various aspects of the present invention. Thus, the scope of various aspects of the present invention should not be limited by any characteristics of theexemplary television system100 unless explicitly claimed.
FIG. 2 is a flow diagram illustrating anexemplary method200 for providing user-selection of objects in television programming, in accordance with various aspects of the present invention. Any or all aspects of theexemplary method200 may, for example, be implemented in a television (e.g., thefirst television140 and/orsecond television141 shown inFIG. 1 and discussed previously).
Theexemplary method200 may, for example, begin executing atstep205. Theexemplary method200 may begin executing in response to any of a variety of causes and/or conditions. For example, theexemplary method200 may begin executing in response to a user command to begin, in response to user selection of a television program that includes user selectable objects, upon television reset and/or power-up, in response to a user input indicating a desire to provide object selection capability to the user, in response to identification of a user and/or user equipment for which object selection capability is to be provided, in response to user payment of a fee, etc.
Theexemplary method200 may, for example atstep210, comprise receiving television programming. Many non-limiting examples of such television programming were provided above. Step210 may comprise receiving the television programming from any of a variety of sources. For example and without limitation, step210 may comprise receiving the television programming from a television broadcasting company, from a movie streaming company, from a user (or consumer) video recording device (e.g., internal and/or external to the television), from an Internet television programming provider, etc.
Step210 may also comprise receiving the television programming via any of a variety of types of communication networks. Such networks may, for example, comprise a wireless television network (e.g., terrestrial and/or satellite) and/or cable television network. Such networks may, for example, comprise any of variety of data communication networks (e.g., the Internet, a local area network, a personal area network, a metropolitan area network, etc.).
In general,step210 may comprise receiving television programming. The scope of various aspects of the present invention should not be limited by characteristics of any particular television programming, television programming source, television programming network or manner of receiving television programming unless explicitly claimed.
Theexemplary method200 may, atstep220, comprise presenting television programming to a user. Step220 may, for example, comprise presenting the television programming received atstep210 to a user in any of a variety of manners. For example, step220 may comprise presenting the television programming on a screen of the television. Also for example, step220 may comprise communicating the television programming to another video presentation device external to the television.
The presented television programming may, for example, comprise user-selectable objects in the television programming. Many non-limiting examples of such user-selectable objects were presented above. In general, such user-selectable objects may, for example, comprise animate and/or inanimate objects in television programming that a user may select (e.g., using a pointing device or other user interface by which a user may specify a screen location).
Theexemplary method200 may, atstep230, comprise determining an on-screen pointing location pointed to by a user of the television. Step230 may comprise determining an on-screen pointing location in any of a variety of manners, non-limiting examples of which will now be provided. Various non-limiting examples of on-screen pointing location determining are provided in U.S. Provisional Application No. 61/242,234, which is hereby incorporated herein by reference in its entirety. An on-screen pointing location may, for example, be expressed in a screen-centric coordinate system (e.g., x-y pixel coordinates), a screen independent coordinate system (e.g., based on location within a moving image, where such location is generic to all television screens), a world coordinate and/or universal coordinate system, a video frame-based coordinate system, etc.
Step230 may, for example, comprise the television analyzing sensor information (e.g., associated with sensors on-board and/or off-board the television) to determine user on-screen pointing location. Step230 may also, for example, comprise the television receiving information describing the on-screen pointing location from a device external to the television (e.g., a television receiver, a television controller, a television network device, a user pointing device, etc.).
Step230 may, for example, comprise identifying a timestamp temporally identifying the instance of a determined on-screen pointing location. Such timestamp may, for example, be obtained by a clock, timestamp embedded in a video stream, timestamp embedded in a stream including object information, timestamp associated with a signal transmitted from a user pointing device, etc. Determination of such a timestamp may, for example, be based on user command (e.g., a user indicating that a selection has occurred) or automatically without a direct indication from the user that a selection has occurred (e.g., the system determining that the user has pointed to an object for at least a particular amount of time), etc. Such timestamp may be utilized, for example, for determining selection of a moving, changing and/or temporally transient object in the presented television programming.
In general,step230 may comprise determining an on-screen pointing location pointed to by a user of the television. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular type of on-screen pointing location or any particular manner of determining such an on-screen pointing location unless explicitly claimed.
Theexemplary method200 may, atstep240, comprise identifying a user-selectable object in the presented television programming at which the user is pointing based, at least in part, on the determined on-screen pointing location (e.g., as determined at step230). Step240 may comprise performing such identifying in any of a variety of manners, non-limiting examples of which will now be presented.
For example, step240 may comprise determining the on-screen location and/or dimensions of one or more user-selectable objects (e.g., or associated selection region) in the presented television programming, and identifying a user-selected object by analyzing the respective on-screen locations of the one or more user-selectable objects and the determined on-screen pointing location (e.g., at a particular time instance and/or particular timeframe) to determine the television programming object selected by the user. For example, such on-screen location may, for example, comprise the on-screen location of one or more points, areas and/or volumes associated with respective locations of user-selectable objects.
In such an exemplary scenario or any scenario involving determining on-screen location of one or more objects in the presented television programming,step240 may, for example, comprise determining such on-screen object location in any of a variety of manners. For example, step240 may comprise receiving information identifying and/or describing the user-selectable objects in the television program.
For example, step240 may comprise receiving information identifying and/or describing such user-selectable objects from the same source as the received television programming. For example, step240 may comprise receiving such information embedded in a same data stream as a stream communicating the presented television programming (e.g., embedded in the received television program data). For example, a television stream protocol may comprise specialized elements (and/or the utilization of unassigned elements) that include information about selectable objects (e.g., object identity, shape, location, size, coloration, movement characteristics, timing, appearance time window, etc.).
Also for example, step240 may comprise receiving the information identifying and/or describing such user-selectable objects in a data stream communicated in parallel with a stream communicating the presented television programming. In such a scenario, the television program stream and object information stream may be received from a same source over a same television programming communication channel. Additionally for example, step240 may comprise receiving such information from a same source but over a communication channel different from the channel over which the television programming is received and/or over a type of communication channel different from a television programming communication channel.
Further for example, step240 may comprise receiving the information identifying and/or describing such user-selectable objects from a source (e.g., a third party information provider, a television network source, etc.) different from the source from which the television programming is received. In such an exemplary scenario, step240 may comprise receiving such information via different respective communication networks or via one or more same communication networks. Also for example, step240 may comprise receiving such information over a different communication medium than that over which the television programming is received.
Step240 may, for example, comprise receiving the information identifying and/or describing such user-selectable objects in a data stream, where such information is always transmitted in the data stream (e.g., whether or not requested by a user and/or other system). Alternatively for example, step240 may comprise receiving such information, where such information is communicated (e.g., to the television) only when requested (e.g., only when requested by the television, by a television controller, by a television receiver, by a user electronic device, by the user, etc.).
Step240 may, for example, comprise receiving the information identifying and/or describing such user-selectable objects in real-time (i.e., as the television programming is received). Also for example, step240 may comprise receiving such information from a source of user-stored television programming. For example, such information may be stored with stored television programming in a user storage device (e.g., in a same data file, in separate but related files, etc.). In such an exemplary implementation, such information may be received from the user's television programming storage device in time synchronization with television programming.
As mentioned above, the information identifying and/or describing user selectable objects in television programming may comprise timing information associated with such selectable objects. For example, movement of a selectable object may be expressed as a function of time. Also for example, appearance of a selectable object in television programming may be associated with a time window during which such object appears. As will be discussed in more detail below, timing associated with a user on-screen pointing (or object selection) event may be synchronized to the timing of selectable object location in a presented program to determine whether a user pointed to (or selected) a particular object at a particular location at a particular time.
The information identifying and/or describing user selectable objects in television programming may comprise information defining respective regions of the presented television programming that are associated with respective user-selectable objects in the presented television programming. For example, such information may comprise information describing respective geometric shapes (e.g., 2-D and/or 3-D geometric constructs) associated with respective user-selectable objects. For example, a circle, oval, square, rectangle, pentagon or any polygon may be associated with a user-selectable object. User-selection of one of such geometric shapes (e.g., a determined on-screen pointing location within the boundaries of such geometric shape(s) at the relevant point in time) may indicate user selection of the respective object.
As discussed above, the object description (whether geometric or not) may comprise information (e.g., temporal information) describing movement, appearance, size changing, shape changing, etc. associated with the user-selectable object. For example, in an exemplary scenario where a plurality of geometric shapes are utilized to describe a user-selectable object (and/or a user-selectable area associated with such object), the description of the user-selectable object may comprise information describing the manner in which each of the respective objects move in the programming (e.g., as a function of time, as a function of video frame number, etc.) and/or information describing the manner in which dimensions of each of the respective objects change.
Once on-screen pointing location and object location (e.g., at a particular point in time) are known, such information may be processed to determine a user-selectable object to which a user is pointing. As mentioned above, such processing may comprise determining a respective region of the television screen and/or a television programming frame associated with a respective user-selectable object, where the respective region correlates to an on-screen pointing location pointed to by the user (e.g., at a particular point in time or during a particular timeframe).
During performing such processing,step240 may comprise low-pass filtering the determined on-screen pointing location (e.g., as determined at step230) to compensate for unintended movement of pointing location (e.g., due to unsteady or unstable user pointing), thus increasing reliability of object selection determination. For example, successful user-selection of a television programming object may require a user to point to an on-screen object for a particular amount of time (or for a particular number of frames).
Note that such low-pass filtering may also be performed atstep230. Such filtering may, for example, comprise filtering over a particular period of time, over a particular number of on-screen pointing direction determinations, over a particular number of television programming frames, etc. Such filtering may comprise averaging a plurality of on-screen pointing direction determinations, utilizing a finite impulse response filtering scheme, etc.
Step240 may also, for example, comprise identifying a user-selected object by identifying a most likely object selected by the user. For example, step240 may comprise identifying a closest user-selectable object to the on-screen pointing location determined atstep230. Also for example, step240 may comprise determining a most likely user-selected object based, at least in part, on respective popularity of user selectable objects (e.g., favoring the most often selected inanimate and/or animate objects). Additionally for example, step240 may comprise identifying a most likely user-selected object based, at least in part, on monetary considerations (e.g., placing a higher likelihood on user-selectable objects associated with relatively higher paying advertisers, placing a higher likelihood on user-selectable objects associated with a per-selection based advertising fee, etc.). Further for example, step240 may comprise identifying a most likely user-selected programming object based on history of selection from a particular user (e.g., favoring types of objects most often selected by a particular user). Also for example, step240 may comprise identifying a most likely user-selected programming object based on object newness (e.g., a new object is likely to garner more interest than an object that has been shown for a relatively long period of time). Additionally, for example, step240 may comprise identifying a most likely user-selected television programming object based on object size.
A selected object may be identified with a unique identifier (e.g., unique to the object, unique to a group of related objects, etc.). Step240 may thus comprise determining the identifier associated with the determined user-selected object. Such identifier may then be utilized in the performance of any of a variety of further functions, non-limiting examples of which will be presented below. For example, the information identifying and/or describing a user-selectable object may comprise information describing functions associated with such object (e.g., information presentation functionality, communication functionality, business transaction functionality, user interaction functionality, etc.).
The steps of the exemplary method200 (or aspects thereof) may, for example, be performed (e.g., by a television) in real-time. In such manner, the user may have relatively expeditious access to functionality associated with the user-selected object. Alternatively for example, the exemplary method200 (or aspects thereof) may be performed off-line in a manner in which functionality associated with the user-selected object is provided to the user at a later time (e.g., after presentation of the television program, upon the user pausing presentation of the television program, upon the user logging into the user's computer system, upon the user accessing email, etc.).
As mentioned above, any or all of the steps of theexemplary method200 may be performed for user selection of an object in television programming as the programming is broadcast in real-time and/or may be performed for user selection of an object in television programming that has been recorded on a user (or home) television programming recorder (e.g., a personal video recorder (PVR), video cassette recorder (VCR), etc.) and is currently being presented to the user (e.g., at step220) in a time-shifted manner. For example, a user may record a broadcast television program on a PVR for later viewing, view such recorded programming at a later time, and while viewing such time-shifted television programming at a later time, select user-selectable objects in such programming.
Similarly, any or all of the steps of theexemplary method200 may be performed for user selection of an object in television programming that has been provided to the user (or stored by the user) on a physical storage medium (e.g., on a digital versatile disc (DVD), video cassette recorder tape, non-volatile memory device, etc.). For example, a user may purchase a set of DVDs including all episodes of a season of a television series, view each of such episodes at the convenience of the user, and while viewing such episodes, select user-selectable objects in such programming.
In an exemplary scenario, where on-screen pointing location at a particular point in time is utilized to determine object selection, any of a variety of time references may be utilized. For example, synchronization of on-screen pointing location and user-selectable object location (e.g., on-screen and/or in-frame object location) may be based on a presentation timestamp (PTS) and/or a decoding timestamp (DTS), or the like, which may be encoded in a broadcast and/or recorded program or determined as such program is being displayed to a user. In such a scenario, so long as the object location and pointing determination are based on a common and/or synchronized time reference, the identification of a pointed-to object may be performed accurately.
As mentioned previously, object information identifying and/or describing user-selectable objects may be received encoded in a video program stream or may be received in a separate stream (and/or channel). In a scenario where television programming information is stored (either short term or long term), the object information may also be stored (e.g., with the stored programming information in a same data file, in a separate but related data file, etc.). In such a manner, when the user determines to view a time-shifted program, the object information is accessible to the television.
Alternatively, such information identifying and/or describing user-selectable objects in programming may be requested from a third party when such information is needed. In yet another scenario, for example, where video information may be decoded separately from the television (e.g., in a set top box (STB), cable and/or satellite television receiver, PVR, etc.) and provided to the television for presentation, such object information may also be received by such separate device and provided to the television (e.g., in an information channel separate from a video driver signal).
Note that although a portion of the previous discussion concerned analyzing on-screen pointing location and on-screen object location to identify a user-selected object, such analysis may also be similarly performed by analyzing on-frame pointing location and on-frame object location. In other words, such analysis may comprise performing any of a variety of coordinate transformations to perform such analysis in any of a variety of different respective coordinate domains.
In general,step240 may comprise identifying a user-selectable object in the presented television programming at which the user is pointing (e.g., based, at least in part, on the determined on-screen pointing location (e.g., as determined at step230)). Accordingly, the scope of various aspects of the present invention should not arbitrarily be limited by any particular manner of performing such identifying unless explicitly claimed.
Theexemplary method200 may, for example atstep295, comprise performing continued operations. Step295 may comprise performing any of a variety of continued operations, non-limiting examples of such continued operation(s) will be presented below. For example, step295 may comprise returning execution flow to any of the previously discussed method steps. For example, step295 may comprise returning execution flow of theexemplary method200 to step230 for determining additional on-screen pointing locations and corresponding user-selected objects in the television programming.
Also for example, step295 may comprise generating a user output indicating the identified user-selectable object (e.g., as identified at step240). For example, step295 may comprise overlaying a graphical feature coinciding with the identified user-selectable object on the presented television programming. For example, as discussed above, a user-selectable object (and/or the user-selectable portion of a user-selectable object) may be defined by one or more geometric shapes. In such an exemplary scenario, step295 may comprise highlighting such geometric shapes (or the borders thereof) whenstep240 determines that the user has selected a user-selectable object associated with such geometric shapes. Also for example, step295 may comprise presenting an outline of the identified object on the television screen, temporarily brighten or alter the color of the identified object, temporarily display a message on the screen as an indication of the identified object, etc. Step295 may also, for example, comprise outputting an audio indication that a user-selected object has been identified.
Additionally for example, step295 may comprise communicating information indicating the identified user-selectable object to a device external to the television (e.g., a user device at the same premises as the television and/or a device communicatively coupled to the television via a communication network). For example, step295 may comprise communicating such information to a television remote control device (e.g., in a scenario where the television remote control device may provide the user an indication of the identified user-selectable object). In such an exemplary scenario, the television remote control device may comprise a video screen on which the television program may be displayed, and the identified user-selectable object may then be graphically indicated on such video screen (e.g., instead of being identified on a screen of the television and/or in addition to being identified on the screen of the television).
Further for example, step295 may comprise processing information of an identified user-selected object (e.g., as determined at step240) to determine an action to perform with regard to such selected object. Various non-limiting examples of such actions are provided in U.S. Provisional Application No. 61/242,234, which is hereby incorporated herein by reference in its entirety.
In general,step295 may comprise performing continued operations (e.g., performing additional operations corresponding to a user-selected television programming object, repeating various method steps for additional user-selected objects, etc.). Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular type of continued processing unless explicitly claimed.
Turning next toFIG. 3, such figure is a flow diagram illustrating anexemplary method300 for providing user-selection of objects in television programming, in accordance with various aspects of the present invention. Theexemplary method300 may, for example, share any or all characteristics with theexemplary method200 illustrated inFIG. 2 and discussed previously. Any or all aspects of theexemplary method300 may, for example, be implemented in a television (e.g., thefirst television140 and/or second television141).
Theexemplary method300 may, for example, begin executing atstep305. Theexemplary method300 may begin executing in response to any of a variety of causes or conditions. Step305 may, for example, share any or all characteristics withstep205 of theexemplary method200 illustrated inFIG. 2 and discussed previously.
Theexemplary method300 may, for example atstep310, comprise receiving television programming. Step310 may, for example, share any or all characteristics withstep210 of theexemplary method200 illustrated inFIG. 2 and discussed previously.
For example, step310 may comprise, for example atsub-step312, receiving a television program broadcast as such program is broadcast in real-time. Alternatively for example, step310 may comprise, for example atsub-step314, receiving a previously broadcast program from a user recording device (e.g., a PVR, VCR, etc.) in a time-shifted manner.
Theexemplary method300 may, for example atstep320, comprise presenting television programming (e.g., as received at step310) to a user. Step320 may, for example, share any or all characteristics withstep220 of theexemplary method200 illustrated inFIG. 2 and discussed previously.
For example, step320 may comprise, for example atsub-step322, presenting the received television programming on a screen of a television (e.g., of a television implementing theexemplary method300 or a portion thereof). Alternatively for example, step320 may comprise, for example atsub-step324, communicating received television programming to another user device for presentation to the user (e.g., to a display device different from the television, to a television remote control device with a display, to a user's handheld computer, etc.).
Theexemplary method300 may, for example atstep330, comprise determining on-screen pointing location pointed to by a user of the television. Step330 may, for example, share any or all characteristics withstep230 of theexemplary method200 illustrated inFIG. 2 and discussed previously.
For example, step330 may comprise, for example atsub-step332, analyzing sensor information (e.g., associated with sensors on-board and/or off-board the television) to determine user on-screen pointing location. Alternatively for example, step330 may comprise, for example atsub-step334, the television receiving information describing the on-screen pointing location from a device external to the television (e.g., a television receiver, a television controller, a television network device, etc.).
Theexemplary method300 may, for example atstep340, comprise identifying a user-selectable object in the presented television programming at which the user is pointing based, at least in part, on the determined on-screen pointing location (e.g., as determined at step330). Step340 may, for example, share any or all characteristics withstep240 of theexemplary method200 illustrated inFIG. 2 and discussed previously.
For example, step340 may comprise, for example atsub-step342, determining the on-screen location of one or more user-selectable objects in the presented television programming. Alternatively for example, step330 may comprise, for example atsub-step344, identifying a user-selected object by analyzing the respective on-screen locations of the one or more user-selectable objects and the determined on-screen pointing location (e.g., at a particular time instance and/or particular timeframe) to determine the object selected by the user.
Theexemplary method300 may, for example atstep395, comprise performing continued operations. Step395 may, for example, share any or all characteristics withstep295 of theexemplary method200 illustrated inFIG. 2 and discussed previously.
Turning next toFIG. 4, such figure is a diagram illustrating anexemplary television400, in accordance with various aspects of the present invention. Theexemplary television400 may, for example, share any or all characteristics with theexemplary televisions140,141 illustrated inFIG. 1 and discussed previously. Also, the exemplary television400 (e.g., various modules thereof) may operate to perform any or all of the functionality discussed previously with regard to theexemplary methods200 and300 illustrated inFIGS. 2-3 and discussed previously.
Theexemplary television400 includes a firstcommunication interface module410. The firstcommunication interface module410 may, for example, operate to communicate over any of a variety of communication media and utilizing any of a variety of communication protocols. For example, though the firstcommunication interface module410 is illustrated coupled to a wireless RF antenna via awireless port412, the wireless medium is merely illustrative and non-limiting. The firstcommunication interface module410 may, for example, operate to communicate with one or more communication networks (e.g., cable television networks, satellite television networks, telecommunication networks, the Internet, local area networks, personal area networks, metropolitan area networks, etc.) via which television video content (e.g., television programming) and/or other data is communicated. Also for example, the firstcommunication interface module410 may operate to communicate with local sources of television video content (e.g., video recorders, receivers, gaming devices, etc.). Additionally, for example, the firstcommunication interface module410 may operate to communicate with a television controller (e.g., directly or via one or more intermediate communication networks).
Theexemplary television400 includes a secondcommunication interface module420. The secondcommunication interface module420 may, for example, operate to communicate over any of a variety of communication media and utilizing any of a variety of communication protocols. For example, the secondcommunication interface module420 may communicate via a wirelessRF communication port422 and antenna, or may communicate via a non-tethered optical communication port424 (e.g., utilizing laser diodes, photodiodes, etc.). Also for example, the secondcommunication interface module420 may communicate via a tethered optical communication port426 (e.g., utilizing a fiber optic cable), or may communicate via a wired communication port428 (e.g., utilizing coaxial cable, twisted pair, HDMI cable, Ethernet cable, any of a variety of wired component and/or composite video connections, etc.). The secondcommunication interface module420 may, for example, operate to communicate with one or more communication networks (e.g., cable television networks, satellite television networks, telecommunication networks, the Internet, local area networks, personal area networks, metropolitan area networks, etc.) via which television video content (e.g., television programming) and/or other data is communicated. Also for example, thesecond communication module420 may operate to communicate with local sources of television video content (e.g., video recorders, receivers, gaming devices, etc.). Additionally, for example, thesecond communication module420 may operate to communicate with a television controller (e.g., directly or via one or more intervening communication networks).
Theexemplary television400 may also comprise additional communication interface modules, which are not illustrated. Such additional communication interface modules may, for example, share any or all aspects with the first410 and second420 communication interface modules discussed above.
Theexemplary television400 may also comprise acommunication module430. Thecommunication module430 may, for example, operate to control and/or coordinate operation of the firstcommunication interface module410 and the second communication interface module420 (and/or additional communication interface modules as needed). Thecommunication module430 may, for example, provide a convenient communication interface by which other components of thetelevision400 may utilize the first410 and second420 communication interface modules. Additionally, for example, in an exemplary scenario where a plurality of communication interface modules are sharing a medium and/or network, thecommunication module430 may coordinate communications to reduce collisions and/or other interference between the communication interface modules.
Theexemplary television400 may additionally comprise one or moreuser interface modules440. Theuser interface module440 may generally operate to provide user interface functionality to a user of thetelevision400. For example, and without limitation, theuser interface module440 may operate to provide for user control of any or all standard television commands (e.g., channel control, volume control, on/off, screen settings, input selection, etc.). Theuser interface module440 may, for example, operate and/or respond to user commands utilizing user interface features disposed on the television (e.g., buttons, etc.) and may also utilize the communication module430 (and/or first410 and second420 communication interface modules) to communicate with a television controller (e.g., a dedicated television remote control, a universal remote control, a cellular telephone, personal computing device, gaming controller, etc.).
Theuser interface module440 may also comprise one or more sensor modules that operate to interface with and/or control operation of any of a variety of sensors that may be utilized to ascertain an on-screen pointing location. For example and without limitation, the user interface module440 (or sensor module(s) thereof) may operate to receive signals associated with respective sensors (e.g., raw or processed signals directly from the sensors, through intermediate devices, via thecommunication interface modules410,420, etc.). Also for example, in scenarios in which such sensors are active sensors (as opposed to purely passive sensors), the user interface module440 (or sensor module(s) thereof) may operate to control the transmission of signals (e.g., RF signals, optical signals, acoustic signals, etc.) from such sensors. Additionally, theuser interface module440 may perform any of a variety of video output functions (e.g., presenting television programming to a user, providing visual feedback to a user regarding an identified user-selected object in the presented television programming, etc.).
Theexemplary television400 may comprise one ormore processors450. Theprocessor450 may, for example, comprise a general purpose processor, digital signal processor, application-specific processor, microcontroller, microprocessor, etc. For example, theprocessor450 may operate in accordance with software (or firmware) instructions. As mentioned previously, any or all functionality discussed herein may be performed by a processor executing instructions. For example, though various modules are illustrated as separate blocks or modules inFIG. 4, such illustrative modules, or a portion thereof, may be implemented by theprocessor450.
Theexemplary television400 may comprise one ormore memories460. As discussed above, various aspects may be performed by one or more processors executing instructions. Such instructions may, for example, be stored in the one ormore memories460.Such memory460 may, for example, comprise characteristics of any of a variety of types of memory. For example and without limitation,such memory460 may comprise one or more memory chips (e.g., ROM, RAM, EPROM, EEPROM, flash memory, one-time-programmable OTP memory, etc.), hard drive memory, CD memory, DVD memory, etc.
Theexemplary television400 may comprise one ormore modules452 that operate to perform and/or manage the receipt and/or presentation of television programming. For example, such one ormore modules452 may operate to utilize the communication module430 (e.g., and at least one of thecommunication interface modules410,420) to receive television programming. For example, such one ormore modules452 may operate to performstep210 of theexemplary method200 discussed previously and/or step310 of theexemplary method300 discussed previously.
Also for example, such one ormore modules452 may operate to utilize the user interface module(s)440 to present television programming to the user (e.g., via thevideo display470 of the television). Additionally for example, such one ormore modules452 may operate to utilize the communication module430 (e.g., and at least one of thecommunication interface modules410,420) to communicate television programming video output information to one or more devices communicatively coupled to the television400 (e.g., via one or more of thecommunication interface modules410,420). For example, such one ormore modules452 may operate to performstep220 of theexemplary method200 discussed previously and/or step320 of theexemplary method300 discussed previously.
Theexemplary television400 may comprise one or more on-screen pointing location determination module(s)454. Such on-screen pointing location determination module(s)454 may, for example, operate to determine an on-screen pointing location pointed to by a user of the television. Such module(s)454 may, for example, operate to performstep230 of theexemplary method200 and/or step330 of theexemplary method300 discussed previously. For example, the module(s)454 may operate to analyze sensor information to determine an on-screen pointing location. Also for example, the module(s)454 may operate to receive on-screen pointing location information from a device external to the television400 (e.g., utilizing the communication module430).
Theexemplary television400 may comprise one or more user-selectedobject identification modules456. Such module(s)456 may, for example, operate to identify a user-selectable object in presented television programming at which a user of thetelevision400 is pointing. For example, such module(s)456 may operate to identify such user-selected object based, at least in part, on on-screen pointing location determined by the on-screen pointing location determination module(s)454. Such module(s)456 may, for example, operate to performstep240 of theexemplary method200 and/or step340 of theexemplary method300 discussed previously. For example, the module(s)456 may operate to determine the on-screen location of one or more user-selectable objects in the presented television programming, and identify a user-selected object by analyzing the respective on-screen locations of the one or more user-selectable objects and the determined on-screen pointing location (e.g., at a particular time instance and/or particular timeframe) to determine the object selected by the user.
Though not illustrated, theexemplary television400 may, for example, comprise one or more modules that operate to perform any or all of the continued processing discussed previously with regard to step295 of theexemplary method200 and step395 of theexemplary method300, discussed previously. Such modules (e.g., as with the one ormore modules452,454 and456) may be performed by the processor(s)450 executing instructions stored in thememory460.
Turning next toFIG. 5, such figure is a diagram illustrating exemplary modules and/or sub-modules for atelevision500, in accordance with various aspects of the present invention. Theexemplary television500 may share any or all aspects with any of thetelevisions140,141 and400 discussed herein and illustrated inFIGS. 1 and 4. For example, the exemplary television500 (or various modules thereof) may operate to perform any or all functionality discussed herein with regard to theexemplary method200 illustrated inFIG. 2 and theexemplary method300 illustrated inFIG. 3. As with theexemplary television400, the components of theexemplary television500 may be disposed in a single television device (e.g., a console television, flat panel television, portable/mobile television device, mobile television device, etc.).
For example, thetelevision500 comprises aprocessor530. Such aprocessor530 may, for example, share any or all characteristics with theprocessor450 discussed with regard toFIG. 4. Also for example, thetelevision500 comprises amemory540.Such memory540 may, for example, share any or all characteristics with thememory460 discussed with regard toFIG. 4.
Also for example, thetelevision500 may comprise any of a variety of user interface module(s)550. Such user interface module(s)550 may, for example, share any or all characteristics with the user interface module(s)440 discussed previously with regard toFIG. 4. For example and without limitation, the user interface module(s)550 may comprise: a display device, a camera (for still or moving picture acquisition), a speaker, an earphone (e.g., wired or wireless), a microphone, a video screen (e.g., a touch screen), a vibrating mechanism, a keypad, and/or any of a variety of other user interface devices (e.g., a mouse, a trackball, a touch pad, touch screen, light pen, game controlling device, etc.).
Theexemplary television500 may also, for example, comprise any of a variety of communication modules (505,506, and510). Such communication module(s) may, for example, share any or all characteristics with the communication interface module(s)410,420 discussed previously with regard toFIG. 4. For example and without limitation, the communication interface module(s)510 may comprise: a Bluetooth interface module; an IEEE 802.11, 802.15, 802.16 and/or 802.20 module; any of a variety of cellular telecommunication interface modules (e.g., GSM/GPRS/EDGE, CDMA/CDMA2000/1x-EV-DO, WCDMA/HSDPA/HSUPA, TDMA/PDC, WiMAX, etc.); any of a variety of position-related communication interface modules (e.g., GPS, A-GPS, etc.); any of a variety of wired/tethered communication interface modules (e.g., USB, Fire Wire, RS-232, HDMI, Ethernet, wireline and/or cable modem, etc.); any of a variety of communication interface modules related to communicating with external memory devices; etc. Theexemplary television500 is also illustrated as comprising various wired506 and/orwireless505 front-end modules that may, for example, be included in the communication interface modules and/or utilized thereby.
Theexemplary television500 may also comprise any of a variety of signal processing module(s)590. Such signal processing module(s)590 may share any or all characteristics with modules of theexemplary television400 that perform signal processing. Such signal processing module(s)590 may, for example, be utilized to assist in processing various types of information discussed previously (e.g., with regard to sensor processing, position determination, video processing, image processing, audio processing, general user interface information data processing, etc.). For example and without limitation, the signal processing module(s)590 may comprise: video/graphics processing modules (e.g. MPEG-2, MPEG-4, H.263, H.264, JPEG, TIFF, 3-D, 2-D, MDDI, etc.); audio processing modules (e.g., MP3, AAC, MIDI, QCELP, AMR, CMX, etc.); and/or tactile processing modules (e.g., Keypad I/O, touch screen processing, motor control, etc.).
In summary, various aspects of the present invention provide a system and method in a television for providing user-selection of objects in a television program. While the invention has been described with reference to certain aspects and embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the scope of the invention. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the invention without departing from its scope. Therefore, it is intended that the invention not be limited to the particular embodiment disclosed, but that the invention will include all embodiments falling within the scope of the appended claims.