CROSS-REFERENCE TO RELATED APPLICATIONS/INCORPORATION BY REFERENCEThis patent application is related to and claims priority from provisional patent application Ser. No. 61/242,234 filed Sep. 14, 2009, and titled “TELEVISION SYSTEM,” the contents of which are hereby incorporated herein by reference in their entirety. This patent application is also related to U.S. patent application Ser. No. ______, filed concurrently with, titled “SYSTEM AND METHOD FOR GENERATING SCREEN POINTING INFORMATION IN A TELEVISION”, Attorney Docket No. 21034US02; and U.S. patent application Ser. No. ______, filed concurrently herewith, titled “SYSTEM AND METHOD FOR GENERATING SCREEN POINTING INFORMATION IN A TELEVISION CONTROL DEVICE”, Attorney Docket No. 21036US02. The contents of each of the above-mentioned applications are hereby incorporated herein by reference in their entirety.
FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT[Not Applicable]
SEQUENCE LISTING[Not Applicable]
MICROFICHE/COPYRIGHT REFERENCE[Not Applicable]
BACKGROUND OF THE INVENTIONPresent television receivers are incapable of providing pointing information to television program viewers. Further limitations and disadvantages of conventional and traditional approaches will become apparent to one of skill in the art, through comparison of such systems with the present invention as set forth in the remainder of the present application with reference to the drawings.
BRIEF SUMMARY OF THE INVENTIONVarious aspects of the present invention provide a system and method, in a television receiver, for generating screen pointing information, substantially as shown in and/or described in connection with at least one of the figures, as set forth more completely in the claims. These and other advantages, aspects and novel features of the present invention, as well as details of illustrative aspects thereof, will be more fully understood from the following description and drawings.
BRIEF DESCRIPTION OF SEVERAL VIEWS OF THE DRAWINGSFIG. 1 is a diagram illustrating an exemplary television system in accordance with various aspects of the present invention.
FIG. 2 is a diagram illustrating an exemplary television receiver in accordance with various aspects of the present invention.
FIG. 3 is a diagram illustrating an exemplary television system with on-screen television sensors in accordance with various aspects of the present invention.
FIG. 4 is a diagram illustrating an exemplary television system with off-screen television sensors in accordance with various aspects of the present invention.
FIG. 5 is a diagram illustrating an exemplary television system with off-television sensors in accordance with various aspects of the present invention.
FIG. 6 is a diagram illustrating an exemplary television system with television receiver sensors in accordance with various aspects of the present invention.
FIG. 7 is a diagram illustrating an exemplary television system with television controller sensors in accordance with various aspects of the present invention.
FIG. 8 is a diagram illustrating an exemplary television receiver in accordance with various aspects of the present invention.
FIG. 9 is a flow diagram illustrating the generation of on-screen pointing information in accordance with various aspects of the present invention.
FIG. 10 is a flow diagram illustrating the generation of on-screen pointing information in accordance with various aspects of the present invention.
DETAILED DESCRIPTION OF VARIOUS ASPECTS OF THE INVENTIONThe following discussion will refer to various communication modules, components or circuits. Such modules, components or circuits may generally comprise hardware and/or a combination of hardware and software (e.g., including firmware). Such modules may also, for example, comprise a computer readable medium (e.g., a non-transitory medium) comprising instructions (e.g., software instructions) that, when executed by a processor, cause the processor to perform various functional aspects of the present invention. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of particular hardware and/or software implementations of a module, component or circuit unless explicitly claimed as such. For example and without limitation, various aspects of the present invention may be implemented by one or more processors (e.g., a microprocessor, digital signal processor, baseband processor, microcontroller, etc.) executing software instructions (e.g., stored in volatile and/or non-volatile memory). Also for example, various aspects of the present invention may be implemented by an application-specific integrated circuit (“ASIC”) and/or other hardware components.
Additionally, the following discussion will refer to various television system modules (e.g., television receiver modules). It should be noted that the following discussion of such various modules is segmented into such modules for the sake of illustrative clarity. However, in actual implementation, the boundaries between various modules may be blurred. For example, any or all of the functional modules discussed herein may share various hardware and/or software components. For example, any or all of the functional modules discussed herein may be implemented wholly or in-part by a shared processor executing software instructions. Additionally, various software sub-modules that may be executed by one or more processors may be shared between various software modules. Accordingly, the scope of various aspects of the present invention should not be limited by arbitrary boundaries between various hardware and/or software components, unless explicitly claimed.
The following discussion may also refer to communication networks and various aspects thereof. For the following discussion, a communication network is generally the communication infrastructure through which a communication device (e.g., a portable communication device, television, television controller, television provider, television programming provider, television receiver, video recording device, etc.) may communicate with other systems. For example and without limitation, a communication network may comprise a cable and/or satellite television communication network, a cellular communication network, a wireless metropolitan area network (WMAN), a wireless local area network (WLAN), a wireless personal area network (WPAN), any home or premises communication network, etc. A particular communication network may, for example, generally have a corresponding communication protocol according to which a communication device may communicate with the communication network. Unless so claimed, the scope of various aspects of the present invention should not be limited by characteristics of a particular type of communication network.
The following discussion will at times refer to an on-screen pointing location. Such a pointing location refers to a location on the television screen to which a user (either directly or with a pointing device) is pointing. Such a pointing location is to be distinguished from other types of on-screen location identification, such as, for example, using arrow keys and/or a mouse to move a cursor or to traverse blocks (e.g., on an on-screen program guide) without pointing.
Additionally, the following discussion will at times refer to television programming. Such television programming generally includes various types of television programming (e.g., television programs, news programs, sports programs, music television, movies, television series programs and/or associated advertisements, educational programs, live or recorded, broadcast/multicast/unicast, etc.). Such television programming video content is to be distinguished from other non-programming video content that may be displayed on a television screen (e.g., an electronic program guide, user interface menu, a television set-up menu, a typical web page, a document, a graphical video game, etc.). Various aspects of the present invention may, for example, comprise determining an on-screen pointing location during the presentation of television programming on the screen of the television.
Turning first toFIG. 1, such figure is a diagram illustrating a non-limitingexemplary television system100 in accordance with various aspects of the present invention. Theexemplary system100 includes atelevision provider110. Thetelevision provider110 may, for example, comprise a television network company, a cable company, a movie-providing company, a news company, an educational institution, etc. Thetelevision provider110 may, for example, be an original source of television programming (or related information). Also for example, thetelevision provider110 may be a communication company that provides programming distribution services (e.g., a cable television company, a satellite television company, a telecommunication company, a data network provider, etc.). Thetelevision provider110 may, for example, provide programming and non-programming information and/or video content. Thetelevision provider110 may, for example, provide information related to a television program (e.g., information describing or otherwise related to selectable objects in programming, etc.).
Theexemplary television system100 may also include a third partyprogram information provider120. Such a provider may, for example, provide information related to a television program. Such information may, for example, comprise information describing selectable objects in programming, program guide information, etc.
Theexemplary television system100 may include one or more communication networks (e.g., the communication network(s)130). Theexemplary communication network130 may comprise characteristics of any of a variety of types of communication networks over which video content and/or information related to video content may be communicated. For example and without limitation, thecommunication network130 may compare characteristics of a cable television network, a satellite television network, a telecommunication network, the Internet, a local area network (LAN), a personal area network (PAN), a metropolitan area network (MAN), any of a variety of different types of home networks, etc.
Theexemplary television system100 may include afirst television140. Such afirst television140 may, for example, comprise networking capability enablingsuch television140 to communicate directly with thecommunication network130. For example, thefirst television140 may comprise one or more embedded television receivers or transceivers (e.g., a cable television receiver, satellite television transceiver, Internet modem, etc.). Also for example, thefirst television140 may comprise one or more recording devices (e.g., for recording and/or playing back video content, television programming, etc.).
Theexemplary television system100 may include afirst television controller160. Such afirst television controller160 may, for example, operate to control operation of thefirst television140. Thefirst television controller160 may comprise characteristics of any of a variety of television controlling devices. For example and without limitation, thefirst television controller160 may comprise characteristics of a dedicated television control device, a universal remote control, a cellular telephone or personal computing device with television control capability, etc.
Thefirst television controller160 may, for example, transmit signals directly to thefirst television140 to control operation of thefirst television140. Thefirst television controller160 may also, for example, operate to transmit signals (e.g., via the communication network130) to thetelevision provider110 to control video content being provided to thefirst television140, or to conduct other transactions (e.g., business transactions, etc.).
As will be discussed in more detail later, various aspects of the present invention include a user pointing to a location on a television screen (e.g., pointing to an object or person presented in television programming). In such a scenario, the user may perform such pointing in any of a variety of manners. One of such exemplary manners includes pointing with a television control device. Thefirst television controller160 provides a non-limiting example of a device that a user may utilize to point to an on-screen location.
Theexemplary television system100 may also include atelevision receiver150. The television receiver may, for example, operate to provide a communication link between a television and/or television controller and a communication network and/or information provider. For example, thetelevision receiver150 may operate to provide a communication link between thesecond television141 and thecommunication network130, or between thesecond television141 and the television provider110 (and/or third party program information provider120) via thecommunication network130.
Thetelevision receiver150 may comprise characteristics of any of a variety of types of television receivers. For example and without limitation, thetelevision receiver150 may comprise characteristics of a cable television receiver, a satellite television receiver, etc. Also for example, thetelevision receiver150 may comprise a data communication network modem for data network communications (e.g., with the Internet, a LAN, PAN, MAN, telecommunication network, etc.). Thetelevision receiver150 may also, for example, comprise recording capability (e.g., programming recording and playback, etc.). The following discussion ofFIGS. 2-10 will present various non-limiting illustrative aspects of such atelevision receiver150.
Theexemplary television system100 may include asecond television controller161. Such asecond television controller161 may, for example, operate to control operation of thesecond television141 and thetelevision receiver150. Thesecond television controller161 may comprise characteristics of any of a variety of television controlling devices. For example and without limitation, thesecond television controller161 may comprise characteristics of a dedicated television control device, a dedicated television receiver control device, a universal remote control, a cellular telephone or personal computing device with television control capability, etc.
Thesecond television controller161 may, for example, transmit signals directly to thesecond television141 to control operation of thesecond television141. Thesecond television controller161 may, for example, transmit signals directly to thetelevision receiver150 to control operation of thetelevision receiver150. Thesecond television controller161 may additionally, for example, operate to transmit signals (e.g., via thetelevision receiver150 and the communication network130) to thetelevision provider110 to control video content being provided to thetelevision receiver150, or to conduct other transactions (e.g., business transactions, etc.).
As will be discussed in more detail later, various aspects of the present invention include a user pointing to a location on a television screen (e.g., pointing to an object or person presented in television programming). In such a scenario, the user may perform such pointing in any of a variety of manners. One of such exemplary manners includes pointing with a television control device. Thesecond television controller161 provides one non-limiting example of a device that a user may utilize to point to an on-screen location.
Theexemplary television system100 was provided to provide a non-limiting illustrative foundation for discussion of various aspects of the present invention. Thus, the scope of various aspects of the present invention should not be limited by any characteristics of theexemplary television system100 unless explicitly claimed.
Turning next toFIG. 2, such figure is a diagram illustrating anexemplary television receiver200 in accordance with various aspects of the present invention. Theexemplary television receiver200 may, for example, share any or all characteristics with theexemplary television receiver150 illustrated inFIG. 1 and discussed previously and/or with any of the exemplary television receivers discussed herein.
Theexemplary television receiver200 includes a firstcommunication interface module210. The firstcommunication interface module210 may, for example, operate to communicate over any of a variety of communication media and utilizing any of a variety of communication protocols. For example, though the firstcommunication interface module210 is illustrated coupled to a wireless RF antenna via a wireless port212, the wireless medium is merely illustrative and non-limiting. The firstcommunication interface module210 may, for example, operate to communicate with one or more communication networks (e.g., cable television networks, satellite television networks, telecommunication networks, the Internet, local area networks, personal area networks, metropolitan area networks, etc.) via which television video content (e.g., television programming) and/or other data is communicated. Also for example, thefirst communication module210 may operate to communicate with local sources of television video content (e.g., video recorders, receivers, gaming devices, etc.). Additionally, for example, thefirst communication module210 may operate to communicate with a television controller (e.g., directly or via one or more intermediate communication networks). Further for example, thefirst communication module210 may operate to communicate with a television utilizing any of a variety of television communication connections and/or protocols (e.g., composite video, component video, HDMI, etc.).
Theexemplary television receiver200 includes a secondcommunication interface module220. The secondcommunication interface module220 may, for example, operate to communicate over any of a variety of communication media and utilizing any of a variety of communication protocols. For example, the secondcommunication interface module220 may communicate via a wirelessRF communication port222 and antenna, or may communicate via a non-tethered optical communication port224 (e.g., utilizing laser diodes, photodiodes, etc.). Also for example, the secondcommunication interface module220 may communicate via a tethered optical communication port226 (e.g., utilizing a fiber optic cable), or may communicate via a wired communication port228 (e.g., utilizing coaxial cable, twisted pair, HDMI cable, Ethernet cable, any of a variety of wired component and/or composite video connections, etc.). The secondcommunication interface module220 may, for example, operate to communicate with one or more communication networks (e.g., cable television networks, satellite television networks, telecommunication networks, the Internet, local area networks, personal area networks, metropolitan area networks, etc.) via which television video content and/or other data is communicated. Also for example, thesecond communication module220 may operate to communicate with local sources of television video content (e.g., video recorders, other receivers, gaming devices, etc.). Additionally, for example, thesecond communication module220 may operate to communicate with a television controller (e.g., directly or via one or more intervening communication networks). Further for example, thesecond communication module220 may operate to communicate with a television utilizing any of a variety of television communication connections and/or protocols (e.g., composite video, component video, HDMI, etc.).
Theexemplary television receiver200 may also comprise additional communication interface modules, which are not illustrated. Such additional communication interface modules may, for example, share any or all aspects with the first210 and second220 communication interface modules discussed above.
Theexemplary television receiver200 may also comprise acommunication module230. Thecommunication module230 may, for example, operate to control and/or coordinate operation of the firstcommunication interface module210 and the second communication interface module220 (and/or additional communication interface modules as needed). Thecommunication module230 may, for example, provide a convenient communication interface by which other components of thetelevision receiver200 may utilize the first210 and second220 communication interface modules. Additionally, for example, in an exemplary scenario where a plurality of communication interface modules are sharing a medium and/or network, thecommunication module230 may coordinate communications to reduce collisions and/or other interference between thecommunication interface modules210,220.
Theexemplary television receiver200 may comprise one or moretelevision interface modules235. Thetelevision interface module235 may, for example, operate to manage communications between thetelevision receiver200 and one or more televisions that are communicatively coupled thereto (e.g., via the first210 and/or second220 communication interface modules). For example, thetelevision interface module235 may operate to communicate general television programming video information to a television (e.g., while thetelevision receiver200 is operating to determine an on-screen pointing location).
Also, for example, as will be discussed in more detail later, thetelevision interface module235 may output a signal to the television or television controller or other device with a display, where such signal comprises characteristics adapted to cause the television (or other device) to output a visual indication of on-screen pointing location. Such an indication may, for example, be communicated with (e.g., as a part of) television programming being communicated to the television (or other device), or such an indication may be communicated to the television (or other device) independent of television programming.
Theexemplary television receiver200 may additionally comprise one or moreuser interface modules240. Theuser interface module240 may generally operate to provide user interface functionality to a user of thetelevision receiver200. For example, and without limitation, theuser interface module240 may operate to provide for user control of any or all standard television receiver commands (e.g., channel control, on/off, television output settings, input selection, etc.). Theuser interface module240 may, for example, operate and/or respond to user commands utilizing user interface features disposed on the television receiver (e.g., buttons, touch screen, microphone, etc.) and may also utilize the communication module230 (and/or first210 and second220 communication interface modules) to communicate with a television controller (e.g., a dedicated television remote control, a universal remote control, a cellular telephone, personal computing device, gaming controller, etc.) or a television. For example, various user interface features of thetelevision receiver200 may comprise utilization of the television (e.g., utilizing the television screen for menu-driven or other GUI associated with television receiver operation).
Theuser interface module240 may also operate to interface with and/or control operation of any of a variety of sensors that may be utilized to ascertain an on-screen pointing location. Non-limiting examples of such sensors will be provided later (e.g., in the discussion ofFIGS. 3-7 and elsewhere herein). For example and without limitation, theuser interface module240 may operate to receive signals associated with respective sensors (e.g., raw or processed signals directly from the sensors, through intermediate devices (e.g., a television, television control, surround sound system, etc.), via thecommunication interface modules210,220, etc.). Also for example, in scenarios in which such sensors are active sensors (as opposed to purely passive sensors), theuser interface module240 may operate to control the transmission of signals (e.g., RF signals, optical signals, acoustic signals, etc.) from such sensors.
Theexemplary television receiver200 may comprise one ormore processors250. Theprocessor250 may, for example, comprise a general purpose processor, digital signal processor, application-specific processor, microcontroller, microprocessor, etc. For example, theprocessor250 may operate in accordance with software (or firmware) instructions. As mentioned previously, any or all functionality discussed herein may be performed by a processor executing instructions. For example, though various modules are illustrated as separate blocks or modules inFIG. 2 for illustrative clarity, such illustrative modules, or a portion thereof, may be implemented by theprocessor250.
Theexemplary television receiver200 may comprise one ormore memories260. As discussed above, various aspects may be performed by one or more processors executing instructions. Such instructions may, for example, be stored in the one ormore memories260.Such memory260 may, for example, comprise characteristics of any of a variety of types of memory. For example and without limitation,such memory260 may comprise one or more memory chips (e.g., ROM, RAM, EPROM, EEPROM, flash memory, one-time-programmable OTP memory, etc.), hard drive memory, CD memory, DVD memory, etc.
Theexemplary television receiver200 may also comprise one ormore calibration modules251 that operate to perform various calibration activities. Examples of such calibration activities will be provided later in this discussion. Briefly, such calibration activities may, for example, comprise interacting with a user and/or user pointing device to determine sensor signals under known circumstances (e.g., determine sensor signals in response to known screen pointing circumstances), and processing such sensor signals to develop algorithms (e.g., transformation matrices, static positional equations, etc.) to determine screen pointing location based on sensor signals received during normal operation. As will also be discussed later, such calibration may also be utilized to establish signal gain (or energy) patterns utilized in determining pointing location.
Theexemplary television receiver200 may comprise one or more location-determiningmodules252. For example, as will be discussed later, various on-screen pointing location determinations may comprise processing location information. As a non-limiting example, knowing the location of a user (e.g., including the location of a pointing device being utilized by the user) may simplify the solution of various pointing direction determinations. For example, knowing exactly where a pointing device is located (e.g., in three-dimensional space) or where a pointing device is located along a line (e.g., knowing device location in two-dimensional space or land surface coordinates) relative to the television screen (and/or relative to the television receiver) may remove a number of unknown variables from applicable positional equations. Note that such positional information may, in various exemplary scenarios, also comprise orientation information for a pointing device (e.g., yaw, pitch and/or roll). Such orientation information may be determined in various manners (e.g., through gyroscopic means, sensor alignment with known references, etc.).
The location-determiningmodule252 may operate to determine user (or pointing device) location in any of a variety of manners. For example and without limitation, the location-determiningmodule252 may operate to receive location information from the pointing device (e.g., via one of thecommunication interface modules210,220). For example, such a pointing device may comprise positioning system capability (e.g., global positioning system, assisted GPS, cellular or other triangulation systems, etc.) and communicate information describing the position of the pointing device to thetelevision receiver200.
Also for example, the location-determiningmodule252 may (e.g., via the user interface modules240) utilize sensor signals to determine the position (which may include orientation) of the pointing device (or user thereof). For example, a signal from a pointing device may arrive at different sensors at different times (or at different phases). Such temporal or phase differences may be processed to determine the location of the pointing device relative to the known location of such sensors. Further for example, the location-determiningmodule252 may operate to communicate pointing device location information with an external system that operates to determine the location of the pointing device. Such an external system may, for example, comprise a cellular telephony triangulation system, a home or premises-based triangulation system, a global positioning system, an assisted global positioning system, etc.
Theexemplary television receiver200 may also comprise one or more sensor processing module(s)253. As will be explained below, thesensor processing module253 may operate to receive sensor information (e.g., from the user interface module(s)240, from thetelevision interface module235, etc.) and process such received sensor information to determine a location on the television screen to which a user is pointing. Various examples of such processing will be provided below. Briefly, such processing may, for example, comprise selecting a sensor with the strongest signal, interpolating between a plurality of sensors, interpolating between a plurality of sensors having strongest signals, determining gain (or energy) pattern intersections, etc. Various aspects of the present invention comprise, for example, determining on-screen pointing location during presentation of television programming (e.g., programming received from a television broadcaster, video recording device, etc.).
Various aspects of the present invention will now be illustrated by way of non-limiting example. Throughout the following discussion, reference will continue to be made to the various modules of thetelevision receiver200 illustrated inFIG. 2. It should be noted that the following non-limiting examples provide specific examples of various aspects, and as such, the scope of various aspects of the present invention should not be limited by characteristics of any of the specific examples, unless specifically claimed.
FIG. 3 is a diagram illustrating anexemplary television system300 with on-screen television sensors in accordance with various aspects of the present invention. Thetelevision system300 includes atelevision301 comprising atelevision screen303. Thetelevision system300 also includes a television controller320 (or other pointing device) pointing to an on-screen pointing location330 along a line325 between thetelevision controller320 and the on-screen pointing location330.
Thetelevision system300 also comprises atelevision receiver350 that is communicatively coupled to thetelevision301 via a communication link351 (e.g., a two-way communication link providing video information to thetelevision301 and/or receiving sensor information from the television301). Thetelevision receiver350 may share any or all aspects with theexemplary receivers150,200 discussed previously and all other receivers discussed herein. Accordingly, various aspects of thetelevision receiver350 will be explained herein with reference to various components of theexemplary television receiver200 illustrated inFIG. 2. Theexemplary television receiver350 is also communicatively coupled to thetelevision controller320 via acommunication link352.
Theexemplary television screen303 comprises an array of sensors integrated into thetelevision screen303. One of such sensors is labeledsensor310. Any of a variety of sensor types may be utilized, non-limiting examples of which include light sensors or photo detectors (e.g., photo diodes) and RF sensors (e.g., antenna elements or loops).
The array of sensors may be integrated in thetelevision screen303 in any of a variety of manners, non-limiting examples of which will now be provided. For example, thetelevision screen303 may comprise an array of liquid crystal display (LCD) pixels for presenting video media to a user. An array of photo diodes and/or antenna elements may be integrated between or behind LCD pixels. For example, every LCD pixel may be associated with a corresponding photo diode and/or antenna element, or every N×M block of LCD pixels may be associated with a corresponding photo diode or antenna element.
As a non-limiting example, an array of photo diodes and/or RF antenna elements may be formed into a substrate beneath or behind transparent LCD substrates. As another example, a photo diode array and/or antenna element array may be interposed between or behind an array of LCD thin film transistors. Also for example, an array of photo diodes and/or RF antenna elements (or other sensors) may be incorporated into a transparent screen overlay. Note that is such an implementation, such transparent screen overlay may be installed after-market. For example, a user that has atelevision receiver350 with the capability to determine on-screen pointing location may install the transparent screen overlay. In such an exemplary scenario, there may be one or more communication links established between thetelevision receiver350 and the sensors in the overlay, where such communication links may be independent of a communication link over which non-sensor information (e.g., video and/or control information) is communicated between thetelevision301 and thetelevision receiver350. Such communication link may, for example, be adapted to communicate information from each sensor to thetelevision receiver350 serially (e.g., in a time-multiplexed manner) and/or in parallel.
In a photo detector implementation, passive photo detectors may receive varying amounts of respective light energy depending on the pointing direction of a light source aimed at the screen. Also for example, received signals (e.g., pulsed signals) may arrive at different sensors at different respective times/phases (e.g., being indicative of relative position and/or pointing direction, which may also be utilized in a pointing determination). In such a photo detector implementation (e.g., utilizing photo diodes), photo detectors may, for example, be tuned to react to particular light frequencies to reduce interference from output pixel light and/or associated reflections, ambient light, etc. As a non-limiting example, photo diodes may be tuned to detect light that is not visible to the human eye, visible light frequencies that are relatively rare, light patterns that are unlikely to occur in a television program (e.g., particular pulse codes), etc.
In an antenna element implementation, an array of antenna elements may be formed on a substrate and placed behind light producing and/or filtering elements in the LCD screen (e.g., so as to avoid interfering with emitted light) or may be formed on a transparent substrate within or in front of the lighted region of the LCD display (e.g., utilizing microscopic antenna elements that are too small to significantly interfere with light emitted from the display). As discussed above, such an implementation may be integrated with thetelevision screen303, but may also be added as an overlay (e.g., as a production option or an after-market user or technician installation).
In an RF antenna implementation, passive antennas (or elements of an overall antenna matrix) may receive varying respective amounts of RF energy depending on the pointing direction of a directional RF source aimed at the screen. Also for example, received signals (e.g., pulsed signals) may arrive at different antennas at different respective times/phases (e.g., being indicative of relative position and/or pointing direction, which may also be utilized in a pointing determination)
In an exemplary scenario, a user may point a pointing device (e.g., a remote controller, a laser pointer, directional RF transmitter, specifically designed eyewear, a mobile computing device, a mobile communication device, a gesture tracking device or glove, etc.) at thetelevision screen303, where the pointing device directs transmitted energy (e.g., light energy, RF energy, acoustic energy, etc.) at a particular location on thetelevision screen303 to which the pointing device is being pointed. Note that such transmitted energy will likely be transmitted directionally and be associated with an intensity or gain pattern with the highest intensity likely at the center of the pattern (i.e., along the pointing line325) and decreasing as a function of angle from the center of the pattern (or distance on the screen from the on-screen pointing location).
In such an exemplary scenario, each sensor of the array of sensors integrated into thescreen303 will likely receive some respective amount of energy. For example, the sensor nearest the screen pointing location330 (i.e., along the pointing line325) will likely receive the highest amount of energy, sensors adjacent to thescreen pointing location330 will likely receive a next highest range of energy, and sensors away from thepointing location330 will likely receive progressively less amounts of energy from thepointing device320, as a function of distance from thepointing location330, until such energy is lost in the noise floor.
In such an exemplary scenario, the television receiver350 (e.g., theuser interface module240 of thetelevision receiver200 illustrated inFIG. 2) may receive signals indicative of the energy received by the sensors of the sensor array. Thetelevision receiver350 may receive such signals in various manners, depending on the degree of integration of such sensors into thetelevision301. For example, in an exemplary scenario where the sensors are fully integrated into thetelevision screen303 and operationally integrated into thetelevision301, thetelevision receiver350 may receive such signals via a communication interface between thetelevision receiver350 and thetelevision301. Also for example, in another exemplary scenario where the sensors are overlaid on thetelevision screen303, and where operation of such sensors is independent of thetelevision301, thetelevision receiver350 may receive such signals via a communication link directly between thetelevision receiver350 and the sensors, where such a communication link may be independent of other communication links between thetelevision receiver350 and thetelevision301. Such communication link may, for example, be adapted to communicate information from each sensor to thetelevision receiver350 serially (e.g., in a time-multiplexed manner) and/or in parallel.
Theuser interface module240 may then, for example, provide information of such received sensor signals to thesensor processing module253 for processing. Thesensor processing module253 may then, for example, operate to process such information to determine the screen pointing location. Thesensor processing module253 may perform such processing in any of a variety of manners, non-limiting examples of which will be provided below.
For example, thesensor processing module253 may operate to select the sensor with the highest received energy and determine that the location of such selected sensor is the on-screen pointing location. For example, in an exemplary scenario where the spatial resolution of screen-integrated sensors is relatively fine, such operation may reliably yield a desired level of accuracy without undue processing overhead.
In another example, thesensor processing module253 may operate to select the sensor with the highest received energy and a plurality of sensors adjacent to such sensor. Then, for example, thesensor processing module253 may interpolate between the locations of such sensors (e.g., based, at least in part, on weighting). For example, in a first dimension in which a sensor to the right of the highest energy sensor has a higher received energy than a sensor to the left of the highest energy sensor, thesensor processing module253 may determine that the pointing location is to the right of the highest energy sensor. How much distance to the right may, for example, be determined as a function of the ratio between respective energies received by the right and left sensors. Such calculation may, for example, be a linear or non-linear calculation. Such calculation may also, for example, consider the expected energy pattern of a transmitting pointing device (e.g., in a scenario where energy fall-off is logarithmic as opposed to linear).
In an additional example, thesensor processing module253 may operate to select all sensors receiving a threshold amount of energy (e.g., an absolute threshold level, a threshold level relative to the highest energy sensor, etc.). Then, for example, thesensor processing module253 may interpolate between the locations of such sensors (e.g., based, at least in part, on respective energy weighting). For example, thesensor processing module253 may perform non-linear splining between sensors in a horizontal direction with sensor location on a first axis and sensor energy on a second axis. Thesensor processing module253 may then operate to select the point on the sensor location axis corresponding to the peak sensor energy on the vertical axis. Such splining and selecting may then be repeated in the vertical direction. Alternatively for example, thesensor processing module253 may operate to perform multi-dimensional splining to create a surface based on sensor energy and select the highest point on such surface and the corresponding screen coordinates of such surface.
In a further example, thesensor processing module253 may operate to select a first sensor (e.g., the sensor with the highest received energy). Then, for example, thesensor processing module253 may utilize information of the relative distance between the selected sensor and the pointing device, information of the gain pattern for the signal transmitted from the pointing device to the selected sensor, and calibration information to determine where the pointing device may be pointed in order for the sensor to receive such energy. For example, this may result in a first closed figure (e.g., a circle, cloverleaf, etc.) drawn around the sensor on the screen plane. Then thesensor processing module253 may repeat the procedure for a second sensor (e.g., a sensor with the second highest received energy), resulting in a second closed figure. Thesensor processing module253 may then, for example, determine the point(s) of intersection between the first and second figures. If only one point of intersection lies within the border of the screen, then such point of intersection might be utilized as an estimate of the pointing location. If, however, there are two potentially significant points of intersection (or more depending on the figures), then thesensor processing module253 may repeat the procedure for a third sensor (e.g., the sensor with the third highest energy, a sensor generally along the line perpendicular to a line segment between the first and second sensors, etc.) and determine a point nearest the intersection of the first, second and third closed figures. Such a point of intersection may then be utilized as an estimate of the pointing location.
The above-mentioned examples of screen-integrated sensors and related pointing location determinations were presented as exemplary illustrations. Though the above-mentioned examples generally discuss light and/or RF energy sensors, other types of sensors may also be integrated into a television screen or overlaid thereon. For example and without limitation, the sensors may comprise acoustic sensors that operate to sense acoustic energy (e.g., directed acoustic energy directed to a pointing location on the screen). For example, such directed acoustic energy may be formed at frequencies beyond the range of human hearing (e.g., and at frequencies beyond the range of pet hearing as well).
Also note that various energy radiation patterns may be used, and/or a plurality of energy radiation patterns may be used. For example, though (e.g., for illustrative clarity) the discussion herein generally discusses a single energy emission from the pointing device, a plurality of energy emissions may be utilized. For example and without limitation, a pointing device may transmit a plurality of different directed energy emissions (e.g., light, RF, etc.) toward the pointing direction. Also for example, a pointing device may transmit one or more energy emissions that move relative to the pointing direction (e.g., in a raster pattern or any other pattern).
After determining on-screen pointing location, thetelevision receiver350 may communicate information of such determined location in various manners. For example and without limitation, thesensor processing module253 of thetelevision receiver200 may utilize thetelevision interface module235 to communicate information of such on-screen pointing location to thetelevision301 for presentation to the user. Also for example, thesensor processing module253 may utilize theuser interface module240 to communicate information of such on-screen pointing location to thetelevision controller320 for presentation to the user. Such communication will also be addressed in the discussions ofFIGS. 9-10.
In addition to various television configurations in which sensors are integrated into the television screen, sensors may be incorporated into the television off-screen. Such sensors may, for example, be incorporated in a border around the screen (or overlaid thereon). For example and without limitation,FIG. 4 is a diagram illustrating anexemplary television system400 with off-screen television sensors in accordance with various aspects of the present invention. Thetelevision system400 includes atelevision401 comprising atelevision screen403. Thetelevision system400 also includes a television controller420 (or other pointing device) pointing to an on-screen pointing location430 along apointing line425 between thetelevision controller420 and the on-screen pointing location430.
Thetelevision system400 also comprises atelevision receiver450 that is communicatively coupled to thetelevision401 via a communication link451 (e.g., a two-way communication link providing video information to thetelevision401 and/or receiving sensor information from the television401). Thetelevision receiver450 may share any or all aspects with the exemplary receivers (150,200 and350) discussed previously and all other receivers discussed herein. Accordingly, various aspects of thetelevision receiver450 will be explained herein with reference to various components of theexemplary television receiver200 illustrated inFIG. 2. Theexemplary television receiver450 is also communicatively coupled to thetelevision controller420 via acommunication link452.
Theexemplary television401 comprises an array of sensors integrated into thetelevision401 around the border of thescreen403. Three of such sensors are labeled410,411 and412. As discussed above, any of a variety of sensor types may be utilized, non-limiting examples of which include light sensors or photo detectors (e.g., photo diodes), RF sensors (e.g., antenna elements), acoustic sensors (e.g., microphones), etc.
The array of sensors may be integrated around thetelevision screen403 in any of a variety of manners. For example, such sensors may be integrated in a border of thetelevision screen403 that is not used for outputting video content. Such a configuration may, for example, avoid sensor interference with video content being displayed on the screen. Also for example, as illustrated inFIG. 4, such sensors may be mounted to a border material of thetelevision401.
For example, an array of photo detectors (e.g., photo diodes) and/or antenna elements (e.g., individual antennas or elements of an antenna array, for example, a phased array) may be incorporated into a border of thetelevision401 around thescreen403. For example, every screen pixel row and/or column may be associated with a pair of corresponding photo diodes and/or antenna elements, or every N×M block of screen pixels may be associated with one or more corresponding photo diodes or antenna elements (e.g., a row and column sensor, two row and two column elements, etc.).
In a photo detector implementation, passive photo detectors may receive varying amounts of respective light energy depending on the pointing direction of a light source pointed at the screen. Also for example, received signals (e.g., pulsed signals) may arrive at different sensors at different respective times/phases (e.g., being indicative of relative position and/or pointing direction, which may also be utilized in a pointing determination). In such a photo detector implementation (e.g., utilizing photo diodes), photo detectors may, for example, be tuned to react to particular light frequencies to reduce interference from output pixel light and/or associated reflections. As a non-limiting example, photo diodes may be tuned to detect light that is not visible to the human eye, visible light frequencies that are relatively rare, light patterns that are unlikely to occur in a television program (e.g., particular pulse codes), etc. In one example, the photo detectors integrated with the television body off-screen may comprise photo diodes that operate to detect energy from a laser pointer or directed infrared energy from a television controller or other pointing device. Note that analogously to the on-screen sensors discussed previously, various aspects may comprise mounting (e.g., adhering) sensors to the television body off-screen. Such sensor installation may, for example, occur at the factory or after-market by a technician or user.
In an antenna element implementation, an array of antenna elements may be positioned around the border of thescreen403. In an RF antenna implementation, passive antennas (or elements of an overall antenna matrix) may receive varying amounts of respective RF energy depending on the pointing direction of a directional RF source aimed at the screen. Also for example, received signals (e.g., pulsed signals) may arrive at different antennas at different respective times/phases (e.g., being indicative of relative position and/or pointing direction, which may also be utilized in a pointing determination). Note that analogously to the on-screen sensors discussed previously, various aspects may comprise mounting (e.g., adhering) sensors to the television body off-screen. Such sensor installation may, for example, occur at the factory or after-market by a technician or user.
In an exemplary scenario, a user may point a pointing device (e.g., a remote controller, a laser pointer, directional RF transmitter, specifically designed eyewear, a mobile computing device, a mobile communication device, a gesture tracking device or glove, etc.) at thetelevision screen403, where the pointing device directs transmitted energy (e.g., light and/or RF energy and/or acoustic energy) at a particular location on thetelevision screen403 to which the device is being pointed. Note that such transmitted energy will likely be transmitted directionally and be associated with an intensity or gain pattern with the highest intensity likely at the center of the pattern (i.e., along the pointing line425) and decreasing as a function of angle from the center of the pattern. Such a gain pattern is generally represented inFIG. 4 by the concentric circles around the on-screen pointing location430. Note, however, that in practice such a gain pattern is likely to be more complex than the illustrated pattern (e.g., including lobes with respective peaks and nulls).
In such an exemplary scenario, each sensor of the sensors integrated into the television around the border of thescreen403 will likely receive some respective amount of energy. For example, along a particular axis the sensor nearest the screen pointing location430 (i.e., along the pointing line425) will likely receive the highest amount of energy, sensors along the particular axis adjacent to thescreen pointing location430 will likely receive a next highest range of energy, and sensors away from thepointing location430 will likely receive progressively less amounts of energy from thepointing device420, as a function of distance from thepointing location430, until such energy is lost in the noise floor.
For example, along the horizontal axis,sensor410 is closest to thepointing location430 and will likely receive the highest energy, with sensors adjacent to the left and right ofsensor410 receiving the next highest amounts of energy, and so on. Also, along the vertical axis,sensors411 and412 will likely receive close to the highest amount of energy, with sensors above and belowsuch sensors411,412 receiving the next highest amounts of energy and so on.
In such an exemplary scenario, the television receiver450 (e.g., theuser interface module240 of thetelevision receiver200 illustrated inFIG. 2) may receive signals indicative of the energy received by the sensors of the television. Thetelevision receiver450 may receive such signals in various manners, depending on the degree of integration of such sensors into thetelevision401. For example, in an exemplary scenario where the sensors are fully integrated into the television401 (e.g., into a border around the screen403) and operationally integrated into thetelevision401, thetelevision receiver450 may receive such signals via a communication interface between thetelevision receiver450 and thetelevision401. Also for example, in another exemplary scenario where the sensors are overlaid on (e.g., adhered to) thetelevision screen401, and where operation of such sensors is independent of thetelevision401, thetelevision receiver450 may receive such signals via a communication link directly between thetelevision receiver450 and the sensors, where such a communication link may be independent of other communication links between thetelevision receiver450 and thetelevision401. Such communication link may, for example, be adapted to communicate information from each sensor to thetelevision receiver450 serially (e.g., in a time-multiplexed manner) and/or in parallel.
Theuser interface module240 may then, for example, provide information of such received sensor signals to thesensor processing module253 for processing. Thesensor processing module253 may then, for example, operate to process such information to determine the screen pointing location. Thesensor processing module253 may perform such processing in any of a variety of manners, non-limiting examples of which will be provided below.
For example, thesensor processing module253 may operate to select the sensor with the highest received energy along each of the horizontal and vertical axes and determine that the respective locations of such selected sensors correspond to the horizontal and vertical coordinates of the on-screen pointing location. For example, in an exemplary scenario where the spatial resolution of screen border sensors is relatively fine, such operation may reliably yield a desired level of accuracy without undue processing overhead. For example, thesensor processing module253 may determine thatsensors410 and411 have the highest received energy for the horizontal and vertical axes, respectively, and thus determine that the on-screen pointing location is represented in the horizontal axis by the horizontal location of thesensor410 and represented in the vertical axis by the vertical location of thesensor411. Note that in scenarios where two sensors have relatively similar energy levels (e.g., as might occur atsensors411 and412, thesensor processing module253 may select a vertical midpoint between such sensors.
In another example, thesensor processing module253 may operate to select, for each screen axis, the sensor with the highest received energy and a plurality of sensors adjacent to such sensor. Then, for example, thesensor processing module253 may interpolate between the locations of such sensors (e.g., based, at least in part, on weighting). For example, in the horizontal dimension in which a sensor to the right of thehighest energy sensor410 has a higher received energy than a sensor to the left of thehighest energy sensor410, thesensor processing module253 may determine that the pointing location along the horizontal axis is to the right of thehighest energy sensor410. How much distance to the right may, for example, be determined as a function of the ratio between respective energies received by the right and left sensors. Such calculation may, for example, be a linear or non-linear calculation. Such calculation may also, for example, consider the expected energy pattern of a transmitting pointing device (e.g., in a scenario where energy fall-off is logarithmic as opposed to linear). Thesensor processing module253 may then, for example, repeat such operation in the vertical direction.
In another example, thesensor processing module253 may operate to select all sensors in each of the axes receiving a threshold amount of energy (e.g., an absolute threshold level, a threshold level relative to the highest energy sensor, etc.). Then, for example, thesensor processing module253 may interpolate between the locations of such sensors (e.g., based, at least in part, on respective energy weighting). For example, thesensor processing module253 may perform non-linear splining between sensors in a horizontal direction with sensor location on a first axis and sensor energy on a second axis. Thesensor processing module253 may then operate to select the point on the sensor location axis corresponding to the peak sensor energy on the vertical axis. Such splining and selecting may then be repeated in the vertical screen direction. Alternatively for example, thesensor processing module253 may operate to perform multi-dimensional splining to create a surface based on sensor energy and select the highest point on such surface and the corresponding screen coordinates of such surface.
After determining on-screen pointing location, thetelevision receiver450 may communicate information of such determined location in various manners. For example and without limitation, thesensor processing module253 of thetelevision receiver200 may utilize thetelevision interface module235 to communicate information of such on-screen pointing location to thetelevision401 for presentation to the user. Also for example, thesensor processing module253 may utilize theuser interface module240 to communicate information of such on-screen pointing location to thetelevision controller420 for presentation to the user. Such communication will also be addressed in the discussions ofFIGS. 9-10.
In addition to various television configurations in which sensors are integrated into the television off-screen or off the video presentation portion of the screen, sensors may be incorporated into the television system off-television. Such sensors may, for example, be incorporated in other components of a television system besides the television. For example and without limitation,FIG. 5 is a diagram illustrating anexemplary television system500 with off-television sensors in accordance with various aspects of the present invention. Thetelevision system500 includes atelevision501 comprising atelevision screen503. Thetelevision system500 also includes a television controller520 (or other pointing device) pointing to an on-screen pointing location530 along apointing line525 between thetelevision controller520 and the on-screen pointing location530.
Thetelevision system500 also comprises atelevision receiver550 that is communicatively coupled to thetelevision501 via a communication link561 (e.g., a two-way communication link providing video information to thetelevision501 and/or receiving sensor information from the television501). Thetelevision receiver550 is also illustrated with one ormore communication links563 to the various sensors551-556 independent of thecommunication link561. Note that in various exemplary scenarios, the television receiver550 (e.g., a user interface module240) may receive sensor information via thetelevision communication link561 and/or via the independent communication link(s)563. Theexemplary television receiver550 is also communicatively coupled to thetelevision controller520.
Thetelevision receiver550 may share any or all aspects with the exemplary receivers (150,200,350 and450) discussed previously and all other receivers discussed herein. Accordingly, various aspects of thetelevision receiver550 will be explained herein with reference to various components of theexemplary television receiver200 illustrated inFIG. 2.
Theexemplary television system500 comprises an array of sensors integrated into audio speaker components (e.g., surround sound speakers) positioned around thetelevision501. For example, thetelevision system500 comprises aleft speaker531 comprising atop sensor552 and abottom sensor551. Also for example, thetelevision system500 comprises aright speaker533 comprising atop sensor556 and abottom sensor555. Additionally for example, the television system comprises a center speaker532 comprising aleft sensor553 and aright sensor554. As discussed above, any of a variety of sensor types may be utilized, non-limiting examples of which include light sensors or photo detectors (e.g., photo diodes), RF sensors (e.g., antenna elements), acoustic sensors (e.g., microphones), etc. Note that the audio speaker component example discussed herein is merely illustrative and that such sensors may be installed in any of a variety of locations (e.g., dedicated sensor boxes, attached to furniture, etc.).
The array of sensors may be positioned around thetelevision501 in any of a variety of manners. For example, such sensors may be positioned around thetelevision501 generally in the same plane as thetelevision screen503. In such an exemplary scenario, on-screen pointing location may be determined in a manner similar to the interpolation and/or gain pattern intersection discussed above with regard to off-screen and/or on-screen sensors. Note that since the locations of the sensors are likely to be inconsistent between various television system configurations, a calibration procedure may be implemented (e.g., by the calibration module251). Such calibration will be discussed in more detail below.
In an exemplary configuration, one or more photo detectors (e.g., photo diodes) and/or antenna elements (e.g., individual antennas or elements of an antenna array) may be incorporated into a plurality of respective surround sound speakers positioned around thetelevision501.
For example, in a photo detector implementation, passive photo detectors may receive varying amounts of respective light energy depending on the pointing direction of a light source aimed at the screen. As discussed previously, directed energy (e.g., light, RF, acoustic, etc.) may be transmitted in a pattern (or envelope), so even if a pointing device is pointed to a location on thetelevision screen530 along pointingline525, sensors off-screen (or even off-television) may still receive energy from the transmission (albeit likely not with the same intensity at which energy is delivered along the pointing line525). Also for example, received signals (e.g., pulsed signals) may arrive at different sensors at different respective times/phases (e.g., being indicative of relative position and/or pointing direction, which may also be utilized in a pointing determination).
In a photo detector implementation (e.g., utilizing photo diodes), photo diodes may, for example, be tuned to react to particular light frequencies to reduce interference from output pixel light and/or associated reflections, ambient light, room lighting, etc. As a non-limiting example, photo diodes may be tuned to detect light that is not visible to the human eye, visible light frequencies that are relatively rare, light patterns that are unlikely to occur in a television program (e.g., particular pulse codes), etc. In one example, the photo detectors integrated with off-television components may comprise photo diodes that operate to detect energy from a laser pointer or directed infrared energy from a television controller (or other pointing device). Note that analogously to the on-screen sensors discussed previously, various aspects may comprise mounting (e.g., adhering) sensors to various off-television components. Such sensor installation may, for example, occur at the factory or after-market by a technician or user.
In an antenna element implementation, an array of antenna elements may be positioned around off-television components (e.g., in surround sound components). In an RF antenna implementation, passive antennas (or elements of an overall antenna matrix) may receive varying amounts of respective RF energy depending on the pointing direction of a directional RF source pointed at a location on the screen. Also for example, received signals (e.g., pulsed signals) may arrive at different antennas at different respective times/phases (e.g., being indicative of relative position and/or pointing direction, which may also be utilized in a pointing determination). Note that analogously to the on-screen sensors discussed previously, various aspects may comprise mounting (e.g., adhering) sensors to the off-television components. Such sensor installation may, for example, occur at the factory or after-market by a technician or user.
In an exemplary scenario, a user may point a pointing device (e.g., a remote controller, a laser pointer, directional RF transmitter, specifically designed eyewear, a mobile computing device, a mobile communication device, a gesture tracking device or glove, etc.) at thetelevision screen503, where the pointing device directs transmitted energy (e.g., light and/or RF energy and/or acoustic energy) at a particular location on thetelevision screen503 to which the user is pointing with the pointing device. Note that such transmitted energy will likely be transmitted directionally and be associated with an intensity or gain pattern with the highest intensity at the center of the pattern (i.e., along the pointing line525) and decreasing as a function of angle from the center of the pattern. Such a gain pattern was discussed previously in the discussion ofFIG. 4.
In such an exemplary scenario, each sensor of the sensors integrated into thetelevision system500 off-television will likely receive some respective amount of energy. For example, along a particular axis, the sensor nearest to the screen pointing location530 (i.e., along the pointing line525) will likely receive the highest amount of energy, a sensor next nearest to thescreen pointing location530 will likely receive a next highest range of energy, and sensors away from thepointing location530 will likely receive progressively less amounts of energy from thepointing device520, as a function of distance from thepointing location530 and/or angle off the pointing line525 (e.g., until such energy is lost in the noise floor). For example,sensor553 is nearest to thepointing location530 and will likely receive the highest energy,sensor552 is next nearest to thepointing location530 and will likely receive the next highest energy, and so on.
Note that in the implementation illustrated inFIG. 5, in particular since there are a relatively low number of sensors, signals from a same sensor may be utilized in determining multiple axes of pointing location. As mentioned previously, a calibration procedure may be performed when thesystem500 is configured to assist in such pointing determination.
In an exemplary scenario, the television receiver550 (e.g., theuser interface module240 of thetelevision receiver200 illustrated inFIG. 2) may receive signals indicative of the energy received by the sensors of thetelevision system500. Thetelevision receiver550 may receive such signals in various manners, depending on the degree of integration of such sensors into thetelevision501. For example, in an exemplary scenario where the sensors are fully integrated into thetelevision system500 components (e.g., surround sound speaker components531-533) and operationally integrated into such components, thetelevision receiver550 may receive such signals via a communication interface between thetelevision receiver550 and the respective off-television components (e.g., via acommunication link563 between thetelevision receiver550 and the surround sound speaker components531-533). Also for example, in another exemplary scenario where the sensors are overlaid on (e.g., adhered to) the off-television components, and where operation of such sensors is independent of thetelevision501, thetelevision receiver550 may receive such signals via a communication link directly between thetelevision receiver550 and the individual sensors, where such a communication link may be independent of other communication links between thetelevision receiver550 and thetelevision501 and/or independent of other communication links between thetelevision receiver550 andother television system500 components (e.g., surround sound speaker components531-533).
Theuser interface module240 may then, for example, provide information of such received sensor signals to thesensor processing module253 for processing. Thesensor processing module253 may then, for example, operate to process such information to determine the screen pointing location. Thesensor processing module253 may perform such processing in any of a variety of manners, non-limiting examples of which will be provided below.
In an exemplary scenario, thesensor processing module253 may operate to estimate a position between sensor positions based on relative sensor energy. For example, in the horizontal dimension,sensor552 may correspond to a relatively high amount of energy, andsensor556 may correspond to a relatively low amount of received energy. Thesensor processing module253 may, for example, estimate a horizontal position relatively closer tosensor552 by an amount proportional to the relative difference between respective amounts of energy. Thesensor processing module253 may perform a similarestimation utilizing sensors551 and555. Various horizontal position estimations may then be averaged. Alternatively for example, respective energies for theleft speaker531 sensors may be averaged, respective energies for theright speaker533 sensors may be averaged, and such left and right speaker average energies may then be utilized to determine a horizontal pointing location. Thesensor processing module253 may then, for example, perform a similar pointing direction estimate in the vertical direction.
In another exemplary scenario, a calibration procedure may be performed to determine an expected sensor energy level (e.g., absolute or relative) when the user is pointing at the sensor. In such a scenario, combined with a gain pattern and user (or pointing device) location relative to thetelevision501, a first line (e.g., a circle or arc) may be drawn around afirst sensor552. Similarly, a second line (e.g., a circle or arc) may be drawn around asecond sensor553, and the intersection of the first and second lines utilized as an estimate of pointing location. Additional lines associated with other sensors may also be utilized. Such additional lines may, for example, be utilized when selecting between multiple line intersections and/or for greater accuracy or resolution. Note that such line intersection solution may be applied to any of the previously discussed scenarios (e.g., as illustrated inFIGS. 3-4). A non-limiting example of this was presented in the discussion ofFIG. 3, and another example will be provided in the following discussion ofFIG. 7.
After determining on-screen pointing location, thetelevision receiver550 may communicate information of such determined location in various manners. For example and without limitation, thesensor processing module253 of thetelevision receiver200 may utilize thetelevision interface module235 to communicate information of such on-screen pointing location to thetelevision501 for presentation to the user. Also for example, thesensor processing module253 may utilize theuser interface module240 to communicate information of such on-screen pointing location to thetelevision controller520 for presentation to the user. Such communication will also be addressed in the discussions ofFIGS. 9-10.
As discussed above, pointing sensors may be incorporated into the television system off-television (i.e., placed separately in stand-alone housings, integrated with other apparatus, attached to other apparatus, etc.). Another example of such off-television sensor placement is presented inFIG. 6. In particular, the screen pointing sensors may be integrated into the television receiver.FIG. 6 is a diagram illustrating anexemplary television system600 with television receiver sensors in accordance with various aspects of the present invention.
Thetelevision system600 includes atelevision601 comprising atelevision screen603. Thetelevision system600 also includes a television controller620 (or other pointing device) pointing to an on-screen pointing location630 along apointing line625 between thetelevision controller620 and the on-screen pointing location630.
Thetelevision system600 also comprises atelevision receiver650 that is communicatively coupled to thetelevision601 via a communication link651 (e.g., a two-way communication link providing video information to thetelevision601 and/or communicating sensor information and/or screen pointing information with the television601). Thetelevision receiver650 comprises an array of screen pointing sensors. A portion of the sensors are labeled (661-665) for discussion purposes. Note that such sensors may be arranged in any of a variety of configurations (e.g., matrix configuration, border configuration, placed only at the front corners, etc.). The pointing sensors may, for example, be integrated into thetelevision receiver650 and/or attached to thetelevision receiver650 in any of a variety of manners (e.g., in any manner similar to those discussed previously with regard to the televisions and/or television system components discussed previously).
Note that in various exemplary scenarios, the television receiver650 (e.g., a user interface module240) may receive additional sensor information from other sensors via thetelevision communication line651 and/or other communication links. Theexemplary television receiver650 is also communicatively coupled to thetelevision controller620 via acommunication link652.
Thetelevision receiver650 may share any or all aspects with the exemplary receivers (150,200,350,450 and550) discussed previously and all other receivers discussed herein. Accordingly, various aspects of thetelevision receiver650 will be explained herein with reference to various components of theexemplary television receiver200 illustrated inFIG. 2.
Theexemplary television receiver650 comprises an array of sensors integrated into thetelevision receiver650. For example, thetelevision receiver650 comprises a lowerleft sensor661, upperleft sensor662, upperright sensor663, lowerright sensor664 andcenter sensor665. As discussed above, any of a variety of sensor types may be utilized, non-limiting examples of which include light sensors or photo detectors (e.g., photo diodes), RF sensors (e.g., antenna elements), acoustic sensors (e.g., microphones), etc.
Theexemplary television receiver650 may be positioned around thetelevision601 in any of a variety of manners. For example, the television receiver650 (and thus the sensors) may be positioned around thetelevision601 in an orientation such that the front face of the television receiver650 (and thus the sensors) is generally in the same plane as thetelevision screen603. Such placement is not necessary, but may be advantageous from an accuracy perspective. In such an exemplary scenario, on-screen pointing location may be determined in a manner similar to the interpolation and/or gain pattern intersection discussed above with regard to off-screen and/or on-screen sensors. Note that since the locations of the sensors are likely to be inconsistent between various television system configurations (i.e., it is unlikely that every user will place/position thetelevision receiver650 in the same manner), a calibration procedure may be implemented (e.g., by the calibration module251). Such calibration was discussed previously and will also be revisited below.
In an exemplary configuration, one or more photo detectors (e.g., photo diodes) and/or antenna elements (e.g., individual antennas or elements of an antenna array) may be incorporated into the faceplate of thetelevision receiver650. Note that additional sensors positioned away from thetelevision receiver650 may also be utilized (e.g., any of the previously discussed sensor placements).
For example, in a photo detector implementation, passive photo detectors may receive varying amounts of respective light energy depending on the pointing direction of a light source aimed at the screen. As discussed previously, directed energy (e.g., light, RF, acoustic, etc.) may be transmitted in a pattern (or envelope), so even if a pointing device is pointed to a location on thetelevision screen630 along pointingline625, sensors off-screen (e.g., sensors integrated into the television receiver650) may still receive energy from the transmission (albeit likely not with the same intensity at which energy is delivered along the pointing line625). Also for example, received signals (e.g., pulsed signals) may arrive at different sensors at different respective times/phases (e.g., being indicative of relative position and/or pointing direction, which may also be utilized in a pointing determination).
In a photo detector implementation (e.g., utilizing photo diodes), photo diodes may, for example, be tuned to react to particular light frequencies to reduce interference from output pixel light and/or associated reflections, ambient light, room lighting, etc. As a non-limiting example, photo diodes may be tuned to detect light that is not visible to the human eye, visible light frequencies that are relatively rare, light patterns that are unlikely to occur in a television program (e.g., particular pulse codes), etc. In one example, the photo detectors integrated with thetelevision receiver650 may comprise photo diodes that operate to detect energy from a laser pointer or directed infrared energy from a controller (or other pointing device). Note that analogously to the on-screen sensors discussed previously, various aspects may comprise mounting (e.g., adhering) sensors tovarious television receiver650 locations and/or to various off-receiver components. Such sensor installation may, for example, occur at the factory or after-market by a technician or user.
In an antenna element implementation, an array of antenna elements may be positioned at locations on the television receiver650 (e.g., only on thetelevision receiver650 and/or at locations around the television receiver650). In an RF antenna implementation, passive antennas (or elements of an overall antenna matrix) may receive varying amounts of respective RF energy depending on the pointing direction of a directional RF source pointed at a location on the screen. Also for example, received signals (e.g., pulsed signals) may arrive at different antennas at different respective times/phases (e.g., being indicative of relative position and/or pointing direction, which may also be utilized in a pointing determination). Note that analogously to the on-screen sensors discussed previously, various aspects may comprise mounting (e.g., adhering) sensors to thetelevision receiver650. Such sensor installation may, for example, occur at the factory or after-market by a technician or user.
In an exemplary scenario, a user may point a pointing device (e.g., a remote controller, a laser pointer, directional RF transmitter, specifically designed eyewear, a mobile computing device, a mobile communication device, a gesture tracking device or glove, etc.) at thetelevision screen603, where the pointing device directs transmitted energy (e.g., light and/or RF energy and/or acoustic energy) at a particular location on thetelevision screen603 to which the user is pointing with the pointing device. Note that such transmitted energy will likely be transmitted directionally and be associated with an intensity or gain (or energy) pattern with the highest intensity at the center of the pattern (i.e., along the pointing line625) and decreasing as a function of angle from the center of the pattern. Such a gain pattern was discussed previously in the discussion ofFIG. 4.
In such an exemplary scenario, each sensor of the sensors integrated into thetelevision receiver650 off-television will likely receive some respective amount of energy. For example, along a particular axis, the sensor nearest to the screen pointing location630 (i.e., along the pointing line625) will likely receive the highest amount of energy, a sensor next nearest to thescreen pointing location630 will likely receive a next highest range of energy, and sensors away from thepointing location630 will likely receive progressively less amounts of energy from thepointing device620, as a function of distance from thepointing location630 and/or angle off the pointing line625 (e.g., until such energy is lost in the noise floor). For example,sensor662 is nearest to thepointing location630 and will likely receive the highest energy,sensors661 and663 are further from thepointing location630, etc., and so on.
Note that in the implementation illustrated inFIG. 6, in particular since there are a relatively low number of sensors, signals from a same sensor may be utilized in determining multiple axes of pointing location. As mentioned previously, a calibration procedure may be performed when thesystem600 is configured to assist in such pointing determination.
In an exemplary scenario, the television receiver650 (e.g., theuser interface module240 of thetelevision receiver200 illustrated inFIG. 2) may receive signals indicative of the energy received by the sensors of thetelevision receiver650. Thetelevision receiver650 may receive such signals in various manners, depending on the degree of integration of such sensors into thetelevision receiver650 and/or various components of thetelevision system600. For example, in an exemplary scenario where the sensors are fully integrated into thetelevision receiver650, thetelevision receiver550 may receive such signals via direct internal link with such sensors. Also for example, in a scenario where various sensors are off thetelevision receiver650, thetelevision receiver650 may receive information from such sensors via direct communication link or via communication link with the various components with which such sensors are integrated.
Theuser interface module240 may then, for example, provide information of such received sensor signals to thesensor processing module253 for processing. Thesensor processing module253 may then, for example, operate to process such information to determine the screen pointing location. Thesensor processing module253 may perform such processing in any of a variety of manners, non-limiting examples of which will be provided below.
In an exemplary scenario, thesensor processing module253 may operate to estimate a position between sensor positions based on relative sensor energy. For example, in the horizontal dimension,sensor662 may correspond to a relatively high amount of energy, andsensor663 may correspond to a relatively low amount of received energy. Thesensor processing module253 may, for example, estimate a horizontal position relatively closer tosensor662 by an amount proportional to the relative difference between respective amounts of energy. Thesensor processing module253 may perform a similarestimation utilizing sensors661 and664. Various horizontal position estimations may then be averaged. Alternatively for example, respective energies for theleft side sensors661,662 may be averaged, respective energies for theright side sensors663,664 sensors may be averaged, and such left and right speaker average energies may then be utilized (e.g., in conjunction with energy pattern characteristics) to determine a horizontal pointing location. Thesensor processing module253 may then, for example, perform a similar pointing direction estimate in the vertical direction.
In another exemplary scenario, a calibration procedure may be performed to determine an expected sensor energy level (e.g., absolute or relative) when the user is pointing at the sensor (and/or other known locations). In such a scenario, combined with a gain pattern and user (or pointing device) location relative to thetelevision601, a first line (e.g., a circle or arc) may be drawn around afirst sensor662. Similarly, a second line (e.g., a circle or arc) may be drawn around asecond sensor663, and the intersection of the first and second lines utilized as an estimate of pointing location. Additional lines associated with other sensors may also be utilized. Such additional lines may, for example, be utilized when selecting between multiple line intersections or to increase accuracy and/or resolution of the pointing determination. Note that such line intersection solution may be applied to any of the previously discussed scenarios (e.g., as illustrated inFIGS. 3-5) or other scenarios discussed herein. A non-limiting example of this was presented in the discussion ofFIG. 3, and another example will be provided in the following discussion ofFIG. 7.
After determining on-screen pointing location, thetelevision receiver650 may communicate information of such determined location in various manners. For example and without limitation, thesensor processing module253 of thetelevision receiver200 may utilize thetelevision interface module235 to communicate information of such on-screen pointing location to thetelevision601 for presentation to the user on thetelevision screen603. Also for example, thesensor processing module253 may utilize theuser interface module240 to communicate information of such on-screen pointing location to thetelevision controller620 for presentation to the user (e.g., on a display of the television controller620). Such communication will also be addressed in the discussions ofFIGS. 9-10.
Various aspects of the present invention may also, for example, include one or more sensors incorporated into the pointing device.FIG. 7 is a diagram illustrating anexemplary television system700 utilizing pointing device sensors in accordance with various aspects of the present invention.
Theexemplary television system700 includes atelevision701 having atelevision screen703. Thetelevision system700 also comprises atelevision receiver750 that is communicatively coupled to thetelevision701 via a communication link751 (e.g., a two-way communication link providing video information to thetelevision701 and/or receiving sensor information from the television701). Theexemplary television receiver750 is also communicatively coupled to thetelevision controller720 via acommunication link752.
Thetelevision receiver750 may share any or all aspects with the exemplary receivers (150,200,350,450,550 and650) discussed previously and all other receivers discussed herein. Accordingly, various aspects of thetelevision receiver750 will be explained herein with reference to various components of theexemplary television receiver200 illustrated inFIG. 2.
Thetelevision system700 includes a television controller720 (e.g., a pointing device) that comprises one or more sensors (e.g., a plurality of antenna array elements, a plurality of photo detectors, etc.). In such a configuration, sensor information may be communicated to the television receiver750 (e.g., to theuser interface module240 via the first210 or second220 communication interface modules). Such sensor information may, for example, be communicated to thetelevision receiver750 directly (e.g., via communication link752) or indirectly (e.g., via thetelevision701 and communication link751). Such information may then be communicated to thesensor processing module253 for the determination of an on-screen pointing location.
In the exemplary configuration, thetelevision701 includes eight emitters (e.g., light emitters, RF transmitters, etc.) located around the border of thetelevision screen703. Note that such emitters may be positioned anywhere proximate thetelevision system700. For example, thetelevision701 includes afirst emitter711,second emitter712,third emitter713,fourth emitter714,fifth emitter715, sixth emitter716,seventh emitter717 andeighth emitter718. Such emitters may each emit a signal that may be received at sensors on-board thecontroller720. Such sensors may, for example, make up a directional receiver. In such a configuration, the controller720 (or other pointing device) may be pointed to alocation730 on thescreen703 along apointing line725. With such an orientation and a directional signal reception pattern, the sensors on-board thecontroller720 will receive the emitted signals at respective signal levels. Such sensor signals may then be processed in a manner similar to the manners discussed above to determine the on-screen pointing direction for thepointing device720.
For example, through a calibration procedure, it may be known that the pointing device at a particular location should receive a particular amount of energy from each of the emitters711-718 when pointed directly at such emitters (or at some other known location). In such a scenario, the pointing device may measure respective signal energies received from each of the emitters (e.g., each distinguishable by frequency, coding, timing and/or timeslotting, etc.) and communicate such information to thetelevision receiver750. Thepointing device720 may also, for example, communicate pointing device position (and/or orientation) information to thetelevision receiver750. Thetelevision receiver750 may receive such sensor and/or position information via at least one of thecommunication interface modules210,220 and/or theuser interface module240 and process such sensor information with thesensor processing module253.
Thesensor processing module253 may, for example, select a first emitter712 (e.g., the emitter corresponding to the highest energy received at the pointing device). Thesensor processing module253 may then process the location of the pointing device, the receive gain pattern for the pointing device, and the energy received from thefirst emitter712 to determine a first figure (e.g., an arc752) along which the pointing device, if pointed, would be expected to receive the measured energy. Similarly, thesensor processing module253 may perform such a procedure for asecond emitter711 resulting in a second figure (e.g., an arc751). The intersection of such arcs may be utilized as an estimate of on-screen pointing location. Additionally, for accuracy or for selecting between multiple intersection points, should they occur, thesensor processing module253 may perform such a procedure for athird emitter714 resulting in a third figure (e.g., an arc754), and so on. The intersection of the threearcs752,751,754 may then be utilized as an estimate of on-screen pointing location.
Alternatively, the solution need not be based on a known position (location) of the pointing device, nor on absolute received energy levels. In such a scenario, differences in received energy from the various emitters may be processed with or without position information of the on-screen pointing device. For example, thepointing device720 may have six degrees of freedom (e.g., three positional degrees of freedom and three orientational degrees of freedom). In such a scenario, if the position and orientation of thetelevision701 are known, the unknown six degrees of freedom for thepointing device720 may be ascertained by processing six known values related to such six degrees of freedom (e.g., related by a known signal energy pattern). In such a scenario, measurements associated with six emitters on the television (and potentially more) may be utilized to solve for the six degrees of freedom of thepointing device720.
The above-mentioned exemplary scenarios were presented to illustrate numerous manners in which the television receiver750 (e.g., sensor processing module253) may operate to determine on-screen pointing location. Such examples are merely exemplary and thus the scope of various aspects of the present invention should not be limited by any particular characteristics of such examples unless explicitly claimed.
As discussed above, thecalibration module251 of thetelevision receiver200 may operate to perform calibration operations. Such calibrating may be performed in any of a variety of manners. For example and without limitation, calibration may be utilized to determine expected received energy when transmitters and receivers are located and oriented in a particular manner. For example, a non-limiting example of a calibration procedure may comprise presenting an on-screen target at various locations and measuring respective sensor signals received when the pointing device is being pointed at such targets. Also for example, a calibration procedure may comprise directing a user (e.g., using the user interface module240) to point to each of a plurality of sensors to determine an expected amount of received energy when the user is pointing directly at such sensors.
As mentioned previously, signal energy (or gain) pattern may be utilized in various on-screen pointing determinations. Such an energy (or gain) pattern may be predefined for a particular pointing device (e.g., at the factory), but may also be measured by thetelevision receiver200. In a non-limiting example, thecalibration module251 may direct the user to utilize a pointing device to point to a location on the screen and process information received from multiple sensors (e.g., embedded in the screen, embedded in the television around the border of the screen, located in off-television devices, located on thetelevision receiver750, located in the pointing device, etc.) to develop a custom gain pattern for the particular pointing device. For example, such calibration may determine the shape of the gain pattern, the signal energy falloff characteristics, etc.
Various aspects discussed above included the processing of position information. In such exemplary cases, thetelevision receiver200 may comprise one ormore location modules252 that operate to determine relevant position information. Thelocation module252 may operate to perform such location determining (e.g., of the user or pointing device and/or the television) in any of a variety of manners. For example, thelocation module252 may utilize acommunication interface module210,220 to receive position information (e.g., of thetelevision receiver200 or of the pointing device) from an external source of such information (e.g., global positioning system, cellular triangulation system, home triangulation system, etc.).
Also for example, thelocation module252 may receive position information directly from the pointing device (e.g., where such pointing device has position-determining capability). For example, in a non-limiting exemplary scenario, where the pointing device is a handheld computer, such computer may comprise GPS (or A-GPS) capability to determine its position. In such a scenario, the pointing device may wirelessly communicate information of its position to thetelevision receiver200, and ultimately to thelocation module252 via acommunication interface module210,220.
Additionally for example, the location module232 may operate to process sensor information to determine location of the pointing device (e.g., location in relation to the television screen). For example, as mentioned previously, a signal (e.g., a pulse) transmitted from a pointing device to the television will arrive at different sensors at different points in time depending on the respective distance from the pointing device to each sensor. The location module232 may process such time-of-arrival information at various sensors to determine the position of the pointing device relative to the television. Similarly, in a scenario including signal emitters associated with the television and sensors on the pointing device, simultaneously transmitted signals from different emitters will arrive at the pointing device at different respective times depending on the position of the pointing device relative to such emitters. Alternatively, the location module232 may also operate to process phase difference information (in addition to timing information or instead of such information) to determine pointing device location.
Once the television receiver200 (e.g., the sensor processing module253) determines an on-screen pointing location, thetelevision receiver200 may utilize such information in any of a variety of manners. For example and without limitation, thesensor processing module253 may operate to generate information of the determined on-screen pointing location, and one or more modules of thetelevision receiver200 may operate to communicate a signal (e.g., to a television, television controller, other display device, etc.) that comprises characteristics that cause presentation of a visual indication (e.g., on the television screen, controller screen, other display, etc.) to indicate to the user the on-screen location to which thetelevision receiver200 has determined the user is pointing. Such a visual indication may, for example, comprise characteristics of a cursor or other graphical construct, bright spot, highlighting, color variation, brightness variation, etc. For example, thetelevision receiver200 may operate to overlay such indication on video content (e.g., television programming) being presented to the user (e.g., presented on the television screen, presented on a screen of the television controller, etc.).
Additionally for example, thesensor processing module253 may provide information of the determined on-screen pointing location to one or more other modules of the television receiver200 (e.g., theprocessing module250 and/or other modules thereof) to identify an object in video content (e.g., television programming) to which a user is pointing. In such an exemplary scenario, one or more modules of thetelevision receiver200 may operate to communicate signals (e.g., to a television, television controller having a screen, other display device, etc.) that cause highlighting of an object to which the user is pointing and/or provide information regarding such object.
Further for example, various modules of the television receiver200 (e.g., the processor module250) may operate to communicate on-screen pointing location information to television system components separate from the television (e.g., to a different television receiver, video recorder, remote programming source, communication network infrastructure, advertising company, provider of goods and/or services, etc.). Also for example, various modules of thetelevision receiver200 may operate to communicate information of the determined on-screen pointing location to the pointing device of the user (e.g., for providing pointing feedback to the user at a remote controller, etc.).
FIG. 2 provided a diagram illustrating anexemplary television receiver200 in accordance with various aspects of the present invention.FIG. 8 provides another diagram illustrating anexemplary television receiver800 in accordance with various aspects of the present invention. Theexemplary television receiver800 may share any or all aspects with any of the television receivers discussed herein and illustrated inFIGS. 1-7. For example, the exemplary television receiver800 (or various modules thereof) may operate to perform any or all functionality discussed herein. As with theexemplary television receiver200, the components of theexemplary television receiver800 may be co-located a single housing.
For example, thetelevision receiver800 comprises aprocessor830. Such aprocessor830 may, for example, share any or all characteristics with theprocessor250 discussed with regard toFIG. 2. Also for example, thetelevision receiver800 comprises amemory840.Such memory840 may, for example, share any or all characteristics with thememory260 discussed with regard toFIG. 2.
Also for example, thetelevision receiver800 may comprise any of a variety of user interface module(s)850. Such user interface module(s)850 may, for example, share any or all characteristics with the user interface module(s)240 discussed previously with regard toFIG. 2. For example and without limitation, the user interface module(s)850 may comprise: a display device, a camera (for still or moving picture acquisition), a speaker, an earphone (e.g., wired or wireless), a microphone, a video screen (e.g., a touch screen display), a vibrating mechanism, a keypad, a remote control interface, and/or any of a variety of other user interface devices (e.g., a mouse, a trackball, a touch pad, touch screen, light pen, game controlling device, etc.).
Theexemplary television receiver800 may also, for example, comprise any of a variety of communication modules (805,806, and810). Such communication module(s) may, for example, share any or all characteristics with the communication interface module(s)210,220 discussed previously with regard toFIG. 2. For example and without limitation, the communication interface module(s)810 may comprise: a Bluetooth interface module; an IEEE 802.11, 802.15, 802.16 and/or 802.20 module; any of a variety of cellular telecommunication interface modules (e.g., GSM/GPRS/EDGE, CDMA/CDMA2000/1x-EV-DO, WCDMA/HSDPA/HSUPA, TDMA/PDC, WiMAX, etc.); any of a variety of position-related communication interface modules (e.g., GPS, A-GPS, etc.); any of a variety of wired/tethered communication interface modules (e.g., USB, Fire Wire, RS-232, HDMI, component and/or composite video, Ethernet, wireline and/or cable modem, etc.); any of a variety of communication interface modules related to communicating with external memory devices; etc. Theexemplary television receiver800 is also illustrated as comprising various wired806 and/orwireless805 front-end modules that may, for example, be included in the communication interface modules and/or utilized thereby.
Theexemplary television receiver800 may also comprise any of a variety of signal processing module(s)890. Such signal processing module(s)890 may, for example, be utilized to assist in processing various types of information discussed previously (e.g., with regard to sensor processing, position determination, video processing, image processing, audio processing, general user interface information data processing, etc.). For example and without limitation, the signal processing module(s)890 may comprise: video/graphics processing modules (e.g. MPEG-2, MPEG-4, H.263, H.264, JPEG, TIFF, 3-D, 2-D, MDDI, etc.); audio processing modules (e.g., MP3, AAC, MIDI, QCELP, AMR, CMX, etc.); and/or tactile processing modules (e.g., Keypad I/O, touch screen processing, motor control, etc.).
Various aspects of the present invention were previously exemplified by non-limiting illustrations and described in terms of operations performed by various modules of the television. Various aspects of the present invention will now be illustrated in the form of method flow diagrams.
FIG. 9 is a flow diagram900 illustrating the generation of on-screen pointing information (e.g., in a television receiver) in accordance with various aspects of the present invention. Theexemplary method900 may, for example, share any or all characteristics with the television receiver operation discussed previously.
Theexemplary method900 may begin executing at step905. Theexemplary method900 may begin executing in response to any of a variety of causes and/or conditions. For example and without limitation, themethod900 may begin executing in response to a user command to begin, detected user interaction with a pointing device (e.g., a television controller), detected user presence in the vicinity, detected user interaction with a television implementing themethod900, etc. Also for example, themethod900 may begin executing in response to a television presenting programming or other video content for which on-screen pointing is enabled and/or relevant.
Theexemplary method900 may, for example atstep910, comprise receiving pointing sensor information. For example and without limitation, step910 may comprise any or all sensor information receiving characteristics described previously with regard the various modules of the exemplary television receivers illustrated inFIGS. 1-8 and discussed previously. For example, step910 may share any or all sensor information receiving characteristics discussed previously with regard to at least theuser interface module240,television interface module235,processor module250,communication interface modules210,220,sensor processing module253,location module252 andcalibration module251.
Step910 may, for example, comprise receiving sensor information from (or associated with) sensors integrated in the television receiver. Also for example, step910 may comprise receiving sensor information from (or associated with) off-receiver sensors (e.g., integrated with or attached to a television, off-television sensors, sensors integrated with a pointing device (e.g., a television controller), etc. As discussed previously, such sensors may comprise any of a variety of characteristics, including without limitation, characteristics of light sensors, RF sensors, acoustic sensors, active and/or passive sensors, etc.
In general,step910 may comprise receiving pointing sensor information. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular manner of receiving pointing sensor information unless explicitly claimed.
Theexemplary method900 may, atstep920, comprise processing received sensor information (e.g., as received at step910) to determine a location on a screen of the television to which a user is pointing (e.g., pointing with a pointing device). For example and without limitation, step920 may comprise any or all pointing location processing characteristics described previously with regard the various modules of the exemplary television receivers illustrated inFIGS. 1-8 and discussed previously. For example, step920 may share any or all pointing location determining characteristics discussed previously with regard to at least theprocessor module250,sensor processing module253,location module252 andcalibration module251.
Step920 may, for example, comprise determining on-screen pointing location in any of a variety of manners. For example, step920 may comprise determining on-screen pointing location based on a location of a selected sensor, based on interpolation between sensor locations (e.g., linear and/or non-linear interpolation), based on determining energy pattern intersection(s), etc. Many examples of such determining were provided previously.
In general,step920 may comprise processing received sensor information (e.g., independently and/or in conjunction with other information) to determine a location on a screen of the television to which a user is pointing. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular manner of performing such processing unless explicitly claimed.
Theexemplary method900 may, atstep930, comprise generating information indicative of a determined on-screen pointing location (e.g., as determined at step920). For example and without limitation, step930 may comprise any or all pointing location information generation characteristics described previously with regard the various modules of the exemplary television receivers illustrated inFIGS. 1-8 and discussed previously. For example, step930 may share any or all information generation characteristics discussed previously with regard to at least theprocessor module250,sensor processing module253,location module252,calibration module251,television interface module235,user interface module240 and/orcommunication interface modules210,220.
Step930 may, for example, comprise generating such information in any of a variety of manners. For example, step930 may comprise generating on-screen pointing location data to communicate to internal modules of the television receiver, to equipment external to the television receiver (e.g., to the television and/or television controller), to television network components, to a television programming source, etc. Such information may, for example, be communicated to various system components and may also be presented to the user (e.g., utilizing visual feedback displayed on a screen of a television, television controller, etc.). Such information may, for example, be generated in the form of screen coordinates, identification of a video content object (e.g., a programming object or person) to which an on-screen pointing location corresponds, generation of an on-screen cursor or highlight or other graphical feature, etc.
In general,step930 may comprise generating information indicative of a determined on-screen pointing location. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular manner of generating such information unless explicitly claimed.
Theexemplary method900 may, atstep995, comprise performing continued processing. Such continued processing may comprise characteristics of any of a variety of types of continued processing, various examples of which were presented previously. For example and without limitation, step995 may comprise looping execution flow back up to any earlier step (e.g., step910). Also for example, step995 may comprise presenting (or causing the presentation of) visual feedback indicia of the on-screen pointing location for a user. Additionally for example, step995 may comprise communicating information of the on-screen pointing location to system components external to the television receiver implementing the method900 (e.g., to a television, television controller, etc.). Further for example, step995 may comprise utilizing the on-screen pointing information to identify a video content object (e.g., an object presented in television programming) to which a user is pointing, etc.
In general,step995 may comprise performing continued processing. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular manner of performing continued processing unless explicitly claimed.
Turning next toFIG. 10, such figure is a flow diagram1000 illustrating the generation of on-screen pointing information (e.g., in a television receiver) in accordance with various aspects of the present invention. Theexemplary method1000 may, for example, share any or all characteristics with the television receiver operation discussed previously (e.g., in reference toFIGS. 1-9).
Theexemplary method1000 may begin executing atstep1005.Step1005 may, for example, share any or all characteristics with step905 of theexemplary method900 illustrated inFIG. 9 and discussed previously.
Theexemplary method1000 may, for example atstep1008, comprise performing a calibration procedure with the user. Such a calibration procedure may, for example, be performed to develop a manner of processing received sensor information to determine on-screen pointing location.Step1008 may, for example, comprise any or all calibration aspects discussed previously (e.g., with reference to the calibration module251).
Theexemplary method1000 may, for example atstep1010, comprise receiving pointing sensor information. For example and without limitation,step1010 may comprise any or all sensor information receiving characteristics described previously with regard the various modules of the exemplary television receivers illustrated inFIGS. 1-8 andFIG. 9 (e.g., step910) and discussed previously.
Theexemplary method1000 may, for example atstep1015, comprise determining user position (e.g., determining position of a user pointing device). For example and without limitation,step1015 may comprise any or all position determining characteristics discussed previously with regard toFIGS. 1-9. Note that position may also, for example, include orientation.
For example,step1015 may share any or all position determining characteristics discussed previously with regard to at least theprocessor module250,sensor processing module253,location module252 andcalibration module251. For example,step1015 may comprise determining user position based, at least in part, on received sensor signals. Also for example,step1015 may comprise determining user position based, at least in part, on position information received from one or more systems external to the television receiver implementing themethod1000.
In general,step1015 may comprise determining user position (e.g., pointing device position). Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular manner of determining user position unless explicitly claimed.
Theexemplary method1000 may, for example, atstep1020, comprise processing received sensor information (e.g., as received at step1010) and/or user position information (e.g., as determined at step1015) to determine a location on a screen of the television to which a user is pointing (e.g., pointing with a pointing device). For example and without limitation,step1020 may comprise any or all pointing location determination characteristics described previously with regard the various modules of the exemplary television receivers illustrated inFIGS. 1-8 andFIG. 9 (e.g., step920) and discussed previously. For example,step1020 may share any or all pointing location determining characteristics discussed previously with regard to at least theprocessor module250,sensor processing module253,location module252 andcalibration module251.
Step1020 may, for example, comprise determining on-screen pointing location in any of a variety of manners. For example,step1020 may comprise determining on-screen pointing location based on a location of a selected sensor, based on location of the pointing device, based on interpolation between sensor locations (e.g., linear and/or non-linear interpolation), based on energy pattern intersection points, etc. Many examples of such determining were provided previously.
In general,step1020 may comprise processing received sensor information and/or user position information to determine a location on a screen of the television to which a user is pointing (e.g., pointing with a pointing device). Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular manner of performing such processing unless explicitly claimed.
Theexemplary method1000 may, atstep1030, comprise generating information indicative of a determined on-screen pointing location (e.g., as determined at step1020). For example and without limitation,step1030 may comprise any or all information generation characteristics described previously with regard the various modules of the exemplary television receivers illustrated inFIGS. 1-8 andFIG. 9 (e.g., step930) and discussed previously. For example,step1030 may share any or all information generation characteristics discussed previously with regard to at least theprocessor module250,sensor processing module253,location module252,calibration module251,television interface module235,user interface module240 and/orcommunication interface modules210,220.
Theexemplary method1000 may, atstep1095, comprise performing continued processing. Such continued processing may comprise characteristics of any of a variety of types of continued processing, various examples of which were presented previously. For example and without limitation,step1095 may comprise looping execution flow back up to any earlier step (e.g., step1008). Also for example,step1095 may comprise presenting (and/or causing the presentation of) visual feedback indicia of the on-screen pointing location for a user. Additionally for example,step1095 may comprise communicating information of the on-screen pointing location to system components external to the television receiver implementing themethod1000. Further for example,step1095 may comprise utilizing the on-screen pointing information to identify a video content object (e.g., an object presented in television programming) to which a user is pointing, etc.
In general,step1095 may comprise performing continued processing. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular manner of performing continued processing unless explicitly claimed.
In summary, various aspects of the present invention provide a system and method in a television receiver for generating screen pointing information. While the invention has been described with reference to certain aspects and embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the scope of the invention. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the invention without departing from its scope. Therefore, it is intended that the invention not be limited to the particular embodiment disclosed, but that the invention will include all embodiments falling within the scope of the appended claims.