CROSS-REFERENCE TO RELATED APPLICATIONSThis application is a continuation of U.S. patent application Ser. No. 10/143,123, filed May 10, 2002, which claims priority to U.S. provisional application having Ser. No. 60/290,315, filed May 11, 2001, both of which are entirely incorporated herein by reference.
This application is related to copending U.S. patent application Ser. No. 10/143,647, filed May 10, 2002, which is entirely incorporated herein by reference.
TECHNICAL FIELDThe present invention is generally related to television systems, and, more particularly, is related to personal video recording.
BACKGROUND OF THE INVENTIONWith recent advances in digital transmission technology, subscriber television systems are now capable of providing much more than the traditional analog broadcast video. In implementing enhanced programming, the home communication terminal device (“HCT”), otherwise known as the set-top box, has become an important computing device for accessing media content services (and media content within those services) and navigating a user through a maze of available services. In addition to supporting traditional analog broadcast video functionality, digital HCTs (or “DHCTs”) now also support an increasing number of two-way digital services such as video-on-demand and personal video recording.
Typically, a DHCT is connected to a cable or satellite, or generally, a subscriber television system, and includes hardware and software necessary to provide the functionality of the digital television system at the user's site. Some of the software executed by a DHCT may be downloaded and/or updated via the subscriber television system. Each DHCT also typically includes a processor, communication components, and memory, and is connected to a television or other display device, such as a personal computer. While many conventional DHCTs are stand-alone devices that are externally connected to a television, a DHCT and/or its functionality may be integrated into a television or personal computer or even an audio device such as a programmable radio, as will be appreciated by those of ordinary skill in the art.
DHCTs are typically capable of providing users with a very large number and variety of media content choices. As the number of available media content choices increases, viewing conflicts arise whereby the user must choose between watching two or more media content instances (e.g., discrete, individual instances of media content such as, for a non-limiting example, a particular television show or “program” episode), all of which the user would like to view. Further, because of the large number of viewing choices, the user may miss viewing opportunities. Buffering of media content instances in memory, or more recently, in storage devices (e.g., hard disk drives, CD-ROM, etc.) coupled to the DHCT, has provided some relief from the conflict in viewing choices while providing personal video recording functionality. However, current buffering mechanisms for personal video recording make inefficient use of tuner and buffer resources for a plurality of display channel changes using buffering mechanisms that operate under single-tuner constraints and/or assumptions. Therefore, there exists a need to exploit multi-tuner functionality to make more efficient use of DHCT resources.
Thus, a heretofore unaddressed need exists in the industry to address the aforementioned deficiencies and inadequacies.
BRIEF DESCRIPTION OF THE DRAWINGSThe preferred embodiments of the invention can be better understood with reference to the following drawings. The components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present invention. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.
FIG. 1A is a block diagram of an example subscriber television system (STS), in accordance with one embodiment of the invention.
FIG. 1B is a block diagram of the transmission signals supported by the STS ofFIG. 1A, and input into the digital home communication terminal (DHCT) from the headend, in accordance with one embodiment of the invention.
FIG. 2 is a block diagram of an example headend as depicted inFIG. 1A and related equipment, in accordance with one embodiment of the invention.
FIG. 3A is a block diagram of an example DHCT as depicted inFIG. 1A and related equipment, in accordance with one embodiment of the invention.
FIG. 3B is a block diagram of example memory for the example DHCT depicted inFIG. 3A, in accordance with one embodiment of the invention.
FIG. 3C is a schematic diagram of an example hard disk and hard disk elements located within the storage device coupled to the DHCT depicted inFIG. 3A.
FIG. 3D is a schematic diagram of an example remote control device to provide input to the DHCT illustrated inFIG. 3A, in accordance with one embodiment of the invention.
FIG. 4 is a schematic diagram illustrating an example conflict scenario in a two tuner, two buffer system that would require the establishment of priorities for receiving and buffering media content from a plurality of display channels, in accordance with one embodiment of the invention.
FIGS. 5-11 are timing diagrams that illustrate some example resource interactions included in tuning, buffering, and displaying media content among a variety of analog and digital signal flow path configurations, in accordance with several embodiments of the invention.
FIGS. 12A-12C are flow diagrams that illustrate an example resource management process for tuning, buffering, and displaying media content from a plurality of display channels for the configurations illustrated inFIGS. 5-11, in accordance with one embodiment of the invention.
FIGS. 13-17 are flow diagrams that illustrate steps for prioritizing between tuner resources and buffer space based on a plurality of download durations in order to address the example scenario ofFIG. 4, in accordance with several embodiments of the invention.
FIG. 18 is a screen diagram of an example decision barker screen that enables a user to make decisions between which downloaded display channel media content to displace in order to receive new media content from a new display channel, in accordance with one embodiment of the invention.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTSThe preferred embodiments of the invention now will be described more fully hereinafter with reference to the accompanying drawings, in which preferred embodiments of the invention are shown. One way of understanding the preferred embodiments of the invention includes viewing them within the context of a subscriber television system, and more particularly within the context of a media client device, such as a digital home communication terminal (DHCT). The DHCT provides for user interaction with what is displayed on a television and what is buffered into an associated storage device. Although other communication environments are considered to be within the scope of the preferred embodiments, the preferred embodiments of the invention will be described in the context of a DHCT that receives media content from a headend over a subscriber network as one example implementation among many.
Because the preferred embodiments of the invention can be understood in the context of a subscriber television system environment, an initial description of a subscriber television system is followed with a description of the types of transmission signals that are included in the subscriber television system, in addition to further description of the headend and DHCT (and coupled storage device) that are included within the subscriber television system. The preferred embodiments of the invention include controlling rules for displaying and buffering media content in a multi-tuner, multi-display channel changing environment. Thus, the description that follows the DHCT discussion will help to illustrate what resources are included in tuning, displaying and/or storing media content at the DHCT, and how those resources are managed and allocated to implement a plurality of display channel changes.
Following the discussion on resource allocation and management is a description of input variables, and how the input variables are used in the context of a rule based system of the preferred embodiments to produce a set of consequences and/or outputs to make decisions on what media content to receive and buffer during a plurality of display channel changes.
This description of input variables in the context of a rule-based system is followed by a description of some example implementations that rely on the rule based system to provide for efficient functioning of the personal video recording (PVR) system of the DHCT
The preferred embodiments of the invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those having ordinary skill in the art. Furthermore, all “examples” given herein are intended to be non-limiting, and are provided as an exemplary list among many other examples contemplated but not shown.
One embodiment of the invention is generally implemented as part of a subscriber television system (STS), which includes digital broadband delivery systems (DBDS) and cable television systems (CTS). As a non-limiting example, a subscriber television system (STS) and its operation will be described initially, with the understanding that other conventional data delivery systems are within the scope of the preferred embodiments of the invention.FIG. 1A shows a block diagram view of anSTS10, which is generally a high quality, reliable and integrated network system that is typically capable of delivering video, audio, voice and data services to digital home communication terminals (DHCTs)16. AlthoughFIG. 1A depicts a high level view of anSTS10, it should be appreciated that a plurality of subscriber television systems can tie together a plurality of regional networks into an integrated global network so that DHCT users can receive media content provided from anywhere in the world. Further, it will be appreciated that theSTS10 shown inFIG. 1A is merely illustrative and should not be construed as implying any limitations upon the scope of the preferred embodiments of the invention. For instance, subscriber television systems also included within the scope of the preferred embodiments of the invention include systems not utilizing physical structured cabling for transmission, such as, but not limited to, satellite systems. Further, transmission media included within the scope of the preferred embodiments of the invention include, but are not limited to, Hybrid Fiber/Coax (HFC), optical, satellite, radio frequency (RF), frequency modulated (FM), and microwave. Further, data provided from theheadend11 to theDHCTs16 and programming necessary to perform the functions discussed below will be understood to be present in theSTS10, in accordance with the description below.
TheSTS10 typically delivers broadcast video signals as digitally formatted signals in addition to delivering traditional broadcast analog video signals. Furthermore, the system can typically support one way broadcast services as well as both one-way data services and two-way media content and data services. The two-way operation of the network typically allows for user interactivity with services, such as Pay-Per-View programming, Near Video-On-Demand (NVOD) programming according to any of several known NVOD implementation methods, Video-on-Demand (VOD) programming (according to any of several VOD implementation methods), and interactive applications, such as Internet connections.
TheSTS10 also provides the interfaces, network control, transport control, session control, and servers to access media content from media content services, and distributes media content to DHCT users. As shown inFIG. 1A, atypical STS10 comprises aheadend11,hubs12, anHFC access network17, nodes,13, taps14, andDHCTs16. It should be appreciated that although a single component (e.g., a headend) is illustrated inFIG. 1A, anSTS10 can feature a plurality of any one of the illustrated components, can omit components, or may be configured with alternative embodiments for any one of the individual components or with yet other additional components not enumerated above.
Media content provided by one or more content providers (not shown) is communicated by the content providers to one ormore headends11. From thoseheadends11 the media content is then communicated over acommunications network18 that includes a plurality of HFC access networks17 (only oneHFC access network17 is illustrated). TheHFC access network17 typically comprises a plurality ofHFC nodes13, each of which may serve a local geographical area. Thehub12 connects to theHFC node13 through a fiber portion of theHFC access network17. TheHFC node13 is connected to atap14 that is connected to a digital home communication terminal (DHCT)16. Coaxial cables are typically used to couplenodes13 and taps14 because the electrical signals can be easily repeated with RF amplifiers. As the high-level operations of many of the functions of anSTS10 are well known to those of ordinary skill in the art, further high level description of theoverall STS10 ofFIG. 1A will not be contained hereinFIG. 1B is a block diagram illustrating the transmission signals supported by the STS10 (FIG. 1A), where the transmission signals60,64,68, and72 are input into a DHCT16 in accordance with one embodiment of the invention. One or more content providers (not shown) are the source of the information that is included in the transmission signals. Before passing through the network17 (FIG. 1), transmission signals can be generated at aheadend11 or at a hub12 (FIG. 1A) that might function as a mini-headend and which therefore possesses some of the headend functionality.
As depicted inFIG. 1B, the STS10 (FIG. 1A) can simultaneously support a number of transmission signal types, transmission rates, and modulation formats. The ability to carry analog and digital signals over a large bandwidth are characteristics of an HFC network typically employed in an STS, as in theSTS10 ofFIG. 1A. As will be appreciated by those of ordinary skill in the art, analog and digital signals in HFC networks can be multiplexed using Frequency Division Multiplexing (FDM), which enables many different types of signals to be transmitted over theSTS10 to theDHCT16. Typically, anSTS10 using HFC supports downstream (i.e., in the direction from theheadend11 to the DHCT16) frequencies from 50 mega-hertz (MHz) to 870 MHz, whereas upstream frequencies (i.e., in the direction from the DHCT16 to higher levels of the system) are in the 5 MHz to 42 MHz band. Generally, the RF bandwidth spacing for analog and digital services is 6 MHz. Furthermore, for a typical 870 MHz system in the United States, a possible downstream RF spectrum subdivision plan uses 6 MHz frequency subdivisions, or spans, within the 50 MHz to 550 MHz band for analog video transmission signals and within the 550 MHz to 870 MHz range for digital transmission signals.
Analog Transmission Signals (ATSs)60 shown inFIG. 1B are typically broadcast in 6 MHz frequency subdivisions, typically referred to in analog broadcasting as channels, having an analog broadcast signal composed of analog video and analog audio, and include Broadcast TV Systems Committee (BTSC) stereo and Secondary Audio Program (SAP) audio. Referring again toFIG. 1B, the downstream direction transmission signals, having been multiplexed, and in one embodiment using FDM, are often referred to as in-band transmission signals and include theATSs60 and Digital Transmission Signals (DTSs)64,68,72 (also known as Digital Transport Signals). These transmission signals carry video, audio, and/or data services. For example, these transmission signals may carry television signals, Internet data, and/or any additional types of data, such as Interactive Program Guide (IPG) data. Additionally, as will be appreciated by those of ordinary skill in the art, additional data can be sent with the analog video image in the Vertical Blanking Interval (VBI) of the video signal and stored in DHCT memory or a DHCT local physical storage device (not shown). It should be appreciated, however, that the amount of data that can be transmitted in the VBI of the analog video signal is typically significantly less than data transmitted in a DTS.
Like theATSs60, theDTSs64,68,72 each typically occupies 6 MHz of the RF spectrum. However, theDTSs64,68, and72 are digital transmission signals consisting of 64- or 256-Quadrature Amplitude Modulated (QAM) digital signals formatted preferably using Moving Picture Experts Group (MPEG) standards such as MPEG-2 transport streams, allocated in a separate frequency range. The MPEG-2 transport stream enables transmission of a plurality of DTS types over each 6 MHz RF subdivision, as compared to a 6 MHz ATS. The three types of digital transport signals illustrated inFIG. 1B include broadcast digital transmission signals64, carousel digital transmission signals68, and on-demand transmission signals72.
MPEG-2 transport may be used to multiplex video, audio, and data in each of these Digital Transmission Signals (DTSs). However, because an MPEG-2 transport stream allows for multiplexed video, audio, and data into the same stream, the DTSs do not necessarily have to be allocated in separate 6 MHz RF frequencies, unlike theATSs60 in one embodiment. On the other hand, each DTS is capable of carrying multiple broadcast digital media content instances, multiple cycling data carousels containing broadcast data, and data requested on-demand by the subscriber. Data is formatted, such as in Internet Protocol (IP), mapped into MPEG-2 packets, and inserted into the multiplexed MPEG-2 transport stream. Encryption can be applied to the data stream for security so that the data may be received only by authorized DHCTs. The authorizedDHCT16 is provided with the mechanisms to receive, among other things, additional data or enhanced services. Such mechanisms can include “keys” that are required to decrypt encrypted data.
Each 6 MHz RF subdivision assigned to a digital transmission signal can carry the video and audio streams of the media content instances of multiple television (TV) stations, as well as media content and data that is not necessarily related to those TV media content instances, as compared to one TV channel broadcast over oneATS60 that consumes the entire 6 MHz. The digital data is inserted into MPEG transport streams carried through each 6 MHz frequency subdivision assigned for digital transmission, and then demultiplexed at the subscriber DHCT so that multiple sets of data can be produced within each tuned 6 MHz frequency span, or subdivision.
Although broadcast in nature, the carousel DTSs68 and on-demand DTSs72 offer different functionality. Continuing withFIG. 1B, the broadcast DTSs64 andcarousel DTSs68 typically function as continuous feed for an indefinite time, whereas the on-demand DTSs72 are continuous feeds sessions for a limited time. In one embodiment, all DTS types are capable of being transmitted at high data rates. Thebroadcast DTSs64 preferably carry data comprising multiple digitally-MPEG-2 compressed and formatted TV source signals and other continuously fed data information. The carousel DTSs68 carry broadcast media content or data that is systematically broadcast in a cycling fashion but updated and revised as needed. Thus, thecarousel DTSs68 serve to carry high volume data such as media content and data and possibly, other data at high data rates.
The carousel DTSs68 preferably carry data formatted in directories and files by a Broadcast File System (BFS) (not shown), which is used for producing and transmitting data streams throughout theSTS10, and which provides an efficient means for the delivery of application executables and application media content and data to the DHCT, as will be described below. Media content and data received by theDHCT16 in such manner can then be saved in the DHCT memory and/or transferred to the DHCT storage device for later use. The on-demand DTSs72, on the other hand, can carry particular information such as compressed video and audio pertaining to subscriber requested media content instance preview and/or media content instance descriptions, as well as other specialized data information.
Preferably, the User-to-Network Download Protocol of the MPEG-2 standard's DSM-CC specification (Digital Storage Media-Command and Control) preferably provides the data carousel protocol used for broadcasting data from a server located at theheadend11, or located elsewhere. It also provides the interactive download protocol for reliable downloading of data from a server (possibly the same server) to an individual DHCT through the on-demand DTSs. Each carousel and on-demand DTS is preferably defined by a DSM-CC session. Therefore, some of the basic functionality reflected in theDHCT16 when the DHCT does not have a local physical storage device is somewhat similar to a networked computer (i.e., a computer without a persistent storage device), in addition to traditional set top box functionality, as is well known to those of ordinary skill in the art. ADHCT16 with a storage device reduces data access latency when the data is stored in the local physical storage device ahead of time.
Also shown inFIG. 1B are Out-Of-Band (OOB) signals that provide continuously available two-way signaling to the subscribers'DHCT16 regardless of which in-band signals are tuned to by the individual DHCT in-band tuners. The OOB signals consist of a Forward Data Signal (FDS)76 and a Reverse Data Signal (RDS)80. The OOB signals can comply to any one of a number of well known transport protocols but preferably comply to either a Digital Audio Visual Council (DAVIC) 1.1 Transport Protocol with an FDS of 1.544 mega-bits per second (Mbps) or more using quadrature phase shift keying (QPSK) modulation and an RDS of 1.544 Mbps or more using QPSK modulation, or to a Data Over Cable Service Interface Specification (DOCSIS) Transport Protocol with an FDS of 27 Mbps using 64-QAM modulation and an RDS of 1.544 Mbps or more using QPSK modulation or 16-QAM modulation. The OOB signals provide the two-way operation of the network, which allows for subscriber interactivity with the applications and services provided by the network. Furthermore, the OOB signals are not limited to a 6 MHz spectrum, but generally to a smaller spectrum, such as 1.5 or 3 MHz.
FIG. 2 is an overview of oneexample headend11, which provides the interface between the STS10 (FIG. 1A) and the service and content providers. The overview ofFIG. 2 is equally applicable to one example hub12 (FIG. 1A), and the same elements and principles may be implemented at thehub12 instead of theheadend11 as described herein. It will be understood that theheadend11 shown inFIG. 2 is merely illustrative and should not be construed as implying any limitations upon the scope of the preferred embodiments of the invention. Theheadend11 receives content from a variety of service and content providers, which can provide input in a variety of ways. Theheadend11 combines the content from the various sources and distributes the content to subscribers via the distribution systems of thenetwork18.
In a typical system, the programming, services and other information from content providers can be distributed according to a variety of mechanisms. The input signals may be transmitted from sources to theheadend11 via a variety of transmission paths, including satellites (not shown), and terrestrial broadcast transmitters and antennas (not shown). Theheadend11 can also receive content from adirect feed source210 via adirect line212. Other input sources from content providers include avideo camera214, ananalog input source208, or anapplication server216. Theapplication server216 may include more than one line of communication. One or more components such as theanalog input source208, theinput source210, thevideo camera214, and theapplication server216 can be located external to theheadend11, as shown, or internal to theheadend11 as would be appreciated by one having ordinary skill in the art. The signals provided by the content or programming input sources can include a single media content instance or a multiplex that includes several media content instances.
Theheadend11 generally includes one ormore receivers218 that are each associated with a content source. In one implementation, MPEG encoders, such asencoder220, are included for digitally encoding local programming or a real-time feed from thevideo camera214, or the like. In other implementations, an encoder can be located externally to theheadend11. Theencoder220 outputs the respective compressed video and audio streams corresponding to the analog audio/video signal received at its input. For example, theencoder220 can output formatted MPEG-2 or MPEG-1 packetized elementary (PES) streams or transport streams compliant to the syntax and semantics of the ISO MPEG-2 standard, respectively. The PES or transport streams may be multiplexed with input signals from aswitch230, thereceiver218 and acontrol system232. Themultiplexing logic222 processes the input signals and multiplexes at least a portion of the input signals into atransport stream240. Theanalog input source208 can provide an analog audio/video broadcast signal which can be input into amodulator227. From themodulator227, a modulated analog output signal can be combined at acombiner246 along with other modulated signals for transmission into atransmission medium250. Alternatively, an analog audio/video broadcast signal from theanalog input source208 can be input into themodulator228. Alternatively, an analog audio/video broadcast signal can be input directly from themodulator227 to thetransmission medium250. The analog broadcast media content instances are transmitted via respective RF channels, each assigned for transmission of an analog audio/video signal such as National Television Standards Committee (NTSC) video, as described in association withFIG. 1B.
The switch, such as an asynchronous transfer mode (ATM)switch230, provides an interface to anapplication server216. There can bemultiple application servers216 providing a variety of services such as a Pay-Per-View service, including video on demand (VOD), a data service, an Internet service, a network system, or a telephone system. Service and content providers may download content to an application server located within theSTS10. Theapplication server216 may be located within theheadend11 or elsewhere within theSTS10, such as in ahub12. The various inputs into theheadend11 are then combined with the other information from thecontrol system232, which is specific to theSTS10, such as local programming and control information, which can include, among other things, conditional access information.
Theheadend11 contains one ormore modulators228 to convert the receivedtransport streams240 into modulated output signals suitable for transmission over thetransmission medium250 through thenetwork18. Eachmodulator228 may be a multimodulator including a plurality of modulators, such as, but not limited to, QAM modulators, that radio frequency modulate at least a portion of the transport streams240 to become output transport streams242. The output signals242 from thevarious modulators228 or multimodulators are combined, using equipment such as thecombiner246, for input into thetransmission medium250, which is sent via the in-band delivery path254 to subscriber locations (not shown). The in-band delivery path254 can includeDTS64,68,72, andATS60, as described withFIG. 1B.
In one embodiment, theserver216 also provides various types ofdata288 to theheadend11. The data, in part, is received by mediaaccess control functions224 that output MPEG transportpackets containing data266 instead of digital audio/video MPEG streams. Thecontrol system232 enables the television system operator to control and monitor the functions and performance of theSTS10. Thecontrol system232 interfaces with various components, viacommunication link270, in order to monitor and/or control a variety of functions, including the frequency spectrum lineup of the programming for theSTS10, billing for each subscriber, and conditional access for the content distributed to subscribers. Information, such as conditional access information, is communicated from thecontrol system232 to themultiplexing logic222 where it is multiplexed into thetransport stream240.
Among other things, thecontrol system232 provides input to themodulator228 for setting the operating parameters, such as selecting certain media content instances or portions of transport streams for inclusion in one or moreoutput transport stream242, system specific MPEG table packet organization, and/or conditional access information. Control information and other data can be communicated to hubs12 (FIG. 1A) and DHCTs16 (FIG. 1A) via an in-band delivery path254 or via an out-of-band delivery path256.
The out-of-band data is transmitted via the out-of-band FDS76 of thetransmission medium250 by, but not limited to, a Quadrature Phase-Shift Keying (QPSK)modem array226. Two-way communication utilizes theRDS80 of the out-of-band delivery path256. Hubs12 (FIG. 1A) and DHCTs16 (FIG. 1A) transmit out-of-band data through thetransmission medium250, and the out-of-band data is received in theheadend11 via the out-of-band RDS80. The out-of-band data is routed throughrouter264 to theapplication server216 and/or to thecontrol system232. The out-of-band control information includes such information as, among others, a pay-per-view purchase instruction and a pause viewing command from the subscriber location to a video-on-demand type application server located internally or external to theheadend11, such as theapplication server216, as well as any other data sent from theDHCT16 orhubs12, all of which will preferably be properly timed. Thecontrol system232 also monitors, controls, and coordinates all communications in the subscriber television system, including video, audio, and data. Thecontrol system232 can be located at theheadend11 or remotely.
Thetransmission medium250 distributes signals from theheadend11 to the other elements in the subscriber television system, such as thehub12, thenode13, and subscriber locations (FIG. 1A). Thetransmission medium250 can incorporate one or more of a variety of media, such as optical fiber, coaxial cable, HFC, satellite, direct broadcast, or other transmission media.
FIG. 3A is a block diagram illustration of anexample DHCT16 that is coupled to theheadend11 and to atelevision341, in accordance with one embodiment of the invention. It will be understood that theDHCT16 shown inFIG. 3A is merely illustrative and should not be construed as implying any limitations upon the scope of the preferred embodiments of the invention. For example, some of the functionality performed by applications executed in the DHCT16 (such as an MOD application363 (FIG. 3B)) may instead be performed completely or in part at theheadend11 and vice versa, or not at all in some embodiments. ADHCT16 is typically situated at the residence or place of business of a user and may be a stand-alone unit or integrated into another device such as, for example, a television set or a personal computer or other display devices or an audio device. TheDHCT16 preferably includes acommunications interface342 for receiving signals (video, audio and/or other data) from theheadend11 through thenetwork18 and for providing any reverse information to theheadend11 through thenetwork18.
TheDHCT16 further preferably includes one or more processors, such asprocessor344, for controlling operations of theDHCT16, and atuner system345, which preferably comprises two tuners,tuner1354 andtuner2358, for tuning into a particular television channel or frequency to display media content and for sending and receiving various types of data or media content to and from theheadend11. TheDHCT16 may include, in other embodiments, more than two tuners for receiving downloaded (or transmitted) media content. Thetuner system345 can select from a plurality of transmission signals (FIG. 1B) provided by the subscriber television system. Thetuner system345 enables theDHCT16 to tune to downstream media and data transmissions, thereby allowing a user to receive digital and/or analog media content delivered in the downstream transmission via the subscriber television system. Thetuner system345 includes, in one implementation, an out-of-band tuner for bi-directional QPSK data communication and two QAM tuners (in band) (tuner1354 and tuner2358) for receiving television signals. Thetuners354 and358 oftuner system345 can be used to simultaneously receive a plurality of signals at different carrier frequencies, and/or at different program identification (PID) locations to receive different services (e.g., receiving HBO and CNN on one carrier frequency, but different PID locations) in an MPEG formatted signal. Additionally, areceiver346 receives externally generated information, such as user inputs or commands from an input device, such as aremote control device380, or other devices.
According to another embodiment of the invention, a telephone modem (not shown) in theDHCT16 can be utilized for upstream data transmission and theheadend11, the hub12 (FIG. 1A), or other components located upstream in the STS10 (FIG. 1A) can receive data from a telephone network corresponding with the telephone modem and can route the upstream data to a destination internal or external to theSTS10, such as an application data server in theheadend11 or content provider.
TheDHCT16 includes asignal processing system314, which comprisesdemodulating system313 and transport demultiplexing and parsing system315 (herein demultiplexing system) to process broadcast media content and/or data. One or more of the systems of thesignal processing system314 can be implemented with software, a combination of software and hardware, or preferably in hardware. Thedemodulating system313 comprises functionality for RF signal demodulation, either an analog transmission signal or a digital transmission signal. For instance, thedemodulating system313 can demodulate a digital transmission signal in a carrier frequency that was modulated, among others, as a QAM-modulated signal.
When tuned to a carrier frequency corresponding to an analog TV signal transmission, thedemultiplexing system315 is bypassed and the demodulated analog TV signal that is output by thedemodulating system313 is instead routed to ananalog video decoder316. Theanalog video decoder316 converts the analog video signal (i.e., the video portion of a media content instance that comprises a video portion and an audio portion, such as NTSC video) received at its input into a respective non-compressed digital representation comprising a sequence of digitized pictures and their respective digitized audio. In one implementation, the video consists of a sequence of fields spaced apart at approximately one-sixtieth of a second. A pair of consecutive fields constitutes a picture. The odd field contains the odd-numbered lines of the picture and the even field contains the even-numbered lines of the picture. Theanalog video decoder316 outputs the corresponding sequence of digitized pictures and respective digitized audio. Each picture is a two dimensional entity of picture elements and each picture element contains a respective set of values. A picture element value comprises luminance and chrominance information that are representative of brightness and color information at the spatial location of the picture element within the picture.
Digitized pictures and respective audio output by theanalog video decoder316 are presented at the input of acompression engine317. Digitized pictures and respective audio output by theanalog video decoder316 can also be presented to abypass308, which acts through an interface (not shown) such as ITU-656 (International Telecommunications Union or ITU), and is dedicated for non-compressed digitized analog video and audio, for display on theTV341. Thecompression engine317 is coupled tomemory349 and additionally to a localdedicated memory309 that is preferably DRAM, for input and processing of the input digitized pictures and the respective digitized audio. Alternatively, thecompression engine317 can have its own integrated memory (not shown). Thecompression engine317 processes the sequence of digitized pictures and digitized audio and converts them into a video compressed stream and an audio compressed stream, respectively. The compressed audio and video streams are produced in accordance with the syntax and semantics of a designated audio and video coding method, such as specified by the MPEG-2 audio and MPEG-2 video ISO (International Organization for Standardization or ISO) standard, so that they can be interpreted by a video decoder323 (also known as a video decompression engine) and an audio decoder325 (also known as an audio decompression engine) for decompression and reconstruction at a future time. Each compressed stream includes a sequence of data packets containing a header and a payload. Each header includes a unique program identification, or PID, associated with the respective compressed stream.
Thecompression engine317 multiplexes the audio and video compressed streams into a transport stream, such as an MPEG-2 transport stream, for output. Furthermore, thecompression engine317 can compress audio and video corresponding to more than one media content instance in parallel (e.g., from two tuned analog TV signals when theDHCT16 possesses multiple tuners) and to multiplex the respective audio and video compressed streams into a single transport stream. The output of compressed streams and/or transport streams produced by thecompression engine317 is preferably input to thesignal processing system314. Parsing capabilities within thesignal processing system314 allow for interpretation of sequence and picture headers, for instance, annotating their locations within their respective compressed stream for future retrieval from astorage device373. A compressed analog media content instance (e.g., TV program episode or show) corresponding to a tuned analog transmission channel can be output as a transport stream by thesignal processing system314 and presented as input for storage in thestorage device373 via aninterface375 as will be described below. The packetized compressed streams can be also output by thesignal processing system314 and presented as input to themedia engine322 for decompression by thevideo decompression engine323 and theaudio decompression engine325 for its display on theTV341, as will be described below.
Thedemultiplexing system315 can include MPEG-2 transport demultiplexing. When tuned to carrier frequencies carrying a digital transmission signal, thedemultiplexing system315 enables the separation of packets of data, corresponding to the compressed streams of information belonging to the desired media content instances, for further processing. Concurrently, thedemultiplexing system315 precludes packets in the multiplexed transport stream that are irrelevant or not desired, such as packets of data corresponding to compressed streams of media content instances of other media content signal sources (e.g., other TV channels), from further processing.
The parsing capabilities of thedemultiplexing system315 includes reading and interpreting the received transport stream without disturbing its content, such as to interpret sequence and picture headers, for instance, to annotate their locations and corresponding time offset within their respective compressed stream for future retrieval from thestorage device373. Thus, the components of thesignal processing system314 are capable of QAM demodulation, forward error correction, and demultiplexing MPEG-2 transport streams, and parsing elementary streams and packetized elementary streams, among other functions. A compressed media content instance corresponding to a tuned carrier frequency carrying a digital transmission signal can be output as a transport stream by thesignal processing system314 and presented as input for storage in thestorage device373 via theinterface375 as will be described below. The packetized compressed streams can be also output by thesignal processing system314 and presented as input to themedia engine322 for decompression by thevideo decompression engine323 and theaudio decompression engine325, and output to anoutput stage348, as will be described below.
One having ordinary skill in the art will appreciate that thesignal processing system314 will preferably include other components not shown, including memory, decryptors, samplers, digitizers (e.g., analog-to-digital converters), and multiplexers. Further, other embodiments will be understood, by those having ordinary skill in the art, to be within the scope of the preferred embodiments of the present invention, including analog signals (e.g., NTSC) that bypass one or more elements of thesignal processing system314 and are forwarded directly to theoutput stage348. Further, outputs presented at corresponding next-stage inputs for the aforementioned signal processing flow may be connected viaaccessible memory349 in which the outputting device stores the output data and the inputting device thereafter inputs the output data written tomemory349 by the respective outputting device. Outputting and inputting devices include theanalog video decoder316, thecompression engine317, themedia engine322, thesignal processing system314, and components or subcomponents thereof. Further, it will be understood by those having ordinary skill in the art that components of thesignal processing system314 can be spatially located in different areas of theDHCT16. Further, it will be understood by those having ordinary skill in the art that, although the components of thesignal processing system314 are illustrated as being in communication with an incoming signal from thecommunications interface342, the signal may not necessarily be in the order shown for all signals.
TheDHCT16 also includes themedia engine322, which includes the digital video decoder323 (or video decompression engine), the digital audio decoder325 (or audio decompression engine), theoutput stage348, and thebypass308, and other digital signal processing components not shown, as would be appreciated by those having ordinary skill in the art. For example, thedemultiplexing system315 is in communication with thetuner system345 andprocessor344 to effect reception of digital compressed video streams, digital compressed audio streams, and data streams corresponding to one or more media content instances to be separated from other media content instances and/or streams transported in the tuned transmission channel and to be stored in a first part (not shown) ofDRAM352 of DHCT16 assigned to receive packets of one or more media content instances. Other dedicated memory may also be used for media content instance packets.
Furthermore, while conducting this process, thedemultiplexing system315 demultiplexes and separates desired compressed streams from the received transport stream without disturbing its content. Further, thedemultiplexing system315 parses (i.e., reads and interprets) compressed streams such as to interpret sequence headers and picture headers, and deposits a transport stream carrying compressed streams of a media content instance intoDRAM352. Theprocessor344 causes the transport stream inDRAM352 to be transferred to thestorage device373 via theinterface375. Under program control by theprocessor344, thedemultiplexing system315, in communication with thedigital video decoder323, thestorage device373, and theprocessor344, effects notification and/or transfer of received packets of one or more compressed streams corresponding to one or more media content instances from a first part ofDRAM352 to a second part (not shown) ofDRAM352 assigned to thedigital video decoder323 and thedigital audio decoder325. In other embodiments, themedia engine322 can have access to a dedicated localized DRAM, such as A/V decoder memory306 to facilitate such transfers. Upon demultiplexing and parsing the transport stream carrying one or more media content instances, and in communication with theprocessor344, thesignal processing system314 outputs toDRAM352 ancillary data in the form of a table or data structure (not shown) comprising the relative or absolute location of the beginning of certain pictures in the compressed media content instance for convenience in retrieval during future operations.
In another embodiment, according to a plurality of tuners, and respective number ofdemodulating systems313,demultiplexing systems315, andsignal processing systems314, a respective number of broadcast digital media content instances are received and routed to thehard disk300 of thestorage device373 simultaneously while performing the necessary data annotations for each of the respective compressed media streams for their future retrieval fromstorage device373. Alternatively, asingle demodulating system313, asingle demultiplexing system315, and a singlesignal processing system314, each with sufficient processing capabilities can serve to process more than one digital media content instance. One or more of the received broadcast digital media content instances routed to thestorage device373 can be routed simultaneously to themedia engine322 for decoding and display to theTV341.
In another embodiment according to the aforementioned description, a first tuner, forexample tuner1354 oftuning system345 receives an analog video signal corresponding to a first media content instance and a second tuner, forexample tuner2358 receives a digital compressed stream corresponding to a second media content instance. The first media content instance is processed as an analog signal and the second media content instance is processed as a digital compressed stream as described above. The compressed digital version of the analog video signal or the second media instance, or both, can be routed to thestorage device373 while simultaneously performing the respective data annotations required for future retrieval. Additionally, either or both of the media instances can be routed simultaneously to themedia engine322 for decoding and display on theTV341.
In one implementation, thecompression engine317 can output formatted MPEG-2 or MPEG-1 packetized elementary streams (PES) inside a transport stream, all compliant to the syntax and semantics of the ISO MPEG-2 standard. Alternatively, thecompression engine317 can output other digital formats that are compliant to other standards. The digital compressed streams output by thecompression engine317 corresponding to a media content instance are preferably deposited inlocal memory309 for thecompression engine317 and routed to thedemultiplexing system315. Thedemultiplexing system315 parses (i.e., reads and interprets) the transport stream generated by thecompression engine317 without disturbing its content, such as to interpret picture headers, and deposits the transport stream intoDRAM352. Theprocessor344 causes the transport stream inDRAM352 to be transferred to thestorage device373. While parsing the transport stream, thedemultiplexing system315 outputs toDRAM352 ancillary data in the form of a table or data structure (not shown) comprising the relative or absolute location of the beginning of certain pictures in the compressed media content stream for the media content instance for convenience in retrieval during future operations. In this way, random access operations such as fast forward, rewind, and jumping to a location in the compressed media content instance can be attained.
In another embodiment, according to a plurality of tuners, respective number ofanalog video decoders316, and respective number ofcompression engines317, the aforementioned compression of analog video and audio is performed and routed to thehard disk300 of thestorage device373 simultaneously on a respective number of analog media content instances. Alternatively, a single compression engine with sufficient processing capabilities can serve to compress more than one analog media content instance.
Themedia engine322 also includes theoutput stage348. In one implementation, theoutput stage348 can include a digital encoder (DENC) (not shown) for driving the TV display. In parallel to feeding a DENC, theoutput stage348 can also route the video/audio signal for output in multiple formats. Such formats can include analog component YPbPr, which can be used in High Definition Television (HDTV), some standard televisions, and digital video disk (DVD) players. Another format can be RGB component for personal computer displays. Another format can include a digital version of an analog component, which is used with fiber optic cable connections. The DENC output can feed digital analog converters (DAC—not shown) for output as composite video (CVBS), also known as baseband (e.g., V-output connection in a VCR or DVD player). In other implementations, the luma and chroma signals can be kept separate to output “separate video” through DACs (e.g., S-video connection in a VCR or DVD player). The DENC output of theoutput stage348 can also feed anRF channel 3 and 4 modulator (not shown) that feeds a DAC. In another implementation, theoutput stage348 outputs in parallel through an ITU-656 output port, either for internal routing of the video or to drive an external DENC (e.g., for VCR recording). In other embodiments, the DENC can be external to theoutput stage348. In other embodiments, a DENC internal to theoutput stage348 can be used to drive an external DENC (not shown), and the external DENC can be used to drive aChannel 3 and/or 4 RF modulator.
TheDHCT16 also includes amedia memory305. These components can include software and/or hardware to compose and store graphical information created by theprocessor344. These components enable the compositing of graphical data with video into a picture for a TV display as provided by capabilities in themedia engine322.
In one implementation, compressed video and audio streams received through an in-band tuner of thetuner system345 or read from thelocal storage device373 is deposited continuously into a compressed audio andvideo section306 of themedia memory305. Thereafter, one or morevideo decompression engines323 within themedia engine322 decompress compressed MPEG-2 Main Profile/Main Level video streams read into thevideo decompression engine323 from thecompressed video buffer306 of themedia memory305. Each picture decompressed by thevideo decompression engine323 is written to areconstruction portion307 of themedia memory305, where the reconstructed pictures are retained.
Alternatively, the pictures may be decompressed in thevideo decompression engine323, then scaled down as they are being reconstructed in a procedural fashion by feeding data of the reconstructed pictures in raster-scan order from thevideo decompression engine323 to a video scaling unit (not shown). According to this alternative, the scaled down reconstructed picture can be stored in one of multiple scaled video picture buffers (not shown) inmedia memory305 in raster-scan order as they are reconstructed, such that a respective scaled video picture buffer is dedicated to the motion video picture of a program or video object (read from the local storage device373) and included in the displayed presentation.
Additionally, one or more digitalaudio decompression engines325 in themedia engine322 can decode the compressed digital audio streams associated with the compressed digital video or read as an audio object from thelocal storage device373 in a similar fashion, allocating respective buffers as necessary. It should be appreciated that in some implementations only one audio buffer may be required. Note that, in some embodiments,system memory349 andmedia memory305 can be unified as one physical memory device. It should be appreciated that themedia memory305 is a memory of finite number of bytes, and it serves as a repository for different data components. Compressed MPEG-2 video streams are deposited in A/V decoder memory306 allocated for compressed video and compressed audio, as described above. Decompressed audio is fed into an audio port (not shown) for playback. Further information on the media memory and subcomponents thereof, in addition to other DHCT components can be found in the patent application entitled, DIGITAL SUBSCRIBER TELEVISION NETWORKS WITH LOCAL PHYSICAL STORAGE DEVICES AND VIRTUAL STORAGE, filed on Jul. 30, 2001, assigned a Ser. No. 09/918,376, assigned to Scientific Atlanta, Inc., and herein incorporated by reference.
One or more programmed software applications, herein referred to as applications, are executed by utilizing the computing resources in theDHCT16. Note that an application typically includes a client part and a server counterpart that cooperate to provide the complete functionality of the application.FIG. 3B is a block diagram of anexample system memory349. The applications may be resident inFLASH memory351 or downloaded (or uploaded) intoDRAM352. Applications stored inFLASH memory351 orDRAM352 are executed by the processor344 (e.g., a central processing unit or digital signal processor) under the auspices of theoperating system353. Data required as input by an application is stored inDRAM352 orFLASH memory351 and read by theprocessor344 as need be during the course of application execution. Input data may be data stored inDRAM352 by a secondary application or other source, either internal or external to theDHCT16, or possibly anticipated by the application and thus created with the application at the time it was generated as a software application, in which case it is stored inFLASH memory351. Data generated by an application is stored inDRAM352 by theprocessor344 during the course of application execution.DRAM352 also includesapplication memory370 that various applications may use for storing and/or retrieving data.
An application referred to as anavigator355 is also resident inFLASH memory351 for providing a navigation framework for services provided by theDHCT16. Thenavigator355 registers for and in some cases reserves certain user inputs related to navigational keys such as channel increment/decrement, last channel, favorite channel, etc. Thenavigator355 also provides users with television related menu options that correspond to DHCT functions such as, for example, blocking a display channel or a group of display channels from being displayed in a display channel menu presented on a screen display.
TheFLASH memory351 also contains aplatform library356. Theplatform library356 is a collection of utilities useful to applications, such as a timer manager, a compression manager, a configuration manager, a hyper text markup language (HTML) parser, a database manager, a widget toolkit, a string manager, and other utilities (not shown). These utilities are accessed by applications via application programming interfaces (APIs) as necessary so that each application does not have to contain these utilities. Two components of theplatform library356 that are shown inFIG. 3B are awindow manager359 and a service application manager (SAM)client357.
Thewindow manager359 provides a mechanism for implementing the sharing of the screen regions and user input. Thewindow manager359 on theDHCT16 is responsible for, as directed by one or more applications, implementing the creation, display, and de-allocation of thelimited DHCT16 screen resources. It allows multiple applications to share the screen by assigning ownership of screen regions, or windows. Thewindow manager359 also maintains, among other things, auser input registry350 inDRAM352 so that when a user enters a key or a command via theremote control device380 or another input device such as a keyboard or mouse, theuser input registry350 is accessed to determine which of various applications running on theDHCT16 should receive data corresponding to the input key and in which order. As an application is executed, it registers a request to receive certain user input keys or commands. When the user presses a key corresponding to one of the commands on theremote control device380, the command is received by thereceiver346 and relayed to theprocessor344. Theprocessor344 dispatches the event to theoperating system353 where it is forwarded to thewindow manager359 which ultimately accesses theuser input registry350 and routes data corresponding to the incoming command to the appropriate application.
TheSAM client357 is a client component of a client-server pair of components, with the server component (not shown) being located on theheadend11, preferably in the control system232 (FIG. 2). A SAM database360 (i.e., structured data such as a database or data structure) inDRAM352 includes a data structure of services and a data structure of channels that are created and updated by theheadend11. Herein, database will refer to a database, structured data or other data structures as is well known to those of ordinary skill in the art. Many services can be defined using the same application component, with different parameters. Examples of services include, without limitation and in accordance with one implementation, presenting television instances (available through a WatchTV application362), pay-per-view events (available through a PPV application364), digital music (not shown), media-on-demand (available through an MOD application363), and an interactive program guide (IPG)397. In general, the identification of a service includes the identification of an executable application that provides the service along with a set of application-dependent parameters that indicate to the application the service to be provided. As an example, a service of presenting a television instance (media content instance) could be executed by theWatchTV application362 with a set of parameters specifying HBO to view HBO or with a separate set of parameters to view CNN. Each association of the application component (tune video) and one parameter component (HBO or CNN) represents a particular service that has a unique service I.D. TheSAM client357 also interfaces with theresource manager367 to control resources of theDHCT16.
Applications can also be downloaded intoDRAM352 at the request of theSAM client357, typically in response to a request by the user or in response to a message from theheadend11. In the example DHCT memory illustrated inFIG. 3B,DRAM352 includes a media-on-demand application (MOD)363, ane-mail application365, aPVR application377, and aweb browser application366. It should be clear to one with ordinary skill in the art that these applications are not limiting and merely serve as examples for embodiments of the invention. Furthermore, one or more DRAM based applications may be resident, as an alternative embodiment, inFLASH memory351. These applications, and others provided by the subscriber television system operator, are top-level software entities on the network for providing services to the user.
In one implementation, applications executing on theDHCT16 work with thenavigator355 by abiding by several guidelines. First, an application utilizes theSAM client357 for the provision, activation, and suspension of services. Second, an application sharesDHCT16 resources with other applications and abides by the resource management policies of theSAM client357, theoperating system353, and theDHCT16. Third, an application handles situations where resources are only available withnavigator355 intervention. Fourth, when an application loses service authorization while providing a service, the application suspends the service via the SAM (thenavigator355 will reactivate an individual service application when it later becomes authorized). Finally, an application is designed to not have access to certain user input keys reserved by the navigator355 (i.e., power, channel+/−, volume+/−, etc.).
TheMOD application363 provides the user with lists of available media content titles for each media content instance to choose from and with media content instances requested by the user. TheMOD application363 provides media content to the user by engaging, typically, in a direct two-way IP (Internet Protocol) connection with VOD content servers (not shown) that would be located, in one embodiment, in theheadend11.
An executable program or algorithm corresponding to an operating system (OS) component, or to a client platform component, or to an application, or to respective parts thereof, can reside in and execute out ofDRAM352 and/orFLASH memory351. Likewise, data input into or output from any executable program can reside inDRAM352 orFLASH memory351. Furthermore, an executable program or algorithm corresponding to an operating system component, or to a client platform component, or to an application, or to respective parts thereof, can reside inFLASH memory351, or in a local storage device (such as the storage device373) externally connected to or integrated into theDHCT16 and be transferred intoDRAM352 for execution. Likewise, data input for an executable program can reside inFLASH memory351 or a storage device and be transferred intoDRAM352 for use by an executable program or algorithm. In addition, data output by an executable program can be written intoDRAM352 by an executable program or algorithm and be transferred intoFLASH memory351 or into a storage device. In other embodiments, the executable code is not transferred, but instead, functionality is effected by other mechanisms.
Referring again toFIG. 3A, theDHCT16 may also include one or more wireless or wired interfaces, also calledcommunication ports374, for receiving and/or transmitting data to other devices. For instance, theDHCT16 may feature USB (Universal Serial Bus), Ethernet (for connection to a computer), IEEE-1394 (for connection to media content devices in an entertainment center), serial, and/or parallel ports. The user inputs may be, for example, provided by an input device including a computer or transmitter with buttons or keys located either on the exterior of the terminal or by a hand-heldremote control device380 or keyboard that includes user-actuated buttons, or even aural input (e.g., voice activated).
TheDHCT16 includes at least onestorage device373 to provide storage for downloaded media content. Thestorage device373 can be an optical storage device or a magnetic storage device, and is preferably a hard disk drive. Thestorage device373 comprises storage for media content and/or data that can be written to for storage and later read from for retrieval for presentation. Thestorage device373 preferably includes twohard disks300 and301, with each including a correspondingbuffer space TSB1376 andTSB2378, as will be explained further below. Alternatively, theDHCT16 can be coupled to two storage devices, each with one hard disk. Alternatively, a storage device can be used that uses different buffer spaces on one hard disk, or the storage device can include more than two hard disks, or platters. Throughout this disclosure, references relating to writing to or reading from thestorage device373, or references regarding recordings from or to thestorage device373 will be understood to mean that such read or write operations are occurring to the actual medium (for example, thehard disk300 and/or301) of thestorage device373. Thestorage device373 is also comprised of acontroller379 that receives operating instructions from adevice driver311 of the operating system353 (as described below) and implements those instructions to cause read and/or write operations to thehard disks300 and/or301.
Thedevice driver311 communicates with thestorage device controller379 to format thehard disks300 and301, causing the hard disks to be divided radially intosectors301 and concentric circles calledtracks302, as illustrated by the schematic diagram n of the examplehard disk300 inFIG. 3C. It will be understood to one having ordinary skill in the art that the discussion that follows pertains tohard disk301 as well ashard disk300. Note fromFIG. 3C that the same number ofsectors304 pertrack302 are illustrated, but other embodiments with a different number tracks per side, sectors per track, bytes per sector, and in different zones of tracks, are within the scope of the preferred embodiments of the invention. Thesector304 is the basic unit of storage on thehard disk300. In one implementation, eachsector304 of ahard disk300 can store 512 bytes of user data. While data is preferably stored in 512-byte sectors on thehard disk300, the cluster, such asexample cluster303, is typically the minimum unit of data storage theoperating system353 uses to store information. Two or more sectors on a single track make up a cluster.
Referring again toFIGS. 3A and 3B, thestorage device373 is preferably internal to theDHCT16, coupled to a common bus through thecommunication interface375, preferably an integrated drive electronics (IDE) interface or small computer system interface (SCSI), although IEEE-1394 or USB can be used. In other embodiments, thestorage device373 can be externally connected to (and thus removable from) theDHCT16 via thecommunication port374 implemented as IEEE-1394 or USB or as a data interface port such as a SCSI or an IDE interface. In one implementation, under the auspices of the real-time operating system353 (as described below) and executed by theprocessor344, and in coordination with the personal video recording (PVR)application377, thedevice driver311, and the device controller379 (the latter three components described below), downloaded media content (herein understood to also refer to other types of data, in addition to, or instead of, media content instances) are received in theDHCT16 via thecommunications interface342, processed as described above, and stored in a temporary cache (not shown) inmemory349.
The temporary cache is implemented and managed to enable media content transfers from the temporary cache to thestorage device373, or, in concert with the insertion of a newly arriving media content into the temporary cache. In one implementation, the fast access time and high data transfer rate characteristics of thestorage device373 enables media content to be read from the temporary cache inmemory349 and written to thestorage device373 in a sufficiently fast manner. Orchestration of multiple simultaneous data transfer operations is effected so that while media content is being transferred from the cache inmemory349 to thestorage device373, new media content is received and stored in the temporary cache ofmemory349. In other implementations, the downloaded media content is received through thecommunications port374 in theDHCT16 and then transferred directly tostorage device373, thus bypassing the temporary cache.
Theoperating system353, thedevice driver311, and thecontroller379 communicate under program execution in theprocessor344 and/or via the interrupt and messaging capabilities of theDHCT16 and thus cooperate to create a special file in one of the hard disk sectors in eachhard disk300 and301 called a file allocation table (FAT) (not shown). The FAT is where theoperating system353 stores the cluster and file information about each of thehard disks300 and301, and which clusters are assigned and associated with a file and thus used to store which media content instance files. Theoperating system353 can determine where a file's data is located by using the directory entry (not shown) for the file and the entries of the FAT. The directory entry gives information about a directory such as its related files and subdirectories and create time, and special permissions. A FAT entry describes the physical locations of data associated with a media content downloaded to thehard disks300 and301 of thestorage device373. The FAT also keeps track of which clusters are free, or open, and thus available for use. Updates to the FAT are provided for by theoperating system353, or thedevice driver311, or a combination of both. Writes to each of thehard disks300 and301 are coordinated between the PVR application377 (described below), theoperating system353, thedevice driver311, and thestorage device controller379.
ThePVR application377, theoperating system353, and thedevice driver311 execute respective programmed instructions in theprocessor344. Theprocessor344, thestorage controller379, and thedemultiplexing system315 communicate via interrupt and messaging capabilities of theDHCT16. ThePVR application377, in communication withoperating system353, thedevice driver311, thestorage device controller379, and thedemultiplexing system315, effects retrieval of compressed video streams, compressed audio streams, and data streams corresponding to one or more media content instances from thestorage device373. The retrieved streams are deposited in an output cache in thestorage device373 and transferred toDRAM352, and then processed for playback according to mechanisms well known to those having ordinary skill in the art. In some embodiments, the media content instances are retrieved and routed from thehard disks300 and/or301 to the video andaudio decoding system323 and325 simultaneously, and then further processed for eventual presentation on a display device or other device.
ThePVR application377 provides for media content recording functionality by enabling the temporary writing to, and if requested, more permanent recording (i.e., relatively permanent) to thestorage device373. Media content can be transmitted (or downloaded) from a remote location, such as, for example, a remote server located in thehead end11, or from a home communication network, or from other consumer electronic devices. Downloaded media content that is received at each tuner oftuner system345 is temporarily buffered, or stored, on the hard disk of the storage device. The corresponding space on each hard disk is called a buffer space, or a time shift buffer (TSB). In a preferred embodiment, each tuner intuner system345 has a respective TSB. In one implementation,tuner1354 receives media content for buffering toTSB1376. Likewise, thesecond tuner358 receives media content for buffering toTSB2378. Moreover, media instances sourced from a device such as a camera attached to theDHCT16 via thecommunication port374 has a respective TSB (not shown). Note that buffering is understood to mean temporarily storing media content, received from a local attached device, or either from reception of a broadcast digital channel, and/or a digital compressed version of a broadcast analog channel, and/or data, into the buffer spaces (or TSBs) of thestorage device373.
Under normal operation, thePVR application377 effectively associates a temporary recording designation with the media content received into the TSBs. The media content stored in the TSBs will either be deleted (i.e., the clusters storing the media content will be configured as writeable for eventual write operations that overwrite the media content within those clusters) or retained (through election by the user, as one example) as a permanent recording. A permanent recording will be understood to include media content that is stored for an extended period of time as decided by the user. Permanent recordings are stored in non-buffer clusters (i.e., not in clusters assigned to the TSBs) that are not used for the TSBs in instances when the user elects in advance to make a scheduled recording of a media content instance that has not yet been tuned to at theDHCT16. A permanent recording can also be achieved by selecting a media content instance stored in the TSBs and designating the media content instance as permanent. In this latter implementation, the designated media content is stored in clusters that are configured from TSB clusters to permanent recording clusters (non-buffer clusters). To compensate for the re-designation of clusters to a permanent recording, thedevice driver311 preferably assigns and associates an equivalent number of clusters to the TSB that it obtains from a pool of available unused and/or writeable (e.g., repossessed) clusters thus permitting continuance of normal TSB behavior and management. Thus, permanent recordings will preferably be more permanent than media content in the TSBs, and permanent recordings can eventually be deleted from the disk space, typically at the explicit request of a user, as one example.
There is a duration associated with the TSBs, which represents how much data is held by the TSBs. This duration could represent, in one embodiment, actual media content instance time. ThePVR application377, in such a time-duration embodiment, will preferably maintain a substantially constant buffer space capacity suitable for a certain duration of media content instance time, for example, 3-4 hours worth of media content instances. Media content instance-time tracking is related to hard disk space tracking if a constant data rate, or buffering rate, is assumed or estimated. In a preferred embodiment, the duration of the TSBs represents hard disk space. ThePVR application377 can set a buffer size capacity, for example 3 gigabytes (GB), and then track the disk space used for the TSBs to ensure a substantially constant TSB capacity. For example, before thePVR application377 effects a write to thestorage device373, it can query the device driver311 (through the operating system353) to determine the available hard disk space. After the write operation, thePVR application377 again can poll thedevice driver311 to get an update on available hard disk space.
The TSBs can be managed according to several mechanisms. In one embodiment, each media content instance that is received at either of the tuners oftuner system345 prompts thePVR application377 to cause each media content instance to be downloaded to thehard disk300 or301 and associated as a media content instance file under a designated media content instance filename. This media content instance filename is recorded in a FAT that maintains a list of the corresponding clusters storing the media content instance file. ThePVR application377 also creates a management file that maintains a data record that preferably points to the corresponding filename and includes a data record that includes, among other elements, guide data and the receipt time of the downloaded media content instance. The guide data includes the scheduled start time and stop time of the downloaded media content instance as well as other attributes and information that pertain to the media content instance.
The receipt of the downloaded media content instance is also recorded by the PVR application377 (through coordination with theoperating system353 and an internal clock372) as a real-time value. ThePVR application377 is either alerted to the start of a media content instance, in one implementation, from a keypress event (e.g., when a user tunes to a desired display channel). In another implementation, the PVR application can use a polling or timing mechanisms (viatimer371, as one example) in cooperation with the internal real-time clock372 and guide data. ThePVR application377 provides theoperating system353 with the scheduled stop time (from guide data, such as from an interactive program guide) of the downloaded media content instance in order to set up a timer interrupt (or in other embodiments, polls the operating system353) with theoperating system353. Theoperating system353, in coordination with the real-time clock371 within theDHCT16, alerts the PVR application377 (FIG. 3B) to the end of the received media content instance.
Further, thePVR application377 preferably maintains the management files with an organization mechanism such as a linked list, wherein each management file is associated with each of the media content instances located on thehard disks300 and301. Read requests for one of the downloaded media content instances in theTSB378 occurs by thePVR application377 searching the linked list for the requested media content instance, and providing a graphics user interface (GUI) (not shown) on a display screen based on the information maintained in the corresponding management file. Furthermore, a bi-directional link-list mechanism can be employed for arbitrary entry and to search forward or backward in relation among media instances.
Further information pertaining to this embodiment for creating and maintaining the TSBs can be found in the patent application entitled, SYSTEM AND METHOD FOR CONTROLLING SUBSTANTIALLY CONSTANT BUFFER CAPACITY FOR PERSONAL VIDEO RECORDING WITH CONSISTENT USER INTERFACE OF AVAILABLE DISK SPACE, filed Dec. 6, 2001 under Ser. No. 10/010,270 and assigned to Scientific Atlanta, herein incorporated by reference.
Another embodiment for maintaining and managing the TSBs includes associating a single file for each TSB, and controlling the allocation and deallocation of clusters in the disk space at thedevice driver311 level. In this embodiment, further described in the patent application entitled, DISK DRIVER CLUSTER MANAGEMENT OF TIME SHIFT BUFFER WITH FILE ALLOCATION TABLE STRUCTURE,” filed Dec. 5, 2001 under Ser. No. 10/005,628 and assigned to Scientific Atlanta, herein incorporated by reference, thePVR application377 requests the allocation of disk space for a single file for each TSB. For eachTSB378, thedevice driver311, implemented as either a separate software module, or integrated with theoperating system353, allocates enough clusters and assigns them to the respective file to meet the size requirement designated by thePVR application377. Media content instances downloaded and written to the TSBs are preferably tracked by time. Thedevice driver311 provides a software generated pointer, called Normal Play Time (NPT), which points to locations within files and locations within media content instances within those files. Based on the Lightweight Stream Control Protocol, NPT can be thought of as the clock associated with a video asset (as distinguished from the real-time clock372 for the DHCT16).
For every file that is created for media content downloaded to thestorage device373, an NPT is generated. There is an NPT for the read head of thestorage device373 and for the write head of thestorage device373. For writing media content to thestorage device373 for a newly created file (e.g., a TSB1 file), an NPT is created for the write head of thestorage device373 with an initial value of zero. In one implementation, thedevice driver311 receives a periodic interrupt (for example every 5-10 msec) set up by thePVR application377 through the computer services of theoperating system353. This interrupt is synchronized with the internal real-time clock372 of theDHCT16 in order to advance the pointer (i.e., the NPT) at a substantially constant rate. The NPT continues to increase in value (from an initial value of zero) until the associated file is closed. For the read head of thestorage device373, the NPT starts at 0 at the start of the file, advances in real time in normal play mode, advances faster than real time in fast forward mode, decrements in rewind mode, and is fixed when the video is paused.
ThePVR application377 maintains a data structure for every downloaded media content instance. There are one or more data structures preferably maintained on thehard disks300 and301 of thestorage device373, but the data structures can also be maintained inmemory349. The data structures include, for example, the NPT values defining the start and end times of the downloaded media content instance, the real-time values corresponding to the start and end times of the media content instances, as well as the corresponding media content instance guide data, among other elements. Thedevice driver311 maintains the mapping between NPT and the cluster and sector locations of the media content in a separate look-up table data structure (not shown) preferably located on thehard disks300 and301. In one embodiment, thedevice driver311 can sample the current write location (i.e., cluster and sector location provided by the storage device controller379) as the write head of thestorage device373 advances and store that cluster and sector location in the look-up table data structure along with a corresponding NPT value. This sampling can occur, for example, every 5-10 msec. In an alternative embodiment, thedevice driver311 can record an initial sample and through an estimation algorithm (e.g., interpolation) estimate file locations and locations within said files. When thePVR application377 references a particular media content instance (for example where a user seeks to rewind to a downloaded media content instance in the hard disk300), thePVR application377 passes the stored start and stop NPT values for that media content instance to thedevice driver311, and thedevice driver311 determines the hard disk locations from the look-up table data structure. ThePVR application377 correlates NPT read values for locations within the media content instances to the real-time clock value. With the real-time start and stop values and guide data maintained in a data structure, as well as the correlated read-NPT to real-time values, thePVR application377 can produce a GUI that provides the user with information that includes what portion of a buffered media content instance the user is currently viewing.
As described above, the user preferably permanently records from the TSBs by designating as permanent a currently viewed media content instance during real-time viewing or returning (e.g., rewinding) to any part of a media content instance in the TSBs and selecting record from aremote device380, or alternatively, from selecting a record button (not shown) on theDHCT16. An exampleremote control device380 to provide input to theDHCT16 is illustrated inFIG. 3D. A display channel is selected, and changed, by a user, typically via pressing a key or button on theremote control device380. Rewind388 and fast-forward387 buttons enable a user to access buffered media content instances in theTSBs376 and378. Arecord button390 enables the user to designate as permanently recorded any media content instance buffered into theTSBs376 and378, as described below. A pause button391 (and a stop button389) enables the user to pause a media content instance, or pause during a search for a particular media content instance. Aplayback button392 enables the playback of a media content instance. “A”381, “B”382, and “C”383 buttons can correspond to certain application-defined functions that have a corresponding “A”, “B”, or “C” symbol displayed in a GUI presented on a display device. Alist button384 can be used to evokevarious PVR application377 user interface screens. Also included is aselect button393 for selecting an option on a display screen, and up and downarrows394 and395 for scrolling through displayed options. Many alternative methods of providing user input may be used including a remote control device with different buttons and/or button layouts, a keyboard device, a voice activated device, etc. The embodiments of the invention described herein are not limited by the type of device used to provide user input.
FIG. 4 is a schematic diagram illustrating an example scenario in a two tuner (and two buffer) system that would require the establishment of priorities for downloading media content from a plurality of display channels, in accordance with one embodiment of the invention. One or more of these display channels can also be presented to a display device, such as atelevision341, as shown by the dotted line from the second display channel. For example, assume the user was initially watching media content of the first display channel, and then selected a second display channel for display (while buffering and displaying the first display channel). This display of media content of the second display channel can be a time-shifted display (sourced from the buffer), or the buffering and the display can occur in parallel. For the descriptions that follow, it will be assumed that the newly requested display channel will always be displayed in favor of the current display. Shown are twohard disks300 and301. These hard disks are preferably located in thestorage device373, as described above. Thehard disk300 includes a buffer space (TSB1)376 for receiving and storing downloaded media content from a first display channel. The first display channel media content is preferably provided from the headend11 (FIG. 2) to one of the tuners, forexample tuner1354 of tuner system345 (FIG. 3A). The second display channel can be received attuner2358 of thetuner system345. Thehard disk301 includes a buffer space (TSB2)378 for receiving and storing downloaded media content of the second display channel. Alternatively,TSB1376 andTSB2378 can be buffer space on a single hard disk (e.g., one physical hard disk, or other storage medium, partitioned to function and be controlled as two “logical” hard disks), or on disks in separate storage devices.
Whiletuner1354 receives media content of the first display channel for storage intoTSB1376 and display on theTV341, assume a user changes from the first display channel to the second display channel. In order to receive media content attuner2358, the operating system353 (FIG. 3B) preferably performs a resource query to determine if the resources exist to tune and buffer both display channels. For instance, iftuner2358 was currently resourced to perform a scheduled recording, then the second display channel, in one implementation, may not be received attuner2358, and thus the content of the second display channel may have to be received at tuner1354 andTSB1376 at the expense of the media content stored inTSB1376 and received from the first display channel.
In another implementation, if the second display channel is transmitted over the same center RF frequency as the first display channel, as determined by the operating system353 (FIG. 3B) from a display channel number to center frequency association table (not shown) in memory349 (FIG. 3A), resourcing oftuner2358 is not necessary and media content of both the first and second display channel are received viatuner1354 and deposited into therespective TSB1376 andTSB2378.
Assume the user changed from the first display channel to the second display channel, and that thesecond tuner358 is an available resource. The media content of the first display channel, in one embodiment, will not be “deleted” (i.e., written over or its associated clusters made writeable) but instead retained for now, and the media content (after the display channel change) will continue to be received at tuner1354 and downloaded intoTSB1376 from the first display channel. The point in time when the user resources tuner2358 to receive the second display channel is “marked” and stored inmemory349 and thereafter copied to a PVR application data structure (not shown) associated with the buffer spaces of the storage device373 (FIG. 3A) when enough time of buffering the second display channel elapses. Buffering tostorage device373 is preferably effected by the caching the data to memory349 (FIG. 3A) and thereafter reading the respective data frommemory349 and writing it tostorage device373. The buffering is timely orchestrated so that previously deposited data tomemory349 is read while additional data is being written to memory. Further, a pointer to the location of thehard disk300 where the media content of the first display channel is buffered when tuner2358 is resourced is also recorded in the data structure to enable later access and/or retrieval, as described below.
In an alternate implementation, memory349 (FIG. 3A) in DHCT16 (FIG. 3A) is of sufficient size to permit buffering of data up to a predetermined elapsed time required according to a threshold before proceeding to transfer the cached data inmemory349 to storage device373 (FIG. 3A). Consequently, less bookkeeping is required on thestorage device373.
The amount of elapsed time deemed to be sufficient is determined by comparison to a first programmable threshold value. The threshold value can be preset at compilation time by the application developer. In one preferred embodiment, the programmer programs the threshold to have an initial default value that can be modified throughout the course of time by the viewer according to the viewer's preference via an interactive configuration application (not shown) in which the viewer makes selections in a displayed graphical user interface (GUI) by entering information with key presses or by entering alphanumeric information with an input device such as a remote control device380 (FIG. 3D). For instance, desired preferences and configuration can be entered as part of an overall general settings application312 (FIG. 3B).
The PVR application377 (FIG. 3B) recognizes the keypress event associated with the display channel change, and determines the time of the key press from time and clock provisions in the DHCT16 (FIG. 3A) and from services of the operating system353 (FIG. 3B). The time of the key press is stored in memory349 (FIG. 3B) and thereafter if enough time elapses, as defined by a first threshold, for example, while buffering the second display channel, the time of the key press is also annotated (written) into thestorage device373 with annotations associated with the respective media content. Annotations include information required to retrieve the media content downloaded to the storage device373 (FIG. 3A). Such information includes the time that the media content was received, information to fulfill playback and other navigating modes, characteristics of the media content, elapsed time of buffering, and a pointer to the location of the downloaded media content, among other information. If enough time elapses, an address marker associated with the start of where the media content corresponding to the second display channel is downloaded is also stored on thehard disk300 or301 (FIG. 3A) in the storage device373 (FIG. 3A) as part of the media content's annotations. ThePVR application377 stores these values in memory until it determines that sufficient time (according to a first threshold) has elapsed and then also stores them in thehard disk300 or301 by reading their respective information frommemory349 and writing it tohard disk300 or301.
The PVR application377 (FIG. 3B) waits until a next input state regardless of whether enough time has elapsed to store annotations in the storage device373 (FIG. 3A). A next input state from a set of possible next input states includes a viewer's input such as a keypress event that is possibly associated with returning to the first display channel (when that occurs). A next input state can alternatively correspond to a key press corresponding to the currently displayed channel (that is, the second display channel) or to a third display channel. A key press that selects the second display channel while the user is currently viewing the media content of the second display channel may be a viewer's mistake, in conventional television systems. However, in the preferred embodiments, it can actually signify a desired effect by the viewer such as to give precedence to buffering the second display channel over the buffering of a first display channel. In one embodiment, this viewer desired precedence behavior for buffering the second display channel becomes effective indefinitely. In an alternate embodiment, the viewer desired precedence behavior for buffering the second display channel becomes effective if the viewer subsequently selects to display a third display channel within a specific amount of elapsed time corresponding to less than a second threshold.
The return to the first display channel causes the PVR application377 (FIG. 3B) to, in one implementation, return to the point in the media content instance that the user left off viewing the media content associated with the first display channel to go to the media content of the second display channel. In other implementations, according to a programmed default behavior or to a viewer's configured preference, the user can be returned to the real-time tuned position to receive the current media content, with the ability to rewind back to where the user left the media content of the first display channel. The first display channel and the second display channel can download a single media content instance to each of thebuffer spaces376 and378 between display channel changes, or a plurality of media content instances can be downloaded to each respective buffer space, depending on the amount of time, or the download duration, that the media content is received in each respective buffer space. In other words, the user is free to change the display channel at any point within or after a media content instance presentation, and thus at any point in time, either buffer space can temporarily store, for example, a portion (e.g., 30 seconds) of a media content instance, or enough media content instances to reach buffer space capacity, among other examples. Such buffering behavior is determined in part by a set of programmed thresholds, including a first and a second threshold as described above. Alternatively, each threshold in the set of thresholds can be fixed to respective empirically determined values by the application developer. In another embodiment, some of the thresholds in the set may be programmed to be configurable and others to fixed values.
When the user decides to change display channels (e.g., switch to a third display channel), the media content of the third display channel can go to eitherTSB1376 orTSB2378. In one embodiment, according to a programmed behavior, when the media content of the third display channel is downloaded to one of these buffer spaces, the media content already stored there is deleted (e.g., overwritten or made writeable) in order to receive and store the media content of the third display channel. A user can implement a preference as to which buffer space media content to delete.
In other embodiments, a currently buffered display channel's media content is deleted after a display channel change according to a set of controlling rules based on the respective values of a set of input variables that effect one or more outcomes or resulting behaviors. The controlling rules are programmed by the application developer and employ measured input variables that preferably connote elapsed time as measured in the background throughout the course of time by the PVR application377 (FIG. 3B). Relational input variables are also employed by the programmed controlling rules. The relational input variables express respective comparisons of the value of an input variable type that expresses elapsed time for a first display channel to the value for a second display channel. As a non-limiting example, a relational input variable may be the fraction obtained by division of the values of the two corresponding input variables being compared or their reciprocal.
ThePVR application377, theoperating system353, thecontroller379, thegeneral settings application312, and the device driver311 (FIGS. 3A and 3B) execute respective programmed instructions in the processor344 (FIG. 3A). The communication between these entities is preferably via interrupt and/or messaging capabilities of theDHCT16 as well as by sharing of data output by one of these entities and written inmemory349 that serves as input to one or more of any of these entities by reading the data frommemory349. The data output by one or more of these entities can be input data to itself. Programmed controlling rules are a programmed component part of thePVR application377. The controlling rules comprise ingredients employed in the preferred embodiments of the invention to make decisions, for example decisions as to priority of buffering and tuning resources. This rule-based system of the preferred embodiments includes variables that feed into their antecedents, and each rule produces a consequence or a prediction. The set of consequences may in turn be input to another set of rules to provide an overall output, or the consequences may be input to other mechanisms (e.g., statistical classifiers, syntactical classifiers, or other inference engine mechanisms used with inference engines in expert systems) to make a decision. As will be described further below, variables can be assigned coefficients. Consequences produced by the rules can be assigned a weight as well. The weights can change throughout the course of time, for example in a dynamic system. If probability or a confidence/certainty factor is assigned to a variable (e.g., a favorites channel), then the rule based system can be generically categorized as a fuzzy-set system.
The scope of outcomes or actions conducted by the execution of the programmed controlling rules, include one or more of the following:
- A. Determination of available unemployed resources such as tuners, TV displays to display the newly requested display channel, storage device capacity (including the time shift buffers) to store the media content of the newly displayed channel, access capability to and/or from the storage device (e.g., via interfaces), compression engines, and decompression engines. Note that the access capacity may be an issue with high definition television streams. For example, in an implementation that includes a single hard disk drive, available bandwidth for transfers to and from the hard disk through the IDE interface, for example, can be limited, especially for high bit rates.
- B. Determination of whether to discontinue a display channel that is sourcing media content (that is being currently buffered) to repossess a lacked resource that is required to effect the viewer's newly requested display channel's media content.
- C. Upon determination of a need for discontinuation of a display channel, selection of a currently buffered display channel (selected display channel) to discontinue.
- D. Upon the lack of an available tuner to receive another display channel, discontinuation of sourcing the selected display channel's media content via the tuner, and
- E. Commencement of sourcing of the viewer's newly requested display channel's media content via the repossessed tuner.
- F. Discontinuation of buffering the currently buffered display channel's media content in a TSB from a first tuner, and deletion (or displacement) of that media content,
- G. Continuation of buffering of a non-displaced display channel's media content associated with a second tuner in a TSB,
- H. Commencement of buffering into the repossessed TSB of the viewer's newly requested display channel's media content,
- I. Upon the lack of an available TV display to display another display channel, discontinuation of display on the TV display of the selected display channel's media content that is being buffered,
- J. Commencement of display on the TV of the viewer's newly requested display channel's media content on the repossessed TV display.
The order in the actions conducted can differ, and no particular order is implied by the list A-J described above. A tuner resource change comprises the discontinuation of receiving a display channel's media content, and receiving the media content of another display channel requested by the viewer. A display resource change comprises the discontinuation of displaying a received display channel's media content and displaying on theTV341 the media content of a newly requested display channel. A buffer resource change comprises the discontinuation of a received display channel and deletion of the corresponding media content that is buffered, in addition to buffering the media content of another received display channel in the TSB.
With continued reference toFIG. 4, if all the resources are available to tune, buffer, and display the media content of the third display channel (i.e., newly requested display channel), no discontinuation is required in the signal flow path through the DHCT. However, if there is a conflict in resources used such that the current resources are fully employed for tuning, buffering, and/or displaying the first and second display channel media content, then the controlling rules of the preferred embodiments will provide for a discontinuation preferably in reverse order (e.g., from display to source) to the order used to establish a connection between resources to tune, display, and/or buffer the media content, and connection in the forward order (e.g., from source to display) for receiving, buffering, and displaying the media content for the newly requested display channel.
For the example implementations provided, it will be assumed that a newly requested display channel is to be both displayed on thetelevision341 and buffered, with the understanding that the preferred embodiments can be used to accomplish either one of these operations alone or in combination. A couple of considerations are worth noting in the context of proper resource management. For example, according to rules and precedences configured, and with continued reference toFIG. 4, the second display channel may be discontinued from being displayed, but its buffering may continue. The first display channel that was not being displayed but only buffered may stop buffering, and its associated tuner may discontinue sourcing the media content of the first display channel in favor of sourcing the newly requested display channel. Further, the media content of the first display channel downloaded to the buffer can be deleted in favor of the media content of the newly requested display channel. Thus, although tuning, displaying, and/or buffering media content from one display channel can be completely discontinued in favor of a newly requested display channel, it is also possible that the signal flow for both display channels (the first and the second) will in some way be affected to provide resources for the newly requested display channel media content. As described above, the discontinuation in signal flow is preferably done in the reverse order (i.e., reverse to its forward connection), such that the display for the second display channel is discontinued first, and then the buffering and then the tuning is discontinued for the first display channel, as one example.
Another consideration in resource management includes the fact that discontinuation will vary in the signal flow path and the resources affected, in some implementations, depending on whether the received display channel is analog or digital.FIGS. 5-11 will now be used to further illustrate the signal flow paths through a DHCT, and in particular, how resources are employed to receive (tune), buffer, and/or display media content sourced from analog transmission signals and digital transmission signals. The timing diagrams ofFIGS. 5-11 will also help to provide added understanding of the outcomes described above in A-J.
FIG. 5 is a timing diagram that illustrates one example implementation for resource connections for a digital signal flow path where there is a display of the media content with no buffering. Step501 includes forwarding the digital transmission signal from a resourced tuner of thetuner system345 to thesignal processing system314. At thesignal processing system314, the digital signal is preferably demodulated and demultiplexed, and forwarded to the media engine322 (step502). Step503 includes decoding the signal in cooperation withmedia memory305, and then the media content is readied and reconstructed (step504) for output (via output stage348 (FIG. 3A), as understood herein) to the TV341 (step505). Discontinuation of the signal flow path, according to the preferred embodiments, will preferably occur in the reverse order (e.g., steps505,504,503, etc.), as is generally true forFIGS. 6-11 that follow.
FIG. 6 is a timing diagram of one example implementation for buffering and displaying media content in a digital signal path.Steps601 through605 are similar tosteps501 through505 described in association withFIG. 5. Step610 preferably occurs in parallel to step602, and includes caching the demultiplexed signal inDRAM352. Step611 includes routing the signal fromDRAM352 to aninterface375, and step612 includes routing the signal to a time shift buffer in thestorage device373.
FIG. 7 is a timing diagram of one example implementation that includes displaying a time-shifted (i.e., buffered) digital signal. Step701 includes forwarding the digital signal to thesignal processing system314 for demodulation and demultiplexing. The signal is then cached to DRAM352 (step702), routed to the interface375 (step703) and then to the storage device373 (Step704). From thestorage device373, the signal is routed back to the interface375 (step705), and then to the media engine322 (step706). Themedia engine322 and themedia memory305 cooperate to decode the signal (step707), and then the signal is reconstructed and conditioned for output (step708), and then the signal is output to the TV341 (step709).
FIG. 8 is a timing diagram of one example implementation that includes buffering the media content from one of the display channels to thestorage device373. As shown, the digital transmission signal received at a tuner of thetuner system345 is forwarded to the signal processing system314 (step801), cached to DRAM352 (step802), routed to the interface375 (step803) and then to the storage device373 (step804).
FIGS. 9-11 will now be used to illustrate the signal flow path and resources employed for tuning, buffering, and/or displaying media content modulated in an analog transmission signal.FIG. 9 is a timing diagram illustrating an example signal flow path and resources employed for decoding and displaying an analog signal. Step901 includes forwarding the analog signal received at a tuner of thetuner system345 to thesignal processing system314, where it is demodulated and forwarded to the analog decoder316 (step902) for decoding. The resulting decoded, digitized signal is then forwarded to the media engine322 (step903), and then forwarded to the media memory305 (step904) where the signal is readied for output, and then forwarded to the TV341 (step905).
FIG. 10 is a timing diagram of one example implementation where the analog signal is buffered and displayed in parallel processes. Step1001 includes forwarding the analog signal from thetuner system345 to thesignal processing system314, where it is demodulated and then forwarded to the analog decoder316 (step1002). From theanalog decoder316, processing occurs in two parallel paths.Step1003 includes forwarding the decoded digitized signal to themedia engine322, and then tosteps1004 and1005 that mirrorsteps904 and905 ofFIG. 9. The other processing path from theanalog decoder316 includes forwarding the decoded digitized signal to the compression engine317 (step1010) for compression, which occurs in cooperation with compression engine memory309 (step1011), and then caching the compressed signal (step1012). The compressed signal is then routed fromDRAM352 to the interface375 (step1013), and then routed for storage in the storage device373 (step1014).
FIG. 11 is a timing diagram of one example implementation where the media content displayed is time shifted.Step1101 includes forwarding the analog signal received at thetuner system345 to thesignal processing system314 for demodulation. The demodulated signal is then forwarded to the analog decoder316 (step1102) where it is decoded, and then forwarded to the compression engine317 (step1103) for compression in cooperation with compression engine memory309 (step1104). From thecompression engine317, the signal is then cached in DRAM352 (step1105) and transferred to the interface375 (step1106) and then routed to the storage device373 (step1107). From thestorage device373, the signal is then routed to the interface375 (1108) and then to the media engine322 (step1109), where it undergoes decoding (step1110) in cooperation withmedia memory305, and then the signal is reconstructed and output processed (step1111), and then the signal is output to the TV341 (step1112).
Thus as shown, the management of resources for tuning, displaying, and/or buffering media content from a plurality of display channels includes, in addition to the considerations mentioned above, determining whether the resources are processing analog signals, digital signals, or a combination of both.FIG. 12A is a flow diagram of an example resource management process that can be implemented when the first and/or second display channel signal flow needs to be discontinued, in whole or in part, to provide for the tuning, buffering and display of a newly requested display channel. Assume that the desired signal flow for the newly requested display channel is as shown in the configuration illustrated inFIG. 6 (e.g., display in parallel to buffering), or in other embodiments, the configuration shown inFIG. 7 (time-shifted display).Step1201 provides that media content is provided in a first display channel for display to the display device and in parallel, for buffering to a storage device, much like the configuration shown inFIG. 6.Step1202 provides that the second display channel is now being displayed and buffered (which results in discontinuation of the display of the media content of the first display channel), and the first display channel is resourced within a configuration that appears much like the configuration shown inFIG. 8. Similar actions and configurations can apply for analog signals, as will be described below.
Step1203 includes receiving a request for a new display channel. For purposes of discussion, it will be assumed that the new display channel is provided as a digital signal, with the understanding that the steps provided herein can be generally applied for an analog signal, or a combination of analog and digital signals.Step1204 includes discontinuing the display of the media content of the second display channel on a display device (e.g., TV). For example, if the second display channel was currently providing media content to the display device, such as that shown in the timing diagram ofFIG. 6, the following forward path steps would be reversed, preferably in the order given:steps605,604,603, and then602. At this stage, the first and the second display channel would have a forward signal path that closely resemblesFIG. 8 for a digital signal (orsteps1001,1002,1010 through1014 ofFIG. 10 for an analog signal).
Step1205 includes determining a precedence for resourcing a newly requested display channel according to a set of controlling rules, as will be described below. If the rules mandate that the first display channel has precedence, then discontinue the buffering of the media content of the second display channel (step1206,FIG. 12B). According to the example configuration shown inFIG. 6, such an action would result in discontinuing the signal flow path according to the following order:steps612,611,610,602, and601 (or steps1014-1010,1002, and1001 ofFIG. 10 for an analog signal). Step1207 (FIG. 12B) includes using the tuner resources, the buffer resources, and the display resources previously employed for the second display channel to resource the media content of the newly requested display channel.
Referring toFIG. 12C, with continued reference toFIG. 12A, if the rules mandate that the second display channel has precedence, then buffering for the second display channel (for example, as shown insteps801 through804 ofFIG. 8 for a digital signal, orsteps1001,1002,1010-1014 ofFIG. 10 for an analog signal) will continue and the first display channel buffering path will be discontinued (step1208).Step1209 includes using the tuner resources, the buffer resources, and the display resources previously employed for the first display channel to resource the media content of the newly requested display channel.
In one implementation, the repossession of “buffering resources” of the received display channel to be displaced uses a form of bookkeeping. In one embodiment, the point on the hard disk (e.g.,300 or301,FIG. 3A) where the media content to be deleted started will preferably be recorded (for example in the PVR data structure or in annotations in the storage device373 (FIG. 3A)) such that the newly requested display channel media content will start at the point where the deleted media content started. Another form of bookkeeping that can be used in other embodiments includes subtracting the amount of total buffer capacity after such deletion. As described briefly above, the determining of precedence relies preferably on a set of controlling rules, which in turn rely on a set of input variables, and/or user inputted preferences. One of the input variables to the controlling rules, including those rules that control discontinuation of buffering of a display channel's media content, can be the length of contiguous time of buffering the display channel's media content (and deletion of buffered media content). Alternatively, the total time of buffering a respective media content instance (i.e., not necessarily contiguous time) can be employed.
Another input variable can be the length of time that a buffered display channel's media content is displayed on TV. Whether a buffered display channel's media content is currently displayed or last displayed on TV may cause this input variable to be weighted more significantly in the controlling rules. Hence, the multiplicative coefficient of an input variable may be dynamically adjusted according to a user's viewing patterns. The relationship between the length of buffering elapsed time between media content of a first display channel currently being buffered to media content of a second currently buffered display channel can also be another input variable.
The relationship between the length of buffering elapsed time of a display channel's media content currently being buffered to its total time displayed on TV throughout the course of buffering can also be an input variable. Another input variable to the controlling rules can be whether a display channel currently being buffered is included in a favorites channel list, such as a list of preferred display channels selected or entered by the viewer during the course of a configuration session. A favorites channel list, as well as all preferences and configurations aforementioned herein are stored in memory (e.g., DRAM352) and in non-volatile memory, such as FLASH memory351 (FIG. 3A) for its recovery in the event of a power outage. The storage device373 (FIG. 3A) can also serve for storing preferences and configurations and recovery upon power-up. Whether the favorites channel list influences the controlling rule is preferably a configuration and/or preference entered by the viewer during a configuration session.
In one embodiment, when the favorites channel list is employed as input to the controlling rules, a different set of controlling rules is executed. Alternate values for thresholds that control relational input variables can be employed when the favorites channel list is part of the input to the controlling rules. Hence, if a display channel being buffered is in the favorites channel list, it will influence the controlling rules in a way more towards continuing its buffering (or equivalently, less likely of being displaced and discontinued due to sourcing the tuner and respective TSB for media content of a newly requested display channel).
An input variable may be designed to exhibit a non-linear range when controlled by a respective threshold or set of thresholds. A first input threshold is preferably assigned to control the value of a first variable. A first type of input variable may be assigned a value of “no significance” (e.g., zero) if below its respective first controlling threshold value. Furthermore, the value of a second type of input variable may be controlled with a first controlling respective threshold and a second respective controlling threshold. If the value of the second type of input variable is above or equal to a first respective controlling threshold but below a second respective controlling threshold value, it retains its original value. Furthermore, the value of a third type of input variable may be controlled with a first respective controlling threshold and a second respective controlling threshold but may be assigned a maximum value when its value is above a second respective controlling threshold value. As would be understood and appreciated by those having ordinary skill in the art, the comparison of an input variable's value to a threshold value may be based on programmable operations, including “if less than the value”, “less than or equal to”, “greater than”, “greater than or equal to”, “equal”, and/or “not equal.”
A fourth type of input variable may be controlled by assigning a respective “no significance” value indirectly or directly per a viewer's input. A fifth type of input variable may be controlled by assigning a respective “no significance” value indirectly or directly per a viewer's input to configure a desired buffering, tuning and/or display behavior during a configuration session. A sixth type of input variable may be controlled by assigning a respective “maximum value” indirectly or directly per a viewer's input. A seventh type of input variable may be controlled by assigning a respective “maximum value” indirectly or directly per a viewer's input to configure a desired buffering, tuning and/or display behavior during a configuration session. In this way, the value of some and possibly all input variables can be modified to a non-linear range of values according to their actual value. The actual value of an input variable can be a measured value, a user input value, or a default value assigned by the programmed application. Each input variable may be further weighted multiplicatively with a respective coefficient that pertains to the importance of the respective input variable with respect to the complete set of input variables employed by the controlling rules. For example, a first controlling rule has a first multiplicative coefficient for a first input variable and a second controlling rule has a second multiplicative coefficient for the same first input variable.
Throughout the course of time, the PVR application377 (FIG. 3B) employs OS services such as time posting and clock access to allow updating of values for each respective input variable in the set of input variables, and to be updated according to the “time posting granularity” of theOS 353 and according to the granularity oftimer371 and clocks372 (one is shown) ofDHCT16. Values of input variables are stored in memory349 (FIG. 3A) with annotations pertaining to a file in the storage device373 (FIG. 3A).
Upon a viewer's input entered via an input device such as a remote control device380 (FIG. 3D), thePVR application377 executes on the processor344 (FIG. 3A) to enter a machine state wherein the PVR application377 (FIG. 3B) assesses and determines available resources such as (1) tuners for sourcing media content of a display channel selection by the viewer, (2) TV display for displaying a display channel's media content, (3) compression engines for compressing analog signals, (4) decompression engines for decoding digital signals, (5) communication interfaces for access to and from the storage device, and (6) TSBs for buffering the respective display channel's media content. Determination of resources is part of the controlling rules in one embodiment. When thePVR application377 determines the lack of a required resource, controlling rules are executed to effect the desired change according to the viewer's input requesting a new display channel, a prior entered viewer's configuration and prioritization, and/or according to set of input variables.
Prior to the execution of the controlling rules, pre-processing of the applicable input variables is preferably conducted. The pre-processing module385 (FIG. 3B) includes processing of the controlling rules and is preferably included in the PVR application377 (FIG. 3B), although similar functionality can be found in other applications or as a separate module. In thepre-processing module385, the respective values of input variables is read frommemory349. For an applicable input variable, its respective value is compared to one or more thresholds assigned to that particular input variable and a modified value for the input variable is obtained. Hence, thepre-processing module385 maps the actual value of each input variable required to be modified to a desired range according to the predetermined set of thresholds assigned to control the respective input variable. Thereafter, each pre-processed input variable is multiplied by its respective weight coefficient and the set of controlling rules are executed to obtain one or more outcomes that effect one or more resource changes.
The following description will include an example implementation, based onFIG. 4, using some input variables to illustrate a rule-based system that determines priority of resources to tune, buffer, and display a third display channel when two other display channels are consuming the resources needed for the third display channel. One way to prioritize which buffer space of the storage device373 (FIG. 3A) to delete of media content is to prioritize the cumulative media content in each buffer space by evaluating input variables such as download duration. For example, if the user was viewing the first display channel for 5 hours, and then decided to “channel surf” (i.e., change display channels in brief succession) over 1 through “N” display channels, then it is probable that the content downloaded from the first display channel is of higher significance, or of a higher priority, than the media content downloaded from the surfed display channels. Thus, with each display channel change, a tuner and buffer conflict arises whereby media content downloaded from prior display channels intoTSB1376 and/orTSB2378 are at risk of being deleted to accommodate the download of the newly requested display channel. It will be assumed that the most recently tuned display channel will be displayed. The conflict of tuning and buffering can be addressed, in one implementation, by establishing a system that prioritizes the download duration to each buffer space according to controlling rules.
In the preferred embodiments, the buffering duration of the media content of the first display channel up to the point of the display channel change (to the second display channel) is compared to the buffering duration of the media content of the second display channel. For example, the user can be viewing a series of media content instances over a span of hours on the first display channel. The user then decides to look at the score of a football game on the second display channel. The point in the presentation of the media content instance of the first display channel when the user switched to the second display channel is “marked”, and this “mark”, the buffering duration, and the displayed duration for the first display channel are stored in a PVR data structure with its pertinent annotations (not shown). The user is now viewing the football game on the second display channel, which is preferably received attuner2358 and buffered toTSB2378. The media content of the first display channel continues to be downloaded toTSB1376. Assume the user waits a few minutes before the score comes up on the screen display for the football game, and then decides to go to a third display channel.
Assuming the two tuner DHCT illustrated inFIG. 4, the user selects the third display channel on his or her remote control device380 (FIG. 3D). The PVR application377 (FIG. 3B) executes the set of controlling rules and according to the outcome of the rules, makes a determination as to which tuner (tuner1354 or tuner2358) to resource as well as the respective applicable behavior to effect. ThePVR application377 makes this determination by executing the programmed controlling rules that effectively compare the buffering duration (stored in the PVR data structure) of the first display channel with the buffering duration of the second display channel (the football game) by the pre-processing (via the pre-processing module385 (FIG. 3B)) of applicable input variables and execution of controlling rules. Since the difference in buffering durations is measurable in hours, the second display channel is discontinued, thesecond tuner358, in one implementation, is then resourced for the third display channel and the associated buffered media content (i.e., the game in TSB2378) is deleted in order to receive and store the media content of the third display channel.
If shortly after switching to the third display channel, the user decides to switch to a fourth display channel, the PVR application377 (FIG. 3B) does a similar evaluation of buffering durations. The buffering duration of the first display channel (stored in the PVR data structure) is compared to the buffering duration for the third display channel. Again, in one implementation, due to the large difference in buffering duration (or because the buffering duration of the third display channel did not meet sufficient elapsed time according to a first threshold), the third display channel is discontinued, thesecond tuner358 is again resourced for the fourth display channel, and the TSB2 media content of the third display channel is deleted to receive the media content of the fourth display channel, while the media content of the first display channel continues to be buffering and retained inTSB1376.
FIGS. 13-17 are a series of flow charts illustrating steps for prioritizing between tuner and buffer resources based on a plurality of buffering durations in order to address the example scenario ofFIG. 4, in accordance with several embodiments of the invention. The blocks in the flow charts should be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
With continued reference toFIG. 4,FIG. 13 is a flow chart that establishes an arbitrary starting point for prioritizing the buffering durations to the first buffer (TSB1376) and the second buffer (TSB2378).Step1302 includes receiving the media content of the first display channel (via tuner1354) into the first buffer (TSB1376), and displaying the media content on theTV341. Thus, as a non-limiting example, a two-tuner system is assumed, andtuner1354 is employed to receive media content of the first display channel. Upon a viewer's input via an input device (e.g., aremote control device380,FIG. 3D) to display a second display channel's media content,step1304 includes receiving the media content from a second display channel (via tuner2358) into the second buffer (TSB2378), and displaying the media content on theTV341. The second tuner (tuner2358) is assumed to be an available resource, and thus receives the media content of the second display channel. The newly requested display channel will preferably have display resource precedence over the currently displayed display channel.
Step1306 includes receiving user input requesting a third display channel. Unless theDHCT16 contains additional tuners, a conflict arises, since one of the tuners (354 or358) has to be resourced to receive the media content of the third display channel for display on the TV341 (FIG. 4). This example scenario assumes that no other buffer resource conflicts are present, for instance from a scheduled permanent recording or from preference filter mechanisms, among others. Otherwise, additional constraints or behavior would be enacted based on the outcome of the execution of the programmed controlling rules. Some conflicts, when they arise, can give rise to conflict barkers that are presented on a display screen to give the user the opportunity to resolve the conflict. Other conflicts can be automatically resolved by execution of the programmed controlling rules. In some embodiments, a system settings menu (not shown) can be utilized at start-up or at times convenient to the user that enables the user to select and/or configure priorities between these and other conflicts that may arise. Assuming that theDHCT16 has only two tuners,step1308 includes a decision to be made by the PVR application377 (FIG. 3B). Namely, thePVR application377, upon receiving a request for a display channel change to a third display channel, decides whether media content of the first two display channels was buffered for a minimum threshold buffering duration prior to the respective display channel changes. The determination of which display channel did not meet the minimum buffering threshold will be left forstep1410 ofFIG. 14, as described below.
If a minimum buffering duration threshold was not met for the buffering of the media content to its respective buffer, then the prioritizing continues to point A inFIG. 14, otherwise to point B inFIG. 15. Continuing the example at point A inFIG. 14, thePVR application377 decides whether the media content buffered to either buffer was buffered over at least a minimum threshold duration (step1410). If a buffering duration threshold was met for the buffering of the media content to one buffer, then step1412 includes the steps of resourcing the tuner (for the third display channel media content) that is sourcing the display channel for which the buffered media content does not satisfy the minimum duration according to a first threshold. In turn, the media content buffered for less than the minimum threshold is displaced with the media content buffered from the third display channel, while the media content buffered for a minimum threshold duration is retained allowing for a continued tuning and buffering operation.
If the buffering of the media content to either buffer failed to meet a minimum threshold duration, then eithertuner1354 or tuner2358 (FIG. 3A) is resourced for the third display channel and the media content from either buffer is displaced while retaining the media content in the non-displaced buffer. At this point, other input variables (as described above), can be employed to provide a decision as to priority. In one embodiment, the media content consuming the least buffered space is displaced. In another implementation, the latest displayed channel media content is retained. In yet another embodiment, the last displayed channel (e.g., the first display channel media content) is retained when the viewer has pressed the key for the current display channel media content being displayed (e.g., the second display channel media content) prior to changing to a third display channel. Another embodiment includes retaining media content sourced from a higher priority display channel (e.g., a favorites channel). Alternatively, the user can be presented at this point with a user interface screen (decision barker screen), as explained below, that provides the user with a choice of buffers to clear (and a choice of tuners to resource) to make room for the third display channel media content.
FIG. 15 is a flow chart that illustrates example prioritizing steps starting from point B, which occurred after the threshold determination ofFIG. 13. Thus, if the buffering duration to both buffers met a minimum threshold,step1516 comes into play to determine if the media content of the first display channel and the second display channel was buffered into the respective buffers for an equal duration. If so,step1518 shows that a decision barker is provided to enable the user to decide which media content to displace (and consequently, which tuner to resource and which media content to retain). The decision barker will be described in further detail below. Alternatively, other input variables can be evaluated to provide a decision, for example retaining the media content of a higher priority display channel (e.g., a favorites channel).
However, if the media content to both buffers was not buffered for an equal buffering duration, then the next step (step1520) is to determine whether the media content of the first display channel was buffered for a longer duration than the media content of the second display channel (i.e., relational input variables). It will be understood by those of ordinary skill in the art that the reference comparison can be reversed. For example,step1520 can just as easily be described as determining whether the second display channel was buffered for a longer duration, or described as whether the media content of the second display channel was buffered for a shorter duration, etc. Ifstep1520 results in an affirmative determination, the prioritizing steps continue to point C (FIG. 16), otherwise the steps continue at point D (FIG. 17).
FIG. 16 is a flow chart that begins from point C, whereinstep1622 compares the buffering duration of the first and the second display channel and determines whether the media content of the second display channel was buffered for a buffering duration at least equal to a defined percentage of the buffering duration of the media content of the first display channel. For instance, if the media content of the first display channel was buffered to the first buffer (TSB1376,FIG. 4) for a duration of 30 minutes, a threshold percentage buffering duration, such as 50%, could be established, in one embodiment, at a user settings menu (not shown) at start-up, or configured at any other point when using the PVR mechanisms in other embodiments. By establishing a threshold percentage of, for example, 50%, if the media content of the second display channel was buffered into the second buffer (TSB2378,FIG. 4) for 10 minutes, 10 minutes is less than 50% of 30 minutes, and thus the determination ofstep1622 would lead to step1624, wherein tuner2358 (FIG. 4) would be resourced for the third display channel and the media content of the second display channel would be displaced by the media content buffered from the third display channel.
However, continuing the example, if the media content of the second display channel was buffered for 25 minutes, this buffering duration exceeds the threshold percentage of 50%, and thus a decision barker can be provided (step1626) to enable the user to determine priority in this case. In one embodiment, the viewer configures when to allow the decision barker to be displayed by entering input during a configuration session. Furthermore, the viewer configures the percentage difference from the set threshold for when the decision barker is displayed. Alternatively, a decision barker is only presented to the viewer to enable the viewer to determine priority in cases deemed as a “close-case,” such as when near the percentage threshold. Alternatively, default threshold percentages can be provided in the software, or the user can configure these threshold percentages in a user interface screen at start-up, or at other times, or additional input variables can be evaluated.
FIG. 17 continues the prioritizing steps from point D, which resulted from thedecision step1520 inFIG. 15. In other words, the media content of the first display channel was buffered for a shorter duration than the media content of the second display channel.Step1728 mirrors the decision step of step1622 (FIG. 16), and determines whether the media content of the first display channel was buffered for a buffering duration at least equal to a defined percentage of the buffering duration of the media content of the second display channel. If so, the decision barker is displayed for user interaction (step1730), otherwise the first tuner (tuner1354FIG. 4) is resourced and the media content of the first display channel is displaced (step1732) to make room for the media content of the third display channel (while retaining the media content of the other buffer).
FIG. 18 is a screen diagram of an example decision barker screen provided for by the PVR application377 (FIG. 3B). Thedecision barker screen1800 can be overlaid on a display of a media content instance, or preferably, is a separate screen that contains a scaled downmedia content display1834. Thedecision barker screen1800 further includes aninstruction section1836 that informs the user of the buffer space conflict, as well as directions as to how to resolve the conflict. Alist portion1838 includes aselect symbol1893 and scrollarrow icons1894 and1895 that suggest corresponding functionality to theselect button393 and up and downarrows394 and395 of the remote control device380 (FIG. 3D). Thelist portion1838 preferably also includes a list of titles of the media content instances buffered for each display channel, as well as the times the instances were buffered as well as the duration of time each media content instance was buffered. The user scrolls to the display channel that the user wants to remove to make room for the third display channel. If the user decides that he or she wants to permanently record buffered content, he or she selects the record options button “B”.
The PVR application377 (FIG. 3B) of the present invention can be implemented in hardware, software, firmware, or a combination thereof. In the preferred embodiment(s), thePVR application377 is implemented in software or firmware that is stored in a memory and that is executed by a suitable instruction execution system. If implemented in hardware, as in an alternative embodiment, thePVR application377 may be implemented with any or a combination of the following technologies, which are all well known in the art: a discrete logic circuit(s) having logic gates for implementing logic functions upon data signals, an application specific integrated circuit (ASIC) having appropriate combinational logic gates, a programmable gate array(s) (PGA), a field programmable gate array (FPGA), etc.
The PVR application377 (FIG. 3B), which comprises an ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. In the context of this document, a “computer-readable medium” can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The computer readable medium can be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. More specific examples (a nonexhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic) having one or more wires, a portable computer diskette (magnetic), a random access memory (RAM) (electronic), a read-only memory (ROM) (electronic), an erasable programmable read-only memory (EPROM or Flash memory) (electronic), an optical fiber (optical), and a portable compact disc read-only memory (CDROM) (optical). Note that the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be emphasized that the above-described embodiments of the present invention, particularly, any “preferred embodiments” are merely possible examples of implementations, merely setting forth a clear understanding of the principles of the inventions. Many variations and modifications may be made to the above-described embodiments of the invention without departing substantially from the spirit of the principles of the invention. All such modifications and variations are intended to be included herein within the scope of the disclosure and present invention and protected by the following claims.