Movatterモバイル変換


[0]ホーム

URL:


CN117812341A - Display equipment and media asset playing method - Google Patents

Display equipment and media asset playing method
Download PDF

Info

Publication number
CN117812341A
CN117812341ACN202310575869.8ACN202310575869ACN117812341ACN 117812341 ACN117812341 ACN 117812341ACN 202310575869 ACN202310575869 ACN 202310575869ACN 117812341 ACN117812341 ACN 117812341A
Authority
CN
China
Prior art keywords
data
media
player
metadata
display device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310575869.8A
Other languages
Chinese (zh)
Inventor
罗贤之
陆华色
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Electronic Technology Shenzhen Co ltd
Original Assignee
Hisense Electronic Technology Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Electronic Technology Shenzhen Co ltdfiledCriticalHisense Electronic Technology Shenzhen Co ltd
Priority to CN202310575869.8ApriorityCriticalpatent/CN117812341A/en
Priority to PCT/CN2023/140315prioritypatent/WO2024239631A1/en
Publication of CN117812341ApublicationCriticalpatent/CN117812341A/en
Pendinglegal-statusCriticalCurrent

Links

Classifications

Landscapes

Abstract

Some embodiments of the present application provide a display device and a method for playing media assets, where the method may obtain a data transmission stream of media asset data in response to a play command of the media asset data. Wherein the data transport stream includes MPU metadata, segment metadata, and MFU data. And detecting the transmission sequence of the MPU metadata, the fragment metadata and the MFU data in the data transmission stream. If the transmission order is the target order, the data transmission stream is injected into the player to perform decoding playing on the data transmission stream in real time through the player. And if the transmission order is not the target order, encapsulating the data transmission stream into media transmission packets according to the target order, and injecting the media transmission packets into the player. The method can directly inject the data transmission stream into the player when the transmission sequence of the data transmission stream is the target sequence, can accelerate the drawing speed of the display equipment and improves the playing efficiency of the media data.

Description

Display equipment and media asset playing method
Technical Field
The application relates to the technical field of display equipment, in particular to display equipment and a media asset playing method.
Background
The display device refers to a terminal device capable of outputting a specific display screen, and may be a terminal device such as a smart television, a communication terminal, a smart advertisement screen, and a projector. Taking intelligent electricity as an example, the intelligent television is based on the Internet application technology, has an open operating system and a chip, has an open application platform, can realize a bidirectional man-machine interaction function, and is a television product integrating multiple functions of video, entertainment, data and the like, and the intelligent television is used for meeting the diversified and personalized requirements of users.
The display device may play different types of media data based on the protocol stack. For example, media data for television programs may be transmitted via the advanced television systems committee (Advanced Television Systems Committee, ATSC) 3.0 protocol stack definition MMTP (MPEG Media Transport Protocol, moving picture expert group media transport protocol) and ROUTE (real-time object delivery over unidirectional transport protocol). When playing the media asset data through MMTP, the media asset data needs to be packaged into MMT data packets for decoding and playing by the display equipment.
However, when the display device starts to access or switch channels, new signaling data and MMT packets need to be acquired. And the process of sending and receiving MMT data packets consumes a long time, delays the drawing speed of the media data in the display device, and causes low playing efficiency of the media data.
Disclosure of Invention
The application provides a display device and a media asset playing method, which are used for solving the problem of low media asset data playing efficiency in the display device.
In a first aspect, some embodiments of the present application provide a display device including a display and a controller. Wherein the display is configured to display a play screen of the media asset data; the controller is configured to:
Responding to a playing instruction of the media data, and acquiring a data transmission stream of the media data; the data transmission stream comprises MPU metadata, fragment metadata and MFU data;
detecting the transmission sequence of MPU metadata, fragment metadata and MFU data in the data transmission stream;
if the transmission sequence is the target sequence, the data transmission stream is injected into a player so as to execute decoding playing on the data transmission stream in real time through the player;
and if the transmission sequence is not the target sequence, packaging the data transmission stream into media transmission packets according to the target sequence, and injecting the media transmission packets into the player.
In some embodiments of the present application, the controller is configured to: monitoring a control event of an upper layer application; detecting a target channel accessed by the display equipment in response to a channel switching or channel accessing control event; inquiring media data corresponding to the target channel; and generating the playing instruction according to the media asset data.
In some embodiments of the present application, the controller is further configured to: establishing a media asset transmission channel according to a media transmission protocol of media asset data; acquiring a media resource presentation information signaling through the media resource transmission channel, wherein the media resource presentation information signaling comprises a presentation information table and a component description table; and inquiring the media resource data according to the presentation information table and the component description table.
In some embodiments of the present application, the controller performs a data transmission stream for obtaining the media asset data, and is configured to: calling a channel interface of the protocol stack middleware; controlling a target channel accessed by the display equipment through the channel interface; and receiving the media information presenting information signaling and media data of the target channel.
In some embodiments of the present application, the controller is further configured to: starting a media server of the protocol stack middleware; establishing a connection relation between the protocol stack middleware and the player through the media server; invoking the player based on the connection relationship, and injecting the data transport stream or the media transport packet into the player.
In some embodiments of the present application, the controller performs decoding playback on the data transport stream, and is configured to: initializing a decoder of the player; performing decoding on the data transport stream by the decoder; and calling a bottom layer resource to render a playing picture of the media data in the display.
In some embodiments of the present application, the target sequence is MPU metadata, segment metadata, and MFU data in order.
In a second aspect, some embodiments of the present application further provide a display apparatus including a display and a controller. Wherein the display is configured to display a play screen of the media asset data; the controller is configured to:
receiving a data transmission stream of media data in response to a playing instruction of the media data, wherein the data transmission stream comprises MPU metadata, fragment metadata and MFU data;
caching the data transmission stream to a player cache area;
when the MPU metadata and the fragment metadata are cached in the player cache region, the data in the player cache region are unpacked, and the data in the player cache region are injected into the player.
In some embodiments of the present application, the player buffer includes a metadata buffer and an MFU data buffer, and the controller is configured to buffer the data transport stream to the player buffer, where the controller is configured to: detecting a data type of data in the data transmission stream; if the data type is metadata, caching the metadata into the metadata cache area, wherein the metadata comprises the MPU metadata and the fragment metadata; and if the data type is MFU data, caching the MFU data into the MFU data cache area.
In a third aspect, some embodiments of the present application further provide a media asset playing method, including:
responding to a playing instruction of the media data, and acquiring a data transmission stream of the media data; the data transmission stream comprises MPU metadata, fragment metadata and MFU data;
detecting the transmission sequence of MPU metadata, fragment metadata and MFU data in the data transmission stream;
if the transmission sequence is the target sequence, the data transmission stream is injected into a player so as to execute decoding playing on the data transmission stream in real time through the player;
and if the transmission sequence is not the target sequence, packaging the data transmission stream into media transmission packets according to the target sequence, and injecting the media transmission packets into the player.
According to the technical scheme, the display equipment and the media asset playing method provided by some embodiments of the application can respond to the playing instruction of the media asset data to acquire the data transmission stream of the media asset data. Wherein the data transport stream includes MPU metadata, segment metadata, and MFU data. And detecting the transmission sequence of the MPU metadata, the fragment metadata and the MFU data in the data transmission stream. If the transmission order is the target order, the data transmission stream is injected into the player to perform decoding playing on the data transmission stream in real time through the player. And if the transmission order is not the target order, encapsulating the data transmission stream into media transmission packets according to the target order, and injecting the media transmission packets into the player. The method can directly inject the data transmission stream into the player when the transmission sequence of the data transmission stream is the target sequence, can accelerate the drawing speed of the display equipment and improves the playing efficiency of the media data.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic diagram of an operation scenario between a display device and a control device according to some embodiments of the present application;
fig. 2 is a schematic hardware configuration diagram of a display device according to some embodiments of the present application;
fig. 3 is a schematic hardware configuration diagram of a control device according to some embodiments of the present application;
fig. 4 is a schematic software configuration diagram of a display device according to some embodiments of the present application;
fig. 5 is a schematic diagram of connection between a display device and a server according to some embodiments of the present application;
FIG. 6 is a schematic diagram of an architecture of an ATSC3.0 system protocol stack provided in some embodiments of the present application;
FIG. 7 is a schematic diagram of an architecture for transmitting media transport packets over an MMT protocol session according to some embodiments of the present application;
FIG. 8 is a diagram of a data sequence of MMT packets in normal order as provided by some embodiments of the present application;
FIG. 9 is a diagram of a data sequence of MMT packets in an abnormal order according to some embodiments of the present application;
FIG. 10 is a conventional flow chart of a protocol stack middleware and a player for transmitting MPU data according to some embodiments of the present application;
fig. 11 is a flowchart illustrating a media transport stream transmission method according to some embodiments of the present application;
FIG. 12 is a flowchart illustrating a method for playing media assets according to some embodiments of the present disclosure;
fig. 13 is a schematic diagram of a playing architecture of a display device according to some embodiments of the present application;
FIG. 14 is an interaction diagram of an optimization flow for transmitting MPU data between a protocol stack middleware and a player according to some embodiments of the present application;
fig. 15 is a schematic flow chart of performing data standard setting on a data transmission stream according to some embodiments of the present application;
fig. 16 is a flowchart of decapsulating player buffer data according to some embodiments of the present application;
fig. 17 is a flowchart of a data transport stream buffering method according to some embodiments of the present application;
fig. 18 is a schematic diagram of a determination flow of decapsulated player buffer data according to some embodiments of the present application.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the exemplary embodiments of the present application more apparent, the technical solutions in the exemplary embodiments of the present application will be clearly and completely described below with reference to the drawings in the exemplary embodiments of the present application, and it is apparent that the described exemplary embodiments are only some embodiments of the present application, but not all embodiments.
All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present application, are intended to be within the scope of the present application based on the exemplary embodiments shown in the present application. Furthermore, while the disclosure has been presented in terms of an exemplary embodiment or embodiments, it should be understood that various aspects of the disclosure can be practiced separately from the disclosure in a complete subject matter.
It should be understood that the terms "first," "second," "third," and the like in the description and in the claims and in the above-described figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate, such as where appropriate, for example, implementations other than those illustrated or described in accordance with embodiments of the present application.
Furthermore, the terms "comprise" and "have," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to those elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The display device provided in the embodiment of the application may have various implementation forms, for example, may be a television, an intelligent television, a laser projection device, a display (monitor), an electronic whiteboard (electronic bulletin board), an electronic desktop (electronic table), and the like. Fig. 1 and 2 are specific embodiments of a display device of the present application.
Fig. 1 is a schematic diagram of an operation scenario between a display device and a control apparatus according to an embodiment. As shown in fig. 1, a user may operate the display device 200 through the smart device 300 or the control apparatus 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes infrared protocol communication or bluetooth protocol communication, and other short-range communication modes, and the display device 200 is controlled by a wireless or wired mode. The user may control the display device 200 by inputting user instructions through keys on a remote control, voice input, control panel input, etc.
In some embodiments, a smart device 300 (e.g., mobile terminal, tablet, computer, notebook, etc.) may also be used to control the display device 200. For example, the display device 200 is controlled using an application running on a smart device.
In some embodiments, the display device may receive instructions not using the smart device or control device described above, but rather receive control of the user by touch or gesture, or the like.
In some embodiments, the display device 200 may also perform control in a manner other than the control apparatus 100 and the smart device 300, for example, the voice command control of the user may be directly received through a module configured inside the display device 200 device for acquiring voice commands, or the voice command control of the user may be received through a voice control apparatus configured outside the display device 200 device.
In some embodiments, the display device 200 is also in data communication with a server 400. The display device 200 may be permitted to make communication connections via a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display device 200. The server 400 may be a cluster, or may be multiple clusters, and may include one or more types of servers.
Fig. 2 exemplarily shows a block diagram of a configuration of the control apparatus 100 in accordance with an exemplary embodiment. As shown in fig. 2, the control device 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory, and a power supply. The control apparatus 100 may receive an input operation instruction of a user and convert the operation instruction into an instruction recognizable and responsive to the display device 200, and function as an interaction between the user and the display device 200.
As shown in fig. 3, the display apparatus 200 includes at least one of a modem 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, and a user interface.
In some embodiments the controller includes a processor, a video processor, an audio processor, a graphics processor, RAM, ROM, a first interface for input/output to an nth interface.
The display 260 includes a display screen component for presenting a picture, and a driving component for driving an image display, a component for receiving an image signal from the controller output, displaying video content, image content, and a menu manipulation interface, and a user manipulation UI interface.
The display 260 may be a liquid crystal display, an OLED display, a projection device, or a projection screen.
The communicator 220 is a component for communicating with external devices or servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi module, a bluetooth module, a wired ethernet module, or other network communication protocol chip or a near field communication protocol chip, and an infrared receiver. The display apparatus 200 may establish transmission and reception of control signals and data signals with the control device 100 or the server 400 through the communicator 220.
A user interface, which may be used to receive control signals from the control device 100 (e.g., an infrared remote control, etc.).
The detector 230 is used to collect signals of the external environment or interaction with the outside. For example, detector 230 includes a light receiver, a sensor for capturing the intensity of ambient light; alternatively, the detector 230 includes an image collector such as a camera, which may be used to collect external environmental scenes, user attributes, or user interaction gestures, or alternatively, the detector 230 includes a sound collector such as a microphone, or the like, which is used to receive external sounds.
The external device interface 240 may include, but is not limited to, the following: high Definition Multimedia Interface (HDMI), analog or data high definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, etc. The input/output interface may be a composite input/output interface formed by a plurality of interfaces.
The modem 210 receives broadcast television signals through a wired or wireless reception manner, and demodulates audio and video signals, such as EPG data signals, from a plurality of wireless or wired broadcast television signals.
In some embodiments, the controller 250 and the modem 210 may be located in separate devices, i.e., the modem 210 may also be located in an external device to the main device in which the controller 250 is located, such as an external set-top box or the like.
The controller 250 controls the operation of the display device and responds to the user's operations through various software control programs stored on the memory. The controller 250 controls the overall operation of the display apparatus 200. For example: in response to receiving a user command to select a UI object to be displayed on the display 260, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments the controller includes at least one of a central processing unit (Central Processing Unit, CPU), video processor, audio processor, graphics processor (Graphics Processing Unit, GPU), RAM Random Access Memory, RAM), ROM (Read-Only Memory, ROM), first to nth interfaces for input/output, a communication Bus (Bus), and the like.
The user may input a user command through a Graphical User Interface (GUI) displayed on the display 260, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface recognizes the sound or gesture through the sensor to receive the user input command.
A "user interface" is a media interface for interaction and exchange of information between an application or operating system and a user, which enables conversion between an internal form of information and a user-acceptable form. A commonly used presentation form of the user interface is a graphical user interface (Graphic User Interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in a display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
Fig. 4 illustrates a software configuration diagram in a display device 200 according to some embodiments. In some embodiments, as shown in fig. 4, the system of the display device may include a Kernel (Kernel), a command parser (shell), a file system, and an application program. The kernel, shell, and file system together form the basic operating system architecture that allows users to manage files, run programs, and use the system. After power-up, the kernel is started, the kernel space is activated, hardware is abstracted, hardware parameters are initialized, virtual memory, a scheduler, signal and inter-process communication (IPC) are operated and maintained. After the kernel is started, shell and user application programs are loaded again. The application program is compiled into machine code after being started to form a process.
As shown in fig. 4, the system of the display device is divided into three layers, an application layer, a middleware layer, and a hardware layer, from top to bottom. In some embodiments, the system of the display device further includes a UI layer (not shown in the figure) located above the application layer, the UI layer receiving data transmissions of the application layer to enable a visual presentation of the display 260.
The application layer mainly comprises common applications on the television, and an application framework (Application Framework), wherein the common applications are mainly applications developed based on Browser, such as: HTML5 APPs; native applications (Native APPs);
The application framework (Application Framework) is a complete program model with all the basic functions required by standard application software, such as: file access, data exchange, and the interface for the use of these functions (toolbar, status column, menu, dialog box).
Native applications (Native APPs) may support online or offline, message pushing, or local resource access.
The middleware layer includes middleware such as various television protocols, multimedia protocols, and system components. The middleware can use basic services (functions) provided by the system software to connect various parts of the application system or different applications on the network, so that the purposes of resource sharing and function sharing can be achieved.
The hardware layer mainly comprises a HAL interface, hardware and a driver, wherein the HAL interface is a unified interface for all the television chips to be docked, and specific logic is realized by each chip. The driving mainly comprises: audio drive, display drive, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (e.g., fingerprint sensor, temperature sensor, pressure sensor, etc.), and power supply drive, etc.
In some embodiments, the application layer of display device 200 contains at least one application, such as: a live television application icon control, a video on demand application icon control, a media center application icon control, an application center icon control, a game application icon control, and the like.
In some embodiments, the live television application may provide live television, broadcast television, through different signal sources. For example, a live television application may provide television signals using inputs from cable television, radio broadcast, satellite services, or other types of live television services. And, the live television application may display media data of the live television signal on the display device 200.
In some embodiments, the video on demand application may provide video from different storage sources. Unlike live television applications, video-on-demand provides media asset data from some storage sources. For example, video-on-demand may come from a server side of cloud storage or from a local hard disk storage containing stored video programs, etc.
In some embodiments, the media center application may provide various multimedia content playing applications. For example, a media center may be a different service than live television or video on demand, and a user may access various images or audio through a media center application.
In some embodiments, an application center may be provided to store various applications. The application may be a game, an application, or some other application associated with a computer system or other device but which may be run in a smart television. The application center may obtain these applications from different sources, store them in local storage, and then run on display device 200.
It should be noted that, the media data described in the embodiments of the present application includes audio data and video data, and may be one or a combination of the two data.
Based on the above application, in order to play the corresponding media data in the display device 200, as shown in fig. 5, in some embodiments, the display device 200 may communicate with the server 400 to implement data interaction in use. For example, the user may trigger the display device 200 to display a program list through the interaction instruction, where the program list may include titles, start times, detailed descriptions, program levels, and media asset items of the programs in the multiple channels, where each media asset item corresponds to a network address, and the network address is used to download corresponding media asset data. The display apparatus 200 may acquire an acquisition request for media asset data from the server 400 in response to an interactive instruction input by the user. The user may request the server 400 to download corresponding media asset data for playback by the display device 200 through selection of media asset items in the program listing.
After receiving the acquisition request, the server 400 may extract the media items included in the corresponding channel from the storage module according to the acquisition request, and feed back the extracted media item information to the display device 200. The display device 200 then generates a program list according to the media item information fed back by the server 400, and displays the program list in the display 260, so as to provide a good program navigation mechanism for the display device 200.
After the user selects the corresponding media items in the program list to play, the display device 200 may acquire media data from the server 400 in real time during playing the media items, form a media data stream, and continuously obtain a media picture through processing such as decoding and rendering.
In order to achieve data interaction between the display device 200 and the server 400, the display device 200 needs to establish a communication connection with the server 400. For example, the display device 200 and the server 400 may establish a communication connection through a transmission network, and interactive data may be transferred between the display device 200 and the server 400 through the transmission network.
In some embodiments, it may be necessary to provide components for establishing a communication connection on the display device 200 and the server 400, respectively. That is, as shown in fig. 5, a communicator 220 may be provided in the display device 200, and a communication module may be provided in the server 400, and the communicator 220 and the communication module may simultaneously support at least one identical communication manner to establish a communication connection relationship. For example, the communicator 220 on the display device 200 includes a fiber optic interface, such that the display device 200 may connect to a network through the fiber optic interface; meanwhile, the communication module of the server 400 also includes an optical fiber interface, and the network may also be connected through the optical fiber interface, so as to implement communication connection between the display device 200 and the server 400.
It should be noted that, the display device 200 and the server 400 may also use other connection methods to establish a communication connection relationship. Such as wired broadband, wireless local area network, cellular network, bluetooth, infrared, radio frequency communications, etc.
A "many-to-one" connection may be between the display device 200 and the server 400, i.e., multiple display devices 200 may establish a communication connection with the same server 400, such that the server 400 may provide services to multiple display devices 200. The display device 200 and the server 400 may also be in a "many-to-many" connection relationship, i.e., the plurality of display devices 200 may establish communication with the plurality of servers 400, so that the plurality of servers 400 may respectively provide different services for the display device 200. It is obvious that in an individual application scenario, there may also be a "one-to-one" connection between the display device 200 and the server 400, i.e. one server 400 is dedicated to serve one display device 200.
In order to be able to provide services to the display device 200, a storage module may be further included in the server 400, and the storage module may store various resource data, information files, and control programs. In response to the user's interaction, the display device 200 may acquire different data in the storage module of the server 400, for example, the display device 200 may send a play data acquisition request to the server 400 when a certain media item is requested, and after receiving the request, the server 400 may extract the media data to be played in the storage module and transmit the media data to the display device 200, so that the display device 200 may decode and display the media data. The control program stored in the storage module may be executed by the control module of the server 400, so that the control module may perform a corresponding function according to the control program.
Based on the display device 200, in order to play the media data of the access channel, in some embodiments, the display device 200 may transmit the media data corresponding to the channel accessed by the display device 200 based on an advanced television systems committee (Advanced Television Systems Committee, ATSC) 3.0 protocol stack to form a data transmission stream of the media data. As shown in fig. 6, the protocol layer of the ATSC3.0 protocol stack uses the all-IP protocol, and the ATSC3.0 protocol stack can provide not only a broadcasting service for the display device 200 but also an interactive service for the display device 200 due to the support of the bi-directional channel. Wherein the broadcast service is based on the UDP/IP protocol and the interactive service is based on the TCP/IP protocol.
Accordingly, the display apparatus 200 may transmit media asset data through a broadcast service provided by the ATSC3.0 protocol stack. In some embodiments, the broadcast services provided by the ATSC3.0 protocol stack include MPEG media transport protocol (MPEG Media Transport Protocol, MMTP) and unidirectional Real-time object transport protocol (Real-time Object delivery over Unidirectional Transport, ROUTE). Wherein, MMTP is used for transmitting the specific signaling of media processing unit (Media Processing Unit, MPU) and MMT, MPU is basic packaging unit based on ISO basic media file format (ISO BMFF) in MPEG media transmission; MMT specific signaling may include two types of signaling for consumption and presentation. ROUTE is used to transport DASH segments, ROUTE specific signaling, and non-timing content. DASH segments are ISO-based basic media encapsulation formats of dynamic adaptive streaming based on HTTP, and non-time-sequential content may include non-time-sequential media content, EPG data, etc.
It should be noted that, the MMTP protocol packet includes an ALP/IP/UDP/MMTP protocol, and the ALP/IP/UDP/MMTP protocol may implement data required for designating the MMTP program during playing. The ALP protocol refers to a link layer protocol in an ATSC3.0 protocol stack; the IP protocol refers to a protocol for interconnection between networks; the UDP protocol refers to the user datagram protocol.
Also, in some embodiments, non-time-sequential content may also be transmitted directly over UDP. Signaling of the ATSC3.0 protocol stack may be distributed by MMTP, and/or ROUTE. While bootstrapping signaling information may be provided in the form of service lists (Service List Table, SLT).
As shown in fig. 6, in some embodiments, to implement heterogeneous services, one or more program elements in display device 200 are transmitted over a broadband path. The ATSC3.0 protocol stack at the broadband end of the display device 200 employs MPEGDASH to pass through the protocol layer of HTTP/TCP/IP while using the ISOBMFF-based MPU and DASH files as broadcast, broadband delivery, encapsulation and synchronization formats.
In some embodiments, the display device 200 transmits MMT signaling information using a signaling information mode through a process of MMT protocol session transmitting MMT data packets. Each MMT protocol session needs to carry specific MMT signaling information and each component data it transmits. MMT signaling may include media presentation information (Media Presentation Information, MPI) signaling, hypothesis receiver buffer model signaling, receiver buffer model removal signaling, and clock related information (Clock Relation Information, CRI) signaling, etc. Wherein, the media asset presentation information signaling comprises all or part of files of presentation information.
As shown in FIG. 7, in some embodiments, one MMT Asset corresponds to one content component, with a corresponding component ID, i.e., packet-ID. Such as 0x0001, 0x0002 and 0x0003 shown in fig. 7. Each MMT Asset is a set of one or more media processing units with the same Asset ID, and there is no overlap in presentation time for the media processing units; one MMT package is a set of one or more MMT Asset, such as Asset a, asset B, and Asset C in fig. 7. As shown in fig. 7, the data transport stream of the media data is composed of one or more MMT packages (media transport packets), and there is no overlap in presentation time of the MMT packages.
It should be noted that, the mapping information between the MMT package and the MMT protocol session is transferred to the receiving end of the display device 200 by the MMT signaling information.
In some embodiments, a complete MMT package is an MPU. The data in the MPU includes MPU metadata, clip (Movie Fragment) metadata, and MFU (Media Fragmentation Unit, media processing unit) data. Wherein, an MFU is an I-frame, and the I-frame contains all picture information, which can affect the playing picture presentation quality of the media asset data. When media data is transmitted, the data is cut into UDP packets and transmitted, and each MPU comprises MPU metadata, fragment metadata and a plurality of MFUs. The display device 200 can play and display the received MFU data only after receiving the MPU metadata and the clip metadata.
Thus, in some embodiments, the display device 200 also needs to detect the order of the individual data in the MMT package when transmitting the MMT package. Only in the order of the MMT package, MPU metadata, clip metadata, and MFU data, the display device 200 can normally play the media asset data.
For example: as shown in fig. 8, the MPU structure shown in fig. 8 is MMT package (MMT package) in normal order. As can be seen from fig. 8, the MMT package is sequentially MPU metadata, clip metadata, and MFU data. When the MMT package is transmitted through the MMT protocol, the MMT package group is not required to be re-packaged, and the complete MMT package can be directly transmitted to the player of the display device 200.
Note that, in the MPU structure shown in fig. 8 in the embodiment of the present application, the "mdat" header data including MFU data is also processed as fragment metadata.
Obviously, when the order of the MPU structure is abnormal, in order for the display device 200 to play the MMT package of the media data normally, the display device 200 needs to package the MMT package of the abnormal order again to form the MMT package of the normal order. That is, in some embodiments, when the order of the MPU structure is an abnormal order, the display apparatus 200 re-packages the MMT package in the normal order.
For example: as shown in fig. 9, the MPU structure shown in fig. 9 is an MMT package arranged in an out-of-order, and the display device 200 needs to reassemble the MMT packages in the order of MPU metadata-clip metadata-MFU data (mpumetadata to fragment metadata to mdat) to form the MMT package in the normal order shown in fig. 8.
Accordingly, when playing the media data, the display device 200 needs to acquire the media data composed of a plurality of MPU data to continuously form a play screen of the media data in the display 260. That is, in some embodiments, the display device 200 enables interaction of media asset data with the player via a protocol stack. When playing the media asset data, the display device 200 starts the player through the protocol stack middleware, and at the same time, the player requests the MPU data from the protocol stack middleware to acquire the media asset data. As shown in fig. 10, after receiving a request sent by the player, the protocol stack middleware of the display apparatus 200 receives MPU data corresponding to media data sent by the server 400 in response to the request sent by the player. And after the protocol stack middleware receives the complete MPU data, returning the complete MPU data to the player. After receiving the complete MPU data, the player can execute decoding playing on the MPU data and request the next MPU data from the protocol stack middleware.
However, when a certain time is required for the protocol stack of the display device 200 to acquire the complete MPU data, the drawing speed of the display device 200 may be delayed. In this way, a user may experience a longer waiting time while viewing a channel program through the display device 200. For example, when a user continuously switches channels in the display apparatus 200, the display apparatus 200 needs to continuously acquire new signaling data and MPU data, and since the protocol stack of the display apparatus 200 needs to wait for complete MPU data, a problem of frame discontinuity may occur, resulting in a decrease in the playing efficiency of media data. If the protocol stack of the display device 200 does not wait for the complete MPU data, the received MPU data is directly injected into the player, and the display device 200 may not play the media data normally due to the MPU data transmitted in disorder.
Based on the above application scenario, in order to improve the problem of the degradation of the playing efficiency of the media data in the display device 200, some embodiments of the present application provide a display device 200, as shown in fig. 11, including a display 260 and a controller 250. Wherein the display 260 is configured to display a play screen of the media asset data; as shown in fig. 12, the controller 250 is configured to perform the following program steps:
S100: and responding to the playing instruction of the media asset data, and acquiring the data transmission stream of the media asset data.
After receiving the play instruction of the media data, the display device 200 acquires MMT package of the media data, that is, MPU data, from the server 400 based on the protocol stack to form a data transport stream of the media data. Wherein the data transport stream includes MPU metadata, segment metadata, and MFU data.
The play instruction of the media asset data may be generated according to a manipulation event of the upper layer application, and thus, in some embodiments, the display apparatus 200 monitors the manipulation event of the upper layer application. The manipulation event may be a manipulation event for increasing volume, a manipulation event for adjusting brightness, a manipulation event for switching channels, a manipulation event for accessing channels, etc. After detecting a manipulation event of switching a channel or accessing a channel, the display apparatus 200 detects a target channel to which the display apparatus 200 is accessed in response to the manipulation event of switching a channel or accessing a channel.
For example: the channel accessed by the display device 200 when the device is started is DC a, and the target channel is DC a at the moment; the user tunes the channel to DC b through the remote control device associated with the display apparatus 200, and the target channel is DC b. That is, the target channel is the channel to which the display apparatus 200 is currently connected, and is not fixed.
After detecting the target channel, the display device 200 queries the media data corresponding to the target channel, and generates a play instruction according to the media data. After the display device 200 generates a play command for playing the media asset data, the data transmission stream of the media asset data may be acquired in response to the play command.
In order to facilitate filtering of the media asset data when querying the media asset data corresponding to the target channel, in some embodiments, the display device 200 establishes a media asset transmission channel according to a media transmission protocol of the media asset data. For example, an MMT protocol session is created in accordance with the MMT protocol. After the media asset transmission channel is established, media asset presentation information signaling is acquired through the media asset transmission channel. The media asset presentation information signaling comprises a presentation information table and a component description table. The two tables are used for filtering media data, and the component description table (User Service Dsecription, USD) includes the ID of the component, i.e., packet_id. Therefore, the media asset data, namely MPU data corresponding to the media asset data, is queried according to the presentation information table and the component description table.
In some embodiments, the display device 200 may obtain the component data corresponding to the media asset data by matching the packet_id value of the MMT package. For example, the corresponding asset data may be queried according to the packet_id recorded in the USD to form MPU data of the asset data.
Thus, in some embodiments, when the display device 200 obtains the data transport stream of the media data, the channel interface of the protocol stack middleware is invoked, and the target channel accessed by the display device 200 is controlled through the channel interface. And then receiving the media information signaling and media data of the target channel.
For example: as shown in fig. 13, the display apparatus 200 transmits media asset data based on an ATSC3.0 protocol stack, where ATSC3.0 contains both protocols for ROUTE and MMTP transmission data. After the user inputs the control event of switching channels in the upper layer application of the display device 200, the display device 200 invokes a channel interface for switching to the protocol stack middleware, starts a switching process in the display device 200, and receives media presentation information signaling and media data of the target channel.
It can be understood that, when switching the target channel, the display device 200 may receive, in addition to the media presentation signaling data, other signaling of the media data, for example, CRI signaling, receiver buffer model removal signaling, and hypothesized receiver buffer model signaling, so that the display device 200 may play the media data corresponding to the target channel.
S200: MPU metadata, clip metadata, and transmission order of MFU data in a data transmission stream are detected.
The display device 200 also detects the transmission order of data in the data transmission building after obtaining the data transmission stream of the media data. Since the display apparatus 200 cannot normally play the media data in the unordered state of the MPU data, the display apparatus 200 needs to detect whether the transmission order of the MPU data is normal in the protocol stack middleware when obtaining the data transmission stream. That is, when the protocol stack middle layer of the display apparatus 200 receives the data transport stream, it is detected whether the order of the respective data in the data transport stream is MPU metadata, clip metadata, and MFU data in this order, that is, the order shown in fig. 8.
Accordingly, the display apparatus 200 may set the order shown in fig. 8 as the target order to measure whether the data transmission stream of the media asset data is out of order by the target order. That is, in some embodiments, the target order is MPU metadata, segment metadata, and MFU data in that order.
S300: if the transmission order is the target order, the data transmission stream is injected into the player to perform decoding playing on the data transmission stream in real time through the player.
Since the data transmission stream of the media asset data is composed of a plurality of MMT packages, that is, a plurality of MPU data, in order to ensure that the MPU data corresponding to the media asset data can be normally played, the display apparatus 200 may compare the detected transmission order with the target order after detecting the order of the data transmission stream. That is, the display apparatus 200 needs to detect the transmission order of each MPU data in the data transmission stream to determine whether or not the currently transmitted MPU data is the target order.
Since the display device 200 can display and play the received MFU data only after receiving the MPU metadata and the clip metadata. Therefore, when the transmission order of the MPU data is the target order of the MPU metadata, the clip metadata, and the MPU data in order, the MPU data can be normally played. At this time, in order to shorten the waiting time of the player, when the data transmission stream of the media data is in the target order, the display device 200 injects the current MPU data into the player, so that the player can perform decoding and playing on the data transmission stream in real time, and the drawing speed of the player is improved.
For example: the display apparatus 200 detects that the transmission order of the current data transmission stream is MPU metadata, clip metadata, and MPU data in order, and illustrates that the transmission order of the current data transmission stream is the target order. As shown in fig. 14, the display apparatus 200 transmits the received data transport stream to the player in real time so that the player can perform decoding playback of the data transport stream in real time. After the current MPU data is received, the player requests the next MPU data from the protocol stack middleware, and the display apparatus 200 continues to detect the transmission order of the MPUs, so as to switch the transmission mode of the MPU data according to the transmission order. Obviously, compared with the interactive mode shown in fig. 10, the interactive mode shown in fig. 14 can shorten the waiting time of the player in the display device 200 and improve the drawing speed of the display device 200.
S400: if the transmission order is not the target order, encapsulating the data transmission stream into media transmission packets according to the target order, and injecting the media transmission packets into the player.
Similarly, in order to enable the data transmission stream of the media data to be normally played, when the MPU data currently transmitted by the data transmission stream is not in the target sequence, the current MPU data is described as being arranged in an out-of-order manner. Since the player cannot normally play the MPU data arranged out of order, it is necessary to repackage the MPU data. Therefore, when the data transmission stream of the media asset data is not the target order, the display apparatus 200 repackages the data transmission stream into the media transmission packets in the order shown in fig. 8 in the target order. Thus, the order of the media transmission package is MPU metadata, segment metadata and MPU data in turn, and the player of the display device can normally display the playing picture of the media transmission package. After the completion of the grouping, the display apparatus 200 injects the encapsulated media transport packets into the player of the display apparatus 200 so that the display apparatus 200 can perform decoding playback on the media transport packets.
The process of receiving and detecting the transmission sequence by the display device 200 is implemented based on the protocol stack middleware, and in order for the protocol stack middleware of the display device 200 to inject data into the player, the protocol stack middleware needs to establish a connection between the protocol stack middleware and the player before injecting data into the player.
Thus, in some embodiments, the display device 200 initiates a media server of the protocol stack middleware and establishes a connection relationship of the protocol stack middleware with the player through the media server. After establishing the connection between the protocol stack middleware and the player, the display device 200 may call the player based on the connection, and inject the data transmission stream or the media transmission packet into the player, so that the player may perform decoding and playing on the data transmission stream or the media transmission packet.
For example: as shown in fig. 13, the protocol stack middleware of the display apparatus 200 includes an AV server module. The display device 200 starts the AV server module to establish a connection with the player before transmitting the data transmission stream of the media data. After the connection is established, the protocol stack middleware can inject the transmission data stream or the media transmission packet of the media data into the player.
To facilitate detection of the transmission order of the transmission data stream, in some embodiments, the FT field corresponding to the header of the fragment metadata is set to 0 or 1; and setting the FT field corresponding to the packet header of the MFU data to 2. In this way, the boundaries of the metadata or MFU data can be explicitly characterized by the FT field, while carrying minimal information for recovering the association between MFU data and metadata, such as movie fragment sequence numbers and sample sequence numbers, etc.
Thus, as shown in fig. 15, in some embodiments, the protocol stack middleware of the display device 200, after receiving the data transport stream of the media data, also detects the data type of the data in the data transport stream. If the data type is MPU metadata, the display device 200 is set according to MPU metadata standards; if the data type is fragment metadata, the display apparatus 200 is set according to the fragment metadata standard; if the data type is MFU data, the display device 200 is set according to the MFU standard. That is, the protocol stack middleware of the display apparatus 200 sets the data transport stream through the corresponding data standard so that the display apparatus 200 performs subsequent processing of the data transport stream.
After the protocol stack middleware of the display device 200 injects data into the player, the player may perform decoding playback on the injected data. Thus, in some embodiments, the display device 200 initializes a decoder of the player and performs decoding of the data transport stream by the decoder. After decoding the data transport stream, the display device 200 invokes the underlying resource to render a play-out of the media asset data in the display 260 to form a continuous play-out.
Based on the above embodiments, some embodiments of the present application further provide a display device 200, including a display 260 and a controller 250. Wherein the display 260 is configured to display a play screen of the media asset data; as shown in fig. 16, the controller 250 is configured to perform the following program steps:
S1: and receiving a data transmission stream of the media asset data in response to a playing instruction of the media asset data.
After receiving the play instruction of the media data, the display device 200 acquires MMT package of the media data, that is, MPU data, from the server 400 based on the protocol stack to form a data transport stream of the media data. Wherein the data transport stream includes MPU metadata, segment metadata, and MFU data.
S2: and caching the data transmission stream to a player cache area.
In order to accelerate the decoding speed of the player, the display device 200 may create a player buffer at the player end to buffer the data transmission stream of the media data in the player buffer. Therefore, by buffering the data transport stream to the player, the transmission efficiency of the data transport stream can be improved, and the response speed of the display device 200 can be increased.
In some embodiments, the player buffers include a metadata buffer and an MFU data buffer. The metadata cache region is used for caching MPU metadata and fragment metadata, and the MFU data cache region is used for caching MFU data. Accordingly, as shown in fig. 17, the display apparatus 200 buffers the data transport stream to the player buffer area, and also detects the data type of the data in the data transport stream. And if the data type is metadata, caching the metadata into a metadata cache region. Wherein the metadata includes MPU metadata and clip metadata. That is, when the data of the data transmission stream is MPU metadata or clip metadata, the MPU metadata or clip metadata is cached in the metadata cache area of the player side. Similarly, if the data type is MF U data, the MFU data is cached in the MFU data cache area. That is, when the data of the data transmission stream is MFU data, the MFU data is cached in the MFU cache area of the player side.
Further, in some embodiments, in detecting the data type of the data transport stream by the display device 200, when the data type of the data transport stream is neither metadata nor MFU data, the data is discarded to reduce system resource consumption occupied by the player buffer in the display device 200.
S3: when the MPU metadata and the fragment metadata are cached in the player cache region, the data in the player cache region is unpacked, and the data in the player cache region is injected into the player.
After the display device 200 buffers the data transmission stream of the media asset data in the buffer area of the player, the data can be respectively stored as metadata and MFU data. Since the display device 200 can play and display the received MFU only after receiving the MPU metadata and the clip metadata. Therefore, the display device 200 only performs decapsulation on the data in the buffer of the player after receiving the MPU metadata and the clip metadata in the buffer of the player, and then injects the decapsulated data into the player, so that the player can decode and play the data buffered in the buffer of the player, regardless of the transmission order of the data transmission stream.
For example: as shown in fig. 18, the protocol stack of the display device 200 sends a data transmission stream of media data to the player, and the display device 200 caches the metadata and MFU data in the corresponding player cache area by detecting the data type of the data transmission stream, and detects whether the complete metadata is received in the player cache area. When the complete metadata is received in the player buffer, the data in the player buffer is unpacked, so that the player can decode and play the data in the player buffer.
Based on the display device 200, some embodiments of the present application further provide a media playing method, as shown in fig. 12, where the method includes the following program steps:
s100: responding to a playing instruction of the media data, and acquiring a data transmission stream of the media data; the data transmission stream comprises MPU metadata, fragment metadata and MFU data;
s200: detecting the transmission sequence of MPU metadata, fragment metadata and MFU data in the data transmission stream;
s300: if the transmission sequence is the target sequence, the data transmission stream is injected into a player so as to execute decoding playing on the data transmission stream in real time through the player;
S400: and if the transmission sequence is not the target sequence, packaging the data transmission stream into media transmission packets according to the target sequence, and injecting the media transmission packets into the player.
According to the technical scheme, the display equipment and the media asset playing method provided by some embodiments of the application can respond to the playing instruction of the media asset data to acquire the data transmission stream of the media asset data. Wherein the data transport stream includes MPU metadata, segment metadata, and MFU data. And detecting the transmission sequence of the MPU metadata, the fragment metadata and the MFU data in the data transmission stream. If the transmission order is the target order, the data transmission stream is injected into the player to perform decoding playing on the data transmission stream in real time through the player. And if the transmission order is not the target order, encapsulating the data transmission stream into media transmission packets according to the target order, and injecting the media transmission packets into the player. The method can directly inject the data transmission stream into the player when the transmission sequence of the data transmission stream is the target sequence, can accelerate the drawing speed of the display equipment and improves the playing efficiency of the media data.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and are not limited thereto; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present application.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (10)

CN202310575869.8A2023-05-192023-05-19Display equipment and media asset playing methodPendingCN117812341A (en)

Priority Applications (2)

Application NumberPriority DateFiling DateTitle
CN202310575869.8ACN117812341A (en)2023-05-192023-05-19Display equipment and media asset playing method
PCT/CN2023/140315WO2024239631A1 (en)2023-05-192023-12-20Display device and media asset playing method

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202310575869.8ACN117812341A (en)2023-05-192023-05-19Display equipment and media asset playing method

Publications (1)

Publication NumberPublication Date
CN117812341Atrue CN117812341A (en)2024-04-02

Family

ID=90425561

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202310575869.8APendingCN117812341A (en)2023-05-192023-05-19Display equipment and media asset playing method

Country Status (1)

CountryLink
CN (1)CN117812341A (en)

Similar Documents

PublicationPublication DateTitle
CN114302219B (en)Display equipment and variable frame rate display method
CN112367543B (en)Display device, mobile terminal, screen projection method and screen projection system
CN113141524B (en)Resource transmission method, device, terminal and storage medium
US20090300108A1 (en)Information Processing System, Information Processing Apparatus, Information Processing Method, and Program
CN113141523B (en)Resource transmission method, device, terminal and storage medium
CN115209208B (en)Video cyclic playing processing method and device
WO2020098504A1 (en)Video switching control method and display device
CN114630101B (en)Display device, VR device and display control method of virtual reality application content
CN113630654B (en)Display equipment and media resource pushing method
WO2021217435A1 (en)Streaming media synchronization method and display device
CN117157987A (en)Split-screen playing method and display device
CN114615529B (en) Display device, external device and audio playback method
CN111654753B (en)Application program starting method and display device
CN117812341A (en)Display equipment and media asset playing method
CN115270030A (en)Display device and media asset playing method
CN117609651A (en)Display equipment and webpage screen projection method
CN115623275A (en)Subtitle display method and display equipment
CN113992963A (en)Display device and screen projection method
CN113596546A (en)Multi-stream program playing method and display equipment
CN115174991B (en)Display equipment and video playing method
CN111629250A (en)Display device and video playing method
US20240427583A1 (en)Display apparatus and processing method for display apparatus
CN117749906A (en)Display device and broadcast program playing method
CN117915139A (en)Display equipment and sound and picture synchronization method
WO2024239631A1 (en)Display device and media asset playing method

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination

[8]ページ先頭

©2009-2025 Movatter.jp