Detailed Description
To make the purpose and embodiments of the present application clearer, the following will clearly and completely describe the exemplary embodiments of the present application with reference to the attached drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first," "second," "third," and the like in the description and claims of this application and in the foregoing drawings are used for distinguishing between similar or analogous objects or entities and are not necessarily intended to limit the order or sequence in which they are presented unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The display device provided by the embodiment of the present application may have various implementation forms, and for example, the display device may be a television, a smart television, a laser projection device, a display (monitor), an electronic whiteboard (electronic whiteboard), an electronic desktop (electronic table), and the like. The embodiments of the present application do not limit the specific form of the display device. In the embodiment of the present application, a display device is taken as a television as an example for schematic description. Fig. 1 and 2 are specific embodiments of a display device of the present application.
Fig. 1 illustrates a usage scenario diagram of a display device according to some embodiments of the present application. As shown in fig. 1, a user may operate thedisplay apparatus 200 through thesmart device 300 or thecontrol device 100.
In some embodiments, thecontrol apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes an infrared protocol communication or a bluetooth protocol communication, and other short-distance communication methods, and controls thedisplay device 200 in a wireless or wired manner. The user may input a user instruction through a key on a remote controller, voice input, control panel input, etc., to control thedisplay apparatus 200.
In some embodiments, the smart device 300 (e.g., mobile terminal, tablet, computer, laptop, etc.) may also be used to control thedisplay device 200. For example, thedisplay device 200 is controlled using an application program running on the smart device.
In some embodiments, thedisplay device 200 may receive the user's control through touch or gesture, etc., instead of receiving the instruction using thesmart device 300 or thecontrol apparatus 100 described above.
In some embodiments, thedisplay device 200 may also be controlled in a manner other than thecontrol apparatus 100 and thesmart device 300, for example, the voice command control of the user may be directly received by a module configured inside thedisplay device 200 to obtain a voice command, or may be received by a voice control device provided outside thedisplay device 200.
In some embodiments, thedisplay device 200 is also in data communication with aserver 400. Thedisplay device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. Theserver 400 may provide various contents and interactions to thedisplay apparatus 200. Theserver 400 may be a cluster or a plurality of clusters, and may include one or more types of servers.
Fig. 2 shows a block configuration diagram of thecontrol device 100 according to some embodiments of the present application. As shown in fig. 2, thecontrol device 100 includes acontroller 110, acommunication interface 130, a user input/output interface 140, a memory, and a power supply. Thecontrol apparatus 100 may receive an input operation instruction from a user and convert the operation instruction into an instruction recognizable and responsive to thedisplay device 200, thereby mediating interaction between the user and thedisplay device 200.
Fig. 3 shows a hardware configuration block diagram of adisplay device 200 according to some embodiments of the present application. As shown in fig. 3, thedisplay apparatus 200 includes at least one of atuner demodulator 210, acommunicator 220, adetector 230, anexternal device interface 240, acontroller 250, adisplay 260, anaudio output interface 270, a memory, a power supply, and a user interface.
In some embodiments, thecontroller 250 includes a processor, a video processor, an audio processor, a graphic processor, a RAM, a ROM, a first interface to an nth interface for input/output.
Thedisplay 260 includes a display screen component for presenting a picture, and a driving component for driving image display, a component for receiving an image signal outputted from thecontroller 250, performing display of video content, image content, and a menu manipulation interface, and a user manipulation UI interface.
Thedisplay 260 may be a liquid crystal display, an OLED display, and a projection display, and may also be a projection device and a projection screen.
Thecommunicator 220 is a component for communicating with an external device or a server according to various communication protocol types. For example: the communicator may include at least one of a Wifi module, a bluetooth module, a wired ethernet module, and other network communication protocol chips or near field communication protocol chips, and an infrared receiver. Thedisplay apparatus 200 may establish transmission and reception of a control signal and a data signal with thecontrol device 100 or theserver 400 through thecommunicator 220.
A user interface for receiving control signals for controlling the apparatus 100 (e.g., an infrared remote control, etc.).
Thedetector 230 is used to collect signals of the external environment or interaction with the outside. For example,detector 230 includes a light receiver, a sensor for collecting ambient light intensity; alternatively, thedetector 230 includes an image collector, such as a camera, which may be used to collect external environment scenes, attributes of the user, or user interaction gestures, or thedetector 230 includes a sound collector, such as a microphone, which is used to receive external sounds.
Theexternal device interface 240 may include, but is not limited to, the following: high Definition Multimedia Interface (HDMI), analog or data high definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, and the like. The interface may be a composite input/output interface formed by the plurality of interfaces.
Thetuner demodulator 210 receives a broadcast television signal through a wired or wireless reception manner, and demodulates an audio/video signal, such as an EPG data signal, from a plurality of wireless or wired broadcast television signals.
In some embodiments, thecontroller 250 and themodem 210 may be located in different separate devices, that is, themodem 210 may also be located in an external device of the main device where thecontroller 250 is located, such as an external set-top box.
Thecontroller 250 controls the operation of the display device and responds to the user's operation through various software control programs stored in the memory. Thecontroller 250 controls the overall operation of thedisplay apparatus 200. For example: in response to receiving a user command for selecting a UI object to be displayed on thedisplay 260, thecontroller 250 may perform an operation related to the object selected by the user command.
In some embodiments, thecontroller 250 includes at least one of a Central Processing Unit (CPU), a video processor, an audio processor, a Graphic Processing Unit (GPU), a RAM Random Access Memory (RAM), a ROM (Read-Only Memory), a first interface to an nth interface for input/output, a communication Bus (Bus), and the like.
The user may input a user command through a Graphical User Interface (GUI) displayed on thedisplay 260, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
A "user interface" is a media interface for interaction and information exchange between an application or operating system and a user that enables the conversion of the internal form of information to a form acceptable to the user. A commonly used presentation form of the User Interface is a Graphical User Interface (GUI), which refers to a User Interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
Fig. 4 shows a software configuration block diagram of thedisplay device 200 according to some embodiments of the present application. As shown in fig. 4, in some embodiments, the system is divided into four layers, which are, from top to bottom, an Application (Applications) layer (abbreviated as "Application layer"), an Application Framework (Application Framework) layer (abbreviated as "Framework layer"), an Android runtime (Android runtime) and system library layer (abbreviated as "system runtime library layer"), and a kernel layer.
In some embodiments, at least one application program runs in the application program layer, and the application programs may be windows (windows) programs carried by an operating system, system setting programs, clock programs or the like; or an application developed by a third party developer. In particular implementations, the applications in the application layer are not limited to the above examples.
The framework layer provides an Application Programming Interface (API) and a programming framework for the application. The application framework layer includes a number of predefined functions. The application framework layer acts as a processing center that decides to let the applications in the application layer act. The application program can access the resources in the system and obtain the services of the system in execution through the API interface.
In some embodiments, the system runtime layer provides support for an upper layer, i.e., the framework layer, and when the framework layer is used, the android operating system runs the C/C + + library included in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software. The inner core layer comprises at least one of the following drivers: audio drive, demonstration drive, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (like fingerprint sensor, temperature sensor, pressure sensor etc.), MIC drive and power drive etc..
FIG. 5 illustrates an icon control interface display of an application indisplay device 200 according to some embodiments of the present application. In some embodiments, as shown in fig. 5, the application layer containing at least one application may display a corresponding icon control in the display, such as: the system comprises a live television application icon control, a video on demand application icon control, a media center application icon control, an application center icon control, a game application icon control and the like.
In some embodiments, the live television application may provide live television via different signal sources. For example, a live television application may provide television signals using input from cable television, radio broadcasts, satellite services, or other types of live television sources. And, the live television application may display video of the live television signal ondisplay device 200.
In some embodiments, a video-on-demand application may provide video from different storage sources. Unlike live television applications, video on demand provides a video display from some storage source. For example, the video on demand may come from a server side of the cloud storage, from a local hard disk storage containing stored video programs.
In some embodiments, the media center application may provide various applications for multimedia content playback. For example, a media center, which may be other than live television or video on demand, may provide services that a user may access to various images or audio through a media center application.
In some embodiments, an application center may provide storage for various applications. The application may be a game, an application, or some other application associated with a computer system or other device that may be run on the smart television. The application center may obtain these applications from different sources, store them in local storage, and then be operable on thedisplay device 200.
In some embodiments, when the display device performs Live broadcast using the configured player, live broadcast data is mainly transmitted using an adaptive bitrate Streaming Protocol (HLS) based on a hypertext Transfer Protocol (HTTP). The working principle of the HLS protocol is that the whole media stream is divided into small HTTP-based files to be downloaded, only one HTTP-based file is downloaded each time, so that the client can download and play the media stream simultaneously, and the whole media stream does not need to be downloaded and played again. Compared to real-time transport protocol (RTP), HLS data can pass through any firewall or proxy server that allows HTTP data to pass through, thereby having greater stability. Among them, HTTP is a simple request response Protocol, which generally runs on top of Transmission Control Protocol (TCP) and is used to define the process of exchanging data between a client and a server. After the client connects to the server, if a certain resource in the server is desired to be obtained, a certain communication format needs to be observed, and the HTTP protocol is used for defining the communication format between the client and the server.
Fig. 6 illustrates a structural schematic of HLS data of some embodiments of the present application. In some embodiments, as shown in fig. 6, the HLS data includes an Index file (Index file) and a Transport Stream (TS) slice file. The index file is an M3U8 file, and the M3U8 file is actually a play list (Playlist), similar to a Media Playlist (Media Playlist), and its internal information records a series of Media clip resources, which are sequentially played, i.e. the multimedia resources are completely shown. In some embodiments, as shown in fig. 6, an address of at least one secondary index file may be recorded in the M3U8 file, for example: the addresses of three secondary index files are recorded in the M3U8 file and are respectively Alternate-A index file, alternate-B index file and Alternate-C index file. The address of each secondary index file is recorded with the download address of each TS fragment file of the multimedia resource, and the client can download the corresponding TS fragment file through the download address of each TS fragment file. The secondary index file acts as an alternate source, and the client may choose to download the same resource from many different alternate sources at different rates, thereby allowing the streaming session between the client and the server to accommodate different data rates. In other embodiments, each secondary index file may also continue to be nested, for example: and recording the address of at least one tertiary index file in each secondary index file. In still other embodiments, the M3U8 file may directly record the download address of each TS fragment file of the multimedia resource, without the index file of the nested hierarchy.
For example: the specific format of the M3U8 file may be:
{#EXTM3U,#EXT-X-MEDIA-SEQUENCE,#EXT-X-TARGETDURATION:10,#EXTINF:10.0,http://media.example.com/first.ts,#EXTINF:9.99,http://media.example.com/second.ts,#EXT-X-ENDLIST}
wherein, # EXTM3U denotes the M3U8 header, placed in the first row; # EXT-X-MEDIA-SEQUENCE indicates the SEQUENCE number of the first TS sharded file, typically 0, but in the live scenario this SEQUENCE number identifies the start position of the live segment; # EXT-X-TARGETDURATION:10 denotes that the maximum duration of each TS sharded file is 10 seconds(s); # extinn indicates the duration of each TS sharded file; # EXT-X-ENDLIST denotes the terminator for the M3U8 file. If an M3U8 file does not have the # EXT-X-ENDLIST tag, it can be considered as live and a new TS slice file is added to the end of the playlist. When playing live broadcast, the display device of the client needs to continuously update the M3U8 file to obtain the latest TS fragment file for playing.
Fig. 7 illustrates a flow diagram of HLS data live broadcast of some embodiments of the present application. In some embodiments, as shown in fig. 7, audio/video input data (Audio/video inputs) provided by a Media producer, which is a transcoding module responsible for transcoding the Audio/video input data into data in a target encoding format, is transmitted to a Server (Server) where it is converted into HLS data, where the encoding format of the data provided by the Media producer may be any Audio/video encoding format. In some embodiments, the target encoding format may be an MPEG2-TS format. After the data in the target coding format is transcoded into the data in the target coding format, the data in the target coding format is sliced by the stream segmenter module, and the result of the slicing is HLS data which comprises an index file and a TS fragment file. HLS data is transmitted to Distribution, which is a generic HTTP file server. The Client (Client) can play the whole audio/video stream provided by the media producer by sequentially acquiring and playing each TS fragment file through the HTTP protocol.
In some embodiments, the duration of the TS slice file may be an integer multiple of the duration of a full Group of Pictures (GOP). A GOP is a group of consecutive pictures, which includes a plurality of data frames. The first data frame of the GOP must be an I-frame, which is an intra-coded frame, i.e., a key frame, that can be understood as a complete picture. MPEG encoding divides data frames into three types, I-frames, P-frames, and B-frames, the P-frames are forward predicted frames, the B-frames are bi-directional interpolated frames, the P-frames and the B-frames record changes relative to the I-frames, the P-frames represent differences of a previous frame, and the B-frames represent differences of previous and next frames.
In some embodiments, when the client implements live broadcast of any channel by using the display device, in order to avoid a problem of pause in playing caused by that a latest TS fragment file of HLS data of the channel is not ready, in a start-playing stage of live broadcast of the channel, the display device usually starts to acquire data from a third TS fragment file from the last, that is, a TS fragment file traced back for a certain period from a current live broadcast time. Since playing HLS data is performed in a file indexing and downloading manner, the playing delay is affected by the size of the TS fragment file, and live broadcasting generally delays about the duration of three TS fragment files, for example: the time length of each TS fragment file is about 10 seconds, the live broadcast delay is about 30 seconds, or the time length of each TS fragment file is about 2 seconds, and the live broadcast delay can be reduced to about 6 seconds. When the time length of the TS fragment file is reduced, the live broadcast delay can be reduced, but the number of the TS fragment files in unit time can be increased, and further performance loss caused by a large number of GETs in a short time of an HTTP protocol can be brought.
When a user switches channels, namely a live broadcast source needs to be switched, the display device needs to acquire HLS data of a new live broadcast source again, based on the characteristics of the HLS protocol, the display device needs to analyze the HLS data to acquire an M3U8 file, acquire a download address of a TS fragment file from the M3U8 file, and then download a corresponding TS fragment file, which wastes time in the process and causes that the user cannot smoothly switch channels, thereby causing that channel switching and playing are jammed and affecting the viewing experience of the user.
Based on this, in order to solve the problem that the display device plays in the live broadcasting channel switching process, some embodiments of the present application provide a live broadcasting channel switching method, which can analyze HLS data while playing multicast data after receiving a channel switching instruction, disconnect the multicast data after the multicast data is aligned with the analyzed HLS data, and continue to play the HLS data. Because some embodiments of the present application analyze the HLS data and play the multicast data at the same time, there is no deadlock during channel switching, and some embodiments of the present application disconnect the multicast data after the multicast data is aligned with the analyzed HLS data, and continue playing the HLS data, which not only enables the live data watched by the user to be continuous, but also enables the HLS to pass through any firewall or proxy server that allows HTTP data to pass through, and also makes it easy to use a content distribution network to transmit media streams, which has a better live broadcast advantage and better user experience.
Fig. 8 illustrates a flow chart of a live channel switching method of some embodiments of the present application. As shown in fig. 8, the method specifically includes the following steps:
s101: responding to a channel switching instruction for switching the first live broadcast source to the second live broadcast source, acquiring HLS data corresponding to the second live broadcast source, and simultaneously acquiring multicast data corresponding to the second live broadcast source.
In some embodiments, the user may issue a channel switching instruction to the display device through the smart device or the control device, for example: the user can control the display apparatus by inputting a channel switching instruction on a control device such as a remote controller, a key, a voice input, a control panel input, or the like. The user can also send out a voice channel switching instruction through a module which is configured on the display device and used for acquiring the voice instruction or an external voice control device. The user may also issue a channel switching instruction by performing a touch operation or a gesture operation on the display device.
Illustratively, the channel switching instruction may be received by a communicator in the display device, and after the channel switching instruction is received by the communicator, the controller in the display device may acquire HLS data corresponding to the second live source and simultaneously acquire multicast data corresponding to the second live source based on the channel switching instruction.
In some embodiments, the servers interacting with the display device may include a streaming server and a multicast server. The streaming media server is used for converting live broadcast data of each live broadcast source into HLS data, and the multicast server can send the live broadcast data of each live broadcast source to a corresponding client according to the multicast address of each live broadcast source. Wherein the multicast data represents real-time live data without delay. The display device may obtain HLS data corresponding to the second live broadcast source from the streaming server, and obtain multicast data corresponding to the second live broadcast source from the multicast server based on the multicast address of the second live broadcast source. When the display device acquires the HLS data corresponding to the second live broadcast source from the streaming media server, since the live broadcast data is continuously increased, the display device continuously acquires updated HLS data from the streaming media server, so as to acquire the latest TS fragment file.
S102: and playing the multicast data corresponding to the second live broadcast source.
After the multicast data corresponding to the second live broadcast source is acquired, the display device performs decapsulation processing and decoding processing on the multicast data to obtain data frames, where each data frame includes an I frame, a B frame, a P frame, and the like. And storing the decoded data frame in a buffer, and displaying the picture corresponding to the data frame. For example: after the display device acquires the multicast data corresponding to the second live broadcast source at the current live broadcast time, the display device can start to play the corresponding picture from the data frame at the current live broadcast time.
In some embodiments, while the multicast data corresponding to the second live source is played, the display device may also simultaneously parse HLS data corresponding to the second live source to obtain an index file in the HLS data corresponding to the second live source. Then, based on the index file, the TS fragment file is downloaded. The display device will trace back the TS fragment file of a certain time period from the current live broadcast time to start downloading until downloading the newly generated TS fragment file, for example: and the display equipment traces back three TS fragment files from the current live broadcast time to start downloading.
In some embodiments, the display device downloads the TS slice file based on the index file, including: the display device sends a first request message to the streaming media server based on the index file and receives a first response message from the streaming media server.
The first request message is used for requesting the TS fragmented files, and the first request message includes download addresses of the TS fragmented files. The first response message carries the TS fragment file. The first request message and the first response message are HTTP messages based on the HTTP protocol.
In some embodiments, the first request message is sent by means of a long TCP connection, and the first response message is sent by means of a chunked connection. And setting a Connection field in the first request message to be in a keep-alive state so as to ensure that the streaming media server and the display equipment can acquire data based on the HTTP long Connection. If the keep-alive is started, the streaming media server does not close the TCP connection after returning the response, the display device does not close the TCP connection after receiving the response message, and the TCP connection is continuously used when the next HTTP request is sent. The Transfer-Encoding field in the first response message is set to chunked, which means that the length of the content is not fixed, and the display device can receive data until the link is broken. Because a request is sent to the server every time a TS (transport stream) fragment file is requested, and the TCP connection is reestablished every time a request is sent, great performance loss is caused, the number of GETs can be effectively reduced by adopting a TCP long connection mode, so that the resource consumption caused by the HLS protocol is reduced.
For example: the specific format of the first request packet may be:
{POST cctvx.m3u8 HTTP/1.1,…,Connection:Keep-Alive,…}
for example: the specific format of the first response packet may be:
{HTTP/1.1206Partial Content,…,Transfer-Encoding:chunked,…,Connection:Keep-Alive,…}
s103: and under the condition that the multicast data are aligned with the data frames of the HLS data, stopping acquiring the multicast data corresponding to the second live broadcast source, and playing the HLS data.
When multicast data is played, HLS data is analyzed, TS fragment files are downloaded, and due to the fact that the downloading speed of the TS fragment files is different from the playing speed of data frames, although the TS fragment files backtrack for a period of time from the current live broadcast time to the front, the downloading speed usually exceeds the playing speed, and therefore the display device can download the corresponding newly generated TS fragment files at a certain time. For example: the display device starts to play real-time multicast data from 10 points, simultaneously starts to download from a TS fragment file of 9 points and 59 minutes, and after one minute, the display device has downloaded the latest TS fragment file corresponding to zero 1 minute of 10 points, that is, the latest generated data frame is obtained, and at this time, the multicast data is also played to the latest generated data frame, and at this time, the multicast data is aligned with the data frame of the HLS data.
In some embodiments, whether data frames of the multicast data and the HLS data are aligned or not is detected, and whether data acquired in each preset time period are identical or not may be compared. The preset time period may be set according to the duration of a complete GOP, for example: for 2 seconds. Illustratively, from the current live broadcast time, once the multicast data and the HLS data of 2 seconds are obtained, if the multicast data and the HLS data are different, the next comparison for 2 seconds is continued until the data are completely the same, and it is determined that the data frames of the multicast data and the HLS data are aligned.
In some embodiments, when the multicast data is not aligned with the data frames of the HLS data, the display device continues to play the multicast data corresponding to the second live broadcast source, continues to analyze the HLS data corresponding to the second live broadcast source, and downloads the TS fragment file, and performs data comparison with the multicast data within a next preset time period until the multicast data is aligned with the data frames of the HLS data.
In some embodiments, after disconnecting the multicast data, HLS data may be played from the aligned data frame or the next data frame of the aligned data frame. The aligned data frame is a data frame when the multicast data is aligned with the HLS data. Therefore, after the multicast data is disconnected, the HLS data can be continuously played, the picture display is coherent, the delay caused by the HLS protocol is avoided, and the resources can be saved.
In some embodiments, when the channel switching instruction for switching the first live source to the second live source is not received, the display device continues to acquire HLS data corresponding to the first live source and plays the HLS data corresponding to the first live source. The HLS data playing process is described in the related description, and is not described herein again.
Fig. 9 is a data flow diagram illustrating a live channel switching method according to some embodiments of the present application. As shown in fig. 9, if the channel switching instruction is not received, the HLS data corresponding to the first live broadcast source is obtained, and the HLS data corresponding to the first live broadcast source is played. And if the channel switching instruction is received, acquiring HLS data corresponding to the second live broadcast source and multicast data corresponding to the second live broadcast source at the same time. And analyzing and downloading the TS fragment file aiming at the HLS data corresponding to the second live broadcast source, and playing the multicast data aiming at the multicast data corresponding to the second live broadcast source. And if the data frames of the multicast data and the HLS data are aligned, disconnecting the multicast data and continuing to play the HLS data. And if the data frames of the multicast data and the HLS data are not aligned, continuously acquiring the HLS data corresponding to the second live broadcast source and the multicast data corresponding to the second live broadcast source.
According to the live broadcast channel switching method provided by some embodiments of the application, after a channel switching instruction is received, the HLS data can be analyzed while the multicast data is played, the multicast data is disconnected after the multicast data is aligned with the HLS data obtained through analysis, and the HLS data is continuously played. The broadcast multicast data is played while the HLS data is analyzed, so that the blockage can not occur during channel switching, the multicast data is disconnected after the multicast data is aligned with the analyzed HLS data, and the HLS data is continuously played, so that the live broadcast data watched by a user is continuous, the HLS can pass through any firewall or proxy server allowing the HTTP data to pass, the media stream can be easily transmitted by using a content distribution network, the live broadcast advantage is good, and the user experience is good.
Based on the live broadcast channel switching method provided in the above embodiments, some embodiments of the present application further provide a display device. Thedisplay device 200 includes: adisplay 260 and acontroller 250. Wherein thedisplay 260 is configured to display the play data. Under the condition that thecontroller 250 normally plays the HLS data corresponding to the first live broadcast source and does not receive the channel switching instruction, fig. 10 illustrates an effect diagram of the display device according to some embodiments of the present application playing the data of the first live broadcast source. As shown in fig. 10, thecontroller 250 continues to acquire the HLS data corresponding to the first direct broadcast source and controls thedisplay 260 to play the HLS data corresponding to the first direct broadcast source.
In a case that thecontroller 250 normally plays HLS data corresponding to the first live source and receives a channel switching instruction for switching the first live source to the second live source, fig. 11 illustrates an effect diagram of the display device according to some embodiments of the present application playing data of the second live source. As shown in fig. 11, after receiving the channel switching instruction, thecontroller 250 may simultaneously acquire HLS data and multicast data corresponding to the second live broadcast source, and control thedisplay 260 to play the multicast data corresponding to the second live broadcast source.
Fig. 12 is a schematic diagram illustrating an interaction flow between a display device and a server according to some embodiments of the present application. As shown in fig. 12, in a case where thecontroller 250 receives a channel switching instruction for switching the first live source to the second live source, thecontroller 250, thedisplay 260, the streaming server and the multicast server are respectively configured to execute the following program steps:
s201: in response to the channel switching instruction for switching the first live broadcast source to the second live broadcast source, thecontroller 250 acquires multicast data corresponding to the second live broadcast source from the multicast server.
S202: thecontroller 250 controls thedisplay 260 to play the multicast data corresponding to the second live source.
S203: thecontroller 250 obtains HLS data corresponding to the second live source from the streaming server.
Thecontroller 250 may obtain the multicast data corresponding to the second live source from the multicast server, and may obtain the HLS data corresponding to the second live source from the streaming server.
S204: thecontroller 250 analyzes the HLS data corresponding to the second live source to obtain an index file in the HLS data corresponding to the second live source.
S205: thecontroller 250 downloads the TS slice file from the streaming server based on the index file.
The above-mentioned step 202 and the steps 203 to 205 may be performed simultaneously.
S206:controller 250 detects whether the data frames of the multicast data and the HLS data are aligned.
S207: in the case that the data frames of the multicast data and the HLS data are not aligned, thecontroller 250 continues to acquire the multicast data corresponding to the second live broadcast source from the multicast server and play the multicast data.
S208: in the case where the multicast data is aligned with the data frame of the HLS data, thecontroller 250 stops acquiring the multicast data corresponding to the second live source from the multicast server.
S209: and controlling the display to play the HLS data corresponding to the second live broadcast source.
Wherein step S208 and step S209 are performed simultaneously.
The display device provided by some embodiments of the present application can analyze HLS data while playing multicast data after receiving a channel switching instruction, disconnect the multicast data after the multicast data is aligned with the analyzed HLS data, and continue playing the HLS data. The broadcast multicast data is played while the HLS data is analyzed, so that the display device cannot be jammed during channel switching, the multicast data is disconnected after the multicast data is aligned with the analyzed HLS data, and the HLS data is continuously played, so that not only can live broadcast data watched by a user be continuous, but also the HLS can pass through any firewall or proxy server allowing HTTP data to pass through, and a content distribution network can be easily used for transmitting media streams, therefore, the broadcast multicast data broadcast method has a better live broadcast advantage and better user experience.
In some embodiments of the present application, a computer-readable storage medium is further provided, where the computer-readable storage medium may store a program, and the program may include some or all of the steps in the embodiments of the live channel switching method provided in the present application when executed. The storage medium may be a magnetic disk, an optical disk, a Read Only Memory (ROM), a Random Access Memory (RAM), or the like.
It can be seen from the foregoing technical solutions that, in a live broadcasting process, in response to a channel switching instruction for switching a first live broadcasting source to a second live broadcasting source, the display device provided in some embodiments of the present application can simultaneously acquire HLS data and multicast data corresponding to the second live broadcasting source, play multicast data corresponding to the second live broadcasting source, and stop acquiring the multicast data and playing the HLS data when data frames of the multicast data and the HLS data are aligned. When the channel is switched, the HLS data and the multicast data are acquired simultaneously, and the multicast data are played before the multicast data are aligned with the data frames of the HLS data, so that the problems of playing delay and blockage caused by acquisition of the HLS data after the channel is switched by the display equipment can be solved, and the watching experience of a user is improved.
The embodiments provided in the present application are only a few examples of the general concept of the present application, and do not limit the scope of the present application. Any other embodiments extended according to the scheme of the present application without inventive efforts will be within the scope of protection of the present application for a person skilled in the art.