Movatterモバイル変換


[0]ホーム

URL:


CN115209208B - Video cyclic playing processing method and device - Google Patents

Video cyclic playing processing method and device
Download PDF

Info

Publication number
CN115209208B
CN115209208BCN202110374967.6ACN202110374967ACN115209208BCN 115209208 BCN115209208 BCN 115209208BCN 202110374967 ACN202110374967 ACN 202110374967ACN 115209208 BCN115209208 BCN 115209208B
Authority
CN
China
Prior art keywords
multimedia data
video
data
circulated
decoded
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110374967.6A
Other languages
Chinese (zh)
Other versions
CN115209208A (en
Inventor
吕鹏
张仁义
李斌
李乃金
吕显浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co LtdfiledCriticalHisense Visual Technology Co Ltd
Priority to CN202110374967.6ApriorityCriticalpatent/CN115209208B/en
Publication of CN115209208ApublicationCriticalpatent/CN115209208A/en
Application grantedgrantedCritical
Publication of CN115209208BpublicationCriticalpatent/CN115209208B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Classifications

Landscapes

Abstract

The application provides a processing method and a processing device for video cyclic playing, which are used for solving the problem of clamping caused by downloading video again in the video cyclic playing process. The method comprises the following steps: acquiring a multimedia data frame, wherein the multimedia data frame belongs to one data frame in a video stream to be circulated; decapsulating the multimedia data frame to obtain first multimedia data, and storing the first multimedia data into a first cache queue; before decoding the first multimedia data, starting to download the video stream to be circulated again when the first multimedia data is determined to belong to the end point of the video stream to be circulated and the circulation times of the video stream to be circulated do not reach the set times; acquiring first multimedia data from a first cache queue, decoding the first multimedia data, and storing the decoded first multimedia data into a second cache queue; and acquiring the decoded first multimedia data from the second cache queue, and rendering the decoded first multimedia data.

Description

Video cyclic playing processing method and device
Technical Field
The present application relates to the field of video processing, and in particular, to a method and apparatus for processing video cyclic playback.
Background
The existing video player can meet the basic use requirements of users, such as functions of playing, fast forwarding, fast rewinding or double-speed playing. However, if the user needs to circularly play a video, the user needs to fast back to the video head to continue playing after the whole video is played, the fast back video player can empty all video data, and the video data is downloaded again.
Disclosure of Invention
The embodiment of the application provides a processing method and a processing device for video cyclic playing, which are used for solving the problem of clamping caused by the need of downloading video again in the video cyclic playing process.
In a first aspect, the present application provides a display apparatus comprising:
an input interface for receiving a multimedia data frame belonging to one data frame in a video stream to be circulated;
The controller is used for decapsulating the multimedia data frames to obtain first multimedia data, and storing the first multimedia data into a first cache queue;
The controller is further configured to, before decoding the first multimedia data, start to download the video stream to be recycled and buffer the video stream to be recycled to the first buffer queue when it is determined that the first multimedia data belongs to an end point of the video stream to be recycled and a number of times of recycling of the video stream to be recycled does not reach a set number of times;
The controller is further configured to obtain the first multimedia data from the first buffer queue, decode the obtained first multimedia data, and store the decoded first multimedia data into a second buffer queue;
The controller is further configured to obtain the decoded first multimedia data from the second buffer queue, and render the decoded first multimedia data to a display screen.
The display screen is used for displaying the first multimedia data.
Based on the above scheme, the controller starts to download the video stream to be circulated again when determining that the received first multimedia data is the end point of the video stream to be circulated. The method does not need to wait for the completion of playing all the video streams to be circulated and then start to download again, and can download the video streams to be circulated which need to be played next time before the completion of playing the video streams to be circulated. The time delay caused by the completion of playing and the downloading is solved, and the problem of blocking in the cyclic playing process is solved.
In some embodiments, before the input interface receives the frame of multimedia data, the controller is further configured to:
and responding to the control operation of the user, and determining the ending point of the video stream to be circulated and the set times.
Based on the above scheme, the controller has determined the end point and the set number of times of the video stream to be circulated before the input interface receives the first multimedia data. The controller may accurately determine whether the first multimedia data is the end point of the video stream to be circulated according to the determined end point of the video stream to be circulated, and may determine that the circulation does not reach the set number of times.
In some embodiments, the controller is further configured to determine a start point of the video stream to be circulated in response to the control operation, where the controller is specifically configured to, when starting to re-download the video stream to be circulated:
And re-downloading the video stream to be circulated from the starting point of the video stream to be circulated according to the starting point of the video stream to be circulated and the ending point of the video stream to be circulated.
Based on the scheme, the controller determines a starting point and an ending point of the video stream to be circulated according to the control operation of the user, and downloads the video stream to be circulated again according to the starting point and the ending point of the video stream to be circulated. The method and the device can accurately download the video stream to be circulated, and avoid downloading redundant useless data.
In some embodiments, the second buffer queue further includes decoded second multimedia data, where the decoded second multimedia data includes audio data and video data, and the decoded second multimedia data belongs to a starting point of the video stream to be circulated, and the controller is further configured to:
When a main time axis used by the current rendering multimedia data is a video time axis, determining that the video time axis in the current rendering multimedia data is larger than the audio time axis in the current rendering multimedia data, determining the delay amount of the audio data in the current rendering multimedia data, and deleting the audio data in the current rendering multimedia data corresponding to the delay amount of the audio data in the current rendering multimedia data from the second cache queue.
In some embodiments, the second buffer queue further includes decoded second multimedia data, where the decoded second multimedia data includes audio data and video data, and the decoded second multimedia data belongs to a starting point of the video stream to be circulated, and the controller is further configured to:
when a main time axis used by the current rendering multimedia data is a video time axis, determining that an audio time axis in the current rendering multimedia data is larger than the video time axis in the current rendering multimedia data, and determining the delay amount of the video data in the current rendering multimedia data; and determining a first time length according to the delay amount of the video data in the current rendering multimedia data, and stopping outputting the audio data within the first time length.
Based on the above scheme, when a main time axis used when rendering multimedia data is a video time axis, if the video time axis in the currently rendered multimedia data is greater than an audio time axis, determining a delay amount of the audio data, and deleting the audio data corresponding to the delay amount; if the audio time axis in the currently rendered multimedia data is larger than the video time axis, determining a first time length according to the delay amount of the video data, and stopping outputting the audio data within the first time length. By adopting the method, the synchronous output of the audio and video data can be ensured, and the watching experience of a user is improved.
In some embodiments, the second buffer queue further includes decoded second multimedia data, where the decoded second multimedia data includes audio data and video data, and the decoded second multimedia data belongs to a starting point of the video stream to be circulated, and the controller is further configured to:
When the main time axis used by the current rendering multimedia data is an audio time axis, switching the main time axis used by the current rendering multimedia data into a video time axis, determining that the video time axis in the current rendering multimedia data is larger than the audio time axis in the current rendering multimedia data, determining the delay amount of the audio data in the current rendering multimedia data, and deleting the audio data in the current rendering multimedia data corresponding to the delay amount of the audio data in the current rendering multimedia data from the second cache queue.
In some embodiments, the second buffer queue further includes decoded second multimedia data, where the decoded second multimedia data includes audio data and video data, and the decoded second multimedia data belongs to a starting point of the video stream to be circulated, and the controller is further configured to:
When a main time axis used by the current rendering multimedia data is an audio time axis, switching the main time axis used by the current rendering multimedia data into a video time axis, determining that the audio time axis in the current rendering multimedia data is larger than the video time axis in the current rendering multimedia data, and determining the delay amount of the video data in the current rendering multimedia data; and determining a first time length according to the delay amount of the video data in the current rendering multimedia data, and stopping outputting the audio data within the first time length.
Based on the above scheme, when the main time axis used in rendering the multimedia data is an audio time axis, the currently used main time axis is switched to a video time axis. If the video time axis in the currently rendered multimedia data is larger than the audio time axis, determining the delay amount of the audio data, and deleting the audio data corresponding to the delay amount; if the audio time axis in the currently rendered multimedia data is larger than the video time axis, determining a first time length according to the delay amount of the video data, and stopping outputting the audio data within the first time length. By adopting the method, the synchronous output of the audio and video data can be ensured, and the watching experience of a user is improved.
In a second aspect, an embodiment of the present application provides a processing method for video cyclic playing, including:
acquiring a multimedia data frame, wherein the multimedia data frame belongs to one data frame in a video stream to be circulated;
decapsulating the multimedia data frame to obtain first multimedia data, and storing the first multimedia data into a first cache queue;
Before decoding the first multimedia data, determining that the first multimedia data belongs to an end point of the video stream to be circulated and that the circulation times of the video stream to be circulated do not reach the set times, starting to re-download the video stream to be circulated and caching the video stream to be circulated in the first cache queue;
Acquiring the first multimedia data from the first cache queue, decoding the acquired first multimedia data, and storing the decoded first multimedia data into a second cache queue;
and acquiring the decoded first multimedia data from the second cache queue, and rendering the decoded first multimedia data.
In some embodiments, the method further comprises:
Before the multimedia data frame is acquired, an ending point of the video stream to be circulated and the set times, which are sent by an upper layer application, are received through an application program interface.
In some embodiments, the application program interface is further configured to receive a start point of the video stream to be recycled sent by an upper layer application, and start to re-download the video stream to be recycled, including:
And re-downloading the video stream to be circulated from the starting point of the video stream to be circulated according to the starting point of the video stream to be circulated and the ending point of the video stream to be circulated.
In some embodiments, the second buffer queue further includes decoded second multimedia data, where the decoded second multimedia data includes audio data and video data, and the decoded second multimedia data belongs to a starting point of the video stream to be circulated, and the method further includes:
When a main time axis used by the current rendering multimedia data is a video time axis, determining that the video time axis in the current rendering multimedia data is larger than the audio time axis in the current rendering multimedia data, determining the delay amount of the audio data in the current rendering multimedia data, and deleting the audio data in the current rendering multimedia data corresponding to the delay amount of the audio data in the current rendering multimedia data from the second cache queue.
In some embodiments, the second buffer queue further includes decoded second multimedia data, where the decoded second multimedia data includes audio data and video data, and the decoded second multimedia data belongs to a starting point of the video stream to be circulated, and the method further includes:
When a main time axis used for currently rendering the multimedia data is a video time axis, determining that an audio time axis in the currently rendered multimedia data is larger than the video time axis in the currently rendered multimedia data, and determining the delay amount of the video data in the currently rendered multimedia data; and determining a first time length according to the delay amount of the video data in the current rendering multimedia data, and stopping outputting the audio data within the first time length.
In some embodiments, the second buffer queue further includes decoded second multimedia data, where the decoded second multimedia data includes audio data and video data, and the decoded second multimedia data belongs to a starting point of the video stream to be circulated, and the method further includes:
When the main time axis used by the current rendering multimedia data is an audio time axis, switching the main time axis used by the current rendering multimedia data into a video time axis, determining that the video time axis in the current rendering multimedia data is larger than the audio time axis in the current rendering multimedia data, determining the delay amount of the audio data in the current rendering multimedia data, and deleting the audio data in the current rendering multimedia data corresponding to the delay amount of the audio data in the current rendering multimedia data from the second cache queue.
In some embodiments, the second buffer queue further includes decoded second multimedia data, where the decoded second multimedia data includes audio data and video data, and the decoded second multimedia data belongs to a starting point of the video stream to be circulated, and the method further includes:
When a main time axis used by current rendering multimedia data is an audio time axis, switching the main time axis used by the current rendering multimedia data into a video time axis, determining that the audio time axis in the current rendering multimedia data is larger than the video time axis in the current rendering multimedia data, and determining the delay amount of the video data in the current rendering multimedia data; and determining a first time length according to the delay amount of the video data in the current rendering multimedia data, and stopping outputting the audio data within the first time length.
In a third aspect, an embodiment of the present application provides a processing apparatus for video cyclic playback, including:
An input unit for receiving a multimedia data frame belonging to one data frame in a video stream to be circulated;
The control unit is used for decapsulating the multimedia data frames to obtain first multimedia data, and storing the first multimedia data into a first cache queue;
The control unit is further configured to, before decoding the first multimedia data, start to download the video stream to be recycled again and buffer the video stream to be recycled to the first buffer queue when it is determined that the first multimedia data belongs to an end point of the video stream to be recycled and a number of times of recycling of the video stream to be recycled does not reach a set number of times;
The control unit is further configured to obtain the first multimedia data from the first buffer queue, decode the obtained first multimedia data, and store the decoded first multimedia data into a second buffer queue;
The control unit is further configured to obtain the decoded first multimedia data from the second buffer queue, and render the decoded first multimedia data to a display unit.
The display unit is used for displaying the first multimedia data.
In some embodiments, before the input unit receives the multimedia data frame, the control unit is further configured to:
and responding to the control operation of the user, and determining the ending point of the video stream to be circulated and the set times.
In some embodiments, the control unit is further configured to determine, in response to the control operation, a start point of the video stream to be circulated, where the control unit is specifically configured to, when starting to re-download the video stream to be circulated:
And re-downloading the video stream to be circulated from the starting point of the video stream to be circulated according to the starting point of the video stream to be circulated and the ending point of the video stream to be circulated.
In some embodiments, the second buffer queue further includes decoded second multimedia data, where the decoded second multimedia data includes audio data and video data, and the decoded second multimedia data belongs to a starting point of the video stream to be circulated, and the control unit is further configured to:
When a main time axis used by the current rendering multimedia data is a video time axis, determining that the video time axis in the current rendering multimedia data is larger than the audio time axis in the current rendering multimedia data, determining the delay amount of the audio data in the current rendering multimedia data, and deleting the audio data in the current rendering multimedia data corresponding to the delay amount of the audio data in the current rendering multimedia data from the second cache queue.
In some embodiments, the second buffer queue further includes decoded second multimedia data, where the decoded second multimedia data includes audio data and video data, and the decoded second multimedia data belongs to a starting point of the video stream to be circulated, and the control unit is further configured to:
when a main time axis used by the current rendering multimedia data is a video time axis, determining that an audio time axis in the current rendering multimedia data is larger than the video time axis in the current rendering multimedia data, and determining the delay amount of the video data in the current rendering multimedia data; and determining a first time length according to the delay amount of the video data in the current rendering multimedia data, and stopping outputting the audio data within the first time length.
In some embodiments, the second buffer queue further includes decoded second multimedia data, where the decoded second multimedia data includes audio data and video data, and the decoded second multimedia data belongs to a starting point of the video stream to be circulated, and the control unit is further configured to:
When the main time axis used by the current rendering multimedia data is an audio time axis, switching the main time axis used by the current rendering multimedia data into a video time axis, determining that the video time axis in the current rendering multimedia data is larger than the audio time axis in the current rendering multimedia data, determining the delay amount of the audio data in the current rendering multimedia data, and deleting the audio data in the current rendering multimedia data corresponding to the delay amount of the audio data in the current rendering multimedia data from the second cache queue.
In some embodiments, the second buffer queue further includes decoded second multimedia data, where the decoded second multimedia data includes audio data and video data, and the decoded second multimedia data belongs to a starting point of the video stream to be circulated, and the control unit is further configured to:
When a main time axis used by the current rendering multimedia data is an audio time axis, switching the main time axis used by the current rendering multimedia data into a video time axis, determining that the audio time axis in the current rendering multimedia data is larger than the video time axis in the current rendering multimedia data, and determining the delay amount of the video data in the current rendering multimedia data; and determining a first time length according to the delay amount of the video data in the current rendering multimedia data, and stopping outputting the audio data within the first time length.
In a fourth aspect, an embodiment of the present application further provides a computer storage medium having stored therein computer program instructions which, when executed on a computer, cause the computer to perform the method for processing live data as described in the second aspect.
The technical effects caused by any implementation manner of the second aspect to the fourth aspect may refer to the technical effects caused by the corresponding implementation manner of the first aspect, and are not described herein.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly described below, it will be apparent that the drawings in the following description are only some embodiments of the present application, and that other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of a method for video cyclic playing in the prior art according to an embodiment of the present application;
Fig. 2A is a hardware configuration block diagram of a terminal device according to an embodiment of the present application;
Fig. 2B is a block diagram illustrating a configuration of a control apparatus 100 according to an embodiment of the present application;
Fig. 2C is a software architecture block diagram of a terminal according to an embodiment of the present application;
Fig. 3 is a flow chart of a processing method for video cyclic playing according to an embodiment of the present application;
Fig. 4A is an interface diagram for setting video cyclic playing according to an embodiment of the present application;
FIG. 4B is an interface diagram for setting video loop types according to an embodiment of the present application;
FIG. 4C is an interface diagram for setting video cycle times according to an embodiment of the present application;
Fig. 5 is a flowchart of a method for synchronizing time axes of audio data and video data in multimedia data according to an embodiment of the present application;
fig. 6 is a flow chart of a processing method for video overall cyclic playing provided by an embodiment of the present application;
FIG. 7A is an interface diagram for setting a video cycle number, a cycle start point, and a cycle end point according to an embodiment of the present application;
FIG. 7B is an interface diagram for setting a video cycle start point position according to an embodiment of the present application;
FIG. 7C is an interface diagram for setting the position of the end point of the video loop according to an embodiment of the present application;
FIG. 8 is a flowchart of a method for determining a start frame nearest to a cycle start point according to an embodiment of the present application;
fig. 9 is a flowchart of another processing method for video part cyclic playing according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a display device according to an embodiment of the present application;
Fig. 11 is a schematic structural diagram of a processing device for video cyclic playing according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be described in detail below with reference to the accompanying drawings, wherein it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
In the embodiment of the present application, the term "and/or" describes the association relationship of the association objects, which means that three relationships may exist, for example, a and/or B may represent: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
The terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present application, unless otherwise indicated, the meaning of "a plurality" is two or more.
When the existing video application is used for implementing the video circulation function, if the situation that the whole video circulation is required is referred to as a flow a shown in fig. 1, the video application can issue a circulation playing instruction to the video player according to the setting or the fast-rewinding operation of the user after the first playing of the video player is completed, instruct the video player to download the video from the network, and play the video again. Since the video player takes a certain time to download video from the network, a phenomenon of a clip occurs each time the video clip starts to be played in a loop. If a certain small segment in the whole video needs to be circulated, referring to the flow B shown in fig. 1, the video application instructs the video player to start downloading the video again from the start point of the segment according to the operation of adjusting the video progress bar by the user, and plays the video of the segment again. Also, since downloading video takes a certain time, a stuck situation occurs each time the start point of the segment is played. And each time the video is played, the user needs to manually operate, and the progress bar is pulled to the starting point of the video, so that the operation is inconvenient. Based on this, the embodiment of the application provides a processing method and a processing device for video cyclic playing, and when the video player can download the video stream to be circulated, the video player can directly start to download the video stream to be circulated again. The video stream to be circulated is not required to be completely played, and then the video stream to be circulated is started to be downloaded again, so that the situation that the video stream to be circulated is blocked when the starting point of the video stream to be circulated is played can be guaranteed.
In the following, in order to facilitate understanding of the solution of the present application, the solution of the present application will be described in a plurality of embodiments in different scenarios. It should be noted that the scheme provided by the application can be applied to video applications installed on various terminal devices such as computers, televisions, smart phones or tablet computers.
As an example, the structure of the terminal device according to the present application will be described in detail. Referring to fig. 2A, a schematic diagram of a possible hardware configuration of a terminal device 200 is shown. In some embodiments, the terminal device includes at least one of a modem 210, a communicator 220, a detector 230, an external device interface 240, a processor 250, a display component 260, an audio output interface 270, memory, a power supply, a user interface 280.
In some embodiments, the display unit 260 includes a display screen component for presenting a screen, and a driving component for driving an image display, and is used for receiving an image signal output from a processor, performing components of displaying video content, image content, and a menu manipulation Interface, a User Interface (UI) Interface, and the like.
In some embodiments, the display part 260 may be at least one of a liquid crystal display, an organic light-Emitting Diode (OLED) display, and a projection display.
In some embodiments, the modem 210 receives broadcast television signals via wired or wireless reception and demodulates audio-video signals from a plurality of wireless or wired broadcast television signals.
In some embodiments, communicator 220 is a component for communicating with external devices or servers according to various communication protocol types. The terminal 200 may communicate data with the target health detection device or the peer device 300 via the communicator 220.
In some embodiments, the detector 230 is used to collect signals of the external environment or interaction with the outside. For example, the detector 230 includes a sound collector, such as a microphone, for receiving external sound.
In some embodiments, the external device interface 240 may include, but is not limited to, the following: high Definition Multimedia Interface (HDMI), analog or data high definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, camera interface, etc. The input/output interface may be a composite input/output interface formed by a plurality of interfaces.
In some embodiments, the processor 250 and the modem 210 may be located in separate devices, i.e., the modem 210 may also be located in an external device to the host device in which the processor 250 is located, such as an external set-top box or the like.
In some embodiments, processor 250 includes at least one of a central processing unit (Central Processing Unit, CPU), a video processor, an audio processor, a graphics processor (Graphics Processing Unit, GPU), RAM (Random Access Memory), ROM (Read-Only Memory).
In some embodiments, the CPU is configured to execute operating system and application instructions stored in memory, and to execute various applications, data, and content in accordance with various interactive instructions received from external inputs, to ultimately display and play various audio and video content. The CPU processor may include a plurality of processors. Such as one main processor and one or more sub-processors.
In some embodiments, a graphics processor is used to generate various graphical objects, such as: at least one of icons, operation menus, and user input instruction display graphics. The graphic processor comprises an arithmetic unit, which is used for receiving various interactive instructions input by a user to operate and displaying various objects according to display attributes; the display device further comprises a renderer for rendering various objects obtained based on the arithmetic unit, wherein the rendered objects are used for being displayed on the display part.
In some embodiments, the video processor is configured to receive an external video signal, perform at least one of decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, image synthesis, and other video processing according to a standard codec protocol of an input signal, and obtain a signal that is directly displayed or played on the terminal 200.
In some embodiments, the audio processor is configured to receive an external audio signal, decompress and decode according to a standard codec protocol of an input signal, and at least one of noise reduction, digital-to-analog conversion, and amplification, to obtain a sound signal that can be played in the speaker.
In some embodiments, a user may input a user command through a Graphical User Interface (GUI) displayed on the display part 260, and the user input interface receives the user input command through the Graphical User Interface (GUI).
In some embodiments, a "display interface" is a media interface for interaction and exchange of information between an application or operating system and a user that enables conversion between an internal form of information and a form acceptable to the user. A commonly used presentation form of a user interface is a graphical user interface (Graphic User Interface, GUI), which refers to a graphically displayed user interface that is related to computer operations.
In some embodiments, the user interface 280 is an interface (e.g., physical keys on the body of the peer device, or the like) that may be used to receive control inputs.
In some embodiments, a system of peer devices may include a Kernel (Kernel), a command parser (shell), a file system, and an application. The kernel, shell, and file system together form the basic operating system architecture that allows users to manage files, run programs, and use the system. After power-up, the kernel is started, the kernel space is activated, hardware is abstracted, hardware parameters are initialized, virtual memory, a scheduler, signal and inter-process communication (IPC) are operated and maintained. After the kernel is started, shell and user application programs are loaded again. The application program is compiled into machine code after being started to form a process.
In some embodiments, the control device 100 may be further included in the embodiments of the present application, where the control device 100 is configured to control the terminal device 200 as shown in fig. 2A, and may receive an operation instruction input by a user, and convert the operation instruction into an instruction that the terminal device 200 can recognize and respond, so as to perform an intermediary of interaction between the user and the terminal device 200.
The control device 100 may be a remote controller 100A, including infrared protocol communication or bluetooth protocol communication, and other short-range communication modes, etc., and controls the terminal device 200 in a wireless or other wired mode. The user may control the terminal device 200 by inputting user instructions through keys on a remote controller, voice input, control panel input, etc.
The control device 100 may also be an intelligent device, such as a mobile terminal 100B, a tablet computer, a notebook computer, or the like. For example, the terminal device 200 is controlled using an application running on the smart device. The application program, by configuration, can provide various controls to the user through an intuitive User Interface (UI) on a screen associated with the smart device.
A block diagram of the configuration of the control apparatus 100 is exemplarily shown in fig. 2B. As shown in fig. 2B, the control device 100 may include a controller 110, a memory 120, a communicator 130, a user input interface 140, an output interface 150, and a power supply 160. It should be appreciated that fig. 2B is only an example, and that the control device 100 may include more or better components than those in fig. 2B, as the application is not particularly limited in this regard.
The controller 110 includes a Random Access Memory (RAM) 111, a Read Only Memory (ROM) 112, a processor 113, a communication interface, and a communication bus. The controller 110 is used to control the operation and operation of the control device 100, as well as the communication collaboration between the internal components, external and internal data processing functions.
For example, when an interaction in which a user presses a key arranged on the remote controller 100A or an interaction in which a touch panel arranged on the remote controller 100A is touched is detected, the controller 110 may control to generate a signal corresponding to the detected interaction and transmit the signal to the terminal device 200.
The memory 120 stores various operation programs, data, and applications for driving and controlling the control device 100 under the control of the controller 110.
The communicator 130 performs communication of control signals and data signals with the terminal equipment 200 under the control of the controller 110. Such as: the control apparatus 100 transmits a control signal (e.g., a touch signal or a button signal) to the terminal device 200 via the communicator 130, and the control apparatus 100 may receive the signal transmitted by the terminal device 200 via the communicator 130. Communicator 130 may include an infrared signal interface 131 and a radio frequency signal interface 132.
The user input interface 140 may include at least one of a microphone 141, a touch pad 142, a sensor 143, keys 144, etc., so that a user may input a user instruction regarding controlling the terminal device 200 to the control apparatus 100 through voice, touch, gesture, press, etc.
The output interface 150 outputs a user instruction received by the user input interface 140 to the terminal device 200 or outputs an image or voice signal received by the terminal device 200. Here, the output interface 150 may include an LED interface 151, a vibration interface 152 generating vibrations, a sound output interface 153 outputting sound, a display 154 outputting an image, and the like.
A power supply 160 for providing operating power support for the various elements of the control device 100 under the control of the controller 110.
Referring to fig. 2C, a block diagram of the architectural configuration of the terminal device operating system is shown schematically. The operating system architecture is an application layer, a middleware layer and a kernel layer in sequence from top to bottom.
The application layer, the application program built in the system and the non-system application program belong to the application layer. Is responsible for direct interaction with the user. The application layer may include a plurality of applications, such as a setup application, a media center application, and the like.
Middleware layer, some standardized interfaces may be provided to support the operation of various environments and systems. For example, the middleware layer may be implemented as multimedia and hypermedia information coding expert group (MHEG) functions of middleware related to data broadcasting, and the like.
And a kernel layer for providing core system services, such as a kernel based on a Linux operating system.
The kernel layer also provides communication between system software and hardware at the same time, and provides device driving services for various hardware.
The hardware configuration and software structure of different terminals may be different, and thus both fig. 2A and fig. 2B are exemplary illustrations.
In order to facilitate understanding of the solution of the present application, a specific embodiment is first described below to describe a processing method of video cyclic playing according to the present application, and referring to fig. 3, a schematic flow chart of a processing method of video cyclic playing is provided.
301, A video player acquires a multimedia data frame belonging to one data frame in a video stream to be circulated.
As an example, the terminal device may determine a start point, an end point, and a number of loops of the video stream to be circulated in response to a user's operation in a display interface of the video application, and send the start point, the end point, and the number of loops of the video stream to be circulated to the video application. The video application may further send the start point, the end point and the number of loops of the video stream to be looped through the set interface, and the source address of the video stream to be looped to the video player. The setting interface is used for realizing data transmission between the video application of the upper layer and the video player of the bottom layer.
The video player may further download the video stream to be circulated from the network according to the start point, the end point, and the slice source address of the video stream to be circulated, and download the multimedia data frame of the video stream to be circulated. Illustratively, the video player may download the video stream to be circulated from the network or read the video stream to be circulated from the local storage through a hypertext transfer protocol (Hyper Text Transport Protocol, HTTP), an HTTP-based streaming media network transport (HTTP LIVE STREAMING, HLS) protocol, a text descriptor (FD) protocol, or the like, and store the video stream to be circulated in the buffer. The multimedia data may include video data, audio data, and subtitle data, among others.
302, The video player decapsulates the multimedia data frame to obtain first multimedia data, and stores the first multimedia data in a first buffer queue.
It should be noted that, the first buffer queue is configured to store multimedia data obtained after the multimedia data frame is decapsulated. The unpackaged multimedia data may include one or more of audio data, video data, or subtitle data. As an example, when the unpacked multimedia data includes different data types, the data belonging to the different data types may be buffered in different buffer queues. For example, the audio data after decapsulation is cached in an audio cache queue, the video data after decapsulation is cached in a video cache queue, and the caption data after decapsulation is cached in a caption cache queue. The first buffer queue mentioned above may include a first audio buffer queue, a first video buffer queue, or a first subtitle buffer queue.
303, Before decoding the first multimedia data, the video player starts to re-download the video stream to be circulated when it is determined that the first multimedia data belongs to an end point of the video stream to be circulated and the number of circulation times of the video stream to be circulated does not reach the set number of times.
In some embodiments, the video player may determine that the first multimedia data is the end point of the video stream to be recycled according to the time stamp of the multimedia data corresponding to the end point of the video stream with the loop from the video application and compare with the time stamp of the first multimedia data, if the two time stamps are the same.
304, The video player obtains the first multimedia data from the first buffer queue, decodes the obtained first multimedia data, and stores the decoded first multimedia data into the second buffer queue.
The second buffer queue is used for storing decoded multimedia data, and the decoded multimedia data can comprise one or more of audio data, video data or subtitle data. As an example, when the decoded multimedia data includes different data types, data belonging to the different data types may be buffered in different buffer queues. For example, the decoded audio data is buffered in a second audio buffer queue, the decoded video data is buffered in a second video buffer queue, and the decoded caption data is buffered in a caption buffer queue. The second buffer queue mentioned above may include a second audio buffer queue, a second video buffer queue, or a second subtitle buffer queue.
It should be noted that, after the video player obtains the first multimedia data in the first buffer queue, the video player may delete the first multimedia data in the first buffer queue.
And 305, the video player acquires the decoded first multimedia data from the second cache queue and renders the decoded first multimedia data.
Generally, the rate of downloading, decapsulation, and decoding is higher than the speed of playback so that no jamming occurs, and thus, there may be a certain time difference between the video player performing step 304 and step 305. For example, the rendering operation of the multimedia data in the video stream to be circulated is later than the downloading and decapsulating operations of the multimedia data in the video stream to be circulated. For example, when rendering to the nth multimedia data frame, the downloaded multimedia data frame is the n+mth multimedia data frame.
In addition, it should be noted that, after the video player extracts the first multimedia data from the second buffer queue, the video player may delete the first multimedia data stored in the second buffer queue, so as to avoid more used data stored in the second buffer queue, which results in resource waste.
The processing method for video cyclic playing provided by the embodiment of the application comprises the following steps: setting up a setting interface for transmitting data between an upper-layer video application and a lower-layer video player, and sending a starting point, an ending point and a set circulation number of a video stream to be circulated to the video player by the video application through the setting interface. The set number of cycles is hereinafter referred to as the set number. The video player modifies the existing cyclic play process accordingly, which can be seen in the flowchart shown in fig. 1. In the application, the video player does not inform the video application of the completion of playing the video stream to be circulated after the completion of playing the video stream to be circulated, and then the video application issues the instruction of restarting downloading and playing the video stream to be circulated to the video player. And starting to download the video stream to be played next time directly according to the starting point, the ending point and the set times of the video stream to be played next time before the current playing is finished, so as to realize seamless circulating playing. For example, when the video player determines that the multimedia data frame of the ending point of the video stream to be circulated has been downloaded, it is not required to determine whether the multimedia data frame before the multimedia data frame of the ending point of the video stream to be circulated is completely played, and it is not required to determine whether the multimedia data frame of the ending point of the video stream to be circulated is completely played, so that the downloading of the video stream to be circulated can be directly started again.
The processing method of video cyclic playing provided by the application is described below by taking a specific scene as an example.
Scene one: scene of whole video overall circulation.
In this scenario, the video stream to be circulated is a whole-segment video, that is, the starting point of the video stream to be circulated mentioned in the above embodiment is the head of the whole-segment video, and the ending point of the video stream to be circulated is the tail of the whole-segment video. For convenience of description, a video requiring an entire cycle in the present scene is referred to as video a.
In some embodiments, during playing of video a, the terminal device may determine to play video a in a loop in response to a user operation in a display interface of the video application. For example, the display interface shown in fig. 4A may be provided during the playing of the video application in the video a, and the display interface in fig. 4A includes the option 401 of loop playing. The terminal device may determine that the user needs to play the video a in a cyclic manner in response to the user's touch or remote control operation of the option 401 in the display interface shown in fig. 4A, and may further transmit an instruction to play the video a in a cyclic manner to the video application. The video application, upon receiving the instruction, may present a loop type option in the display interface, for example, see the display interface shown in fig. 4B. The cycle type may be divided into a whole cycle and a partial cycle. In this scenario, taking the case that the video a is integrally cycled, the terminal device may determine that the user needs to integrally cycle to play the video a in response to the user selecting the operation of the cycle type as the integral cycle in the display interface shown in fig. 4B, and may transmit an instruction for integrally cycle to play the video a to the video application. Note that, the operation of selecting the cycle type as the whole cycle in the display interface shown in fig. 4B may be a touch operation or a remote operation. The video application may further display an option of selecting the set number of times in the display interface after receiving the instruction, for example, see the display interface shown in fig. 4C. In some embodiments, the terminal device determines the number of times the user is to play video a in a loop in response to the user selecting the set number of times in the display interface as shown in fig. 4C. The operation of selecting the set number of times by the user may input the set number of times into the corresponding selection frame in the display interface shown in fig. 4C, or may operate the flip-up and flip-down options after the set number of times in the display interface shown in fig. 4C, where the flip-up may be to add one to the set number of times, or to subtract one to the set number of times, which is not limited in detail herein. The terminal device can transmit the set times to the video application after determining the set times in response to the above operation of the user, and the video application can further transmit the cycle type and the set times to the video player through the set interface. After the video player receives the circulation type and the set times through the set interface, determining that the currently played video A needs to be circularly played, determining that the first multimedia data in the multimedia data of the unpacked video A belongs to the tail of the video A, and further determining that the currently completed circulation times do not reach the set times, starting to download the video A again, namely, starting to download the video A again from the head of the video A. Since the downloading, decapsulating and decoding rates are generally higher than the video rendering rate, based on the above scheme, the video player starts to re-download the video a from the beginning of the video a when determining that the decapsulated first multimedia data belongs to the end of the video a. it can be ensured that the downloading of the multimedia data of the head of the video a to be played next is already completed when the action of rendering the multimedia data of the tail of the video a has not been completed. Therefore, when the rendering of the multimedia data of the video A is finished, the multimedia data of the video A can be directly rendered, and the phenomenon of clamping and stopping can not occur. It should be noted that fig. 4A to fig. 4C provided in this embodiment are only examples, and the cycle type and the cycle number of the video a may be set by using an alternative method.
In other embodiments, if the user sets the loop type and number of loops before starting to play video A. In this case, the video application will send the cycle type and number of cycles to the video player before video a starts playing. After the video player starts downloading the video A, judging whether the unpacked multimedia data belongs to the video A end and judging whether the circulation times reach the set times, if the unpacked multimedia data belongs to the video A end and the circulation times do not reach the set times, starting to download the video A from the video A end by the video player. In some embodiments, the unpacked multimedia data may be stored in a first buffer queue, and the video player may obtain the multimedia data from the first buffer queue to determine whether the multimedia data belongs to the tail of the video a, and may obtain the multimedia data from the first buffer queue to perform decoding processing.
In the above two embodiments, after determining that the multimedia data in the first buffer queue belongs to the trailer of the video a and determining that the number of loops has not reached the set number, the video player may further set the trailer offset to 0, where the purpose of this operation is to instruct to download the video a from the trailer. The slice header offset refers to an offset between the current multimedia data and the multimedia data belonging to the slice header of the video a.
In some embodiments, after decoding the multimedia data in the first buffer queue, the video player may store the decoded multimedia data in the second buffer queue. In some embodiments, before the video player renders a certain decoded multimedia data in the second buffer queue, it may first determine whether the decoded multimedia data belongs to the slice header of video a and determine whether the decoded multimedia data is to be rendered for the first time. For convenience of description, the decoded multimedia data will be hereinafter referred to as decoded second multimedia data. If the video player determines that the decoded second multimedia data belongs to the slice header of the video A and determines that the decoded second multimedia data is not rendered for the first time, synchronizing the time axis of the video data in the decoded second multimedia data with the time axis of the audio data in the decoded second multimedia data. As an example, the synchronization of the time axes of the audio data and the video data in the decoded second multimedia data may employ the following method as shown in fig. 5:
501, the video player determines whether the currently used main time axis is the time axis of video data.
Wherein the main time axis may be referred to as a play time axis or a reference time axis. If the main time axis is the time axis of the video data, the main time axis may be used to indicate the time corresponding to the video data, such as the total duration, the start time, the end time, and the current playing time of the video data. That is, the main time axis is the time line of video playback, and is linearly increasing. Similarly, if the main time axis is a time axis of the audio data, the main time axis may be used to indicate a time corresponding to the audio data, such as information of a total duration, a start time, an end time, and the like of the audio data.
If yes, go to step 503.
If not, step 502 is performed.
502, The video player switches the main time axis to the time axis of the video data.
503, The video player calculates the delay amount of the audio data in the currently rendered multimedia data.
In some embodiments, the delay amount of the audio data may be calculated as follows: subtracting the time axis of the video data from the time axis of the audio data, and taking the obtained difference as the delay amount of the audio data. For example, the time axis of audio data in the currently rendered multimedia data is 00:55s, and the time axis of video data in the currently rendered multimedia data is 00:40s, and the delay amount of the audio data is 15ms.
The video player determines whether the delay amount of the audio data is greater than 0 504.
If the delay amount of the audio data is smaller than 0, it may be that in the last playing process, all the video data is played, but not all the audio data is played, and some remaining audio data exists in the second buffer queue, where the remaining audio data corresponds to the delay amount. Step 505 is performed.
If the delay amount of the audio data is greater than 0, it may be that the audio data is completely played in the previous playing process, but the video data is not completely played, and some remaining video data is still present in the second buffer queue, then step 507 is performed.
505, The video player calculates the data amount of the remaining audio data.
In some embodiments, the calculation may be performed as follows:
The remaining data amount of audio data=absolute value of delay amount of audio data.
The video player discards 506 the amount of data of the remaining audio data.
507, The video player determines a first time period according to the delay amount of the audio data, and stops outputting the audio data within the first time period.
For example, the time axis of the audio data in the currently rendered multimedia data is 00:55s, and the time axis of the video data in the currently rendered multimedia data is 00:40s, the delay amount of the audio data is 15ms, and the first duration is 15ms. The main time axis currently used is 00:40s, and output of audio data is stopped within 15ms from 00:40 s. When the main time axis reaches 00:55s, synchronous output of audio data and video data is started.
508, The audio data and video data time axis synchronization is completed, and the main time axis is switched back to the original time axis.
If the video player is taking the time axis of the audio data as the main time axis before executing step 502, the main time axis is switched from the time axis of the video data to the time axis of the audio data.
If the video player is taking the time axis of the video data as the main time axis before executing step 502, the video player continues taking the time axis of the video data as the main time axis.
It should be noted that, other methods may be used to correct the time axes of the audio data and the video data in the second multimedia data, and fig. 5 is only an example.
Next, a specific embodiment will be described for a processing method of video loop playback in the present scenario. Referring specifically to the flowchart shown in fig. 6, the flowchart shown in fig. 6 is exemplified by setting the set number of times and the type of the loop before the video a starts playing.
The video player initiates the downloading of video a 601.
Specifically, when the video application determines to play the video a, the video application sends the source address of the video a to the video player, and the video player can download the video a from the network according to the source address of the video a.
The video player decapsulates 602 the video a to obtain multimedia data of the video a.
In some embodiments, after the video player performs the decapsulation processing on the video a, the decapsulated multimedia data may be stored in the first buffer queue.
603, The video player determines whether the decapsulated multimedia data belongs to the trailer of video a.
If so, step 604 is performed.
If not, go to step 607.
The video player determines 604 whether it is necessary to play video a in a loop.
If so, step 605 is performed.
If not, step 607 is performed.
605, The video player sets the slice header offset to 0.
The slice header offset refers to an offset between the current multimedia data and the multimedia data of the slice header.
The video player initiates a re-download of video a 606.
The video player downloads video a from the title according to the title offset of 0.
After the download is completed, the steps 602-603 are continued.
607, The video player obtains the multimedia data in the first buffer queue and performs decoding operation on the multimedia data.
In some embodiments, the video player may store the decoded multimedia data in the second cache queue after performing a decoding operation on the multimedia data in the first cache queue.
The video player determines 608 whether the decoded multimedia data belongs to the title of video a.
If so, the decoded multimedia data belonging to the slice header of the video a is referred to as the decoded second multimedia data, and the process continues to step 609.
If not, go to step 611.
The video player determines 609 if it is the first rendering of the decoded second multimedia data.
In some embodiments, a counter may be included in the video player, which starts the counter when the video player starts rendering the head multimedia data belonging to video a and increments the counter when the multimedia data belonging to the tail of video a is rendered. So, if the decoded second multimedia data is first rendered, the counter is displayed as 0. The video player may determine whether to render the decoded second multimedia data for the first time based on the number displayed by the counter.
If yes, go to step 610.
If not, step 611 is performed.
The video player synchronizes 610 the time axes of the audio data and the video data in the decoded second multimedia data.
As an example, the time axes of the audio data and the video data in the decoded second multimedia data may be synchronized in the manner shown in fig. 5.
611, The video player renders the multimedia data in the second buffer queue.
612, Whether the video player is disconnected a set number of times.
If the set number of times is reached, the cycle is ended, and the playing is exited.
If the set number of times is not reached, then step 608 is continued.
Scene II: a partially cycled scenario.
In this scenario, the video stream to be circulated is a part of a video, for convenience of description, the video to be circulated in this scenario is referred to as video B, the start point of the video stream to be circulated is referred to as a circulation start point, and the end point of the video stream to be circulated is referred to as a circulation end point.
In some embodiments, the terminal device may determine to play the video in a loop in response to a user operation in a display interface of the video application during the playing of the video. For example, the display interface shown in fig. 4A in scene one may be provided during the playing of the video application, and the display interface in fig. 4A includes the option 401 of loop playing. The terminal device may determine that the user needs to play the video in a loop in response to the user's touch or remote control operation of the option 401 in the display interface shown in fig. 4A, and may transmit an instruction to play the video in a loop to the video application. Upon receiving the instruction, the video application may present an option to select a loop type in the display interface, for example, see the display interface shown in fig. 4B in scenario one. The circulation type can be divided into a whole circulation and a partial circulation, and in this scenario, taking the video to perform partial circulation as an example, the part of circulation playing is video B. In some embodiments, the terminal device may respond to the user selecting the operation of the loop type as the partial loop in the display interface shown in fig. 4B, and transmit the execution of the partial loop video to the video application, where the video application may display the operations of setting the loop start point, setting the loop end point, and setting the set number of times in the display interface after receiving the instruction. For example, see the display interface shown in fig. 7A. The operation of setting the setting times may refer to a manner provided in the first scenario, which is not described herein. The display interface shown in fig. 7A further includes an option of setting a circulation start point, and the terminal device sends an instruction for setting the circulation start point to the video application in response to the touch or remote control operation of the user on the option of setting the circulation start point, and after receiving the instruction, the video application may display an alternative circulation start point in the display interface, for example, see the interface shown in fig. 7B, and in the interface shown in fig. 7B, a progress bar under the video is in a virtual state, which indicates that the user may click on any position of the progress bar as the circulation start point.
In some embodiments, the terminal device determines a loop start point in response to a user operation in a display interface as shown in fig. 7B, and transmits the loop start point to the video application. In some embodiments, the display interface shown in fig. 7A includes an option of setting a cycle end point, and the terminal device may further send an instruction that the user wants to set the cycle end point to the video application in response to a touch or remote control operation of the user on the option of setting the cycle end point, and after receiving the instruction, the video application may display an alternative cycle end point in the display interface, for example, may refer to an interface shown in fig. 7C, where a progress bar under the video is in a virtual state, which indicates that the user may click on any position of the progress bar as the cycle end point. In some embodiments, the terminal device determines a loop end point in response to a user operation in a display interface as shown in fig. 7C, and sends the loop end point to the video application. After the video application receives the set times, the circulation starting point and the circulation ending point, the set times can be sent to the video player through the set interface. After receiving the set times, the video player downloads the video B from the network according to the cycle start point and the cycle end point and the source address of the complete video including the video B after receiving the cycle start point and the cycle end point, and subsequently, for convenience of description, the complete video including the video B is referred to as the video C. The video player may further perform decapsulation processing on the downloaded video B to obtain multimedia data of the video B, and may store the multimedia data of the video B in the first buffer queue. The video player initiates a re-download of video B when it is determined that the first multimedia data of the decapsulated multimedia data belongs to the loop ending point according to the loop ending point from the video application.
As an alternative, the video player may further set the start offset to 0 after determining that the first multimedia data of the unpacked multimedia data belongs to the loop ending point and before starting to download the video B again, which is an operation for indicating that the video B is downloaded from the loop starting point. The initial offset refers to an offset between the current multimedia data and the multimedia data at the start point of the loop.
In addition, it should be noted that, since the cycle start point is not necessarily the decoded start frame, for example, the cycle start point corresponds to a time of 3:00 and the start frame corresponds to a time of 2:58, in this case, if the video B is downloaded from 3:00, the video player cannot decode the multimedia data of 3:00, so the video player needs to download the video B from 2:58. That is, after setting the start offset, the video player also needs to determine the start frame nearest to the start point of the loop, and set the offset of the nearest start frame to 0, and start downloading from the start frame nearest to the start point of the loop.
As an example, the method shown in fig. 8 may be used to determine the start frame. It should be noted that, the video player cannot learn the start frame closest to the cycle start point when downloading the video B for the first time, so that the data packet including the cycle start point needs to be downloaded for the first time when downloading the video B, and the start frame closest to the cycle start point is determined after the data packet is decapsulated. As shown in fig. 8:
The video player determines 801 the start point of the loop.
The video player downloads 802 a packet containing the start of the loop.
Specifically, the video player may download a data packet including the start point of the loop when downloading according to the source address of the slice and the start point and end point of the loop. For example, the time corresponding to the cycle starting point is 3:00, and the data packet containing the cycle starting point is 2:00-3:30, and then the data of the data packet is downloaded entirely. For convenience of description, the data packet including the cycle start point will be referred to as the data packet P.
803, The video player performs a decapsulation process on the data packet P.
The video player records 804 the start frame contained in the data packet P that is closest to the start of the loop.
Before the video player starts to re-download video B, the start offset is set to 0 first, and then the offset of the nearest start frame is further set to 0.
806, The video player starts downloading video B from the start frame nearest to the start of the loop.
In some embodiments, after the video player downloads the video B and decapsulates the video B, the multimedia data of the video B obtained after decapsulation may be stored in the first buffer queue. After decoding the multimedia data in the first buffer queue, the video player may store the decoded multimedia data in the second buffer queue. In some embodiments, before the video player renders the decoded multimedia data in the second buffer queue, it may further determine whether the decoded multimedia data belongs to a cycle starting point and determine whether the decoded multimedia data is to be rendered for the first time. For convenience of description, the decoded multimedia data will be referred to as decoded second multimedia data. If the video player determines that the decoded second multimedia data belongs to the cycle starting point and determines that the decoded second multimedia data is not rendered for the first time, discarding data from the starting frame to the cycle starting point in the decoded second multimedia data, and synchronizing a time axis of video data in the decoded second multimedia data with a time axis of audio data in the decoded second multimedia data. As an example, the method shown in fig. 5 in the first scenario may be used to synchronize the time axes of the audio data and the video data in the decoded second multimedia data, which is not described herein.
Next, a specific embodiment will be described for a processing method of video loop playback in the present scenario. See in particular the flow chart shown in fig. 9.
901, The video player starts playing video C.
Wherein video C refers to the complete video containing video B.
The video player downloads video B902.
Specifically, in the process of playing the video C by the video player, the terminal device sends an instruction of playing the video B in a circulating way to the video application in response to the operation of the user, and the video application sends a circulating start point, a circulating end point and a set number of times to the video player. And the video player starts downloading the video B after receiving the cycle starting point, the cycle ending point and the set times.
903, The video player decapsulates the video B to obtain multimedia data of the video B.
As an alternative, after the video B is decapsulated by the video player, the decapsulated multimedia data may be stored in the first buffer queue.
The video player determines 904 whether the decapsulated multimedia data belongs to a loop ending point.
If so, step 905 is performed.
If not, step 908 is performed.
905, The video player determines whether it is necessary to play the video B in a loop.
If so, step 909 is performed.
If not, step 908 is performed.
The video player sets 906 the start offset to 0 and the offset of the nearest start frame to 0.
Wherein, the initial offset refers to the displacement between the current multimedia data and the multimedia data of the circulation starting point; the offset of the nearest start frame refers to the offset between the current multimedia data and the start frame nearest to the start point of the loop.
907, The video player initiates a re-download of video B.
The video player starts downloading video B from the start frame nearest to the start point of the loop according to the offset of the nearest start frame being 0.
After the download is completed, the steps 903-904 are continued.
The video player obtains 908 the multimedia data in the first buffer queue and performs a decoding operation on the multimedia data.
In some embodiments, the video player may store the decoded multimedia data in the second cache queue after performing a decoding operation on the multimedia data in the first cache queue.
The video player judges 909 whether the decoded multimedia data belongs to a cycle start point.
If so, the decoded multimedia data belonging to the loop start point is referred to as the decoded second multimedia data, and the process continues to step 910.
If not, go to step 912.
The video player determines 910 if it is the first rendering of the decoded second multimedia data.
If yes, step 911 is performed.
If not, step 913 is performed.
911, The video player synchronizes the time axis of the audio data and the video data in the decoded second multimedia data.
As an example, the time axes of the audio data and the video data in the decoded second multimedia data may be synchronized in the manner shown in fig. 5 in scene one.
The video player determines 912 whether the decoded multimedia data belongs to multimedia data between a start frame nearest to the cycle start point and the cycle start point.
If so, discarding the multimedia data between the start frame nearest to the cycle start point and the cycle start point.
If not, go to step 913.
913, The video player renders the multimedia data in the second buffer queue.
914, The video player determines if the set number of times has been reached.
If the set number of times is reached, the circulation is ended, the multimedia data of the video B remained in the first buffer queue is deleted, the decoded multimedia data of the video B in the second buffer queue is deleted, and the video C is continuously played. For example, the duration of the video C is 0-10:00, the duration of the video B is 3:00-5:00, after the cycle is finished, the multimedia data of the video B remaining in the first buffer queue is deleted, the multimedia data of the decoded video B in the second buffer queue is deleted, and the video C is downloaded from 6:00.
If the set number of times is not reached, the process continues to step 909.
Based on the same concept as the above method, as shown in fig. 10, a display apparatus 1000 is provided. The display device 1000 is capable of performing the various steps of the method described above and will not be described in detail herein in order to avoid repetition. The display device 1000 includes: an input interface 1001, a controller 1002 and a display screen 1003.
An input interface 1001 for receiving a multimedia data frame, the multimedia data frame belonging to one data frame in a video stream to be circulated;
A controller 1002, configured to decapsulate the multimedia data frame to obtain first multimedia data, and store the first multimedia data in a first buffer queue;
The controller 1002 is further configured to, before decoding the first multimedia data, start to re-download the video stream to be circulated and buffer the video stream to be circulated in the first buffer queue when it is determined that the first multimedia data belongs to an end point of the video stream to be circulated and a number of times of circulation of the video stream to be circulated does not reach a set number of times;
The controller 1002 is further configured to obtain the first multimedia data from the first buffer queue, decode the obtained first multimedia data, and store the decoded first multimedia data in a second buffer queue;
The controller 1002 is further configured to obtain the decoded first multimedia data from the second buffer queue, and render the decoded first multimedia data to the display screen 1003.
The display 1003 is configured to display the first multimedia data.
In some embodiments, before the input interface 1001 receives the multimedia data frame, the controller 1002 is further configured to:
and responding to the control operation of the user, and determining the ending point of the video stream to be circulated and the set times.
In some embodiments, the controller 1002 is further configured to determine, in response to the control operation, a start point of the video stream to be circulated, where the controller 1002 is specifically configured to, when starting to re-download the video stream to be circulated:
And re-downloading the video stream to be circulated from the starting point of the video stream to be circulated according to the starting point of the video stream to be circulated and the ending point of the video stream to be circulated.
In some embodiments, the second buffer queue further includes decoded second multimedia data, where the decoded second multimedia data includes audio data and video data, the decoded second multimedia data belongs to a starting point of the video stream to be circulated, and the controller 1002 is further configured to:
When a main time axis used by the current rendering multimedia data is a video time axis, determining that the video time axis in the current rendering multimedia data is larger than the audio time axis in the current rendering multimedia data, determining the delay amount of the audio data in the current rendering multimedia data, and deleting the audio data in the current rendering multimedia data corresponding to the delay amount of the audio data in the current rendering multimedia data from the second cache queue.
In some embodiments, the second buffer queue further includes decoded second multimedia data, where the decoded second multimedia data includes audio data and video data, the decoded second multimedia data belongs to a starting point of the video stream to be circulated, and the controller 1002 is further configured to:
when a main time axis used by the current rendering multimedia data is a video time axis, determining that an audio time axis in the current rendering multimedia data is larger than the video time axis in the current rendering multimedia data, and determining the delay amount of the video data in the current rendering multimedia data; and determining a first time length according to the delay amount of the video data in the current rendering multimedia data, and stopping outputting the audio data within the first time length.
In some embodiments, the second buffer queue further includes decoded second multimedia data, where the decoded second multimedia data includes audio data and video data, the decoded second multimedia data belongs to a starting point of the video stream to be circulated, and the controller 1002 is further configured to:
When the main time axis used by the current rendering multimedia data is an audio time axis, switching the main time axis used by the current rendering multimedia data into a video time axis, determining that the video time axis in the current rendering multimedia data is larger than the audio time axis in the current rendering multimedia data, determining the delay amount of the audio data in the current rendering multimedia data, and deleting the audio data in the current rendering multimedia data corresponding to the delay amount of the audio data in the current rendering multimedia data from the second cache queue.
In some embodiments, the second buffer queue further includes decoded second multimedia data, where the decoded second multimedia data includes audio data and video data, the decoded second multimedia data belongs to a starting point of the video stream to be circulated, and the controller 1002 is further configured to:
When a main time axis used by the current rendering multimedia data is an audio time axis, switching the main time axis used by the current rendering multimedia data into a video time axis, determining that the audio time axis in the current rendering multimedia data is larger than the video time axis in the current rendering multimedia data, and determining the delay amount of the video data in the current rendering multimedia data; and determining a first time length according to the delay amount of the video data in the current rendering multimedia data, and stopping outputting the audio data within the first time length.
Based on the same concept as the above method, as shown in fig. 11, a processing apparatus 1100 for video loop playing is provided. The apparatus 1100 is capable of performing the various steps of the method described above, and will not be described in detail herein to avoid repetition. The apparatus 1100 comprises: a communication unit 1101, a processing unit 1102, and a display unit 1103.
An input unit 1101 for receiving a multimedia data frame, the multimedia data frame belonging to one data frame in a video stream to be circulated;
the control unit 1102 is configured to decapsulate the multimedia data frame to obtain first multimedia data, and store the first multimedia data in a first buffer queue;
the control unit 1102 is further configured to, before decoding the first multimedia data, start to download the video stream to be recycled again and buffer the video stream to be recycled to the first buffer queue when it is determined that the first multimedia data belongs to an end point of the video stream to be recycled and a number of times of recycling of the video stream to be recycled does not reach a set number of times;
The control unit 1102 is further configured to obtain the first multimedia data from the first buffer queue, decode the obtained first multimedia data, and store the decoded first multimedia data in a second buffer queue;
The control unit 1102 is further configured to obtain the decoded first multimedia data from the second buffer queue, and render the decoded first multimedia data to the display unit 1103.
The display unit 1103 is configured to display the first multimedia data.
The embodiments of the present application also provide a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of any of the methods described above.
Those of ordinary skill in the art will appreciate that: all or part of the steps for implementing the method embodiments described above may be performed by hardware associated with program instructions. The foregoing program may be stored in a computer readable storage medium. The program, when executed, performs steps including the method embodiments described above; and the aforementioned storage medium includes: various media that can store program code, such as ROM, RAM, magnetic or optical disks.
While specific embodiments of the application have been described above, it will be appreciated by those skilled in the art that these are by way of example only, and the scope of the application is defined by the appended claims. Various changes and modifications to these embodiments may be made by those skilled in the art without departing from the principles and spirit of the application, but such changes and modifications fall within the scope of the application. While preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present application without departing from the scope of the application. Thus, it is intended that the present application also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (10)

Decapsulating the multimedia data frame to obtain first multimedia data, and storing the first multimedia data into a first cache queue; before decoding the first multimedia data, determining that the first multimedia data belongs to an end point of the video stream to be circulated and that the circulation times of the video stream to be circulated do not reach the set times, starting to re-download the video stream to be circulated and caching the video stream to be circulated in the first cache queue; acquiring the first multimedia data from the first cache queue, decoding the acquired first multimedia data, and storing the decoded first multimedia data into a second cache queue; acquiring the decoded first multimedia data from the second buffer queue, and rendering the decoded first multimedia data to a display screen;
CN202110374967.6A2021-04-082021-04-08Video cyclic playing processing method and deviceActiveCN115209208B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202110374967.6ACN115209208B (en)2021-04-082021-04-08Video cyclic playing processing method and device

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202110374967.6ACN115209208B (en)2021-04-082021-04-08Video cyclic playing processing method and device

Publications (2)

Publication NumberPublication Date
CN115209208A CN115209208A (en)2022-10-18
CN115209208Btrue CN115209208B (en)2024-07-23

Family

ID=83571067

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202110374967.6AActiveCN115209208B (en)2021-04-082021-04-08Video cyclic playing processing method and device

Country Status (1)

CountryLink
CN (1)CN115209208B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN116193185B (en)*2022-12-262024-06-25北京仁光科技有限公司 Method, device, apparatus and medium for multi-window playback of video streams
CN119277149A (en)*2023-07-042025-01-07华为技术有限公司 Multimedia file playing method, readable storage medium and device
CN118018795B (en)*2024-01-312024-09-27书行科技(北京)有限公司Video playing method, device, electronic equipment and computer readable storage medium
CN118741224A (en)*2024-06-262024-10-01四川国创新视超高清视频科技有限公司 Video image display terminal and display method

Citations (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN102084652A (en)*2008-06-042011-06-01高通股份有限公司Method and apparatus for selective caching of burst stream transmission
CN102421034A (en)*2011-12-192012-04-18中山爱科数字科技股份有限公司 A video playing method formed by live video or video surveillance

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN1276658C (en)*2003-06-132006-09-20天津大学Method for controlling and adjusting for circulating play transfer flow
CN102724584B (en)*2012-06-182016-07-27Tcl集团股份有限公司The online player method of Internet video, the online playing device of video and intelligent television
CN105187895B (en)*2015-09-172019-03-12暴风集团股份有限公司For the hardware-accelerated data cache method and system for playing video of mobile platform
EP3996382A1 (en)*2015-10-022022-05-11Twitter, Inc.Gapless video looping
JP6984001B2 (en)*2017-04-212021-12-17ゼニマックス メディア インク.Zenimax Media Inc. Systems and methods for motion compensation of player inputs for motion vector prediction
CN109447048B (en)*2018-12-252020-12-25苏州闪驰数控系统集成有限公司Artificial intelligence early warning system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN102084652A (en)*2008-06-042011-06-01高通股份有限公司Method and apparatus for selective caching of burst stream transmission
CN102421034A (en)*2011-12-192012-04-18中山爱科数字科技股份有限公司 A video playing method formed by live video or video surveillance

Also Published As

Publication numberPublication date
CN115209208A (en)2022-10-18

Similar Documents

PublicationPublication DateTitle
CN115209208B (en)Video cyclic playing processing method and device
CN112367543B (en)Display device, mobile terminal, screen projection method and screen projection system
CN113507638B (en)Display equipment and screen projection method
CN112153447B (en)Display device and sound and picture synchronous control method
WO2020098504A1 (en)Video switching control method and display device
CN114073098B (en) Streaming media synchronization method and display device
CN114095778B (en)Audio hard decoding method of application-level player and display device
CN111601144B (en)Streaming media file playing method and display equipment
CN114095769B (en)Live broadcast low-delay processing method of application-level player and display device
US20230017791A1 (en)Display method and display apparatus for operation prompt information of input control
CN112153406A (en)Live broadcast data generation method, display equipment and server
US20250210066A1 (en)Audio processing method and electronic device
CN111601142B (en)Subtitle display method and display equipment
CN115119030B (en) A subtitle processing method and device
CN111935510B (en)Double-browser application loading method and display equipment
CN113453063B (en)Resource playing method and display equipment
CN113542765B (en)Media data jump continuous playing method and display device
CN115623275A (en)Subtitle display method and display equipment
CN111343498B (en)Mute control method and device and smart television
CN111629250A (en)Display device and video playing method
CN115134644B (en)Live broadcast data processing method and device
CN113038221B (en)Double-channel video playing method and display equipment
CN115174991B (en)Display equipment and video playing method
US20230412890A1 (en)Refreshing method and display apparatus
CN112153395A (en)Streaming media data playing method and display equipment

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp