Disclosure of Invention
The embodiment of the application aims to provide a data processing method, a data processing device, a computer readable medium and electronic equipment, which at least overcome the technical problems of easy interruption of network live broadcast, poor live broadcast stability and the like to a certain extent.
Other features and advantages of embodiments of the application will be apparent from the following detailed description, or may be learned by the practice of embodiments of the application.
According to an aspect of an embodiment of the present application, there is provided a data processing method including: acquiring a plurality of live broadcast data sources for content live broadcast from a live broadcast initiating terminal to a live broadcast displaying terminal, wherein the live broadcast data sources are used for acquiring data of different modality types to form live broadcast content; respectively acquiring data acquisition states of the live broadcast data sources, and determining whether the live broadcast data sources are normal data sources or abnormal data sources according to the data acquisition states; when the live broadcast data source is an abnormal data source, acquiring a substitute data source with the same data mode type as the abnormal data source; and acquiring live broadcast content based on multi-mode data through at least two live broadcast data sources in the normal data source and the alternative data source, and sending the live broadcast content to the live broadcast display end.
According to an aspect of an embodiment of the present application, there is provided a data processing apparatus including: the data source acquisition module is configured to acquire a plurality of live broadcast data sources for carrying out content live broadcast from the live broadcast initiation terminal to the live broadcast display terminal; the state acquisition module is configured to acquire the data acquisition states of the live broadcast data sources respectively and determine whether the live broadcast data sources are normal data sources or abnormal data sources according to the data acquisition states; the data source substitution module is configured to acquire a substitution data source with the same data mode type as the abnormal data source when the live broadcast data source is the abnormal data source; the content acquisition module is configured to acquire live broadcast content based on multi-mode data through at least two live broadcast data sources in the normal data source and the alternative data source, and send the live broadcast content to the live broadcast display end.
In some embodiments of the present application, based on the above technical solution, the state obtaining module includes: a connection state obtaining unit configured to obtain device connection states of respective data acquisition interfaces corresponding to respective live broadcast data sources, respectively, the device connection states being used to indicate whether the data acquisition interfaces are connected to a data acquisition device; and the first state determining unit is configured to determine the data acquisition state of the live broadcast data source corresponding to the data acquisition interface according to the equipment connection state.
In some embodiments of the present application, based on the above technical solution, the state obtaining module includes: a data stream receiving unit configured to receive data streams through the respective live data sources, respectively; a transmission state acquisition unit configured to acquire data stream transmission states of data streams received by the respective live data sources, respectively; and the second state determining unit is configured to determine the data acquisition state of the live data source corresponding to the data stream according to the data stream transmission state.
In some embodiments of the present application, based on the above technical solutions, the data source substitution module includes: the mode type acquisition unit is configured to acquire a data mode type of the abnormal data source acquisition data; a data source set acquisition unit configured to acquire a candidate data source set corresponding to the abnormal data source according to the data modality type; a substitute data source selection unit configured to select a substitute data source from the candidate data source set.
In some embodiments of the present application, based on the above technical solution, the alternative data source selecting unit includes: a priority acquisition subunit configured to acquire a data source priority of each candidate data source in the candidate data source set; a data source ranking subunit configured to rank each of the candidate data sources according to the data source priorities; a data source selection subunit configured to select a substitute data source from the candidate data source set according to a ranking result of the candidate data sources.
In some embodiments of the present application, based on the above technical solutions, the live broadcast data source includes a video acquisition data source that acquires video mode data through a video acquisition device and an audio acquisition data source that acquires audio mode data through an audio acquisition device; when the video acquisition data source is an abnormal data source and the audio acquisition data source is a normal data source, the alternative data source comprises a video file data source for generating video mode data through a video file.
In some embodiments of the present application, based on the above technical solution, the content acquisition module includes: the audio acquisition unit is configured to acquire audio mode data acquired by the audio acquisition equipment through the audio acquisition data source; the first video acquisition unit is configured to read file content of a video file through the video file data source and generate video mode data according to the file content; and the data combination unit is configured to combine the audio mode data and the video mode data into live broadcast content based on multi-mode data.
In some embodiments of the present application, based on the above technical solutions, the file content of the video file includes one or more image files for displaying a prompt text; the first video acquisition unit includes: a document arrangement sub-unit configured to arrange the one or more image documents in an image presentation order to form an image sequence; a first video encoding subunit configured to video encode the sequence of images at a video frame rate to generate video modality data.
In some embodiments of the present application, based on the above technical solutions, the live broadcast data source includes a video acquisition data source that acquires video mode data through a video acquisition device and an audio acquisition data source that acquires audio mode data through an audio acquisition device; when the video acquisition data source is an abnormal data source and the audio acquisition data source is a normal data source, the alternative data source comprises a terminal interface data source for generating video mode data through an interactive interface of the terminal device.
In some embodiments of the present application, based on the above technical solution, the content acquisition module includes: the audio acquisition unit is configured to acquire audio mode data acquired by the audio acquisition equipment through the audio acquisition data source; the second video acquisition unit is configured to acquire interface content of an interactive interface of the terminal equipment through the terminal interface data source and generate video mode data according to the interface content; and the data combination unit is configured to combine the audio mode data and the video mode data into live broadcast content based on multi-mode data.
In some embodiments of the present application, based on the above technical solution, the interface content includes an interface image of the interactive interface; the second video acquisition unit includes: the interface image acquisition subunit is configured to acquire images of the interactive interface of the terminal equipment according to the video frame rate to obtain an interface image sequence consisting of interface images of the interactive interface; a second video encoding subunit configured to video encode the sequence of interface images at the video frame rate to generate video modality data.
In some embodiments of the present application, based on the above technical solutions, the data processing apparatus further includes: the switching instruction response module is configured to respond to a data source switching instruction and acquire a data source to be switched in the live broadcast data source; the data source switching module is configured to acquire a target data source with the same data modality type as the data source to be switched, and replace the data source to be switched with the target data source.
According to an aspect of the embodiments of the present application, there is provided a computer-readable medium having stored thereon a computer program which, when executed by a processor, implements a data processing method as in the above technical solutions.
According to an aspect of an embodiment of the present application, there is provided an electronic apparatus including: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform the data processing method as in the above technical solution via execution of the executable instructions.
According to an aspect of embodiments of the present application, there is provided a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions so that the computer device performs the data processing method as in the above technical solution.
In the technical scheme provided by the embodiment of the application, by acquiring the data acquisition state of each live broadcast data source used for content live broadcast by the live broadcast initiating terminal, whether each live broadcast data source can normally acquire data or not can be monitored, and once a certain live broadcast data source is found to be in an abnormal state, the live broadcast data source can be immediately replaced by a substitute data source capable of generating the same mode data, so that the network live broadcast can be normally initiated and can continuously generate live broadcast content, the dependence of the network live broadcast on hardware equipment can be reduced, and the stability and reliability of the network live broadcast can be improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application as claimed.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the application. One skilled in the relevant art will recognize, however, that an embodiment of the application can be practiced without one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known methods, devices, implementations, or operations are not shown or described in detail to avoid obscuring aspects of embodiments of the application.
The block diagrams depicted in the figures are merely functional entities and do not necessarily correspond to physically separate entities. That is, the functional entities may be implemented in software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The flow diagrams depicted in the figures are exemplary only, and do not necessarily include all of the elements and operations/steps, nor must they be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the order of actual execution may be changed according to actual situations.
Fig. 1 schematically shows a system architecture block diagram of a content live broadcast system to which the technical scheme of the embodiment of the present application is applied.
As shown in fig. 1, the content live system 100 may mainly include a live originating terminal 110, a server 120, and a live presenting terminal 130.
The live broadcast initiator 110 is a terminal device located on the webcast side for generating live broadcast content to initiate webcast, and the webcast may generate live broadcast content by using various electronic devices such as a smart phone, a tablet computer, a notebook computer, a desktop computer, etc., and upload the live broadcast content to the server 120 through a network (such as a communication medium including a wired communication link or a wireless communication link).
The server 120 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud computing services.
The live presentation end 130 is a terminal device located at a side of a live viewer for presenting live content, and the live viewer can receive and watch live content distributed by the server 120 in real time through a network.
The content live broadcast system 100 in the embodiment of the present application may have any number of terminal devices and servers therein, as required by implementation. For example, the server 120 may be a server group composed of a plurality of server devices. In addition, the technical solution provided in the embodiment of the present application may be applied to the live broadcast initiator 110, may also be applied to the server 120, or may be implemented by the live broadcast initiator 110 and the server 120 together, which is not limited in particular in the embodiment of the present application.
The live broadcast initiator 110 needs to continuously collect and process data to form continuous live broadcast content during the process of initiating or conducting network live broadcast. Depending on the live form, the data used to form the live content may generally include streaming media data in a variety of different modalities, such as video streaming, audio streaming, text streaming, image streaming, and the like. For example, the webmaster may collect a video stream through a camera, and may collect an audio stream through a microphone, the video stream data and the audio stream data will be uploaded to the server 120 synchronously, and the server 120 may integrate the two paths of data into a live content for live broadcast, and push the live content to the live broadcast display end 130 through the network to display the live broadcast audience.
For live content formed based on data of multiple different modalities, the data of the various modalities is generally acquired by a fixed data source. In some related technologies of the present application, if a data source for collecting data fails, a live broadcast initiation failure or a live broadcast interruption may be caused. For example, when video stream data is designated to be collected by a camera, if the camera is not installed on a terminal device used by a network anchor or the camera fails, the network anchor cannot live. In addition, if video stream data cannot be acquired due to accidental pulling of the camera during live webcast, the live webcast will be forced to end, the webcast needs to be reconnected to the camera and reinitiated, and live audience needs to reenter the live room, resulting in poor user experience. Aiming at the problems, the embodiment of the application provides a processing method of live broadcast data, which reduces the excessive dependence of a live broadcast initiating terminal on hardware equipment by configuring a substitute data source, ensures that the live broadcast initiating terminal can continuously and uninterruptedly generate live broadcast content, and ensures the overall stability and reliability of network live broadcast.
The data processing method applied to the network live broadcast scene provided by the application is described in detail below with reference to the specific embodiment.
Fig. 2 schematically shows a flow chart of steps of a data processing method in an embodiment of the application, which may be performed by a terminal device of a live initiator, by a server, or by both the terminal device and the server. The embodiment of the application is illustrated by taking a data processing method executed by a terminal device of a live broadcast initiator as an example, and as shown in fig. 2, the data processing method mainly may include the following steps S210 to S240.
Step S210: and acquiring a plurality of live broadcast data sources for carrying out content live broadcast from the live broadcast initiating terminal to the live broadcast displaying terminal, wherein the live broadcast data sources are used for acquiring data of different modality types to form live broadcast content.
Step S220: and respectively acquiring the data acquisition states of the live broadcast data sources, and determining whether the live broadcast data sources are normal data sources or abnormal data sources according to the data acquisition states.
Step S230: and when the live broadcast data source is an abnormal data source, acquiring a substitute data source with the same data mode type as the abnormal data source.
Step S240: and acquiring live broadcast content based on the multi-mode data through at least two live broadcast data sources in the normal data source and the alternative data source, and sending the live broadcast content to the live broadcast display end.
In the data processing method provided by the embodiment of the application, by acquiring the data acquisition state of each live broadcast data source used for content live broadcast by the live broadcast initiating terminal, whether each live broadcast data source can normally perform data acquisition or not can be monitored, and once a certain live broadcast data source is found to be in an abnormal state, the live broadcast data source can be immediately replaced by a substitute data source capable of generating the same mode data, so that the network live broadcast can be normally initiated and can continuously generate live broadcast content, the dependence of the network live broadcast on hardware equipment can be reduced, and the stability and reliability of the network live broadcast can be greatly improved.
The following describes each step of the data processing method provided in the embodiment of the present application in detail in connection with an application scenario of live webcast.
In step S210, a plurality of live broadcast data sources for performing content live broadcast from a live broadcast initiation end to a live broadcast display end are acquired, where the plurality of live broadcast data sources are used for acquiring data of different modality types to form live broadcast content.
The live broadcast data source is a data source for generating live broadcast content on terminal equipment of a live broadcast initiating terminal, and in order to provide live broadcast content combined by various media forms such as visual images, sounds, texts and the like for live broadcast audiences, the live broadcast initiating terminal can be preconfigured with a plurality of live broadcast data sources respectively used for collecting data of different modality types. For example, a video acquisition data source for acquiring video mode data through a video acquisition device and an audio acquisition data source for acquiring audio mode data through an audio acquisition device may be configured at the live broadcast initiation end.
Fig. 3 schematically illustrates a scenario in which network live is performed based on a plurality of live data sources in one embodiment of the present application. As shown in fig. 3, in this application scenario, a client program of a webmaster may collect video data through a camera 301 to form a video stream, while audio data may be collected through a microphone 302 to form an audio stream. The video stream and the audio stream are sent to the server 303 together, and the server 303 can integrate the two paths of data to form live content which can be played, and then push the live content to a client program on one side of the audience, so that the live content is displayed to the audience.
In step S220, the data collection status of each live broadcast data source is obtained, and the live broadcast data source is determined to be a normal data source or an abnormal data source according to the data collection status.
The data acquisition state of the live data source is used for indicating whether the live data source can normally acquire data. If a live data source is able to perform data collection normally to form a data stream for live presentation, the live data source may be determined to be a normal data source. If one live data source cannot normally collect data or cannot normally transmit data, the live data source is determined to be an abnormal data source.
In one embodiment of the present application, the live data source is data collected by an external device (such as a camera, a microphone, etc.) connected to the terminal device, and if the external device for collecting data is not normally connected to the terminal device, the external device cannot provide corresponding live data. On the basis, the embodiment of the application can judge whether the external equipment is normally connected or not by detecting the connection state of the corresponding external equipment, so as to further determine the data acquisition state of the live broadcast data source.
Specifically, the embodiment of the application can respectively acquire the equipment connection state of each data acquisition interface corresponding to each live broadcast data source, wherein the equipment connection state is used for indicating whether the data acquisition interface is connected with data acquisition equipment or not; and then determining the data acquisition state of the live broadcast data source corresponding to the data acquisition interface according to the equipment connection state. In an embodiment of the present application, each live data source may correspond to a data acquisition interface on the terminal device, where the data acquisition interface may be a physical interface or a network interface with a specified function. For example, a physical interface, such as a USB interface, for connecting to a camera or other external device may typically be provided on a computer device. The embodiment of the application can judge the equipment connection state by detecting whether each physical interface is connected with corresponding external equipment. As another example, a network interface for wireless communication connection functions such as a bluetooth connection or a wifi connection may be provided on a computer device. The embodiment of the application can judge the equipment connection state by detecting whether the network interface establishes normal wireless communication connection with the external equipment.
In another embodiment of the present application, when the live data source is capable of collecting data to form a data stream, the transmission status of the data stream thereof may be monitored to determine the data collection status of the live data source. Specifically, the embodiment of the application can respectively receive the data streams through each live data source; then respectively acquiring the data stream transmission states of the data streams received by each live broadcast data source; and further determining the data acquisition state of the live data source corresponding to the data stream according to the data stream transmission state. The data stream transmission state can be represented from multiple dimensions such as data delay, jitter and packet loss, and if the data stream transmitted by one live data source has the problems of higher delay rate or higher packet loss rate, the data stream transmission state can be judged to be an abnormal state. On this basis, even if the live data source is normally connected to an external device for data acquisition, the live data source is still determined as an abnormal data source because a continuous and stable data stream cannot be provided.
In step S230, when the live data source is an abnormal data source, a substitute data source having the same data modality type as the abnormal data source is acquired.
When determining that a live broadcast data source is an abnormal data source, the embodiment of the application can firstly acquire the data mode type of the data acquired by the abnormal data source; the data modality types may include data types in various media forms such as video, audio, image, text, and the like. According to the data modality type of the abnormal data source acquisition data, a candidate data source set corresponding to the abnormal data source can be acquired, and a substitute data source can be further selected from the candidate data source set. Wherein the candidate data source set is a set of at least one candidate data source, each candidate data source being capable of providing live data of the same data modality type as the anomalous data source.
In one embodiment of the present application, when selecting the alternative data source, the data source priority of each candidate data source in the candidate data source set may be obtained, and after each candidate data source is ordered according to the data source priority, the alternative data source is selected from the candidate data source set according to the ordering result of the candidate data source. In the embodiment of the application, the priority of the data source can be configured for each candidate data source in advance, and a substitute data source is selected according to the priority order. In another embodiment of the present application, visual display may also be performed according to candidate data sources that may be selected from the candidate data source set, for example, a data source selection page may be displayed to the user through an interaction interface of the terminal device, and then the alternative data source may be determined according to a selection operation of the user.
In step S240, live broadcast content based on the multi-mode data is obtained through at least two live broadcast data sources of the normal data source and the alternative data source, and the live broadcast content is sent to the live broadcast display end.
After the abnormal data source is replaced by the replacement data source, at least two live broadcast data sources in the normal data source and the replacement data source are subjected to data acquisition to form live broadcast content based on multi-mode data, and then the network live broadcast content can be normally sent to the live broadcast display end under the condition that the preset live broadcast data source is abnormal so as to start and conduct network live broadcast.
In the related art of the present application, the live broadcast initiator may preset a live broadcast type of the network live broadcast, and determine each live broadcast data source for generating live broadcast content based on the live broadcast type. For example, when the live type is a live video, at least two paths of data signals including a video stream and an audio stream need to be acquired from a plurality of live data sources. When the network video live broadcast is started or in the process of network video live broadcast, if any live broadcast data source is abnormal, the live broadcast is interrupted due to the loss of corresponding data signals. In the technical scheme provided by the embodiment of the application, when the live broadcast data source is detected to be abnormal, the abnormal data source can be immediately switched to the alternative data source so as to keep continuous transmission of the data signal, thereby avoiding live broadcast interruption. The method for switching the data sources can comprise the following steps: acquiring a data transmission protocol for pushing live content from a live initiator to a live presenter, wherein the data transmission protocol can comprise, for example, a real-time streaming protocol (Real Time Streaming Protocol, RTSP), a real-time messaging protocol (Real Time Messaging Protocol, RTMP), an HTTP-based streaming protocol (HTTP Live Streaming, HLS), and the like; replacing the address of the live broadcast data source configured in the data transmission protocol with the first address of the abnormal data source to the second address of the substitute data source; and acquiring a data signal from the second address according to the data transmission protocol, and continuously pushing the data stream based on the alternative data source to the live broadcast display end, so as to keep the continuity of the data stream.
In one embodiment of the application, the live data sources include a video acquisition data source that acquires video modality data through a video acquisition device and an audio acquisition data source that acquires audio modality data through an audio acquisition device. When the video acquisition data source is an abnormal data source and the audio acquisition data source is a normal data source, the alternative data source includes a video file data source that generates video modality data from the video file.
On the basis, the embodiment of the application can utilize the video file data source to replace the video acquisition data source to generate the video mode data. Specifically: acquiring audio mode data acquired by audio acquisition equipment through an audio acquisition data source; reading file content of a video file through a video file data source, and generating video mode data according to the file content; and combining the audio mode data and the video mode data into live broadcast content based on the multi-mode data.
In some alternative embodiments, the file content of the video file may include one or more image files for presenting the hint text; for example, a hint character such as "no camera" may be presented on the image file. When generating video mode data according to file content, the embodiment of the application can arrange one or more image files to form an image sequence according to the image display sequence, and further perform video coding on the image sequence according to the video frame rate to generate the video mode data.
Fig. 4 schematically illustrates a scenario in which a live webcast is performed based on a video file as an alternative data source in one embodiment of the present application. As shown in fig. 4, in this application scenario, a video acquisition data source (for example, a camera in fig. 3) originally used for acquiring video data becomes an abnormal data source due to abnormal device connection or abnormal data transmission, and on the basis of this, the embodiment of the present application may determine, as an alternative data source for replacing the video acquisition data source, a video file data source that generates video data based on the video file 401 based on the data source priority or based on user selection.
The client program of the network anchor can form a video stream by reading the file contents of the video file 401, and can form an audio stream by capturing audio data through the microphone 402. The video stream and the audio stream are sent to the server 403 together, and the server 403 can integrate the two paths of data to form live content for playing, and then push the live content to a client program on one side of the audience, so as to display the live content to the audience.
In another embodiment of the application, the live data sources include a video acquisition data source that acquires video modality data by a video acquisition device and an audio acquisition data source that acquires audio modality data by an audio acquisition device. When the video acquisition data source is an abnormal data source and the audio acquisition data source is a normal data source, the alternative data source comprises a terminal interface data source for generating video mode data through an interactive interface of the terminal equipment.
On the basis, the embodiment of the application can utilize the terminal interface data source to replace the video acquisition data source to generate video mode data. Specifically: acquiring audio mode data acquired by audio acquisition equipment through an audio acquisition data source; acquiring interface content of an interactive interface of the terminal equipment through a terminal interface data source, and generating video mode data according to the interface content; and combining the audio mode data and the video mode data into live broadcast content based on the multi-mode data.
In some alternative embodiments, the interface content may include an interface image of the interactive interface. For example, when the webcast uses the mobile phone to conduct webcast, the picture displayed on the mobile phone screen in real time is the interface image of the interactive interface. When video mode data is generated according to interface content, the embodiment of the application can acquire images of the interactive interface of the terminal equipment according to the video frame rate to obtain an interface image sequence composed of interface images of the interactive interface, and further video coding is carried out on the interface image sequence according to the video frame rate to generate the video mode data.
Fig. 5 schematically illustrates a scenario in which network live broadcast is performed based on interface content of an interactive interface as an alternative data source according to an embodiment of the present application. As shown in fig. 5, in this application scenario, a video acquisition data source (for example, a camera in fig. 3) originally used for acquiring video data becomes an abnormal data source due to abnormal device connection or abnormal data transmission, and on the basis of this, according to the data source priority or according to user selection, the terminal interface data source that generates video data based on the interface content of the interactive interface may be determined as an alternative data source for replacing the video acquisition data source.
The client program of the network anchor may form a video stream by continuously and uninterruptedly processing the pictures on the screen 501 while audio data may be collected by the microphone 502 to form an audio stream. The video stream and the audio stream are sent to the server 503 together, and the server 503 can integrate the two paths of data to form live content for playing, and then push the live content to a client program on one side of the audience, so as to display the live content to the audience.
In one embodiment of the application, the live data source may also be actively replaced by the user according to live needs. Specifically: responding to a data source switching instruction, and acquiring a data source to be switched in the live broadcast data source; and acquiring a target data source with the same data modality type as the data source to be switched, and replacing the data source to be switched with the target data source.
In the embodiment of the application, a target data source can be actively selected according to the data source switching instruction triggered by the user, and the data source to be switched is replaced by the target data source, so that the adjustment of the live broadcast content is realized. For example, a user may switch between camera data and screen data of a terminal device to change live video image content.
Based on the technical scheme provided by the embodiment of the application, the network anchor can start live broadcast under the condition that the terminal equipment has no camera. In the live broadcast process, if the camera is accidentally pulled out or the problems of connection faults, data transmission faults and the like occur, the video picture can be automatically switched to the content of a preconfigured video file or switched to the display picture of the terminal equipment screen, so that the stability and reliability of network live broadcast are maintained. In addition, according to the live broadcast requirement, the data source switching can be actively carried out during live broadcast so as to enrich the live broadcast content and improve the flexibility of network live broadcast.
It should be noted that although the steps of the methods in the embodiments of the present application are depicted in the drawings in a particular order, this does not require or imply that the steps must be performed in that particular order, or that all of the illustrated steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step to perform, and/or one step decomposed into multiple steps to perform, etc.
The following describes embodiments of the apparatus of the present application that may be used to perform the data processing methods of the above-described embodiments of the present application. Fig. 6 schematically shows a block diagram of a data processing apparatus according to an embodiment of the present application. As shown in fig. 6, the data processing apparatus 600 includes: a data source acquisition module 610 configured to acquire a plurality of live data sources for content live broadcast from a live broadcast initiation terminal to a live broadcast presentation terminal; the state acquisition module 620 is configured to acquire data acquisition states of the live broadcast data sources respectively, and determine that the live broadcast data sources are normal data sources or abnormal data sources according to the data acquisition states; a data source substitution module 630 configured to obtain a substitute data source having the same data modality type as the abnormal data source when the live data source is the abnormal data source; the content acquisition module 640 is configured to acquire live content based on multi-mode data through at least two live data sources of the normal data source and the alternative data source, and send the live content to the live showing end.
In some embodiments of the present application, based on the above embodiments, the state acquisition module 620 includes: a connection state obtaining unit configured to obtain device connection states of respective data acquisition interfaces corresponding to respective live broadcast data sources, respectively, the device connection states being used to indicate whether the data acquisition interfaces are connected to a data acquisition device; and the first state determining unit is configured to determine the data acquisition state of the live broadcast data source corresponding to the data acquisition interface according to the equipment connection state.
In some embodiments of the present application, based on the above embodiments, the state acquisition module 620 includes: a data stream receiving unit configured to receive data streams through the respective live data sources, respectively; a transmission state acquisition unit configured to acquire data stream transmission states of data streams received by the respective live data sources, respectively; and the second state determining unit is configured to determine the data acquisition state of the live data source corresponding to the data stream according to the data stream transmission state.
In some embodiments of the present application, based on the above embodiments, the data source substitution module 630 includes: the mode type acquisition unit is configured to acquire a data mode type of the abnormal data source acquisition data; a data source set acquisition unit configured to acquire a candidate data source set corresponding to the abnormal data source according to the data modality type; a substitute data source selection unit configured to select a substitute data source from the candidate data source set.
In some embodiments of the present application, based on the above embodiments, the alternative data source selection unit includes: a priority acquisition subunit configured to acquire a data source priority of each candidate data source in the candidate data source set; a data source ranking subunit configured to rank each of the candidate data sources according to the data source priorities; a data source selection subunit configured to select a substitute data source from the candidate data source set according to a ranking result of the candidate data sources.
In some embodiments of the present application, based on the above embodiments, the live data source includes a video acquisition data source that acquires video modality data through a video acquisition device and an audio acquisition data source that acquires audio modality data through an audio acquisition device; when the video acquisition data source is an abnormal data source and the audio acquisition data source is a normal data source, the alternative data source comprises a video file data source for generating video mode data through a video file.
In some embodiments of the present application, based on the above embodiments, the content obtaining module 640 includes: the audio acquisition unit is configured to acquire audio mode data acquired by the audio acquisition equipment through the audio acquisition data source; the first video acquisition unit is configured to read file content of a video file through the video file data source and generate video mode data according to the file content; and the data combination unit is configured to combine the audio mode data and the video mode data into live broadcast content based on multi-mode data.
In some embodiments of the present application, based on the above embodiments, the file content of the video file includes one or more image files for presenting the hint text; the first video acquisition unit includes: a document arrangement sub-unit configured to arrange the one or more image documents in an image presentation order to form an image sequence; a first video encoding subunit configured to video encode the sequence of images at a video frame rate to generate video modality data.
In some embodiments of the present application, based on the above embodiments, the live data source includes a video acquisition data source that acquires video modality data through a video acquisition device and an audio acquisition data source that acquires audio modality data through an audio acquisition device; when the video acquisition data source is an abnormal data source and the audio acquisition data source is a normal data source, the alternative data source comprises a terminal interface data source for generating video mode data through an interactive interface of the terminal device.
In some embodiments of the present application, based on the above embodiments, the content obtaining module 640 includes: the audio acquisition unit is configured to acquire audio mode data acquired by the audio acquisition equipment through the audio acquisition data source; the second video acquisition unit is configured to acquire interface content of an interactive interface of the terminal equipment through the terminal interface data source and generate video mode data according to the interface content; and the data combination unit is configured to combine the audio mode data and the video mode data into live broadcast content based on multi-mode data.
In some embodiments of the present application, based on the above embodiments, the interface content includes an interface image of the interactive interface; the second video acquisition unit includes: the interface image acquisition subunit is configured to acquire images of the interactive interface of the terminal equipment according to the video frame rate to obtain an interface image sequence consisting of interface images of the interactive interface; a second video encoding subunit configured to video encode the sequence of interface images at the video frame rate to generate video modality data.
In some embodiments of the present application, based on the above embodiments, the data processing apparatus 600 further includes: the switching instruction response module is configured to respond to a data source switching instruction and acquire a data source to be switched in the live broadcast data source; the data source switching module is configured to acquire a target data source with the same data modality type as the data source to be switched, and replace the data source to be switched with the target data source.
Specific details of the data processing apparatus provided in each embodiment of the present application have been described in the corresponding method embodiments, and are not described herein.
Fig. 7 schematically shows a block diagram of a computer system of an electronic device for implementing an embodiment of the application.
It should be noted that, the computer system 700 of the electronic device shown in fig. 7 is only an example, and should not impose any limitation on the functions and the application scope of the embodiments of the present application.
As shown in fig. 7, the computer system 700 includes a central processing unit 701 (Central Processing Unit, CPU) which can execute various appropriate actions and processes according to a program stored in a Read-Only Memory 702 (ROM) or a program loaded from a storage section 708 into a random access Memory 703 (Random Access Memory, RAM). In the random access memory 703, various programs and data necessary for the system operation are also stored. The central processing unit 701, the read only memory 702, and the random access memory 703 are connected to each other via a bus 704. An Input/Output interface 705 (i.e., an I/O interface) is also connected to bus 704.
The following components are connected to the input/output interface 705: an input section 706 including a keyboard, a mouse, and the like; an output section 707 including a Cathode Ray Tube (CRT), a liquid crystal display (Liquid Crystal Display, LCD), and the like, a speaker, and the like; a storage section 708 including a hard disk or the like; and a communication section 709 including a network interface card such as a local area network card, a modem, or the like. The communication section 709 performs communication processing via a network such as the internet. The drive 710 is also connected to the input/output interface 705 as needed. A removable medium 711 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 710 as necessary, so that a computer program read therefrom is mounted into the storage section 708 as necessary.
In particular, the processes described in the various method flowcharts may be implemented as computer software programs according to embodiments of the application. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flowcharts. In such an embodiment, the computer program may be downloaded and installed from a network via the communication portion 709, and/or installed from the removable medium 711. The computer programs, when executed by the central processor 701, perform the various functions defined in the system of the present application.
It should be noted that, the computer readable medium shown in the embodiments of the present application may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-Only Memory (ROM), an erasable programmable read-Only Memory (Erasable Programmable Read Only Memory, EPROM), flash Memory, an optical fiber, a portable compact disc read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present application, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functions of two or more modules or units described above may be embodied in one module or unit in accordance with embodiments of the application. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present application may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, and includes several instructions to cause a computing device (may be a personal computer, a server, a touch terminal, or a network device, etc.) to perform the method according to the embodiments of the present application.
Other embodiments of the application will be apparent to those skilled in the art from consideration of the specification and practice of the application disclosed herein. This application is intended to cover any variations, uses, or adaptations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains.
It is to be understood that the application is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the application is limited only by the appended claims.