Detailed Description
The embodiment of the application provides a method, a device and a storage medium for displaying live broadcast content, which are used for triggering a data hiding instruction according to actual requirements in a live broadcast process, so that live broadcast additional data on a live broadcast interface are hidden, only live broadcast video data are displayed, the live broadcast interface is simplified as much as possible, and a more complete live broadcast picture is displayed.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims of the present application and in the drawings described above, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "corresponding" and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be understood that the live content display method provided by the application can be applied to various live scenes, and live video data and live additional data are fed back to the terminal equipment in real time in the live broadcasting process, so that a user can view the live contents. In the application, when a user triggers a data hiding instruction, live broadcast additional data in live broadcast content can be hidden. When the user triggers a data display instruction, live additional data in live content can be reproduced. It should be noted that the live broadcast types referred to in the present application may include, but are not limited to, live Game broadcast, live shopping broadcast, live show broadcast, and live sports broadcast, where the live Game broadcast includes, but is not limited to, live MOBA Game broadcast, live Role-playing Game (RPG), live First person shooter Game (FPS), and live Game strategy Game (SLG).
For convenience of understanding, a live game scene is taken as an example for explanation, please refer to fig. 1, fig. 1 is an interface schematic diagram based on a live game scene in the embodiment of the present application, as shown in fig. 1 (a), a live interface based on an MOBA game is shown, live video data Y1 and live additional data Y2 are displayed in the live interface of the MOBA game, and for the MOBA game, the live additional data includes, but is not limited to, game equipment information, game character information, player information, game map information, and game state information. Fig. 1 (B) shows a live view interface based on the FPS game, in which live view video data X1 and live view additional data X2 are displayed on the live view interface of the FPS game, and the live view additional data includes, but is not limited to, game equipment information, game map information, enjoyment information, chat information, and in-room person information for the FPS game. In the live broadcasting process, a data hiding instruction is triggered according to actual requirements, live broadcasting additional data on a live broadcasting interface are hidden, and only live broadcasting video data are displayed, so that the live broadcasting interface is simplified as much as possible, and more complete live broadcasting pictures are displayed.
In order to display a more complete live broadcast picture in the various scenes, the present application provides a live broadcast content display method, which is applied to a live broadcast content display system shown in fig. 2, please refer to fig. 2, fig. 2 is an environment schematic diagram of the live broadcast content display system in the embodiment of the present application, as shown in the figure, the live broadcast content display system includes a server and a terminal device, the terminal device can send a live broadcast content request to the server, the server sends live broadcast video data and live broadcast additional data to the terminal device according to the live broadcast content request, so that the terminal device displays the live broadcast video data and the live broadcast additional data on a live broadcast interface, when a data hiding instruction is identified, the terminal device continues to display the live broadcast video data on the live broadcast interface based on a data hiding instruction, and stops displaying the live broadcast additional data.
The server in fig. 2 may be a server or a server cluster composed of multiple servers, or a cloud computing center, and the like, which are not limited herein. The terminal device may be a tablet computer, a notebook computer, a palm computer, a mobile phone, a Personal Computer (PC), a smart television, a voice interaction device, and the like shown in fig. 2. The terminal equipment is provided with a client which can be a video client, a browser client, an instant messaging client, an education client and the like.
Although only five terminal devices and one server are shown in fig. 2, it should be understood that the example in fig. 2 is only used for understanding the present solution, and the number of the specific terminal devices and the number of the servers should be flexibly determined according to actual situations.
Referring to fig. 3, fig. 3 is a schematic flow chart of a live content display method in an embodiment of the present application, as shown in the figure, specifically:
in step S1, when the user needs to watch the live content, the terminal device may trigger a request for the live content, download live video data from the server, render and display the live video data, and optionally, display live additional data on the live interface.
In step S2, the terminal device determines whether it is necessary to present live additional data to the user, and if the user wants to view the live additional data, step S3 is performed, and if the user wants to continue viewing only live video data, step S5 is performed.
In step S3, if the user wants to view the live additional data without showing the live additional data on the live interface, the terminal device receives a data display instruction triggered by the user, and then requests the server for the live additional data in response to the data display instruction.
In step S4, after receiving the live broadcast additional data sent by the server, the terminal device refreshes the local component and displays the local component on the upper layer of the terminal device, so that the live broadcast video data and the live broadcast additional data are simultaneously displayed on the live broadcast interface.
In step S5, if the user wants to continue watching the live video data only when the live additional data is displayed on the live interface, the terminal device receives a data hiding instruction triggered by the user.
In step S6, the terminal device continues to display the live video data on the live interface according to the data hiding instruction, and stops displaying the live additional data.
With reference to the above description, the following description will describe a method for displaying live content in the present application, please refer to fig. 4, where fig. 4 is a schematic view of an embodiment of a method for displaying live content in an embodiment of the present application, and an embodiment of the method for displaying live content in the embodiment of the present application includes:
101. the method comprises the steps that terminal equipment sends a live broadcast content request to a server, so that the server obtains live broadcast video data and live broadcast additional data according to the live broadcast content request, wherein the live broadcast additional data comprise additional data related to the live broadcast video data;
in this embodiment, before the live content is displayed on the terminal device, a long connection is established with the server, and then a live content request is sent to the server through the long connection, so that the server obtains live video data and live additional data according to the live content request, and the live additional data may include additional data associated with the live video data. The long connection can enable the terminal device and the server to be not closed after establishing connection once and to keep a connected state for a long time.
For the convenience of understanding, the description is given by taking the application to a live video scene of an MOBA game as an example, and the live video data is video basic data in live MOBA game. Since information such as game characters, equipment, and a game map may be changed as the progress of a game changes during a game, live video data may be acquired while live additional data associated with the live video data, such as game equipment information, game character information, player information, game map information, and game state information, may be acquired.
102. The method comprises the steps that terminal equipment receives live video data and live additional data sent by a server;
in this embodiment, after the server acquires the live video data and the live additional data, the server sends the live video data and the live additional data to the terminal device. Specifically, the terminal device sends a live content request through a long connection established with the server, and the server sends live video data and live additional data to the terminal device through the same long connection. If the long connection is not closed, the terminal equipment and the server can continue the long connection to transmit data without establishing a new connection, so that signaling required by connection establishment and connection closing is reduced, the data transmission speed is increased, and the timeliness of live broadcast content is improved.
103. The terminal equipment displays live video data and live additional data on a live interface;
in this embodiment, after receiving the live video data and the live additional data, the terminal device may display the live video data and the live additional data on a live interface. For convenience of understanding, taking an application to a live broadcast scene of an MOBA game as an example for explanation, please refer to fig. 5, fig. 5 is an interface diagram based on the live broadcast scene of the MOBA game in the embodiment of the present application, as shown in the figure, where a1 indicates game state information in live broadcast additional data, a21, a22, and a23 all indicate game character information in the live broadcast additional data, a31 and a32 both indicate game player information in the live broadcast additional data, a4 indicates game equipment information in the live broadcast additional data, and a5 indicates game map information in the live broadcast additional data. The unmarked areas in fig. 5 are the live video data. In the live playing process of the MOBA game, a user can see a live playing picture (namely live video data) of the MOBA game from a live playing interface and can also see additional information (namely live additional data) for helping the user to understand the progress of the game from the live playing interface.
104. And if the terminal equipment identifies the data hiding instruction, continuously displaying the live video data on the live interface according to the data hiding instruction, and stopping displaying the live additional data.
In this embodiment, when the terminal device identifies the data hiding instruction, the live video data is continuously displayed on the live interface according to the data hiding instruction, and the display of the live additional data is stopped. Specifically, the terminal device may disconnect from the server based on the data hiding instruction, whereupon the server stops sending the live additional data to the terminal device, and the terminal device only displays the live video data. Optionally, the terminal device may also continue to receive the live additional data sent by the server, and then stop displaying the live additional data according to the data hiding instruction.
When a user triggers the switch button, the terminal device may recognize a corresponding instruction (e.g., a data hiding instruction), and the trigger mode of the switch button may be that the user directly clicks the switch button on the interface, or the terminal device may recognize the instruction through voice control or gesture control. Referring to fig. 5, the case displayed on the interface by the toggle button may be "one-touch hide" a6, which indicates that the live additional data is stopped from being displayed. In addition, the switch button can be displayed as "selective hiding" a7 on the interface, where "selective hiding" a7 indicates that part of the live broadcast additional data can be selected for hiding, that is, after the terminal device recognizes a data hiding instruction, the live broadcast video data can be continuously displayed on the live broadcast interface, but the display of the selected hidden live broadcast additional data is stopped, for example, in the MOBA live broadcast, game equipment information, game character information, and game player information are selected to be hidden, and then the terminal device only displays the game map information and the game state information in the live broadcast video data and the live broadcast additional data, and does not display the game equipment information, the game character information, and the game player information any more.
The embodiment can also be applied to the intelligent television, the button of one-key hiding can be selected through the remote controller, so that the terminal equipment of the intelligent television stops displaying the live broadcast additional data, and the remote controller can also be used for selecting partial live broadcast additional data to be hidden, so that the partial data in the live broadcast additional data stops being displayed.
For convenience of understanding, please refer to fig. 6, fig. 6 is an interface schematic diagram of a live broadcast scene based on an MOBA game in the embodiment of the present application, as shown in fig. 6 (a), a live broadcast interface including a "one-key hiding" button B1 and a "select hiding" button B2 is shown, the live broadcast interface simultaneously displays live broadcast video data and live broadcast additional data, and when a user does not need to view all live broadcast additional data, the "one-key hiding" button B1 may be clicked, so that the terminal device recognizes a data hiding instruction and continues to display live broadcast video data, but does not display live broadcast additional data, that is, the interface shown in fig. 6 (B) enters to display only live broadcast video data, and a "one-key display" button B3 and a "select display" button B4 are shown in the live broadcast interface. Secondly, when the user does not need to view part of the live broadcast additional data, the "select hide" button B2 is clicked, and the live broadcast additional data that needs to be hidden is selected, for example, the hidden game character information and the game player information are selected, then the terminal device can recognize the data hide instruction and continue to show the live broadcast video data, but does not show the game character information and the game player information, that is, the interface shown in (C) of fig. 6 is entered, only the live broadcast video data and the game equipment information, the game map information and the game state information are shown, and the "one-key display" button B5 and the "select display" button B6 are shown in the live broadcast interface.
In the embodiment of the application, a live content display method is provided, and through the above manner, in the live broadcasting process, a data hiding instruction can be triggered according to actual requirements, so that live additional data on a live interface is hidden, and only live video data is displayed, so that the live interface is simplified as much as possible, and a more complete live picture is displayed.
Optionally, on the basis of the embodiment corresponding to fig. 4, in an optional embodiment of the method for displaying live content provided in the embodiment of the present application, after the terminal device continues to display live video data on the live interface according to the data hiding instruction and stops displaying live additional data, the method may further include:
and if the data display instruction is identified, displaying the live broadcast additional data when the terminal equipment displays the live broadcast video data on the live broadcast interface.
In this embodiment, after the terminal device stops displaying the live broadcast additional data, when the user needs to watch the live broadcast additional data again, the data display instruction may be triggered, and if the terminal device recognizes the data display instruction, the live broadcast additional data is displayed when the live broadcast video data is displayed on the live broadcast interface.
Specifically, there are two situations when the terminal device stops displaying the live broadcast additional data, the first situation is that the server disconnects the long connection according to the data hiding instruction sent by the terminal device, and stops sending the live broadcast additional data to the terminal device. And in the second situation, the terminal equipment is kept in long connection with the server, when the live broadcast additional data stops being displayed, the live broadcast additional data sent by the server can be continuously received, and after a data display instruction is identified, the hidden live broadcast additional data is re-rendered to a live broadcast interface.
Further, the user can click a switch button in the live interface, so as to trigger the data display instruction, and the data display instruction can also be triggered through sound control or gesture control. The switch button can be displayed as 'one-key display' on the interface, and the data display instruction is used for indicating the terminal equipment to display all live broadcast additional data. The switch button can also be displayed as 'selection display' on the interface, and the data display instruction is used for indicating the terminal equipment to display part of live broadcast additional data, for example, in MOBA live broadcast, only live broadcast video data is displayed through 'one-key hiding', and then game equipment information, game map information and game state information are selected by operating the switch button of 'selection display', so that the game equipment information, the game map information and the game state information in the live broadcast additional data can be displayed while the live broadcast video data is displayed on the live broadcast interface.
For convenience of understanding, taking an application to a live broadcast scene in an MOBA game as an example, please refer to fig. 7, and fig. 7 is an interface diagram illustrating that live broadcast additional data is displayed based on the live broadcast scene in the MOBA game in the embodiment of the present application, as shown in fig. 7, a live broadcast interface shown in (a) of fig. 7 includes a "one-key display" button C1 and a "select display" button C2, the live broadcast interface only displays live broadcast video data, when a user needs to view complete live broadcast additional data, the "one-key display" button C1 can be clicked, a terminal device recognizes a data display instruction, and simultaneously displays live broadcast video data and complete live broadcast additional data, thereby obtaining a live broadcast interface shown in (B) of fig. 7, and the live broadcast interface includes a "one-key hide" button C3 and a "select hide" button C4. When the user only needs to view part of the live broadcast additional data, the "select display" button C2 can be clicked, and the live broadcast additional data that needs to be displayed is selected, the terminal device recognizes the data display instruction, and presents part of the live broadcast additional data when the live broadcast video data is presented, for example, the game state information, the game character information, and the player information are selectively presented, and then the terminal device can present the game state information, the game character information, and the player information when the live broadcast video data is presented, thereby obtaining a live broadcast interface as shown in (C) of fig. 7, which includes a "one-click hide" button C5 and a "select hide" button C6. It should be understood that the example in fig. 7 is only used for understanding the present solution, and the specific live additional data should be flexibly determined in combination with actual requirements.
In the embodiment of the application, a method for displaying live broadcast additional data is provided, and in the above manner, in the live broadcast process, a data display instruction can be triggered according to actual requirements, and when live broadcast video data is displayed, live broadcast additional data on a live broadcast interface can be displayed, so that a more complete live broadcast picture can be displayed.
Optionally, on the basis of the embodiment corresponding to fig. 4, in an optional embodiment of the method for displaying live content provided in the embodiment of the present application, the method for displaying live content may further include:
if the data display instruction is identified, the terminal equipment acquires live broadcast additional data from the server according to the data display instruction;
the acquiring, by the terminal device, the live broadcast additional data from the server according to the data display instruction may include:
the terminal equipment responds to the data display instruction and establishes long connection with the server;
the method comprises the steps that terminal equipment sends an additional data request to a server through long connection, so that the server determines live additional data and position information corresponding to the live additional data according to the additional data request, wherein the additional data request carries a data type identifier, and the data type identifier is used for indicating the type of the live additional data;
the method comprises the steps that terminal equipment receives live broadcast additional data sent by a server and position information corresponding to the live broadcast additional data;
the terminal device displays the live broadcast additional data, and the method can include the following steps:
the terminal equipment determines a display area corresponding to a target assembly in a live broadcast interface according to position information corresponding to the live broadcast additional data, wherein the target assembly is displayed on a first layer, the live broadcast video data is displayed on a second layer, and the first layer is positioned on the second layer;
and the terminal equipment displays the live broadcast additional data on a display area corresponding to the target component in the live broadcast interface.
In this embodiment, the live broadcast additional data displayed in the live broadcast has strong timeliness, and if the terminal device recognizes the data display instruction, the live broadcast additional data can be acquired from the server again according to the data display instruction. Secondly, the terminal device can respond to the data display instruction and can establish long connection with the server, so that the terminal device and the server can keep a connected state through the long connection after one-time live video data and live additional data transmission is completed. The terminal equipment can send an additional data request to the server through the long connection, so that the server determines the live additional data and the position information corresponding to the live additional data according to the additional data request, wherein the additional data request carries a data type identifier, and the data type identifier can indicate the type of the live additional data. The method comprises the steps that terminal equipment receives live broadcast additional data sent by a server and position information corresponding to the live broadcast additional data, a display area corresponding to a target assembly in a live broadcast interface is determined according to the position information corresponding to the live broadcast additional data, the target assembly is displayed on a first layer, live broadcast video data are displayed on a second layer, the first layer is located on the second layer, and then the live broadcast additional data are displayed on the display area corresponding to the target assembly in the live broadcast interface. Because the long connection can not be closed, if the live broadcast additional data needs to be transmitted between the terminal equipment and the server in real time, the established long connection can be directly used, a new connection does not need to be established, so that the resource consumption of connection establishment and connection closing between the terminal equipment and the server in the live broadcast additional data transmission can be reduced, the total consumed time is reduced when data transmission is carried out for multiple times, the data transmission speed is increased, and the timeliness of live broadcast content display is ensured.
Specifically, the long connection may be used to perform data transmission on all live broadcast additional data, or may be used to perform data transmission on part of information in the live broadcast additional data, for example, the data display instruction is used to indicate that game state information in the live broadcast additional data is displayed, so that the long connection that only transmits the game state information may be established. If all live broadcast additional data are selected to be displayed, a long connection for transmitting all live broadcast additional data needs to be established. Therefore, the same interface can be adopted for all live broadcast additional data, and a corresponding interface can be set for each type of additional data in the live broadcast additional data, so that the full transmission of the live broadcast additional data can be realized, and the partial transmission of the live broadcast additional data can also be realized.
The data type identifier carried by the additional data request is used for indicating the type of the live additional data, for example, the data type identifier is "1", the data type identifier indicates that a "one-key-on" button is triggered, that is, all live additional data are indicated to be displayed, then the server sends the full amount of live additional data to the terminal device, and determines the position information corresponding to each additional data in the live additional data. If the data type identification is '2' or '3', or other numerical values, the data type identification indicates that different types of live broadcast additional data are triggered, the server can determine the live broadcast additional data to be displayed according to the type identification, and determine the position information corresponding to the live broadcast additional data. The server may maintain a table recording a position relationship between each live broadcast additional data and the live broadcast additional data, where the position relationship of the live broadcast additional data is a position of a display area corresponding to the target component on the live broadcast interface, and taking an origin of coordinates located at an upper left corner of the screen as an example, the position relationship may be determined by screen coordinates (x, y, w, h), where x and y are pixels of an abscissa and an ordinate of the target component located in the live broadcast interface, and w and h are wide and long pixels of the display area corresponding to the target component located in the live broadcast interface, (x, y) is sent by the server, and (w, h) may be sent by the server, may also be stored for the terminal device itself, and is not limited specifically here. For example, the position information of the game equipment information may be represented by (x, y), or (x, y, w, h), and the position information of the game state information may be represented by (x, y), or (x, y, w, h).
Further, after receiving the live broadcast additional data sent by the server and the position information corresponding to the live broadcast additional data, the terminal device determines a display area corresponding to a target component in a live broadcast interface according to the position information corresponding to the live broadcast additional data, and the target components corresponding to different live broadcast additional data are different, for example, an image in the game equipment information needs to be rendered and displayed through an image view (ImageView), a text in the game equipment information needs to be rendered and displayed through a text view (TextView), and the target components corresponding to the specific live broadcast additional data are not exhaustive. Therefore, after the position information corresponding to the live broadcast additional data is acquired, the terminal device sets the corresponding target assembly at the corresponding position according to the positions corresponding to different live broadcast additional data, so that the corresponding live broadcast additional data can be displayed on the display area corresponding to the target assembly.
For convenience of understanding, an example that the coordinate origin is located at the upper left corner of the screen and applied to a live broadcast scene of a MOBA game is described, please refer to fig. 8, fig. 8 is a schematic diagram of determining a display area of a component based on a live broadcast scene of a MOBA game in an embodiment of the present application, as shown in the diagram (a) in fig. 8 is live broadcast video data displayed in a second layer in a live broadcast interface, and the diagram (B) in fig. 8 is position information corresponding to game equipment information, since the coordinate origin is located at the upper left corner of the screen as an example, the number of pixels is (0, 0) in the upper left corner of the screen of a terminal device, the direction extending to the bottom of the screen of the terminal device is a Y axis, the direction extending to the right end of the terminal device is an X axis, and it is assumed that the position information of the game equipment information is (60, 150, 90, 40), then the upper left corner of the display area corresponding to a target component is 60 from the X axis origin, the number of pixels is 150 from the direction of the Y axis of the origin of coordinates, the length of the upper left corner extending to the right is 90 pixels, and the width of the upper left corner extending to the bottom is 40 pixels, so that the display area corresponding to the target component (namely the equipment information component) of the game equipment information in the live broadcast interface can be determined, and the display area is displayed on the first layer. Other live broadcast additional data can also be displayed through similar steps, and are not described in detail herein. The first layer is located above the second layer, and therefore, as shown in (C) of fig. 8, the corresponding live additional data is displayed on the display area corresponding to the target component.
It is to be understood that the foregoing examples are only for understanding the present solution, and the specific data type identification, the location information, the target component and the live additional data should be flexibly determined according to actual requirements.
According to the method, the live broadcast additional data can be displayed on a higher layer by utilizing the relation of the layers, and a user can see the live broadcast additional data while seeing the live broadcast video data. In addition, the display area of the target component is updated by utilizing the position information provided by the server, the position of the live broadcast additional data to be displayed on the live broadcast interface can be more accurately determined, the terminal equipment is not required to search the position information corresponding to different live broadcast additional data, and the data rendering and displaying efficiency of the terminal equipment side is improved.
Optionally, on the basis of the embodiment corresponding to fig. 4, the receiving, by the terminal device, live video data and live additional data sent by the server may include:
the method comprises the steps that terminal equipment receives live video data sent by a server;
the method comprises the steps that terminal equipment receives game equipment information and position information corresponding to the game equipment information, wherein the game equipment information comprises text information and image information;
the terminal equipment displays live video data and live additional data on a live interface, and the method can include the following steps:
the terminal equipment determines a display area of an equipment information component in a live interface according to the position information corresponding to the game equipment information;
the method comprises the steps that the terminal device creates an image view corresponding to game equipment information and a text view corresponding to the game equipment information, wherein the image view is used for displaying the image information, and the text view is used for displaying the text information;
the terminal equipment displays an image view corresponding to the game equipment information and a text view corresponding to the game equipment information on a display area corresponding to the equipment information component in the live broadcast interface;
and the terminal equipment displays the live video data on a live interface.
In this embodiment, the live content request is used to request live video data and live additional data, and the live video data and the live additional data correspond to different data interfaces, respectively, so the live content request includes an additional data request for requesting the live additional data and a video data request for requesting the live video data. And the server sends the live video data to the terminal equipment according to the video data request. And the server sends the live broadcast additional data of corresponding type to the terminal equipment according to the data type identification carried in the additional data request.
If the data type identifier carried by the additional data request is a first type identifier, and the first type identifier is used for indicating game equipment information, the terminal device can also receive the game equipment information and position information corresponding to the game equipment information, which are sent by the server, and the game equipment information comprises text information and image information. The terminal equipment determines a display area of an equipment information component in the live interface according to the position information corresponding to the game equipment information, creates ImageView corresponding to the game equipment information and TextView corresponding to the game equipment information, wherein the ImageView is used for displaying image information, and the TextView is used for displaying text information, and then displays the ImageView corresponding to the game equipment information and the TextView corresponding to the game equipment information on the display area corresponding to the equipment information component in the live interface.
Specifically, assuming that the first type identifier is "11", when the data type identifier carried in the additional data request is the first type identifier "11", it is determined that game equipment information in the additional data is live broadcast, which is requested by the terminal device, so that the terminal device receives the game equipment information sent by the server and position information corresponding to the game equipment information, and determines a display area of an equipment information component, then creates an ImageView corresponding to the game equipment information, and creates a TextView corresponding to the game equipment information, where the ImageView is a specific representation of a View (View) component, and the ImageView may be used to display an image, and in an actual application, the ImageView may be used to display not only an image, but also other drawable objects may be shown using the ImageView. TextView is a component for displaying a character string, and is an area on a terminal device where a piece of text is displayed, and the displayed text cannot be directly edited by a user. The accessory information component represents a component in which the game accessory information is packaged.
For convenience of understanding, taking an application to a live broadcast scene of an MOBA game as an example for explanation, please refer to fig. 9, fig. 9 is an interface schematic diagram illustrating game equipment information based on the live broadcast scene of the MOBA game in the embodiment of the present application, as shown in fig. 9, (a) illustrates a selection box further illustrated by the live broadcast interface after a user clicks a "select display" button, when the user selects a "game equipment information" button D1, a terminal device may send an additional data request carrying a first type identifier to a server, so that the terminal device receives location information corresponding to the game equipment information sent by the server, thereby determining a display area corresponding to an equipment information component in the live broadcast interface, and then creating an ImageView D2 (such as an equipment image of a game character) and an ImageView D3 (such as a head portrait of the game character) corresponding to the game equipment information as illustrated in fig. 9 (B), and TextView D4 corresponding to the game equipment information (e.g., the number of killers, attacks, and the number of times of being killed, etc. of the game character), and then a live interface as illustrated in (C) of fig. 9 is displayed in a display area corresponding to the equipment information component in the live interface, so that a user can view live video data and game equipment information through the live interface at the same time.
In the embodiment of the application, the method for displaying the game equipment information is provided, and through the mode, the game equipment information can be displayed on a live broadcast interface based on the requirements of the viewers, so that the viewers can be helped to better understand the live broadcast game-play situation, more information can be provided for game explanation, and the live broadcast flexibility is improved.
Optionally, on the basis of the embodiment corresponding to fig. 4, in an optional embodiment of the method for displaying live content provided in the embodiment of the present application, the receiving, by the terminal device, live video data and live additional data sent by the server may include:
the method comprises the steps that terminal equipment receives live video data sent by a server;
the method comprises the steps that terminal equipment receives game role information sent by a server and position information corresponding to the game role information, wherein the game role information comprises text information and image information;
the terminal equipment displays live video data and live additional data on a live interface, and the method can include the following steps:
the terminal equipment determines a display area of a role information component in a live broadcast interface according to the position information corresponding to the game role information;
the method comprises the steps that terminal equipment creates an image view corresponding to game role information and a text view corresponding to the game role information, wherein the image view is used for displaying the image information, and the text view is used for displaying the text information;
the method comprises the steps that terminal equipment displays an image view corresponding to game role information and a text view corresponding to the game role information on a display area corresponding to a role information component in a live broadcast interface;
and the terminal equipment displays the live video data on a live interface.
In this embodiment, the live content request is used to request live video data and live additional data, and the live video data and the live additional data correspond to different data interfaces, respectively, so the live content request includes an additional data request for requesting the live additional data and a video data request for requesting the live video data. And the server sends the live video data to the terminal equipment according to the video data request. And the server sends the live broadcast additional data of corresponding type to the terminal equipment according to the data type identification carried in the additional data request.
If the data type identifier carried by the additional data request is a second type identifier, the second type identifier is used for indicating game role information, then the terminal device can receive the game role information and the position information corresponding to the game role information sent by the server, the game role information comprises text information and image information, then the display area of a role information component in a live broadcast interface is determined according to the position information corresponding to the game role information, an ImageView corresponding to the game role information and a TextView corresponding to the game role information are created, the ImageView is used for displaying the image information, the TextView is used for displaying the text information, and then the ImageView corresponding to the game role information and the TextView corresponding to the game role information are displayed in the display area corresponding to the role information component in the live broadcast interface.
Specifically, assuming that the second type identifier is "12", when the data type identifier carried in the additional data request is the second type identifier "12", it is determined that the data request is the game role information in the live broadcast additional data, and then the terminal device displays the ImageView corresponding to the game role information and the TextView corresponding to the game role information according to steps similar to those in the foregoing embodiment, which is not described herein again.
For convenience of understanding, taking an application to a live broadcast scene of an MOBA game as an example for explanation, please refer to fig. 10, where fig. 10 is a schematic diagram of an interface for presenting game role information based on the live broadcast scene of the MOBA game in the embodiment of the present application, as shown in the figure, after a "select display" button is clicked by a user in fig. 10, a selection box further presented by the live broadcast interface is shown, when the user selects "game role information" button E1, a terminal device may send an additional data request carrying a second type identifier to a server, so that the terminal device receives location information corresponding to the game role information sent by the server, thereby determining a display area corresponding to a role information component in the live broadcast interface, and then creates an ImageView E2 (such as a head portrait of a game role) and ImageView E3 (such as an image of a skill possessed by the game role) corresponding to the game role information shown in fig. 10 (B), and TextView E4 (such as a game name) corresponding to the game character information, and then displaying a live interface illustrated in fig. 10 (C) on a display area corresponding to a character information component in the live interface, so that a user can view live video data and game character information through the live interface at the same time.
In the embodiment of the application, the method for displaying the game role information is provided, and the game role information can be displayed on a live broadcast interface based on the requirements of the viewers, so that the viewers can be helped to better understand the live broadcast game-play situation, more information can be provided for game explanation, and the live broadcast flexibility is improved.
Optionally, on the basis of the embodiment corresponding to fig. 4, in an optional embodiment of the method for displaying live content provided in the embodiment of the present application, the receiving, by the terminal device, live video data and live additional data sent by the server may include:
the method comprises the steps that terminal equipment receives live video data sent by a server;
the method comprises the steps that terminal equipment receives game player information and position information corresponding to the game player information, wherein the game player information is sent by a server and comprises text information and video information;
the terminal equipment displays live video data and live additional data on a live interface, and the method can include the following steps:
the terminal equipment determines a display area of an image information component in a live broadcast interface according to the position information corresponding to the game player information;
the method comprises the steps that terminal equipment creates a text view corresponding to game player information and a video view corresponding to game role information, wherein the text view is used for displaying text information, and the video view is used for displaying video information;
the terminal equipment displays a text view corresponding to the information of the game player and a video view corresponding to the information of the game player on a display area corresponding to the image information component in the live broadcast interface;
and the terminal equipment displays the live video data on a live interface.
In this embodiment, the live content request is used to request live video data and live additional data, and the live video data and the live additional data correspond to different data interfaces, respectively, so the live content request includes an additional data request for requesting the live additional data and a video data request for requesting the live video data. And the server sends the live video data to the terminal equipment according to the video data request. And the server sends the live broadcast additional data of corresponding type to the terminal equipment according to the data type identification carried in the additional data request.
If the data type identifier carried by the additional data request is a third type identifier, the third type identifier is used for indicating game player information, then the terminal device can receive the game player information and the position information corresponding to the game player information sent by the server, the game player information includes text information and video information, then according to the position information corresponding to the game player information, a display area of a video information component in a live interface is determined, then a TextView corresponding to the game player information and a video view (VideoView) corresponding to the game character information are created, the TextView is used for displaying the text information, the VideoView is used for displaying the video information, and then the TextView corresponding to the game player information and the VideoView corresponding to the game player information are displayed in the display area corresponding to the video information component in the live interface.
Specifically, assuming that the third type identifier is "13", when the data type identifier carried in the additional data request is the third type identifier "13", it is determined that the terminal device requests the player information in the live additional data, and then the terminal device determines the display area of the video information component in a similar procedure to the foregoing embodiment. In addition, a VideoView corresponding to the game character information and a TextView corresponding to the game player information need to be created, where the VideoView can keep the screen normally bright by acquiring the authority, and the VideoView is usually added in an eXtensible Markup Language (xml) layout and then initialized and configured, so that the VideoView corresponding to the game character information can be created.
For convenience of understanding, taking the application to the live broadcast scene of the MOBA game as an example for explanation, please refer to fig. 11, fig. 11 is a schematic diagram of an interface for presenting game player information based on the live broadcast scene of the MOBA game in the embodiment of the present application, as shown in the figure, after a "select display" button is clicked by a user, a selection pop-up box further presented on the live broadcast interface is illustrated in (a) in fig. 11, when the user selects "game player information" button F1, a terminal device may send an additional data request carrying a third type identifier to a server, so that the terminal device receives position information corresponding to the game player information sent by the server, thereby determining a display area corresponding to an image information component in the live broadcast interface, and then creates a VideoView F2 (e.g. a recorded player picture) corresponding to the game player information and a text view F3 (e.g. a nickname of a player) corresponding to the game player information as illustrated in (B) in fig. 11, then, a live interface as illustrated in fig. 11 (C) is displayed in a display area corresponding to the image information component in the live interface, and a user can view live video data and information of game players through the live interface at the same time.
In the embodiment of the application, the method for displaying the information of the game players is provided, and through the mode, the information of the game players can be displayed on a live broadcast interface based on the requirements of the viewers, so that the viewers can be helped to better understand the live broadcast game-play situation, more information can be provided for game commentary, and the live broadcast flexibility is improved.
Optionally, on the basis of the embodiment corresponding to fig. 4, in an optional embodiment of the method for displaying live content provided in the embodiment of the present application, the receiving, by the terminal device, live video data and live additional data sent by the server may include:
the method comprises the steps that terminal equipment receives live video data sent by a server;
the method comprises the steps that terminal equipment receives game map information sent by a server and position information corresponding to the game map information, wherein the game map information comprises image information;
the terminal equipment displays live video data and live additional data on a live interface, and the method can include the following steps:
the terminal equipment determines a display area of a map information component in a live broadcast interface according to the position information corresponding to the game map information;
the method comprises the steps that terminal equipment creates an image view corresponding to game map information, wherein the image view is used for displaying image information;
the terminal equipment displays an image view corresponding to the game map information on a display area corresponding to a map information component in a live broadcast interface;
and the terminal equipment displays the live video data on a live interface.
In this embodiment, the live content request is used to request live video data and live additional data, and the live video data and the live additional data correspond to different data interfaces, respectively, so the live content request includes an additional data request for requesting the live additional data and a video data request for requesting the live video data. And the server sends the live video data to the terminal equipment according to the video data request. And the server sends the live broadcast additional data of corresponding type to the terminal equipment according to the data type identification carried in the additional data request.
If the data type identifier carried by the additional data request is a fourth type identifier, the fourth type identifier is used for indicating game map information, so that the terminal device can receive the game map information sent by the server and the position information corresponding to the game map information, the game map information comprises image information, then the display area of a map information component in a live broadcast interface is determined according to the position information corresponding to the game map information, an ImageView corresponding to the game map information is created, the ImageView is used for displaying the image information, and then the ImageView corresponding to the game map information is displayed on the display area corresponding to the map information component in the live broadcast interface.
Specifically, assuming that the fourth type identifier may be "14", when the data type identifier carried in the additional data request is the fourth type identifier "14", it is determined that the game map information in the additional data is live broadcast, which is requested by the terminal device, and then the terminal device displays the ImageView corresponding to the game map information according to a step similar to that in the foregoing embodiment, which is not described herein again.
For convenience of understanding, taking the application to a live view scene of an MOBA game as an example for explanation, please refer to fig. 12, where fig. 12 is a schematic diagram of an interface for displaying game map information based on the live view scene of the MOBA game in the embodiment of the present application, as shown in the figure, after a user clicks a "select display" button in (a) in fig. 12, the live view interface further displays a selection pop box, when the user selects a "game map information" button G1, a terminal device sends an additional data request carrying a fourth type identifier to a server, so that the terminal device receives location information corresponding to the game map information sent by the server, so as to determine a display area corresponding to a map information component in the live view interface, then creates an im view G2 (such as an image of a game map) corresponding to the game map information shown in (B) in fig. 12, and then on the display area corresponding to the map information component in the live view interface, a live interface as illustrated in fig. 12 (C) is presented, through which a user can simultaneously view live video data and game map information.
According to the method, the game map information can be displayed on the live broadcast interface based on the requirements of the viewers, so that the viewers can be helped to better understand the live broadcast game-alignment condition, more information can be provided for game explanation, and the live broadcast flexibility is improved.
Optionally, on the basis of the embodiment corresponding to fig. 4, in an optional embodiment of the method for displaying live content provided in the embodiment of the present application, the receiving, by the terminal device, live video data and live additional data sent by the server may include:
the method comprises the steps that terminal equipment receives live video data sent by a server;
the method comprises the steps that terminal equipment receives game state information sent by a server and position information corresponding to the game state information, wherein the game state information comprises real-time information;
terminal equipment shows live broadcast video data and live broadcast additional data on live broadcast interface, include:
the terminal equipment determines a display area of a state information component in a live broadcast interface according to the position information corresponding to the game state information;
the terminal equipment creates a customized view corresponding to the game state information, wherein the customized view is used for displaying real-time information in an animation effect;
the terminal equipment displays a customized view corresponding to the game state information on a display area corresponding to the state information component in the live broadcast interface;
and the terminal equipment displays the live video data on a live interface.
In this embodiment, the live content request is used to request live video data and live additional data, and the live video data and the live additional data correspond to different data interfaces, respectively, so the live content request includes an additional data request for requesting the live additional data and a video data request for requesting the live video data. And the server sends the live video data to the terminal equipment according to the video data request. And the server sends the live broadcast additional data of corresponding type to the terminal equipment according to the data type identification carried in the additional data request.
Assuming that the data type identifier carried by the additional data request is a fifth type identifier, where the fifth type identifier is used to indicate game state information, so that the terminal device may receive the game state information and position information corresponding to the game state information sent by the server, where the game state information includes real-time information, then determine a display area of a state information component in a live interface according to the position information corresponding to the game state information, and then create a customized view corresponding to the game state information, where the customized view is used to display the real-time information with an animation effect, and then display the customized view corresponding to the game state information on the display area corresponding to the state information component in the live interface.
Specifically, assuming that the fifth type identifier may be "15", when the data type identifier carried in the additional data request is the fifth type identifier "15", it is determined that the game state information requested by the terminal device is in the live additional data, the game state information includes real-time information, and the real-time information includes, but is not limited to, countdown, score, gold coin, and economic difference equal to that the terminal device determines the display area of the state information component according to the similar steps as in the foregoing embodiment.
For convenience of understanding, taking an application to a live view of an MOBA game as an example for explanation, please refer to fig. 13, where fig. 13 is an interface diagram illustrating game state information based on the live view of the MOBA game in the embodiment of the present application, as shown in fig. 13, after a "select display" button is clicked by a user for operation, a selection pop-up box further illustrated by the live view interface is illustrated in fig. 13, when the "game state information" button H1 is selected by the user, a terminal device sends an additional data request carrying a fifth type identifier to a server, so that the terminal device receives position information corresponding to the game state information sent by the server, so as to determine a display area corresponding to a state information component in the live view interface, then creates a customized view H2 (such as score information) corresponding to the game state information illustrated in fig. 13 (B), and then on the display area corresponding to the state information component in the live view interface, a live interface as illustrated in fig. 13 (C) is presented, through which a user can simultaneously view live video data and game state information.
It can be understood that, in the foregoing embodiment, only one manner of displaying the selected live broadcast additional data is performed, and in practical applications, multiple kinds of live broadcast additional data may also be selectively displayed, for example, game map information and game state information are selectively displayed, or game character information and game player information are selectively displayed, and the selection display of specific live broadcast additional data should be flexibly determined according to practical requirements.
In the embodiment of the application, the method for displaying the game state information is provided, and the game state information can be displayed on a live broadcast interface based on the requirements of the viewers, so that the viewers can be helped to better understand the live broadcast game-play situation, more information can be provided for game explanation, and the live broadcast flexibility is improved.
Optionally, on the basis of the embodiment corresponding to fig. 4, in an optional embodiment of the method for displaying live content provided in the embodiment of the present application, the step of continuing to display live video data on the live interface by the terminal device according to the data hiding instruction and stopping displaying live additional data may include:
the method comprises the steps that a terminal device sends a data hiding instruction to a server, so that the server breaks long connection according to the data hiding instruction and stops sending live broadcast additional data to the terminal device, wherein the long connection is established with the server according to a live broadcast content request;
and the terminal equipment continues to display the live video data on the live interface and stops displaying the live additional data.
In this embodiment, the terminal device may send the data hiding instruction to the server, so that the server disconnects the long connection according to the data hiding instruction, and stops sending the live broadcast additional data to the terminal device, where the long connection is a connection established with the server according to the live broadcast content request, so that the terminal device can continue to display live broadcast video data on a live broadcast interface, and the server stops sending the live broadcast additional data to the terminal device, and therefore, the terminal device cannot acquire the live broadcast additional data in real time, and does not display the live broadcast additional data any more.
For convenience of understanding, please refer to fig. 14, where fig. 14 is a schematic flowchart illustrating a process of implementing data transmission in a long connection state in the embodiment of the present application, and as shown in the figure, the terminal device first establishes a long connection with the server, and receives data transmitted in real time by the server after the long connection is successfully established. After the terminal equipment sends the data hiding instruction to the server, the server can break the long connection with the terminal equipment according to the data hiding instruction, and does not send live broadcast additional data to the terminal equipment any more, so that the terminal equipment can only display live broadcast video data and stops displaying the live broadcast additional data.
In the embodiment of the application, a method for stopping displaying live broadcast additional data is provided, and through the mode, when the display requirement of the live broadcast additional data does not exist, long connection is disconnected to stop transmission of the live broadcast additional data, so that resource consumption of data transmission can be reduced.
Optionally, on the basis of the embodiment corresponding to fig. 4, in an optional embodiment of the method for displaying live content provided in the embodiment of the present application, continuously displaying live video data on a live interface according to the data hiding instruction, and stopping displaying live additional data, may include:
responding to a data hiding instruction, and continuously receiving live broadcast additional data through a long connection, wherein the long connection is a connection established with a server according to a live broadcast content request;
and hiding the live broadcast additional data on the live broadcast interface, and continuously displaying the live broadcast video data.
In this embodiment, the terminal device may continue to receive the live broadcast additional data through the long connection in response to the data hiding instruction, where the long connection is a connection established with the server according to the live broadcast content request, that is, after the server sends the live broadcast video data and the live broadcast additional data to the terminal device, the long connection may not be disconnected, but after the terminal device receives the live broadcast additional data, the received live broadcast additional data is hidden on the live broadcast interface, and the live broadcast video data continues to be displayed. When the live broadcast additional data needs to be displayed again, the long connection with the server does not need to be reestablished, the live broadcast additional data which is being received is directly displayed, and the timeliness is good.
For convenience of understanding, please refer to fig. 15, where fig. 15 is another schematic flow chart illustrating that data transmission is implemented in a long connection state in the embodiment of the present application, and as shown in the figure, the terminal device first establishes a long connection with the server and receives data transmitted in real time by the server after the long connection is successfully established. When the terminal equipment receives the data hiding instruction, the data hiding instruction cannot be forwarded to the server, therefore, the server can still transmit live broadcast additional data to the terminal equipment in real time, but the terminal equipment can hide the received live broadcast additional data on a live broadcast interface and continuously display live broadcast video data.
In the embodiment of the application, the method for stopping displaying the live broadcast additional data is provided, through the mode, the terminal equipment is not in long connection with the server, real-time receiving of the live broadcast additional data can be kept, the live broadcast additional data is not displayed only by hiding the live broadcast additional data, and the live broadcast additional data is directly displayed when being displayed subsequently, so that timeliness of the scheme is improved.
With reference to the above description, the following description will describe a method for displaying live content in the present application, please refer to fig. 16, where fig. 16 is a schematic view of another embodiment of a method for displaying live content in an embodiment of the present application, and as shown in the drawing, another embodiment of the method for displaying live content in the embodiment of the present application includes:
201. a server receives a live broadcast content request sent by a terminal device;
in this embodiment, before the terminal device displays the live content, a long connection is established with the server, and then a live content request is sent to the server through the long connection, so that the server obtains live video data and live additional data according to the live content request, and the live additional data may include additional data associated with the live video data.
202. The server acquires live video data and live additional data according to the live content request, wherein the live additional data comprises additional data related to the live video data;
in this embodiment, the server obtains live video data and live additional data according to a live content request, where the live additional data may include additional data associated with the live video data. Specifically, the description is given by taking an application to a live broadcast scene of an MOBA game as an example, the live broadcast video data is video data generated in live broadcast of the MOBA game, and the live broadcast additional data includes game equipment information, game character information, game player information, game map information, and game state information.
203. The server sends live video data and live additional data to the terminal equipment so that the terminal equipment can display the live video data and the live additional data on a live interface;
in this embodiment, the server sends the live video data and the live additional data to the terminal device through the long connection established with the terminal device, so that the terminal device can display the live video data and the live additional data on a live interface after receiving the live video data and the live additional data.
204. If the terminal equipment identifies the data hiding instruction, the server receives the data hiding instruction sent by the terminal equipment;
in this embodiment, when the terminal device recognizes the data hiding instruction, the terminal device sends the data hiding instruction to the server, and the server may receive the data hiding instruction sent by the terminal device.
205. And the server stops sending the live broadcast additional data to the terminal equipment according to the data hiding instruction so that the terminal equipment continues to display the live broadcast video data on a live broadcast interface and stops displaying the live broadcast additional data.
In this embodiment, the server disconnects the long connection according to the data hiding instruction and stops sending the live broadcast additional data to the terminal device, so that the terminal device can continue to display the live broadcast video data on the live broadcast interface, but the live broadcast additional data cannot be acquired in real time, and therefore the terminal device stops displaying the live broadcast additional data.
Specifically, the server sends live video data and live additional data to the terminal equipment, the terminal equipment shows the received live video data and the received live additional data on a live interface, if a user does not need to watch the live additional data, a data hiding instruction can be triggered, the terminal equipment sends the data hiding instruction to the server, the server stops sending the live additional data to the terminal equipment, then the terminal equipment shows the live video data on the live interface, and stops showing the live additional data.
In the embodiment of the application, a live content display method is provided, and in the live broadcasting process, the terminal equipment can trigger a data hiding instruction according to actual requirements, and the server can stop sending live additional data to the terminal equipment, so that the terminal equipment can hide the live additional data on a live interface and only display live video data, the live interface is simplified as much as possible, and more complete live pictures are displayed.
Optionally, on the basis of the embodiment corresponding to fig. 16, in an optional embodiment of the method for displaying live content provided in the embodiment of the present application, after stopping sending live additional data to the terminal device according to the data hiding instruction, the method for displaying live content further includes:
if the terminal equipment identifies the data display instruction, receiving the data display instruction sent by the terminal equipment;
establishing long connection with the terminal equipment according to the data display instruction;
receiving an additional data request sent by the terminal equipment through long connection;
determining live broadcast additional data and position information corresponding to the live broadcast additional data according to the additional data request, wherein the additional data request carries a data type identifier, and the data type identifier is used for indicating the type of the live broadcast additional data;
and sending the live broadcast additional data and the position information corresponding to the live broadcast additional data to the terminal equipment so that the terminal equipment determines a display area corresponding to a target assembly in a live broadcast interface according to the position information corresponding to the live broadcast additional data and displays the live broadcast additional data on the display area corresponding to the target assembly in the live broadcast interface, wherein the target assembly is displayed on a first layer, the live broadcast video data is displayed on a second layer, and the first layer is positioned on the second layer.
In this embodiment, if the server establishes a long connection with the terminal device after receiving a data display instruction sent by the terminal device, and then sends live additional data and position information corresponding to the live additional data to the terminal device through the long connection, so that the terminal device determines a display area corresponding to a target component in a live interface according to the position information corresponding to the live additional data, and displays the live additional data on the display area corresponding to the target component in the live interface, where the target component is displayed on a first layer, the live video data is displayed on a second layer, and the first layer is located on the second layer.
Specifically, a data type identifier may be carried in the additional data request, and the data type identifier may indicate a type of the live additional data, for example, a data type identifier "1" indicates that all live additional data are exposed. And when the data type identification is not other numerical values, showing part of the live additional data. The specific display manner of the live broadcast additional data is described in the foregoing embodiment, and is not described herein again.
Specifically, the server sends live video data to the terminal device, and the terminal device displays the live video data. When a user needs to watch live additional data, a data display instruction is triggered on the terminal device, so that the server establishes a long connection with the terminal device according to the data display instruction, and receives an additional data request sent by the terminal device J2 through the long connection, the server J1 determines position information corresponding to the live additional data J4 and the live additional data J4 according to the additional data request and feeds the position information back to the terminal device J2, therefore, the terminal device J2 can determine a display area corresponding to a target component in a live interface according to the position information corresponding to the live additional data J4, and displays the live additional data J4 on the display area corresponding to the target component in the live interface.
According to the method, when the terminal equipment displays the live video data, the live additional data on the live interface can be displayed, and therefore a more complete live picture can be displayed.
Referring to fig. 17, fig. 17 is a schematic view of an embodiment of a terminal device in an embodiment of the present application, and as shown in the drawing, theterminal device 30 includes:
a sendingmodule 301, configured to send a live broadcast content request to a server, so that the server obtains live broadcast video data and live broadcast additional data according to the live broadcast content request, where the live broadcast additional data includes additional data associated with the live broadcast video data;
areceiving module 302, configured to receive live video data and live additional data sent by a server;
thedisplay module 303 is configured to display the live video data and the live additional data received by the receiving module on a live interface;
and thehiding module 304 is configured to, if the data hiding instruction is identified, continue to display the live video data on the live interface according to the data hiding instruction, and stop displaying the live additional data.
Optionally, on the basis of the embodiment corresponding to fig. 17, in another embodiment of theterminal device 30 provided in the embodiment of the present application,
thedisplay module 303 is further configured to continue to display the live video data on the live interface according to the data hiding instruction and stop displaying the live additional data, and when the live video data is displayed on the live interface if the data display instruction is identified, display the live additional data.
Optionally, on the basis of the embodiment corresponding to fig. 17, in another embodiment of theterminal device 30 provided in the embodiment of the present application, the terminal device further includes an obtainingmodule 305;
the acquisition module is specifically used for responding to the data display instruction and establishing long connection with the server;
sending an additional data request to a server through long connection so that the server determines live additional data and position information corresponding to the live additional data according to the additional data request, wherein the additional data request carries a data type identifier, and the data type identifier is used for indicating the type of the live additional data;
receiving live broadcast additional data sent by a server and position information corresponding to the live broadcast additional data;
thedisplay module 303 is specifically configured to determine a display area corresponding to a target component in a live broadcast interface according to position information corresponding to live broadcast additional data, where the target component is displayed on a first layer, live broadcast video data is displayed on a second layer, and the first layer is located on the second layer;
and displaying the live broadcast additional data on a display area corresponding to the target component in the live broadcast interface.
Optionally, on the basis of the embodiment corresponding to fig. 17, in another embodiment of theterminal device 30 provided in the embodiment of the present application,
areceiving module 302, specifically configured to receive live video data sent by a server;
receiving game equipment information and position information corresponding to the game equipment information sent by a server, wherein the game equipment information comprises text information and image information;
thedisplay module 303 is specifically configured to determine a display area of an equipment information component in a live interface according to position information corresponding to the game equipment information;
creating an image view corresponding to the game equipment information and a text view corresponding to the game equipment information, wherein the image view is used for displaying the image information, and the text view is used for displaying the text information;
displaying an image view corresponding to the game equipment information and a text view corresponding to the game equipment information on a display area corresponding to the equipment information component in the live broadcast interface;
and displaying the live video data on a live interface.
Optionally, on the basis of the embodiment corresponding to fig. 17, in another embodiment of theterminal device 30 provided in the embodiment of the present application,
areceiving module 302, specifically configured to receive live video data sent by a server;
receiving game role information sent by a server and position information corresponding to the game role information, wherein the game role information comprises text information and image information;
thedisplay module 303 is specifically configured to determine a display area of a role information component in a live interface according to position information corresponding to game role information;
creating an image view corresponding to the game role information and a text view corresponding to the game role information, wherein the image view is used for displaying the image information, and the text view is used for displaying the text information;
displaying an image view corresponding to the game role information and a text view corresponding to the game role information on a display area corresponding to the role information component in the live broadcast interface;
and displaying the live video data on a live interface.
Optionally, on the basis of the embodiment corresponding to fig. 17, in another embodiment of theterminal device 30 provided in the embodiment of the present application,
areceiving module 302, specifically configured to receive live video data sent by a server;
receiving game player information and position information corresponding to the game player information sent by a server, wherein the game player information comprises text information and video information;
thedisplay module 303 is specifically configured to determine a display area of an image information component in a live interface according to position information corresponding to the information of the game player;
creating a text view corresponding to the game player information and a video view corresponding to the game role information, wherein the text view is used for displaying the text information, and the video view is used for displaying the video information;
displaying a text view corresponding to the information of the game player and a video view corresponding to the information of the game player on a display area corresponding to the image information component in the live broadcast interface;
and displaying the live video data on a live interface.
Optionally, on the basis of the embodiment corresponding to fig. 17, in another embodiment of theterminal device 30 provided in the embodiment of the present application,
areceiving module 302, specifically configured to receive live video data sent by a server;
receiving game map information and position information corresponding to the game map information sent by a server, wherein the game map information comprises image information;
thedisplay module 303 is specifically configured to determine a display area of a map information component in the live broadcast interface according to the position information corresponding to the game map information;
creating an image view corresponding to the game map information, wherein the image view is used for displaying the image information;
displaying an image view corresponding to the game map information on a display area corresponding to a map information component in a live broadcast interface;
and displaying the live video data on a live interface.
Optionally, on the basis of the embodiment corresponding to fig. 17, in another embodiment of theterminal device 30 provided in the embodiment of the present application,
areceiving module 302, specifically configured to receive live video data sent by a server;
receiving game state information sent by a server and position information corresponding to the game state information, wherein the game state information comprises real-time information;
thedisplay module 303 is specifically configured to determine a display area of a state information component in the live interface according to the position information corresponding to the game state information;
creating a customized view corresponding to the game state information, wherein the customized view is used for displaying the real-time information in an animation effect;
displaying a customized view corresponding to the game state information on a display area corresponding to the state information component in the live broadcast interface;
and displaying the live video data on a live interface.
Optionally, on the basis of the embodiment corresponding to fig. 17, in another embodiment of theterminal device 30 provided in the embodiment of the present application,
thehiding module 304 is specifically configured to send a data hiding instruction to the server, so that the server disconnects the long connection according to the data hiding instruction, and stops sending the live broadcast additional data to the terminal device, where the long connection is a connection established with the server according to a live broadcast content request;
and continuously displaying the live video data on the live interface, and stopping displaying the live additional data.
Optionally, on the basis of the embodiment corresponding to fig. 17, in another embodiment of theterminal device 30 provided in the embodiment of the present application,
ahiding module 304, specifically configured to respond to a data hiding instruction, and continue to receive live broadcast additional data through a long connection, where the long connection is a connection established with a server according to a live broadcast content request;
and hiding the live broadcast additional data on the live broadcast interface, and continuously displaying the live broadcast video data.
Referring to fig. 18, fig. 18 is a schematic view of an embodiment of a live content presentation apparatus according to an embodiment of the present application, and as shown in the drawing, the livecontent presentation apparatus 40 includes:
areceiving module 401, configured to receive a live content request sent by a terminal device;
an obtainingmodule 402, configured to obtain live broadcast video data and live broadcast additional data according to the live broadcast content request received by the receiving module, where the live broadcast additional data includes additional data associated with the live broadcast video data;
a sendingmodule 403, configured to send the live video data and the live additional data acquired by the acquiring module to a terminal device, so that the terminal device displays the live video data and the live additional data on a live interface;
the receivingmodule 401 is further configured to receive a data hiding instruction sent by the terminal device if the terminal device identifies the data hiding instruction;
and thedisplay module 404 is configured to stop sending the live broadcast additional data to the terminal device according to the data hiding instruction received by the receiving module, so that the terminal device continues to display the live broadcast video data on a live broadcast interface and stops displaying the live broadcast additional data.
Optionally, on the basis of the embodiment corresponding to fig. 18, in another embodiment of the livecontent presentation apparatus 40 provided in this embodiment of the present application, after the presentation module stops sending live additional data to the terminal device according to the data hiding instruction, the livecontent presentation apparatus 40 further includes an establishingmodule 405 and a determiningmodule 406,
the receivingmodule 401 is further configured to receive a data display instruction sent by the terminal device if the terminal device identifies the data display instruction;
an establishingmodule 405, configured to establish a long connection with a terminal device according to a data display instruction;
areceiving module 401, configured to receive, through a long connection, an additional data request sent by a terminal device;
a determiningmodule 406, configured to determine, according to an additional data request, live additional data and location information corresponding to the live additional data, where the additional data request carries a data type identifier, and the data type identifier is used to indicate a type of the live additional data;
the sendingmodule 403 is further configured to send live additional data and location information corresponding to the live additional data to the terminal device, so that the terminal device determines a display area corresponding to a target component in a live interface according to the location information corresponding to the live additional data, and displays the live additional data on the display area corresponding to the target component in the live interface, where the target component is displayed on a first layer, the live video data is displayed on a second layer, and the first layer is located on the second layer.
Next, an embodiment of the present application further provides a terminal device, configured to execute the steps executed by the terminal device in the embodiment corresponding to fig. 4. As shown in fig. 19, for convenience of explanation, only the portions related to the embodiments of the present application are shown, and details of the specific techniques are not disclosed, please refer to the method portion of the embodiments of the present application. Taking a terminal device as a mobile phone as an example for explanation:
fig. 19 is a block diagram illustrating a partial structure of a mobile phone related to a terminal provided in an embodiment of the present application. Referring to fig. 19, the cellular phone includes: radio Frequency (RF)circuit 510,memory 520,input unit 530,display unit 540, sensor 550,audio circuit 560, wireless fidelity (WiFi)module 570,processor 580, andpower supply 590. Those skilled in the art will appreciate that the handset configuration shown in fig. 19 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile phone in detail with reference to fig. 19:
RF circuit 510 may be used for receiving and transmitting signals during information transmission and reception or during a call, and in particular, for processing downlink information of a base station after receiving the downlink information toprocessor 580; in addition, the data for designing uplink is transmitted to the base station. In general,RF circuit 510 includes, but is not limited to, an antenna, at least one Amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition,RF circuit 510 may also communicate with networks and other devices via wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Messaging Service (SMS), and the like.
Thememory 520 may be used to store software programs and modules, and theprocessor 580 executes various functional applications and data processing of the mobile phone by operating the software programs and modules stored in thememory 520. Thememory 520 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, thememory 520 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
Theinput unit 530 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone. Specifically, theinput unit 530 may include atouch panel 531 andother input devices 532. Thetouch panel 531, also called a touch screen, can collect touch operations of a user on or near the touch panel 531 (for example, operations of the user on or near thetouch panel 531 by using any suitable object or accessory such as a finger or a stylus pen), and drive the corresponding connection device according to a preset program. Alternatively, thetouch panel 531 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, and sends the touch point coordinates to theprocessor 580, and can receive and execute commands sent by theprocessor 580. In addition, thetouch panel 531 may be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. Theinput unit 530 may includeother input devices 532 in addition to thetouch panel 531. In particular,other input devices 532 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
Thedisplay unit 540 may be used to display information input by the user or information provided to the user and various menus of the mobile phone. TheDisplay unit 540 may include aDisplay panel 541, and optionally, theDisplay panel 541 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, thetouch panel 531 may cover thedisplay panel 541, and when thetouch panel 531 detects a touch operation on or near thetouch panel 531, the touch panel is transmitted to theprocessor 580 to determine the type of the touch event, and then theprocessor 580 provides a corresponding visual output on thedisplay panel 541 according to the type of the touch event. Although thetouch panel 531 and thedisplay panel 541 are shown as two separate components in fig. 19 to implement the input and output functions of the mobile phone, in some embodiments, thetouch panel 531 and thedisplay panel 541 may be integrated to implement the input and output functions of the mobile phone.
The handset may also include at least one sensor 550, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of thedisplay panel 541 according to the brightness of ambient light, and the proximity sensor may turn off thedisplay panel 541 and/or the backlight when the mobile phone is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
Audio circuitry 560, speaker 561, and microphone 562 may provide an audio interface between a user and a cell phone. Theaudio circuit 560 may transmit the electrical signal converted from the received audio data to the speaker 561, and convert the electrical signal into a sound signal by the speaker 561 for output; on the other hand, the microphone 562 converts the collected sound signals into electrical signals, which are received by theaudio circuit 560 and converted into audio data, which are then processed by the audiodata output processor 580, and then passed through theRF circuit 510 to be sent to, for example, another cellular phone, or output to thememory 520 for further processing.
WiFi belongs to short distance wireless transmission technology, and the mobile phone can help the user to send and receive e-mail, browse web pages, access streaming media, etc. through theWiFi module 570, which provides wireless broadband internet access for the user. Although fig. 19 shows aWiFi module 570, it is understood that it does not belong to the essential components of the handset.
Theprocessor 580 is a control center of the mobile phone, connects various parts of the entire mobile phone by using various interfaces and lines, and performs various functions of the mobile phone and processes data by operating or executing software programs and/or modules stored in thememory 520 and calling data stored in thememory 520, thereby performing overall monitoring of the mobile phone. Alternatively,processor 580 may include one or more processing units; preferably, theprocessor 580 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated intoprocessor 580.
The handset also includes a power supply 590 (e.g., a battery) for powering the various components, which may preferably be logically coupled to theprocessor 580 via a power management system, such that the power management system may be used to manage charging, discharging, and power consumption.
Although not shown, the mobile phone may further include a camera, a bluetooth module, etc., which are not described herein.
In the embodiment of the present application, the terminal includes aprocessor 580 for executing the embodiments corresponding to fig. 4.
An embodiment of the present application further provides another live content display apparatus, where the live content display apparatus may be deployed on a computer device, such as a server or a terminal device, and in the present application, the live content display apparatus is deployed on the server as an example, please refer to fig. 20, where fig. 20 is a schematic structural diagram of the server in the embodiment of the present application, as shown in the figure, theserver 600 may generate a relatively large difference due to different configurations or performances, and may include one or more Central Processing Units (CPUs) 622 (e.g., one or more processors) and amemory 632, and one or more storage media 630 (e.g., one or more mass storage devices) for storing anapplication program 642 ordata 644.Memory 632 andstorage medium 630 may be, among other things, transient or persistent storage. The program stored in thestorage medium 630 may include one or more modules (not shown), each of which may include a series of instruction operations for the server. Still further, thecentral processor 622 may be configured to communicate with thestorage medium 630 and execute a series of instruction operations in thestorage medium 630 on theserver 600.
TheServer 600 may also include one ormore power supplies 626, one or more wired or wireless network interfaces 650, one or more input-output interfaces 658, and/or one ormore operating systems 641, such as a Windows ServerTM,Mac OS XTM,UnixTM,LinuxTM,FreeBSDTMAnd so on.
The steps performed by the server in the above embodiment may be based on the server structure shown in fig. 20.
In the embodiment of the present application, the server includes aCPU 622 for executing the respective embodiments corresponding to fig. 16.
Embodiments of the present application also provide a computer-readable storage medium, in which a computer program is stored, and when the computer program runs on a computer, the computer is caused to execute the steps of the foregoing embodiments.
Embodiments of the present application also provide a computer program product comprising a program, which when run on a computer causes the computer to perform the steps of the various embodiments as described above.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.