Movatterモバイル変換


[0]ホーム

URL:


CN113318442B - Live broadcast interface display method, data uploading method and data issuing method - Google Patents

Live broadcast interface display method, data uploading method and data issuing method
Download PDF

Info

Publication number
CN113318442B
CN113318442BCN202110586123.8ACN202110586123ACN113318442BCN 113318442 BCN113318442 BCN 113318442BCN 202110586123 ACN202110586123 ACN 202110586123ACN 113318442 BCN113318442 BCN 113318442B
Authority
CN
China
Prior art keywords
live
virtual scene
virtual
interface
live broadcast
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110586123.8A
Other languages
Chinese (zh)
Other versions
CN113318442A (en
Inventor
周林军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Fanxing Huyu IT Co Ltd
Original Assignee
Guangzhou Fanxing Huyu IT Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Fanxing Huyu IT Co LtdfiledCriticalGuangzhou Fanxing Huyu IT Co Ltd
Priority to CN202110586123.8ApriorityCriticalpatent/CN113318442B/en
Publication of CN113318442ApublicationCriticalpatent/CN113318442A/en
Application grantedgrantedCritical
Publication of CN113318442BpublicationCriticalpatent/CN113318442B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Classifications

Landscapes

Abstract

The embodiment of the application discloses a live broadcast interface display method, a data uploading method and a data issuing method, and belongs to the technical field of computers. The method comprises the following steps: displaying a virtual scene in a live interface of a live broadcasting room, wherein the live broadcasting room is used for live broadcasting a first change process of the virtual scene along with a main broadcasting operation; acquiring a visual field switching operation triggered based on the live broadcast interface; and responding to the visual field switching operation, and updating the virtual scene displayed in the live interface. Therefore, when a user watches live broadcast in a live broadcast room, the live broadcast interface can be controlled to display virtual scenes of the user-specified field of view by triggering the field of view switching operation, and the flexibility of live broadcast is improved.

Description

Live broadcast interface display method, data uploading method and data issuing method
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a live broadcast interface display method, a data uploading method and a data issuing method.
Background
With the continuous development of computer technology, people can share information more and more conveniently. The live broadcast becomes a popular information sharing mode at present, a host broadcast displays information which the host broadcast wants to convey in a live broadcast room of the host broadcast, and a user enters the interested live broadcast room to watch the information displayed by the host broadcast.
For example, in the process that the game host operates in the live client, the live client records the terminal interface, and the recorded video stream is distributed to the audience client in the live room by the live server, so that the audience can watch the game operation process of the game host.
Disclosure of Invention
The embodiment of the application provides a live broadcast interface display method, a data uploading method and a data issuing method, which improve the flexibility of live broadcast. The technical scheme is as follows:
in one aspect, a live interface display method is provided, and the method includes:
Displaying a virtual scene in a live interface of a live broadcasting room, wherein the live broadcasting room is used for live broadcasting a first change process of the virtual scene along with a main broadcasting operation;
acquiring a visual field switching operation triggered based on the live broadcast interface;
and responding to the visual field switching operation, and updating the virtual scene displayed in the live interface.
In one aspect, a live interface display method is provided, and the method includes:
responding to access operation of a live broadcasting room, acquiring state data of a virtual scene corresponding to the live broadcasting room, wherein the state data indicates the state of a virtual object in the virtual scene, and the live broadcasting room is used for live broadcasting a first change process of the virtual scene along with the operation of a host broadcasting;
Displaying the virtual scene according to the state data in a live interface of the live broadcasting room;
acquiring an instruction executed in a state indicated by the state data;
and executing the acquired instruction on the virtual scene in the live broadcast interface.
In one aspect, a data uploading method is provided, the method including:
uploading state data of a virtual scene corresponding to a live broadcasting room to a live broadcasting server, wherein the state data indicates the state of a virtual object in the virtual scene, and the live broadcasting room is used for broadcasting a first change process of the virtual scene along with a main broadcasting operation;
acquiring an instruction executed by an application program corresponding to the virtual scene;
and uploading the acquired instruction to the live broadcast server.
In one aspect, a data issuing method is provided, and the method includes:
Acquiring an instruction executed on a virtual scene;
Sending an acquired instruction to a viewer client in a live broadcast room, wherein the live broadcast room is used for broadcasting a first change process of the virtual scene along with the anchor operation;
The audience client is used for executing the acquired instruction according to the local state data of the virtual scene so as to update the local state data.
In one aspect, a live interface display device is provided, the device including:
The display module is used for displaying a virtual scene in a live interface of a live broadcasting room, and the live broadcasting room is used for broadcasting a first change process of the virtual scene along with the operation of a host;
the acquisition module is used for acquiring the visual field switching operation triggered on the basis of the live broadcast interface;
And the display module is used for responding to the visual field switching operation and updating the virtual scene displayed in the live broadcast interface.
In one aspect, a live interface display device is provided, the device including:
The display module is used for displaying a virtual scene in the current time in a live broadcast interface of a live broadcast room in the live broadcast process of the live broadcast room, wherein the live broadcast room is used for broadcasting a first change process of the virtual scene along with a main broadcast operation;
a determining module, configured to determine a target time selected by a playback operation triggered based on the live interface, where the playback operation indicates playback from the target time;
and the display module is used for displaying the virtual scene under the target time in the live interface.
In one aspect, a live interface display device is provided, the device including:
the system comprises a data acquisition module, a live broadcasting room and a control module, wherein the data acquisition module is used for responding to access operation of the live broadcasting room and acquiring state data of a virtual scene corresponding to the live broadcasting room, the state data indicates the state of a virtual object in the virtual scene, and the live broadcasting room is used for broadcasting a first change process of the virtual scene along with the operation of a host broadcasting;
the display module is used for displaying the virtual scene according to the state data in a live interface of the live broadcasting room;
the instruction acquisition module is used for acquiring an instruction executed in a state indicated by the state data;
and the instruction execution module is used for executing the acquired instruction on the virtual scene in the live broadcast interface.
In one aspect, a data uploading device is provided, the device includes:
The system comprises an uploading module, a live broadcasting server and a live broadcasting server, wherein the uploading module is used for uploading state data of a virtual scene corresponding to the live broadcasting room to the live broadcasting server, the state data indicate the state of a virtual object in the virtual scene, and the live broadcasting room is used for live broadcasting a first change process of the virtual scene along with a main broadcasting operation;
The acquisition module is used for acquiring instructions executed by the application programs corresponding to the virtual scenes;
and the uploading module is used for uploading the acquired instruction to the live broadcast server.
In one aspect, a data issuing device is provided, where the device includes:
the instruction acquisition module is used for acquiring an instruction executed on the virtual scene;
The transmission module is used for transmitting the acquired instruction to the audience client side in the live broadcasting room, and the live broadcasting room is used for broadcasting a first change process of the virtual scene along with the anchor operation;
The audience client is used for executing the acquired instruction according to the local state data of the virtual scene so as to update the local state data.
In another aspect, there is provided a computer device including a processor and a memory having stored therein at least one program code loaded and executed by the processor to implement the operations performed in the live interface display method as described in the above aspects; or to implement the operations performed in the data upload method as described in the above aspects; or to implement the operations performed in the data delivery method as described in the above aspects.
In another aspect, there is provided a computer readable storage medium having stored therein at least one program code loaded and executed by a processor to implement the operations performed in the live interface display method of the above aspect; or to implement the operations performed in the data upload method as described in the above aspects; or to implement the operations performed in the data delivery method as described in the above aspects.
In yet another aspect, there is provided a computer program having at least one program code stored therein, the at least one program code being loaded and executed by a processor to implement the operations performed in the live interface display method as described in the above aspect; or to implement the operations performed in the data upload method as described in the above aspects; or to implement the operations performed in the data delivery method as described in the above aspects.
According to the live broadcast interface display method provided by the embodiment of the application, the video switching operation can be responded, so that the display content of the live broadcast interface is updated, and therefore, when a user watches live broadcast in a live broadcast room, the live broadcast interface can be controlled to display a virtual scene of a user designated visual field by triggering the visual field switching operation, and the flexibility of live broadcast is improved.
According to the live broadcast interface display method provided by the embodiment of the application, the playback operation can be performed in the live broadcast process of the live broadcast room, and the playback is started from the target time selected by the playback operation, so that when a user watches live broadcast in the live broadcast room being live broadcast, the virtual scene at the appointed moment of the user can be controlled to be displayed on the live broadcast interface through the playback operation, and the flexibility of live broadcast is improved.
According to the live broadcast interface display method provided by the embodiment of the application, the process that the virtual scene in the anchor client changes along with the anchor operation can be restored through the acquired state data and the instruction, so that the data volume of the data required to be acquired in the live broadcast process is reduced, and the fluency of the live broadcast room is ensured.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of an implementation environment provided by an embodiment of the present application;
Fig. 2 is a flowchart of a live interface display method according to an embodiment of the present application;
FIG. 3 is a flowchart of another method for displaying a live interface according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a live interface provided by an embodiment of the present application;
FIG. 5 is a flowchart of a sliding live interface provided by an embodiment of the present application;
FIG. 6 is a flow chart for switching viewing angles according to an embodiment of the present application;
FIG. 7 is a flow chart of a playback live provided by an embodiment of the present application;
Fig. 8 is a flowchart of a live interface display method according to an embodiment of the present application;
FIG. 9 is a flowchart of another method for displaying a live interface according to an embodiment of the present application;
fig. 10 is a flowchart of a live interface display method according to an embodiment of the present application;
FIG. 11 is a flowchart of a data uploading method according to an embodiment of the present application;
FIG. 12 is a flowchart of a data delivery method according to an embodiment of the present application;
Fig. 13 is a flowchart of a live broadcast method according to an embodiment of the present application;
Fig. 14 is a schematic structural diagram of a live interface display device according to an embodiment of the present application;
Fig. 15 is a schematic structural diagram of another live interface display device according to an embodiment of the present application;
Fig. 16 is a schematic structural diagram of a live interface display device according to an embodiment of the present application;
Fig. 17 is a schematic structural diagram of a live interface display device according to an embodiment of the present application;
fig. 18 is a schematic structural diagram of another live interface display device according to an embodiment of the present application;
fig. 19 is a schematic structural diagram of a data uploading device according to an embodiment of the present application;
fig. 20 is a schematic structural diagram of another data uploading device according to an embodiment of the present application;
fig. 21 is a schematic structural diagram of a data issuing device according to an embodiment of the present application;
Fig. 22 is a schematic structural diagram of a terminal according to an embodiment of the present application;
fig. 23 is a schematic structural diagram of a server according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the following detailed description of the embodiments of the present application will be given with reference to the accompanying drawings.
It is to be understood that the terms "first," "second," "third," "fourth," "fifth," "sixth," etc. as used herein may be used to describe various concepts, but are not limited by these terms unless otherwise specified. These terms are only used to distinguish one concept from another. For example, a first time period may be referred to as a second time period, and a second time period may be referred to as a first time period, without departing from the scope of the present application.
The terms "each," "plurality," "at least one," "any" and the like as used herein, at least one includes one, two or more, a plurality includes two or more, and each refers to each of a corresponding plurality, any of which refers to any of the plurality. For example, the plurality of virtual objects includes 3 virtual objects, and each refers to each of the 3 virtual objects, and any one refers to any one of the 3 virtual objects, which may be the first, the second, or the third.
The virtual scene may be a 2D (2 Dimensions) scene, a 3D (3 Dimensions) scene, or the like.
In one possible implementation, the virtual scene includes one or more virtual environments, such as may include at least one of a sea scene, a mountain scene, a prosperous city, glaciers, a grassland, and the like. In one possible implementation, the virtual scene includes at least one virtual character including a virtual character controlled based on a host account number, and the host can control any operation of moving, attacking, etc. the virtual character in the virtual scene based on the host account number, alternatively, the virtual character may have a character level, an experience value, equipment, a warehouse, a virtual coin, etc. Optionally, the virtual scene includes a plurality of virtual characters including virtual characters controlled based on the anchor account number, and virtual characters controlled based on other account numbers. Optionally, the plurality of virtual characters further includes NPCs (Non-PLAYERCHARACTER, non-player characters). The player character (virtual character based on account control) can get corresponding information and services at the NPC. For example, the NPC in the virtual scene may implement functions such as query, route, transfer, etc. The virtual character may be any character, for example, a human character, an animal character, or the like.
Optionally, the virtual scene further includes a virtual article, where the virtual article is any article such as weapon prop, clothing, food, etc.
It should be noted that, the embodiment of the present application only illustrates the virtual scene, and does not limit the virtual scene.
The live interface method and the data uploading method provided by the embodiment of the application are executed by a terminal, such as a mobile phone, a tablet personal computer, a computer and the like. The data issuing method provided by the embodiment of the application is executed by a server, and the server can be a server, a server cluster formed by a plurality of servers, or a cloud computing service center.
FIG. 1 is a schematic diagram of an implementation environment provided by an embodiment of the present application. Referring to fig. 1, the implementation environment includes a first terminal 101, at least one second terminal 102 (3 are illustrated in fig. 1), and a server 103. The first terminal 101 and the server 103 are connected through a wireless or wired network; each second terminal 102 is connected to the server 103 via a wireless or wired network.
The first terminal 101 installs thereon a target application served by the server 103, and the second terminal 102 installs thereon a target application served by the server 103, by which the first terminal 101 and the second terminal 102 can realize functions such as data transmission, message interaction, and the like. Optionally, the first terminal 101 and the second terminal 102 are computers, mobile phones, tablet computers or other terminals. Optionally, the server 103 is a background server of the target application or a cloud server providing services such as cloud computing and cloud storage.
Optionally, the target application is a target application in a terminal operating system or a target application provided for a third party. For example, the target application is a content sharing application having a live broadcast and a live broadcast viewing function, but of course, the target application can also have other functions such as a comment function, a game function, a shopping function, and the like.
Alternatively, the first terminal 101 is a anchor client and the second terminal 102 is a viewer client. The first terminal 101 uploads the current state data of the virtual scene to the server 103, then, further uploads an instruction executed by an application program corresponding to the virtual scene to the server 103, and when any second terminal 102 accesses the live broadcasting room of the anchor client, the server 103 sends the state data and the instruction uploaded by the first terminal 101 to the second terminal 102, so that the second terminal restores the virtual scene displayed in the anchor client.
The method provided by the embodiment of the application can be applied to live scenes:
for example: the method is applied to the live game scene.
When the method provided by the embodiment of the application is adopted for live broadcasting of the client side, the state data of the virtual scene and the instruction executed on the virtual scene in the state indicated by the state data are sent to the client side of the audience, so that the client side of the audience can restore the complete virtual scene, the virtual scene of the client side of the audience is enabled to be consistent with the virtual scene of the client side of the host by continuously updating the virtual scene in the mode of executing the instruction, and as the client side of the audience can restore the complete virtual scene, the client side of the audience can display the virtual scene observed in any view of the virtual scene, namely, the user can control the live broadcasting interface to display the virtual scene of the appointed view, and the flexibility of live broadcasting is improved.
It should be noted that, the method provided by the embodiment of the present application is only exemplified by the live game, and the embodiment of the present application does not limit the live content.
Fig. 2 is a flowchart of a live interface display method according to an embodiment of the present application. The embodiment of the application takes the execution main body as the terminal as an example for carrying out the exemplary explanation. Referring to fig. 2, the method includes:
201. And the terminal displays the virtual scene in a live interface of the live broadcasting room.
The live broadcasting room is used for broadcasting a first change process of the virtual scene along with the operation of the host broadcasting. In some cases, the virtual scene includes a virtual character controlled by a host based on a host account, and the host operates as an operation for controlling the virtual character, for example, the host operates as an operation for controlling the virtual character to move forward; as another example, the anchor operation is an operation of controlling the skill of the character's transmission. Since the virtual character controlled by the anchor is included in the virtual scene, the state of the virtual character may be changed after the virtual character is controlled according to the anchor operation, resulting in a change in the virtual scene, and thus, the virtual scene may be changed according to the anchor operation.
The terminal is a terminal provided with a live broadcast application program, and the terminal is a client of a viewer in the live broadcast room, and the terminal can enter the live broadcast room to watch a picture displayed in a live broadcast interface of the live broadcast room by accessing the live broadcast room.
If the terminal is a terminal that has just entered the live broadcast room, the terminal displays a virtual scene in a live broadcast interface of the live broadcast room, including: and responding to the access operation of the live broadcasting room, displaying a live broadcasting interface of the live broadcasting room, and displaying a virtual scene in the live broadcasting interface. After the terminal enters the live broadcast room, the live broadcast interface is always displayed, but the virtual scene presented in the live broadcast interface is not constant and can be changed along with the operation of a host broadcast or the operation triggered by the terminal user.
In addition, the live room is also used for a second change process of changing the live virtual scene along with the view switching operation of the audience. It should be noted that the view switching operation performed by different viewer clients may be different, and thus, the second change procedure of live room presentation in different viewer clients may be different.
202. And the terminal acquires the visual field switching operation triggered based on the live broadcast interface.
In the live interface, when displaying the virtual scene, the global area of the virtual scene may be displayed, or a partial area of the virtual scene may be displayed, for example, the global area of the virtual scene is an area of 500 m by 900 m, and the virtual scene displayed in the live interface is an area of 200 m by 200 m centered on the virtual character controlled by the host.
The view switching operation triggered based on the live interface refers to an operation triggered on the live interface for switching the view. The virtual scenes in different views are different, so that after the views are switched, the virtual scenes presented by the live interface are also different. For example, the virtual scene includes a first virtual character and a second virtual character, the virtual scene observed from the perspective of the first virtual character is displayed in the live interface, and after the operation of switching the field of view, the virtual scene observed from the perspective of the second virtual character is displayed in the live interface.
Optionally, the terminal acquiring the view switching operation triggered based on the live interface means: the terminal detects a view switching operation based on the live interface. Optionally, the terminal acquiring the view switching operation triggered based on the live interface means: the terminal acquires operation information of the view switching operation. Optionally, the operation information includes an operation type of the view switching operation, such as a click operation, a double click operation, a long press operation, a slide operation, and the like; optionally, the operation information includes a trigger position of the view switching operation, and the like. The embodiment of the application does not limit the operation of switching the terminal to acquire the visual field.
203. And the terminal responds to the field switching operation and updates the virtual scene displayed in the live interface.
After the view switching operation, the live interface displays the virtual scene in the switched view.
According to the live broadcast interface display method provided by the embodiment of the application, the video switching operation can be responded, so that the display content of the live broadcast interface is updated, and therefore, when a user watches live broadcast in a live broadcast room, the live broadcast interface can be controlled to display a virtual scene of a user designated visual field by triggering the visual field switching operation, and the flexibility of live broadcast is improved.
The view switching operation may be any operation, for example, a click operation, a slide operation, or the like. The view switching operation may be an operation for indicating a target of replacement interest, switching a viewing angle, or the like, and the embodiment of the present application is not limited thereto, and the embodiment of the present application exemplarily illustrates updating a virtual scene displayed in a live interface by the view switching operation only by the embodiment shown in fig. 3.
Fig. 3 is a flowchart of a live interface display method according to an embodiment of the present application. Referring to fig. 3, an embodiment of the present application is illustrated by taking an execution body as a terminal, where the method includes:
301. And the terminal displays the virtual scene in a live interface of the live broadcasting room.
The live broadcasting room is used for broadcasting a first change process of the virtual scene along with the operation of the host broadcasting, as shown in fig. 4, and fig. 4 shows a live broadcasting interface of the live broadcasting room. In some embodiments, the virtual scene includes a virtual character controlled by a host based on a host account, and the host is operated to control the virtual character, for example, the host is operated to control the virtual character to move forward; as another example, the anchor operation is an operation of controlling the skill of the character's transmission. Since the virtual character controlled by the anchor is included in the virtual scene, the state of the virtual character may be changed after the virtual character is controlled according to the anchor operation, resulting in a change in the virtual scene, and thus, the virtual scene may be changed according to the anchor operation.
The terminal is a terminal provided with a live broadcast application program, and the terminal is a client of a viewer in the live broadcast room, and the terminal can enter the live broadcast room to watch a picture displayed in a live broadcast interface of the live broadcast room by accessing the live broadcast room.
If the terminal is a terminal that has just entered the live broadcast room, the terminal displays a virtual scene in a live broadcast interface of the live broadcast room, including: and responding to the access operation of the live broadcasting room, displaying a live broadcasting interface of the live broadcasting room, and displaying a virtual scene in the live broadcasting interface. After the terminal enters the live broadcast room, the live broadcast interface is always displayed, but the virtual scene presented in the live broadcast interface is not constant and can be changed along with the operation of a host broadcast or the operation triggered by the terminal user.
The terminal may display the virtual scene in the live interface of the live broadcasting room, which may be the global or local of the virtual scene. In one possible implementation, when a user enters a live room, a virtual environment screen targeted for viewing by a virtual character controlled by the anchor is displayed.
The virtual environment screen taking the virtual role controlled by the host as an observation target is displayed in a live broadcast interface. Optionally, a virtual camera is set in the virtual scene, an observation target of the virtual camera is a virtual role controlled by a host, and the live broadcast interface displays a virtual environment picture acquired by the virtual camera.
In another possible implementation, when the user enters the living room, what is displayed is a virtual environment view that is viewed at the first-person perspective of the avatar controlled by the anchor. In another possible implementation, when the user enters the living room, a global view of the virtual scene is displayed.
In addition, the live room is also used for a second change process of changing the live virtual scene along with the view switching operation of the audience. It should be noted that the view switching operation performed by different viewer clients may be different, and thus, the second change procedure of live room presentation in different viewer clients may be different.
302. And the terminal acquires sliding operation triggered based on the live broadcast interface.
Wherein the sliding operation is an operation for moving a screen displayed in the current live interface. For example, when displaying a virtual scene in a live interface, a partial area of the virtual scene may be displayed, for example, the global area of the virtual scene is an area of 500 m by 900 m, and the virtual scene displayed in the live interface is an area of 200 m by 200 m centered on a virtual character controlled by a host. However, the viewer in the live broadcasting room wants to know the situation of other areas, and can trigger a sliding operation in the live broadcasting interface to drag the picture displayed in the live broadcasting interface to move, so that other areas in the virtual scene are displayed in the live broadcasting interface, and drag the picture displayed in the live broadcasting interface to move leftwards, as shown in fig. 5, so that the virtual scene on the right side is displayed.
Optionally, the terminal acquiring the sliding operation triggered based on the live interface refers to: the terminal detects a sliding operation based on the live interface. Optionally, the terminal acquiring the sliding operation triggered based on the live interface refers to: the terminal acquires operation information of the sliding operation. Optionally, the operation information includes an operation type of the sliding operation, and optionally, the operation information includes a sliding direction, a sliding distance, and the like of the sliding operation. The embodiment of the application does not limit the terminal acquisition sliding operation.
It should be noted that, in the embodiment of the present application, when the sliding operation generates the sliding direction and the sliding distance, the sliding operation may be acquired, and based on the sliding operation, the display content in the live interface is updated, without waiting until the sliding operation is released, that is, without waiting until the sliding operation is finished, and then updating the display content in the live interface.
303. And the terminal responds to the sliding operation, and updates the currently displayed virtual scene into the virtual scene corresponding to the target position indicated by the sliding operation in the live interface.
Wherein the sliding operation is an operation for moving a screen displayed in the current live interface. The sliding direction and the sliding distance of the sliding operation determine which direction and how much the screen displayed in the current live interface moves, so that the sliding operation can indicate a target position, and the terminal can display a virtual scene corresponding to the target position in the live interface.
For example, the virtual scene currently displayed in the live interface is a virtual environment picture taking the virtual role controlled by the anchor as an observation target, and after the terminal triggers the sliding operation, the terminal determines a target position according to the sliding direction and the sliding distance of the sliding operation and the position of the virtual role controlled by the anchor, and displays the live environment picture taking the target position as the observation target in the live interface.
The above steps 302 and 303 only illustrate how the terminal displays in the live interface while the sliding operation is in progress. An exemplary description of how the terminal displays when the sliding operation is released is provided below.
In one possible implementation manner, the live interface display method further includes: the terminal responds to the sliding operation to be released, and the virtual scene corresponding to the target position is continuously displayed in the live interface; that is, when the sliding operation is released, the terminal keeps the current display content of the live interface unchanged.
Or the terminal is released in response to the sliding operation, and in the live interface, the currently displayed virtual scene is updated to the virtual scene displayed before the sliding operation is triggered; that is, after the user releases the sliding operation, the terminal may continue to display the previous screen.
Or the virtual scene comprises a virtual role controlled based on the anchor account, the terminal is released in response to the sliding operation, and the virtual scene corresponding to the display target position is canceled in the live interface, and the virtual scene corresponding to the virtual role is displayed. The virtual scene corresponding to the virtual character is a virtual scene observed by taking the virtual character as an observation target, or the virtual scene corresponding to the virtual character is a virtual scene observed by a first person viewing angle of the virtual character.
It should be noted that, if, before triggering the sliding operation, the terminal displays a virtual scene corresponding to the target role in the live interface, in one possible implementation manner, the terminal is released in response to the sliding operation, and in the live interface, the display of the virtual scene corresponding to the target position is canceled, and the virtual scene corresponding to the target role is displayed. The target character may be a virtual character controlled by the anchor or a virtual character controlled by other players.
It should be noted that, in one possible implementation manner, the terminal may also switch the viewing angle, and display a virtual scene corresponding to any viewing angle, and the following exemplary description is given of switching the viewing angle of the terminal through step 304 and step 305.
304. And the terminal acquires triggering operation of any virtual character identifier in the live broadcast interface.
The live interface also displays the character identification of at least one virtual character in the virtual scene, wherein the virtual character in the virtual scene comprises the virtual character controlled by the live broadcast and the virtual character controlled by other players, and optionally, the live interface displays the character identifications of all the virtual characters; optionally, the live interface displays the role identifier of the virtual role controlled by the host and the role identifier of the virtual role subordinate to the same battle team with the virtual role controlled by the host. The embodiment of the application does not limit the character identification of at least one virtual character displayed in the live interface.
The triggering operation may be any operation, for example, a clicking operation, a double-clicking operation, or a long-pressing operation, which is not limited in the embodiment of the present application.
Optionally, the triggering operation of the terminal to acquire any virtual character identifier in the live interface means: and the terminal detects the triggering operation of any character identifier based on the live broadcast interface. Optionally, the triggering operation of the terminal to acquire any virtual character identifier in the live interface means: the terminal acquires a trigger object of the trigger operation, wherein the trigger object is a role identifier of any virtual role. The embodiment of the application does not limit the triggering operation of the terminal for acquiring any role identifier.
305. And the terminal responds to the triggering operation of the character identification, and displays the virtual scene corresponding to any virtual character in the live interface.
The virtual scene corresponding to the virtual character refers to a virtual scene observed by taking the virtual character as an observation target, or a virtual scene corresponding to the virtual character is a virtual scene observed by a first person perspective of the virtual character.
The audience client can watch the virtual scene corresponding to any virtual character through triggering operation of the character identification of any virtual character, so that the display of the virtual scene in the live interface is more flexible.
As shown in fig. 6, the user performs a triggering operation on the character identifier of any virtual character in the live interface, and the terminal responds to the triggering operation to display the observed virtual scene with the virtual character as an observation target.
306. And the terminal acquires triggering operation of the global identifier.
The live interface also displays a global identification.
The triggering operation may be any operation, for example, a clicking operation, a double-clicking operation, or a long-pressing operation, which is not limited in the embodiment of the present application.
Optionally, the terminal acquiring the triggering operation of the global identifier refers to: the terminal detects triggering operation on the global identification based on the live broadcast interface. Optionally, the terminal acquiring the triggering operation of the global identifier refers to: the terminal acquires a trigger object of the trigger operation, wherein the trigger object is a global identification. The embodiment of the application does not limit the triggering operation of the terminal to acquire the global identifier.
307. And the terminal responds to the triggering operation of the global identification, and the global of the virtual scene is displayed in the live interface.
It should be noted that, the steps 301 to 307 may be performed during the live broadcast process, or may be performed during the live broadcast of the viewing history, which is not limited in the embodiment of the present application.
It should be noted that, the above steps 302-303, 304-305, and 306-307 are optional execution schemes, and may or may not be executed during the live broadcast viewing process.
It should be noted that, the embodiment shown in fig. 3 only illustrates the view switching operation by taking the sliding operation and the triggering operation of the identifier as an example, and in some embodiments, the view switching operation may also be an enlarging operation or a reducing operation, where the enlarging operation is used to enlarge the virtual scene displayed in the live interface, and since the virtual scene in the live interface is enlarged, the virtual scene displayed in the live interface is reduced; the zoom-out operation is used to zoom out the virtual scene displayed in the live interface, and as the virtual scene in the live interface is zoomed out, the virtual scene displayed in the live interface becomes more.
In addition to triggering the view switching operation, other operations may be triggered in the live interface, for example, a live room switching operation, a playback operation, and the like.
In one possible implementation manner, after displaying the virtual scene in the live interface, the live interface display method further includes: responsive to a playback operation triggered based on the live interface, determining a target time selected by the playback operation, the playback operation indicating playback starting from the target time; and displaying the virtual scene under the target time in the live interface. The target time selected by the playback operation may be any historical time, that is, the terminal may play back the virtual scene in the live interface from any historical time.
Alternatively, as shown in fig. 7, the live interface includes a progress bar, and in response to an update operation of a progress value of the progress bar, the updated progress value is determined as a target time selected by a playback operation. The update operation of the progress value may be any operation such as a sliding operation or a clicking operation, which is not limited in the embodiment of the present application.
For example, in a live broadcasting process of a live broadcasting room, a client of a viewer enters the live broadcasting room, a live broadcasting interface is displayed, a virtual scene at the current moment is displayed in the live broadcasting interface, a progress bar is displayed in the live broadcasting interface, at this time, a progress value of the progress bar is the total current playing duration of the live broadcasting room, by clicking any position of the progress bar, the time corresponding to the position is determined as a target time selected by an updating operation, and in the live broadcasting room, playback is started from the target time.
Optionally, the live interface includes an input box, and the time acquired based on the input box is determined as the target time. The embodiment of the application only takes two modes of a progress bar and an input box as examples to exemplarily illustrate the playback operation, and does not limit the playback operation. In one possible implementation, the playback operation may also be other operations.
According to the live broadcast interface display method provided by the embodiment of the application, the video switching operation can be responded, so that the display content of the live broadcast interface is updated, and therefore, when a user watches live broadcast in a live broadcast room, the video switching operation can be triggered to control the live broadcast interface to display a virtual scene of a user designated video, and the flexibility of live broadcast is improved.
According to the live broadcast interface display method provided by the embodiment of the application, the triggering operation of the character identification of any virtual character in the live broadcast interface can be responded, and the virtual scene corresponding to the any virtual character is displayed in the live broadcast interface, so that when a user watches live broadcast in a live broadcast room, the triggering operation of any character identification can be triggered, the live broadcast interface can be controlled to display the virtual scene corresponding to the virtual character appointed by the user, and the live broadcast flexibility is improved.
According to the live broadcast interface display method provided by the embodiment of the application, the sliding operation detected based on the live broadcast interface can be responded, so that the picture displayed in the live broadcast interface is controlled to move, a user can watch any region of the virtual scene, and the live broadcast flexibility is improved.
According to the live broadcast interface display method provided by the embodiment of the application, the playback operation can be performed in the live broadcast process of the live broadcast room, and the playback is started from the target time selected by the playback operation, so that when a user watches live broadcast in the live broadcast room being live broadcast, the virtual scene at the appointed moment of the user can be controlled to be displayed on the live broadcast interface through the playback operation, and the flexibility of live broadcast is improved.
Fig. 8 is a flowchart of a live interface display method according to an embodiment of the present application. Referring to fig. 8, an embodiment of the present application is exemplarily illustrated by taking an execution body as a terminal, where the method includes:
801. in the live broadcasting process of the live broadcasting room, the terminal displays the virtual scene in the current time in the live broadcasting interface of the live broadcasting room.
The live room is used for a first change process of a live virtual scene along with a main broadcasting operation. If a live broadcasting room is in the live broadcasting process, after a viewer enters the live broadcasting room, a virtual scene at the current time is displayed in a live broadcasting interface of the live broadcasting room, and the displayed virtual scene is continuously updated along with the time in the live broadcasting interface of the viewer client side, so that the virtual scene at the current time is always displayed.
802. The terminal determines a target time selected by the playback operation based on the playback operation triggered by the live interface, wherein the playback operation indicates playback from the target time.
The embodiment of the application provides a playback function when watching live video being live, wherein the target time selected by playback operation is any historical time of the live room, that is, the playback function provided by the application can start playback at any time.
803. And the terminal displays the virtual scene under the target time in the live interface.
And the terminal responds to the playback operation triggered based on the live interface, displays the virtual scene under the target time in the live interface, and starts playback from the target time.
According to the live broadcast interface display method provided by the embodiment of the application, the playback operation can be performed in the live broadcast process of the live broadcast room, and the playback is started from the target time selected by the playback operation, so that when a user watches live broadcast in the live broadcast room being live broadcast, the virtual scene at the appointed moment of the user can be controlled to be displayed on the live broadcast interface through the playback operation, and the flexibility of live broadcast is improved.
Fig. 9 is a flowchart of a live interface display method according to an embodiment of the present application. Referring to fig. 9, an embodiment of the present application is exemplarily described using an execution body as a terminal, where the method includes:
901. In the live broadcasting process of the live broadcasting room, the terminal displays the virtual scene in the current time in the live broadcasting interface of the live broadcasting room.
The live room is used for a first change process of a live virtual scene along with a main broadcasting operation. If a live broadcasting room is in the live broadcasting process, after a viewer enters the live broadcasting room, a virtual scene at the current time is displayed in a live broadcasting interface of the live broadcasting room, and the displayed virtual scene is continuously updated along with the time in the live broadcasting interface of the viewer client side, so that the virtual scene at the current time is always displayed.
It should be noted that the embodiment of the present application provides a playback function, so that when a viewer views a video being live, playback can be started from any time, and in one possible implementation manner, the terminal determines a target time selected by the playback operation based on a playback operation triggered by the live interface, where the playback operation indicates that playback is started from the target time. In the embodiment of the present application, step 902 and step 903 are taken as examples respectively, and the terminal determines the target time selected by the playback operation based on the playback operation triggered by the live interface, but the embodiment of the present application does not limit the playback operation.
902. The terminal responds to the updating operation of the progress value of the progress bar, and the updated progress value is determined to be the target time selected by the playback operation.
The live broadcast interface comprises a progress bar, wherein the progress bar is used for indicating the total live broadcast duration of the current live broadcast and the current play progress, and the total live broadcast duration can be updated continuously along with the live broadcast. The update operation of the progress value may be any operation such as a sliding operation or a clicking operation, which is not limited in the embodiment of the present application.
For example, in a live broadcasting process of a live broadcasting room, a client of a viewer enters the live broadcasting room, a live broadcasting interface is displayed, a virtual scene at the current moment is displayed in the live broadcasting interface, a progress bar is displayed in the live broadcasting interface, at this time, a progress value of the progress bar is the total current playing duration of the live broadcasting room, by clicking any position of the progress bar, the time corresponding to the position is determined as a target time selected by an updating operation, and in the live broadcasting room, playback is started from the target time.
903. The terminal determines the time acquired based on the input box as a target time.
The live interface includes an input box for acquiring a time, in which a user can input the time, and the terminal determines the time as a target time based on the time the user input is acquired by the input box, and the target time may be any time. For example, 3 minutes 45 seconds, etc.
904. And the terminal displays the virtual scene under the target time in the live interface.
And the terminal responds to the playback operation triggered based on the live interface, displays the virtual scene under the target time in the live interface, and starts playback from the target time.
According to the live broadcast interface display method provided by the embodiment of the application, the playback operation can be performed in the live broadcast process of the live broadcast room, and the playback is started from the target time selected by the playback operation, so that when a user watches live broadcast in the live broadcast room being live broadcast, the virtual scene at the appointed moment of the user can be controlled to be displayed on the live broadcast interface through the playback operation, and the flexibility of live broadcast is improved.
Fig. 10 is a flowchart of a live interface display method according to an embodiment of the present application. Referring to fig. 10, an embodiment of the present application is exemplarily illustrated by taking an execution body as a terminal, where the method includes:
1001. and the terminal responds to the access operation of the live broadcasting room, and acquires the state data of the virtual scene corresponding to the live broadcasting room, wherein the state data indicates the state of the virtual object in the virtual scene.
The live room is used for live broadcasting a first change process of the virtual scene along with the operation of the anchor. The access operation of the live broadcasting room can be any operation, for example, clicking operation of the identification of the live broadcasting room, etc., and the embodiment of the application does not limit the access operation of the live broadcasting room.
The state of the virtual object in the virtual scene may be the position of the virtual object, the blood volume of the virtual object, the rendering parameters of the virtual state, and the like.
It should be noted that, in the embodiment of the present application, the state data of the virtual scene corresponding to the live broadcast room is obtained, and then the virtual scene corresponding to the live broadcast room can be restored based on the state data.
The terminal is a terminal provided with a live broadcast application program, and in the embodiment of the application, the terminal is a client of a audience in a live broadcast room.
1002. And the terminal displays the virtual scene in a live interface of the live broadcasting room according to the state data.
In the embodiment of the application, the audience client side does not acquire the video recorded by the anchor client side from the live broadcast server, but acquires the state data, and displays the virtual scene according to the state data. Since the state data indicates the state of the virtual object in the virtual scene, the virtual scene displayed according to the state data is identical to the state of the virtual object in both virtual scenes compared to the virtual scene in the anchor client, and therefore the viewer client restores the virtual scene in the anchor client according to the state data.
1003. The terminal acquires an instruction executed in a state indicated by the state data.
The virtual object in the virtual scene comprises a virtual role controlled by the anchor, optionally, the virtual object can also comprise a virtual role controlled by other players or an NPC role, and the like, and in the process of controlling the virtual role, an instruction can be generated, and the virtual object can complete a certain action by executing the instruction, so that the virtual scene is updated.
The terminal displays static virtual scenes according to the state data, and the audience client can execute instructions executed by the anchor client in order to synchronize the virtual scenes of the audience client and the anchor client. Thus, the audience client needs to obtain instructions that the anchor client executes in the state indicated by the state data.
1004. And the terminal executes the acquired instruction on the virtual scene in the live broadcast interface.
The obtained instruction is executed on the virtual scene, so that the virtual scene is updated, and the state of the virtual scene in the audience client is consistent with the state of the virtual scene in the anchor client.
According to the live broadcast interface display method provided by the embodiment of the application, the process that the virtual scene in the anchor client changes along with the anchor operation can be restored through the acquired state data and the instruction, so that the data volume of the data required to be acquired in the live broadcast process is reduced, and the fluency of the live broadcast room is ensured.
Fig. 11 is a flowchart of a data uploading method according to an embodiment of the present application. Referring to fig. 11, an embodiment of the present application is exemplarily described by taking an execution body as a terminal, where the method includes:
1101. The terminal uploads state data of a virtual scene corresponding to a live broadcasting room to the live broadcasting server, wherein the state data indicates the state of a virtual object in the virtual scene, and the live broadcasting room is used for live broadcasting a first change process of the virtual scene along with a main broadcasting operation.
The terminal is a terminal where the anchor client is located, and the anchor client uploads the state data of the virtual scene to the server.
1102. And the terminal acquires an instruction executed by the application program corresponding to the virtual scene.
The application program execution instruction corresponding to the virtual scene can cause the virtual scene to change. The method comprises the steps of obtaining an instruction executed by an application program corresponding to a virtual scene, and then executing the obtained instruction on the virtual scene, so that the virtual scene is consistent with the virtual scene in the application program.
The application program corresponding to the virtual scene may be a game program, etc., and the application program corresponding to the virtual scene may be a part of application programs in the live broadcast application program or may be a separate application program. Optionally, the terminal obtains an instruction executed by an application program corresponding to the virtual scene, including: and the terminal calls the live application program to acquire an instruction executed by the application program corresponding to the virtual scene.
1103. And uploading the acquired instruction to the live broadcast server by the terminal.
The anchor client not only uploads the state data of the virtual scene, but also uploads the instruction executed by the application program corresponding to the virtual scene, and the subsequent live broadcast server can display the process of changing the virtual scene along with the anchor operation by transmitting the state data and the instruction acquired by the anchor client to the audience client.
According to the live broadcast interface display method provided by the embodiment of the application, the process that the virtual scene in the anchor client changes along with the anchor operation can be restored through the acquired state data and the instruction, so that the data volume of the data required to be acquired in the live broadcast process is reduced, and the fluency of the live broadcast room is ensured.
Fig. 12 is a flowchart of a data issuing method according to an embodiment of the present application. Referring to fig. 12, an embodiment of the present application is exemplarily described by taking an execution body as a server, where the method includes:
1201. the server obtains instructions for executing the virtual scene.
The instruction executed on the virtual scene may be obtained by the server from the anchor client, or may be obtained from other servers corresponding to the anchor client. Optionally, the terminal where the anchor client is located further installs an application program corresponding to the virtual scene, and other servers are servers for providing services for the application program.
The live broadcasting room is used for directly broadcasting a first change process of the virtual scene along with the anchor operation, and the anchor client side responds to the anchor operation to execute instructions on the virtual scene and update the virtual scene. Therefore, the server can acquire the instruction for executing the virtual scene, and issue the acquired instruction to the audience client in the living broadcast room, so that the audience client updates the virtual scene according to the instruction, and the virtual scene of the audience client is synchronous with the virtual scene of the anchor client.
1202. The server sends the acquired instruction to the viewer client in the living room.
1203. The audience client is used for executing the acquired instruction according to the local state data of the virtual scene so as to update the local state data.
According to the live broadcast interface display method provided by the embodiment of the application, the instruction for executing the virtual scene can be acquired, and the instruction is issued to the audience client side, so that the audience client side restores the process that the virtual scene in the anchor client side changes along with the anchor operation, the data quantity of data required to be acquired in the live broadcast process is reduced, and the fluency of the live broadcast room is ensured.
Fig. 13 is a flowchart of a live broadcast method according to an embodiment of the present application. Referring to fig. 13, an embodiment of the present application is exemplarily illustrated with an execution subject as a hosting client, a viewer client, and a live server, and the method includes:
1301. and uploading the state data of the virtual scene corresponding to the live broadcasting room to the live broadcasting server by the anchor client, wherein the state data indicates the state of the virtual object in the virtual scene.
The live broadcasting room is used for broadcasting a first change process of the virtual scene along with the operation of the host broadcasting.
The state of the virtual object in the virtual scene may be the position of the virtual object, the blood volume of the virtual object, the rendering parameters of the virtual state, and the like.
The status data uploaded by the anchor client may be initial status data of the virtual scene or latest status data of the virtual scene. In one possible implementation, the uploading, by the anchor client, status data of a virtual scene corresponding to the live room to the live server includes: and uploading initial state data of the virtual scene to the live broadcast server. The initial state data is initial rendering data obtained from a server corresponding to an application program of the virtual scene, and is used for rendering the virtual scene, and a subsequent anchor client performs an operation in the rendered virtual scene.
In another possible implementation manner, the uploading, by the anchor client, status data of a virtual scene corresponding to the live room to the live server includes: and uploading the current state data of the virtual scene to the live broadcast server every second time length. Wherein the second duration is any one of the durations, e.g., the second duration is 1 second, 2 seconds, etc.
For example, when live broadcasting is performed for 3 minutes and 54 seconds, current state data of a virtual scene is acquired, and the acquired current state data is uploaded to a live broadcasting server; and when the live broadcast is carried out for 3 minutes and 56 seconds, acquiring the current state data of the virtual scene again, and uploading the acquired current state data to the live broadcast server.
It should be noted that, in the live broadcast process, if the initial state data of the virtual scene is sent to the audience client, the audience client needs to acquire a large number of instructions and execute the instructions, and the state data obtained after the instructions are acquired and executed, and the virtual scene is displayed in the live broadcast interface according to the state data, so that the duration of the virtual scene displayed by the audience client in the live broadcast interface is increased.
It should be noted that, in some embodiments, there is a case where a communication connection between a hosting client and a live broadcast server is interrupted, and in order to avoid this, in an embodiment of the present application, the live broadcast server includes a plurality of receiving servers and a management server, and the hosting client uploads, to the live broadcast server, status data of a virtual scene corresponding to a live broadcast room, including: uploading state data of the virtual scene to each receiving server; each receiving server is used for sending the received state data to the management server, and the management server is used for carrying out de-duplication processing on the state data received by each server.
Because the anchor client is connected with a plurality of receiving servers, even if the communication connection between the anchor client and one of the receiving servers is interrupted, the anchor client can also communicate through other receiving servers, and the state data is uploaded to the server, so that the live broadcasting stability is ensured.
1302. And the live broadcast server receives the state data uploaded by the anchor client.
The live broadcast server receives the state data uploaded by the anchor client and stores the state data uploaded by the anchor client in correspondence with the identifier of the live broadcast room, or stores the state data uploaded by the anchor client in correspondence with the anchor account number of the anchor, so that the state data is accurately issued to the audience client when the audience client watches live broadcast.
1303. And the anchor client acquires an instruction executed by an application program corresponding to the virtual scene in a time period corresponding to the first time period every first time period.
The first duration may be any duration, for example, the first duration is 1 second, etc. The method for obtaining the instruction executed by the application program corresponding to the virtual scene by the anchor client comprises the following steps: the anchor client acquires an instruction executed by an application program corresponding to the virtual scene.
In some embodiments, there are multiple player characters in the virtual scene, so that the control end corresponding to each player character sends a triggered instruction to the server, the server packages the instruction and returns the instruction to the control end of each player character, and the control end of each player character executes the instruction returned by the server, so that the state of each player character in the virtual scene is updated.
Optionally, the instructions acquired by the anchor client may include: the time of occurrence of the instruction, the sequence number of the instruction, the content of the instruction, etc. Wherein the sequence number of the instruction indicates the occurrence sequence of the instruction, and the content of the instruction indicates how to execute the instruction, for example, the instruction content is: character a moves to the right.
It should be noted that, in the embodiment of the present application, only the instruction executed by the application program in the period corresponding to the first time interval is taken as an example by the anchor client to acquire the instruction executed by the application program corresponding to the virtual scene, and in another embodiment, the instruction executed by the application program corresponding to the virtual scene is acquired by the anchor client, including: and acquiring an instruction executed by the application program corresponding to the virtual scene in real time. In some embodiments, the anchor performs the fight in the virtual scene, and after the fight ends, the anchor client obtains an instruction executed by an application program corresponding to the virtual scene.
1304. And uploading the acquired instruction to the live broadcast server by the anchor client.
The anchor client can directly upload the acquired instruction to the live broadcast server.
Optionally, in order to facilitate the live server to manage the acquired instructions, the anchor client may package the acquired instructions and upload the instructions. In one possible implementation, the uploading, by the anchor client, the acquired instruction to the live server includes: the anchor client generates an instruction packet based on the acquired instruction, wherein the instruction packet at least comprises the acquired instruction and a packet sequence number, and the packet sequence number indicates the generation sequence of the instruction packet; and uploading the generated instruction packet to a live broadcast server.
It should be noted that, in the embodiment of the present application, only the instruction executed by the application program corresponding to the virtual scene in the time period corresponding to the first time period is packaged for example, and the generation of the instruction packet by the anchor client is illustrated by way of example. In some embodiments, in response to the instructions acquired by the anchor client reaching a threshold, an instruction packet is generated based on the acquired instructions.
1305. And the live broadcast server receives the instruction uploaded by the anchor client.
1306. And the audience client responds to the access operation to the living broadcast room, sends an access request to the living broadcast room to the living broadcast server, and acquires the state data of the virtual scene corresponding to the living broadcast room from the living broadcast server.
The live room is used for live broadcasting the process that the virtual scene changes along with the operation of the anchor. The access operation of the live broadcasting room can be any operation, for example, clicking operation of the identification of the live broadcasting room, etc., and the embodiment of the application does not limit the access operation of the live broadcasting room.
The state of the virtual object in the virtual scene may be the position of the virtual object, the blood volume of the virtual object, the rendering parameters of the virtual state, and the like.
It should be noted that, in the embodiment of the present application, the state data of the virtual scene corresponding to the live broadcast room is obtained, and then the virtual scene corresponding to the live broadcast room can be restored based on the state data.
In one possible implementation, a viewer client obtains initial state data of a virtual scene corresponding to a live room in response to an access request to the live room. In another possible implementation manner, the audience client side responds to an access request of the live broadcast room to acquire the latest state data of the virtual scene corresponding to the live broadcast room. Wherein, the latest state data of the virtual scene refers to: the server stores state data of the virtual scene at a plurality of times, and among the state data at the plurality of times, state data at the latest time is stored.
1307. And the audience client displays the virtual scene in a live interface of the live broadcasting room according to the state data.
It should be noted that, if the viewer client obtains the initial state data of the virtual scene corresponding to the live broadcast room from the live broadcast server, optionally, after obtaining the initial state data of the virtual scene, the viewer client may display the initial virtual scene based on the initial state data. Subsequently, the viewer client may also obtain an instruction for executing the virtual scene, and update the initial virtual scene according to the instruction, that is, the viewer client starts playback after entering the live broadcast room.
Optionally, after acquiring the initial state data of the virtual scene, the viewer client acquires an instruction executed in a state indicated by the initial state data, executes the acquired instruction on the virtual scene to update the state data of the virtual scene, and displays the virtual scene based on the updated state data of the virtual scene.
And if the audience client acquires the latest state data of the virtual scene corresponding to the live broadcasting room from the live broadcasting server, displaying the virtual scene in the live broadcasting interface according to the latest state data.
1308. The viewer client obtains instructions from the live server that are executed in the state indicated by the state data.
The audience client side obtains the instruction from the live broadcast server, namely the audience client side sends an instruction obtaining request to the live broadcast server so that the live broadcast server can send the instruction to the audience client side; or after each time the live broadcast server receives the instruction of uploading by the host broadcast, the live broadcast server actively transmits the instruction of uploading to each audience client in the live broadcast room.
Wherein the instruction executed in the state indicated by the state data is an instruction generated after the generation time of the state data, optionally, the viewer client acquires the instruction executed in the state indicated by the state data, including: the viewer client acquires an instruction generated after the generation time according to the generation time of the acquired state data.
Optionally, when the instruction is uploaded by the anchor server, the instruction is packaged, and the obtained instruction packet is uploaded to the live broadcast server, so that when the audience client acquires the instruction, the instruction packet uploaded by the anchor client can be acquired from the live broadcast server. Optionally, acquiring an instruction executed in a state indicated by the state data includes: and receiving an instruction packet returned by the live broadcast server, wherein the instruction packet comprises a packet sequence number and at least one instruction, the at least one instruction is generated in a target time period corresponding to the packet sequence number, and the target time period is a time period after the generation time of the state data.
In addition, since the instruction packet includes the packet number, the viewer client can accurately acquire the instruction packet from the live broadcast server based on the packet number of the local instruction packet. Optionally, receiving an instruction packet returned by the live broadcast server, including: after the communication connection with the live broadcast server is disconnected and reconnected, sending an instruction acquisition request to the live broadcast server, wherein the instruction acquisition request carries a reference packet sequence number of a last instruction packet acquired locally; and receiving an instruction packet with a packet sequence number larger than the reference packet sequence number returned by the live broadcast server.
1309. And the audience client executes the acquired instruction on the virtual scene in the live broadcasting room interface.
If the acquired instruction is a plurality of instructions, the audience client executes the acquired instruction on the virtual scene in the live broadcast room interface, including: and the audience client side sequentially executes the plurality of instructions on the virtual scene in the live broadcast interface according to the obtained time stamps of the plurality of instructions.
It should be noted that, in the embodiment of the present application, only the anchor client uploads the state data of the virtual scene and the instruction executed on the virtual scene to the server, and the server issues the state data and the instruction to the audience client, which is an example, to exemplarily describe the live broadcast process.
In another embodiment, the terminal where the anchor client is located is provided with an application client corresponding to the virtual scene in addition to the anchor client, and the embodiment takes the application client corresponding to the virtual scene as a game client for example for illustration. In this embodiment, the game client is configured to obtain initial state data of the virtual scene from the corresponding game server, generate an instruction for executing the virtual scene according to the hosting operation, and send the instruction to the game server. In addition, the game server can also receive instructions uploaded by other participants in the virtual scene. In this way, the game server can acquire each instruction executed on the virtual scene, and execute the instruction on the virtual scene according to the initial state data of the virtual scene, so as to acquire the current state data of the virtual scene. Accordingly, the game server may include therein status data of the virtual scene and instructions executed on the virtual scene. Thus, the live server may obtain status data and instructions from the game server. The embodiment of the application does not limit the sources of the state data and the instructions acquired by the live broadcast server.
It should be noted that in the embodiment of the present application, the live room is not only used for a first changing process of a live virtual scene changing along with a hosting operation, but also used for a second changing process of the live virtual scene changing along with a view switching operation triggered by a viewer. The method further comprises the steps that after the audience client displays the virtual scene according to the state data in a live interface of the live broadcasting room, the method comprises the following steps: the audience client responds to the visual field switching operation triggered based on the live broadcast interface and acquires the operation information of the visual field switching operation; determining display style parameters of the virtual scene according to the operation information, wherein the display style parameters indicate the center point and the size of the virtual scene presented in the live interface; and displaying the virtual scene corresponding to the display style parameter in the live interface.
The operation information may be an operation type of a view switching operation, for example, a click operation, a slide operation, or the like; the operation information may also be a trigger position of the field switching operation, and the embodiment of the application does not limit the operation information.
Optionally, the display style parameter includes a center point coordinate and a target size, and displaying, in the live interface, a virtual scene corresponding to the display style parameter includes: in the live interface, an area centered on the center point coordinates and having the target size is displayed in the virtual scene. The target size may be any shape, and the embodiment of the present application does not limit the target size.
In the embodiment of the present application, a field switching operation is taken as an example of a sliding operation, and a display style parameter of a virtual scene is updated in response to the field switching operation. In one possible implementation, the display style parameters include center point coordinates, and determining the display style parameters of the virtual scene according to the operation information includes: responding to a sliding operation triggered based on a live interface, and determining the sliding direction and the sliding distance of the sliding operation; and updating the coordinates of the central point in the display style parameters according to the sliding direction and the sliding distance.
Wherein the sliding direction and the sliding distance indicate a relative positional relationship of the center point coordinates before updating and the center point coordinates after updating.
For example, the center point coordinates are (550, 700), the sliding direction based on the live interface triggered sliding operation is left, the sliding distance is 2 cm, and the updated center point coordinates are (750, 700).
In one possible implementation, the view switch operation is a trigger operation for character identification in the live interface. Optionally, the display style parameter includes a center point coordinate, and determining the display style parameter of the virtual scene according to the operation information includes: the live interface also displays character identifications of at least one virtual character in the virtual scene, and the coordinates of the virtual characters are determined to be the center point coordinates in response to triggering operation of the character identifications of any virtual character, so that the virtual characters are used as observation targets.
In one possible implementation, the spectator client may also display the global of the virtual scene. Optionally, the display style parameters include center point coordinates and target size; according to the operation information, determining display style parameters of the virtual scene, including: the live interface is also displayed with a global identifier, coordinates of a center point of the virtual scene are determined to be the coordinates of the center point in response to triggering operation of the global identifier, and the size of the virtual scene is determined to be the target size, so that the whole virtual scene is displayed in the live interface.
It should be noted that, the viewer client may also play back the live broadcast, and in one possible implementation manner, after displaying the virtual scene according to the state data in the live broadcast interface of the live broadcast room, the method further includes: determining a target time selected by a playback operation in response to the playback operation triggered based on the live interface; acquiring target state data based on target time, wherein the target state data indicates the state of a virtual object in a virtual scene at the target time; displaying a virtual scene in the live interface according to the target state data; acquiring instructions executed after the target time based on the target time; and executing the acquired instruction on the virtual scene in the live broadcast interface.
It should be noted that, in the embodiment of the present application, the uploading and downloading of data are frequent, and optionally, the anchor client, the audience client and the live server use TCP (Transmission Control Protocol ) long-chain to send and receive data. The kcp protocol (a fast reliability protocol) or quic protocol (a UDP-based low latency internet transport layer protocol) used for UDP (User Datagram Protocol ) can also be used if it is desired to reduce latency in transceiving data, and the PB (ProtoBuf, flexible, efficient, automatic method for structured data serialization) protocol is used to compress the transmitted data.
According to the live broadcast interface display method provided by the embodiment of the application, the process that the virtual scene in the anchor client changes along with the anchor operation can be restored through the acquired state data and the instruction, so that the data volume of the data required to be acquired in the live broadcast process is reduced, and the fluency of the live broadcast room is ensured.
According to the live broadcast interface display method provided by the embodiment of the application, the video switching operation can be responded, so that the display content of the live broadcast interface is updated, and therefore, when a user watches live broadcast in a live broadcast room, the live broadcast interface can be controlled to display a virtual scene of a user designated visual field by triggering the visual field switching operation, and the flexibility of live broadcast is improved.
According to the live broadcast interface display method provided by the embodiment of the application, the playback operation can be performed in the live broadcast process of the live broadcast room, and the playback is started from the target time selected by the playback operation, so that when a user watches live broadcast in the live broadcast room being live broadcast, the virtual scene at the appointed moment of the user can be controlled to be displayed on the live broadcast interface through the playback operation, and the flexibility of live broadcast is improved.
Fig. 14 is a schematic structural diagram of a live interface display device provided by the present application. Referring to fig. 14, the apparatus includes:
a display module 1401, configured to display a virtual scene in a live interface of a live broadcast room, where the live broadcast room is configured to live broadcast a first change process of the virtual scene along with a hosting operation;
an obtaining module 1402, configured to obtain a view switching operation triggered based on the live interface;
the display module 1401 is configured to update a virtual scene displayed in the live interface in response to the view switching operation.
As shown in fig. 15, in one possible implementation manner, the display module 1401 is configured to respond to a sliding operation triggered based on the live interface, and update, in the live interface, a virtual scene that is currently displayed to a virtual scene that corresponds to a target position indicated by the sliding operation.
In one possible implementation, the display module 1401 is configured to continue displaying, in response to the sliding operation being released, the virtual scene corresponding to the target location in the live interface; or alternatively
The display module 1401 is configured to update, in response to the sliding operation being released, a currently displayed virtual scene to a virtual scene displayed before the sliding operation is triggered in the live interface; or alternatively
The virtual scene includes a virtual character controlled based on a anchor account, and the display module 1401 is configured to cancel display of the virtual scene corresponding to the target position in the live interface and display the virtual scene corresponding to the virtual character in response to the release of the sliding operation.
In one possible implementation manner, the live broadcast interface further displays a character identifier of at least one virtual character in the virtual scenes, and the display module 1401 is configured to respond to a triggering operation on the character identifier of any virtual character, and display, in the live broadcast interface, a virtual scene corresponding to the any virtual character; or alternatively
The live interface further displays a global identifier, and the display module 1401 is configured to respond to a triggering operation on the global identifier, and display, in the live interface, a global of the virtual scene.
In one possible implementation, the apparatus further includes:
a determining module 1403, configured to determine a target time selected by a playback operation triggered based on the live interface, where the playback operation indicates playback from the target time;
The display module 1401 is configured to display, in the live interface, a virtual scene at the target time.
In one possible implementation manner, the live interface includes a progress bar, and the determining module 1403 is configured to determine, in response to an update operation on a progress value of the progress bar, the updated progress value as a target time selected by the playback operation; or alternatively
The live interface includes an input box, and the determining module 1403 is configured to determine a time acquired based on the input box as the target time.
Fig. 16 is a schematic structural diagram of a live interface display device provided by the present application. Referring to fig. 16, the apparatus includes:
The display module 1601 is configured to display, in a live broadcast interface of a live broadcast room, a virtual scene at a current time in a live broadcast process of the live broadcast room, where the live broadcast room is configured to live a first change process of the virtual scene along with a main broadcast operation;
A determining module 1602, configured to determine a target time selected by a playback operation based on a playback operation triggered by the live interface, the playback operation indicating playback starting from the target time;
The display module 1601 is configured to display, in the live interface, a virtual scene at the target time.
Fig. 17 is a schematic structural diagram of a live interface display device provided by the application. Referring to fig. 17, the apparatus includes:
the data acquisition module 1701 is configured to respond to an access operation to a live broadcasting room, and acquire state data of a virtual scene corresponding to the live broadcasting room, where the state data indicates a state of a virtual object in the virtual scene, and the live broadcasting room is used for broadcasting a first change process of the virtual scene along with a hosting operation;
the display module 1702 is configured to display the virtual scene according to the state data in a live interface of the live broadcasting room;
An instruction acquisition module 1703 for acquiring an instruction executed in a state indicated by the state data;
And the instruction execution module 1704 is configured to execute the acquired instruction on the virtual scene in the live broadcast interface.
As shown in fig. 18, in one possible implementation manner, the data obtaining module 1701 is configured to obtain initial state data of a virtual scene corresponding to the live broadcast room in response to an access operation to the live broadcast room; or alternatively
The data obtaining module 1701 is configured to obtain, in response to an access operation to the live broadcast room, latest state data of a virtual scene corresponding to the live broadcast room.
In one possible implementation, the instruction acquiring module 1703 is configured to acquire an instruction generated after a generation time of the acquired state data according to the generation time.
In one possible implementation manner, the data obtaining module 1703 is configured to receive an instruction packet returned by the live broadcast server, where the instruction packet includes a packet sequence number and at least one instruction, the at least one instruction is an instruction generated in a target period of time corresponding to the packet sequence number, and the target period of time is a period of time after the generation time of the state data.
In one possible implementation, the data acquisition module 1703 includes:
A sending unit 1713, configured to send an instruction acquisition request to the live broadcast server after the communication connection with the live broadcast server is disconnected and reconnected, where the instruction acquisition request carries a reference packet sequence number of a last instruction packet that has been acquired locally;
and a receiving unit 1723, configured to receive an instruction packet with a packet sequence number greater than the reference packet sequence number returned by the live broadcast server.
In one possible implementation, the apparatus further includes:
an operation obtaining module 1705, configured to obtain operation information of a field switching operation triggered based on the live interface in response to the field switching operation;
a parameter determining module 1706, configured to determine, according to the operation information, a display style parameter of the virtual scene, where the display style parameter indicates a center point and a size of the virtual scene presented in the live interface;
The display module 1702 is configured to display, in the live interface, a virtual scene corresponding to the display style parameter.
In one possible implementation manner, the display style parameter includes a center point coordinate and a target size, and the display module 1702 is configured to display, in the live interface, a region in the virtual scene centered on the center point coordinate and having the target size as a size.
In one possible implementation, the display style parameters include center point coordinates;
The parameter determining module 1706 is configured to determine a sliding direction and a sliding distance of the sliding operation in response to the sliding operation triggered based on the live interface;
The display module 1702 is configured to update the center point coordinate in the display style parameter according to the sliding direction and the sliding distance.
In one possible implementation manner, the display style parameter includes a center point coordinate, and the live interface further displays a character identifier of at least one virtual character in the virtual scene;
the parameter determining module 1706 is configured to determine coordinates of any virtual character as the center point coordinates in response to a triggering operation for character identification of the virtual character.
In one possible implementation manner, the display style parameter includes a center point coordinate and a target size, and the live interface also displays a global identifier;
The parameter determining module 1706 is configured to determine, in response to a triggering operation on the global identifier, coordinates of a center point of the virtual scene as the center point coordinates, and a size of the virtual scene as the target size.
In one possible implementation, the apparatus further includes:
a time determining module 1707, configured to determine a target time selected by a playback operation triggered based on the live interface in response to the playback operation;
the data obtaining module 1701 is configured to obtain target state data based on the target time, where the target state data indicates a state of a virtual object in the virtual scene at the target time;
the display module 1702 is configured to display, in the live interface, the virtual scene according to the target state data;
The instruction acquiring module 1703 is configured to acquire an instruction executed after the target time based on the target time;
The instruction execution module 1704 is configured to execute, in the live interface, the acquired instruction on the virtual scene.
Fig. 19 is a schematic structural diagram of a data uploading device provided by the present application. Referring to fig. 19, the apparatus includes:
An uploading module 1901, configured to upload, to a live broadcast server, state data of a virtual scene corresponding to a live broadcast room, where the state data indicates a state of a virtual object in the virtual scene, and the live broadcast room is configured to live broadcast a first change process of the virtual scene along with a hosting operation;
an obtaining module 1902, configured to obtain an instruction executed by an application program corresponding to the virtual scene;
the uploading module 1901 is configured to upload the acquired instruction to the live broadcast server.
As shown in fig. 20, in one possible implementation manner, the obtaining module 1902 is configured to obtain, every a first duration, an instruction executed by the application program in a period corresponding to the first duration.
In one possible implementation, the uploading module 1901 includes:
a generating unit 1911, configured to generate an instruction packet based on the acquired instruction, where the instruction packet includes at least the acquired instruction and a packet sequence number, and the packet sequence number indicates a generation order of the instruction packet;
and the sending unit 1921 is used for uploading the generated instruction packet to the live broadcast server.
In one possible implementation, the uploading module 1901 is configured to upload initial state data of the virtual scene to the live broadcast server; or alternatively
The uploading module 1901 is configured to upload current state data of the virtual scene to the live broadcast server every second duration.
In one possible implementation, the live broadcast server includes a plurality of receiving servers and a management server, and the uploading module 1901 is configured to upload status data of the virtual scene to each receiving server; each receiving server is used for sending the received state data to the management server, and the management server is used for carrying out de-duplication processing on the state data received by each server.
Fig. 21 is a schematic structural diagram of a data issuing device provided by the present application. Referring to fig. 21, the apparatus includes:
an acquisition module 2101 for acquiring an instruction to be executed on a virtual scene;
a sending module 2102, configured to send an acquired instruction to a viewer client in a live broadcast room, where the live broadcast room is configured to live broadcast a first change process of the virtual scene along with a hosting operation;
The audience client is used for executing the acquisition instruction according to the local state data of the virtual scene so as to update the local state data.
In one possible implementation, the apparatus further includes:
The acquiring module 2101 is further configured to acquire status data of a virtual scene corresponding to the live broadcast room in response to receiving an access request to the live broadcast room;
The sending module 2102 is further configured to send, according to the account carried by the access request, the status data to an audience client that logs in to the account, so that the audience client displays the virtual scene in a live broadcast interface of the live broadcast room according to the status data.
The embodiment of the application also provides a computer device, which comprises a processor and a memory, wherein at least one program code is stored in the memory, and the at least one program code is loaded and executed by the processor to realize the operation executed in the live interface display method of the embodiment or the operation executed in the data uploading method of the embodiment; or to implement the operations performed in the data delivery method as described in the above embodiments.
Optionally, a computer device is provided as the terminal. Fig. 22 is a schematic structural diagram of a terminal according to an embodiment of the present application. The terminal 2200 may be a portable mobile terminal such as: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion picture expert compression standard audio plane 3), an MP4 (Moving Picture Experts Group Audio Layer IV, motion picture expert compression standard audio plane 4) player, a notebook computer, or a desktop computer. Terminal 2200 may also be referred to as other names of user devices, portable terminals, laptop terminals, desktop terminals, etc.
Terminal 2200 includes: a processor 2201 and a memory 2202.
The processor 2201 may include one or more processing cores, such as a 4-core processor, an 8-core processor, or the like. The processor 2201 may be implemented in at least one hardware form of DSP (DIGITAL SIGNAL Processing), FPGA (Field-Programmable gate array), PLA (Programmable Logic Array ). The processor 2201 may also include a main processor and a coprocessor, where the main processor is a processor for processing data in an awake state, and is also called a CPU (Central Processing Unit ); a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 2201 may be integrated with a GPU (Graphics Processing Unit, image processor) for taking care of the display and rendering of the content that the display screen is required to display. In some embodiments, the processor 2201 may also include an AI (ARTIFICIAL INTELLIGENCE ) processor for processing computing operations related to machine learning.
Memory 2202 may include one or more computer-readable storage media, which may be non-transitory. Memory 2202 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 2202 is used to store at least one program code for execution by processor 2201 to perform the operations performed in the live interface display method of the above embodiments or to perform the operations performed in the data upload method of the above embodiments; or to implement the operations performed in the data delivery method as described in the above embodiments.
In some embodiments, terminal 2200 may optionally further comprise: a peripheral interface 2203 and at least one peripheral device. The processor 2201, memory 2202, and peripheral interface 2203 may be connected by a bus or signal line. The individual peripheral devices may be connected to the peripheral device interface 2203 by buses, signal lines or circuit boards. Specifically, the peripheral device includes: at least one of radio frequency circuitry 2204, a display 2205, a camera assembly 2206, audio circuitry 2207, a positioning assembly 2208, and a power source 2209.
The peripheral interface 2203 may be used to connect at least one Input/Output (I/O) related peripheral device to the processor 2201 and the memory 2202. In some embodiments, the processor 2201, memory 2202, and peripheral interface 2203 are integrated on the same chip or circuit board; in some other embodiments, either or both of the processor 2201, the memory 2202, and the peripheral interface 2203 may be implemented on separate chips or circuit boards, which is not limited in this embodiment.
The Radio Frequency circuit 2204 is configured to receive and transmit RF (Radio Frequency) signals, also referred to as electromagnetic signals. The radio frequency circuit 2204 communicates with a communication network and other communication devices via electromagnetic signals. The radio frequency circuit 2204 converts an electrical signal into an electromagnetic signal for transmission, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 2204 includes: antenna systems, RF transceivers, one or more amplifiers, tuners, oscillators, digital signal processors, codec chipsets, subscriber identity module cards, and so forth. The radio frequency circuit 2204 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocol includes, but is not limited to: the world wide web, metropolitan area networks, intranets, generation mobile communication networks (2G, 3G, 4G, and 20G), wireless local area networks, and/or WiFi (WIRELESS FIDELITY ) networks. In some embodiments, the radio frequency circuit 2204 may further include NFC (NEAR FIELD Communication) related circuits, which is not limited by the present application.
The display 2205 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display 2205 is a touch display, the display 2205 also has the ability to collect touch signals at or above the surface of the display 2205. The touch signal may be input as a control signal to the processor 2201 for processing. At this point, the display 2205 may also be used to provide virtual buttons and/or virtual keyboards, also referred to as soft buttons and/or soft keyboards. In some embodiments, the display 2205 may be one, disposed on the front panel of the terminal 2200; in other embodiments, the display 2205 may be at least two, respectively disposed on different surfaces of the terminal 2200 or in a folded configuration; in other embodiments, the display 2205 may be a flexible display disposed on a curved surface or a folded surface of the terminal 2200. Even more, the display 2205 may be configured in a non-rectangular irregular pattern, i.e., a shaped screen. The display 2205 may be made of LCD (Liquid CRYSTAL DISPLAY), OLED (Organic Light-Emitting Diode) or other materials.
The camera assembly 2206 is used to capture images or video. Optionally, camera assembly 2206 includes a front camera and a rear camera. The front camera is arranged on the front panel of the terminal, and the rear camera is arranged on the back of the terminal. In some embodiments, the at least two rear cameras are any one of a main camera, a depth camera, a wide-angle camera and a tele camera, so as to realize that the main camera and the depth camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting and Virtual Reality (VR) shooting function or other fusion shooting functions. In some embodiments, camera assembly 2206 may also include a flash. The flash lamp can be a single-color temperature flash lamp or a double-color temperature flash lamp. The dual-color temperature flash lamp refers to a combination of a warm light flash lamp and a cold light flash lamp, and can be used for light compensation under different color temperatures.
The audio circuit 2207 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, inputting the electric signals to the processor 2201 for processing, or inputting the electric signals to the radio frequency circuit 2204 for realizing voice communication. For purposes of stereo acquisition or noise reduction, a plurality of microphones may be provided at different portions of the terminal 2200, respectively. The microphone may also be an array microphone or an omni-directional pickup microphone. The speaker is used to convert electrical signals from the processor 2201 or the radio frequency circuit 2204 into sound waves. The speaker may be a conventional thin film speaker or a piezoelectric ceramic speaker. When the speaker is a piezoelectric ceramic speaker, not only the electric signal can be converted into a sound wave audible to humans, but also the electric signal can be converted into a sound wave inaudible to humans for ranging and other purposes. In some embodiments, audio circuit 2207 may also include a headphone jack.
The location component 2208 is utilized to locate the current geographic location of the terminal 2200 for navigation or LBS (Location Based Service, location-based services). The positioning component 2208 can be a U.S. based GPS (Global Positioning System ), a chinese beidou system or russian graver positioning system, and a european union galileo positioning system.
A power supply 2209 is used to power the various components in terminal 2200. The power source 2209 may be alternating current, direct current, disposable or rechargeable. When the power source 2209 comprises a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 2200 further comprises one or more sensors 2210. The one or more sensors 2210 include, but are not limited to: acceleration sensor 2211, gyroscope sensor 2212, pressure sensor 2213, fingerprint sensor 2214, optical sensor 2215, and proximity sensor 2216.
The acceleration sensor 2211 can detect the magnitudes of accelerations on three coordinate axes of the coordinate system established with the terminal 220. For example, the acceleration sensor 2211 may be used to detect components of gravitational acceleration on three coordinate axes. The processor 2201 may control the display 2205 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal acquired by the acceleration sensor 2211. The acceleration sensor 2211 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 2212 may detect a body direction and a rotation angle of the terminal 2200, and the gyro sensor 2212 may collect a 3D motion of the user to the terminal 2200 in cooperation with the acceleration sensor 2211. The processor 2201 may implement the following functions according to the data collected by the gyro sensor 2212: motion sensing (e.g., changing UI according to a tilting operation by a user), image stabilization at shooting, game control, and inertial navigation.
The pressure sensor 2213 may be disposed at a side frame of the terminal 2200 and/or at a lower layer of the display 2205. When the pressure sensor 2213 is disposed at a side frame of the terminal 2200, a grip signal of the terminal 2200 by a user may be detected, and the processor 2201 performs left-right hand recognition or quick operation according to the grip signal collected by the pressure sensor 2213. When the pressure sensor 2213 is disposed at the lower layer of the display screen 2205, the processor 2201 controls the operability control on the UI interface according to the pressure operation of the user on the display screen 2205. The operability controls include at least one of a button control, a scroll bar control, an icon control, and a menu control.
The fingerprint sensor 2214 is used for collecting the fingerprint of the user, and the processor 2201 identifies the identity of the user according to the collected fingerprint of the fingerprint sensor 2214, or the fingerprint sensor 2214 identifies the identity of the user according to the collected fingerprint. Upon identifying the user's identity as a trusted identity, the processor 2201 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying for and changing settings, etc. The fingerprint sensor 2214 may be provided at the front, rear, or side of the terminal 2200. When a physical key or a vendor Logo is provided on the terminal 2200, the fingerprint sensor 2214 may be integrated with the physical key or the vendor Logo.
The optical sensor 2215 is used to collect the intensity of ambient light. In one embodiment, the processor 2201 may control the display brightness of the display 2205 based on the intensity of ambient light collected by the optical sensor 2215. Specifically, when the intensity of the ambient light is high, the display brightness of the display screen 2205 is turned up; when the ambient light intensity is low, the display brightness of the display screen 2205 is turned down. In another embodiment, the processor 2201 may also dynamically adjust the shooting parameters of the camera assembly 2206 based on the ambient light intensity collected by the optical sensor 2215.
A proximity sensor 2216, also referred to as a distance sensor, is provided at the front panel of the terminal 2200. The proximity sensor 2216 is used to collect the distance between the user and the front of the terminal 2200. In one embodiment, when the proximity sensor 2216 detects a gradual decrease in the distance between the user and the front face of the terminal 2200, the processor 2201 controls the display 2205 to switch from the bright screen state to the off screen state; when the proximity sensor 2216 detects that the distance between the user and the front surface of the terminal 2200 gradually increases, the display 2205 is controlled by the processor 2201 to switch from the off-screen state to the on-screen state.
It will be appreciated by those skilled in the art that the structure shown in fig. 22 is not limiting and that more or fewer components than shown may be included or certain components may be combined or a different arrangement of components may be employed.
Optionally, the computer device is provided as a server. Fig. 23 is a schematic diagram of a server according to an exemplary embodiment, where the server 2300 may include one or more processors (Central Processing Units, CPU) 2301 and one or more memories 2302, where the memory 2302 stores at least one program code that is loaded and executed by the processor 2301 to implement the methods provided by the above-described method embodiments. Of course, the server may also have a wired or wireless network interface, a keyboard, an input/output interface, and other components for implementing the functions of the device, which are not described herein.
The embodiment of the present application also provides a computer readable storage medium, in which at least one program code is stored, where the at least one program code is loaded and executed by a processor, to implement an operation performed in the live interface display method of the above embodiment, or to implement an operation performed in the data upload method of the above embodiment; or to implement the operations performed in the data delivery method as described in the above embodiments.
The embodiment of the present application also provides a computer program, where at least one program code is stored, where the at least one program code is loaded and executed by a processor, to implement an operation performed in the live interface display method of the above embodiment, or to implement an operation performed in the data uploading method of the above embodiment; or to implement the operations performed in the data delivery method as described in the above embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program for instructing relevant hardware, where the program may be stored in a computer readable storage medium, and the above storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The foregoing is merely an alternative embodiment of the present application and is not intended to limit the embodiment of the present application, and any modifications, equivalent substitutions, improvements, etc. made within the spirit and principle of the embodiment of the present application should be included in the protection scope of the present application.

Claims (25)

Displaying a virtual scene in a live interface of the live broadcasting room, wherein the live broadcasting room is used for live broadcasting a first change process of the virtual scene along with a main broadcasting operation, the virtual scene comprises a virtual role which is controlled by the main broadcasting and can move in the virtual environment, the live broadcasting room restores the first change process of the virtual scene along with the main broadcasting operation through acquired state data and instructions, the main broadcasting operation comprises an operation of controlling the movement of the virtual role and an operation of controlling the transmission skill of the virtual role, and the instructions are at least one instruction in an instruction packet obtained by packaging the instructions when the main broadcasting server uploads the instructions;
The live broadcast interface also displays a global identifier and a character identifier of at least one virtual character in the virtual scene, and the global of the virtual scene is displayed in the live broadcast interface in response to the triggering operation of the global identifier; responding to the triggering operation of the character identification of any virtual character, and displaying a virtual scene corresponding to any virtual character in the live interface; and responding to a sliding operation triggered based on the live broadcast interface, determining a target position according to the sliding direction and the sliding distance of the sliding operation and the position of the virtual character controlled by the anchor, and displaying a live broadcast environment picture taking the target position as an observation target in the live broadcast interface, wherein the sliding operation is an operation for moving the picture displayed in the current live broadcast interface; responding to the release of the sliding operation, and continuously displaying the virtual scene corresponding to the target position in the live interface; or in response to the sliding operation being released, in the live interface, updating the currently displayed virtual scene to the virtual scene displayed before triggering the sliding operation; or the virtual scene comprises a virtual role controlled based on a main broadcasting account number, the virtual scene corresponding to the target position is canceled from being displayed in the live broadcast interface in response to the sliding operation being released, and the virtual scene corresponding to the virtual role is displayed.
The method comprises the steps that access operation of a live broadcasting room is responded, state data of a virtual scene corresponding to the live broadcasting room is obtained, the state data indicate the state of a virtual object in the virtual scene, the live broadcasting room is used for live broadcasting a first change process of the virtual scene along with a host broadcasting operation, the virtual scene comprises a host broadcasting controlled virtual role which can move in a virtual environment, the live broadcasting room restores the first change process of the virtual scene along with the host broadcasting operation through the obtained state data and instructions, the host broadcasting operation comprises operation for controlling the movement of the virtual role and operation for controlling the transmission skill of the virtual role, and when the host broadcasting server uploads the instructions, the instructions are at least one instruction in an instruction packet obtained by packaging the instructions;
The live broadcast interface also displays a global identifier and a character identifier of at least one virtual character in the virtual scene, and the global of the virtual scene is displayed in the live broadcast interface in response to the triggering operation of the global identifier; responding to the triggering operation of the character identification of any virtual character, and displaying a virtual scene corresponding to any virtual character in the live interface; and responding to a sliding operation triggered based on the live broadcast interface, determining a target position according to the sliding direction and the sliding distance of the sliding operation and the position of the virtual character controlled by the anchor, and displaying a live broadcast environment picture taking the target position as an observation target in the live broadcast interface, wherein the sliding operation is an operation for moving the picture displayed in the current live broadcast interface; responding to the release of the sliding operation, and continuously displaying the virtual scene corresponding to the target position in the live interface; or in response to the sliding operation being released, in the live interface, updating the currently displayed virtual scene to the virtual scene displayed before triggering the sliding operation; or the virtual scene comprises a virtual role controlled based on a main broadcasting account number, the virtual scene corresponding to the target position is canceled from being displayed in the live broadcast interface in response to the sliding operation being released, and the virtual scene corresponding to the virtual role is displayed.
Uploading state data of a virtual scene corresponding to a live broadcasting room to a live broadcasting server, wherein the state data indicates the state of a virtual object in the virtual scene, the live broadcasting room is used for broadcasting a first change process of the virtual scene along with a host broadcasting operation, the virtual scene comprises a host broadcasting controlled virtual role which can move in a virtual environment, the live broadcasting room restores the first change process of the virtual scene along with the host broadcasting operation through the obtained state data and instructions, the host broadcasting operation comprises an operation of controlling the movement of the virtual role and an operation of controlling the transmission skill of the virtual role, and the instructions are at least one instruction in an instruction packet obtained by packaging the instructions when the host broadcasting server uploads the instructions;
The live broadcast interface displays a global identifier and a character identifier of at least one virtual character in the virtual scene, and the global of the virtual scene is displayed in the live broadcast interface in response to triggering operation of the global identifier; responding to the triggering operation of the character identification of any virtual character, and displaying a virtual scene corresponding to any virtual character in the live interface; and responding to a sliding operation triggered based on the live broadcast interface, determining a target position according to the sliding direction and the sliding distance of the sliding operation and the position of the virtual character controlled by the anchor, and displaying a live broadcast environment picture taking the target position as an observation target in the live broadcast interface, wherein the sliding operation is an operation for moving the picture displayed in the current live broadcast interface; responding to the release of the sliding operation, and continuously displaying the virtual scene corresponding to the target position in the live interface; or in response to the sliding operation being released, in the live interface, updating the currently displayed virtual scene to the virtual scene displayed before triggering the sliding operation; or the virtual scene comprises a virtual role controlled based on a main broadcasting account number, the virtual scene corresponding to the target position is canceled from being displayed in the live broadcast interface in response to the sliding operation being released, and the virtual scene corresponding to the virtual role is displayed.
the live broadcast interface displays a global identifier and a role identifier of at least one virtual role in the virtual scene, and the audience client is further used for: responding to the triggering operation of the global identification, and displaying the global of the virtual scene in the live interface; responding to the triggering operation of the character identification of any virtual character, and displaying a virtual scene corresponding to any virtual character in the live interface; and responding to a sliding operation triggered based on the live broadcast interface, determining a target position according to the sliding direction and the sliding distance of the sliding operation and the position of the virtual character controlled by the anchor, and displaying a live broadcast environment picture taking the target position as an observation target in the live broadcast interface, wherein the sliding operation is an operation for moving the picture displayed in the current live broadcast interface; responding to the release of the sliding operation, and continuously displaying the virtual scene corresponding to the target position in the live interface; or in response to the sliding operation being released, in the live interface, updating the currently displayed virtual scene to the virtual scene displayed before triggering the sliding operation; or the virtual scene comprises a virtual role controlled based on a main broadcasting account number, the virtual scene corresponding to the target position is canceled from being displayed in the live broadcast interface in response to the sliding operation being released, and the virtual scene corresponding to the virtual role is displayed.
The display module is used for displaying a virtual scene in a live interface of the live broadcasting room, the live broadcasting room is used for broadcasting a first change process of the virtual scene along with a main broadcasting operation, the virtual scene comprises a main broadcasting controlled virtual role which can move in the virtual environment, the live broadcasting room restores the first change process of the virtual scene along with the main broadcasting operation through acquired state data and instructions, the main broadcasting operation comprises an operation of controlling the movement of the virtual role and an operation of controlling the transmission skill of the virtual role, and the instructions are at least one instruction in an instruction packet obtained by packaging the instructions when the main broadcasting server uploads the instructions;
The live broadcast interface is further displayed with a global identifier and a character identifier of at least one virtual character in the virtual scene, and the display module is further configured to: responding to the triggering operation of the global identification, and displaying the global of the virtual scene in the live interface; responding to the triggering operation of the character identification of any virtual character, and displaying a virtual scene corresponding to any virtual character in the live interface; and responding to a sliding operation triggered based on the live broadcast interface, determining a target position according to the sliding direction and the sliding distance of the sliding operation and the position of the virtual character controlled by the anchor, and displaying a live broadcast environment picture taking the target position as an observation target in the live broadcast interface, wherein the sliding operation is an operation for moving the picture displayed in the current live broadcast interface; responding to the release of the sliding operation, and continuously displaying the virtual scene corresponding to the target position in the live interface; or in response to the sliding operation being released, in the live interface, updating the currently displayed virtual scene to the virtual scene displayed before triggering the sliding operation; or the virtual scene comprises a virtual role controlled based on a anchor account, the virtual scene corresponding to the target position is canceled from being displayed in the live interface in response to the sliding operation being released, and the virtual scene corresponding to the virtual role is displayed;
The system comprises a data acquisition module, a live broadcasting room and a command processing module, wherein the data acquisition module is used for responding to access operation of the live broadcasting room, acquiring state data of a virtual scene corresponding to the live broadcasting room, wherein the state data indicates the state of a virtual object in the virtual scene, the live broadcasting room is used for broadcasting a first change process of the virtual scene along with the operation of a host, the virtual scene comprises a virtual role which is controlled by the host and can move in the virtual environment, the live broadcasting room restores the first change process of the virtual scene along with the operation of the host through the acquired state data and the command, the operation of the host comprises operation for controlling the movement of the virtual role and operation for controlling the transmission skill of the virtual role, and the command is at least one command in a command packet obtained by packaging the command when the command is uploaded by the host server;
The live broadcast interface is further displayed with a global identifier and a character identifier of at least one virtual character in the virtual scene, and the display module is further configured to: responding to the triggering operation of the global identification, and displaying the global of the virtual scene in the live interface; responding to the triggering operation of the character identification of any virtual character, and displaying a virtual scene corresponding to any virtual character in the live interface; and responding to a sliding operation triggered based on the live broadcast interface, determining a target position according to the sliding direction and the sliding distance of the sliding operation and the position of the virtual character controlled by the anchor, and displaying a live broadcast environment picture taking the target position as an observation target in the live broadcast interface, wherein the sliding operation is an operation for moving the picture displayed in the current live broadcast interface; responding to the release of the sliding operation, and continuously displaying the virtual scene corresponding to the target position in the live interface; or in response to the sliding operation being released, in the live interface, updating the currently displayed virtual scene to the virtual scene displayed before triggering the sliding operation; or the virtual scene comprises a virtual role controlled based on a main broadcasting account number, the virtual scene corresponding to the target position is canceled from being displayed in the live broadcast interface in response to the sliding operation being released, and the virtual scene corresponding to the virtual role is displayed.
The system comprises an uploading module, a live broadcasting server and a live broadcasting server, wherein the live broadcasting server is used for uploading state data of a virtual scene corresponding to the live broadcasting room, the state data indicates the state of a virtual object in the virtual scene, the live broadcasting room is used for broadcasting a first change process of the virtual scene along with the operation of a host, the virtual scene comprises a virtual role which is controlled by the host and can move in the virtual environment, the live broadcasting room restores the first change process of the virtual scene along with the operation of the host through the obtained state data and instructions, the operation of the host comprises an operation of controlling the movement of the virtual role and an operation of controlling the transmission skill of the virtual role, and the instructions are at least one instruction in an instruction packet obtained by packaging the instructions when the instruction is uploaded by the host server;
The live broadcast interface displays a global identifier and a character identifier of at least one virtual character in the virtual scene, and the global of the virtual scene is displayed in the live broadcast interface in response to triggering operation of the global identifier; responding to the triggering operation of the character identification of any virtual character, and displaying a virtual scene corresponding to any virtual character in the live interface; and responding to a sliding operation triggered based on the live broadcast interface, determining a target position according to the sliding direction and the sliding distance of the sliding operation and the position of the virtual character controlled by the anchor, and displaying a live broadcast environment picture taking the target position as an observation target in the live broadcast interface, wherein the sliding operation is an operation for moving the picture displayed in the current live broadcast interface; responding to the release of the sliding operation, and continuously displaying the virtual scene corresponding to the target position in the live interface; or in response to the sliding operation being released, in the live interface, updating the currently displayed virtual scene to the virtual scene displayed before triggering the sliding operation; or the virtual scene comprises a virtual role controlled based on a main broadcasting account number, the virtual scene corresponding to the target position is canceled from being displayed in the live broadcast interface in response to the sliding operation being released, and the virtual scene corresponding to the virtual role is displayed.
The system comprises a transmitting module, a broadcasting module and a broadcasting server, wherein the transmitting module is used for transmitting acquired instructions to audience clients in a live broadcasting room, the live broadcasting room is used for broadcasting a first change process of the virtual scene along with a broadcasting operation, the live broadcasting room restores the first change process of the virtual scene along with the broadcasting operation through the acquired state data and the instructions, the broadcasting operation comprises an operation of controlling the movement of a virtual role and an operation of controlling the transmitting skill of the virtual role, and the instructions are at least one instruction in an instruction packet obtained by packaging the instructions when the broadcasting server uploads the instructions;
the live broadcast interface displays a global identifier and a role identifier of at least one virtual role in the virtual scene, and the audience client is further used for: responding to the triggering operation of the global identification, and displaying the global of the virtual scene in the live interface; responding to the triggering operation of the character identification of any virtual character, and displaying a virtual scene corresponding to any virtual character in the live interface; and responding to a sliding operation triggered based on the live broadcast interface, determining a target position according to the sliding direction and the sliding distance of the sliding operation and the position of the virtual character controlled by the anchor, and displaying a live broadcast environment picture taking the target position as an observation target in the live broadcast interface, wherein the sliding operation is an operation for moving the picture displayed in the current live broadcast interface; responding to the release of the sliding operation, and continuously displaying the virtual scene corresponding to the target position in the live interface; or in response to the sliding operation being released, in the live interface, updating the currently displayed virtual scene to the virtual scene displayed before triggering the sliding operation; or the virtual scene comprises a virtual role controlled based on a main broadcasting account number, the virtual scene corresponding to the target position is canceled from being displayed in the live broadcast interface in response to the sliding operation being released, and the virtual scene corresponding to the virtual role is displayed.
CN202110586123.8A2021-05-272021-05-27Live broadcast interface display method, data uploading method and data issuing methodActiveCN113318442B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202110586123.8ACN113318442B (en)2021-05-272021-05-27Live broadcast interface display method, data uploading method and data issuing method

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202110586123.8ACN113318442B (en)2021-05-272021-05-27Live broadcast interface display method, data uploading method and data issuing method

Publications (2)

Publication NumberPublication Date
CN113318442A CN113318442A (en)2021-08-31
CN113318442Btrue CN113318442B (en)2024-07-19

Family

ID=77421675

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202110586123.8AActiveCN113318442B (en)2021-05-272021-05-27Live broadcast interface display method, data uploading method and data issuing method

Country Status (1)

CountryLink
CN (1)CN113318442B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN113765908B (en)*2021-09-012023-07-07南京炫佳网络科技有限公司Data acquisition method, device, equipment and storage medium
CN116088739A (en)2021-11-042023-05-09北京字跳网络技术有限公司Live broadcast interface display method, device, equipment, storage medium and program product
WO2023133801A1 (en)*2022-01-142023-07-20上海莉莉丝科技股份有限公司Data processing method, system, medium, and computer program product
CN114745598B (en)*2022-04-122024-03-19北京字跳网络技术有限公司Video data display method and device, electronic equipment and storage medium
CN116170614A (en)*2023-02-102023-05-26北京达佳互联信息技术有限公司Live broadcast method, live broadcast device, electronic equipment and computer readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN106162369A (en)*2016-06-292016-11-23腾讯科技(深圳)有限公司A kind of realize in virtual scene interactive method, Apparatus and system
CN108668163A (en)*2018-05-032018-10-16广州虎牙信息科技有限公司 Live broadcast method, device, computer-readable storage medium and computer equipment
CN110557625A (en)*2019-09-172019-12-10北京达佳互联信息技术有限公司live virtual image broadcasting method, terminal, computer equipment and storage medium
CN111629225A (en)*2020-07-142020-09-04腾讯科技(深圳)有限公司Visual angle switching method, device and equipment for live broadcast of virtual scene and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20060251382A1 (en)*2005-05-092006-11-09Microsoft CorporationSystem and method for automatic video editing using object recognition
JP2020088588A (en)*2018-11-232020-06-04ネクシオン株式会社Remote production system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN106162369A (en)*2016-06-292016-11-23腾讯科技(深圳)有限公司A kind of realize in virtual scene interactive method, Apparatus and system
CN108668163A (en)*2018-05-032018-10-16广州虎牙信息科技有限公司 Live broadcast method, device, computer-readable storage medium and computer equipment
CN110557625A (en)*2019-09-172019-12-10北京达佳互联信息技术有限公司live virtual image broadcasting method, terminal, computer equipment and storage medium
CN111629225A (en)*2020-07-142020-09-04腾讯科技(深圳)有限公司Visual angle switching method, device and equipment for live broadcast of virtual scene and storage medium

Also Published As

Publication numberPublication date
CN113318442A (en)2021-08-31

Similar Documents

PublicationPublication DateTitle
CN113318442B (en)Live broadcast interface display method, data uploading method and data issuing method
CN111589167B (en)Event sightseeing method, device, terminal, server and storage medium
CN110740340B (en)Video live broadcast method and device and storage medium
CN112118477B (en)Virtual gift display method, device, equipment and storage medium
CN110139116B (en)Live broadcast room switching method and device and storage medium
CN111246236B (en)Interactive data playing method, device, terminal, server and storage medium
CN111921197B (en)Method, device, terminal and storage medium for displaying game playback picture
CN113230655B (en)Virtual object control method, device, equipment, system and readable storage medium
CN111918090B (en)Live broadcast picture display method and device, terminal and storage medium
CN111050189A (en)Live broadcast method, apparatus, device, storage medium, and program product
CN113204672B (en)Resource display method, device, computer equipment and medium
CN112612439A (en)Bullet screen display method and device, electronic equipment and storage medium
CN110772793A (en)Virtual resource configuration method and device, electronic equipment and storage medium
CN111818358A (en)Audio file playing method and device, terminal and storage medium
CN114116053A (en)Resource display method and device, computer equipment and medium
CN112188268B (en)Virtual scene display method, virtual scene introduction video generation method and device
CN113411680A (en)Multimedia resource playing method, device, terminal and storage medium
KR102756416B1 (en) Method for controlling virtual objects, apparatus, device and computer-readable storage medium
CN111544897B (en)Video clip display method, device, equipment and medium based on virtual scene
CN112104648A (en)Data processing method, device, terminal, server and storage medium
CN113204671A (en)Resource display method, device, terminal, server, medium and product
CN113134232A (en)Virtual object control method, device, equipment and computer readable storage medium
CN111669640A (en)Virtual article transfer special effect display method, device, terminal and storage medium
CN113141538B (en)Media resource playing method, device, terminal, server and storage medium
CN112973116B (en)Virtual scene picture display method and device, computer equipment and storage medium

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp