Disclosure of Invention
The embodiment of the application provides a picture display method and device, computer equipment and a storage medium, wherein two independent window processes are created, a live broadcast picture is played on a main window, and special-effect animations are played in sub-windows, so that the phenomenon that a live broadcast client side is blocked when the live broadcast picture and a plurality of special-effect animations are played simultaneously is avoided, and the picture display fluency in the live broadcast client side is improved.
The embodiment of the application provides a picture display method, which comprises the following steps:
responding to a picture display request triggered at a live client, displaying a live picture in a main window of the live client, and creating a sub-window in the live client; wherein the sub-window is superimposed on the main window;
when a special-effect playing instruction of a virtual gift sent by a live broadcast server is received, obtaining special-effect information corresponding to the special-effect playing instruction;
and generating and displaying special effect animation in the child window based on the special effect information.
Optionally, the creating a sub-window in the live client includes:
determining a main window process associated with the main window, and acquiring a main window handle corresponding to the main window process;
creating a child window process, forwarding window attribute data corresponding to the main window handle to the child window process, and generating the child window handle based on the window attribute data, wherein the window attribute data at least comprises window size data and window position data;
the child window process creates the child window based on the window attribute data, wherein the main window process and the child window process communicate via an interprocess communication link.
Optionally, when a special-effect playing instruction sent by a live broadcast server is received, obtaining special-effect information corresponding to the special-effect playing instruction includes:
when the main window process receives the special-effect playing instruction sent by the live broadcast server, the main window process forwards the special-effect playing instruction to the sub-window process through the inter-process communication link, so that the sub-window process obtains special-effect information corresponding to the special-effect playing instruction.
Optionally, after creating the sub-window in the live client, the method further includes:
when detecting that the main window process receives a processing instruction, the main window process forwards the processing instruction to the sub-window process through the inter-process communication link, so that the sub-window process responds to the processing instruction.
Optionally, the main window displays a first function control;
when it is detected that the main window process receives the processing instruction, the main window process forwards the processing instruction to the sub-window process through the inter-process communication link, further comprising:
generating a processing instruction in response to a triggering operation for the first function control, wherein the processing instruction comprises: a size adjustment parameter;
the main window process forwards the processing instruction to the sub-window process through the inter-process communication link, including:
adjusting window size data corresponding to the main window handle according to the size adjustment parameter so as to adjust the size of the main window;
and the main window process forwards the processing instruction and the size adjusting parameter to the sub-window process through the inter-process communication link, so that the sub-window process responds to the processing instruction and adjusts window size data corresponding to the sub-window handle according to the size adjusting parameter to adjust the size of the sub-window.
Optionally, a second function control is displayed in the main window;
when it is detected that the main window process receives the processing instruction, the main window process forwards the processing instruction to the sub-window process through the inter-process communication link, further comprising:
generating a processing instruction in response to a triggering operation for the second function control, wherein the processing instruction comprises: a status indication parameter;
the main window process forwards the processing instruction to the sub-window process through the inter-process communication link, including:
when the main window process is detected to receive the processing instruction, the main window process forwards the state indication parameter to the sub-window process through the inter-process communication link, so that the sub-window process responds to the state indication parameter and closes the sub-window; meanwhile, the main window process responds to the state indication parameter and closes the main window.
Optionally, a third function control is displayed in the main window;
when it is detected that the main window process receives the processing instruction, the main window process forwards the processing instruction to the sub-window process through the inter-process communication link, further comprising:
responding to the triggering operation of the current user for the third function control, and generating a session object list in the main window, wherein the session object list shows a plurality of virtual head portraits, and the virtual head portraits are associated with virtual account numbers;
generating a processing instruction in response to a triggering operation for a target virtual avatar, wherein the processing instruction comprises: information display parameters;
the main window process forwards the processing instruction to the sub-window process through the inter-process communication link, including:
the main window process sends the processing instruction to the sub-window process so that the sub-window process generates and displays an information interaction interface in the sub-window based on the information display parameter; the information interaction interface is used for displaying interaction information between users.
Optionally, after creating the sub-window in the live client, the method further includes:
the child window process receives a movement instruction, wherein the movement instruction comprises a position adjustment parameter;
and the child window process adjusts the window position data corresponding to the child window handle according to the position adjustment parameter so as to adjust the position relation between the child window and the main window.
Optionally, after creating the sub-window in the live client, the method further includes:
receiving a barrage playing instruction sent by the live broadcast server;
and generating and displaying bullet screen information in the child window based on the bullet screen playing instruction, wherein the bullet screen information is used for describing the virtual gift and a source user of the virtual gift.
Optionally, the generating and displaying the bullet screen information in the sub-window based on the bullet screen playing instruction further includes:
acquiring position data of bullet screen information to be displayed and position data of special-effect animation to be displayed, which are indicated by the bullet screen playing instruction;
the child window process determines whether the position data of the bullet screen information to be displayed is the same as the position data of the special effect animation to be displayed;
and if so, the child window process sets the display priority of the special effect animation to be higher than the priority of the target bullet screen.
An embodiment of the present application further provides an image display device, including:
the response unit is used for responding to a picture display request triggered by a live client, displaying a live picture in a main window of the live client and creating a sub-window in the live client; wherein the sub-window is superimposed on the main window;
the system comprises an acquisition unit, a display unit and a display unit, wherein the acquisition unit is used for acquiring special effect information corresponding to a special effect playing instruction when the special effect playing instruction of a virtual gift sent by a live broadcast server is received;
and the generating unit is used for generating and displaying the special effect animation in the child window based on the special effect information.
Optionally, the screen display apparatus further includes a determination unit, the determination unit is configured to:
determining a main window process associated with the main window, and acquiring a main window handle corresponding to the main window process;
creating a child window process, forwarding window attribute data corresponding to the main window handle to the child window process, and generating the child window handle based on the window attribute data, wherein the window attribute data at least comprises window size data and window position data;
the child window process creates the child window based on the window attribute data, wherein the main window process and the child window process communicate via an interprocess communication link.
Optionally, the screen display apparatus further includes a sending unit, and the sending unit is configured to:
when the main window process receives the special-effect playing instruction sent by the live broadcast server, the main window process forwards the special-effect playing instruction to the sub-window process through the inter-process communication link, so that the sub-window process obtains special-effect information corresponding to the special-effect playing instruction.
Optionally, the sending unit is further configured to:
when detecting that the main window process receives a processing instruction, the main window process forwards the processing instruction to the sub-window process through the inter-process communication link, so that the sub-window process responds to the processing instruction.
Optionally, the screen display device further includes a first response unit, where the first response unit is configured to:
the main window is displayed with a first function control;
generating a processing instruction in response to a triggering operation for the first function control, wherein the processing instruction comprises: a size adjustment parameter;
the main window process forwards the processing instruction to the sub-window process through the inter-process communication link, including:
adjusting window size data corresponding to the main window handle according to the size adjustment parameter so as to adjust the size of the main window;
and the main window process forwards the processing instruction and the size adjusting parameter to the sub-window process through the inter-process communication link, so that the sub-window process responds to the processing instruction and adjusts window size data corresponding to the sub-window handle according to the size adjusting parameter to adjust the size of the sub-window.
Optionally, the screen display device further includes a second response unit, and the second response unit is configured to:
the main window is displayed with a second function control;
generating a processing instruction in response to a triggering operation for the second function control, wherein the processing instruction comprises: a status indication parameter;
the main window process forwards the processing instruction to the sub-window process through the inter-process communication link, including:
when the main window process is detected to receive the processing instruction, the main window process forwards the state indication parameter to the sub-window process through the inter-process communication link, so that the sub-window process responds to the state indication parameter and closes the sub-window; meanwhile, the main window process responds to the state indication parameter and closes the main window.
Optionally, the screen display device further includes a third response unit, and the third response unit is configured to:
the main window is displayed with a third function control;
responding to the triggering operation of the current user for the third function control, and generating a session object list in the main window, wherein the session object list shows a plurality of virtual head portraits, and the virtual head portraits are associated with virtual account numbers;
generating a processing instruction in response to a triggering operation for a target virtual avatar, wherein the processing instruction comprises: information display parameters;
the main window process forwards the processing instruction to the sub-window process through the inter-process communication link, including:
the main window process sends the processing instruction to the sub-window process so that the sub-window process generates and displays an information interaction interface in the sub-window based on the information display parameter; the information interaction interface is used for displaying interaction information between users.
Optionally, the screen display apparatus further includes an adjusting unit, and the adjusting unit is configured to:
the child window process receives a movement instruction, wherein the movement instruction comprises a position adjustment parameter;
and the child window process adjusts the window position data corresponding to the child window handle according to the position adjustment parameter so as to adjust the position relation between the child window and the main window.
Optionally, the screen display apparatus further includes a receiving unit, where the receiving unit is configured to:
receiving a barrage playing instruction sent by the live broadcast server;
and generating and displaying bullet screen information in the child window based on the bullet screen playing instruction, wherein the bullet screen information is used for describing the virtual gift and a source user of the virtual gift.
Optionally, the obtaining unit is further configured to:
acquiring position data of bullet screen information to be displayed and position data of special-effect animation to be displayed, which are indicated by the bullet screen playing instruction;
the child window process determines whether the position data of the bullet screen information to be displayed is the same as the position data of the special effect animation to be displayed;
and if so, the child window process sets the display priority of the special effect animation to be higher than the priority of the target bullet screen.
An embodiment of the present application further provides a computer device, where the computer device includes a memory and a processor, where the memory stores a computer program, and the processor executes the steps in the picture display method according to any one of the above embodiments by calling the computer program stored in the memory.
An embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored, where the computer program is suitable for being loaded by a processor to perform the steps in the image display method according to any of the above embodiments.
According to the picture display method, the picture display device, the computer equipment and the storage medium, a live broadcast picture is displayed in a main window of a live broadcast client by responding to a picture display request triggered by the live broadcast client, and a sub-window is created in the live broadcast client; wherein the sub-window is superimposed on the main window; when a special-effect playing instruction sent by a live broadcast server is received, obtaining special-effect information corresponding to the special-effect playing instruction; and generating and displaying special effect animation in the child window based on the special effect information. According to the method and the device, two independent window processes are established at a live broadcast receiving end, and the two independent window processes are communicated with each other, so that live broadcast pictures are played on a main window, and special-effect animations are played in sub-windows; therefore, the blocking caused when the live broadcast client plays the live broadcast picture and a plurality of special-effect animations simultaneously is avoided, and the picture display fluency in the live broadcast client is improved.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides a picture display method, a picture display device, computer equipment and a storage medium. Specifically, the screen display method according to the embodiment of the present application may be executed by a computer device, where the computer device may be a terminal or a server. The terminal can be a terminal device such as a smart phone, a tablet Computer, a notebook Computer, a touch screen, a Personal Computer (PC), a Personal Digital Assistant (PDA), and the like, and can also include a client, which can be a live application client, a browser client carrying a live program, or an instant communication client, and the like. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as cloud service, a cloud database, cloud computing, a cloud function, cloud storage, network service, cloud communication, middleware service, domain name service, security service, CDN, and a big data and artificial intelligence platform.
Referring to fig. 1, fig. 1 is a system schematic diagram of a screen display device according to an embodiment of the present disclosure. The system may include at least onecomputer device 1000, at least oneserver 2000, and anetwork 3000. Acomputer device 1000 held by a user may connect to a server for live applications over thenetwork 3000.Computer device 1000 is any device having computing hardware capable of supporting and executing software products corresponding to a live video. In addition, thecomputer device 1000 has one or more multi-touch sensitive screens for sensing and obtaining input by a user through a touch or slide operation performed at multiple points of one or more touch sensitive display screens. In addition, when the system includes a plurality ofcomputer apparatuses 1000, a plurality ofservers 2000, and a plurality ofnetworks 3000,different computer apparatuses 1000 may be connected to each other throughdifferent networks 3000 and throughdifferent servers 2000. Thenetwork 3000 may be a wireless network or a wired network, such as a Wireless Local Area Network (WLAN), a Local Area Network (LAN), a cellular network, a 2G network, a 3G network, a 4G network, a 5G network, etc. In addition,different computer devices 1000 may also be connected to other terminals or servers using their own bluetooth network or hotspot network. For example, multiple users may be online throughdifferent computer devices 1000 to connect and synchronize with each other over an appropriate network.
The following describes a method for displaying a screen according to an embodiment, with reference to fig. 2 to 13.
Referring to fig. 2, fig. 2 is a schematic flow chart of a picture display method according to an embodiment of the present application, and a specific flow of the picture display method may be as follows:
101, responding to a picture display request triggered at a live client, displaying a live picture in a main window of the live client, and creating a sub-window in the live client; wherein the sub-window is superimposed on the main window.
Wherein the computer device can display a main window of the live client on its user interface. A plurality of live broadcast room identifications are displayed in a main window of the live broadcast client, each live broadcast room identification is associated with a live broadcast room, so that a user can jump to a corresponding target live broadcast room through triggering operation aiming at the target live broadcast room identification, and a live broadcast picture corresponding to the target live broadcast room is displayed in the main window of the live broadcast client. For example, referring to fig. 3, the computer device displays, on its user interface, a main window of the live client whose display priority is higher than that of the user interface, that is, the main window of the live client is displayed on the user interface in an overlapping manner. A plurality of live broadcast room identifications are displayed in a main window of the live broadcast room, and the live broadcast room identifications are static pictures or dynamic pictures and the like.
Further, when the user performs a trigger operation on the target live broadcast room identifier, the computer device determines a main window process associated with the main window, and obtains a main window handle corresponding to the main window process. Then, the computer device creates a sub-window process, forwards the window attribute data corresponding to the main window handle to the sub-window process, and generates the sub-window handle based on the window attribute data, i.e. the main window handle corresponding to the main window process is given to the newly created sub-window process. Wherein the window attribute data comprises at least window size data and window position data. Thereafter, the child window process creates a child window based on the window attribute data, wherein the main window process communicates with the child window process via an interprocess communication link.
For example, referring to fig. 4, when the user performs a trigger operation on the target live broadcast room identifier, the user may jump to the corresponding target live broadcast room and display a live broadcast screen corresponding to the target live broadcast room in a main window of the anchor client. Meanwhile, a sub-window is displayed on the main window in an overlapping mode, the display priority of the sub-window is higher than that of the main window, and the sub-window is displayed on the main window in an overlapping mode.
102, when a special-effect playing instruction sent by a live broadcast server is received, obtaining special-effect information corresponding to the special-effect playing instruction.
Specifically, when the main window process receives a special-effect playing instruction sent by the live broadcast server, first, the computer device obtains special-effect information corresponding to the special-effect playing instruction from a local pre-stored database. Then, the main window process forwards the trick play instruction to the sub-window process through the interprocess communication link, so that the sub-window process receives and responds to the trick play instruction. And then, the sub-window process responds to the special-effect playing instruction, and the sub-window process acquires special-effect information corresponding to the special-effect playing instruction.
Wherein, the trick play instruction can be triggered by the anchor receiving end. Referring to fig. 4, a main window of the live broadcast client displays a plurality of virtual gift identifications, and each virtual gift identification is associated with a virtual special effect. When a user performs triggering operation aiming at the target virtual gift identification, the main window process sends a triggering instruction generated by the triggering operation to the live broadcast server, the live broadcast server generates a special-effect playing instruction based on the triggering operation aiming at the target virtual gift identification, and the live broadcast server returns the special-effect playing instruction to the main window process of the live broadcast receiving end. When the main window process receives a special-effect playing instruction sent by the live broadcast server, the computer equipment acquires special-effect information corresponding to the special-effect playing instruction.
And 103, generating and displaying the special effect animation in the child window based on the special effect information.
In an embodiment, after the sub-window process obtains the special effect information corresponding to the special effect playing instruction, the sub-window process generates and displays a special effect animation in the sub-window based on the special effect information, where the special effect information may be a size of the special effect animation, a special effect picture, a start timestamp, an end timestamp, and position information.
To sum up, in the embodiment of the present application, two independent window processes are created at a live broadcast receiving end, and communication is performed between the two independent window processes, so that a live broadcast picture is played on a main window, and a special-effect animation is played in a sub-window; therefore, the blocking caused when the live broadcast client plays the live broadcast picture and a plurality of special-effect animations simultaneously is avoided, and the picture display fluency in the live broadcast client is improved.
Referring to fig. 5, fig. 5 is another schematic flow chart of the image display method according to the embodiment of the present application, and the specific flow of the image display method may be as follows:
and 201, responding to a picture display request triggered at the live client, and displaying the live picture in a main window of the live client.
Wherein the computer device can display a main window of the live client on its user interface. A plurality of live broadcast room identifications are displayed in a main window of the live broadcast client, each live broadcast room identification is associated with a live broadcast room, so that a user can jump to a corresponding target live broadcast room through triggering operation aiming at the target live broadcast room identification, and a live broadcast picture corresponding to the target live broadcast room is displayed in the main window of the live broadcast client. For example, referring to fig. 3, the computer device displays, on its user interface, a main window of the live client whose display priority is higher than that of the user interface, that is, the main window of the live client is displayed on the user interface in an overlapping manner. A plurality of live broadcast room identifications are displayed in a main window of the live broadcast room, and the live broadcast room identifications are static pictures or dynamic pictures and the like.
202, determining a main window process associated with the main window, and obtaining a main window handle corresponding to the main window process.
Further, when the user performs a trigger operation on the target live broadcast room identifier, the computer device determines a main window process associated with the main window, and obtains a main window handle corresponding to the main window process. The main window handle points to window attribute data corresponding to the main window, and the window attribute data at least comprises window size data and window position data.
203, creating a child window process, forwarding window attribute data corresponding to the main window handle to the child window process, and generating the child window handle based on the window attribute data; the child window process creates a child window based on the window attribute data, wherein the main window process communicates with the child window process via an interprocess communication link.
Optionally, after creating the sub-window process and generating the sub-window, when it is detected that the main window process receives the processing instruction, the main window process forwards the processing instruction to the sub-window process through the inter-process communication link, so that the sub-window process responds to the processing instruction.
The situation that the size of the special effect picture is not matched with the size of the live broadcast picture after the size of the main window is changed is avoided, and the effect of combined display of the special effect and the live broadcast picture is guaranteed. In one embodiment, the main window displays a first function control, and the processing instruction includes: and (4) size adjustment parameters. Specifically, the computer device may generate a processing instruction to adjust the size of the main window and the size of the sub-window in response to a triggering operation for the first functionality control. And when the main window process is detected to receive the processing instruction, adjusting window size data corresponding to the main window handle according to the size adjustment parameter so as to adjust the size of the main window. And then, the main window process forwards the processing instruction and the size adjusting parameter to the sub-window process through the interprocess communication link, so that the sub-window process responds to the processing instruction and adjusts the window size data corresponding to the sub-window handle according to the size adjusting parameter to adjust the size of the sub-window.
For example, referring to fig. 6, when the user performs the triggering operation on the first functionality control, the computer device may generate a processing instruction in response to the triggering operation on the first functionality control performed by the user, and send the processing instruction to the main window process. And when the main window process is detected to receive the processing instruction, adjusting window size data corresponding to the main window handle according to the size adjustment parameter so as to adjust the size of the main window. And then, the main window process forwards the processing instruction and the size adjusting parameter to the sub-window process through the interprocess communication link, so that the sub-window process responds to the processing instruction, and adjusts the window size data corresponding to the sub-window handle according to the size adjusting parameter to adjust the size of the sub-window, thereby realizing that the size of the sub-window is changed along with the change of the size of the main window.
In one embodiment, the main window displays a second function control, and the processing instruction includes: the status indicates a parameter. Specifically, the computer device may generate a processing instruction to close the main window and the sub-window in response to a triggering operation for the second functionality control. When detecting that the main window process receives the processing instruction, the main window process forwards the state indication parameter to the sub-window process through the inter-process communication link, so that the sub-window process responds to the state indication parameter and closes the sub-window; meanwhile, the main window process closes the main window in response to the state indication parameter.
For example, referring to fig. 7, when the user performs the triggering operation on the second functionality control, the computer device may generate a processing instruction in response to the triggering operation on the second functionality control by the user, and send the processing instruction to the main window process. When detecting that the main window process receives the processing instruction, the main window process forwards the state indication parameter to the sub-window process through the inter-process communication link, so that the sub-window process responds to the state indication parameter and closes the sub-window; meanwhile, the main window process closes the main window in response to the state indication parameter. And destroying both the main window and the sub-window to display the user interface which is not blocked by the main window and the sub-window.
In one embodiment, the main window displays a third function control, and the processing instruction includes: and displaying the parameters. Specifically, the main window process responds to the triggering operation of the current user for the third function control, and generates a session object list in the main window, wherein the session object list shows a plurality of virtual avatars, and the virtual avatars are associated with the virtual account. Then, the main window process responds to the triggering operation aiming at the target virtual head portrait, generates an information interaction instruction, and sends the information display parameters to the sub-window process, so that the sub-window process generates and displays an information interaction interface in the sub-window based on the information display parameters; the information interaction interface is used for performing information interaction between the virtual account corresponding to the current user and the virtual account corresponding to the target virtual avatar.
In order to ensure the effect of correspondingly combining and displaying the special effect and the live broadcast picture. In one embodiment, the main window displays a fourth function control, and the processing instruction includes: and adjusting the parameters of the position. Specifically, the computer device may generate a processing instruction to adjust the display position of the main window and the display position of the sub window in response to a triggering operation for the fourth function control. And when the main window process is detected to receive the processing instruction, adjusting window position data corresponding to the main window handle according to the position adjustment parameter so as to adjust the display position of the main window. And then, the main window process forwards the processing instruction and the position adjusting parameter to the sub-window process through the inter-process communication link, so that the sub-window process responds to the processing instruction, and window position data corresponding to the sub-window handle is adjusted according to the position adjusting parameter to adjust the display position of the sub-window.
Optionally, in order to meet the personalized requirements of the user, the user adjusts the display position of the special effect picture according to the requirements. In the embodiment of the application, when the computer device detects that the child window process receives the processing instruction, the child window process responds to the processing instruction, wherein the processing instruction comprises the position adjustment parameter. And then, the child window process adjusts the window position data corresponding to the child window handle according to the position adjustment parameter so as to adjust the position relation between the child window and the main window.
For example, referring to fig. 8, when the user operates by a trigger for the child window, the computer device may generate a processing instruction in response to the user operating by a trigger for the child window, and send the processing instruction to the child window process. When the sub-window process is detected to receive the processing instruction, the sub-window process responds to the position adjustment parameter, and adjusts the window position data corresponding to the sub-window handle according to the position adjustment parameter so as to adjust the display position of the sub-window, thereby realizing the change of the relative position relationship between the main window and the sub-window.
204, when receiving a special-effect playing instruction sent by the live broadcast server, obtaining special-effect information corresponding to the special-effect playing instruction.
Specifically, when the main window process receives a special-effect playing instruction sent by the live broadcast server, first, the computer device obtains special-effect information corresponding to the special-effect playing instruction from a local pre-stored database. Then, the main window process forwards the trick play instruction to the sub-window process through the interprocess communication link, so that the sub-window process receives and responds to the trick play instruction. And then, the sub-window process responds to the special-effect playing instruction, and the sub-window process acquires special-effect information corresponding to the special-effect playing instruction.
Wherein, the trick play instruction can be triggered by the anchor receiving end. Referring to fig. 4, a main window of the live broadcast client displays a plurality of virtual gift identifications, and each virtual gift identification is associated with a virtual special effect. When a user performs triggering operation aiming at the target virtual gift identification, the main window process sends a triggering instruction generated by the triggering operation to the live broadcast server, the live broadcast server generates a special-effect playing instruction based on the triggering operation aiming at the target virtual gift identification, and the live broadcast server returns the special-effect playing instruction to the main window process of the live broadcast receiving end. When the main window process receives a special-effect playing instruction sent by the live broadcast server, the computer equipment acquires special-effect information corresponding to the special-effect playing instruction.
And 205, generating and displaying the special effect animation in the child window based on the special effect information.
In an embodiment, after the sub-window process obtains the special effect information corresponding to the special effect playing instruction, the sub-window process generates and displays a special effect animation in the sub-window based on the special effect information, where the special effect information may be a size of the special effect animation, a special effect picture, a start timestamp, an end timestamp, and position information.
Referring to fig. 9, fig. 9 is another schematic flow chart of a screen display method according to an embodiment of the present application, where a specific flow of the screen display method may be as follows:
301, responding to a picture display request triggered at a live client, displaying a live picture in a main window of the live client, and creating a sub-window in the live client; wherein the sub-window is superimposed on the main window.
302, when receiving a special-effect playing instruction sent by a live broadcast server, obtaining special-effect information corresponding to the special-effect playing instruction.
And 303, generating and displaying the special effect animation in the child window based on the special effect information.
And 304, receiving a barrage playing instruction sent by the live broadcast server.
Specifically, a bullet screen playing instruction sent by a live broadcast server is received, bullet screen information is generated and displayed in a child window based on the bullet screen playing instruction, and the bullet screen information is used for describing a virtual gift and a source user of the virtual gift. For example, when the main window process receives a bullet screen playing instruction sent by a live broadcast server, first, the computer device obtains bullet screen information corresponding to the bullet screen playing instruction from a local pre-stored database. And then, the main window process forwards the bullet screen playing instruction to the sub-window process through the interprocess communication link, so that the sub-window process receives and responds to the bullet screen playing instruction. And then, the sub-window process responds to the bullet screen playing instruction, and the sub-window process acquires bullet screen information corresponding to the bullet screen playing instruction.
Further, the sub-window process generates and displays bullet screen information, and the bullet screen information may include the size of the target bullet screen, bullet screen content, opening timestamp, ending timestamp, and position information.
305, acquiring position data of bullet screen information to be displayed and position data of special-effect animation to be displayed, which are indicated by a bullet screen playing instruction; the child window process determines whether the position data of the bullet screen information to be displayed is the same as the position data of the special effect animation to be displayed; if yes, the child window process sets the display priority of the special effect animation to be higher than the priority of the target bullet screen.
For example, referring to fig. 10, when it is detected that a special effect animation (high-heeled shoes) and a bullet screen are displayed in the current sub-window, and the sub-window process determines that there is display position overlap between the special effect animation and the position data of the target bullet screen, the sub-window process sets the display priority of the special effect animation to be higher than that of the target bullet screen, that is, the special effect animation is displayed on the target bullet screen in an overlapping manner.
In summary, the display priority of the special-effect animation is higher than that of the target bullet screen, so that the special-effect animation is not shielded by the target bullet screen when the special-effect animation is displayed, and the display effect of the special-effect animation can be improved.
Referring to fig. 11, fig. 11 is another schematic flow chart of the screen display method according to the embodiment of the present application, taking interaction between a live broadcast server and a main window process and a sub window process of a live broadcast client as an example, the specific flow is as follows:
the main window process of the live client responds to a trigger operation triggered at thelive client 401.
In one embodiment, a plurality of virtual gift identifications are displayed in a main window of the live client, and each virtual gift identification is associated with a virtual special effect. When the user can perform a trigger operation on the target virtual gift identification, for example, the trigger operation may be clicking the target virtual gift identification or long-pressing the target virtual gift identification.
402, the main window process of the live client sends the trigger operation to the live server.
Specifically, when it is detected that a trigger operation is formed in a main window of a live broadcast client, a main window process sends a trigger instruction generated by the trigger operation to a live broadcast server, so that the live broadcast server generates a special-effect playing instruction.
403, the live broadcast server generates a special-effect playing instruction based on the trigger operation.
404, the live broadcast server sends the special-effect playing instruction to the main window process of the live broadcast client.
405, a main window process of the live broadcast client receives a special-effect playing instruction returned by the live broadcast server, and obtains special-effect information corresponding to the special-effect playing instruction.
Specifically, when the main window process receives a special-effect playing instruction sent by the live broadcast server, first, the computer device obtains special-effect information corresponding to the special-effect playing instruction from a local pre-stored database.
406, the main window process of the live client sends the trick play instruction to the sub-window process of the live client.
The main window process forwards the special-effect playing instruction to the sub-window process through the inter-process communication link, so that the sub-window process receives and responds to the special-effect playing instruction.
407, the sub-window of the live broadcast client responds to a special effect playing instruction, and a special effect animation is generated and displayed in the sub-window based on the special effect information.
In an embodiment, after the sub-window process obtains the special effect information corresponding to the special effect playing instruction, the sub-window process generates and displays a special effect animation in the sub-window based on the special effect information, where the special effect information may be a size of the special effect animation, a special effect picture, a start timestamp, an end timestamp, and position information.
To sum up, in the embodiment of the present application, two independent window processes are created at a live broadcast receiving end, and communication is performed between the two independent window processes, so that a live broadcast picture is played on a main window, and a special-effect animation is played in a sub-window; therefore, the blocking caused when the live broadcast client plays the live broadcast picture and a plurality of special-effect animations simultaneously is avoided, and the picture display fluency in the live broadcast client is improved.
In order to better implement the image display method according to the embodiments of the present application, an image display apparatus is further provided in the embodiments of the present application. Please refer to fig. 12, fig. 12 is a schematic structural diagram of a screen display device according to an embodiment of the present application. The screen display apparatus may include aresponse unit 501, anacquisition unit 502, and ageneration unit 503.
Theresponse unit 501 is configured to, in response to a picture display request triggered at a live broadcast client, display a live broadcast picture in a main window of the live broadcast client, and create a sub-window in the live broadcast client; the sub-window is superimposed on the main window.
The obtainingunit 502 is configured to obtain special effect information corresponding to a special effect playing instruction when the special effect playing instruction of the virtual gift sent by the live broadcast server is received.
A generatingunit 503, configured to generate and display a special effect animation in the child window based on the special effect information.
Optionally, the screen display apparatus further includes a determination unit, the determination unit is configured to:
and determining a main window process associated with the main window, and acquiring a main window handle corresponding to the main window process.
The response unit is used for responding to a picture display request triggered by a live client, displaying a live picture in a main window of the live client and creating a sub-window in the live client; wherein the sub-window is superimposed on the main window;
the system comprises an acquisition unit, a display unit and a display unit, wherein the acquisition unit is used for acquiring special effect information corresponding to a special effect playing instruction when the special effect playing instruction of a virtual gift sent by a live broadcast server is received;
and the generating unit is used for generating and displaying the special effect animation in the child window based on the special effect information.
Optionally, the screen display apparatus further includes a determination unit, the determination unit is configured to:
determining a main window process associated with the main window, and acquiring a main window handle corresponding to the main window process;
creating a child window process, forwarding window attribute data corresponding to the main window handle to the child window process, and generating the child window handle based on the window attribute data, wherein the window attribute data at least comprises window size data and window position data;
the child window process creates the child window based on the window attribute data, wherein the main window process and the child window process communicate via an interprocess communication link.
Optionally, the screen display apparatus further includes a sending unit, and the sending unit is configured to:
when the main window process receives the special-effect playing instruction sent by the live broadcast server, the main window process forwards the special-effect playing instruction to the sub-window process through the inter-process communication link, so that the sub-window process obtains special-effect information corresponding to the special-effect playing instruction.
Optionally, the sending unit is further configured to:
when detecting that the main window process receives a processing instruction, the main window process forwards the processing instruction to the sub-window process through the inter-process communication link, so that the sub-window process responds to the processing instruction.
Optionally, the screen display device further includes a first response unit, where the first response unit is configured to:
the main window is displayed with a first function control;
generating a processing instruction in response to a triggering operation for the first function control, wherein the processing instruction comprises: a size adjustment parameter;
the main window process forwards the processing instruction to the sub-window process through the inter-process communication link, including:
adjusting window size data corresponding to the main window handle according to the size adjustment parameter so as to adjust the size of the main window;
and the main window process forwards the processing instruction and the size adjusting parameter to the sub-window process through the inter-process communication link, so that the sub-window process responds to the processing instruction and adjusts window size data corresponding to the sub-window handle according to the size adjusting parameter to adjust the size of the sub-window.
Optionally, the screen display device further includes a second response unit, and the second response unit is configured to:
the main window is displayed with a second function control;
generating a processing instruction in response to a triggering operation for the second function control, wherein the processing instruction comprises: a status indication parameter;
the main window process forwards the processing instruction to the sub-window process through the inter-process communication link, including:
when the main window process is detected to receive the processing instruction, the main window process forwards the state indication parameter to the sub-window process through the inter-process communication link, so that the sub-window process responds to the state indication parameter and closes the sub-window; meanwhile, the main window process responds to the state indication parameter and closes the main window.
Optionally, the screen display device further includes a third response unit, and the third response unit is configured to:
the main window is displayed with a third function control;
responding to the triggering operation of the current user for the third function control, and generating a session object list in the main window, wherein the session object list shows a plurality of virtual head portraits, and the virtual head portraits are associated with virtual account numbers;
generating a processing instruction in response to a triggering operation for a target virtual avatar, wherein the processing instruction comprises: information display parameters;
the main window process forwards the processing instruction to the sub-window process through the inter-process communication link, including:
the main window process sends the processing instruction to the sub-window process so that the sub-window process generates and displays an information interaction interface in the sub-window based on the information display parameter; the information interaction interface is used for displaying interaction information between users.
Optionally, the screen display apparatus further includes an adjusting unit, and the adjusting unit is configured to:
the child window process receives a movement instruction, wherein the movement instruction comprises a position adjustment parameter;
and the child window process adjusts the window position data corresponding to the child window handle according to the position adjustment parameter so as to adjust the position relation between the child window and the main window.
Optionally, the screen display apparatus further includes a receiving unit, where the receiving unit is configured to:
receiving a barrage playing instruction sent by the live broadcast server;
and generating and displaying bullet screen information in the child window based on the bullet screen playing instruction, wherein the bullet screen information is used for describing the virtual gift and a source user of the virtual gift.
Optionally, the obtainingunit 502 is further configured to:
acquiring position data of bullet screen information to be displayed and position data of special-effect animation to be displayed, which are indicated by the bullet screen playing instruction;
the child window process determines whether the position data of the bullet screen information to be displayed is the same as the position data of the special effect animation to be displayed;
and if so, the child window process sets the display priority of the special effect animation to be higher than the priority of the target bullet screen.
An embodiment of the present application discloses a picture display device, including: aresponse unit 501 responds to a picture display request triggered by a live client, displays a live picture in a main window of the live client, and creates a sub-window in the live client; the sub-window is superimposed on the main window. When a special-effect playing instruction sent by a live broadcast server is received, the obtainingunit 502 obtains special-effect information corresponding to the special-effect playing instruction. The generatingunit 503 generates and displays a special effect animation in the sub-window based on the special effect information. According to the method and the device, two independent window processes are established at a live broadcast receiving end, and the two independent window processes are communicated with each other, so that live broadcast pictures are played on a main window, and special-effect animations are played in sub-windows; therefore, the blocking caused when the live broadcast client plays the live broadcast picture and a plurality of special-effect animations simultaneously is avoided, and the picture display fluency in the live broadcast client is improved.
Correspondingly, the embodiment of the present application further provides a Computer device, where the Computer device may be a terminal or a server, and the terminal may be a terminal device such as a smart phone, a tablet Computer, a notebook Computer, a touch screen, a Personal Computer (PC), a Personal Digital Assistant (PDA), and the like. As shown in fig. 13, fig. 13 is a schematic structural diagram of a computer device according to an embodiment of the present application. Thecomputer device 1000 includes aprocessor 1001 with one or more processing cores, amemory 1002 with one or more computer-readable storage media, and a computer program stored on thememory 1002 and executable on the processor. Theprocessor 1001 is electrically connected to thememory 1002. Those skilled in the art will appreciate that the computer device configurations illustrated in the figures are not meant to be limiting of computer devices and may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components.
Theprocessor 1001 is a control center of thecomputer apparatus 1000, connects various parts of theentire computer apparatus 1000 using various interfaces and lines, performs various functions of thecomputer apparatus 1000 and processes data by running or loading software programs and/or modules stored in thememory 1002, and calling data stored in thememory 1002, thereby performing overall monitoring of thecomputer apparatus 1000.
In this embodiment of the application, theprocessor 1001 in thecomputer device 1000 loads instructions corresponding to processes of one or more applications into thememory 1002, and theprocessor 1001 runs the applications stored in thememory 1002 according to the following steps, so as to implement various functions:
responding to a picture display request triggered at a live client, displaying a live picture in a main window of the live client, and creating a sub-window in the live client; the sub-window is superimposed on the main window. When a special-effect playing instruction sent by a live broadcast server is received, special-effect information corresponding to the special-effect playing instruction is obtained. And generating and displaying special effect animation in the child window based on the special effect information. According to the method and the device, two independent window processes are established at a live broadcast receiving end, and the two independent window processes are communicated with each other, so that live broadcast pictures are played on a main window, and special-effect animations are played in sub-windows; therefore, the blocking caused when the live broadcast client plays the live broadcast picture and a plurality of special-effect animations simultaneously is avoided, and the picture display fluency in the live broadcast client is improved.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Optionally, as shown in fig. 13, thecomputer device 1000 further includes: touch-sensitive display screen 1003,radio frequency circuit 1004,audio circuit 1005,input unit 1006 andpower 1007. Theprocessor 1001 is electrically connected to thetouch display screen 1003, theradio frequency circuit 1004, theaudio circuit 1005, theinput unit 1006, and thepower supply 1007, respectively. Those skilled in the art will appreciate that the computer device configuration illustrated in FIG. 13 does not constitute a limitation of computer devices, and may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components.
Thetouch screen 1003 may be used to display a user interface and receive an instruction of a user to perform a trigger operation generated by the user interface. Thetouch display screen 1003 may include a display panel and a touch panel. The display panel may be used, among other things, to display information entered by or provided to a user and various graphical user interfaces of the computer device, which may be made up of graphics, text, icons, video, and any combination thereof. Alternatively, the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations of a user on or near the touch panel (for example, operations of the user on or near the touch panel using any suitable object or accessory such as a finger, a stylus pen, and the like), and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to theprocessor 1001, and can receive and execute commands sent by theprocessor 1001. The touch panel may cover the display panel, and when the touch panel detects a touch operation thereon or nearby, the touch panel transmits the touch operation to theprocessor 1001 to determine the type of the touch event, and then theprocessor 1001 provides a corresponding visual output on the display panel according to the type of the touch event. In the embodiment of the present application, the touch panel and the display panel may be integrated into thetouch display screen 1003 to implement input and output functions. However, in some embodiments, the touch panel and the touch panel can be implemented as two separate components to perform the input and output functions. That is, thetouch display 1003 may also be used as a part of theinput unit 1006 to implement an input function.
In this embodiment of the application, a live application is executed by theprocessor 1001 to generate a user interface, a main window, and a sub-window on thetouch display 1003, and a virtual gift identifier is displayed on the main window. Thetouch display screen 1003 is used for presenting a user interface and receiving an operation instruction generated by a user acting on a function control.
Theradio frequency circuit 1004 may be used for transceiving radio frequency signals to establish wireless communication with a network device or other computer device through wireless communication, and for transceiving signals with the network device or other computer device.
Audio circuitry 1005 may be used to provide an audio interface between a user and a computer device through speakers and microphones. Theaudio circuit 1005 may transmit the electrical signal converted from the received audio data to a speaker, and convert the electrical signal into a sound signal for output; on the other hand, the microphone converts a collected sound signal into an electric signal, converts the electric signal into audio data after being received by theaudio circuit 1005, and outputs the audio data to theprocessor 1001 for processing, for example, to another computer device via theradio frequency circuit 1004, or outputs the audio data to thememory 1002 for further processing. Theaudio circuitry 1005 may also include an earbud jack to provide communication of a peripheral headset with the computer device.
Theinput unit 1006 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), and generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
Thepower supply 1007 is used to power the various components of thecomputer device 1000. Optionally, thepower supply 1007 may be logically connected to theprocessor 1001 through a power management system, so as to implement functions of managing charging, discharging, power consumption management, and the like through the power management system. Thepower supply 1007 may also include any component including one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
Although not shown in fig. 13, thecomputer device 1000 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., which are not described in detail herein.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
As can be seen from the above, the computer device provided in this embodiment displays a live view in a main window of a live view client by responding to a view display request triggered at the live view client, and creates a sub-window in the live view client; the sub-window is superimposed on the main window. When a special-effect playing instruction sent by a live broadcast server is received, special-effect information corresponding to the special-effect playing instruction is obtained. And generating and displaying special effect animation in the child window based on the special effect information. According to the method and the device, two independent window processes are established at a live broadcast receiving end, and the two independent window processes are communicated with each other, so that live broadcast pictures are played on a main window, and special-effect animations are played in sub-windows; therefore, the blocking caused when the live broadcast client plays the live broadcast picture and a plurality of special-effect animations simultaneously is avoided, and the picture display fluency in the live broadcast client is improved.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor.
To this end, embodiments of the present application provide a computer-readable storage medium, in which a plurality of computer programs are stored, and the computer programs can be loaded by a processor to execute the steps in any one of the information presentation methods provided by the embodiments of the present application. For example, the computer program may perform the steps of:
responding to a picture display request triggered at a live client, displaying a live picture in a main window of the live client, and creating a sub-window in the live client; the sub-window is superimposed on the main window. When a special-effect playing instruction sent by a live broadcast server is received, special-effect information corresponding to the special-effect playing instruction is obtained. And generating and displaying special effect animation in the child window based on the special effect information.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Wherein the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
Since the computer program stored in the storage medium can execute the steps in any image display method provided in the embodiments of the present application, the beneficial effects that can be achieved by any image display method provided in the embodiments of the present application can be achieved, which are detailed in the foregoing embodiments and will not be described herein again.
The foregoing describes in detail a screen display method, an apparatus, a computer device, and a storage medium provided in the embodiments of the present application, and specific examples are applied in the present application to explain the principles and implementations of the present application, and the descriptions of the foregoing embodiments are only used to help understand the method and the core ideas of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.