Movatterモバイル変換


[0]ホーム

URL:


CN113706719B - Virtual scene generation method and device, storage medium and electronic equipment - Google Patents

Virtual scene generation method and device, storage medium and electronic equipment
Download PDF

Info

Publication number
CN113706719B
CN113706719BCN202111011155.1ACN202111011155ACN113706719BCN 113706719 BCN113706719 BCN 113706719BCN 202111011155 ACN202111011155 ACN 202111011155ACN 113706719 BCN113706719 BCN 113706719B
Authority
CN
China
Prior art keywords
light source
virtual
data
source data
virtual scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111011155.1A
Other languages
Chinese (zh)
Other versions
CN113706719A (en
Inventor
庄宇轩
马若超
詹澍祺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Boguan Information Technology Co Ltd
Original Assignee
Guangzhou Boguan Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Boguan Information Technology Co LtdfiledCriticalGuangzhou Boguan Information Technology Co Ltd
Priority to CN202111011155.1ApriorityCriticalpatent/CN113706719B/en
Publication of CN113706719ApublicationCriticalpatent/CN113706719A/en
Application grantedgrantedCritical
Publication of CN113706719BpublicationCriticalpatent/CN113706719B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Classifications

Landscapes

Abstract

The disclosure belongs to the technical field of live video broadcasting, and relates to a virtual scene generation method and device, a storage medium and electronic equipment. The method comprises the following steps: virtual light source data in the virtual scene are obtained, and the virtual light source data are displayed on a live broadcast interface; and responding to the adjustment triggering operation acted on the live broadcast interface, and adjusting the virtual light source data. The virtual light source data are displayed on the live broadcast interface, and the anchor terminal can view the light source data in the virtual scene, so that a data basis is provided for autonomous interaction and personality setting of the anchor terminal. And the virtual light source data in the virtual scene is adjusted according to the adjustment triggering operation, and the anchor automatically and interactively changes the light distribution effect of the virtual scene, so that the customization of the personalized live broadcast scene is finished, the degree of automation and the degree of intelligence are high, the generation period of the virtual scene is shortened, the time cost and the labor cost of virtual scene replacement are reduced, the requirements of users on freshness and diversification of the live broadcast scene are met, and the viewing experience of audiences is optimized.

Description

Virtual scene generation method and device, storage medium and electronic equipment
Technical Field
The disclosure relates to the technical field of live video broadcasting, and in particular relates to a virtual scene generation method, a virtual scene generation device, a computer readable storage medium and electronic equipment.
Background
In a live scene, the live room context is an important viewing medium other than the host person. And the effect of the live broadcast scene also directly influences the watching experience of the user, thereby influencing the activity degree of the user in the live broadcast room and the atmosphere of the live broadcast room.
But in live-action scenes, the number of live-action rooms arranged is limited. Meanwhile, in the virtual live broadcast scene, the required virtual scenes are required to be manufactured one by one according to the requirement of a host, the manufacturing period is long, and the replacement cost of the virtual scenes is high. Therefore, it is difficult to satisfy the freshness and diversified appeal of the live scene by the user in the live scene and the virtual live scene.
In view of this, there is a need in the art to develop a new virtual scene generation method and apparatus.
It should be noted that the information disclosed in the above background section is only for enhancing understanding of the background of the present disclosure and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure aims to provide a virtual scene generating method, a virtual scene generating device, a computer-readable storage medium and an electronic device, so as to overcome at least to some extent the technical problems of single live scene, high scene manufacturing cost and long period caused by the limitations of the related art.
Other features and advantages of the present disclosure will be apparent from the following detailed description, or may be learned in part by the practice of the disclosure.
According to a first aspect of an embodiment of the present invention, there is provided a method for generating a virtual scene, where a live broadcast interface is provided by a hosting end, the live broadcast interface includes a video display area, and the video display area displays the virtual scene, the method includes:
obtaining virtual light source data in the virtual scene, and displaying the virtual light source data on the live broadcast interface;
And responding to an adjustment triggering operation acted on the live broadcast interface, and adjusting the virtual light source data.
In an exemplary embodiment of the present invention, the acquiring virtual light source data in the virtual scene includes:
Providing a light source adjustment area on the live broadcast interface;
And responding to function triggering operation acting on the light source adjustment area, and acquiring virtual light source data in the virtual scene.
In an exemplary embodiment of the present invention, the virtual light source data includes: light source type data, light source position data, and light source attribute parameters.
In an exemplary embodiment of the present invention, the displaying the virtual light source data on the live interface includes:
And generating a light control panel area on the live broadcast interface so as to display the virtual light source data in the light control panel area.
In an exemplary embodiment of the present invention, the displaying the virtual light source data in the light control panel area includes:
performing position projection processing on the light source position data to obtain projection position data;
and displaying the light source attribute parameters and the projection position data corresponding to the light source type data in the light control panel area based on the light source type data.
In an exemplary embodiment of the present invention, the light source attribute parameter includes: a light source information parameter and a light source color parameter.
In an exemplary embodiment of the invention, the method further comprises:
And displaying a virtual light source identifier corresponding to the light source type data on the live broadcast interface.
In an exemplary embodiment of the present invention, the adjusting the virtual light source data in response to an adjustment trigger operation acting on the live interface includes:
Providing a prescribed sliding path corresponding to the virtual light source identifier in the light control area panel;
Responding to the adjustment triggering operation acted on the virtual light source identifier, sliding the virtual light source identifier according to the specified sliding path to obtain a current sliding position, and adjusting the projection position data according to the current sliding position.
In an exemplary embodiment of the present invention, the adjusting the virtual light source data in response to an adjustment trigger operation acting on the live interface includes:
Providing a virtual light source sliding bar corresponding to the virtual light source identifier in the light control area panel, and acquiring a mapping relation of the virtual light source identifier and a current identifier parameter of the virtual light source identifier;
And responding to an adjustment triggering operation acted on the virtual light source slide bar, and adjusting the light source information parameters according to the mapping relation and the current identification parameters.
In an exemplary embodiment of the present invention, the mapping relationship is established according to a light source identification parameter of the virtual light source identification and the light source attribute parameter.
In an exemplary embodiment of the present invention, the adjusting the virtual light source data in response to an adjustment trigger operation acting on the live interface includes:
providing a palette control corresponding to the virtual light source identifier in the light control area panel;
And responding to an adjustment triggering operation acted on the palette control, and adjusting the color parameters of the light source.
In an exemplary embodiment of the invention, said adjusting said light source color parameter comprises:
Acquiring color value data corresponding to the adjustment triggering operation, and performing color value mapping processing on the color value data to obtain an image color file;
And adjusting the light source color parameters by using the image color file.
In an exemplary embodiment of the invention, the method further comprises:
Acquiring anchor display data under the irradiation of the adjusted virtual light source data, and determining anchor display rules of the adjusted virtual light source data;
And adjusting the entity light source data corresponding to the virtual scene according to the anchor display rule and the anchor display data.
In an exemplary embodiment of the present invention, the anchor display rule includes the adjusting entity light source data corresponding to the virtual scene according to the anchor display rule and the anchor display data, including:
Acquiring to-be-dimmed source data corresponding to entity light source data, and performing light source data calculation on the to-be-dimmed source data and the anchor display data to obtain a fusion display difference value;
And acquiring a fusion display threshold value of the anchor display rule, and adjusting entity light source data corresponding to the virtual scene according to the fusion display difference value and the fusion display threshold value.
In an exemplary embodiment of the present invention, the anchor display rule includes: the same display rule and the complementary display rule.
In an exemplary embodiment of the invention, the method further comprises:
and generating result identification data according to the adjustment result of the entity light source data, and displaying the result identification data on the live broadcast interface.
In an exemplary embodiment of the invention, the method further comprises:
and adjusting the virtual light source data again according to the result identification data.
In an exemplary embodiment of the present invention, said readjusting said virtual light source data according to said result identification data comprises:
Acquiring target light source data after the virtual light source data and the entity light source data are adjusted, and acquiring original light source data before the entity light source data are adjusted;
performing light source average value calculation on the original light source data and the target light source data to obtain light source average value data;
And when the result identification data is that the entity light source data is successfully adjusted, the virtual light source data is adjusted again according to the light source mean value data.
According to a second aspect of the embodiment of the present invention, there is provided a virtual scene generating apparatus, which provides a live broadcast interface through a hosting end, the live broadcast interface including a video display area, the video display area displaying a virtual scene, including:
The data display module is configured to acquire virtual light source data in the virtual scene and display the virtual light source data on the live broadcast interface;
and the data adjustment module is configured to respond to adjustment triggering operation acted on the live broadcast interface and adjust the virtual light source data.
According to a third aspect of an embodiment of the present invention, there is provided an electronic apparatus including: a processor and a memory; wherein the memory has stored thereon computer readable instructions which, when executed by the processor, implement the method of generating a virtual scene in any of the above-described exemplary embodiments.
According to a fourth aspect of embodiments of the present invention, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of generating a virtual scene in any of the above-described exemplary embodiments.
As can be seen from the above technical solutions, the method for generating a virtual scene, the device for generating a virtual scene, the computer storage medium, and the electronic device according to the exemplary embodiments of the present disclosure have at least the following advantages and positive effects:
In the method and the device provided by the exemplary embodiment of the disclosure, the virtual light source data is displayed on the live broadcast interface, so that the host broadcast end can view the light source data in the virtual scene, and a data base is provided for autonomous interaction and personality setting of the host broadcast end. And the virtual light source data in the virtual scene is adjusted according to the adjustment triggering operation, and the anchor automatically and interactively changes the light distribution effect in the virtual scene, so that the customization of the personalized live broadcast scene is finished, the degree of automation and intelligence is high, the generation period of the virtual scene is shortened, the time cost and the labor cost of virtual scene replacement are also reduced, the requirements of users on freshness and diversification of the live broadcast scene are met, and the viewing experience of audiences is optimized.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure. It will be apparent to those of ordinary skill in the art that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived from them without undue effort.
Fig. 1 schematically illustrates a flowchart of a method for generating a virtual scene in an exemplary embodiment of the disclosure;
FIG. 2 schematically illustrates a flow diagram of a method of obtaining virtual light source data in an exemplary embodiment of the present disclosure;
FIG. 3 schematically illustrates a flow diagram of a method of displaying virtual light source data in an exemplary embodiment of the present disclosure;
FIG. 4 schematically illustrates a flow diagram of a method of adjusting virtual light source data in an exemplary embodiment of the present disclosure;
FIG. 5 schematically illustrates a flow chart of another method of adjusting virtual light source data in an exemplary embodiment of the present disclosure;
FIG. 6 schematically illustrates a flow chart of a method of still another adjustment of virtual light source data in an exemplary embodiment of the present disclosure;
FIG. 7 schematically illustrates a flow chart of a method of adjusting a light source color parameter in an exemplary embodiment of the present disclosure;
FIG. 8 schematically illustrates a flow diagram of a method of adjusting physical light source data in an exemplary embodiment of the present disclosure;
FIG. 9 schematically illustrates a flow chart of a method of further adjusting physical light source data in an exemplary embodiment of the disclosure;
FIG. 10 schematically illustrates a flow diagram of a method of readjusting virtual light source data in an exemplary embodiment of the present disclosure;
Fig. 11 schematically illustrates an interface schematic diagram of a virtual multicast at a host in an application scenario in an exemplary embodiment of the present disclosure;
fig. 12 schematically illustrates an interface schematic diagram of a light control panel area in an application scenario in an exemplary embodiment of the present disclosure;
FIG. 13 schematically illustrates an interface diagram for defining a sliding path in an application scenario in an exemplary embodiment of the present disclosure;
Fig. 14 schematically illustrates a virtual live effect graph after adjusting virtual light source data in an application scenario in an exemplary embodiment of the present disclosure;
Fig. 15 schematically illustrates a structural diagram of a virtual scene generating apparatus in an exemplary embodiment of the present disclosure;
fig. 16 schematically illustrates an electronic device for implementing a method for generating a virtual scene in an exemplary embodiment of the present disclosure;
fig. 17 schematically illustrates a computer-readable storage medium for implementing a virtual scene generation method in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the present disclosure. One skilled in the relevant art will recognize, however, that the aspects of the disclosure may be practiced without one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
The terms "a," "an," "the," and "said" are used in this specification to denote the presence of one or more elements/components/etc.; the terms "comprising" and "having" are intended to be inclusive and mean that there may be additional elements/components/etc. in addition to the listed elements/components/etc.; the terms "first" and "second" and the like are used merely as labels, and are not intended to limit the number of their objects.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities.
Aiming at the problems in the related art, the present disclosure provides a method for generating a virtual scene, wherein a live broadcast interface is provided by a host, the live broadcast interface includes a video display area, and the video display area displays the virtual scene. Fig. 1 shows a flowchart of a method for generating a virtual scene, and as shown in fig. 1, the method for generating a virtual scene at least includes the following steps:
s110, obtaining virtual light source data in the virtual scene, and displaying the virtual light source data on a live broadcast interface.
And S120, responding to adjustment triggering operation acted on the live broadcast interface, and adjusting the virtual light source data.
In an exemplary embodiment of the present disclosure, virtual light source data is displayed on a live interface, so that a hosting end can view the light source data in a virtual scene, and a data base is provided for autonomous interaction and personality setting of the hosting end. And the virtual light source data in the virtual scene is adjusted according to the adjustment triggering operation, and the anchor automatically and interactively changes the light distribution effect in the virtual scene, so that the customization of the personalized live broadcast scene is finished, the degree of automation and intelligence is high, the generation period of the virtual scene is shortened, the time cost and the labor cost of virtual scene replacement are also reduced, the requirements of users on freshness and diversification of the live broadcast scene are met, and the viewing experience of audiences is optimized.
The steps of the virtual scene generation method are described in detail below.
In step S110, virtual light source data in the virtual scene is acquired, and the virtual light source data is displayed on the live interface.
In an exemplary embodiment of the present disclosure, the anchor may complete the virtual casting through a virtual casting function of the live software.
Specifically, the anchor is located before the green curtain, and the process is entered by clicking virtual on the background. Further, a virtual background of the UE (Unreal Engine, ghost engine) is selected to preview the effect after matting in real time. Then, the angle and other parameters of the camera are adjusted to click on the play control to complete virtual play. And, the anchor is in a virtual on-air state.
It should be noted that, after the UE virtual background is selected, customization of the personalized scene may be adjusted before or during the start-up.
Correspondingly, the user enters the main broadcasting living room in the virtual broadcasting, and can watch living broadcast normally. And, the personalized virtual scene after the adjustment of the anchor can be watched.
In an alternative embodiment, fig. 2 shows a schematic flow chart of a method for obtaining virtual light source data, and as shown in fig. 2, the method at least includes the following steps: in step S210, a light source adjustment area is provided on the live interface.
The light source adjustment area may be an area for judging whether a function trigger operation of the anchor is valid. And, the light source adjustment area may be an area belonging to any size or any shape within the live interface, which is not particularly limited in the present exemplary embodiment.
In step S220, virtual light source data in the virtual scene is acquired in response to a function trigger operation acting in the light source adjustment area.
The anchor can trigger the interaction event of functions such as clicking on the preview picture of the virtual live broadcast, namely the live broadcast interface, namely a function triggering operation is acted. The function triggering operation may be a long press or a sliding triggering operation other than a clicking operation, and the present exemplary embodiment is not limited thereto.
Further, the location information of the anchor acting the function triggering operation may be determined. Only when the action position of the function triggering operation belongs to the light source adjustment area, the virtual light source data in the virtual scene can be acquired.
And, the location information of the function triggering operation may be that the anchor side of the live broadcast platform sends to the anchor side UE instance, and the anchor side UE instance determines that the action location of the function triggering operation belongs to the light source adjustment area.
When the action position of the function triggering operation is not located in the light source adjustment area, the anchor terminal continues to preview the current virtual live broadcast picture; when the action position of the function triggering operation belongs to the light source adjustment area, the calling logic of the scene lamplight control is met. Specifically, the anchor UE instance reads the virtual light source data in the current virtual scene. And further, the virtual light source data is sent to a main broadcasting end of the live broadcasting platform.
In an alternative embodiment, the virtual light source data includes: light source type data, light source position data, and light source attribute parameters.
The light source type data may include Directional light sources (directors), point light sources (points), condensing light sources (Spot), and Sky light (Sky); the light source position data may include position data of each type of light source included in the virtual scene, and the position data may be in the form of three-dimensional point coordinates; the light source attribute parameters may include the light intensity of each light source, the light source color, and parameters affecting the scene. In addition, the virtual light source data may include other data, and the light source type data, the light source position data, and the light source attribute parameters may further include other data, which is not particularly limited in the present exemplary embodiment.
In the present exemplary embodiment, whether virtual light source data is acquired is defined by the light source adjustment area, the interaction setting logic is perfect, and a function entry is provided for autonomous interaction and setting by the anchor.
Further, virtual light source data may also be displayed on the live interface.
In an alternative embodiment, a light control panel region is generated on the live interface to display virtual light source data at the light control panel region.
After the live broadcast platform host receives the virtual light source data, a corresponding light control panel area can be generated on the live broadcast interface. The light control panel area is used for displaying the obtained virtual light source data.
In an alternative embodiment, fig. 3 shows a flow chart of a method for displaying virtual light source data, as shown in fig. 3, the method at least comprises the following steps: in step S310, the position projection processing is performed on the light source position data to obtain projection position data.
In order to display the light source position data in the light control panel region, first, it is necessary to perform a position projection process on the light source position data.
Specifically, the light source position data of each light source in the virtual scene may be read, and the light source position data may be sequentially projected onto a plane formed by XY coordinate axes to obtain two-dimensional projection position data.
In step S320, light source attribute parameters and projection position data corresponding to the light source type data are displayed in the lamp control panel region based on the light source type data.
When the virtual light source data is displayed in the light control panel area, the light source type data can be sequentially read, and corresponding light source attribute parameters and projection position data are set for different light source type data.
Accordingly, the light source attribute parameter and the projection position data can be displayed in the light source type data within the light control panel region.
In an alternative embodiment, the light source attribute parameters include: a light source information parameter and a light source color parameter.
The light source color data may include a light source color value, a light source color temperature, and other data reflecting the light source color, which is not particularly limited in the present exemplary embodiment. The light source information data may include light intensity, and may also include data of other light source information, which is not particularly limited in the present exemplary embodiment.
In order to more remarkably show the light source attribute and the projection position data corresponding to different light source types for the anchor end, a virtual light source identifier can be generated.
In an alternative embodiment, a virtual light source identification corresponding to the light source type data is displayed on the live interface.
The virtual light source identification may be generated for different light source type settings. And, the virtual light source mark may be in a sun pattern style, a bulb pattern style, or a desk lamp pattern style, which is not particularly limited in the present exemplary embodiment.
In order to display the light source attribute parameters, the light source attribute parameters can be sequentially read, and controls such as light movement track mapping, light attribute and color information mapping are set for different light source attribute parameters, and are displayed in the light control panel area according to the form of a slide bar, a color disc or input parameter values.
In the exemplary embodiment, the converted virtual light source data can be displayed by utilizing the light control panel area, the display of the virtual light source data is more beneficial to the viewing and adjustment of a host, a light source interaction mode is provided for the host, and the interaction dimension in the virtual live broadcast scene is perfected.
In step S120, the virtual light source data is adjusted in response to the adjustment trigger operation acting on the live interface.
In an exemplary embodiment of the present disclosure, after the virtual light source data is displayed on the live interface, the corresponding virtual light source data may be adjusted by adjusting the trigger operation.
In an alternative embodiment, fig. 4 shows a flow chart of a method for adjusting virtual light source data, and as shown in fig. 4, the method at least includes the following steps: in step S410, a prescribed sliding path corresponding to the virtual light source identifier is provided in the light control area panel.
For the light source type data in the virtual scene, a prescribed sliding path can be correspondingly set. The specified sliding path is a mapping track of lamplight movement, for example, a 3D-2D mapping track, and is used for restricting the position of the corresponding light source, so that virtual light source data can be displayed on the specified sliding path for the anchor terminal to act on the virtual light source identifier. In order to make the anchor end easy to view, the same dotted line track as the prescribed sliding path can be generated and projected in the preview window of the anchor end so that the anchor can drag according to the dotted line track.
In step S420, in response to the adjustment triggering operation acting on the virtual light source identifier, the virtual light source identifier is slid according to the prescribed sliding path to obtain a current sliding position, and the projection position data is adjusted according to the current sliding position.
After providing the prescribed sliding path, the anchor may act as an adjustment trigger according to the requirements for different types of light sources, as indicated by the virtual light source identification. The adjustment triggering operation may be a click operation, a long press operation, a slide operation, or the like, which is not particularly limited in the present exemplary embodiment.
The virtual light source identifier can be slid to the current sliding position according to the adjustment triggering operation, and projection position data is adjusted according to the current sliding position, so that adjustment of the virtual light source data is achieved.
The anchor can adjust the action of triggering operation through the anchor side of the virtual live broadcast, then the UE instance of the anchor side changes the light position of the corresponding light source in the virtual scene in real time according to the specified sliding path, and updates the picture video stream to the preview window in real time. At this time, the anchor side can check the effect after the lamplight is moved.
In the present exemplary embodiment, the position data in the virtual light source data may be adjusted through the prescribed sliding path corresponding to the virtual light source identifier, which provides an interactive way for the anchor to adjust the position of the light source, and ensures the personalized effect of the virtual scene from the perspective of the position of the light source, and the adjustment way is simple and easy to operate.
In an alternative embodiment, fig. 5 shows a flow chart of another method for adjusting virtual light source data, and as shown in fig. 5, the method at least includes the following steps: in step S510, a virtual light source slide bar corresponding to the virtual light source identifier is provided in the light control area panel, and the mapping relationship of the virtual light source identifier and the current identifier parameter of the virtual light source identifier are obtained.
The anchor side UE instance can read the corresponding attribute parameter interface to obtain the corresponding current identification parameter aiming at the virtual light source identification displayed currently.
And a virtual light source slide bar can be generated in the lamp control area panel, and the virtual light source slide bar is used for adjusting the light source information parameters. Further, a mapping relation of the virtual light source identification is established.
In an alternative embodiment, the mapping relationship is established according to the light source identification parameter and the light source attribute parameter of the virtual light source identification.
And a mapping relation can be established according to the light source information parameters and the slide bar values of the virtual light source slide bars. For example, the light intensity may have a value of [0,1000], and the virtual light source bar may have a value of [0,100]. Therefore, a mapping relationship can be established between [0,1000] of the light intensity and [0,100] of the virtual light source slide bar value. Also, the mapping relationship may be linear or nonlinear, and the present exemplary embodiment is not particularly limited thereto.
In step S520, the light source information parameter is adjusted according to the mapping relation and the current identification parameter in response to the adjustment triggering operation acting on the virtual light source slide bar.
Further, the user can act an adjustment triggering operation on the virtual light source slide bar to adjust the light source information parameters. The adjustment triggering operation may be a click operation, a long press operation, or a slide operation, which is not particularly limited in the present exemplary embodiment.
After receiving the data of the adjustment triggering operation of the anchor terminal, the anchor terminal UE instance can change the light source information parameters in the virtual scene in real time according to the mapping relation, and update the picture video stream to the preview window in real time so that the anchor terminal can check the effect after the light value is changed.
In the exemplary embodiment, the light source information parameters can be adjusted in real time through the arrangement of the virtual light source slide bar, an interaction mode for adjusting the light value is provided for a host, the personalized effect of the virtual scene is ensured from the angle of the light value, and the adjustment mode is simple and easy to operate.
In an alternative embodiment, fig. 6 is a flowchart illustrating a further method for adjusting virtual light source data, and as shown in fig. 6, the method at least includes the following steps: in step S610, a palette control corresponding to the virtual light source identifier is provided in the light control area panel.
The palette control may be a control that exposes a range of color values. Moreover, the palette controls of different virtual light source identifications may be the same or different, and the present exemplary embodiment is not particularly limited thereto.
In step S620, the light source color parameters are adjusted in response to an adjustment triggering operation acting on the palette control.
The anchor UE instance may read its attribute parameter interface for the current virtual light source identification, and the anchor may return the color value range (srbg, STANDARD RED GREEN Blue) of the currently selected palette control by the anchor. The anchor UE instance may adjust the light source color parameters according to the received color value data determined by the adjustment triggering operation.
In an alternative embodiment, fig. 7 shows a schematic flow chart of a method for adjusting color parameters of a light source, and as shown in fig. 7, the method at least comprises the following steps: in step S710, color value data corresponding to the adjustment triggering operation is obtained, and color value mapping processing is performed on the color value data to obtain an image color file.
After the color value data is obtained, a color value mapping process may be performed on the color value data.
Specifically, the color value data may be mapped to a scene Lut (look up Table) map to obtain an image color file.
In step S720, the light source color parameters are adjusted using the image color file.
After the image color file is obtained, the integral light range in the virtual scene can be adjusted by utilizing the Lut, and the anchor side can check the effect of the changed light source color parameters.
In the exemplary embodiment, the color parameters of the light source can be adjusted in real time through the setting of the palette control, an interaction mode for adjusting the color parameters is provided for a host, the personalized effect of the virtual scene is ensured from the light color perspective, and the adjustment mode is simple and easy to operate.
After the anchor UE instance adjusts the virtual light source data, the physical light source data in the real environment where the virtual scene is located may be further changed to adapt to the light change in the virtual scene.
In an alternative embodiment, fig. 8 is a flow chart of a method for adjusting physical light source data, and as shown in fig. 8, the method at least includes the following steps: in step S810, the anchor display data under the irradiation of the adjusted virtual light source data is acquired, and the anchor display rule with the adjusted virtual light source data is determined.
After the virtual light source data is adjusted, the irradiation of the virtual light source data can affect the display effect of the face and other parts or other areas of the anchor, so that the anchor display data under the irradiation of the virtual light source data can be obtained.
The anchor display data may be a color temperature value of an anchor face, or may be other data, which is not particularly limited in the present exemplary embodiment.
Furthermore, the anchor UE instance may further obtain a corresponding anchor display rule, that is, a fusion template of the virtual scene and the entity light.
In an alternative embodiment, the anchor display rules include: the same display rule and the complementary display rule.
The same display rule may be a rule that when the anchor display data is cool tone data, the physical light source data is also adjusted to be cool tone data, that is, a rule of the same color temperature value; the complementary display rule may be a rule for adjusting the physical light source data to the color-hitting data of the anchor display data, that is, the complementary color temperature value. In addition, the same display rule and the complementary display rule may be other rules set according to actual requirements, which are not particularly limited in the present exemplary embodiment.
In step S820, the entity light source data corresponding to the virtual scene is adjusted according to the anchor display rule and the anchor display data.
In an alternative embodiment, fig. 9 is a flow chart of a method for further adjusting physical light source data, and as shown in fig. 9, the method at least includes the following steps: in step S910, the to-be-dimmed source data corresponding to the entity light source data is obtained, and the light source data calculation is performed on the to-be-dimmed source data and the anchor display data to obtain a fusion display difference value.
The to-be-dimmed source data may be a target value for determining that the physical light source data needs to be adjusted.
Further, light source data calculation may be performed on the to-be-dimmed source data and the anchor display data. Specifically, difference calculation can be performed on the to-be-dimmed source data and the anchor display data to obtain a corresponding fusion display difference value.
In step S920, a fusion display threshold of the anchor display rule is obtained, and the entity light source data corresponding to the virtual scene is adjusted according to the fusion display difference value and the fusion display threshold.
The fusion display threshold is a threshold for judging whether the entity light source data needs to be adjusted.
When the fusion display difference value is larger than the fusion display threshold value, the corresponding entity light source data can be adjusted according to the fusion display difference value and the fusion display threshold value, so that when the virtual light source data is warm tone, the entity light source data is also adjusted to be warm tone, and the condition that the face color of the anchor is distorted is avoided.
The anchor side UE instance can send the entity light source data to be adjusted to the entity light control in the off-line scene through the entity light control plug-in and the wireless communication template. After the entity light control receives the entity light source data to be adjusted, the data can be analyzed, and the entity light source data can be adjusted according to the corresponding numerical value.
In this exemplary embodiment, the corresponding physical light source data is adjusted through the indication of the virtual light source data, so as to achieve the fusion display effect of combining the virtual scene and the display scene, thereby completing the customization of the personalized scene at the anchor end.
In the process of adjusting the entity light source data, the entity light source data may be successfully adjusted, or the entity light source data may be failed to be adjusted, so that corresponding result identification data may be generated for indication.
In an alternative embodiment, the result identification data is generated according to the adjustment result of the entity light source data, and the result identification data is displayed on the live broadcast interface.
The result identification data may be data capable of displaying the adjustment result of the entity light source data, for example, may be word data of success or failure, or may be symbol data of a sign and a wrong sign, and the present exemplary embodiment is not limited in particular. And, the result identification data may be returned to the live UE instance.
And after the physical light source data is successfully adjusted, the virtual light source data can be further finely adjusted.
In an alternative embodiment, the virtual light source data is again adjusted according to the result identification data.
At this time, the result identification data may be data showing that the entity light source data adjustment fails.
In an alternative embodiment, fig. 10 shows a flow chart of a method for readjusting virtual light source data, and as shown in fig. 10, the method at least includes the following steps: in step S1010, target light source data after adjusting the virtual light source data and the physical light source data is obtained, and original light source data before adjusting the physical light source data is obtained.
The target light source data may be current light source data of the physical light source.
In step S1020, a light source mean value calculation is performed on the original light source data and the target light source data to obtain light source mean value data.
After the original light source data and the target light source data are obtained, average calculation can be carried out on the target light source data and the original light source data to obtain light source average data before and after the entity light source data are adjusted.
In step S1030, when the result identification data is that the adjustment of the physical light source data is successful, the virtual light source data is adjusted again according to the light source mean value data.
After the physical light source data is successfully adjusted, the virtual light source data can be adjusted again by utilizing the light source average value data before and after the physical light source data is adjusted, namely, the virtual light source data is closed to the light source average value data before and after the physical light source data is adjusted, so that the virtual scene is consistent with the offline environment. And the picture video stream can be updated to the preview window in real time, so that the anchor end can check the effect after the lamplight value is changed.
In this exemplary embodiment, the adjustment result of the physical light source data may further fine tune the virtual light source data, so as to maximize the effect of the interaction between the virtual light source data and the physical light source data, and to maximally ensure the scene display effect of the virtual-real combination.
After the adjustment of the virtual light source data, the adjustment of the physical light source data and the fine adjustment of the virtual light source data are completed, the anchor can click a save button at the anchor end to synchronize the adjusted personalized scene to the user end so as to be presented in the live client of the user end, thereby completing the presentation of the customized virtual scene.
The method for generating the virtual scene in the embodiment of the present disclosure is described in detail below in connection with an application scene.
Fig. 11 shows an interface schematic diagram of a virtual multicast at a host in an application scenario, and as shown in fig. 11, the host may complete the virtual multicast through a virtual multicast function of live broadcast software.
Specifically, the anchor is located before the green curtain, and the process is entered by clicking virtual on the background. Further, a UE virtual background is selected to preview the effect after the matting in real time. Then, the angle and other parameters of the camera are adjusted to click on the play control to complete virtual play. And, the anchor is in a virtual on-air state.
It should be noted that, after the UE virtual background is selected, customization of the personalized scene may be adjusted before or during the start-up.
Correspondingly, the user enters the main broadcasting living room in the virtual broadcasting, and can watch living broadcast normally. And, the personalized virtual scene after the adjustment of the anchor can be watched.
Fig. 12 is a schematic view of an interface of a light control panel area in an application scenario, and as shown in fig. 12, first, a light source adjustment area is provided on a live interface.
The light source adjustment area may be an area for judging whether a function trigger operation of the anchor is valid. And, the light source adjustment area may be an area belonging to any size or any shape within the live interface, which is not particularly limited in the present exemplary embodiment.
And responding to the function triggering operation acted in the light source adjusting area, and acquiring virtual light source data in the virtual scene.
The anchor can trigger the interaction event of functions such as clicking on the preview picture of the virtual live broadcast, namely the live broadcast interface, namely a function triggering operation is acted. The function triggering operation may be a long press or a sliding triggering operation other than a clicking operation, and the present exemplary embodiment is not limited thereto.
Further, the location information of the anchor acting the function triggering operation may be determined. Only when the action position of the function triggering operation belongs to the light source adjustment area, the virtual light source data in the virtual scene can be acquired.
And, the location information of the function triggering operation may be that the anchor side of the live broadcast platform sends to the anchor side UE instance, and the anchor side UE instance determines that the action location of the function triggering operation belongs to the light source adjustment area.
When the action position of the function triggering operation is not located in the light source adjustment area, the anchor terminal continues to preview the current virtual live broadcast picture; when the action position of the function triggering operation belongs to the light source adjustment area, the calling logic of the scene lamplight control is met. Specifically, the anchor UE instance reads the virtual light source data in the current virtual scene. And further, the virtual light source data is sent to a main broadcasting end of the live broadcasting platform.
Wherein the virtual light source data may include: light source type data, light source position data, and light source attribute parameters.
The light source type data may include directional light sources, point light sources, concentrated light sources, and daylight; the light source position data may include position data of each type of light source included in the virtual scene, and the position data may be in the form of three-dimensional point coordinates; the light source attribute parameters may include the light intensity of each light source, the light source color, and parameters affecting the scene. In addition, the virtual light source data may include other data, and the light source type data, the light source position data, and the light source attribute parameters may further include other data, which is not particularly limited in the present exemplary embodiment.
The virtual light source data may then also be displayed on the live interface.
Specifically, a light control panel area is generated on the live broadcast interface to display virtual light source data in the light control panel area.
After the live broadcast platform host receives the virtual light source data, a corresponding light control panel area can be generated on the live broadcast interface. The light control panel area is used for displaying the obtained virtual light source data.
And performing position projection processing on the light source position data to obtain projection position data.
In order to display the light source position data in the light control panel region, first, it is necessary to perform a position projection process on the light source position data.
Specifically, the light source position data of each light source in the virtual scene may be read, and the light source position data may be sequentially projected onto a plane formed by XY coordinate axes to obtain two-dimensional projection position data.
Based on the light source type data, light source attribute parameters and projection position data corresponding to the light source type data are displayed in the lamp control panel region.
When the virtual light source data is displayed in the light control panel area, the light source type data can be sequentially read, and corresponding light source attribute parameters and projection position data are set for different light source type data.
Accordingly, the light source attribute parameter and the projection position data can be displayed in the light source type data within the light control panel region.
Wherein, the light source attribute parameters may include: a light source information parameter and a light source color parameter.
The light source color data may include a light source color value, a light source color temperature, and may also include data of other reaction light source colors, which is not particularly limited in the present exemplary embodiment. The light source information data may include light intensity, and may also include data of other light source information, which is not particularly limited in the present exemplary embodiment.
In order to more remarkably show the light source attribute and the projection position data corresponding to different light source types for the anchor end, a virtual light source identifier can be generated.
And displaying the virtual light source identification corresponding to the light source type data on the live broadcast interface.
The virtual light source identification may be generated for different light source type settings. And, the virtual light source identification may be light bulb style.
In order to display the light source attribute parameters, the light source attribute parameters can be sequentially read, and controls such as light movement track mapping, light attribute and color information mapping are set for different light source attribute parameters, and are displayed in the light control panel area according to the form of a slide bar, a color disc or input parameter values.
Further, after the virtual light source data is displayed on the live interface, the corresponding virtual light source data can be adjusted through adjustment triggering operation.
A prescribed sliding path corresponding to the virtual light source identifier is provided in the light control area panel.
Fig. 13 shows an interface schematic diagram of a prescribed sliding path in an application scene, and as shown in fig. 13, a prescribed sliding path may be correspondingly set for light source type data of a bulb pattern in a virtual scene. The specified sliding path is a mapping track of lamplight movement, for example, a 3D-2D mapping track, and is used for restricting the position of the corresponding light source, so that virtual light source data can be displayed on the specified sliding path for the anchor terminal to act on the virtual light source identifier. In order to make the anchor end easy to view, the same dotted line track as the prescribed sliding path can be generated and projected in the preview window of the anchor end so that the anchor can drag according to the dotted line track.
Responding to the adjustment triggering operation acted on the virtual light source identifier, sliding the virtual light source identifier according to a specified sliding path to obtain a current sliding position, and adjusting projection position data according to the current sliding position.
After providing the prescribed sliding path, the anchor may act as an adjustment trigger according to the requirements for different types of light sources, as indicated by the virtual light source identification. The adjustment triggering operation may be a click operation, a long press operation, a slide operation, or the like, which is not particularly limited in the present exemplary embodiment.
The virtual light source identifier can be slid to the current sliding position according to the adjustment triggering operation, and projection position data is adjusted according to the current sliding position, so that adjustment of the virtual light source data is achieved.
The anchor can adjust the action of triggering operation through the anchor side of the virtual live broadcast, then the UE instance of the anchor side changes the light position of the corresponding light source in the virtual scene in real time according to the specified sliding path, and updates the picture video stream to the preview window in real time. At this time, the anchor side can check the effect after the lamplight is moved.
And providing a virtual light source sliding bar corresponding to the virtual light source identifier in the light control area panel, and acquiring the mapping relation of the virtual light source identifier and the current identifier parameter of the virtual light source identifier.
The anchor side UE instance can read the corresponding attribute parameter interface to obtain the corresponding current identification parameter aiming at the virtual light source identification displayed currently.
And a virtual light source slide bar can be generated in the lamp control area panel, and the virtual light source slide bar is used for adjusting the light source information parameters. Further, a mapping relation of the virtual light source identification is established.
And a mapping relation can be established according to the light source information parameters and the slide bar values of the virtual light source slide bars. For example, the light intensity may have a value of [0,1000], and the virtual light source bar may have a value of [0,100]. Therefore, a mapping relationship can be established between [0,1000] of the light intensity and [0,100] of the virtual light source slide bar value. Also, the mapping relationship may be linear or nonlinear, and the present exemplary embodiment is not particularly limited thereto.
And responding to the adjustment triggering operation acted on the virtual light source slide bar, and adjusting the light source information parameters according to the mapping relation and the current identification parameters.
Further, the user can act an adjustment triggering operation on the virtual light source slide bar to adjust the light source information parameters. The adjustment triggering operation may be a click operation, a long press operation, or a slide operation, which is not particularly limited in the present exemplary embodiment.
After receiving the data of the adjustment triggering operation of the anchor terminal, the anchor terminal UE instance can change the light source information parameters in the virtual scene in real time according to the mapping relation, and update the picture video stream to the preview window in real time so that the anchor terminal can check the effect after the light value is changed.
And providing a palette control corresponding to the virtual light source identifier in the light control area panel.
The palette control may be a control that exposes a range of color values. Moreover, the palette controls of different virtual light source identifications may be the same or different, and the present exemplary embodiment is not particularly limited thereto.
And adjusting the color parameters of the light source in response to an adjustment triggering operation acting on the palette control.
The anchor UE instance may read its attribute parameter interface for the current virtual light source identifier, and the anchor may return the color value range of the palette control currently selected by the anchor. The anchor UE instance may adjust the light source color parameters according to the received color value data determined by the adjustment triggering operation.
And obtaining color value data corresponding to the adjustment triggering operation, and performing color value mapping processing on the color value data to obtain an image color file.
After the color value data is obtained, a color value mapping process may be performed on the color value data.
Specifically, the color value data may be mapped to a scene Lut (look up Table) map to obtain an image color file.
And adjusting the color parameters of the light source by using the image color file.
After the image color file is obtained, the integral light range in the virtual scene can be adjusted by utilizing the Lut, and the anchor side can check the effect of the changed light source color parameters.
Fig. 14 shows a virtual live broadcast effect diagram after adjusting virtual light source data in an application scene, and as shown in fig. 14, a host can freely customize a light scene required by the host by adjusting the virtual light source data in the virtual live broadcast scene. Under the light scene of adjustment virtual light source data, the cloth light effect in the virtual scene is better, and virtual combination effect is better.
According to the method for generating the virtual scene under the application scene, the virtual light source data are displayed on the live broadcast interface, so that the host broadcast end can view the light source data in the virtual scene, and a data base is provided for autonomous interaction and personality setting of the host broadcast end. And the virtual light source data in the virtual scene is adjusted according to the adjustment triggering operation, and the anchor automatically and interactively changes the light distribution effect in the virtual scene, so that the customization of the personalized live broadcast scene is finished, the degree of automation and intelligence is high, the generation period of the virtual scene is shortened, the time cost and the labor cost of virtual scene replacement are also reduced, the requirements of users on freshness and diversification of the live broadcast scene are met, and the viewing experience of audiences is optimized.
In addition, in an exemplary embodiment of the present disclosure, a generating device of a virtual scene is further provided, and a live broadcast interface is provided through a host, where the live broadcast interface includes a video display area, and the video display area displays the virtual scene. Fig. 15 shows a schematic structural diagram of a virtual scene generating apparatus, and as shown in fig. 15, a virtual scene generating apparatus 1500 may include: the data display module 1510 and the data adjustment module 1520. Wherein:
A data display module 1510 configured to acquire virtual light source data within the virtual scene and display the virtual light source data on the live interface; a data adjustment module 1520 configured to adjust the virtual light source data in response to an adjustment trigger operation acting on the live interface.
In an exemplary embodiment of the present invention, the acquiring virtual light source data in the virtual scene includes:
Providing a light source adjustment area on the live broadcast interface;
And responding to function triggering operation acting on the light source adjustment area, and acquiring virtual light source data in the virtual scene.
In an exemplary embodiment of the present invention, the virtual light source data includes: light source type data, light source position data, and light source attribute parameters.
In an exemplary embodiment of the present invention, the displaying the virtual light source data on the live interface includes:
And generating a light control panel area on the live broadcast interface so as to display the virtual light source data in the light control panel area.
In an exemplary embodiment of the present invention, the displaying the virtual light source data in the light control panel area includes:
performing position projection processing on the light source position data to obtain projection position data;
and displaying the light source attribute parameters and the projection position data corresponding to the light source type data in the light control panel area based on the light source type data.
In an exemplary embodiment of the present invention, the light source attribute parameter includes: a light source information parameter and a light source color parameter.
In an exemplary embodiment of the invention, the method further comprises:
And displaying a virtual light source identifier corresponding to the light source type data on the live broadcast interface.
In an exemplary embodiment of the present invention, the adjusting the virtual light source data in response to an adjustment trigger operation acting on the live interface includes:
Providing a prescribed sliding path corresponding to the virtual light source identifier in the light control area panel;
Responding to the adjustment triggering operation acted on the virtual light source identifier, sliding the virtual light source identifier according to the specified sliding path to obtain a current sliding position, and adjusting the projection position data according to the current sliding position.
In an exemplary embodiment of the present invention, the adjusting the virtual light source data in response to an adjustment trigger operation acting on the live interface includes:
Providing a virtual light source sliding bar corresponding to the virtual light source identifier in the light control area panel, and acquiring a mapping relation of the virtual light source identifier and a current identifier parameter of the virtual light source identifier;
And responding to an adjustment triggering operation acted on the virtual light source slide bar, and adjusting the light source information parameters according to the mapping relation and the current identification parameters.
In an exemplary embodiment of the present invention, the mapping relationship is established according to a light source identification parameter of the virtual light source identification and the light source attribute parameter.
In an exemplary embodiment of the present invention, the adjusting the virtual light source data in response to an adjustment trigger operation acting on the live interface includes:
providing a palette control corresponding to the virtual light source identifier in the light control area panel;
And responding to an adjustment triggering operation acted on the palette control, and adjusting the color parameters of the light source.
In an exemplary embodiment of the invention, said adjusting said light source color parameter comprises:
Acquiring color value data corresponding to the adjustment triggering operation, and performing color value mapping processing on the color value data to obtain an image color file;
And adjusting the light source color parameters by using the image color file.
In an exemplary embodiment of the invention, the method further comprises:
Acquiring anchor display data under the irradiation of the adjusted virtual light source data, and determining anchor display rules of the adjusted virtual light source data;
And adjusting the entity light source data corresponding to the virtual scene according to the anchor display rule and the anchor display data.
In an exemplary embodiment of the present invention, the anchor display rule includes the adjusting entity light source data corresponding to the virtual scene according to the anchor display rule and the anchor display data, including:
Acquiring to-be-dimmed source data corresponding to entity light source data, and performing light source data calculation on the to-be-dimmed source data and the anchor display data to obtain a fusion display difference value;
And acquiring a fusion display threshold value of the anchor display rule, and adjusting entity light source data corresponding to the virtual scene according to the fusion display difference value and the fusion display threshold value.
In an exemplary embodiment of the present invention, the anchor display rule includes: the same display rule and the complementary display rule.
In an exemplary embodiment of the invention, the method further comprises:
and generating result identification data according to the adjustment result of the entity light source data, and displaying the result identification data on the live broadcast interface.
In an exemplary embodiment of the invention, the method further comprises:
and adjusting the virtual light source data again according to the result identification data.
In an exemplary embodiment of the present invention, said readjusting said virtual light source data according to said result identification data comprises:
Acquiring target light source data after the virtual light source data and the entity light source data are adjusted, and acquiring original light source data before the entity light source data are adjusted;
performing light source average value calculation on the original light source data and the target light source data to obtain light source average value data;
And when the result identification data is that the entity light source data is successfully adjusted, the virtual light source data is adjusted again according to the light source mean value data.
The specific details of the virtual scene generating apparatus 1500 are described in detail in the corresponding virtual scene generating method, and thus are not described herein.
It should be noted that although several modules or units of the virtual scene generating apparatus 1500 are mentioned in the above detailed description, such division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit in accordance with embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
In addition, in an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
An electronic device 1600 according to such an embodiment of the invention is described below with reference to fig. 16. The electronic device 1600 shown in fig. 16 is merely an example, and should not be construed as limiting the functionality and scope of use of embodiments of the present invention.
As shown in fig. 16, the electronic device 1600 is embodied in the form of a general purpose computing device. The components of the electronic device 1600 may include, but are not limited to: the at least one processing unit 1610, the at least one memory unit 1620, a bus 1630 connecting the different system components (including the memory unit 1620 and the processing unit 1610), and a display unit 1640.
Wherein the storage unit stores program code that is executable by the processing unit 1610 such that the processing unit 1610 performs steps according to various exemplary embodiments of the present invention described in the above-described "exemplary methods" section of the present specification.
The memory unit 1620 may include readable media in the form of volatile memory units, such as Random Access Memory (RAM) 1621 and/or cache memory 1622, and may further include Read Only Memory (ROM) 1623.
The storage unit 1620 may also include a program/utility 1624 having a set (at least one) of program modules 1625, such program modules 1625 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
Bus 1630 may be a local bus representing one or more of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or using any of a variety of bus architectures.
Electronic device 1600 may also communicate with one or more external devices 1800, such as a keyboard, pointing device, bluetooth device, etc., as well as one or more devices that enable a user to interact with the electronic device 1600, and/or with any device (e.g., router, modem, etc.) that enables the electronic device 1600 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 1650. Also, electronic device 1600 can communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN) and/or a public network, such as the Internet, through network adapter 1660. As shown, network adapter 1640 communicates with other modules of electronic device 1600 over bus 1630. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with electronic device 1600, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, and includes several instructions to cause a computing device (may be a personal computer, a server, a terminal device, or a network device, etc.) to perform the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, a computer-readable storage medium having stored thereon a program product capable of implementing the method described above in the present specification is also provided. In some possible embodiments, the various aspects of the invention may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps according to the various exemplary embodiments of the invention as described in the "exemplary methods" section of this specification, when said program product is run on the terminal device.
Referring to fig. 17, a program product 1700 for implementing the above-described method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable signal medium may include a data signal propagated in baseband or as part of a carrier wave with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims (20)

CN202111011155.1A2021-08-312021-08-31Virtual scene generation method and device, storage medium and electronic equipmentActiveCN113706719B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202111011155.1ACN113706719B (en)2021-08-312021-08-31Virtual scene generation method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202111011155.1ACN113706719B (en)2021-08-312021-08-31Virtual scene generation method and device, storage medium and electronic equipment

Publications (2)

Publication NumberPublication Date
CN113706719A CN113706719A (en)2021-11-26
CN113706719Btrue CN113706719B (en)2024-07-12

Family

ID=78657906

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202111011155.1AActiveCN113706719B (en)2021-08-312021-08-31Virtual scene generation method and device, storage medium and electronic equipment

Country Status (1)

CountryLink
CN (1)CN113706719B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN114202576B (en)*2021-12-132025-08-08广州博冠信息科技有限公司 Virtual scene processing method and device, storage medium, and electronic device
CN114201095A (en)*2021-12-142022-03-18广州博冠信息科技有限公司 Control method, device, storage medium and electronic device for live broadcast interface
CN114554240B (en)*2022-02-252025-01-10广州博冠信息科技有限公司 Interactive method and device in live broadcast, storage medium, and electronic device
CN115243065A (en)*2022-07-192022-10-25广州博冠信息科技有限公司Method and device for scheduling light and package and electronic equipment
CN115774926A (en)*2022-11-182023-03-10广州彩熠灯光股份有限公司Simulation model of light source display effect, generation method, system, medium and device
CN117424969B (en)*2023-10-232024-12-10神力视界(深圳)文化科技有限公司 Lighting control method, device, mobile terminal and storage medium
CN117440184B (en)*2023-12-202024-03-26深圳市亿莱顿科技有限公司Live broadcast equipment and control method thereof
CN119031150A (en)*2024-07-292024-11-26北京达佳互联信息技术有限公司 Virtual background generation method, device, electronic device and storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN112116695A (en)*2020-09-242020-12-22广州博冠信息科技有限公司Virtual light control method and device, storage medium and electronic equipment

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
GB2519363A (en)*2013-10-212015-04-22Nokia Technologies OyMethod, apparatus and computer program product for modifying illumination in an image
GB201709199D0 (en)*2017-06-092017-07-26Delamont Dean LindsayIR mixed reality and augmented reality gaming system
CN109785423B (en)*2018-12-282023-10-03广州方硅信息技术有限公司Image light supplementing method and device and computer equipment
CN111050189B (en)*2019-12-312022-06-14成都酷狗创业孵化器管理有限公司 Live broadcast method, apparatus, device and storage medium
CN111756956B (en)*2020-06-232023-04-14网易(杭州)网络有限公司Virtual light control method and device, medium and equipment in virtual studio
CN112188228A (en)*2020-09-302021-01-05网易(杭州)网络有限公司Live broadcast method and device, computer readable storage medium and electronic equipment
CN112562056B (en)*2020-12-032024-11-22广州博冠信息科技有限公司 Control method, device, medium and equipment for virtual lighting in virtual studio
CN112770135B (en)*2021-01-212021-12-10腾讯科技(深圳)有限公司Live broadcast-based content explanation method and device, electronic equipment and storage medium

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN112116695A (en)*2020-09-242020-12-22广州博冠信息科技有限公司Virtual light control method and device, storage medium and electronic equipment

Also Published As

Publication numberPublication date
CN113706719A (en)2021-11-26

Similar Documents

PublicationPublication DateTitle
CN113706719B (en)Virtual scene generation method and device, storage medium and electronic equipment
CN111698390B (en)Virtual camera control method and device, and virtual studio implementation method and system
US11856322B2 (en)Display apparatus for image processing and image processing method
CN113436343B (en)Picture generation method and device for virtual concert hall, medium and electronic equipment
CN106534343B (en) Exhibition hall cloud control system
CN112449229B (en)Sound and picture synchronous processing method and display equipment
CN109887066B (en)Lighting effect processing method and device, electronic equipment and storage medium
CN111866596A (en)Bullet screen publishing and displaying method and device, electronic equipment and storage medium
CN112543344B (en)Live broadcast control method and device, computer readable medium and electronic equipment
CN110780598B (en)Intelligent device control method and device, electronic device and readable storage medium
US11917329B2 (en)Display device and video communication data processing method
CN111683260A (en)Program video generation method, system and storage medium based on virtual anchor
CN114092671B (en) Virtual live scene processing method and device, storage medium, and electronic device
CN113676690A (en)Method, device and storage medium for realizing video conference
CN114302221B (en)Virtual reality equipment and screen-throwing media asset playing method
CN112533037A (en)Method for generating Lian-Mai chorus works and display equipment
CN113407289A (en)Wallpaper switching method, wallpaper generation method, device and storage medium
CN112423052A (en)Display system and display method
CN114286077B (en)Virtual reality device and VR scene image display method
CN112269553B (en)Display system, display method and computing device
CN112017264B (en)Display control method and device for virtual studio, storage medium and electronic equipment
CN112399225B (en)Service management method for projection hall and display equipment
CN115129280B (en) Virtual reality device and screen projection media playback method
CN114339174B (en)Projection equipment and control method thereof
CN111385631A (en)Display device, communication method and storage medium

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp