Movatterモバイル変換


[0]ホーム

URL:


CN116152423A - Virtual reality live broadcasting room illumination processing method, device, equipment and storage medium - Google Patents

Virtual reality live broadcasting room illumination processing method, device, equipment and storage medium
Download PDF

Info

Publication number
CN116152423A
CN116152423ACN202310172343.5ACN202310172343ACN116152423ACN 116152423 ACN116152423 ACN 116152423ACN 202310172343 ACN202310172343 ACN 202310172343ACN 116152423 ACN116152423 ACN 116152423A
Authority
CN
China
Prior art keywords
virtual
real
illumination
anchor
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310172343.5A
Other languages
Chinese (zh)
Inventor
曾衍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Cubesili Information Technology Co Ltd
Original Assignee
Guangzhou Cubesili Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Cubesili Information Technology Co LtdfiledCriticalGuangzhou Cubesili Information Technology Co Ltd
Priority to CN202310172343.5ApriorityCriticalpatent/CN116152423A/en
Publication of CN116152423ApublicationCriticalpatent/CN116152423A/en
Pendinglegal-statusCriticalCurrent

Links

Images

Classifications

Landscapes

Abstract

The application relates to the technical field of network living broadcast, and provides a virtual reality living broadcast room illumination processing method, device, electronic equipment and storage medium. According to the method and the device, the more real illumination effect can be simulated by combining the real illumination information of the anchor terminal in the virtual reality live broadcasting room. The method comprises the following steps: displaying an anchor virtual image in a virtual reality live broadcasting room, acquiring a real geographic position and a real orientation of the anchor terminal through a sensor of the anchor terminal, acquiring real illumination information corresponding to the anchor terminal according to the real geographic position and the real orientation, acquiring corresponding virtual illumination information based on the real illumination information and the virtual position and the virtual orientation of the anchor virtual image in the virtual reality live broadcasting room, and displaying a corresponding illumination effect in the virtual reality live broadcasting room according to the virtual illumination information.

Description

Virtual reality live broadcasting room illumination processing method, device, equipment and storage medium
Technical Field
The present disclosure relates to the field of network living broadcast technologies, and in particular, to a virtual reality living broadcast room illumination processing method, device, electronic device, and computer readable storage medium.
Background
With the development of network live broadcasting technology, live broadcasting modes are more and more, and virtual reality live broadcasting is an emerging live broadcasting mode, so that users can be immersed in a virtual reality live broadcasting room, and live broadcasting experience is improved. Wherein, virtual reality live broadcast is virtual reality and live combination, can upgrade the live broadcast of user's seen two-dimensional plane into virtual reality's panorama live broadcast at ordinary times through virtual reality live broadcast, compares with ordinary two-dimensional plane live broadcast, and virtual reality live broadcast brings the sensation of personally on the scene for the spectator, can make spectator not receive the constraint of fixed live broadcast drawing scene face, can change along with the free change of visual angle, can bring the brand-new live broadcast visual experience of spectator.
In a virtual reality live room, a host may be substituted by his avatar, and the host may control his avatar. The main broadcasting virtual image can be placed in a virtual scene, and a virtual light source is needed to be used in the virtual scene to build a corresponding illumination effect, so that a better display effect is achieved in a virtual reality live broadcasting room, and live broadcasting experience is improved.
The current technology generally sets a fixed light source in the lighting process of the virtual reality live broadcasting room to generate a lighting effect, but this can make the lighting effect displayed in the virtual reality live broadcasting room be relatively single and the reality sense of the lighting effect relatively poor.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a virtual reality live room illumination processing method, apparatus, electronic device, and computer readable storage medium.
In a first aspect, the application provides a virtual reality live broadcasting room illumination processing method. The method comprises the following steps:
displaying a host virtual image in a virtual reality live broadcasting room;
acquiring the real geographic position and the real orientation of a host side through a sensor of the host side;
acquiring real illumination information corresponding to the anchor terminal according to the real geographic position and the real orientation;
based on the real illumination information and the virtual position and virtual orientation of the anchor virtual image in the virtual reality live broadcasting room, obtaining corresponding virtual illumination information;
and displaying the corresponding illumination effect in the virtual reality live broadcasting room according to the virtual illumination information.
In a second aspect, the present application provides a virtual reality live room light treatment apparatus. The device comprises:
the image display module is used for displaying the main broadcasting virtual image in the virtual reality live broadcasting room;
the first information acquisition module is used for acquiring the real geographic position and the real orientation of the anchor terminal through a sensor of the anchor terminal;
The second information acquisition module is used for acquiring the real illumination information corresponding to the anchor terminal according to the real geographic position and the real orientation;
the third information acquisition module is used for acquiring corresponding virtual illumination information based on the real illumination information and the virtual position and virtual orientation of the anchor virtual image in the virtual reality live broadcasting room;
and the illumination display module is used for displaying the corresponding illumination effect in the virtual reality live broadcasting room according to the virtual illumination information.
In a third aspect, the present application provides an electronic device. The electronic device comprises a memory and a processor, the memory stores a computer program, and the processor executes the computer program to realize the following steps:
displaying a host virtual image in a virtual reality live broadcasting room; acquiring the real geographic position and the real orientation of a host side through a sensor of the host side; acquiring real illumination information corresponding to the anchor terminal according to the real geographic position and the real orientation; based on the real illumination information and the virtual position and virtual orientation of the anchor virtual image in the virtual reality live broadcasting room, obtaining corresponding virtual illumination information; and displaying the corresponding illumination effect in the virtual reality live broadcasting room according to the virtual illumination information.
In a fourth aspect, the present application provides a computer-readable storage medium. The computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of:
displaying a host virtual image in a virtual reality live broadcasting room; acquiring the real geographic position and the real orientation of a host side through a sensor of the host side; acquiring real illumination information corresponding to the anchor terminal according to the real geographic position and the real orientation; based on the real illumination information and the virtual position and virtual orientation of the anchor virtual image in the virtual reality live broadcasting room, obtaining corresponding virtual illumination information; and displaying the corresponding illumination effect in the virtual reality live broadcasting room according to the virtual illumination information.
According to the virtual reality live broadcasting room illumination processing method, the device, the electronic equipment and the storage medium, the anchor virtual image is displayed in the virtual reality live broadcasting room, the real geographic position and the real orientation of the anchor are obtained through the sensor of the anchor, the real illumination information corresponding to the anchor is obtained according to the real geographic position and the real orientation, the corresponding virtual illumination information is obtained based on the real illumination information and the virtual position and the virtual orientation of the anchor virtual image in the virtual reality live broadcasting room, and the corresponding illumination effect is displayed in the virtual reality live broadcasting room according to the virtual illumination information. According to the scheme, the sensor of the host side is used for acquiring the real geographic position and the real orientation of the host side and acquiring the corresponding real illumination information, then the corresponding virtual illumination information is acquired based on the real illumination information and the virtual position and the virtual orientation of the host virtual image in the virtual reality living broadcast room, and the corresponding illumination effect is displayed in the virtual reality living broadcast room according to the virtual illumination information, so that the real illumination effect can be simulated in the virtual reality living broadcast room by combining the real illumination information of the host side, the expression form of the illumination effect in the virtual reality living broadcast room is enriched, the illumination effect in the virtual reality living broadcast room is enabled to be more realistic, the expression form of the illumination effect in the meta-universe living broadcast scene is enriched, the real illumination effect is simulated in the meta-universe living broadcast, and the meta-universe living broadcast experience is improved.
Drawings
Fig. 1 is an application scenario diagram of a virtual reality live broadcasting room illumination processing method in an embodiment of the present application;
fig. 2 is a flow chart of a virtual reality live room illumination processing method in an embodiment of the present application;
FIG. 3 is a flowchart illustrating a step of determining a virtual light source type according to an embodiment of the present application;
FIG. 4 is a flowchart illustrating steps of displaying corresponding lighting effects according to an embodiment of the present disclosure;
fig. 5 is a schematic flow chart of a virtual reality live room illumination processing method in an embodiment of the present application;
fig. 6 is a block diagram of a virtual reality live room lighting processing device in an embodiment of the present application;
fig. 7 is an internal structural diagram of an electronic device in an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
The virtual reality live broadcasting room illumination processing method can be applied to an application scene shown in fig. 1, wherein the application scene can comprise a terminal and a server, the terminal can specifically comprise a live broadcasting room anchor end and a plurality of audience ends (such as audience end 1 and audience end 2) and the anchor end and the audience ends can respectively communicate with the server through the Internet, and the server provides live broadcasting related services for the anchor end and the audience ends of the live broadcasting room. The terminal can be, but is not limited to, a smart phone, a tablet computer, a head-mounted virtual reality device and the like; the server may be implemented as a stand-alone server or as a server cluster composed of a plurality of servers. The application scene can be a metauniverse live broadcast scene, the server can create independent metauniverse scenes for the anchor and audience of the live broadcast room, a virtual space parallel to the real world is built, a virtual world of real world mapping and interaction is created by utilizing a technological means, a three-dimensional virtual image can be displayed in the virtual space, and the anchor and audience can freely interact with the three-dimensional virtual image in the virtual space. Specifically, a user in the live broadcasting room can complete man-machine interaction through terminals such as mobile phones and head-mounted virtual reality devices, basic control operations such as gesture movement switching and opting-out are realized, the terminals can receive and display data such as live broadcasting streams sent by a server, information is displayed for a meta-universe scene, and meta-universe information is provided for the user.
For a meta-universe live broadcast scene, a host can start meta-universe live broadcast, a host virtual image simulating the facial and physical characteristics of the host can be generated by combining a three-dimensional virtual person generation technology in a mode of digging out the host figure, and the designed three-dimensional virtual image can be directly selected as the host virtual image for live broadcast, so that the host virtual image can carry corresponding three-dimensional information. After live broadcasting is started, the main broadcasting virtual image can be placed in a virtual three-dimensional space, for example, in a live broadcasting scene of a concert, the main broadcasting virtual image can be placed in a stage of the concert, the position of a virtual camera can be adjusted in the virtual space, for convenience in calculation, the virtual camera can be placed at the (0, 0) position, the position of the stage and the main broadcasting virtual image can be adjusted, and the virtual camera can shoot the stage at a good angle. In the live broadcast process, the anchor can acquire some data of the anchor through sensors (such as acceleration, speed, direction, light, sound and the like) of the anchor, the data can be converted into actions or sounds of some limbs of the anchor virtual image in the live broadcast process, and real illumination effects can be simulated in a live broadcast room by combining the sensor data, so that better virtual-real combination is realized, and live broadcast experience is improved.
In a living broadcast scene including but not limited to a meta-living broadcast, the method for processing illumination in the living broadcast room based on the virtual reality provided by the application is characterized in that the sensor at the host side is used for acquiring the real geographic position and the real orientation of the living broadcast scene and acquiring corresponding real illumination information, then the corresponding virtual illumination information is acquired based on the real illumination information and the virtual position and the virtual orientation of the host virtual image in the living broadcast room, and the corresponding illumination effect is displayed in the living broadcast room based on the corresponding virtual illumination information, so that the real illumination effect can be simulated in the living broadcast room in combination with the real illumination information at the host side, the expression form of the illumination effect in the living broadcast room is enriched, the illumination effect in the living broadcast room is enabled to have a sense of reality, the expression form of the illumination effect in the living broadcast scene can be enriched, the real illumination effect can be simulated in the living broadcast scene, and the living broadcast experience of the living broadcast is improved.
The following describes a method for processing illumination of a virtual reality live broadcasting room according to the present application based on an application environment as shown in fig. 1, with reference to various embodiments and corresponding drawings.
In one embodiment, as shown in fig. 2, there is provided a virtual reality live room illumination processing method, which may be applied to a terminal as in fig. 1, and the method may include the steps of:
step S201, a main cast avatar is shown in the virtual reality live room.
In this step, when the anchor starts the virtual live broadcast at the anchor end, the virtual image simulated according to the anchor, that is, the anchor virtual image, may be set in one virtual scene, the virtual scene may be a three-dimensional virtual scene, the anchor may also directly select the three-dimensional virtual image designed as the anchor virtual image to perform live broadcast, and the anchor end and each viewer end in the virtual live broadcast room may display the anchor virtual image and the virtual scene in the virtual live broadcast room. The virtual light source is needed to be used in the virtual live broadcasting room to display the corresponding illumination effect, the sensor at the anchor end is used for collecting relevant data in the follow-up steps of the application, the real illumination effect is simulated in the virtual live broadcasting room accordingly, and live broadcasting experience is improved.
Step S202, acquiring the real geographic position and the real orientation of the anchor through a sensor of the anchor.
Step S203, obtaining real illumination information corresponding to the anchor terminal according to the real geographic position and the real orientation.
Step S204, based on the real illumination information and the virtual position and virtual orientation of the host avatar in the virtual reality live broadcast room, corresponding virtual illumination information is obtained.
The steps S202 to S204 are related steps of acquiring the virtual illumination information corresponding to the real illumination condition of the anchor terminal by performing related data acquisition through the sensor of the anchor terminal. In step S202, the anchor terminal is generally configured with various sensors, and the one or more sensors of the anchor terminal may be used to obtain a real geographic position and a real orientation of the anchor terminal, where the real geographic position refers to a geographic position of the anchor terminal in the real world, and the real orientation refers to an orientation of the anchor terminal in the real world, where the real geographic position may be represented by longitude and latitude, and the real orientation may be represented based on the north-south direction of the thing. As an example, the true geographic location of the anchor may be obtained by a GPS (global positioning system ) sensor among the various sensors of the anchor, and the true heading of the anchor may be collected by a direction sensor among the various sensors of the anchor. Then in step S203, the real geographic position and the real orientation of the anchor may be sent to the server, and the server calculates, according to the real geographic position and the real orientation, real illumination information corresponding to the anchor at the current time, where the real illumination information may include information such as a position of the real illumination relative to the anchor, and in a specific implementation, the real illumination information may be approximate azimuth information, such as information located at an upper right, an upper left, a front, and the like of the anchor, or may be accurate such as information about an azimuth angle relative to the anchor. Then in step S204, the server may still obtain the virtual illumination information corresponding to the real illumination condition of the anchor according to the calculated real illumination information corresponding to the anchor at the current time and the virtual position and virtual orientation of the anchor avatar in the virtual reality live broadcast room. The position and the orientation of the anchor avatar in the virtual live broadcasting room are recorded as the virtual position and the virtual orientation, as previously described, the anchor may acquire the motion data of the anchor through the acceleration, the speed, the direction and other sensors of the anchor end in the live broadcasting process, and may convert the motion data into the motion of the anchor avatar in the virtual live broadcasting room, so that the anchor avatar may change the virtual position and the virtual orientation in the virtual live broadcasting room, such as moving and turning in the virtual live broadcasting room, in order to simulate the real illumination effect in the virtual live broadcasting room, in step S204, the virtual illumination information needs to be acquired according to the real illumination information acquired in the foregoing step S203, where the virtual illumination information may correspondingly include the azimuth information of the virtual light source in the virtual live broadcasting room relative to the anchor avatar, such as the azimuth information located in the right upper part, the left upper part, the right azimuth angle and the like of the anchor avatar, and the like, and the azimuth angle may also be accurate relative to the anchor avatar.
Step S205, according to the virtual illumination information, the corresponding illumination effect is displayed in the virtual reality live broadcasting room.
In the step, the terminal can receive the virtual illumination information fed back by the server, and display the corresponding illumination effect in the virtual reality live broadcasting room according to the virtual illumination information, so that a more real illumination effect corresponding to the real illumination condition of the anchor terminal is simulated in the virtual reality live broadcasting room. As a possible implementation manner, when the corresponding illumination effect is displayed in the virtual reality live broadcasting room according to the virtual illumination information, the illumination effect corresponding to the real illumination condition of the anchor side can be simulated in the virtual reality live broadcasting room by presetting a virtual light source of a virtual light source type such as a point light source type and setting the illumination intensity of the virtual light source to be the preset illumination intensity at a certain distance above the right of the anchor side virtual image.
According to the method for processing the illumination of the virtual reality living broadcasting room, the main broadcasting virtual image is displayed in the virtual reality living broadcasting room, the real geographic position and the real orientation of the main broadcasting end are obtained through the sensor of the main broadcasting end, the real illumination information corresponding to the main broadcasting end is obtained according to the real geographic position and the real orientation, the corresponding virtual illumination information is obtained based on the real illumination information and the virtual position and the virtual orientation of the main broadcasting virtual image in the virtual reality living broadcasting room, and the corresponding illumination effect is displayed in the virtual reality living broadcasting room according to the virtual illumination information. According to the scheme, the sensor of the host side is used for acquiring the real geographic position and the real orientation of the host side and acquiring the corresponding real illumination information, then the corresponding virtual illumination information is acquired based on the real illumination information and the virtual position and the virtual orientation of the host virtual image in the virtual reality living broadcast room, and the corresponding illumination effect is displayed in the virtual reality living broadcast room according to the virtual illumination information, so that the real illumination effect can be simulated in the virtual reality living broadcast room by combining the real illumination information of the host side, the expression form of the illumination effect in the virtual reality living broadcast room is enriched, the illumination effect in the virtual reality living broadcast room is enabled to be more realistic, the expression form of the illumination effect in the meta-universe living broadcast scene is enriched, the real illumination effect is simulated in the meta-universe living broadcast, and the meta-universe living broadcast experience is improved.
In some embodiments, further, the method of the present application may further comprise the steps of:
acquiring the real light intensity and the real temperature of the anchor terminal through a sensor of the anchor terminal; and acquiring real weather information of the anchor terminal according to the real geographic position.
The real light intensity of the real environment of the anchor terminal can be acquired through the light sensors in the various sensors of the anchor terminal, the real temperature of the real environment of the anchor terminal can be acquired through the temperature sensors in the various sensors of the anchor terminal, and the server can also acquire the real weather information of the real environment of the anchor terminal according to the real geographic position acquired through the GPS sensors in the various sensors of the anchor terminal.
Based on this, in step S203, according to the real geographic location and the real orientation, the obtaining real illumination information corresponding to the anchor terminal further includes: and acquiring the real illumination information corresponding to the anchor terminal according to the real geographic position, the real orientation and the real light intensity.
That is, under the condition that the real light intensity is collected, the server may further obtain real illumination information corresponding to the anchor end according to the real geographic position, the real orientation and the real light intensity, where the real illumination information may include information such as the real light intensity irradiated to the anchor end in addition to the azimuth of the real illumination relative to the anchor end.
Based on this, the step S204 of acquiring the corresponding virtual illumination information based on the real illumination information and the virtual position and virtual orientation of the anchor avatar in the virtual reality live broadcast room further includes: and acquiring corresponding virtual illumination information according to the real illumination information, the real weather information and the real temperature.
Specifically, under the condition that real weather information and real temperature are acquired, the server can further acquire virtual illumination information which is more suitable for the illumination environment of the anchor side according to the real illumination information, the real weather information and the real temperature, wherein the virtual illumination information can comprise information such as illumination intensity and light color of the virtual light source besides the azimuth of the virtual light source relative to the anchor virtual image in the virtual reality living broadcast room, and as an example, the azimuth and illumination intensity of the virtual light source relative to the anchor virtual image can be correspondingly set according to the real illumination information, and the light color of the virtual light source can be adjusted according to the real weather information and the real temperature. Therefore, the scheme of the embodiment can call the various sensors to collect related data under the condition that the anchor side is configured with the corresponding various sensors and the equipment performance of the sensors is allowed, so that the illumination effect which is more adapted to the real illumination environment of the anchor side is simulated in the virtual reality live broadcast room, and the sense of reality of the illumination effect in the virtual reality live broadcast room is further enhanced.
In some embodiments, as shown in fig. 3, further, the method of the present application may further include the steps of: step S310, obtaining the camera on state of the anchor. Step S320, determining the corresponding virtual light source type according to the starting state of the camera. Based on this, the displaying of the corresponding lighting effect in the virtual reality live broadcast room according to the virtual lighting information in step S205 further includes: and displaying the corresponding illumination effect in the virtual reality live broadcasting room according to the virtual illumination information and the virtual light source corresponding to the virtual light source type.
In this embodiment, the camera on state of the anchor may be obtained at the anchor, that is, whether the anchor starts the camera to perform virtual live broadcast may be determined, then the corresponding virtual light source type may be selected according to the camera on state, and the selectable virtual light source type may generally include a point light source type, a parallel light source type, a spotlight type, a regional light/area light source type, etc., so that the virtual light source type adapted to the anchor on environment may be selected as far as possible in combination with the camera on state of the anchor, thereby avoiding the use of a fixed type virtual light source to generate an illumination effect each time, and further enhancing the realism of the illumination effect in the virtual reality live broadcast room.
As shown in fig. 3, in some embodiments, step S320, that is, determining the corresponding virtual light source type according to the camera on state, may include: step S3210, if the camera is started, identifying that the anchor is indoor or outdoor through the camera; step S3220, selecting a corresponding virtual light source type according to whether the anchor is indoor or outdoor.
In this embodiment, when the camera is turned on, whether the anchor is live in the indoor or live in the outdoor may be identified through image recognition of the camera, so that the corresponding virtual light source type may be selected from the selectable virtual light source types according to whether the anchor is live in the indoor or live in the outdoor, so as to adapt to the real live scene thereof.
As an embodiment, as shown in fig. 3, step S3220 is to select a corresponding virtual light source type according to whether the anchor is indoor or outdoor, and specifically includes: step S3221, if the anchor is in the room, selecting the corresponding virtual light source type as the point light source type; in step S3222, if the anchor is outside, the corresponding virtual light source type is selected as the parallel light source type.
In this embodiment, when the host player starts the camera to perform virtual live broadcast and recognizes that the host player is outdoors through the camera, the corresponding virtual light source type may be selected to be a parallel light source type, that is, a corresponding illumination effect is generated in the virtual reality live broadcast room through the parallel light source, specifically, the azimuth and illumination intensity of the parallel light source relative to the host player virtual image may be set correspondingly according to real illumination information, and the light color of the parallel light source may be adjusted according to real weather information and real temperature to generate illumination effects adapted to the live broadcast scene and real illumination environment thereof. When the webcam is started by the webcam to perform virtual live broadcasting and the webcam is identified to be indoor, the corresponding virtual light source type can be selected as the point light source type, namely, a corresponding illumination effect is generated in the virtual reality live broadcasting room through the point light source, accordingly, the azimuth and illumination intensity of the point light source relative to the virtual image of the webcam can be correspondingly set according to real illumination information, and the light color of the point light source can be adjusted according to real weather information and real temperature to generate an adaptive illumination effect.
As shown in fig. 3, in other embodiments, step S320 of determining the corresponding virtual light source type according to the on state of the camera may include: step S3230, if the camera is not turned on, determining that the corresponding virtual light source type is a preset type, or selecting the corresponding virtual light source type according to the scene range feature of the virtual reality live broadcasting room.
In this embodiment, under the condition that the host does not start the camera to perform virtual live broadcast, the host cannot be identified to be indoor or outdoor through the camera, and two schemes are provided to select the virtual light source type. The corresponding virtual light source type can be selected as a preset type, for example, the corresponding virtual light source type can be uniformly selected as a parallel light source type, namely, corresponding illumination effect is uniformly generated in the virtual reality live broadcast room through the parallel light source, then the azimuth and illumination intensity of the parallel light source relative to the host virtual image are correspondingly set according to real illumination information, and the light color of the parallel light source is adjusted according to real weather information and real temperature. The method comprises the steps of selecting a virtual light source type according to scene range characteristics of a virtual reality live broadcast room, wherein the scene range characteristics can be specifically the range size of a three-dimensional virtual scene arranged in the virtual reality live broadcast room or the range size of a main broadcasting virtual image walking in the three-dimensional virtual scene, selecting a virtual light source type matched with the scene range characteristics according to the range size, selecting the virtual light source type, wherein the selected virtual light source type can comprise a point light source type, a parallel light source type, a spotlight type, a regional light/area light source type and the like as described above, and then correspondingly setting the azimuth and the illumination intensity of the virtual light source type relative to the main broadcasting virtual image according to real illumination information and adjusting the light color of the virtual light source type according to real weather information and real temperature.
In some embodiments, as shown in fig. 4, the displaying the corresponding lighting effect in the virtual reality live broadcast room according to the virtual lighting information in step S205 may include:
step S401, acquiring device information of the home terminal.
Step S402, judging whether the performance of the home terminal reaches the preset performance according to the device information of the home terminal.
Step S403, if the performance of the local terminal reaches the preset performance, displaying the corresponding illumination effect by adopting real-time illumination in the virtual reality live broadcasting room according to the virtual illumination information.
Step S404, if the performance of the local terminal does not reach the preset performance, according to the virtual illumination information, displaying the corresponding illumination effect in the virtual reality live broadcasting room by adopting the illumination map.
In this embodiment, the real-time baking of the illumination is considered to compare the consumption performance, so that the baking with the real-time illumination or the illumination map is considered to be performed according to the performance of the device. Specifically, the device information of the local end is obtained first, the local end can be a hosting end or a viewer end of the virtual reality live broadcasting room, the device information can include hardware configuration information and the like of the local end device, then whether the performance of the local end reaches preset performance or not is judged according to the device information of the local end, if the local end has specific hardware or whether parameters of the specific hardware reach preset parameter requirements or not, and the like, if the performance of the local end reaches the preset performance, the performance of the local end can be considered to be better, so that corresponding illumination effect can be displayed by adopting real-time illumination in the virtual reality live broadcasting room according to virtual illumination information, and if the performance of the local end does not reach the preset performance, the performance of the local end can be considered to be poorer, the illumination effect can be displayed by adopting the illumination map directly, and further consumption of illumination baking on the performance is reduced.
In some embodiments, after the corresponding lighting effect is shown in the virtual reality living room according to the virtual lighting information in step S205, the method may further include the following steps:
mixing a live broadcast picture containing an illumination effect with an original live broadcast stream to obtain a target live broadcast stream; pushing the target live stream to other ends in the virtual reality live broadcasting room through the server for display.
The scheme of the embodiment can enable the live broadcast picture rendered by illumination baking to be synchronous with the original live broadcast stream (original live broadcast audio and video stream) in real time, and is applicable to all ends of the same virtual reality live broadcast room. Specifically, after the corresponding illumination effect is displayed in the virtual reality live broadcasting room according to the virtual illumination information at the anchor terminal, the live broadcasting picture containing the illumination effect is mixed with the original live broadcasting stream at the anchor terminal, the obtained live broadcasting stream is called a target live broadcasting stream, then the target live broadcasting stream can be pushed to other audience terminals in the virtual reality live broadcasting room to be displayed through the server, and at the moment, the audience can see that the result of illumination baking and rendering is synchronous with the live broadcasting audio and video in real time. In addition, if the result of illumination baking is not required to be synchronized with the live audio and video in real time at each end, each end can be subjected to illumination baking rendering when receiving the data by sending data such as virtual illumination information and the like at the same time through the server.
As a specific example, the virtual reality live room illumination processing method of the present application may specifically include the steps shown in fig. 5:
for the three-dimensional model and related material production, the anchor avatar can be formed by the three-dimensional model, the anchor avatar can be designed according to a certain specification by a designer, the reusability of action scripts can be kept by the specification design, the anchor avatar can accompany the movement of each bone joint of the whole anchor avatar when doing certain actions, the bone joints can identify the fine degree of the movement, in general, the finer the number of bone joints with higher fine degree is more fine, and the anchor avatar can possibly accompany material resources such as sound, pictures and the like when doing certain actions. For sound material resources, such as background music, expression music, etc., and for picture material resources, such as costumes, props, etc., of an anchor avatar, these material resources need to be prefabricated and recorded, so as to form a material resource library, and various material resources contained in the material resource library can be respectively associated with action scripts, so that the purpose of obtaining an action script can obtain corresponding material resources therefrom is achieved.
For the preparation of the action prefabrication script, specifically, for the action script, different anchor avatar have some specific actions, and may need to be represented by matching some specific sound, picture and other material resources when making specific actions, these specific actions and material resources can be controlled and bound through code script, for example, the actions of walking, standing, lifting hands, singing, dancing and the like of the anchor avatar, and then the code script corresponding to the skeletal joints of the model and the required material resources of the anchor avatar needs to be written according to the model preparation of the anchor avatar, and the action script of the actions of walking, standing, lifting hands, singing, dancing and the like is saved, and the identification and the description information of each action script are set, and the description information may specifically include the information of the action control command corresponding to the action script, so that the action control command and the action script can be corresponding.
For the management of the related scripts of the virtual images, in order to enable the action scripts to be convenient to use, multiplex and manage later, a unique identification can be set for each action script, the name of the action script is defined (the name can be used for describing the action meaning of the action script, such as walking, standing, lifting hands, singing, dancing and the like), and the corresponding host broadcasting virtual images, the corresponding live broadcasting scenes, the corresponding image attributes, the action control instructions and other labels can be set for each action script to be further classified and stored to form a script database. Therefore, when the anchor side obtains the action control instruction from the anchor, a script obtaining request carrying the action control instruction can be sent to the server, wherein the action control instruction can be obtained according to the action data of the anchor collected by the sensor (pedometer, acceleration sensor, speed and the like) of the anchor side, the server can obtain the action script corresponding to the action control instruction from the script database as a target action script according to the action control instruction in the script obtaining request, and then the action script returns to the anchor side, so that the anchor side can obtain the target action script. The anchor side loads different target action scripts to trigger the anchor side to conduct corresponding action rendering on the anchor avatar, so that the anchor avatar can generate corresponding actions, such as actions that the anchor avatar can walk according to the issued target action scripts. Wherein the target action script may specify which skeleton of the anchor avatar is moving, which direction to move, where the coordinates of the movement are, how long to move, and so forth. Specifically, for the motion control of the motion script and the anchor avatar, different anchor avatar may have different shapes and joints, so when the motion control of the anchor avatar is performed, the motion script based on the motion process of the joints, which is pre-configured in the script database, needs to be acquired, for example, the motion process of different joints such as legs, soles, hands, heads, bodies, etc. in the walking process of the anchor avatar, so when the obtained target motion script is the walking motion script, the walking motion script needs to be applied to the anchor avatar to make walking motions, for example, the singing motion script can be pre-manufactured, the motion script of the face, mouth, eyes, etc. of the anchor avatar in the singing process can be formed, and corresponding audio material resources can be added, and then the two materials are cooperatively applied to the anchor avatar to make singing motions and play corresponding audio material resources. When receiving the target action script issued by the server, the server can firstly judge whether the identification of the target action script is matched with the identification of the current anchor avatar, if so, the target action script can be bound into the anchor avatar, corresponding material resources such as music, pictures and the like are downloaded, so that the anchor avatar performs corresponding action presentation according to the target action script, and if the issued target action script is not matched with the identification of the current anchor avatar, the anchor avatar can not execute any processing.
When the host starts the virtual reality live broadcast, the virtual image simulated according to the host, namely, the host virtual image, can be placed in one virtual scene, the virtual scene can be a three-dimensional virtual scene, the host can also directly select the designed three-dimensional virtual image as the host virtual image for live broadcast, the host end and each audience end in the virtual reality live broadcast room can display the host virtual image and the virtual scene in the virtual reality live broadcast room, the virtual reality live broadcast room needs to use a virtual light source to display the corresponding illumination effect, the virtual light source can adopt the point light source, the parallel light source, the spotlight light, the regional light/area light source and the like as described above, and the azimuth, the illumination intensity, the light color and the like of the virtual light source relative to the host virtual image can be adjusted.
For the data fed back by the recognition sensor, in order to enable the audience in the virtual reality living broadcast room to more truly reflect the scene of real illumination at the moment when seeing the virtual image of the anchor, the data collected by the sensor at the anchor end can be combined to conduct rendering of corresponding illumination effects, including but not limited to the collection of real light intensity through the light sensor, the collection of real temperature through the temperature sensor, the collection of real orientation of the anchor end by the direction sensor, the collection of real geographic position of the anchor by the GPS sensor, and the like, and the real weather information of the real geographic position of the anchor can be queried through the server.
For selecting a light source type, when a host player starts a camera to perform virtual live broadcast and recognizes that the host player is outdoors through the camera, the corresponding virtual light source type can be selected as a parallel light source type, namely, a corresponding illumination effect is generated in a virtual reality live broadcast room through the parallel light source, specifically, the azimuth and illumination intensity of the parallel light source relative to the host player virtual image can be correspondingly set according to real illumination information, and the light color of the parallel light source can be adjusted according to real weather information and real temperature; when the webcam is started by the webcam to carry out virtual live broadcasting and the webcam is identified to be in a room, the corresponding virtual light source type can be selected as a point light source type, namely, a corresponding illumination effect is generated in a virtual reality live broadcasting room through the point light source, the azimuth and illumination intensity of the point light source relative to the virtual image of the webcam can be correspondingly set according to real illumination information, and the light color of the point light source can be adjusted according to real weather information and real temperature; under the condition that a host is not started to perform virtual live broadcasting, the host can not be identified to be indoor or outdoor through the camera, at the moment, the corresponding virtual light source type can be uniformly selected to be the parallel light source type, namely, the corresponding illumination effect is uniformly generated in the virtual reality live broadcasting room through the parallel light source, then the azimuth and illumination intensity of the parallel light source relative to the host virtual image are correspondingly set according to real illumination information, the light color of the parallel light source is regulated according to real weather information and real temperature, the corresponding virtual light source type can be selected according to the scene range characteristics of the virtual reality live broadcasting room, and then the azimuth and illumination intensity of the parallel light source relative to the host virtual image are correspondingly set according to real illumination information, and the light color of the parallel light source is regulated according to real weather information and real temperature.
For judging the performance of the equipment, the consumption performance is compared by the real-time baking of illumination, and whether the real-time illumination or the illumination map is adopted for baking is needed to be considered according to the performance condition of the equipment. Judging whether the performance of the local terminal reaches the preset performance according to the equipment information of the local terminal, if so, displaying the corresponding illumination effect by adopting real-time illumination in the virtual reality live broadcasting room according to the virtual illumination information, and if not, indicating that the performance of the local terminal is relatively poor, displaying the corresponding illumination effect by adopting the illumination map directly so as to reduce the consumption of illumination baking on the performance.
The method comprises the steps that after a main broadcasting end displays corresponding illumination effects in a virtual reality live broadcasting room according to virtual illumination information, the main broadcasting end mixes live broadcasting pictures containing the illumination effects with an original live broadcasting stream to obtain a target live broadcasting stream, the target live broadcasting stream is pushed to other audience ends in the virtual reality live broadcasting room through a server to be displayed, at the moment, the audience sees that an illumination baking and rendering result is synchronous with live broadcasting audio and video in real time, and the method is suitable for a scene requiring the live broadcasting pictures rendered by illumination baking and the original live broadcasting stream to be synchronous in real time; in addition, each end is subjected to illumination baking rendering when receiving data in a mode that the server transmits data such as virtual illumination information and the like at the same time, and the method is suitable for scenes in which the result of illumination baking is not needed to be synchronized with the live audio and video picture in real time.
According to the illumination processing method for the virtual reality live broadcasting room, the main broadcasting virtual image can be placed in different virtual scenes in the virtual reality live broadcasting, corresponding differentiated expression is generated in the virtual reality live broadcasting room according to different real illumination environments of the main broadcasting end, the illumination effect in the virtual reality live broadcasting room is more real, and live broadcasting watching experience is improved.
It should be understood that, although the steps in the flowcharts related to the embodiments described above are sequentially shown as indicated by arrows, these steps are not necessarily sequentially performed in the order indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described in the above embodiments may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of the steps or stages is not necessarily performed sequentially, but may be performed alternately or alternately with at least some of the other steps or stages.
Based on the same inventive concept, the embodiment of the application also provides a virtual reality live broadcasting room illumination processing device for realizing the virtual reality live broadcasting room illumination processing method. The implementation scheme of the solution to the problem provided by the device is similar to the implementation scheme described in the above method, so the specific limitation in the embodiments of the virtual reality live broadcasting room light processing device or devices provided below may refer to the limitation of the virtual reality live broadcasting room light processing method hereinabove, and will not be repeated herein.
In one embodiment, as shown in fig. 6, a virtual reality live room light processing apparatus is provided, the apparatus 600 may include:
the image display module 601 is used for displaying the anchor virtual image in the virtual reality live broadcasting room;
a firstinformation obtaining module 602, configured to obtain, by using a sensor of a anchor, a real geographic position and a real orientation of the anchor;
a secondinformation obtaining module 603, configured to obtain, according to the real geographic location and the real orientation, real illumination information corresponding to the anchor;
a thirdinformation obtaining module 604, configured to obtain corresponding virtual illumination information based on the real illumination information and a virtual position and a virtual orientation of the anchor avatar in the virtual reality live broadcast room;
And theillumination display module 605 is configured to display a corresponding illumination effect in the virtual reality live broadcast room according to the virtual illumination information.
In one embodiment, the apparatus 600 may further include: a fourth information acquisition module, configured to acquire, by using a sensor of the anchor end, a real light intensity and a real temperature of the anchor end; acquiring real weather information of the anchor terminal according to the real geographic position; a secondinformation obtaining module 603, configured to obtain real illumination information corresponding to the anchor terminal according to the real geographic location, the real orientation, and the real light intensity; a thirdinformation obtaining module 604, configured to obtain corresponding virtual illumination information according to the real illumination information, the real weather information, and the real temperature
In one embodiment, the apparatus 600 may further include: the light source type determining module is used for acquiring the starting state of the camera of the anchor; determining a corresponding virtual light source type according to the starting state of the camera; and theillumination display module 605 is configured to display a corresponding illumination effect in the virtual reality live broadcast room according to the virtual illumination information and the virtual light source corresponding to the virtual light source type.
In one embodiment, the light source type determining module is configured to identify, by the camera, that the anchor is indoor or outdoor if the camera is turned on; and selecting a corresponding virtual light source type according to whether the anchor is indoor or outdoor.
In one embodiment, the light source type determining module is configured to select a corresponding virtual light source type as a point light source type if the anchor is in a room; and if the anchor is outdoors, selecting the corresponding virtual light source type as the parallel light source type.
In one embodiment, the light source type determining module is configured to determine that the corresponding virtual light source type is a preset type if the camera is not turned on, or select the corresponding virtual light source type according to the scene range feature of the virtual reality live broadcasting room.
In one embodiment, theillumination display module 605 is configured to obtain device information of the home terminal; judging whether the performance of the home terminal reaches the preset performance according to the equipment information of the home terminal; if the performance of the local terminal reaches the preset performance, displaying a corresponding illumination effect by adopting real-time illumination in the virtual reality live broadcasting room according to the virtual illumination information; and if the performance of the local terminal does not reach the preset performance, displaying the corresponding illumination effect in the virtual reality live broadcasting room by adopting an illumination map according to the virtual illumination information.
In one embodiment, theillumination display module 605 may be further configured to mix a live broadcast picture including the illumination effect with an original live broadcast stream to obtain a target live broadcast stream; pushing the target live stream to other ends in the virtual reality live broadcasting room by a server for display.
The modules in the virtual reality live room lighting processing device can be implemented in whole or in part by software, hardware and a combination thereof. The above modules may be embedded in hardware or independent of a processor in the electronic device, or may be stored in software in a memory in the electronic device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, an electronic device is provided, which may be a terminal, and an internal structure diagram thereof may be as shown in fig. 7. The electronic device includes a processor, a memory, a communication interface, a display screen, and an input device connected by a system bus. Wherein the processor of the electronic device is configured to provide computing and control capabilities. The memory of the electronic device includes a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The communication interface of the electronic device is used for conducting wired or wireless communication with an external device, and the wireless communication can be realized through WIFI, a mobile cellular network, NFC (near field communication) or other technologies. The computer program, when executed by a processor, implements a virtual reality live room illumination processing method. The display screen of the electronic equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the electronic equipment can be a touch layer covered on the display screen, can also be keys, a track ball or a touch pad arranged on the shell of the electronic equipment, and can also be an external keyboard, a touch pad or a mouse and the like.
It will be appreciated by those skilled in the art that the structure shown in fig. 7 is merely a block diagram of a portion of the structure associated with the present application and is not limiting of the electronic device to which the present application is applied, and that a particular electronic device may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
In an embodiment, there is also provided an electronic device including a memory and a processor, the memory storing a computer program, the processor implementing the steps of the method embodiments described above when executing the computer program.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored which, when executed by a processor, carries out the steps of the method embodiments described above.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the various embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high density embedded nonvolatile Memory, resistive random access Memory (ReRAM), magnetic random access Memory (Magnetoresistive Random Access Memory, MRAM), ferroelectric Memory (Ferroelectric Random Access Memory, FRAM), phase change Memory (Phase Change Memory, PCM), graphene Memory, and the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory, and the like. By way of illustration, and not limitation, RAM can be in the form of a variety of forms, such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), and the like. The databases referred to in the various embodiments provided herein may include at least one of relational databases and non-relational databases. The non-relational database may include, but is not limited to, a blockchain-based distributed database, and the like. The processors referred to in the embodiments provided herein may be general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic units, quantum computing-based data processing logic units, etc., without being limited thereto.
It should be noted that, user information (including but not limited to user equipment information, user personal information, etc.) and data (including but not limited to data for analysis, stored data, presented data, etc.) referred to in the present application are information and data authorized by the user or sufficiently authorized by each party.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples only represent a few embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the present application. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application shall be subject to the appended claims.

Claims (11)

CN202310172343.5A2023-02-242023-02-24Virtual reality live broadcasting room illumination processing method, device, equipment and storage mediumPendingCN116152423A (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202310172343.5ACN116152423A (en)2023-02-242023-02-24Virtual reality live broadcasting room illumination processing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202310172343.5ACN116152423A (en)2023-02-242023-02-24Virtual reality live broadcasting room illumination processing method, device, equipment and storage medium

Publications (1)

Publication NumberPublication Date
CN116152423Atrue CN116152423A (en)2023-05-23

Family

ID=86373385

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202310172343.5APendingCN116152423A (en)2023-02-242023-02-24Virtual reality live broadcasting room illumination processing method, device, equipment and storage medium

Country Status (1)

CountryLink
CN (1)CN116152423A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN119225534A (en)*2024-09-232024-12-31深圳市帝景光电科技有限公司 Multi-light source lighting display method, device and electronic equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN119225534A (en)*2024-09-232024-12-31深圳市帝景光电科技有限公司 Multi-light source lighting display method, device and electronic equipment

Similar Documents

PublicationPublication DateTitle
US11854149B2 (en)Techniques for capturing and displaying partial motion in virtual or augmented reality scenes
CN108446310B (en)Virtual street view map generation method and device and client device
CN100534158C (en)Generating images combining real and virtual images
TWI752502B (en)Method for realizing lens splitting effect, electronic equipment and computer readable storage medium thereof
US20240078703A1 (en)Personalized scene image processing method, apparatus and storage medium
CN111815780A (en)Display method, display device, equipment and computer readable storage medium
CN104484327A (en)Project environment display method
CN113822970A (en)Live broadcast control method and device, storage medium and electronic equipment
CN106951561A (en)Electronic map system based on virtual reality technology and GIS data
CN113194329B (en)Live interaction method, device, terminal and storage medium
JP2022507502A (en) Augmented Reality (AR) Imprint Method and System
CN114358112B (en) Video fusion method, computer program product, client and storage medium
CN112446804A (en)Intelligent tourism system based on country culture and virtual reality
CN107451953A (en)Group photo generation method and device and electronic equipment
CN115988232B (en) Interaction method, device, electronic device and storage medium of virtual image in live broadcast room
CN117934718A (en) A platform and method for constructing multiple meteorological virtual scenes
CN116233513A (en)Virtual gift special effect playing processing method, device and equipment in virtual reality live broadcasting room
CN117014644A (en)Meta-universe live video processing method and device and meta-universe live video system
CN116152423A (en)Virtual reality live broadcasting room illumination processing method, device, equipment and storage medium
CN111652986A (en)Stage effect presentation method and device, electronic equipment and storage medium
KR20200028830A (en)Real-time computer graphics video broadcasting service system
TaoA VR/AR-based display system for arts and crafts museum
CN117354481A (en)Interaction method, device, electronic equipment, storage medium and program product
CN116320646A (en)Interactive processing method and device for three-dimensional virtual gift in virtual reality live broadcasting room
Kramar et al.Peculiarities of Augmented Reality Usage in a Mobile Application: the Case of the Ivan Puluj Digital Museum.

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination

[8]ページ先頭

©2009-2025 Movatter.jp