Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
It should be noted that the descriptions of "first," "second," etc. in the embodiments of the present application are for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In addition, the technical solutions of the embodiments may be combined with each other, but it is necessary to base that the technical solutions can be realized by those skilled in the art, and when the technical solutions are contradictory or cannot be realized, the combination of the technical solutions should be considered to be absent and not within the scope of protection claimed in the present application.
In the description of the present application, it should be understood that the numerical references before the steps do not identify the order in which the steps are performed, but are merely used to facilitate description of the present application and to distinguish between each step, and thus should not be construed as limiting the present application.
First, a term explanation is provided in relation to the present application:
Unreal Engine-a comprehensive game engine developed by EPIC GAMES for creating interactive 3D and 2D content. Unreal engines can be applied in the fields of game development, virtual Reality (VR), augmented Reality (AR), movie production, building visualization, etc.
Stained areas-specific portions or areas on a 3D model or object that may be used to control certain appearance or behavioral characteristics of an object or character.
Material-a virtual material for controlling and rendering the appearance of a target model. By material, visual and physical properties such as color, texture, gloss, reflection, transparency, etc. of the object can be defined.
Next, in order to facilitate understanding of the technical solutions provided by the embodiments of the present application by those skilled in the art, the following description is made on related technologies:
Unreal engines may be used to implement real-time interactive experiences, such as in the fields of game development, virtual Reality (VR), and the like. Appearance attributes and effects associated with the 3D model, such as changing the color of the 3D model, may be set by the Unreal engine so that the 3D model switches from one color to another. However, current 3D model color switching is less effective, thereby degrading the user's use experience.
Therefore, the embodiment of the application provides a model dynamic color generation technical scheme based on Unreal engine. In the technical scheme, (1) for the same 3D model, the dynamic effect of the model changing in different time periods and in a plurality of different colors can be realized by setting different dynamic color effect parameters. (2) Different color transition algorithms can be replaced so that different dynamic color changing effects can be achieved in the same time period and color sequence. (3) In the engine or during game running, the content of the functional module can be flexibly adjusted according to the actual condition requirement, so that the smooth dynamic change effect of the 3D model color can be realized. See in particular below.
Example 1
FIG. 1 schematically illustrates a flow chart of a method for model dynamic color generation based on Unreal engine, according to one embodiment of the application.
As shown in fig. 1, the method for generating dynamic colors based on the Unreal engine model may include steps S100 to S108, where:
Step S100, material information of the target model is obtained.
Step S102, according to the material information, dyeing area information of each material on the target model is obtained, and the same dyeing area corresponds to the same color on each material at the same time.
Step S104, obtaining dynamic color parameters aiming at the color dyeing area.
Step S106, according to the dynamic color parameters, obtaining the frame color values corresponding to each frame of the color dyeing area in the dynamic process.
Step S108, dynamically rendering the dyeing area according to the frame color value corresponding to each frame so as to generate a target model with dynamically changed color.
According to the method for generating the dynamic color of the model based on the Unreal engine, provided by the embodiment, the dyeing area of the target model and the dynamic color parameters of the dyeing area are obtained. And determining a frame color value corresponding to each frame in the dynamic change process of the color-dyed area according to the dynamic color parameters of the color-dyed area. And then dynamically rendering the color-dyed areas frame by frame according to the frame color values corresponding to each frame, so that the color of the target model continuously and dynamically changes along with the frame number, thereby improving the dynamic color switching effect of the target model and improving the use experience of users.
Each of steps S100 to S108 and optional other steps are described in detail below with reference to fig. 1.
Step S100, material information of the target model is obtained.
The target model may be a 3D model generated based on Unreal engine, the 3D model refers to a virtual object or object in three-dimensional space, and may be a character, a scene, an object, a building, and other objects in the game world. In an exemplary application, the target model may be a clothing object such as a coat, skirt, or the like, or may be another object.
The texture information may be setup information for rendering and rendering texture properties of the object model. The appearance of the object model, such as visual characteristics of color, texture, reflection, illumination, transparency, etc., can be determined based on the material information. Illustratively, the texture information may include color partition information on the map, color partition mark information on the texture, and the like.
Step S102, according to the material information, dyeing area information of each material on the target model is obtained, and the same dyeing area corresponds to the same color on each material at the same time.
One or more colored regions may be included on a target model, which in an exemplary application, as shown in fig. 2, contains two colored regions, colored region 1 "orange" and colored region 2 "blue".
Each colored region can include one or more color zones, the one or more color zones being the same color. It should be noted that, by modifying the color of a certain color-dyed area, the color of one or more color zones corresponding to the color-dyed area can be changed at the same time, and the color of the one or more color zones changes the same.
Step S104, obtaining dynamic color parameters aiming at the color dyeing area.
Each dyeing area corresponds to a group of dynamic color parameters, when the target model is designed, the corresponding dynamic color parameters can be modified in real time in an engine or an internal game interface so as to adjust the specific color value and the dynamic change duration of the dynamic change of the dyeing area in real time and preview the dynamic color change effect of the dyeing area.
In an alternative embodiment, the object model includes one or more materials, each material including one or more color zones. Each color partition is provided with an identifier, and the identifiers of the color partitions belonging to the same dyeing area are the same.
Correspondingly, the step 204 may further include:
And acquiring a target color partition corresponding to the dyeing partition from the one or more materials according to the identifiers of the color partitions.
The dynamic color parameters of the color dyeing area are used for indicating the target color subareas to dynamically change.
The material is a virtual material for controlling and rendering the appearance of the target model. Vision properties of the object model, such as color, illumination, texture, etc., can be defined by the material. And the material may use a map to divide a plurality of different color partitions.
The identifier is used for identifying which color dyeing area each color partition on the material belongs to.
In the above optional embodiment, according to the identifier of the color partition, the target color partition corresponding to the dyeing area may be quickly obtained from one or more materials in the target model, so as to improve the efficiency of obtaining the target color partition corresponding to the dyeing area, and further improve the efficiency of performing dynamic color change on the target color partition.
In an alternative embodiment, each texture is associated with a dynamic sub-texture, and the target color partition includes a plurality of sub-partitions. Correspondingly, as shown in fig. 3, the step 104 further includes:
step S300, determining a plurality of target dynamic sub-materials corresponding to the plurality of sub-partitions one by one.
Step S300, corresponding dynamic color parameters are allocated to each target dynamic sub-material. The corresponding dynamic color parameters are one group of parameters in the dynamic color parameters of the color dyeing area and are used for indicating the corresponding sub-areas to dynamically change.
The dynamic sub-texture is an instance of a texture that is dynamically created and modified at runtime. Dynamic sub-textures allow real-time modification of properties of the textures, such as color, texture, illumination, transparency, etc., as the game or application is run, thereby achieving the visual effect and interactivity of the target model dynamics.
The target color partition may include a plurality of sub-partitions having identical identifiers and marked as belonging to the same staining region. In an exemplary application, as shown in FIG. 4, color zone 1 comprises sub-partitions of sub-partition A2, sub-partition B1, sub-partition C2. It should be noted that, by modifying the color of the dyeing area, the colors of the sub-areas corresponding to the dyeing area can be changed at the same time.
In an exemplary application, as shown in FIG. 4, the object model may include 3 materials, namely, material A, material B, and material C. Correspondingly, the target model may include 3 dynamic sub-textures, specifically, texture A is associated with dynamic sub-texture A, texture B is associated with dynamic sub-texture B, and texture C is associated with dynamic sub-texture C. Furthermore, the object model comprises two stained areas, namely stained area 1 and stained area 2. Wherein, the color dyeing area 1 comprises a sub-partition A2, a sub-partition B1 and a sub-partition C2. The color dyeing area 2 comprises a subarea A1, a subarea A3, a subarea B2, a subarea C1 and a subarea C3. According to the association relation between the material and the dynamic sub-material, the target dynamic sub-material corresponding to each sub-partition can be determined. Specifically, the target dynamic sub-material corresponding to the sub-partition A2 is a dynamic sub-material A, the target dynamic sub-material corresponding to the sub-partition B1 is a dynamic sub-material B, and the target dynamic sub-material corresponding to the partition C2 is a dynamic sub-material C. Correspondingly, the correspondence between the sub-partition and the target dynamic sub-material in the color area 2 is that the target dynamic sub-material corresponding to the sub-partition A1 is the dynamic sub-material A, the target dynamic sub-material corresponding to the sub-partition A3 is the dynamic sub-material A, the target dynamic sub-material corresponding to the sub-partition B2 is the dynamic sub-material B, the target dynamic sub-material corresponding to the sub-partition C1 is the dynamic sub-material C, and the target dynamic sub-material corresponding to the sub-partition C3 is the dynamic sub-material C.
In the above optional embodiment, the target dynamic sub-materials corresponding to the sub-partitions are determined, and then the corresponding dynamic color parameters are allocated to the target dynamic sub-materials, so that the colors of the corresponding sub-partitions can be dynamically changed according to the allocated target dynamic sub-materials corresponding to the dynamic color parameters, and the color switching effect of the target model is improved.
In an alternative embodiment, the method further comprises:
and under the condition that the target material corresponding to the sub-partition is not provided with the dynamic sub-material, creating the target dynamic sub-material corresponding to the target material.
In an exemplary application, it is determined whether a target material corresponding to a child partition is a first set dynamic child material in a target model. And under the condition that the corresponding dynamic color effect is not set for the target material, namely, the dynamic sub-material is not set, creating a target dynamic sub-material corresponding to the target material, and storing the target dynamic sub-material in the target material.
In the above optional embodiment, in the case that the target material corresponding to the sub-partition is not provided with a dynamic sub-material, the target dynamic sub-material corresponding to the target material is created, so that the real-time dynamic change of the color of the target model can be realized according to the created target dynamic sub-material, thereby improving the color switching effect of the target model.
In an alternative embodiment, the method may further include buffering the target dynamic sub-material in a predetermined data structure.
The data structure may exist in the form of an array or a structure body, etc., and it should be noted that the data structure may be arranged according to color partitions in the corresponding materials, so as to improve the efficiency of searching the corresponding materials according to the colors in the following process. In the above optional embodiment, the target dynamic sub-material is cached in a preset data structure, so that the access efficiency of the target model to the target dynamic sub-material during the dynamic color change can be improved.
In an alternative embodiment, the method may further comprise:
And setting the historical dynamic sub-material as the target dynamic sub-material under the condition that the target material corresponding to the sub-partition is provided with the historical dynamic sub-material.
In an exemplary application, it is determined whether a dynamic sub-material is first set for a target material corresponding to a sub-partition in a target model. If the target material has created the historical dynamic sub-material, i.e. if the target material has been provided with the historical dynamic sub-material, the historical dynamic sub-material is set as the target dynamic sub-material.
In the above-mentioned alternative embodiment, the historical dynamic sub-material is set as the target dynamic sub-material, so that the dynamic sub-material can be directly used when the color of the sub-partition is modified later, and a corresponding dynamic sub-material is not required to be additionally created every time the color of the sub-partition is modified, thereby reducing the occurrence of operation blocking caused by creating the dynamic sub-material for multiple times.
Step S106, according to the dynamic color parameters, obtaining the frame color values corresponding to each frame of the color dyeing area in the dynamic process.
The frame color value refers to a color attribute value that a stained area presents in each frame (a single image frame in a game or animation) during dynamic changes. The frame color value may be a color vector, and the frame color value may have various representations such as RGB (red, green, blue), HSV (hue, saturation, brightness), CMYK (Cyan), magenta (Magenta), yellow (Yellow), and black (Key)), and the like. For example, when the frame color value is expressed by RGB, it may be (r=255, g=0, b=0).
In an alternative embodiment, the dynamic color parameter of the stained area comprises a first correspondence between a color sequence of the stained area and a time period. Correspondingly, as shown in fig. 5, the step 106 includes:
step 500, setting a preset display frame number.
Step 500, setting a second correspondence between the preset display frame number and the time period.
And 500, acquiring a frame color value corresponding to each frame of the dyeing area according to the first corresponding relation and the second corresponding relation.
The dynamic color change parameters of the dyeing region can also comprise corresponding color sequence combinations of sub-time periods in the time period of the dynamic color change. Illustratively, the dynamic color change of the stained area is for a period of time of 0-8 seconds, the color being changed from red to yellow. The time period can comprise a plurality of sub-time periods, and the plurality of sub-time periods can correspond to different color changes, specifically, the color of the color-dyed area changes from red to orange in the sub-time period of 0-2 seconds, changes from orange to red in the sub-time period of 2-5 seconds, changes from red to orange in the sub-time period of 5-6 seconds and changes from orange to yellow in the sub-time period of 5-6 seconds. It should be noted that, the color and time corresponding to the sub-time period may be selected according to the actual situation, which is not limited herein.
The color sequence may be used to describe a sequence of color changes of the stained area over time. The time period may be a time period of a color change of the dyed area. It should be noted that the color sequence and the time period in the dynamic color variation parameters may be designed according to actual requirements, so as to achieve the effect of dynamic variation of the target model in different time periods and different colors.
In an exemplary application, as shown in FIG. 4, assuming a2 second time period for color zone 1, the frame color values are represented in RGB (red, green, blue) and the color sequence of the color zone is dynamic from orange to blue, i.e., the RGB color values of color zone 1 are changed from orange (237,125,49) to blue (91,155,213). Accordingly, the partial correspondence in the first correspondence may be that when the color-dyed area 1 is at 0 seconds, the RG B color value is orange (237,125,49), when the color-dyed area 1 is at 1 second, the RGB color value is (200,130,100), when the color-dyed area 1 is at 1.5 seconds, the RGB color value is (150,140,160), and when the color-dyed area 1 is at 2 seconds, the RGB color value is blue (91,155,213). Note that, the first time correspondence may be set in actual situations, which is not limited herein.
The preset display frame number is a display frame number required in an engine or a game, and for example, the preset display frame number may be 30 frames per second or 60 frames per second.
In an exemplary application, the preset display frame number may be 30 frames per second, and the time period may be 2 seconds. And according to the preset display frame number, correlating the time period with the preset display frame number, and acquiring a second corresponding relation. The partial correspondence in the second correspondence may be that the dyeing area is at the 0 th frame at the 0 th second, the dyeing area is at the 30 th frame at the 1 st second, the 45 th frame at the 1.5 th second, and the 60 th frame at the 2 nd second.
In an exemplary application, a frame color value corresponding to each frame in a time period of a color dynamic change process of a color dyeing area can be obtained according to a color transition algorithm. The color transition algorithm can be selected according to the actual conditions of a color representation method of a color dyeing region, a corresponding relation (such as linearity or nonlinearity) of a color sequence and a frame number, and the like. The color transition algorithm may be (1) linear interpolation (Lerp) to create a transition color by linear interpolation between two colors. For example, in the RGB color space, linear interpolation from red to green may be performed separately on each color channel. Linear interpolation between red (255, 0) and green (0,255,0) produces various shades of yellow. (2) Bezier curves (Bezier Curve) create complex color transition effects by defining colors on control points. (3) HSL/HSV transition-color transition algorithms based on HSL (hue, saturation, brightness) or HSV (hue, saturation, brightness) color models, which allow transitions on hue wheels, thus enabling the target model to achieve a more natural color transition effect. It should be noted that when the frame color value corresponding to each frame of the color-dyed area is obtained, different color transition algorithms can be selected according to actual conditions, so that different color change effects can be realized when the color-dyed area is in the same time period and the color sequence is the same, thereby improving the dynamic color switching effect of the target model and improving the use experience of users.
In the above optional embodiment, according to the first correspondence and the second correspondence, a frame color value corresponding to each frame of the color-dyed area is obtained, so that the color-dyed area can be rendered frame by frame according to the frame color value corresponding to each frame, thereby improving the dynamic color change effect of the color-dyed area.
Step S108, dynamically rendering the dyeing area according to the frame color value corresponding to each frame so as to generate a target model with dynamically changed color.
In an exemplary application, as shown in fig. 6, the frame color value of each frame of the dye region 1 may vary with the number of frames. The color dyeing area 1 comprises a sub-partition A2, a sub-partition B1 and a sub-partition C2. Wherein the RGB color value of the dyed area 1 at 0 frame is orange (237,125,49), the RGB color value of the dyed area 1 at 30 frame is blue (91,155,213), and the RGB color value of the dyed area 1 at 60 frame is orange (237,125,49). The dyeing area is dynamically rendered according to the frame color value corresponding to each frame shown in fig. 6, so that the dyeing area 1 changes from orange to blue at 0-1s, and then changes from blue to the initial color (orange) at 1-2s, and the dyeing area 1 can cycle the dynamic color change process. Correspondingly, the color change of the sub-partitions corresponding to the dyeing area 1, namely the sub-partition A2, the sub-partition B1 and the sub-partition C2, is the same as that of the dyeing area 1.
In some embodiments, multiple dynamic color change schemes can be applied to the target model, when the effect of different color schemes is previewed, the color of the target model is previewed without switching and modifying repeatedly, and the corresponding dynamic color change scheme can be directly pulled from the server, so that the dynamic color change operation is automatically performed on the target model.
In an alternative embodiment, as shown in fig. 7, the method further comprises:
step 700, obtaining real-time display frame rate and the number of rendered dynamic color objects in a target model of the same screen display;
Step 702, adjusting the number of the rendered dynamic color objects in the target model according to the real-time display frame rate, so that the display frame number is a preset value.
The more the number of dynamic color objects rendered in the on-screen displayed object model, the fewer the number of frames the game runs. For example, when the number of rendered dynamic color objects is 1, the frame rate of the game play may be 60, and when the number of rendered dynamic color objects is 5, the frame rate of the game play may be reduced to 30 frames. To keep the frame number stable, the number of dynamic color objects rendered in the target model of the on-screen display may be adjusted. For example, changes in the frame rate at which the game is run may be monitored in real time, and as the frame rate decreases, certain effects may be turned off, i.e., the number of rendered dynamic color objects in the target model is reduced, thereby keeping the frame number stable.
In the above optional embodiment, the number of the dynamic color objects rendered in the target model is adjusted according to the real-time display frame rate, so that the display frame number is a preset value, and thus the stability of the display frame number is maintained, so that the frame color value corresponding to each frame of the color-dyed area can be obtained according to the stable display frame number, further the subsequent dynamic rendering of the color-dyed area frame by frame according to the frame color value corresponding to each frame is ensured, and the effect that the color of the target model continuously and dynamically changes along with the frame number is realized.
To make the application easier to understand, an exemplary application is provided below in connection with fig. 8.
In this exemplary application, the object model includes one or more materials, each material including one or more color partitions. The object model includes one or more stained areas, each stained area (of the same color) including one or more sub-partitions, which may be from different materials. Each sub-zone is provided with an identifier, and the identifiers of the sub-zones belonging to the same dyeing zone are the same.
S11, acquiring material information of a target model;
s12, obtaining a dyeing area of the target model according to the material information;
s13, judging whether the target material is provided with a target dynamic sub-material, specifically:
s131, if not, creating a target dynamic sub-material;
s132, setting the historical dynamic sub-material as a target dynamic sub-material;
s14, acquiring dynamic color parameters for a color dyeing area, and specifically:
① And obtaining a target color partition corresponding to the dyeing partition from the material according to the identifier of the color partition, wherein the target color partition comprises a plurality of sub-partitions.
② Determining the dynamic sub-materials of the targets corresponding to the sub-partitions;
③ And distributing corresponding dynamic color parameters for each target dynamic sub-material.
S15, according to the dynamic color parameters, acquiring a frame color value corresponding to each frame of a color dyeing area in the dynamic process, and specifically:
① Setting a preset display frame number;
② Setting a second corresponding relation between a preset display frame number and a time period;
③ And acquiring a frame color value corresponding to each frame of the color-dyed area according to the dynamic color parameters (the first corresponding relation) and the second corresponding relation of the color-dyed area.
S16, dynamically rendering the color areas according to the frame color values corresponding to each frame.
Example two
Fig. 9 schematically shows a block diagram of a Unreal engine-based model dynamic color generating apparatus according to a second embodiment of the present application, which Unreal engine-based model dynamic color generating apparatus may be divided into one or more program modules, which are stored in a storage medium and executed by one or more processors to complete the embodiments of the present application. Program modules in accordance with the embodiments of the present application are directed to a series of computer program instruction segments capable of performing the specified functions, and the following description describes each program module in detail. As shown in fig. 9, the Unreal engine-based model dynamic color generation device 900 may include a first acquisition module 910, a second acquisition module 920, a third acquisition module 930, a fourth acquisition module 940, and a rendering module 950, where:
a first obtaining module 910, configured to obtain material information of the target model;
The second obtaining module 920 is configured to obtain, according to the material information, information of a dyeing area of each material on the target model, where the same dyeing area corresponds to the same color on each material at the same time;
A third obtaining module 930, configured to obtain a dynamic color parameter for the dyeing area;
a fourth obtaining module 940, configured to obtain a frame color value corresponding to each frame of the color-dyed area in a dynamic process according to the dynamic color parameter;
And the rendering module 950 is configured to dynamically render the dyeing area according to the frame color value corresponding to each frame, so as to generate a target model with dynamically changed color.
As an alternative embodiment, the target model comprises one or more materials, each material comprises one or more color partitions, each color partition is provided with an identifier, and the identifiers of the color partitions belonging to the same dyeing area are the same;
correspondingly, the third obtaining module 930 is further configured to:
obtaining a target color partition corresponding to the dyeing partition from the one or more materials according to the identifier of the color partition;
The dynamic color parameters of the color dyeing area are used for indicating the target color subareas to dynamically change.
As an alternative embodiment, each texture is associated with a dynamic sub-texture, and the target color partition comprises a plurality of sub-partitions;
correspondingly, the third obtaining module 930 is further configured to:
determining a plurality of target dynamic sub-materials corresponding to the plurality of sub-partitions one by one;
And distributing corresponding dynamic color parameters for each target dynamic sub-material, wherein the corresponding dynamic color parameters are one group of parameters in the dynamic color parameters of the color dyeing area and are used for indicating the corresponding sub-partition to dynamically change.
As an alternative embodiment, the apparatus further comprises a creation module for:
and under the condition that the target material corresponding to the sub-partition is not provided with the dynamic sub-material, creating the target dynamic sub-material corresponding to the target material.
As an optional embodiment, the apparatus further includes a caching module, where the caching module is configured to:
And buffering the target dynamic sub-material in a preset data structure.
As an alternative embodiment, the apparatus further comprises a setting module, the setting module being configured to:
And setting the historical dynamic sub-material as the target dynamic sub-material under the condition that the target material corresponding to the sub-partition is provided with the historical dynamic sub-material.
As an alternative embodiment, the dynamic color parameter of the dyed area includes a first correspondence between a color sequence of the dyed area and a time period;
Correspondingly, the fourth obtaining module 940 is further configured to:
Setting a preset display frame number;
Setting a second corresponding relation between the preset display frame number and the time period;
And acquiring a frame color value corresponding to each frame of the dyeing region according to the first corresponding relation and the second corresponding relation.
As an alternative embodiment, the apparatus further comprises an adjustment module for:
Acquiring the real-time display frame rate and the number of the rendered dynamic color objects in the target model displayed on the same screen;
And adjusting the number of the rendered dynamic color objects in the target model according to the real-time display frame rate so as to enable the display frame number to be a preset value.
Example III
Fig. 10 schematically shows a hardware architecture diagram of a computer device 10000 adapted to implement a method for Unreal engine-based model dynamic color generation according to a third embodiment of the application. In some embodiments, computer device 10000 may be a smart phone, a wearable device, a tablet, a personal computer, a vehicle terminal, a gaming machine, a virtual device, a workstation, a digital assistant, a set top box, a robot, or the like. In other embodiments, the computer device 10000 may be a rack server, a blade server, a tower server, or a rack server (including a stand-alone server, or a server cluster composed of multiple servers), or the like. As shown in fig. 10, the computer device 10000 includes, but is not limited to, a memory 10010, a processor 10020, and a network interface 10030, which can be communicatively linked to each other through a system bus. Wherein:
Memory 10010 includes at least one type of computer-readable storage medium including flash memory, hard disk, multimedia card, card memory (e.g., SD or DX memory), random Access Memory (RAM), static Random Access Memory (SRAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), programmable read-only memory (PROM), magnetic memory, magnetic disk, optical disk, and the like. In some embodiments, memory 10010 may be an internal storage module of computer device 10000, such as a hard disk or memory of computer device 10000. In other embodiments, the memory 10010 may also be an external storage device of the computer device 10000, such as a plug-in hard disk provided on the computer device 10000, a smart memory card (SMART MEDIA CARD, SMC), a Secure Digital (SD) card, a flash memory card (FLASH CARD), or the like. Of course, the memory 10010 may also include both an internal memory module of the computer device 10000 and an external memory device thereof. In this embodiment, the memory 10010 is typically used to store an operating system installed on the computer device 10000 and various application software, such as program codes of a method for generating dynamic colors based on a model of Unreal engine. In addition, the memory 10010 may be used to temporarily store various types of data that have been output or are to be output.
The processor 10020 may be a central processing unit (Central Processing Unit, CPU), controller, microcontroller, microprocessor, or other chip in some embodiments. The processor 10020 is typically configured to control overall operation of the computer device 10000, such as performing control and processing related to data interaction or communication with the computer device 10000. In this embodiment, the processor 10020 is configured to execute program codes or process data stored in the memory 10010.
The network interface 10030 may comprise a wireless network interface or a wired network interface, which network interface 10030 is typically used to establish a communication link between the computer device 10000 and other computer devices. For example, the network interface 10030 is used to connect the computer device 10000 to an external terminal through a network, establish a data transmission channel and a communication link between the computer device 10000 and the external terminal, and the like. The network may be a wireless or wired network such as an Intranet (Intranet), the Internet (Internet), a global system for mobile communications (Global System of Mobile communication, abbreviated as GSM), wideband code division multiple access (Wideband Code Divi sion Multiple Access, abbreviated as WCDMA), a 4G network, a 5G network, bluetooth (Bluetoo th), wi-Fi, etc.
It should be noted that fig. 10 only shows a computer device having components 10010-10030, but it should be understood that not all of the illustrated components are required to be implemented, and that more or fewer components may be implemented instead.
In this embodiment, the method for generating dynamic color based on the model of Unreal engine stored in the memory 10010 may be further divided into one or more program modules and executed by one or more processors (such as the processor 10020) to implement the embodiment of the application.
Example IV
The embodiment of the application also provides a computer readable storage medium, on which a computer program is stored, wherein the computer program, when being executed by a processor, implements the steps of the method for dynamic color generation based on the model of Unreal engine in the embodiment.
In this embodiment, the computer-readable storage medium includes a flash memory, a hard disk, a multimedia card, a card memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEP ROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, an optical disk, and the like. In some embodiments, the computer readable storage medium may be an internal storage unit of a computer device, such as a hard disk or a memory of the computer device. In other embodiments, the computer readable storage medium may also be an external storage device of a computer device, such as a plug-in hard disk, a smart memory card (SMART MEDIA CARD, SMC), a Secure Digital (SD) card, a flash memory card (FLASH CARD), or the like, provided on the computer device. Of course, the computer-readable storage medium may also include both internal storage units of a computer device and external storage devices. In this embodiment, the computer readable storage medium is typically used to store an operating system installed on a computer device and various types of application software, such as program code for a method for dynamic color generation based on a model of Unr eal engine in the embodiment, and the like. Furthermore, the computer-readable storage medium may also be used to temporarily store various types of data that have been output or are to be output.
It will be apparent to those skilled in the art that the modules or steps of the embodiments of the application described above may be implemented in a general purpose computer device, they may be concentrated on a single computer device, or distributed over a network of multiple computer devices, they may alternatively be implemented in program code executable by a computer device, so that they may be stored in a storage device for execution by the computer device, and in some cases, the steps shown or described may be performed in a different order than what is shown or described, or they may be separately made into individual integrated circuit modules, or a plurality of modules or steps in them may be made into a single integrated circuit module. Thus, embodiments of the application are not limited to any specific combination of hardware and software.
It should be noted that the foregoing is only a preferred embodiment of the present application, and is not intended to limit the scope of the present application, and all equivalent structures or equivalent processes using the descriptions of the present application and the accompanying drawings, or direct or indirect application in other related technical fields, are included in the scope of the present application.