Detailed Description
Exemplary embodiments of the present disclosure will be described more fully hereinafter with reference to the accompanying drawings.
The drawings are schematic illustrations of the present disclosure and are not necessarily drawn to scale. Some of the block diagrams shown in the figures may be functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in software, or in hardware modules or integrated circuits, or in networks, processors or microcontrollers. Embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein. The described features, structures, or characteristics of the disclosure may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough description of embodiments of the present disclosure. However, it will be recognized by one skilled in the art that one or more of the specific details may be omitted, or other methods, components, devices, steps, etc. may be used instead of one or more of the specific details in implementing the aspects of the present disclosure.
In the related art, when editing motion information for a virtual object in a game, a user generally needs to edit information such as the position of the virtual object at different times. For example, for motion trail editing of a three-dimensional virtual object, a common manner is that a user sets key frames at different moments by moving an indication point on a time axis, edits positions of the three-dimensional virtual object in each key frame, and a program calculates a speed by a distance, a time and the like, so that complete motion information is obtained, for example, a coherent motion trail animation and the like can be generated. However, in this method, the information that the user can see is insufficient, for example, when the user edits the information at a certain time, the user cannot see the information at other times or only can see part of the information at other times, and the relevant editing interface always only displays the state of the virtual object at a certain time, so that the user cannot see the states at a plurality of times at the same time. Thereby adversely affecting the editing result.
In view of the above, exemplary embodiments of the present disclosure provide a motion editing processing method in a game.
Fig. 1 shows a system architecture diagram. The system architecture 100 may include a terminal device 110 and a server 120. The terminal device 110 may be a mobile phone, a tablet computer, a personal computer, a smart wearable device, a game machine, or the like, which has a display function and is capable of displaying a graphical user interface, and the graphical user interface may include an interface of an operating system or an interface of an application program, or the like. The terminal device 110 has installed thereon a game program, such as a client program that may be a network game. When the terminal device 110 runs the game program, a game editing scene in which the user can edit the scene components may be displayed in the graphical user interface. The server 120 generally refers to a background system that provides game services in the present exemplary embodiment, and may be one server or a cluster of multiple servers. The server 120 is deployed with a server program of a network game, and is used for executing game data processing of the server. The connection between the terminal device 110 and the server 120 may be formed by a wired or wireless communication link for data transmission. The motion editing processing method in the game may be executed independently by the terminal device 110 or may be executed together by the terminal device 110 and the server 120.
In one embodiment, the motion editing processing method in the game can be realized and executed based on a cloud interaction system. The cloud interaction system may be the system architecture 100 described above. Various cloud applications can be run under the cloud interaction system, for example: and (5) cloud game. Taking cloud game as an example, cloud game refers to a game mode based on cloud computing. In the cloud game operation mode, the game program operation main body and the game picture presentation main body are separated, and the storage and operation of the control and interaction method in the game are completed on a cloud game server (such as the server 120), and the cloud game client (such as the terminal device 110) is used for receiving and sending data and presenting the game picture. For example, the cloud game client may be a display device with a data transmission function near the user side, such as a mobile terminal, a television, a computer, a palm computer, etc.; and the cloud game server for information processing is a cloud game server. When playing the game, the user operates the cloud game client to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, codes and compresses data such as game pictures and the like, returns the data to the cloud game client through a network, and finally decodes the data through the cloud game client and outputs the game pictures.
In one embodiment, the component editing method may be implemented in a stand-alone game. Without the need of deploying a server, it is possible to install all game programs by the terminal device and execute the motion editing processing method in the game.
In one embodiment, referring to fig. 2, the in-game motion editing processing method may include the following steps S210 to S230:
step S210, displaying a game editing scene in a graphical user interface provided by running a game program; the game editing scene comprises a scene component configured to generate a corresponding virtual model at a game run stage;
step S220, in response to a preset operation of a motion path to be edited of a target scene component, displaying a first visual object corresponding to the target scene component at a position of a target waypoint of the motion path to be edited in a game editing scene, and representing state information of the target scene component moving to the target waypoint through state information of the first visual object;
wherein the target scenario component is one or more of the scenario components of the game editing scenario; the target scene component is provided with one or more motion paths, and the motion path to be edited is the motion path therein; the target scene component is configured to move according to a motion path during a game play phase; the motion path to be edited comprises one or more waypoints, and the target waypoint is the waypoint therein.
In the method shown in fig. 2, on one hand, a first visualized object is displayed at a position of a target waypoint, state information of the first visualized object is used for representing state information of a target scene component moving to the target waypoint, and an effect of the target scene component moving to the target waypoint is presented, so that a user can see states of the target scene component at different moments at the same time. On the other hand, the road points are displayed in a highly visual mode, visual feedback corresponding to editing operation is provided for the user, and convenience and experience of the user operation are improved.
Each step in fig. 2 is described in detail below.
Referring to fig. 2, in step S210, a game editing scene is displayed in a graphical user interface provided by running a game program; the game editing scenario includes a scenario component configured to generate a corresponding virtual model at a game run stage.
Wherein, when the terminal device runs the game program, a game editing scene can be displayed in the graphical user interface. The game program may be a game main program in which a game scene editing function (e.g., a game editor is built in the game program) is provided, and when the user uses the function, a game editing scene may be displayed in the game main program. Alternatively, the game program may be an editing program that is associated with the game main program, and the editing program may be executed independently without executing the game main program, and when editing a game scene using the editing program, the game editing scene may be displayed.
The present exemplary embodiment supports custom editing of scenes by players, and thus, users herein may refer to game makers (e.g., artwork) of game manufacturers, as well as players.
The user can select to newly build a game scene and edit the game scene, and can also select to edit the existing game scene. The game edit scene is a game scene in the current edit. The game editing scene may include a background of the scene and one or more generated scene components, and the scene components may be scene components of the game editing scene (such as a game program may provide a plurality of different types of preset game scenes, a user may select a certain preset game scene to edit, and the scene components of the game editing scene may be initially provided with scene components of the game editing scene) or scene components generated by the user. The scene component may be a person, thing or a partial person, thing in the game, such as an NPC (Non-Player Character), etc. During a game play phase (i.e., a phase in which a player plays a game using an edited scene), the scene component may generate a corresponding virtual model.
In one embodiment, in the case of displaying a game editing scene, one or more scene component selection controls may also be displayed in the graphical user interface. Referring to FIG. 3, the scenario component selection controls may include "tile component," "cylindrical component," "semi-cylindrical component," etc. controls for generating corresponding scenario components in the game-editing scenario in response to user manipulation of the controls, e.g., a user clicking on the "tile component" control may generate tile components in the game-editing scenario.
In one embodiment, the game program may be self-contained with one or more scene components, such as scene components that may be preconfigured by an artist and stored in the game program, and may provide their corresponding scene component selection controls so that a player may conveniently use these scene components for scene editing, such as may add scene components in a one-touch manner in a game editing scene.
In one embodiment, the scene components may be preconfigured by a player, who may get scene components that were not originally in the game program by modeling in a game editing scene or other editing interface. For a player pre-configured scene component, a corresponding scene component selection control may also be provided. When the scene component is preconfigured, one or more of the information of the size, position, direction, color, texture, shape, etc. of the scene component can be configured. Thus, when the user uses the scene components in the game editing scene, the configured information can be directly called, and the game editing scene is very convenient and efficient. Of course, the user may also adjust the configured information in the scene component, such as adjusting one or more of the above information, so as to better meet the needs and preferences of the user.
In one embodiment, a virtual camera may be provided in the game editing scene. The virtual camera is a tool for simulating a real camera in a game program to shoot a game scene picture, can be arranged at any position in a game editing scene, and shoots the game scene at any view angle, namely the virtual camera can have any pose in the game scene, and the pose can be fixed or dynamically changed. In addition, any number of virtual cameras can be arranged in the game editing scene, and different virtual cameras can shoot different game scene pictures.
Referring to fig. 4A, the game editing scene may present two different viewing angles, namely, a viewing angle and a game viewing angle, and a user may select which viewing angle to use in the relevant setting interface of the game editing scene. The viewing angle means that the game editing scene is viewed at what is called by the third person, and as shown in fig. 4B, under the viewing angle, the user may not manipulate the game character in the game editing scene, but directly manipulate the virtual camera to move the viewing angle (the virtual camera is not displayed). The game view angle refers to a view angle of a first person to observe a game editing scene, and referring to fig. 4C, under the game view angle, a user may control a certain game role in the game editing scene, the game role may be bound to a virtual camera, that is, a position relationship between the game role and the virtual camera is fixed, for example, the game role may be located at a focus of the virtual camera, and when the user controls the game role to move, the virtual camera moves synchronously, thereby moving the view angle. Of course, under the observation view angle, an invisible game character can be set in the game scene to be edited, which is equivalent to hiding the game character in fig. 4C, and the user can move the virtual camera by moving the game character when moving the view angle. Under the observation view angle or the game view angle, a virtual rocker, an up-shift or down-shift control, etc. can be arranged in the game editing scene, and a user can move a virtual camera or move a game role by operating the control.
With continued reference to fig. 2, in step S220, in response to a preset operation on the motion path to be edited of the target scene component, a first visualized object corresponding to the target scene component is displayed at a position of a target waypoint of the motion path to be edited in the game editing scene, and state information of the target scene component moving to the target waypoint is represented by state information of the first visualized object.
The target scene component is a scene component which is required to be edited or viewed by a user in the current motion trail, is one or more components in the scene components of the game editing scene, and can be any scene component in the game editing scene. For example, the user may select any one of the scene components in the game editing scene as the target scene component. Alternatively, the user may generate a scenario component in the game editing scenario by operating a scenario component selection control or the like, and after generating the scenario component, default the scenario component to the target scenario component.
The target scene component has one or more motion paths, the motion path to be edited being the motion path therein. The motion path is motion information that has been edited for a target scene component that is configured to move in accordance with its motion path during the game play phase. The motion path to be edited may be one or more motion paths selected manually by the user among all motion paths of the target scene component or selected automatically by the game program.
Waypoints refer to points on the path of motion. In this exemplary embodiment, the waypoint is a set of motion information, and may include information such as a position, a direction, and the like, which indicates a position, a direction, and the like when the target scene component moves to the waypoint. The motion information is recorded and stored in a way of the access points, so that richer and comprehensive motion information can be provided.
Each motion path may have a waypoint. Alternatively, as shown in fig. 5, a movement manner of a movement path may be set, for example, a movement manner including movement along a waypoint, a uniform straight line, a uniform circumference, a simple pendulum, etc., the movement manner may be a movement path along the waypoint, and the movement paths of other movement manners may not have the waypoint or may not store the waypoint. It should be appreciated that the waypoints of each motion path may have a number or order that indicates the directional order in which the target scene components are moving. For example, if a motion path includes a waypoint 1, a waypoint 2 and a waypoint 3, the target scene component goes from the waypoint 1 to the waypoint 2 and then to the waypoint 3 according to the motion direction of the motion path. It can be seen that if the waypoint number or sequence changes, the motion path of the target scene component also changes.
The motion path to be edited comprises one or more waypoints, and the target waypoint is the waypoint in the motion path and is the waypoint which is required to be edited or checked currently by the user. The preset operation may be an operation to trigger editing or viewing of the waypoints. And responding to the preset operation, and displaying a first visual object corresponding to the target scene component at the position of the target waypoint in the game editing scene. The first visualization object has state information that may be equivalent to or map out state information of the target scene component moving to the target waypoint. The status information may include, but is not limited to, one or more of location, orientation, size, morphology, color (including transparency). In the movement process of the target scene component, not only the position is changed, but also certain rotation can be performed simultaneously to generate direction change, certain scaling can be performed to generate size change, certain deformation can be performed to generate form change, or certain color change can be performed to generate color change. By displaying the first visual object, the state information of the target scene component moving to the target waypoint can be displayed, so that a user can intuitively see the movement effect.
In one embodiment, referring to fig. 6, in response to the above-mentioned preset operation on the motion path to be edited of the target scene component, displaying, at the position of the target waypoint of the motion path to be edited in the game editing scene, a first visualized object corresponding to the target scene component, where the state information of the first visualized object indicates the state information of the motion of the target scene component to the target waypoint, and may include the following steps S610 and S620:
Step S610, generating a first visualized object which is the same as the target scene component in response to a preset operation of the motion path to be edited;
step S620, displaying the first visualized object at the position of the target waypoint in the game editing scene based on the state information of the target scene component moving to the target waypoint.
Referring to FIG. 7, the target scene component 701 is a chair component, and the first visual object 702 is the same as the target scene component 701, also a chair component, which is displayed at the location of the target waypoint. The first visualized object 702 may be displayed based on the state information of the target scene component 701 moving to the target waypoint, i.e. the state information of the displayed first visualized object 702 may be the same as the state information of the target scene component 701 moving to the target waypoint. For example, the direction, size, etc. of the movement of the target scene component 701 to the target waypoint may be determined, the direction, size, etc. of the first visual object 702 on the target waypoint may be set, and the first visual object 702 may be displayed. Since the first visualization object 702 is the same as the target scene component 701, the user can very intuitively see the state and effect of the target scene component 701 moving to the target waypoint.
In one embodiment, before displaying the first visualized object corresponding to the target scene component at the position of the target waypoint of the motion path to be edited in the game editing scene in response to the preset operation on the motion path to be edited of the target scene component, the motion editing processing method may further include the steps of:
and superposing all the motion paths of the target scene component, and determining the state information of the target scene component moving to the target waypoint based on the superposed motion paths.
If the target scene component has multiple motion paths, in the game running stage, the target scene component can move under the superposition of the multiple motion paths, and the actual motion path can be the motion path after all the motion paths are superposed. Referring to fig. 8, the point O represents the position of the target scene component, where the target scene component has 3 motion paths, i.e., O1, O2, and O3, and the 3 motion paths are superimposed, and a motion path O4 may be formed by vector addition, which is the actual motion path of the target scene component. If the path point P exists on the motion path O3, as a target path point, a mapping point P ' (if OP/o1=op '/O4 is satisfied) of the target path point P may be determined on the superimposed motion path O4, state information of the target scene component moving to the point P ' may be determined, and the first visualization object may be displayed based on the state information. The first visual object may be displayed at point P or at point P'. Therefore, the user can see the motion effect of superposition of a plurality of motion paths, and is convenient for realizing complex and various motion effects by editing a plurality of motion paths.
Fig. 8 shows a case where a plurality of movement paths are simply superimposed. In one embodiment, the motion path may include information of a trigger condition, a motion time, etc., the trigger condition indicating that the target scene component is triggered to move along the motion path if the condition is satisfied, the motion time information may include a total time, a cycle time, or a time delay of the target scene component moving along the motion path (indicating how long to delay to start moving if the trigger condition is satisfied), etc. Therefore, the respective movement paths can be superimposed based on the trigger condition, movement time, and the like of the respective movement paths. Specifically, a motion time period may be determined according to the earliest motion start time and the latest motion end time in all motion paths, information such as the position and the direction of the target scene component at different moments in each motion path is aligned to a time axis, so that the time axis alignment is realized by different motion paths, and information such as the position and the direction in all motion paths at each moment is overlapped to obtain information such as the position and the direction of the target scene component after being overlapped at each moment, thereby forming an overlapped motion path.
In one embodiment, the preset operation includes a waypoint selection operation; in response to a preset operation on a motion path to be edited of a target scene component, displaying a first visualized object corresponding to the target scene component at a position of a target waypoint of the motion path to be edited in a game editing scene, may include the steps of:
and responding to the waypoint selection operation of the motion path to be edited, determining a target waypoint according to the waypoint selection operation, and displaying a first visualized object corresponding to the target scene component at the position of the target waypoint in the game editing scene.
The waypoint selection operation can be used for selecting one or more waypoints from all the waypoints of the motion path to be edited, and the selected waypoints serve as target waypoints, and the first visualized object is triggered to be displayed at the position of the target waypoints. So that the user may choose to view specific waypoint information. In addition, the waypoint selection operation may be further used to trigger editing of the waypoints, for example, in response to the waypoint selection operation, determine the target waypoint, and display an editing interface of the target waypoint, where the editing interface may display information such as a position and a direction stored in the target waypoint, and the user may edit the information.
In one embodiment, in response to a waypoint switching operation of a motion path to be edited, other waypoints than the currently determined target waypoint are determined as target waypoints. For example, the waypoint switching operation may be implemented by clicking "last waypoint", "next waypoint", "jump to end point (end point is last waypoint)" in the editing interface of the motion path to be edited, and when the user performs the waypoint switching operation, the currently selected waypoint may be switched to another waypoint, and the other waypoint is used as a new target waypoint, and the first visualization object is displayed at the position of the new target waypoint.
In one embodiment, in response to a waypoint selection operation or a waypoint switching operation, when determining a target waypoint, a lens of a virtual camera in a game editing scene may be aimed at the target waypoint to better present information of the target waypoint and a first visualized object on the target waypoint.
In one embodiment, before determining the target waypoint according to the waypoint selection operation in response to the waypoint selection operation of the motion path to be edited, and displaying the first visualized object corresponding to the target scene component at the position of the target waypoint in the game editing scene, the motion editing processing method may further include the steps of:
responding to the editing triggering operation of the motion path to be edited, and displaying an editing interface of the motion path to be edited in the graphical user interface; the editing interface of the motion path to be edited comprises a waypoint operation control.
The editing triggering operation is an operation for starting editing the motion path to be edited. Illustratively, after the target scenario component is determined, an editing interface for the target scenario component is displayed. Fig. 9 shows a schematic diagram of an editing interface of a target scene component, or more specifically, the setting interface on the right side in fig. 9 is an editing interface of a target scene component. The target scene component is a block component a, and the editing interface can display all motion paths of the block component a, for example, the block component a has 3 motion paths, which are "motion 1", "motion 2" and "motion 3" shown in fig. 9. The user can select the motion path to be edited from the motion path to be edited, and the operation of selecting the motion path to be edited is the editing triggering operation of the motion path to be edited, such as the operation of clicking the control of 'motion 1', 'motion 2' or 'motion 3'. In addition, the user can copy and paste the motion path (i.e. the "sporter" shown in fig. 9), add a new motion path, and perform operations such as motion preview (or motion test, i.e. preview the motion effect of the motion path) in the editing interface of the target scene component.
In one embodiment, after determining the target scene component, displaying the motion path of the target scene component in the game editing scene, for example, displaying the motion path in a curve, a straight line or the like, wherein the user can select the motion path to be edited from the motion path, and the operation of selecting the motion path to be edited is the editing triggering operation of the motion path to be edited. Or after determining the target scene component, if the target scene component only has one motion path, the motion path can be automatically used as the motion path to be edited, and the operation of determining the target scene component by the user is the editing triggering operation of the motion path to be edited.
From the above, the target scene component may have multiple motion paths, and waypoints on these motion paths may display state information of all motion paths superimposed. For example, referring to fig. 9, the motion path "motion 1" of the block assembly a has two waypoints, and if the 2 nd waypoint is taken as the target waypoint, the state information of the block assembly a moving to the waypoint based on the path overlapped by the 3 motion paths (such as the dashed block assembly in the left side diagram of fig. 9) can be displayed at the waypoint, so as to display the motion effect of the block assembly a under the superposition of multiple motions such as translation, rotation and the like.
And responding to the editing triggering operation of the motion path to be edited, displaying an editing interface of the motion path to be edited, wherein the editing interface of the motion path to be edited can be the next-stage interface of the editing interface of the target scene component. The editing interface of the motion path to be edited comprises a waypoint operation control. Referring to fig. 10A, an editing interface of the motion path 1 is shown with the motion path 1 as a motion path to be edited, where the motion path 1 includes two waypoints, namely, a waypoint 1 and a waypoint 2, and the editing interface includes an operation control of the waypoint 1 and an operation control of the waypoint 2. Accordingly, the waypoint selection operation may be an operation performed through a waypoint operation control. For example, in fig. 10A, if the user clicks the operation control of the waypoint 1, the waypoint 1 is selected as the target waypoint. Of course, the waypoint operation control is not limited to the specific form shown in fig. 10A, and may be, for example, one control corresponding to all waypoints, or the like. By displaying the editing interface of the motion path to be edited and providing the waypoint operation control, the user can conveniently select the target waypoint, thereby achieving the aim of editing or viewing.
In one embodiment, the editing interface of the motion path to be edited may also display motion information of the motion path to be edited, such as a motion speed, a path type, etc., and the user may edit and modify the information.
In one embodiment, the motion editing processing method may further include the steps of:
and responding to the editing triggering operation of the motion path to be edited, and displaying a second visual object at the position of each waypoint of the motion path to be edited in the game editing scene.
Wherein the second visual object is for presenting the location of the waypoint without presenting the complete state information of the movement of the target scene component to the waypoint. That is, the second visualization object may have no status information, which presents simpler information than the first visualization object. The second visual object is illustratively a virtual object capable of indicating location information, such as a point, an arrow (the location indicated by the arrow is the location of the waypoint), or the like. Referring to fig. 10B, the target scene component 701 is a chair component, the motion path to be edited of the target scene component 701 includes two waypoints, and in response to an editing trigger operation of the motion path to be edited, second visual objects 703, such as points, may be displayed at positions of the two waypoints, respectively, thereby indicating positions of the waypoints. It should be understood that, in response to the editing trigger operation of the motion path to be edited, before the user selects the target waypoint, all the waypoints of the motion path to be edited can be displayed as the second visualized objects, so that the user sees the positions of all the waypoints, and the information presented by the second visualized objects is simpler, so that the simplicity of the game editing scene interface can be ensured. After the user selects the target waypoint, the first visualized object is displayed at the position of the target waypoint, and the second visualized object can be continuously displayed at the position of the non-target waypoint, so that the positions of all the waypoints can be displayed, and the information of the target waypoint can be displayed again.
In one embodiment, the motion editing processing method may further include the steps of:
in response to an edit triggering operation of the motion path to be edited, a third visual object for representing the motion path to be edited is displayed in the game editing scene.
The type of the third visual object is matched with the path type of the motion path to be edited, if the path type of the motion path to be edited is a curve, the third visual object is a curve, and if the path type of the motion path to be edited is a straight line, the third visual object is a straight line. Referring to fig. 10B, in response to the edit trigger operation of the motion path to be edited, a third visualized object of a dotted line type (belonging to one of straight lines) is displayed, which connects different waypoints, showing the complete motion path to be edited. The user can see the path information of the motion path to be edited, and the corresponding editing operation is convenient to carry out.
In one embodiment, in response to an edit trigger operation of a motion path to be edited, a third visual object representing the motion path to be edited is displayed in a game editing scene, and a second visual object is displayed at a position of each waypoint of the motion path to be edited. In response to the waypoint selection operation, a target waypoint is determined, the first visual object is displayed at the location of the target waypoint, while the second visual object and the third visual object of non-target waypoints may remain displayed. For a second visualized object on the target waypoint, in the case of displaying the first visualized object, the second visualized object may be removed, or the second visualized object may remain displayed. For example, as shown with reference to fig. 7 above, in the case of determining a target waypoint, a chair assembly (first visualized object) and a point (second visualized object) may be displayed simultaneously on the target waypoint.
In one embodiment, the preset operation includes a waypoint adding operation; the displaying, in response to the preset operation on the motion path to be edited of the target scene component, the first visualized object corresponding to the target scene component at the position of the target waypoint of the motion path to be edited in the game editing scene may include the following steps:
and in response to the waypoint adding operation of the movement path to be edited, adding a new waypoint in the movement path to be edited, determining the new waypoint as a target waypoint, and displaying a first visualized object corresponding to the target scene component at the position of the target waypoint in the game editing scene.
The user can add a new road point in an editing interface of the motion path to be edited, when the new road point is added, the new road point can be automatically determined to be a target road point, and a first visual object is displayed at the position of the target road point, so that the user can immediately see the motion effect corresponding to the new road point.
In an embodiment, the preset operation includes an edit triggering operation, and in response to the preset operation on the motion path to be edited of the target scene component, displaying, in the game editing scene, a first visualized object corresponding to the target scene component at a position of a target waypoint of the motion path to be edited, may include the following steps:
And responding to the editing triggering operation of the motion path to be edited, taking all the waypoints of the motion path to be edited as target waypoints or the last waypoint of the motion path to be edited as the target waypoint, and displaying a first visual object corresponding to the target scene component at the position of the target waypoint in the game editing scene.
The editing triggering operation refers to an operation of starting editing a motion path to be edited. And responding to the editing triggering operation, taking all the waypoints of the motion path to be edited as target waypoints, namely displaying a first visualized object on each waypoint so as to comprehensively present the motion effect of the motion path. Or the last waypoint of the motion path to be edited can be used as a target waypoint, the last waypoint is the end point of the motion path to be edited, and the first visual object is displayed at the end point position so as to highlight the motion effect of the end point.
In one embodiment, the motion editing processing method may further include the steps of:
in response to a motion path adding operation to the target scene component, adding a new motion path to the target scene component;
if the new motion path is a motion mode along the waypoints, adding a first waypoint in the new motion path, and setting the position of the first waypoint as the position of the target scene component plus a first preset offset parameter.
Wherein the editing interface of the target scene component may provide a control for adding a motion path, such as a button for adding a motion path or an "add mover" in fig. 9, and the user may operate the control to perform a motion path adding operation. In the case of adding a new motion path, a motion mode may be selected for the new motion path, referring to fig. 5, a motion mode such as a motion along a waypoint, a uniform straight line, a uniform circumference, a simple pendulum, etc. may be selected, and if a user selects a motion mode along a waypoint, a first waypoint may be automatically added in the new motion path, where a position is a position of a target scene component plus a first preset offset parameter. The first preset offset parameter may be a vector, may be a fixed parameter, or may be determined according to the size of the target scene component, the path type of the new motion path, and the like. Exemplary, the first preset offset parameter may be (0, h+2), h being the height of the target scene component, table 22 units of length in the game editing scene are shown. If the location of the target scene component is (x)0 ,y0 ,z0 ) The first waypoint location is (x)0 ,y0 ,z0 +h+2). Of course, the user may also adjust the position of the first waypoint later, for example, may directly modify its position coordinates, or may adjust its position by moving the first visualized object or the second visualized object on the first waypoint.
When the new motion path is added, if the new motion path is in a way of moving along the waypoints, the first waypoint is automatically added, and the position of the first waypoint is set, so that the cost of understanding the waypoints and the motion path by a user is reduced, the user can edit the waypoints conveniently, other waypoints are added conveniently, and the editing efficiency is improved.
In one embodiment, the motion editing processing method may further include the steps of:
in response to the waypoint adding operation of the movement path to be edited, adding a new waypoint in the movement path to be edited, and setting the position of the new waypoint as the position of the reference waypoint of the movement path to be edited plus a second preset offset parameter; the reference waypoints are one or more waypoints in the motion path to be edited before the new waypoints are added.
Referring to fig. 7, the editing interface of the motion path to be edited may provide a control for adding a waypoint, such as an "add waypoint" button, and the user may operate the control to perform a waypoint adding operation. Each time a waypoint adding operation is performed, a new waypoint may be added, which may be located in the vicinity of the reference waypoint of the motion path to be edited. The reference waypoint may be any one or more waypoints in the motion path to be edited before the new waypoint is added, for example, the last waypoint in the motion path to be edited before the new waypoint is added may be used as the reference waypoint, or the user may perform the waypoint adding operation under the condition that one waypoint is selected (for example, a target waypoint is selected), and then the waypoint selected by the user (for example, the target waypoint) is the reference waypoint. The position of the new waypoint is the position of the reference waypoint of the motion path to be edited plus a second preset offset parameter. The second preset offset parameter may be a vector, The parameter can be fixed parameter, or can be determined according to the path type of the motion path to be edited, the path type of the reference path point, and the like. The second preset offset parameter may be (0, 2), for example. If the position of the reference waypoint is (x)m ,ym ,zm ) Then the new waypoint location is (xm ,ym ,zm +2). Of course, the user may also adjust the position of the new waypoint later, for example, may directly modify its position coordinates, or may adjust its position by moving the first visualization object or the second visualization object on the new waypoint. By the method, the user can conveniently add the waypoints in the motion path, and editing efficiency is improved.
In one embodiment, the motion editing processing method may further include the steps of:
and adding a new waypoint at the position of the waypoint i in response to the copying operation of any waypoint i of the motion path to be edited, and moving the waypoint i backwards along the motion path to be edited.
Here, the waypoint i represents any waypoint i of the motion path to be edited, and the motion path to be edited is assumed to have m waypoints in total, i is any positive integer in [1, m ]. The copying operation of the waypoint may be performed through the waypoint operation control, as shown in fig. 10A, the editing interface of the motion path to be edited provides operation controls of the waypoint 1 and the waypoint 2, and the user may click on an option icon in the operation control of the waypoint 1 and select to copy, thereby copying the waypoint 1.
In the case of copying the waypoint i, a new waypoint is added and used as a new waypoint i, and the new waypoint i inherits the information (including the number, the identification, the motion information and the like of the waypoint) of the original waypoint i. And the road point i is moved backwards along the motion path to be edited, for example, the road point i can be moved to a certain position (such as a midpoint position) between the original road point i and the road point i+1, and the road point i and the road points after the road point i can be integrally moved. By the mode, a user can copy the waypoints conveniently, and editing efficiency is improved.
In one embodiment, the motion editing processing method may further include the steps of:
and in response to the copying operation of any road point i of the motion path to be edited, adding a new road point at the position of the road point i, and respectively moving the road point i and the road point after the road point i to the position of the next road point.
Specifically, in the case of copying the waypoint i, a new waypoint is added and used as a new waypoint i, which inherits the information of the original waypoint i. The waypoint i is moved to the position of waypoint i+1, the waypoint i+1 is moved to the position of waypoint i+2, and so on. For the last waypoint m, since there is no next waypoint of the waypoint m, a new waypoint m+1 may be added, and its position may be the position of the waypoint m plus a second preset offset parameter, and the waypoint m is moved to the position of the waypoint m+1. Equivalent to shifting the waypoint i and all waypoints following it one bit backward.
Further, the above-described processing may be performed by performing a copy operation on a plurality of waypoints of the motion path to be edited, corresponding to each copied waypoint.
In one embodiment, the motion editing processing method may further include the steps of:
and responding to the operation of changing the path type of the motion path to be edited, keeping the position of the waypoint of the motion path to be edited unchanged, and updating the motion path to be edited according to the position of the waypoint and the changed path type.
The path type of the motion path to be edited represents the path type between two adjacent waypoints, and may include, but is not limited to: straight line, spline interpolation, linear interpolation. If the path type of the motion path to be edited is a straight line, two adjacent road points are connected through the straight line to form a complete motion path. If the path type of the motion path to be edited is spline interpolation, a curve is formed between two adjacent waypoints through spline interpolation, and then a complete motion path is formed. If the path type of the motion path to be edited is linear interpolation, a curve or a straight line is formed between two adjacent waypoints through linear interpolation, and then a complete motion path is formed. It can be seen that the type of path of the motion path to be edited is different, as is the final motion path. The user may change the path type of the movement path to be edited, such as in an editing interface of the movement path to be edited. The path type of the motion path to be edited is changed, the position of the road point of the motion path to be edited is not affected, the position of the road point can be kept unchanged, and the motion path to be edited is updated according to the changed path type. For example, the path type of the motion path to be edited is changed from a straight line to spline interpolation, the motion path can be regenerated between every two adjacent waypoints, and then a new complete motion path is formed, so that the motion path to be edited is updated. In this way, the user is enabled to conveniently change the path type of the movement path to achieve a desired movement effect.
In one embodiment, the motion editing processing method may further include the steps of:
and responding to the operation of changing the path type of any path point i of the motion path to be edited, keeping the position of the path point i unchanged, and updating the sub-path corresponding to the path point i in the motion path to be edited according to the position of the path point i and the changed path type.
The motion path to be edited may include a plurality of sub-paths, which respectively correspond to different waypoints. For example, the sub-path from the start point (i.e., the location of the target scene component) to waypoint 1 may be taken as the sub-path corresponding to waypoint 1, the sub-path from waypoint 1 to waypoint 2 may be taken as the sub-path corresponding to waypoint 2, and so on. Alternatively, the sub-path from the start point to the midpoint between the waypoints 1 and 2 may be taken as the sub-path corresponding to the waypoint 1, the sub-path from the midpoint between the waypoints 1 and 2 to the midpoint between the waypoints 2 and 3 may be taken as the sub-path corresponding to the waypoint 2, and so on, and finally the sub-path from the midpoint between the waypoints m-1 and m to the waypoint m may be taken as the sub-path corresponding to the waypoint m, and so on. The division manner of the sub-paths and the corresponding manner of the sub-paths and the waypoints are not limited in the present disclosure.
The path type may be set for the waypoint, and the path type representing the sub-path corresponding to the waypoint may include, but is not limited to: straight line, spline interpolation, linear interpolation. If the path type of the path point is a straight line, the start point, the path point and the end point of the corresponding sub path are sequentially connected through the straight line to form the sub path. If the path type of the path point is spline interpolation, the corresponding start point, the path point and the end point of the sub path form the sub path through spline interpolation. If the path type of the path point is spline interpolation, the corresponding start point, the path point and the end point of the sub-path form the sub-path through linear interpolation. It can be seen that the path types of the waypoints are different, as are the corresponding sub-paths. The user can change the path type of the waypoint, for example, the path type can be changed in an editing interface of the motion path to be edited through the waypoint operation control, or the path type can be changed in an editing interface of the waypoint (for example, the editing interface of the waypoint can be opened under the condition that the user selects the waypoint, or the path type can be changed in the editing interface of the waypoint through the waypoint operation control). The path type of the path point is changed, the position of the path point is not affected, the position of the path point can be kept unchanged, and the sub path corresponding to the path point is updated according to the changed path type. For example, changing the path type of the waypoint from a straight line to spline interpolation, the sub-path can be regenerated between the start point of the sub-path corresponding to the waypoint, the waypoint and the end point of the sub-path through spline interpolation, and the update of the sub-path can be realized. In this way, the user is enabled to conveniently change the path type of a segment of the sub-path in the movement path so as to achieve the desired movement effect. And the path types of the waypoints are supported to be independently set, so that the flexibility of editing by a user is improved, one motion path can comprise sub paths with different path types, and the diversity of the motion paths is improved.
In one embodiment, the motion editing processing method may further include the steps of:
and in response to the operation of deleting any road point i of the motion path to be edited, updating the sub-path between the adjacent road points in the motion path to be edited according to the adjacent road points of the road point i.
The user can delete the waypoints through the waypoint operation control, or select the waypoints by selecting the first visual object or the second visual object in the game editing scene, and delete the waypoints through the deletion control or the deletion button and the like. And deleting the road point i, and not affecting the positions of other road points, so that the positions of other road points can be kept unchanged, and the sub-paths between the adjacent road points of the road point i are updated. For example, after the waypoint i is deleted, a sub-path may be formed between the waypoint i-1 and the waypoint i+1 again, for example, a sub-path of a straight line, spline interpolation, linear interpolation, etc. type may be generated according to the path type of the motion path to be edited, the path type of the waypoint i-1, or the path type of the waypoint i+1, so as to implement updating of the sub-path. If the waypoint i is the first waypoint, after the waypoint i is deleted, the sub-path can be formed again between the starting point and the waypoint i+1. If the way point i is the last way point, after deleting the way point i, the way point i-1 can be used as the last way point, and the sub-paths from the way point i-1 to the way point i can be deleted. By the method, the user can conveniently delete the waypoints, and the accuracy of the motion path after the waypoints are deleted is ensured.
Naturally, after deleting the waypoint i, the numbers and the sequence of the remaining waypoints may be updated. Specifically, the numbers of all waypoints following waypoint i may be decremented by one.
In one embodiment, the motion editing processing method may further include the steps of:
in response to a movement operation on the target scene component, each waypoint in each motion path of the target scene component is moved according to the movement operation, so that the relative position of each waypoint and the target scene component is unchanged.
Wherein, each waypoint in each motion path of the target scene component, the relative position with the target scene component can remain unchanged. As the position of each waypoint may be a position in the coordinate system of the target scene component that moves as a whole as the target scene component moves, the absolute position of each waypoint moves with it, but the relative position to the target scene component is unchanged. In one embodiment, in response to a movement operation on a target scene component, a movement vector of the target scene component may be obtained, and the movement vector may be added to a position of each waypoint, so that each waypoint achieves synchronous movement with the target scene component. The relative positions of the waypoints and the target scene components are kept unchanged, so that the complexity of motion editing is reduced, and a user can understand the motion more easily.
It should be appreciated that the move operation on the target scene component changes the initial position of the target scene component. In the game editing scene, when the motion test is performed, the target scene component is triggered to move according to the motion path, so that the user previews the motion effect. The initial position of the target scene component is not changed in the motion test process, and the moving operation is not carried out on the target scene component, so that the position of the waypoint of the target scene component is not changed in the motion test process.
In one embodiment, the motion editing processing method may further include the steps of:
under the condition that the direction option of the motion path to be edited is activated, the direction information corresponding to each waypoint of the motion path to be edited is recorded, and the direction information of the target scene component moving to each waypoint is represented.
Wherein the target scene component may rotate during movement along the path of movement, resulting in a change in direction. And activating a direction option of the motion path to be edited, and representing and recording the direction information corresponding to each road point. For example, in an editing interface of a motion path to be edited, a direction option may be provided, and if the user clicks the option, the direction option is activated. If the user does not select the option, indicating that the direction option is not activated, the direction information is not recorded. The direction information corresponding to each waypoint represents the direction information of the movement of the target scene component to each waypoint, and can comprise rotation angles along different axes. By recording the direction information of the waypoints, the user can see the direction information, and when the first visualized object on the position of the target waypoints is displayed, the state information of the first visualized object can be rapidly determined according to the direction information corresponding to the target waypoints, so that the processing efficiency is improved.
In one embodiment, the motion editing processing method may further include the steps of:
and if the motion path to be edited comprises the non-smooth road points, recording motion direction mutation information corresponding to the non-smooth road points.
For example, when the motion path to be edited is of a linear path type or a sub path in which a straight line exists, there may be a non-smooth road point, typically an inflection point of a broken line. And at the non-smooth road points, abrupt change exists in the movement direction of the target scene component, and movement direction abrupt change information corresponding to the non-smooth road points can be recorded, for example, the movement direction before abrupt change and the movement direction after abrupt change can be included. If the non-smooth road point is used as the target road point, the first visualized object before the abrupt change of the movement direction and the first visualized object after the abrupt change of the movement direction may be displayed at the position of the target road point, or only the first visualized object after the abrupt change of the movement direction may be displayed. During the motion test or game running stage, when the target scene component moves to a non-smooth road point, the effect of transient abrupt change of the motion direction can be presented.
In one embodiment, the waypoints may be sub-components of the target scenario component. That is, the target scenario component is the parent component of the waypoint, which is associated with the child component's information. In particular, a child component may inherit the motion information of its parent component, or the motion information of the child component may be synchronized to the parent component. The denser track points among the road points can be generated according to the movement speed and the path type in the movement path to be edited of the target scene component, the movement speed can influence the density among the track points, and the path type can influence the positions of the track points. During the course of the motion test or at the game play stage, the target scene component may move according to the track points. A minimum distance (e.g., 0.01) for updating the position of the target scene component can be set, and when the position change of the target scene component reaches the minimum distance, the position of the target scene component is updated once, thereby realizing the real-time updating of the position to a certain extent.
In one embodiment, after the game scene is established or released through the terminal device, for example, after the game editing scene is edited, a "release map" control may be clicked on an interface shown in fig. 4A, so that game scene information corresponding to the game editing scene may be generated, where the game scene information may be stored in a first preset location, and the first preset location may be a map file, and the map file may store not only the game scene information but also other map information (including, but not limited to, information such as a screenshot, a map name, a log, and the like). The map file may be uploaded to a server after saving the game scene information. After the server checks, the game scene generated by the game scene information can be released to a preset map pool, so that terminal equipment connected with the server can download corresponding game scene information from the server, generate a corresponding game scene according to the downloaded game scene information through a game program, and then play game experience in the game scene. The method can release game scene information in the game editor and is experienced by other players, so that a rapid UGC (User Generated Content ) function is realized.
In one embodiment, after the motion information of the scene component is edited by the terminal device, edited motion path data may be generated, and the motion path data may be stored in a second preset location, which may be a scene component file (or a map file). The scene component file may be uploaded to a server after saving the motion path data. After the server passes the examination, the motion path corresponding to the motion path data can be issued to a preset motion path pool, so that terminal equipment connected with the server can download the corresponding motion path data from the server, generate a corresponding motion path according to the downloaded motion path data through a game program, and then set the motion path for one or more scene components in the game. The method can release the motion path data in the game editor and is experienced by other players, so that a rapid UGC function is realized.
The exemplary embodiment of the present disclosure also provides a motion editing processing device in a game. Referring to fig. 11, the in-game motion editing processing apparatus 1100 may include the following program modules:
a game editing scene display processing module 1110 configured to display a game editing scene in a graphical user interface provided by running a game program; the game editing scene comprises a scene component configured to generate a corresponding virtual model at a game run stage;
The first visualized object display processing module 1120 is configured to display a first visualized object corresponding to the target scene component at the position of the target waypoint of the motion path to be edited in the game editing scene in response to a preset operation on the motion path to be edited of the target scene component, and represent the state information of the target scene component moving to the target waypoint through the state information of the first visualized object;
wherein the target scenario component is one or more of the scenario components of the game editing scenario; the target scene component is provided with one or more motion paths, and the motion path to be edited is the motion path therein; the target scene component is configured to move according to a motion path during a game play phase; the motion path to be edited comprises one or more waypoints, and the target waypoint is the waypoint therein.
In one embodiment, the preset operation includes a waypoint selection operation; responding to the preset operation of the motion path to be edited of the target scene component, displaying a first visualized object corresponding to the target scene component at the position of a target waypoint of the motion path to be edited in the game editing scene, wherein the first visualized object comprises the following components:
and responding to the waypoint selection operation of the motion path to be edited, determining a target waypoint according to the waypoint selection operation, and displaying a first visualized object corresponding to the target scene component at the position of the target waypoint in the game editing scene.
In one embodiment, the first visualized object display processing module 1120 is further configured to:
determining a target waypoint according to the waypoint selection operation in response to the waypoint selection operation of the movement path to be edited, and displaying an editing interface of the movement path to be edited in a graphical user interface in response to the editing triggering operation of the movement path to be edited before displaying a first visualized object corresponding to a target scene component at the position of the target waypoint in the game editing scene; the editing interface of the motion path to be edited comprises a waypoint operation control;
the waypoint selection operation is an operation performed by the waypoint operation control.
In one embodiment, the first visualized object display processing module 1120 is further configured to:
and responding to the editing triggering operation of the motion path to be edited, and displaying a second visual object at the position of each waypoint of the motion path to be edited in the game editing scene.
In one embodiment, the first visualized object display processing module 1120 is further configured to:
in response to an edit triggering operation of the motion path to be edited, a third visual object for representing the motion path to be edited is displayed in the game editing scene.
In one embodiment, in response to a preset operation on a motion path to be edited of a target scene component, displaying a first visualized object corresponding to the target scene component at a position of a target waypoint of the motion path to be edited in a game editing scene, representing state information of the target scene component moving to the target waypoint by state information of the first visualized object, including:
generating a first visualized object which is the same as the target scene component in response to a preset operation of the motion path to be edited;
and displaying the first visual object at the position of the target waypoint in the game editing scene based on the state information of the target scene component moving to the target waypoint.
In one embodiment, the apparatus 1100 may further include an edit processing module configured to:
in response to a motion path adding operation to the target scene component, adding a new motion path to the target scene component;
if the new motion path is a motion mode along the waypoints, adding a first waypoint in the new motion path, and setting the position of the first waypoint as the position of the target scene component plus a first preset offset parameter.
In one embodiment, the apparatus 1100 may further include an edit processing module configured to:
In response to the waypoint adding operation of the movement path to be edited, adding a new waypoint in the movement path to be edited, and setting the position of the new waypoint as the position of the reference waypoint of the movement path to be edited plus a second preset offset parameter; the reference waypoints are one or more waypoints in the motion path to be edited before the new waypoints are added.
In one embodiment, the apparatus 1100 may further include an edit processing module configured to:
in response to the copy operation of any one of the waypoints of the motion path to be edited, adding a new waypoint at the position of any one of the waypoints, and respectively moving any one of the waypoints and the waypoint after any one of the waypoints to the position of the next waypoint.
In one embodiment, the apparatus 1100 may further include an edit processing module configured to:
and responding to the operation of changing the path type of the motion path to be edited, keeping the position of the waypoint of the motion path to be edited unchanged, and updating the motion path to be edited according to the position of the waypoint and the changed path type.
In one embodiment, the apparatus 1100 may further include an edit processing module configured to:
and responding to the operation of changing the path type of any one of the path points of the motion path to be edited, keeping the position of any one of the path points unchanged, and updating the sub path corresponding to any one of the path points of the motion path to be edited according to the position of any one of the path points and the changed path type.
In one embodiment, the apparatus 1100 may further include an edit processing module configured to:
and in response to the operation of deleting any one of the waypoints of the motion path to be edited, updating the sub-path between the adjacent waypoints in the motion path to be edited according to the adjacent waypoints of the any one of the waypoints.
In one embodiment, the apparatus 1100 may further include an edit processing module configured to:
in response to a movement operation on the target scene component, each waypoint in each motion path of the target scene component is moved according to the movement operation, so that the relative position of each waypoint and the target scene component is unchanged.
In one embodiment, the apparatus 1100 may further include an edit processing module configured to:
under the condition that the direction option of the motion path to be edited is activated, the direction information corresponding to each waypoint of the motion path to be edited is recorded, and the direction information of the target scene component moving to each waypoint is represented.
In one embodiment, the apparatus 1100 may further include an edit processing module configured to:
and if the motion path to be edited comprises the non-smooth road points, recording motion direction mutation information corresponding to the non-smooth road points.
In one embodiment, the first visualized object display processing module 1120 is further configured to:
Before a first visualized object corresponding to the target scene component is displayed at the position of a target waypoint of the motion path to be edited in the game editing scene in response to a preset operation of the motion path to be edited of the target scene component, all the motion paths of the target scene component are overlapped, and state information of the motion of the target scene component to the target waypoint is determined based on the overlapped motion paths.
The specific details of each part in the above apparatus are already described in the method part embodiments, and the details not disclosed can refer to the embodiment content of the method part, so that the details are not repeated.
Exemplary embodiments of the present disclosure also provide a computer readable storage medium, which may be implemented in the form of a program product comprising program code for causing an electronic device to carry out the steps according to the various exemplary embodiments of the disclosure as described in the above section of the "exemplary method" when the program product is run on the electronic device. In an alternative embodiment, the program product may be implemented as a portable compact disc read only memory (CD-ROM) and comprises program code and may run on an electronic device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable signal medium may include a data signal propagated in baseband or as part of a carrier wave with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
The exemplary embodiments of the present disclosure also provide an electronic device, which may be the terminal device described above. The electronic device may include a processor and a memory. The memory stores executable instructions of the processor, such as program code. The processor performs the method of the present exemplary embodiment by executing the executable instructions. The electronic device may further comprise a display for displaying the graphical user interface.
An electronic device is illustrated in the form of a general purpose computing device with reference to fig. 12. It should be understood that the electronic device 1200 shown in fig. 12 is merely an example and should not be construed as limiting the functionality and scope of use of embodiments of the present disclosure.
As shown in fig. 12, an electronic device 1200 may include: processor 1210, memory 1220, bus 1230, I/O (input/output) interface 1240, network adapter 1250, display 1260.
Memory 1220 may include volatile memory, such as RAM 1221, cache unit 1222, and nonvolatile memory, such as ROM 1223. Memory 1220 may also include one or more program modules 1224, such program modules 1224 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment. Program modules 1224 may include, for example, the modules of the apparatus described above.
Processor 1210 may include one or more processing units such as: the processor 1210 may include an AP (Application Processor ), modem processor, GPU (Graphics Processing Unit, graphics processor), ISP (Image Signal Processor ), controller, encoder, decoder, DSP (Digital Signal Processor ), baseband processor and/or NPU (Neural-Network Processing Unit, neural network processor), and the like.
Processor 1210 may be configured to execute executable instructions stored in memory 1220, such as may perform any one or more of the method steps of the present exemplary embodiment. Illustratively, the processor 1210 may perform the method shown in FIG. 2.
Bus 1230 is used to enable connections between the different components of electronic device 1200 and may include a data bus, an address bus, and a control bus.
The electronic device 1200 may communicate with one or more external devices 1300 (e.g., keyboard, mouse, external controller, etc.) via the I/O interface 1240.
The electronic device 1200 may communicate with one or more networks through the network adapter 1250, e.g., the network adapter 1250 may provide a mobile communication solution such as 3G/4G/5G, or a wireless communication solution such as wireless local area network, bluetooth, near field communication, etc. The network adapter 1250 may communicate with other modules of the electronic device 1200 via the bus 1230.
The electronic device 1200 may display a graphical user interface, such as displaying a game editing scene, an associated editing interface or settings interface, etc., via the display 1260.
Although not shown in fig. 12, other hardware and/or software modules may also be provided in the electronic device 1200, including but not limited to: microcode, device drivers, redundant processors, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit in accordance with exemplary embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
Those skilled in the art will appreciate that the various aspects of the present disclosure may be implemented as a system, method, or program product. Accordingly, various aspects of the disclosure may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" system. Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.