Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
First, the related terms referred to in the present application will be explained.
1. Virtual environment
A virtual environment is a scene displayed (or provided) by a client of an application program (e.g., a game application program) when running on a terminal, and refers to a scene created for a virtual object to perform an activity (e.g., a game competition), such as a virtual house, a virtual island, a virtual map, and the like. The virtual environment may be a simulation scene of a real world, a semi-simulation semi-fictional scene, or a pure fictional scene. The virtual environment may be a two-dimensional virtual environment, a 2.5-dimensional virtual environment, or a three-dimensional virtual environment, which is not limited in this embodiment of the present application.
2. Virtual object
The virtual object refers to a virtual role controlled by the user account in the application program. Taking an application as a game application as an example, the virtual object refers to a game character controlled by a user account in the game application. The virtual object may be in the form of a character, an animal, a cartoon or other forms, which is not limited in this application. The virtual object may be displayed in a three-dimensional form or a two-dimensional form, which is not limited in the embodiment of the present application.
The operations that a user account can perform to control a virtual object may also vary from game application to game application. For example, in a shooting-type game application, the user account may control the virtual object to perform shooting, running, jumping, picking up a firearm, replacing a firearm, adding bullets to a firearm, and the like.
Of course, in addition to game applications, other types of applications may present virtual objects to a user and provide corresponding functionality to the virtual objects. For example, an AR (Augmented Reality) application, a social application, an interactive entertainment application, and the like, which are not limited in this embodiment. In addition, for different applications, the forms of the virtual objects provided by the applications may also be different, and the corresponding functions may also be different, which may be configured in advance according to actual requirements, and this is not limited in the embodiments of the present application.
Referring to fig. 1, a schematic diagram of an implementation environment provided by an embodiment of the application is shown. The implementation environment may include: a terminal 10 and aserver 20.
The terminal 10 may be a device such as a mobile phone, a PC (Personal Computer), a tablet PC, an e-book reader, an electronic game machine, a Moving Picture Experts Group Audio Layer IV (MP 4) player, and the like.
The terminal 10 may have a client of a game application installed therein, such as a client of a Shooting-type game application, where the Shooting-type game application may be any one of an FPS (First Person Shooting) game application, a TPS (Third Person Shooting) game application, a Multiplayer Online Battle sports (MOBA) game application, a Multiplayer gun Battle survival game application, and the like. Alternatively, the game application may be a stand-alone application, such as a stand-alone 3D game application; or may be a web-enabled version of the application.
Theserver 20 is used to provide background services for clients of applications (e.g., game applications) in the terminal 10. For example, theserver 20 may be a backend server for the above-described applications (e.g., gaming applications). Theserver 20 may be a server, a server cluster composed of a plurality of servers, or a cloud computing service center.
The terminal 10 and theserver 20 can communicate with each other through thenetwork 30. Thenetwork 30 may be a wired network or a wireless network.
In the embodiment of the method, the execution subject of each step may be a terminal. Please refer to fig. 2, which illustrates a schematic structural diagram of a terminal according to an embodiment of the present application. The terminal 10 may include: amain board 110, an external input/output device 120, a memory 130, an external interface 140, a touch system 150, and a power supply 160.
Themain board 110 has integrated therein processing elements such as a processor and a controller.
The external input/output device 120 may include a display component (e.g., a display screen), a sound playing component (e.g., a speaker), a sound collecting component (e.g., a microphone), various keys, and the like.
The memory 130 has program codes and data stored therein.
The external interface 140 may include a headset interface, a charging interface, a data interface, and the like.
The touch system 150 may be integrated into a display component or a key of the external input/output device 120, and the touch system 150 is used to detect a touch operation performed by a user on the display component or the key.
The power supply 160 is used to power various other components in the terminal 10.
In this embodiment, the processor in themotherboard 110 may generate a user interface (e.g., a game interface) by executing or calling program codes and data stored in the memory, and expose the generated user interface (e.g., the game interface) through the external input/output device 120. In the process of presenting a user interface (e.g., a game interface), a touch operation performed when a user interacts with the user interface (e.g., the game interface) may be detected by the touch system 150 and responded to.
Referring to fig. 3, a flowchart of a display method of a virtual article according to an embodiment of the present application is shown. The method may be applied in the terminal described above, such as in a client of an application of the terminal (e.g. a shooting-type game application). The method may include the steps of:
step 301, displaying a user interface.
The user may run a client of an application installed in the terminal, and the client may display the user interface. The user interface includes a virtual environment screen, which is a display screen corresponding to the virtual environment, and the virtual environment includes a virtual item, and optionally also includes a virtual object holding the virtual item. In addition, elements such as virtual buildings, virtual weapons, virtual props, etc. can also be included in the virtual environment. Details regarding the virtual environment and the virtual objects are described above and will not be repeated here.
In the embodiment of the present application, the virtual article may be a throwing type article, that is, the virtual article is thrown in the virtual environment to trigger the target function. For example, the virtual object may be a throwing-type object such as a virtual grenade, a viscous grenade, a virtual smoke bomb, a virtual flash bomb, etc.
Alternatively, the virtual items may be divided into multiple types, with different types of virtual items being used to trigger different target functions. Illustratively, the virtual objects may include combat-type objects and tactical-type objects, wherein the combat-type objects are intended to be thrown to cause damage to virtual objects, such as virtual grenades and viscous grenades (a viscous grenade looks have an adhesive effect and will adhere to a first touching virtual object), etc., and the virtual objects within the blast range are more damaged when thrown for a functional trigger period. Tactical type article is used for being thrown in order to cause the interference to virtual object, like virtual smog bullet and virtual flash bomb etc. is thrown and touches virtual object (virtual object) when triggering the interference (for being thrown when reaching function trigger time length, trigger smog diffuse effect and cause the interference), makes virtual object can't see the virtual environment clearly.
Illustratively, as shown in fig. 4, a schematic diagram of a combat-type item is illustratively shown. The combat-type items may include alaser tripper 41, a combat hatchet 42, afragment grenade 43, and aviscous grenade 44.
Illustratively, as shown in fig. 5, a schematic diagram of a tactical type of item is illustrated. The tactical items may include asmoke bomb 51, aflash bomb 52, anexplosion prevention device 53, acold bomb 54, ashockbomb 55 and anelectromagnetic pulse 56.
Optionally, each virtual object may be equipped with at least one virtual item (e.g., 2); further, optionally, each virtual object may be equipped with different types of virtual items, at most one of each type of virtual item.
Illustratively, as shown in fig. 6, the virtual object may be equipped with 2 types of virtual items, such as battle type items and tactical type items, and each type of virtual item can be equipped with only one, such asfragment grenade 43 in the battle type items andsmoke bomb 51 in the tactical type items.
Optionally, some operation controls are also included in the user interface, and the operation controls are controls for the user to operate, and may include, for example, buttons, sliders, icons, and the like.
Optionally, the user interface includes a first viewing layer and a second viewing layer. The viewing layer is a layer for displaying interface content. The display level of the first view layer is higher than that of the second view layer, that is, the first view layer is located on the upper layer of the second view layer. The first view layer can be used for displaying an operation control for human-computer interaction of a user, and the second view layer can be used for displaying a virtual environment picture. Because the display level of the first view layer is higher than that of the second view layer, the operation control is displayed on the upper layer of the virtual environment picture, and therefore the operation control can be guaranteed to respond to the touch operation of the user. It should be noted that although the first view layer is located on the upper layer of the second view layer, the display of the content in the second view layer may not be blocked, for example, a part or all of the operation controls in the first view layer may be displayed in a semi-transparent state. As shown in fig. 7, the user interface includes a first view layer and a second view layer, the display content in the first view layer includes operation controls such as avirtual joystick 71, anattack button 72, aposture control button 73, a throwingobject switching button 74, and amap thumbnail 75, and the display content in the second view layer includes avirtual environment screen 76 including a three-dimensional virtual environment and some people or objects in the virtual environment.
Step 302, in response to the virtual item being in a pre-cast state, playing a pre-cast animation.
When detecting that a virtual item held by the virtual object is in a pre-cast state, the client can play a pre-cast animation of the virtual item.
The pre-throwing state is a state before the virtual article is held by the virtual object and thrown. When the client receives a pre-casting instruction for the virtual article, the client controls the virtual article to enter a pre-casting state and starts to play the pre-casting animation. The pre-throwing instruction is an operation instruction for controlling the virtual article to enter a pre-throwing state, and for example, the pre-throwing instruction can be triggered by a pressing operation signal acting on a throwing control in a user interface. The pre-cast animation is used to show the operation of the virtual item in a pre-cast state. For some virtual objects, it is necessary to pull the pull ring of the virtual object manually, which is also called a bolt pulling process, and in shooting type games, the bolt pulling process is displayed by playing a pre-throwing animation.
Illustratively, as shown in FIG. 7, a schematic diagram of a pre-cast animation is illustratively shown. The virtual object is held with a virtual object such as afragment grenade 43 in the right hand, and the pulling process of thefragment grenade 43 is completed by pulling a ring through the left hand.
It should be noted that the contents of the pre-cast animation may be different for different virtual articles, and the embodiment of the present application is not limited to this.
Step 303, in response to the playing of the pre-cast animation being finished, displaying the cast line.
When the pre-cast play is finished, a cast line can be displayed in the user interface. The throwing line is used for displaying the motion trail of the thrown virtual article, and the motion trail refers to a path of the client controlling the virtual article to move in the virtual environment. By displaying the throwing line in the user interface before the virtual article is actually thrown, the user can conveniently preview and check the motion trail and the landing point of the virtual article, the user can determine whether the motion trail and the landing point of the virtual article accord with the expectation of the user according to the throwing line, if so, the throwing of the virtual article can be triggered, and if not, the throwing line can be adjusted through operation, so that the motion trail and the landing point of the virtual article can be adjusted.
In the related art, the user displays the throwing line as soon as the throwing control is clicked, at which time the operation before the virtual item is thrown has not been completed (i.e., the pre-throw animation has not yet finished playing). Taking the pre-throwing animation as an example of playing the bolt animation, the related technology immediately displays the throwing line when the bolt animation is started to be played. However, according to the real logic, the user can throw the golf ball based on the throwing line after completing the operation before throwing the golf ball (such as pulling a bolt before throwing the grenade). Therefore, after the playing of the pre-cast animation is finished, the cast line is displayed, which is more in accordance with the reality logic than the related art.
Optionally, the starting point of the throwing line is a position at which the virtual object leaves the hands of the virtual object when the virtual object throws the virtual object; the end point of the throwing line is a landing point after the virtual article is thrown out. Optionally, the end point of the throwing line may be displayed with identifying information for indicating the landing point position of the virtual item, thereby allowing the user to better target the throw.
Illustratively, as shown in FIG. 8, a schematic diagram of a throwing line display is illustratively shown. After the pre-cast animation of the virtual item is finished playing, acast line 81 may be displayed in the user interface.
Optionally, the above-mentioned throwing line displaying method may include the steps of: acquiring throwing line parameters; and displaying the throwing line according to the throwing line parameters.
Wherein, the throwing line parameters are used for determining the trajectory of the throwing line; the throwing line parameters may include at least one of: a throwing starting point, a throwing direction, a throwing initial speed and a throwing acceleration.
Wherein, the throwing starting point is the starting point of the throwing line; the throwing direction refers to the throwing direction of the virtual article, and can include the throwing direction in the horizontal direction and the throwing direction in the gravity direction, the throwing direction in the horizontal direction corresponds to the facing direction of the virtual object in the virtual environment, and the throwing direction in the gravity direction corresponds to the throwing height of the virtual prop; the initial throwing speed refers to the speed of the virtual article thrown from the starting throwing point, and may include the initial throwing speed in the horizontal direction or the initial throwing speed in the gravity direction; the throwing acceleration is an acceleration at the time when the virtual article is thrown from the throwing start point, and may include a throwing acceleration in the horizontal direction or a throwing acceleration in the gravity direction (i.e., a gravitational acceleration).
Through the throwing line parameters, a parabola can be calculated according to the relevant physical formula, the parabola is calculated in a frame, a point is collected at intervals in the frame, finally, some waypoints are obtained, then the waypoints are transmitted to a special effect line, and the special effect line forms a parabola according to the obtained position information, namely the throwing line.
Optionally, the throw line parameters may be adjusted by control of an interface control. For example, the user interface may include a throwing control of a virtual article, and the throwing direction may be adjusted by performing a long-press operation on the throwing control and sliding the throwing control in different directions (e.g., up, down, left, and right).
Optionally, before the throwing line is displayed, the method further includes: calling an animation state machine corresponding to the pre-cast animation to obtain animation duration of the pre-cast animation; and determining whether the playing of the pre-throwing animation is finished or not through an animation state machine corresponding to the pre-throwing animation.
The animation state machine corresponding to the pre-throwing animation is used for managing the pre-throwing animation, including basic information (such as animation duration) of the pre-throwing animation, and playing control and other related processing of the pre-throwing animation. The animation duration of the pre-throwing animation refers to the total playing duration of the pre-throwing animation.
Through the animation state machine corresponding to the pre-cast animation, the client can know the animation duration of the pre-cast animation and monitor the playing process of the pre-cast animation so as to judge whether the playing of the pre-cast animation is finished.
And step 304, responding to the virtual article switched from the pre-throwing state to the throwing state, and playing the throwing animation.
When the customer detects that the virtual item switches from the pre-cast state to the cast state, a cast animation of the virtual item may be played. When the client receives a throwing instruction for the virtual article, the client controls the virtual article to be switched from a pre-throwing state to a throwing state, and starts playing the throwing animation. The throwing instruction is an operation instruction for controlling the virtual article to enter a throwing state, and for example, the throwing instruction can be triggered by canceling a pressing operation signal acting on the throwing control.
The throwing state refers to a state in which the virtual object is thrown, and the throwing animation is used to show a process in which the virtual object is thrown. For throwing type virtual articles, it needs to be thrown to trigger the realization of target functions, such as explosion, smoke release and other attack functions. The throwing animation can show the action process that the swing arm of the virtual object throws out the virtual object and withdraws the arm after throwing out.
It should be noted that, for different virtual articles, the corresponding throwing animations may be the same or different, and this is not limited in the embodiment of the present application.
Optionally, the user interface further includes a throwing control for controlling the virtual object to throw the virtual item. In this case, the client controls the virtual article to enter a pre-cast state in response to receiving the operation signal corresponding to the cast control; and controlling the virtual article to be switched from the pre-throwing state to the throwing state in response to the detection of the disappearance of the operation signal.
That is, when the throwing control is included in the user interface, the user may operate the throwing control to trigger the virtual article to enter a pre-throwing state; correspondingly, when the client receives an operation signal corresponding to the throwing control, the client controls the virtual article to enter a pre-throwing state. When the user releases the hand and stops operating the throwing control, the virtual article is controlled to be switched from the pre-throwing state to the throwing state; correspondingly, when the client detects that the operation signal disappears, the client controls the virtual article to be switched from the pre-throwing state to the throwing state.
The operation signal can be generated by clicking the throwing control piece. For example, for a mobile phone end user configured with a touch screen, the user clicks the throwing control with a finger to generate an operation signal. For another example, for the PC end, the user may click the throwing control through a mouse to generate an operation signal; alternatively, the user generates the operation signal by pressing a key (e.g., R key) associated with the throwing control, and it should be noted that the key may be set by the user according to personal habits.
It should be noted that the operation on the throwing control may be a long-press operation, a single-click operation, a press operation, a slide operation, or the like, which is not limited in this embodiment of the present application.
Additionally, in some other embodiments, a cancel toss control may also be included in the user interface for the user to cancel tossing the virtual item.
Step 305, responsive to the play duration of the throwing animation reaching the first duration, canceling the display of the throwing line.
The client can monitor the playing process of the throwing animation, and cancel the display of the throwing line when the playing time of the throwing animation reaches the first time.
In the related art, when the user stops touching the throwing control with his hands loose, that is, when the client receives a throwing instruction, the throwing line disappears immediately, that is, the throwing animation has not started playing the throwing line and has disappeared, but according to the reality logic, a throwing motion (such as a swing arm motion) needs to be completed before the throwing is performed, and thus the disappearance of the throwing line needs to be delayed until the throwing motion is completed. Therefore, the throwing line is more real after the throwing animation is played for a certain period of time and disappears, and the throwing line is more in accordance with the reality logic compared with the related art.
Illustratively, as shown in FIG. 9, a schematic diagram of the disappearance of a throwing line is illustratively shown. After the playing time period of the throwing animation reaches a certain time period, the displayed throwingline 81 is cancelled in the user interface.
The game designer can set a throwing line cancellation moment at which the playing duration of the throwing animation reaches a first duration at which the throwing line displayed in the user interface disappears. For example, assuming that the animation time period of the throwing animation is 1s, the first time period set by the game designer is 0.2s, that is, when the throwing animation is played to 0.2s, the throwing line displayed in the user interface disappears.
In summary, according to the technical scheme provided by the embodiment of the application, the throwing line is displayed after the playing of the pre-throwing animation is finished, and the throwing line is cancelled when the playing time length of the throwing animation reaches the first time length. Compared with the prior art, the throwing line is displayed when the player clicks the throwing control piece, and the throwing line disappears when the player looses hands to stop touching the throwing control piece. According to the technical scheme provided by the embodiment of the application, the appearance and the disappearance of the throwing line are optimally configured, so that the appearance and the disappearance of the throwing line are more in line with the practical logic, and the accuracy of the appearance and the disappearance of the throwing line is improved.
In an alternative embodiment provided based on the above-mentioned fig. 3 embodiment, after thestep 304, in response to the virtual article being switched from the pre-cast state to the cast state and the cast animation being played, the following steps may be further performed: and responding to the playing time of the throwing animation reaching the second time, and displaying the throwing special effect.
In the process of playing the throwing animation, the client can monitor the playing process of the throwing animation, and display the throwing special effect when the playing time of the throwing animation is monitored to reach the second time. The throwing special effect refers to a special effect of movement of the virtual article after being thrown out. Optionally, the second time period is greater than or equal to the first time period, for example, the first time period is 0.2 seconds, and the second time period is 0.2 seconds or 0.3 seconds. In this way, when the virtual object leaves the hand of the virtual object (is thrown by the virtual object), the throwing line disappears, and after the throwing line disappears, the throwing special effect starts to be displayed to show the motion process of the virtual object after being thrown, so that the virtual object is more in line with the real logic and more real.
Illustratively, as shown in fig. 10, a schematic diagram of a throw special effects display is illustratively shown. When the playing time of the throwing animation reaches the second time, a throwingspecial effect 101, namely a special effect of the movement of the virtual article after throwing is displayed in the user interface. The second duration may also be set by the game designer, for example, the game designer may set the second duration based on the animation duration of the throwing animation and the first duration, which may be greater than or equal to the first duration and less than the animation duration of the throwing animation. For example, the animation time period of the throwing animation is 1 second, the first time period is set to 0.2s, and the second time period is set to 0.2s or 0.3 s.
Optionally, before displaying the special throwing effect, the method may further include the following steps: acquiring animation duration and a zooming value of the throwing animation; and determining the second time length according to the animation time length and the zooming value of the throwing animation.
The client can call an animation state machine corresponding to the throwing animation, and the animation state machine corresponding to the throwing animation is used for managing the throwing animation, including basic information (such as animation duration) of the throwing animation, and playing control and other related processing of the throwing animation. The animation time length of the throwing animation refers to the total playing time length of the throwing animation. The zoom value may also be referred to as a scale value, and is used for setting a display time of the special throwing effect, and represents a ratio of the second time length in the animation time length. Optionally, the scaling value is a value greater than 0 and less than 1. For example, the second duration is the product of the animation duration of the throwing animation and the scaling value. The display time of the throwing object special effect can be determined according to the animation duration of the throwing animation and the zooming value. Wherein the zoom value can be set by a game designer. For example, assuming that the animation duration of the throwing animation is 2s in total and the zoom value is 0.5, the throwing special effect may be displayed when the throwing animation is played to 2 × 0.5= 1s. Thus, no matter how long the animation duration of the throwing animation is, the throwing animation can be scaled according to the scaling value without modifying the configuration due to the change of the animation duration. It should be noted that the first time length can also be determined by configuring a scaling value in a similar manner. In order for the second duration to be greater than or equal to the first duration, the scaling value used to determine the second duration should be greater than or equal to the scaling value used to determine the first duration.
To sum up, the technical scheme provided by the embodiment of the application displays the throwing special effect when the playing time of the throwing animation reaches the second time, so that the appearance of the throwing special effect is more consistent with the reality logic, and the accuracy of the appearance of the throwing special effect is improved.
In an optional embodiment provided based on the embodiment of fig. 3, the user interface further includes n article slot positions and a type switching control, the virtual articles assembled in the n article slot positions belong to different types, and n is a positive integer.
The article slot position is used for being equipped with a virtual article.
In this case, the method for controlling display of a virtual article may further include:
in response to receiving a selection signal corresponding to a target article slot position of the n article slot positions, controlling the virtual object to use a first virtual article assembled in the target article slot position;
and in response to receiving a trigger signal corresponding to the type switching control, controlling the virtual object to switch the held first virtual article to the second virtual article.
The first virtual article and the second virtual article are two different types of virtual articles.
The virtual object in the user interface can hold a virtual weapon, and when a user selects to use a target article slot position in the n article slot positions, the client can receive a selection signal corresponding to the target article slot position and control the virtual object to use a first virtual article assembled in the target article slot position; then, when the user wants to use another type of virtual article, the user can touch the type switching control to trigger and generate the trigger signal corresponding to the type switching control; correspondingly, the client controls the virtual object to switch the held first virtual article into the second virtual article when receiving the trigger signal corresponding to the type switching control. The second virtual item is a different type of virtual item than the first virtual item.
Illustratively, as shown in FIG. 11, a schematic diagram of a user interface is illustrated. Theuser interface 110 can include anitem slot 111 therein, and theitem slot 111 can be populated with a combat-type item, such as afragment grenade 43. The user can click thearticle slot position 111 to control the virtual object to use thefragment grenade 43; thereafter, the user may click on thetype switching control 112 to control the virtual object to switch thefragment grenade 43 in the held combat type item to a smoke shell in the tactical type item.
To sum up, according to the technical scheme provided by the embodiment of the application, the user can quickly switch between different types of virtual articles through the type switching control in the user interface, so that the switching efficiency of the virtual articles is improved, and the user experience is improved.
Referring to fig. 12, a flowchart of a display method of a virtual article according to an embodiment of the present application is exemplarily shown. The method may be applied in the terminal described above, such as in a client of an application of the terminal (e.g. a shooting-type game application). The method may include the steps of:
step 1201, controlling the virtual object to equip the virtual article.
Step 1202, detecting whether a touch operation corresponding to the article slot position is received.
Step 1203, in response to receiving the touch operation corresponding to the article slot, controlling the virtual object to use the virtual article.
Illustratively, as shown in FIG. 13, a schematic diagram of another user interface is illustrated. The user may click on thearticle slot 111, and correspondingly, the client controls the virtual object to use the virtual article, such as thefragment grenade 43, in response to receiving the touch operation corresponding to thearticle slot 111.
Instep 1204, it is detected whether an operation signal corresponding to the throwing control is received.
And step 1205, responding to the received operation signal corresponding to the throwing control, and playing the pre-throwing animation.
Instep 1206, it is detected whether the playing of the pre-throwing animation is finished.
Step 1207, in response to the end of the pre-cast animation play, displaying the cast line.
Instep 1208, it is detected whether the operation signal disappears.
In response to detecting the disappearance of the operation signal, a throwing animation is played 1209.
Step 1210, detecting whether the playing time of the throwing animation reaches a first time.
Step 1211, in response to the play duration of the throwing animation reaching the first duration, canceling the display of the throwing line.
Instep 1212, it is detected whether the playing time length of the throwing animation reaches the second time length.
Step 1213, responding to the playing time of the throwing animation reaching the second time, displaying the throwing special effect.
The following are embodiments of the apparatus of the present application that may be used to perform embodiments of the method of the present application. For details which are not disclosed in the embodiments of the apparatus of the present application, reference is made to the embodiments of the method of the present application.
Referring to fig. 14, a block diagram of a display device of a virtual article according to an embodiment of the present application is shown. The device has the function of realizing the display method example of the virtual article, and the function can be realized by hardware or by hardware executing corresponding software. The device may be the terminal described above, or may be provided on the terminal. Theapparatus 1400 may include: aninterface display module 1401, afirst play module 1402, a castline display module 1403, a second play module 1404, and a canceldisplay module 1405.
Aninterface display module 1401, configured to display a user interface, where the user interface includes a display screen corresponding to a virtual environment, and the virtual environment includes a virtual article.
Afirst playing module 1402, configured to play a pre-cast animation in response to the virtual item being in a pre-cast state.
A throwingline display module 1403, configured to display a throwing line in response to the end of playing the pre-throwing animation, where the throwing line is used to indicate a motion trajectory of the virtual object after throwing.
A second playing module 1404, configured to play a throwing animation in response to the virtual item being switched from the pre-throwing state to a throwing state.
Acancellation display module 1405, configured to cancel display of the throwing line in response to a playing time length of the throwing animation reaching a first time length.
In summary, according to the technical scheme provided by the embodiment of the application, the throwing line is displayed after the playing of the pre-throwing animation is finished, and the throwing line is cancelled when the playing time length of the throwing animation reaches the first time length. Compared with the prior art, the throwing line is displayed when the player clicks the throwing control, and the throwing line disappears when the player releases the hand to stop touching the throwing control. According to the technical scheme provided by the embodiment of the application, the appearance and the disappearance of the throwing line are optimally configured, so that the appearance and the disappearance of the throwing line are more in line with the practical logic, and the accuracy of the appearance and the disappearance of the throwing line is improved.
In some possible designs, the castline display module 1403 is for obtaining cast line parameters for determining a trajectory of the cast line; and displaying the throwing line according to the throwing line parameters.
In some possible designs, the cast line parameters include at least one of: a throwing starting point, a throwing direction, a throwing initial speed and a throwing acceleration.
In some possible designs, as shown in fig. 15, theapparatus 1400 further comprises: a specialeffects display module 1406.
The specialeffect display module 1406 is used for responding to the playing time length of the throwing animation reaching a second time length and displaying a throwing special effect; wherein, the throwing special effect refers to the special effect of the movement of the virtual article after being thrown.
In some possible designs, as shown in fig. 15, theapparatus 1400 further comprises: aparameter acquisition module 1407 and aduration determination module 1408.
Aparameter obtaining module 1407, configured to obtain an animation duration and a zoom value of the throwing animation, where the zoom value is used to set a display time of the throwing special effect.
Aduration determining module 1408, configured to determine the second duration according to the animation duration of the throwing animation and the zoom value.
In some possible designs, as shown in fig. 15, theapparatus 1400 further comprises: a statemachine call module 1409, and aplay detection module 1410.
The statemachine calling module 1409 is configured to call an animation state machine corresponding to the pre-cast animation, and obtain an animation duration of the pre-cast animation.
And theplay detection module 1410 is configured to determine whether the playing of the pre-cast animation is finished through an animation state machine corresponding to the pre-cast animation.
In some possible designs, the user interface further includes n article slot positions and a type switching control, the virtual articles assembled in the n article slot positions belong to different types, and n is a positive integer; as shown in fig. 15, theapparatus 1400 further includes: anitem usage module 1411 and anitem switching module 1412.
Anarticle use module 1411, configured to control the virtual object to use the first virtual article assembled in the target article slot in response to receiving a selection signal corresponding to a target article slot of the n article slots.
Anarticle switching module 1412, configured to control the virtual object to switch the held first virtual article to a second virtual article in response to receiving a trigger signal corresponding to the type switching control; wherein the first virtual article and the second virtual article are two different types of virtual articles.
In some possible designs, a throwing control is also included in the user interface; as shown in fig. 15, theapparatus 1400 further includes: astate entry module 1413 and a state switch module 1414.
Astate entry module 1413 for controlling the virtual item to enter the pre-cast state in response to receiving an operation signal corresponding to the casting control.
A state switching module 1414, configured to control the virtual article to switch from the pre-throwing state to the throwing state in response to detecting that the operation signal disappears.
It should be noted that, when the apparatus provided in the foregoing embodiment implements the functions thereof, only the division of the functional modules is illustrated, and in practical applications, the functions may be distributed by different functional modules according to needs, that is, the internal structure of the apparatus may be divided into different functional modules to implement all or part of the functions described above. In addition, the apparatus and method embodiments provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments for details, which are not described herein again.
Referring to fig. 16, a block diagram of a terminal according to an embodiment of the present application is shown. Generally, terminal 1600 includes: aprocessor 1601, and amemory 1602.
Processor 1601 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. Theprocessor 1601 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field Programmable Gate Array), and a PLA (Programmable Logic Array).Processor 1601 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, theprocessor 1601 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content that the display screen needs to display. In some embodiments, theprocessor 1601 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 1602 may include one or more computer-readable storage media, which may be non-transitory. Thememory 1602 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in thememory 1602 is used to store at least one instruction, at least one program, a set of codes, or a set of instructions for execution by theprocessor 1601 to implement a method of displaying a virtual article provided by method embodiments of the present application.
In some embodiments, the terminal 1600 may also optionally include:peripheral interface 1603 and at least one peripheral.Processor 1601,memory 1602 andperipheral interface 1603 may be connected by buses or signal lines. Various peripheral devices may be connected toperipheral interface 1603 via buses, signal lines, or circuit boards. Specifically, the peripheral device may include: at least one of acommunication interface 1604, adisplay 1605,audio circuitry 1606, acamera assembly 1607, and apower supply 1609.
Those skilled in the art will appreciate that the configuration shown in fig. 16 is not intended to be limiting of terminal 1600, and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be employed.
In an exemplary embodiment, a terminal is also provided. The terminal may be a terminal or a server. The terminal comprises a processor and a memory, wherein at least one instruction, at least one program, a code set or an instruction set is stored in the memory, and the at least one instruction, the at least one program, the code set or the instruction set is loaded and executed by the processor to realize the display method of the virtual article.
In an exemplary embodiment, a computer readable storage medium is also provided, in which at least one instruction, at least one program, a set of codes, or a set of instructions is stored, which when executed by a processor implements the above-mentioned display method of a virtual article.
In an exemplary embodiment, a computer program product for implementing the above virtual article display method when executed by a processor is also provided.
It should be understood that reference herein to "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
The above description is only exemplary of the present application and should not be taken as limiting the present application, and any modifications, equivalents, improvements and the like that are made within the spirit and principle of the present application should be included in the protection scope of the present application.