The present application claims priority from chinese patent application filed on 11.02 of 2021, application number 202111289599.1, entitled "method, apparatus, terminal, storage medium, and program product for use of virtual props", the entire contents of which are incorporated herein by reference.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail with reference to the accompanying drawings.
References herein to "a plurality" means two or more. "and/or" describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate that there are three cases of a alone, a and B together, and B alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
First, the nouns involved in the embodiments of the present application will be described:
Virtual environment-is the virtual environment that an application displays (or provides) while running on a terminal. The virtual environment may be a simulation environment for the real world, a semi-simulation and semi-imaginary environment, or a pure imaginary environment. The virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment, which is not limited in the present application. The following embodiments will be described taking the example in which the virtual environment is a three-dimensional virtual environment.
Virtual object-refers to a movable object in a virtual environment. The movable object may be a virtual character, a virtual animal, a cartoon character, etc., such as a character, an animal displayed in a three-dimensional virtual environment. Optionally, the virtual object is a three-dimensional stereoscopic model created based on animated skeleton techniques. Each virtual object has its own shape and volume in the three-dimensional virtual environment, occupying a portion of the space in the three-dimensional virtual environment.
Shooting games, including first person shooting games and third person shooting games. The first-person shooting game is a shooting game in which a user can play at a first-person viewing angle, and a screen of a virtual environment in the game is a screen in which the virtual environment is observed at a viewing angle of a first virtual object. The third person-to-shoot game is a shoot game performed through a third person-to-shoot viewing angle, and the screen of the virtual environment in the game is a screen for observing the virtual environment through the third person-to-shoot viewing angle (for example, located behind the head of the first virtual object).
In the game, at least two virtual objects perform a single-play fight mode in the virtual environment, the virtual objects achieve the purpose of survival in the virtual environment by avoiding injuries initiated by other virtual objects and dangers (such as poison gas rings, marshes and the like) existing in the virtual environment, when the life value of the virtual objects in the virtual environment is zero, the life of the virtual objects in the virtual environment is ended, and finally the virtual objects surviving in the virtual environment are winners. Optionally, the fight may take a time when the first client joins the fight as a start time and a time when the last client exits the fight as an end time, and each client may control one or more virtual objects in the virtual environment. Alternatively, the competitive mode of the fight may include a single fight mode, a two-person team fight mode, or a multi-person team fight mode, which is not limited by the embodiment of the present application.
Virtual props refer to props that can be used by virtual objects in a virtual environment, and include attack-type virtual props, replenishment props, defense props, and the like, which can change attribute values of other virtual objects. The attack virtual prop capable of changing the attribute values of other virtual objects comprises a long-distance virtual prop, a short-distance virtual prop, a throwing virtual prop and the like.
The method provided in the present application may be applied to a virtual reality application, a three-dimensional map program, a first/third person shooter game, a multiplayer online tactical competition game (Multiplayer Online Battle ARENA GAMES, MOBA), etc., and the following embodiments are exemplified by application in a game.
Games based on virtual environments are often composed of one or more maps of the game world, the virtual environments in the games simulate the scenes of the real world, users can control virtual objects in the games to walk, run, jump, shoot, fight, drive, switch to use virtual props, use the virtual props to injure other virtual objects and the like in the virtual environments, the interactivity is high, and a plurality of users can form on-line team to play competitive games.
In the game playing process, a user can control the virtual object to attack other virtual objects by using the virtual firearm, and when the virtual firearm is used for attack, other players can distinguish the attack position through the attack sound effect, so that the position of the user control target virtual object is revealed.
It should be noted that, in the embodiments of the present application, the virtual firearm, the virtual bullet, the virtual weapon, and the like are all virtual props in the game.
Referring to fig. 1, a schematic diagram of an implementation environment provided by an embodiment of the present application is shown. The implementation environment may include a first terminal 110, a server 120, and a second terminal 130.
The first terminal 110 is operated with an application program 111 supporting a virtual environment, and the application program 111 may be a multi-person online fight program. When the first terminal runs the application 111, a user interface of the application 111 is displayed on the screen of the first terminal 110. The application 111 may be any one of a multiplayer online tactical Game (Multiplayer Online Battle Arena, MOBA) Game, a fleeing shooting Game, a simulated strategy Game (SLG). In this embodiment, the application 111 is illustrated as a First-Person shooter (FPS) game. The first terminal 110 is a terminal used by the first user 112, and the first user 112 uses the first terminal 110 to control a first virtual object located in the virtual environment to perform activities, where the first virtual object may be referred to as a master virtual object of the first user 112. The activities of the first virtual object include, but are not limited to, adjusting at least one of body posture, crawling, walking, running, riding, flying, jumping, driving, picking up, shooting, attacking, throwing, releasing skills. Illustratively, the first virtual object is a first virtual character, such as an emulated character or a cartoon character.
The second terminal 130 is operated with an application 131 supporting a virtual environment, and the application 131 may be a multi-person online fight program. When the second terminal 130 runs the application 131, a user interface of the application 131 is displayed on a screen of the second terminal 130. The client may be any of MOBA games, fleeing games, SLG games, in this embodiment illustrated by the application 131 being an FPS game. The second terminal 130 is a terminal used by the second user 132, and the second user 132 uses the second terminal 130 to control a second virtual object located in the virtual environment to perform activities, and the second virtual object may be referred to as a master virtual character of the second user 132. Illustratively, the second virtual object is a second virtual character, such as an emulated character or a cartoon character.
Optionally, the first virtual object and the second virtual object are in the same virtual world. Optionally, the first virtual object and the second virtual object may belong to the same camp, the same team, the same organization, have a friend relationship, or have temporary communication rights. Alternatively, the first virtual object and the second virtual object may belong to different camps, different teams, different organizations, or have hostile relationships.
Alternatively, the applications running on the first terminal 110 and the second terminal 130 are the same, or the applications running on the two terminals are the same type of application on different operating system platforms (android or IOS). The first terminal 110 may refer broadly to one of the plurality of terminals and the second terminal 130 may refer broadly to another of the plurality of terminals, the present embodiment being illustrated with only the first terminal 110 and the second terminal 130. The device types of the first terminal 110 and the second terminal 130 are the same or different, and the device types include at least one of a smart phone, a tablet computer, an electronic book reader, a moving picture experts compression standard audio layer 3 (Moving Picture Experts Group Audio Layer III, MP 3) player, a moving picture experts compression standard audio layer 4 (Moving Picture Experts Group Audio Layer IV, MP 4) player, a laptop portable computer, and a desktop computer.
Only two terminals are shown in fig. 1, but in different embodiments there are a number of other terminals that can access the server 120. Optionally, there is one or more terminals corresponding to the developer, on which a development and editing platform for supporting the application program of the virtual environment is installed, the developer may edit and update the application program on the terminal, and transmit the updated application program installation package to the server 120 through a wired or wireless network, and the first terminal 110 and the second terminal 130 may download the application program installation package from the server 120 to implement the update of the application program.
The first terminal 110, the second terminal 130, and other terminals are connected to the server 120 through a wireless network or a wired network.
The server 120 includes at least one of a server, a server cluster formed by a plurality of servers, a cloud computing platform and a virtualization center. The server 120 is used to provide background services for applications supporting a three-dimensional virtual environment. Optionally, the server 120 performs primary computing, the terminal performs secondary computing, or the server 120 performs secondary computing, the terminal performs primary computing, or a distributed computing architecture is used between the server 120 and the terminal for collaborative computing.
In one illustrative example, server 120 includes memory 121, processor 122, user account database 123, combat service module 124, and user-oriented Input/Output Interface (I/O Interface) 125. The processor 122 is configured to load instructions stored in the server 120, process data in the user account database 123 and the combat service module 124, the user account database 123 is configured to store data of user accounts used by the first terminal 110, the second terminal 130, and other terminals, such as an avatar of the user account, a nickname of the user account, a combat power index of the user account, a service area where the user account is located, the combat service module 124 is configured to provide a plurality of combat rooms for the user to combat, such as 1V1 combat, 3V3 combat, 5V5 combat, and the like, and the user-oriented I/O interface 125 is configured to establish communication exchange data with the first terminal 110 and/or the second terminal 130 through a wireless network or a wired network.
Referring to fig. 2, a flowchart of a method for using a virtual prop according to an exemplary embodiment of the present application is shown. This embodiment will be described by taking the example that the method is used for the first terminal 110 or the second terminal 130 in the implementation environment shown in fig. 1 or other terminals in the implementation environment, and the method includes the following steps:
step 201, a prop throwing control is displayed, wherein the prop throwing control is used for triggering throwing of a target virtual prop, and the target virtual prop is used for simulating attack sound effects of a virtual weapon.
The method of the embodiment of the application is applied to the virtual environment, wherein the virtual environment comprises a first virtual object and a second virtual object, and the first virtual object and the second virtual object belong to different camps. In one possible implementation, the terminal displays the virtual environment through a virtual environment screen. Alternatively, the virtual environment screen is a screen in which the virtual environment is observed at the perspective of the virtual object. The angle of view refers to an observation angle at which a first person or a third person of the virtual object observes in the virtual environment. Optionally, in an embodiment of the present application, the perspective is an angle at which the virtual object is observed by the camera model in the virtual environment.
Optionally, the camera model automatically follows the virtual object in the virtual environment, that is, when the position of the virtual object in the virtual environment changes, the camera model simultaneously changes along with the position of the virtual object in the virtual environment, and the camera model is always within a preset distance range of the virtual object in the virtual environment. Optionally, the relative positions of the camera model and the virtual object do not change during the automatic following process.
The camera model refers to a three-dimensional model positioned around the virtual object in the virtual environment, and is positioned near the head of the virtual object or positioned at the head of the virtual object when a first person viewing angle is adopted, and can be positioned behind the virtual object and bound with the virtual object or positioned at any position with a preset distance from the virtual object when a third person viewing angle is adopted, so that the virtual object positioned in the virtual environment can be observed from different angles through the camera model. Optionally, the camera model is positioned behind a virtual object (e.g., the head and shoulder of a virtual character) when the third person is said to be at a view angle that is the first person's view angle over the shoulder. Optionally, the view may include other views, such as a look-down view, in addition to the first person view and the third person view, the camera model may be positioned above the head of the virtual object when the look-down view is employed, the look-down view being a view of the virtual environment from an overhead look-down view. Optionally, the camera model is not actually displayed in the virtual environment, i.e. the camera model is not displayed in the virtual environment of the user interface display.
In the embodiment of the application, the target virtual prop is provided, and the prop is a throwing virtual prop and can simulate the attack sound effect of a virtual weapon after being thrown, wherein the virtual weapon can be a virtual firearm weapon, for example, the attack sound effect corresponding to the virtual firearm prop can be simulated, and different virtual weapons can correspond to different attack sound effects.
In one possible implementation, the user may equip the virtual object with a target virtual prop in advance before entering the game, and after entering the game, when using the target virtual prop, the user may throw the target virtual prop by triggering the prop throwing control, thereby triggering the prop effect of the target virtual prop. The prop throwing control can be displayed in the virtual environment picture after the target virtual object is provided with the target virtual prop, and can also be displayed in the virtual environment picture after the user triggers the corresponding use control of the target virtual prop.
And the target virtual prop can be pre-equipped before entering the game, or can be picked up at random or fixed positions in the virtual environment after entering the game. It should be noted that, when the target virtual prop is pre-equipped before entering the office, there is a limitation on the number of equipment, that is, the number of props of the pre-equipped target virtual prop needs to be less than the number threshold, for example, the number threshold may be 2. In one possible implementation, a user may increase the number of equipment target virtual props by expanding the virtual props for adding tactical equipment to the virtual object, such as target virtual props, smoke bullets, explosion proof devices, and the like.
Illustratively, as shown in FIG. 3, when a user clicks on a target virtual prop correspondence control 301, the attack control will switch to a prop throwing control 302.
Step 202, in response to triggering operation of the prop throwing control, controlling the target virtual object to throw the target virtual prop.
The user can control the target virtual object to throw the target virtual prop through the triggering operation of the prop throwing control, wherein when the target virtual object is controlled to throw the target virtual prop, the throwing direction and the throwing distance can be adjusted, so that the target virtual prop is thrown to the target position.
Alternatively, the triggering operation may be at least one of a click operation, a long press operation, and a slide operation.
Schematically, as shown in fig. 3, a control target virtual object 303 throws a target virtual prop 304.
Step 203, controlling the thrown target virtual prop to trigger at least two prop effects, wherein the positions of the target virtual props are different when the at least two prop effects are triggered, and the moments when the at least two prop effects are triggered are different.
When the target virtual prop is thrown, the terminal can control the target virtual prop to trigger prop effects, wherein the prop effects comprise attack sound effects of playing virtual weapons. And because the target virtual prop is used for simulating the attack sound effect of the virtual weapon, and when a user controls the target virtual object to attack by using the virtual weapon, the target virtual prop usually moves to different positions to attack, therefore, in the embodiment of the application, the target virtual prop can be controlled to trigger prop effects at different positions, and the triggering time of each time is different, thereby improving the authenticity of the simulated target virtual object for using the virtual weapon attack.
Optionally, when the virtual object controlled by the other user is within a certain range of the triggering position of the target virtual prop, the attack sound effect can be heard, and the attack sound effect gradually becomes smaller along with the increase of the distance.
And it should be noted that, the control target virtual object throwing target virtual prop may be executed by the terminal or the server. Likewise, the control target virtual prop triggering prop effect can be executed by the terminal or the server. When the control is executed by the server, when the terminal receives the triggering operation of the prop triggering control, the terminal sends a throwing instruction to the server, and the server controls the target virtual object to throw the target virtual prop to the target position according to the throwing instruction and controls the target virtual prop to trigger prop effects at different positions. This embodiment is not limited thereto. For convenience of explanation, the following embodiments will be explained with a terminal execution example.
Schematically, as shown in FIG. 3, after a target virtual prop 304 is thrown, the prop effect of target virtual prop 304 may be triggered at different locations.
In summary, in the embodiment of the application, the target virtual prop is provided, which can simulate the attack sound effect of the virtual weapon, and can control the target virtual prop to trigger the prop effect at different positions, so that the target virtual prop can simulate the attack sound effect of the virtual weapon at different positions at different times, the reality of the virtual weapon attack by the simulated user for controlling the target virtual object is improved, the probability of the user using the virtual weapon to reveal the position information is reduced, the fight success rate is improved, and the utilization rate of the virtual weapon is improved.
In the embodiment of the application, the target virtual prop can automatically jump to different directions after being thrown, so that the position of the target virtual prop in the virtual environment is changed, and the effect of triggering the prop by the target virtual prop is controlled at different positions. The following will describe exemplary embodiments.
Referring to fig. 4, a flowchart of a method for using a virtual prop according to another exemplary embodiment of the present application is shown. This embodiment will be described by taking the example that the method is used for the first terminal 110 or the second terminal 130 in the implementation environment shown in fig. 1 or other terminals in the implementation environment, and the method includes the following steps:
step 401, a prop throwing control is displayed.
Step 402, in response to a triggering operation of the prop throwing control, controlling the target virtual object to throw the target virtual prop.
The implementation of step 401 and step 402 may refer to the above steps 201 to 202, and this embodiment is not repeated here.
Step 403, controlling the target virtual prop to play the attack sound effect of the virtual weapon at the ith trigger position.
When the target virtual prop is thrown, the terminal can control the target virtual prop to play attack sound effects of the virtual weapon at different trigger positions. When the target virtual weapon contacts the virtual ground, the terminal can control the attack sound effect of the virtual weapon to be played at the corresponding position of the virtual ground. In one possible implementation, after the target virtual prop is thrown, the target virtual prop can emit a radiation detection for a distance in the flight process, wherein the radiation detection is used for detecting an obstacle touched by the target virtual prop, when the radiation collides with a wall, the target virtual prop is controlled to rebound, and when the radiation collides with the ground, the target virtual prop can be controlled to stop moving. When the radiation detection is adopted, the target virtual prop can be determined to be about to touch the ground or about to touch the wall according to the detected material.
When the target virtual prop falls to the ground for the first time, the contact position with the virtual ground can be determined as a first trigger position. In one possible scenario, the target virtual prop is immediately stationary after landing, at which time the landable position is determined as the first trigger position, and in another possible scenario, the target virtual prop may be stationary after landing after bouncing several times, at which time the contact position with the virtual ground after stationary is determined as the first trigger position. And after the first trigger position is determined, the terminal controls the target virtual prop to play the attack sound effect of the virtual weapon at the first trigger position. And subsequently, when the target virtual prop changes position, playing the attack sound effect of the virtual weapon at the changed trigger position.
Because a certain time interval exists when the target virtual object attacks at different positions in the process of using the virtual weapon to attack, in order to improve the authenticity of movement of the simulated target virtual object in the process of using the virtual weapon, a certain time interval exists when the target virtual prop is controlled at different positions to trigger prop effects, for example, the step can be replaced by the following steps:
In step 403a, in response to the duration at the ith trigger position reaching a duration threshold, the target virtual prop is controlled to play the attack sound effect of the virtual weapon at the ith trigger position, where the duration threshold is a fixed value, or the duration threshold is a random value within the target duration range.
When the time length of the target virtual prop at each trigger position reaches a certain time length threshold, the terminal controls the target virtual prop to play the attack sound effect of the virtual weapon at the trigger position. In one possible implementation manner, the duration threshold value corresponding to each trigger position may be the same, for example, the duration threshold value may be 2s, and when the duration of the target virtual prop in the first trigger position reaches 2s after being thrown, the target virtual prop is controlled to play the attack sound effect of the virtual weapon at the first trigger position, and when the target virtual prop is changed in position, the target virtual prop is controlled to play the attack sound effect of the virtual weapon at the second trigger position after reaching 2s, and so on, the target virtual prop is controlled to play the attack sound effect of the virtual weapon at the second trigger position.
In another possible implementation manner, because the time interval between the positions may be switched differently when the control target virtual object attacks at different positions using the virtual weapon, in order to further improve the simulation authenticity, the duration threshold value corresponding to each trigger position may be a random value, the terminal stores a target duration range in advance, and when the trigger position of the target virtual prop changes, a random value is determined as the duration threshold value in the target duration range. For example, the target duration range may be 1-2s, when the target virtual prop is thrown, the first duration threshold is selected to be 2s, and after the duration of stay at the first trigger position reaches 2s, the target virtual prop is controlled to play the attack sound effect of the virtual weapon at the first trigger position, and when the target virtual prop is replaced, the second duration threshold is selected to be 1.5s, and after the duration of the second trigger position reaches 1.5s, the target virtual prop is controlled to play the attack sound effect of the virtual weapon at the second trigger position, and so on, and the target virtual prop is controlled to play the attack sound effect of the virtual weapon at different positions.
And step 404, controlling the target virtual prop to perform the ith jump, wherein the target virtual prop moves to the (i+1) th trigger position after the ith jump.
In this embodiment, the trigger position of the target virtual prop is changed by controlling the target virtual prop to jump. When the virtual prop of the control target plays the virtual weapon attack sound effect at the ith trigger position, the terminal controls the virtual prop of the control target to perform the ith jump, so that the virtual prop of the control target moves to the (i+1) th trigger position. The target virtual prop also can rebound until falling to the ground if colliding with the wall in the jumping process, and the triggering position after jumping is the contact position with the virtual ground after falling to the ground and resting, namely the (i+1) th triggering position is the contact position with the virtual ground when the target virtual prop is resting after the ith jumping.
In order to make the user more clearly understand the state of the target virtual prop, when the target virtual prop plays the attack sound effect and the jump of the virtual weapon, a first explosion animation, for example, the animation can be a specific animation of spark splash, is displayed in the virtual environment picture, so that the user is prompted that the target virtual prop plays the attack sound effect.
In this embodiment, when the target virtual prop is controlled to jump each time, the target virtual prop is randomly jumped. Wherein, this step can be replaced by the following steps:
step 404a, determining an ith initial speed and an ith initial direction of jumping corresponding to the ith initial speed, wherein the ith initial speed of jumping is within a target speed range, and an included angle between the ith initial speed of jumping and the virtual ground is within a target angle range.
In one possible implementation, the i-th jump initial speed corresponding to the i-th jump and the i-th jump direction are determined first, so that the i-th jump is performed according to the i-th jump initial speed and the i-th jump direction control. The i-th jumping initial speed refers to an upward initial speed when the target virtual prop jumps, the terminal can select a random value in a target speed range, and the random value is determined as the i-th jumping initial speed, for example, the target speed range can be 2-5m/s. The ith jumping direction refers to an included angle between the target virtual prop and the virtual ground when jumping, and the terminal can select a random value in the target angle range to determine the ith jumping direction. For example, the target angle may range from 50 to 80.
In another possible implementation manner, since the attack sound effects played by the target virtual prop at each trigger position are the same, that is, the attack sound effects corresponding to the same virtual weapon are simulated, and the attack sound effects played by the target virtual prop may be virtual weapons with higher stability requirements during use, for example, sniper guns, when a user attacks other target virtual objects by using the virtual weapons, a large-scale movement is generally not performed in a short time, and when the target virtual prop is controlled to jump, the target virtual prop needs to be controlled to move in a small range in order to improve the reality of simulating the attack sound effects of the virtual weapons by using the target virtual prop. Thus, the initial speed and direction of jump of the target virtual prop may be determined according to weapon type, which may include the following:
1. in response to the weapon type of the virtual weapon being of the first type, an i-th bounce initial velocity is determined within a first velocity range and an i-th bounce direction is determined within a first angle range, the first velocity range being of the target velocity range and the first angle range being of the target angle range.
After the target virtual prop plays the attack sound effect at the first trigger position, the terminal can determine the virtual weapon corresponding to the attack sound effect, thereby determining the weapon type of the virtual weapon. In one possible implementation, the terminal may have a pre-stored correspondence between different types and virtual weapons, and when determining the virtual weapon simulated by the target virtual prop, the corresponding weapon type may be determined.
Wherein, the first type of virtual weapon refers to a virtual weapon with higher stability requirement in use, i.e. the first type of virtual weapon has higher stability requirement in use than the second type of virtual weapon. When the target virtual prop is adopted to simulate the virtual weapon, the target virtual prop is required to be controlled to move in a small range, so that the ith jumping initial speed is required to be determined in a small speed range, and the ith jumping direction is required to be determined in a small angle range, and the position change distance of the target virtual prop after jumping is relatively short. Wherein the i-th bounce initial velocity may be determined within a first velocity range, wherein the first velocity range is a smaller range within the target velocity range, and wherein the first velocity range may be 2-3m/s in connection with the above example, and wherein the first angle range is a smaller range within the target angle range, and wherein the first angle range may be 50-60.
Illustratively, when the virtual weapon is a sniper gun, then the corresponding weapon type may be determined to be the first type, a random value is selected in the range of 2-3m/s when determining the initial speed of bounce of the target virtual prop, and a random angle is selected as the direction of bounce within 50 ° -60 ° when determining the direction of bounce.
2. In response to the weapon type of the virtual weapon being of the second type, an i-th bounce initial velocity is determined within a second velocity range and an i-th bounce direction is determined within a second angle range, the second velocity range being of the target velocity range and the second angle range being of the target angle range.
The second type of virtual weapon is a virtual weapon with lower stability requirement in use, and when the target virtual prop is adopted to simulate the virtual weapon, the target virtual prop can be controlled to move in a larger range. Thus, the i-th bounce initial velocity may be determined over a larger range of velocities and the i-th bounce direction may be determined over a larger range of angles, i.e. the i-th bounce initial velocity may be determined over a second range of velocities greater than the first range of velocities, and the i-th bounce direction may be determined over a second range of angles greater than the first range of angles, i.e. the second range of velocities is a larger range within the target range of velocities, e.g. may be 3-5m/s, and the second range of angles is a larger range within the target range of angles, e.g. may be 60 ° -80 °.
Illustratively, when the virtual weapon is a rifle, then the corresponding weapon type may be determined to be the second type, a random value is selected in the range of 3-5m/s when determining the initial rate of bounce of the target virtual prop, and a random angle is selected as the direction of bounce within 60 ° -80 ° when determining the direction of bounce.
Step 404b, based on the ith jumping initial speed and the ith jumping direction, controlling the target virtual prop to perform the ith jumping.
The terminal can control the target virtual prop to perform the ith jump according to the ith jump initial speed and the ith jump direction, so that the target virtual prop is moved to the (i+1) th trigger position.
Schematically, as shown in fig. 5, when the target virtual prop is thrown and falls to the first trigger position 501, the terminal controls the target virtual prop to play the attack sound effect at the first trigger position 501 and controls the target virtual prop to jump, so that the target virtual prop moves to the second trigger position 502, and the subsequent terminal continues to control the target virtual prop to play the attack sound effect at the second trigger position 502 and control the target virtual prop to jump.
In this embodiment, when the target virtual prop is controlled to play the attack sound effect of the virtual weapon, the target virtual prop is controlled to jump at the same time, so that the next trigger position of the target virtual prop is changed, and the reality of simulating the attack using the virtual weapon is improved.
In this embodiment, when each time of jumping, the initial speed and the direction of jumping are randomly selected, so that the target virtual prop moves randomly, and when the speed and the direction are randomly selected, the selection range can be further determined according to the weapon type of the simulated virtual weapon, so that when the virtual weapon with higher requirement on stability is simulated, the movement range of the target virtual weapon is controlled to be smaller, and the reality of the attack of the simulated virtual weapon is further improved.
Because the target virtual prop jumps in the virtual environment, in order to enable the user to know the real-time position of the target virtual prop, in this embodiment, the prop identifier is displayed at the position corresponding to the display position of the prop trigger position of the virtual environment picture, and the user is prompted. And a removal condition is set for the target virtual prop, and when the removal condition is reached, the target virtual prop is removed. The following will describe exemplary embodiments.
Referring to fig. 6, a flowchart of a method for using a virtual prop according to another exemplary embodiment of the present application is shown. This embodiment will be described by taking the example that the method is used for the first terminal 110 or the second terminal 130 in the implementation environment shown in fig. 1 or other terminals in the implementation environment, and the method includes the following steps:
Step 601, a prop throwing control is displayed.
In step 602, a target virtual object is controlled to throw a target virtual prop in response to a triggering operation of a prop throwing control.
The implementation of steps 601 to 602 may refer to steps 201 to 202, which are not limited in this embodiment.
Step 603, randomly selecting target attack sound effects from a sound effect library, wherein the sound effect library comprises at least one attack sound effect corresponding to the virtual weapon.
In this embodiment, the attack sound effect of the target virtual prop simulating the virtual weapon may be an attack sound effect corresponding to a random virtual weapon. And the terminal is pre-stored with an acoustic effect library, wherein the acoustic effect library comprises attack acoustic effects corresponding to different virtual weapons. For example, the attack sound effect of the rifle is a string of sound for 1s, the attack sound effect of the light machine gun is a string of sound for 2s, the attack sound effect of the submachine gun is a string of sound for 1s, the attack sound effects of the sniper gun and the shotgun are a sound, and different guns correspond to attack sounds.
After the control target virtual object throws the target virtual prop, a target attack sound effect can be randomly selected from the sound effect library, and the target attack sound effect is determined as the attack sound effect of the subsequent play of the target virtual prop.
Step 604, controlling the thrown target virtual prop to play the target attack sound effect at least twice, wherein the attack sound effect of each play of the target virtual prop is the same.
After the target attack sound effect is selected, the terminal can control the target attack sound effect to be played at each trigger position.
Step 605, displaying a prop identifier at a display position corresponding to the triggering position of the target virtual prop in the virtual environment picture, where the prop identifier includes prompt information, and the prompt information is used for indicating the remaining residence time or the remaining triggering times of the target virtual prop.
After the target virtual prop is thrown, in order to enable a user to clearly know the trigger position of the target virtual prop, a prop identifier is displayed at a display position corresponding to the trigger position of the target virtual prop in the virtual environment picture. And when the triggering position of the target virtual prop is changed, the prop mark display position of the target virtual prop is changed. Namely, when the trigger position of the target virtual prop is the ith trigger position, the display position of the prop mark is the display position corresponding to the ith trigger position in the virtual environment picture, and when the trigger position of the target virtual prop is the (i+1) th trigger position, the display position of the prop mark is the display position corresponding to the (i+1) th trigger position in the virtual environment picture.
Optionally, the prop identifier may be displayed in the virtual environment screen as an icon, text, model, or the like.
In order to enable the user to further know the prop state of the target virtual prop, when the target virtual prop is in different states, prop identifications are displayed in different display forms, wherein the situations can be as follows:
1. and responding to the target virtual prop in a trigger state, and displaying the prop identification in a first display form.
The triggering state refers to a state that the prop effect of the target virtual prop is triggered, namely, a state that the target virtual prop is playing the attack sound effect of the virtual weapon and jumping.
When the target virtual prop is in the triggered state, the prop identifier may be displayed in a first display form, for example, the prop identifier may be in a shaking state.
2. And in response to the target virtual prop being in a non-triggered state, displaying the prop identification in a second display form.
The non-triggering state is a state that the target virtual prop is stationary, and at this time, the prop identifier can be displayed in a second display form different from the first display form.
Schematically, as shown in fig. 7, when the target virtual prop is in the triggered state, the prop identifier is displayed in a first display form 701, and after the target virtual prop is triggered, the target virtual prop moves to another triggered position, and at this time, the target virtual prop is in an un-triggered state, and the prop identifier is displayed in a second display form 702.
In addition, in this embodiment, the user with the same camping may be prompted in the same manner, that is, the prop identifier is displayed in the virtual environment picture in the client corresponding to the first virtual object, where the first virtual object and the target virtual object belong to the same camping.
After the terminal obtains the trigger position of the target virtual prop, the trigger position information can be sent to the server, the server forwards the trigger position information to the client corresponding to the first virtual object, and after the client corresponding to the first virtual object receives the trigger position information, prop identification can be displayed at the position corresponding to the virtual environment picture. And when the trigger position of the target virtual prop is changed, correspondingly sending the updated trigger position to the server, and forwarding the trigger position information by the server, so that the prop identification position in the virtual environment picture of the first virtual object corresponding to the client is correspondingly changed. And the prop mark is only displayed in the virtual environment picture of the client corresponding to the first virtual object, but not displayed in the virtual environment picture of the client corresponding to the second virtual object, and the second virtual object and the target virtual object belong to different camps.
In order to further improve the simulation reality of the target virtual prop, when the target virtual prop plays the attack sound effect of the virtual weapon, the sound effect identification can be displayed in the map display control of the client corresponding to the second virtual object, wherein the sound effect identification is the same as the identification displayed in the map display control when the virtual weapon is used for attack. The sound effect identification is used for indicating that the virtual objects exist at the virtual environment positions corresponding to the positions of the sound effect identifications for attack by using the virtual weapon, and the sound effect identification is only displayed when the target virtual prop plays the attack sound effect. And the sound effect identification is only displayed in the map display control of the client corresponding to the second virtual object, but not displayed in the map display control of the client corresponding to the first virtual object.
In one possible implementation manner, after the terminal sends the trigger position information to the server, the server may determine a display position corresponding to the map display control according to the trigger position information, so as to forward the display position to the second virtual object corresponding client, and enable the second virtual object corresponding client to display the sound effect identifier at the position corresponding to the display position.
Schematically, as shown in fig. 8, when the target virtual prop triggers the prop effect, a valid sound effect identifier 801 is displayed in a map display control in the second virtual object corresponding client, which indicates that a virtual object exists at a corresponding position in the virtual environment to use a virtual weapon attack.
In response to satisfying the removal condition, the target virtual prop is removed in the virtual environment, the removal condition including at least one of a residence time condition, a trigger number condition, and a prop-cradling condition, step 606.
In one possible implementation, to ensure game fairness, a removal condition is set for the target virtual prop to avoid the target virtual prop from continuously triggering prop effects.
Alternatively, when the removal condition is a residence time condition, this step embodiment may be:
and in response to the target virtual prop remaining in the virtual environment for greater than the time threshold, determining that a removal condition is met, removing the target virtual prop in the virtual environment.
The residence time of the target virtual prop in the virtual environment may be the residence time of the target virtual prop in the virtual environment after being thrown to the first trigger position. When the target virtual prop is thrown, falls to the ground and is static, the terminal can start timing, and when the retention time reaches a time threshold, the terminal removes the target virtual prop. For example, the time threshold may be 10s.
In order to enable the user to intuitively know the residual residence time of the target virtual prop, prompt information can be correspondingly displayed in the prop identification, so that the residual residence time is prompted. Schematically, as shown in FIG. 7, the prop id is shown with a remaining residence time on the circumferential side.
Alternatively, when the removal condition is the trigger number condition, the step embodiment may be:
and in response to the trigger times of the target virtual props reaching the number threshold, determining that the removal condition is met, and removing the target virtual props in the virtual environment.
In another possible implementation, a trigger number threshold, i.e., a quantity threshold, may be set for the target virtual prop, i.e., when the trigger number of the target virtual prop reaches the quantity threshold, the control of the target virtual prop to jump is stopped and the target virtual prop is removed. For example, the number threshold may be 3, and when the prop effect of the target virtual prop is triggered 3 times, it is determined that the number threshold is reached, and the target virtual prop is removed in the virtual environment.
Similarly, when the removal condition is the trigger frequency condition, prompt information can be correspondingly displayed in the prop identifier for prompting the residual trigger frequency.
Alternatively, when the removal condition is a prop-restraining condition, the step embodiment may be:
and responding to the trigger position of the target virtual prop being in the action range of the virtual brake prop, determining that the removal condition is met, and removing the target virtual prop in the virtual environment, wherein the virtual brake prop is used for destroying the target virtual prop.
In this embodiment, the target virtual prop may be damaged by the virtual restraining prop, where the virtual restraining prop may be used to restrain the throwing virtual prop, and there is a certain range of action, and when the throwing virtual prop is located in the range of the virtual restraining prop, the prop will be damaged, and the prop effect cannot be triggered.
And the virtual restraint prop is a prop placed by the second virtual object, and the target virtual prop is removed only when the target virtual prop is positioned in the action range of the virtual restraint prop placed by the second virtual object. In one possible implementation, since the target virtual prop may jump, the prop effect of the target virtual prop may not fall within the range of the virtual restraining prop after being thrown, and when the target virtual prop jumps within the range of the virtual restraining prop, the terminal will remove the target virtual prop.
When the target virtual prop is removed, a second explosion animation can be displayed in the virtual environment picture, and is removed after the display is completed, wherein the second explosion animation is different from the first explosion animation.
Schematically, as shown in fig. 9, there is a virtual restraint prop 901 placed by the second virtual object in the virtual environment, and the target virtual prop is within the range 902 of the virtual restraint prop 901, so the target virtual prop needs to be removed.
In response to the target virtual prop being removed, display of the prop identification is stopped 607.
When the target virtual prop is removed, the terminal stops displaying the prop identification, and correspondingly, the server forwards removal information to the client corresponding to the first object, and the client corresponding to the first virtual object stops displaying the prop identification in the virtual environment picture.
In this embodiment, a prop identifier is displayed at a position corresponding to a display position of a target virtual prop trigger position in a virtual environment picture, and is used for prompting the target prop position. In this embodiment, a removal condition is further provided, and when the target virtual prop meets the removal condition, the target virtual prop will be removed, so as to avoid continuously triggering prop effects and affecting the fairness of the game.
In connection with the various embodiments described above, in one illustrative example, the use of virtual props is illustrated in FIG. 10.
Step 1001, a prop throwing control is displayed.
Step 1002, determining whether a triggering operation for the prop throwing control is received, if yes, executing step 1003.
In step 1003, the control target virtual object throws the target virtual prop.
Step 1004, controlling the target virtual prop to play the attack sound effect of the virtual weapon at the ith trigger position and controlling the target virtual prop to jump.
Step 1005, determining whether a removal condition is satisfied, if yes, executing step 1006, and if no, executing step 1004.
At step 1006, the target virtual prop is removed.
Fig. 11 is a block diagram of a virtual prop using apparatus according to an exemplary embodiment of the present application, the apparatus including:
a first display module 1101, configured to display a prop throwing control, where the prop throwing control is configured to trigger throwing a target virtual prop, and the target virtual prop is configured to simulate an attack sound effect of a virtual weapon;
The first control module 1102 is configured to control a target virtual object to throw the target virtual prop in response to a triggering operation of the prop throwing control;
The second control module 1103 is configured to control the thrown target virtual prop to trigger at least two prop effects, where the positions of the target virtual prop are different when the prop effects are triggered at least two times, and the moments when the prop effects are triggered at least two times are different.
Optionally, the second control module 1103 includes:
The first control unit is used for controlling the target virtual prop to play the attack sound effect of the virtual weapon at the ith trigger position;
The second control unit is used for controlling the target virtual prop to perform the ith jump, wherein the target virtual prop moves to the (i+1) th trigger position after the ith jump.
Optionally, the first control unit is further configured to:
and controlling the target virtual prop to play the attack sound effect of the virtual weapon at the ith trigger position in response to the time length at the ith trigger position reaching a time length threshold, wherein the time length threshold is a fixed value or a random value within a target time length range.
Optionally, the second control unit is further configured to:
Determining an ith jumping initial speed and an ith jumping direction corresponding to the ith jumping, wherein the ith jumping initial speed is in a target speed range, and an included angle between the ith jumping direction and the virtual ground is in a target angle range;
and controlling the target virtual prop to perform the ith jump based on the ith jump initial speed and the ith jump direction.
Optionally, the attack sound effects played by the target virtual prop at each trigger position are the same.
Optionally, the second control unit is further configured to:
determining the ith bounce initial velocity in a first velocity range and the ith bounce direction in a first angle range in response to the weapon type of the virtual weapon being of a first type, the first velocity range being of the target velocity range, the first angle range being of the target angle range;
Determining the ith bounce initial velocity in a second velocity range and the ith bounce direction in a second angle range in response to the weapon type of the virtual weapon being of a second type, the second velocity range being of the target velocity range, the second angle range being of the target angle range;
Wherein the first type of virtual weapon has a higher stability requirement than the second type of virtual weapon when in use, the first range of speeds being less than the second range of speeds, and the first range of angles being less than the second range of angles.
Optionally, the apparatus further includes:
the selecting module is used for randomly selecting target attack sound effects from a sound effect library, wherein the sound effect library comprises attack sound effects corresponding to at least one virtual weapon;
The second control module is further configured to:
And controlling the thrown target virtual prop to play the target attack sound effect at least twice, wherein the attack sound effect of each play of the target virtual prop is the same.
Optionally, the apparatus further includes:
and the removing module is used for removing the target virtual prop in the virtual environment in response to the condition of removing being met, wherein the condition of removing comprises at least one of a retention time condition, a triggering frequency condition and a prop control condition.
Optionally, the removal condition is the residence time condition, and the removal module is further configured to:
Responsive to the target virtual prop remaining in the virtual environment for greater than a time threshold, determining that the removal condition is met, removing the target virtual prop in the virtual environment;
Or alternatively, the first and second heat exchangers may be,
The removal condition is the trigger number condition, and the removal module is further configured to:
determining that the removal condition is met in response to the trigger times of the target virtual props reaching a quantity threshold, and removing the target virtual props in the virtual environment;
Or alternatively, the first and second heat exchangers may be,
The removal condition is the prop-restraining condition, and the removal module is further configured to:
and responding to the trigger position of the target virtual prop being in the action range of the virtual restraint prop, determining that the removal condition is met, and removing the target virtual prop in the virtual environment, wherein the virtual restraint prop is used for destroying the target virtual prop.
Optionally, the apparatus further includes:
The second display module is used for displaying the prop identifier at a display position corresponding to the triggering position of the target virtual prop in the virtual environment picture, wherein the prop identifier comprises prompt information, and the prompt information is used for indicating the residual residence time or the residual triggering times of the target virtual prop.
Optionally, the second display module includes:
the first display unit is used for responding to the triggering state of the target virtual prop and displaying the prop identification in a first display mode;
the second display unit is used for responding to the fact that the target virtual prop is in a non-triggering state and displaying the prop identification in a second display mode;
The apparatus further comprises:
and the stopping display module is used for stopping displaying the prop identification in response to the removal of the target virtual prop.
Optionally, the prop identifier is displayed in a virtual environment picture in the client corresponding to the first virtual object, the sound effect identifier is displayed in a map display control in the client corresponding to the second virtual object, the sound effect identifier is displayed when the target virtual prop plays the attack sound effect, the first virtual object and the target virtual object belong to the same camping, and the second virtual object and the target virtual object belong to different camping.
In summary, in the embodiment of the application, the target virtual prop is provided, which can simulate the attack sound effect of the virtual weapon, and can control the target virtual prop to trigger the prop effect at different positions, so that the target virtual prop can simulate the attack sound effect of the virtual weapon at different positions at different times, the reality of the virtual weapon attack by the simulated user for controlling the target virtual object is improved, the probability of the user using the virtual weapon to reveal the position information is reduced, the fight success rate is improved, and the utilization rate of the virtual weapon is improved.
Referring to fig. 12, a block diagram of a terminal 1200 according to an exemplary embodiment of the present application is shown. The terminal 1200 may be a portable mobile terminal such as a smart phone, tablet, MP3 player, MP4 player. Terminal 1200 may also be referred to as a user device, portable terminal, or the like.
In general, terminal 1200 includes a processor 1201 and memory 1202.
Processor 1201 may include one or more processing cores, such as a 4-core processor, an 8-core processor, or the like. The processor 1201 may be implemented in at least one hardware form of Digital Signal Processing (DSP), field-Programmable gate array (fieldprogrammable GATE ARRAY, FPGA), programmable logic array (Programmable Logic Array, PLA). The processor 1201 may also include a main processor, which is a processor for processing data in a wake-up state, also referred to as a central processor (Central Processing Unit, CPU), and a coprocessor, which is a low-power processor for processing data in a standby state. In some embodiments, the processor 1201 may integrate an image processor (Graphics Processing Unit, GPU) for rendering and rendering of content required for display by the display screen. In some embodiments, the processor 1201 may also include an artificial intelligence (ARTIFICIAL INTELLIGENCE, AI) processor for processing computing operations related to machine learning.
Memory 1202 may include one or more computer-readable storage media, which may be tangible and non-transitory. Memory 1202 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1202 is used to store at least one instruction for execution by processor 1201 to implement the methods provided by embodiments of the present application.
In some embodiments, terminal 1200 may also optionally include a peripheral interface 1203 and at least one peripheral. For example, peripheral devices include radio frequency circuitry, touch display screens, power supplies, and the like.
It will be appreciated by those skilled in the art that the structure shown in fig. 12 is not limiting and that more or fewer components than shown may be included or certain components may be combined or a different arrangement of components may be employed.
Embodiments of the present application also provide a computer readable storage medium storing at least one instruction that is loaded and executed by the processor to implement the method of using a virtual prop as described in the above embodiments.
According to one aspect of the present application, there is provided a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the terminal reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the terminal performs the method of using the virtual prop provided in the various optional implementations of the above aspect.
Those skilled in the art will appreciate that in one or more of the examples described above, the functions described in the embodiments of the present application may be implemented in hardware, software, firmware, or any combination thereof. When implemented in software, these functions may be stored on or transmitted over as one or more instructions or code on a computer-readable storage medium. Computer-readable storage media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a general purpose or special purpose computer.
The foregoing description of the preferred embodiments of the present application is not intended to limit the application, but rather, the application is to be construed as limited to the appended claims.