Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail with reference to the accompanying drawings.
First, the nouns involved in the embodiments of the present application will be described:
Virtual environment-is the virtual environment that an application displays (or provides) while running on a terminal. The virtual environment can be a simulation environment for the real world, a semi-simulation and semi-fictional three-dimensional environment, or a pure fictional three-dimensional environment. The virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment. Optionally, the virtual environment is further used for virtual environment combat between at least two virtual roles, in which virtual environment there are virtual resources available for use by the at least two virtual roles.
Virtual roles-refer to movable objects in a virtual environment. The movable object may be at least one of a virtual character, a virtual animal, and a cartoon character. Alternatively, when the virtual world is a three-dimensional virtual world, the virtual characters may be three-dimensional stereoscopic models, each having its own shape and volume in the three-dimensional virtual world, occupying a part of the space in the three-dimensional virtual world. Optionally, the virtual character is a three-dimensional character constructed based on three-dimensional human skeleton technology, which implements different external figures by wearing different skins. In some implementations, the avatar may also be implemented using a 2.5-dimensional or 2-dimensional model, as embodiments of the application are not limited in this regard. In some embodiments of the application, a avatar refers to an avatar that is controllable by a user in a virtual environment, and may be an auxiliary avatar for an avatar that is not controllable by a user, such as a soldier, monster, non-player character (Non-PLAYER CHARACTER, NPC).
Task object-object that needs defeat in task. May also be referred to as a Boss or captain.
Roguelike games are a subtype of Role playing games (Role PLAYING GAME, RPG) and are characterized by a randomly generated map, round-robin game mechanisms, random generation of games, props and character capabilities which need to be restarted after death, and the like. Class Roguelike games typically place the game character in a randomly generated virtual environment where the player needs to explore the map, fight against enemy, collect items and resources, eventually reach the destination and defeat the final BOSS (i.e., mission object). In Roguelike games, each action needs to be carefully considered, since once the character dies, the game will start over and the map and character attributes will be regenerated, so each game is unique.
On the map provided by the virtual environment, different virtual teams belonging to at least two hostile camps occupy respective map areas respectively, and play games with a certain winning condition as a target. The winning conditions include, but are not limited to, at least one of occupying or destroying a hostile camp, killing a hostile camp avatar, securing itself for a specified scenario and time, robbing to a resource, and scoring beyond the other for a specified time. Tactical competition may be performed in units of bureaus, and the maps of each tactical competition may be the same or different. Each virtual team includes one or more virtual characters, such as 1, 3, or 5.
And the virtual environment picture is a picture for observing the three-dimensional virtual environment by adopting a third humanoid overlook angle. Alternatively, the top view angle is a 45 degree angle. It should be noted that, a virtual camera is generally disposed in the game scene, and the content of the game scene shot by the virtual camera is a virtual environment picture presented by the user interface. In the 3D game of the third person viewing angle, the virtual camera can capture the contents of the game scene including the virtual character, and display the virtual character on the virtual environment screen. For another example, in some games without virtual characters, the movement and rotation of the virtual camera can also be directly controlled, so as to update the game screen presented by the user interface.
The skills, namely the role capability of the virtual roles in the three-dimensional virtual environment, can generate action effects on other virtual roles. The skills may be attack type skills, defensive skills, skills with specific targets, group attack skills, skills that produce a flying effect on other camping virtual characters, and skills that produce a decelerating effect on other camping virtual characters, and the present application is not limited to this.
Fig. 1 is a schematic diagram of a task object selection method according to an exemplary embodiment of the present application. Information such as a current season name 12, a task entry 13 for an opened task, and a task entry 14 for an unopened task is displayed on the first user interface 11. Wherein, the task is that the player controls the virtual character to complete the activity, thereby obtaining the growth or rewarding of the virtual character, and the task can also be called a gate, a copy, a game, and the like. And each task entry is displayed with an avatar of two candidate task objects participating in the subtotal, a subtotal bar of the two candidate task objects and an attribute tag, wherein the subtotal bar is used for indicating the historical support rate of the two candidate task objects, and the attribute tag is used for displaying attribute promotion made for the captain with lower historical support rate. Taking the task entry 13 of an opened task as an example, two candidate task objects participating in a subtotal in the task, a subtotal bar 15 of the two candidate task objects and an attribute tag 16 for a candidate task object with a lower history support rate are displayed on the task entry, wherein the attribute tag 16 is "difficulty +10%" indicates that the difficulty of the candidate task to the task as a target task object is increased by 10%. For portals with unopened tasks, the avatar of the two-digit captain displayed thereon is displayed in silhouette to preserve mystery.
Fig. 2 is a schematic diagram of a task object selection method according to an exemplary embodiment of the present application. Fig. 2 shows a display manner of the task entry. For the entry 21 of the open task in the unselected state, the avatar 22 of two candidate task objects participating in the subtotal in the task is displayed thereon, the attribute tag 23 (e.g. "difficulty + 10%") for the attribute promotion of the candidate task object is displayed on the candidate task object with the lower support rate of the two candidate task objects, and the subtotal bar 24 for displaying the history support rate of the two candidate task objects are displayed thereon, and for the task entry 25 of the open task in the selected state, the background thereof is highlighted. For the task portal 26 of the unopened task in the unselected state, a silhouette 28 of the avatar of the two candidate task objects participating in the subtree in the task and the open date 29 of the task are displayed thereon, and for the task portal 27 of the unopened task in the selected state, the background thereof is highlighted.
When the user enters a task by triggering a task entry on the first user interface, a second user interface 30 is displayed, as shown in fig. 3. On this second user interface 30 two selection circles are displayed, character 1 (i.e. one candidate task object) is displayed in selection circle 34, and character 2 (i.e. the other candidate task object) is displayed in selection circle 35. The player character 33 operated by the user may enter any selection circle representing a character that supports or wants to challenge the selection circle.
In one possible implementation, ten users form a team. After ten users select the candidate task objects supported, more than half of the candidate task objects supported by the user are selected as candidate task objects supported by the team, and the character with low support rate is selected as the target task object to be subtended by the team, for example, when more than five users select the candidate task object supporting the character 1, the team is the character 2 (target task object), and for example, when the number of users supporting the character 1 and the character 2 is the same, the character to be subtended by the team is selected at random.
FIG. 4 is a block diagram illustrating a computer system according to an exemplary embodiment of the present application. The computer system 100 includes a first terminal 110, a server 120, and a second terminal 130.
The first terminal 110 installs and runs a client 111 supporting a virtual environment, which client 111 may be a Roguelike-type game program. When the first terminal runs the client 111, a user interface of the client 111 is displayed on a screen of the first terminal 110. The client may be any one of a strategy Game (SLG) Game, a Roguelike-class Game, a multiplayer online tactical competition (Multiplayer Online Battle Arena, MOBA) Game. In this embodiment, the client is illustrated as a Roguelike-class game. The first terminal 110 is a terminal used by the first user 112. The first user 112 uses the first terminal 110 to control a first avatar located in the virtual environment to perform an activity, which may be referred to as a master avatar of the first user 112. The activities of the first virtual character include, but are not limited to, at least one of adjusting body posture, crawling, walking, running, riding, flying, jumping, driving, picking up, shooting, attacking, throwing. Illustratively, the first avatar is a first avatar, such as an emulated persona or a cartoon persona.
The second terminal 130 installs and runs a client 131 supporting a virtual environment, and the client 131 may be a Roguelike-type game program. When the second terminal 130 runs the client 131, a user interface of the client 131 is displayed on a screen of the second terminal 130. The client may be any one of an SLG game, roguelike-class game, MOBA game. In this embodiment, the client is illustrated as a Roguelike-class game. The second terminal 130 is a terminal used by the second user 113. The second user 113 uses the second terminal 130 to control a second avatar located in the virtual environment to perform an activity, which may be referred to as a master avatar of the second user 113. The activities of the second virtual character include, but are not limited to, adjusting at least one of body posture, crawling, walking, running, riding, flying, jumping, driving, picking up, shooting, attacking, throwing. Illustratively, the first avatar is a first avatar, such as an emulated persona or a cartoon persona.
Optionally, the first avatar and the second avatar are in the same virtual environment. Alternatively, the first avatar and the second avatar may belong to the same camp, the same team, the same organization, have a friend relationship, or have temporary communication rights. Alternatively, the first avatar and the second avatar may belong to different camps, different teams, different organizations, or have hostile relationships.
Alternatively, the clients installed on the first terminal 110 and the second terminal 130 are the same, or the clients installed on the two terminals are the same type of client on different operating system platforms (android or IOS). The first terminal 110 may refer broadly to one of the plurality of terminals and the second terminal 130 may refer broadly to another of the plurality of terminals, the present embodiment being illustrated with only the first terminal 110 and the second terminal 130. The device types of the first terminal 110 and the second terminal 130 are the same or different, and include at least one of a smart phone, a tablet computer, an electronic book reader, an MP3 player, an MP4 player, a laptop portable computer, and a desktop computer.
Only two terminals are shown in fig. 4, but in different embodiments there are a plurality of other terminals 140 that can access the server 120. Optionally, there are one or more terminals 140 corresponding to the developer, a development and editing platform for supporting the client of the virtual environment is installed on the terminal 140, the developer can edit and update the client on the terminal 140, and transmit the updated client installation package to the server 120 through a wired or wireless network, and the first terminal 110 and the second terminal 130 can download the client installation package from the server 120 to implement the update of the client.
The first terminal 110, the second terminal 130, and the other terminals 140 are connected to the server 120 through a wireless network or a wired network.
Server 120 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. The server 120 is used to provide background services for clients supporting a three-dimensional virtual environment (Roguelike-class game clients). Optionally, the server 120 performs primary computing, the terminal performs secondary computing, or the server 120 performs secondary computing, the terminal performs primary computing, or a distributed computing architecture is used between the server 120 and the terminal for collaborative computing.
In one illustrative example, server 120 includes a processor 122, a user account database 123, an engagement service module 124, and a user-oriented Input/Output Interface (I/O Interface) 125. The processor 122 is configured to load instructions stored in the server 120, process data in the user account database 123 and the combat service module 124, the user account database 123 is configured to store data of user accounts used by the first terminal 110, the second terminal 130, and other terminals 140, such as an avatar of the user account, a nickname of the user account, a combat index of the user account, a service area where the user account is located, the combat service module 124 is configured to provide an combat environment for the user to combat, and the user-oriented I/O interface 125 is configured to establish communication exchange data with the first terminal 110 and/or the second terminal 130 through a wireless network or a wired network.
Fig. 5 is a flowchart illustrating a task object selection method according to an exemplary embodiment of the present application. The method may be performed by any of the terminals in fig. 4 described above, or by a client running on any of the terminals, on which the first account number is logged in. The method comprises the following steps:
Step 220, displaying a first user interface;
The first user interface includes a task portal. Optionally, the task portal displayed by the first user interface may be one or more. The task portal is a portal control that enters the task interface, i.e., by a trigger operation on the task portal, can jump to the task interface (e.g., a second user interface, described below). Optionally, the task portal has displayed thereon an avatar of a candidate task object involved in the task.
Optionally, the first user interface further includes at least two candidate task objects, where the candidate task objects correspond to tasks, i.e. different tasks may correspond to at least two different candidate task objects, where the candidate task objects are selectable objects that are targets of tasks, i.e. the user needs to select from the at least two candidate task objects to select a target task object.
Optionally, the historical support rate of at least two candidate task objects is obtained, and the at least two candidate task objects and a subtree are displayed on the first user interface, wherein the subtree is used for indicating the historical support rate of the at least two candidate task objects.
The history support rate refers to a support rate for candidate mission objects determined based on history data of players who played the mission before the first user interface display is cut, for example, history data of all players who played the mission before the first user interface display is acquired, and for example, history data of players who played the mission within three months before the first user interface display is acquired. Wherein, the player selects the supported candidate task object in one task and calculates one support for the candidate task object.
The method comprises the steps of obtaining the obtained support times of each candidate task object, converting the obtained support times of at least two candidate task objects into percentages, namely, historical support rates, wherein the molecules of the historical support rates are the support times obtained by the candidate task objects respectively, and the denominator of the historical support rates is the sum of the support times obtained by all the candidate task objects.
Optionally, an attribute tag is further displayed on the first user interface, where the attribute tag is used to indicate an attribute promotion for a candidate task object with a lower historical support rate, and a value of the attribute promotion is related to the historical support rates of at least two candidate task objects. That is, for a candidate task object with a low support rate, the attribute value is raised in order to raise its support rate. That is, for the candidate task object with lower support rate, by increasing the attribute value, the candidate task object is more dominant in the game (for example, obtaining skill improvement, obtaining vitality enhancement, etc.), so that the subsequent player is more prone to support the candidate task object when selecting the candidate task object, so as to achieve the purpose of increasing the support rate.
It should be noted that, the attribute promotion displayed on the first user interface is determined based on the historical support rate of the candidate task objects before the first user interface is displayed, and as more players select the candidate task objects in the task, the support rate of the candidate task objects changes, and the value of the attribute promotion changes, that is, the value of the attribute promotion is dynamically adjusted according to the support rate.
Ways to promote attributes for candidate task objects with lower historical support rates include enhancing their vitality values, increasing their skill values, increasing their task difficulties (i.e., the difficulty of defeating the candidate task object), and so forth. That is, the attribute tags displayed on the first user interface include at least one of:
displaying on the first user interface an attribute tag having a first numerical value with increased task difficulty;
Displaying on the first user interface an attribute tag having a life value increase of the second value;
displaying on the first user interface an attribute tag having a third numerical skill effect enhancement.
Wherein the first, second, and third values are determined based on historical support rates between candidate task objects.
In one possible embodiment, the value of the attribute promotion is determined based on the difference in the historical support rates of at least two candidate task objects, i.e., the greater the difference in the historical support rates between candidate task objects, the more the attribute promotion for candidate task objects with lower historical support rates. By improving the attribute of the candidate task object with low support rate, the candidate task object is more dominant in the task, so that more players tend to select the candidate task object, each candidate task object in the task can be fully supported or challenged, and the existing resources in the game are more efficiently utilized.
In one possible embodiment, the difference between the historic support rates of the two candidate task objects is determined as a value of the attribute improvement of the candidate task object with the lower historic support rate in the case where the two candidate task objects are two, and for example, the performance improvement of the candidate task object with the lower historic support rate is 10% (for example, the vitality value is improved by 10%) in the case where the historic support rates of the two candidate task objects are 45% and 55%, respectively.
In another possible embodiment, in the case where there are more than two candidate task objects, the difference between the history support rates of the candidate task object with the lower history support rate and the candidate task object with the highest history support rate is determined as the value of the attribute promotion of the candidate task object with the lower history support rate. For example, taking three candidate task objects as examples, the historical support rates are 25%,35% and 40%, respectively, the attribute promotion value of the candidate task object with the lowest historical support rate is 15%, and the attribute promotion value of the candidate task object with the second lowest historical support rate is 5%.
Optionally, the tasks include unopened tasks and open tasks. For an opened task, the task entrance of the opened task is provided with the avatars of at least two candidate task objects in the task, and for an unopened task, the task entrance of the opened task is provided with the avatars of at least two candidate task objects in the task.
Optionally, the opening status of each task is obtained. Displaying an entry of a first task on a first user interface, displaying the virtual images and the opposite bars of at least two candidate task objects in the first task on the entry of the first task, and/or displaying an entry of a second task on the first user interface, and displaying the silhouette of the virtual images of at least two candidate task objects in the second task on the entry of the second task, wherein the first task is an opened task, and the second task is an unopened task.
Step 240, responding to the triggering operation of the task entrance, and displaying a second user interface;
The second user interface includes at least two candidate task objects. The at least two candidate task objects are displayed on a second user interface, and the first account and the team where the first account is located are waited for selection. Optionally, a virtual image of the at least two candidate task objects is displayed on the second user interface to indicate that the at least two candidate task objects are in a state to be selected.
Optionally, the second user interface further comprises a first player character manipulated by the first account.
Optionally, both the player character and the candidate task object are displayed in a three-dimensional representation in the virtual world.
For example, in response to a trigger operation on the task portal, selection circles, the number of which is the same as the number of candidate task objects, are displayed on the second user interface, each selection circle corresponding to one candidate task object. The selection circle is used for a player character to select candidate mission objects for player manipulation, e.g., which selection circle a player enters represents choosing the candidate mission object in the selection circle as a party supported by the player or a party seeking a challenge.
In some possible implementations, the player entering the selection circle represents supporting the candidate task object in the selection circle, e.g., selection circle a corresponds to candidate virtual character a, selection circle B corresponds to candidate virtual character B, and then the player entering the selection circle a represents supporting candidate virtual character a, challenging virtual character B.
In other possible embodiments, the player entering the selection circle represents a candidate task object that is to be challenged in the selection circle, e.g., selection circle a corresponds to candidate virtual character a and selection circle B corresponds to candidate virtual character B, and then the player entering the selection circle a represents a candidate virtual character a that is to be challenged.
Step 260, responding to the selection operation of one candidate task object in the at least two candidate task objects, and generating a selection result of the first account on the at least two candidate task objects;
illustratively, a first player character controlling manipulation of the first account number enters a selection circle of one selected candidate task object, and the selected candidate task object is used as a selection result of the first account number on at least two candidate task objects. That is, the first player character corresponding to the first account enters a selection circle of one candidate task object, which indicates that the first account selects the candidate task object in the selection circle to support or challenge.
Optionally, the manner in which the first player character manipulated by the first account enters the selection circle of the first candidate task object may be at least one of a click operation, a double click operation, a voice operation, a pressure touch operation, eye-catch control, and somatosensory control. The application is not limited in this regard.
Optionally, the selection circle where the first player character is located is displayed as a selected state, wherein the selection circle in the selected state is at least one of a selection circle with highlighted color, a selection circle with a bolded appearance line, a selection circle with a shadow, a selection circle with an animation displayed, an audio effect of a candidate task object in the selection circle played, a candidate task object in the selection circle with a highlighted color, a candidate task object in the selection circle with a bolded line, and a candidate task object in the selection circle with a shadow.
Optionally, after the first player character in the team where the first account is located enters any selection circle, a countdown prompt is displayed on the second user interface, where the countdown prompt is used to indicate the remaining time for selecting the candidate task object. For example, a 15 second countdown is set, the timer is started after any player character in the team where the first account is located enters the selection circle, and after the countdown is stopped, the number of player characters in the selection circle is counted.
And 280, displaying a task picture taking the target task object as a task target on a second user interface.
The target task object is one of at least two candidate task objects, and the target task object is determined based on the selection result of each account of the team where the first account is located. For example, in the case where a player character enters a selection circle representing a candidate task object in the selection circle, the candidate task object in the selection circle having the largest number of entered player characters (i.e., the largest number of candidate task objects in the selection circle) is taken as a target task object, and in the case where a player character enters a selection circle representing a candidate task object in the selection circle, the candidate task object in the selection circle having the smallest number of entered player characters (i.e., the smallest number of support) is taken as a target task object.
That is, the target task object is determined by one of the following means:
The target task object is a candidate task object in a first selection circle, the first selection circle being the selection circle with the largest number of entered player characters;
The target task object is a candidate task object in a second selection circle, the second selection circle being the selection circle with the least number of entered player characters;
In the case where the number of player characters entering each selection circle is the same, the target task object is determined from the candidate task objects by means of random selection.
For example, after the target task object is determined, a task picture taking the target task object as a task target, that is, a task picture of the current task, is displayed on the second user interface.
The attribute value displayed on the first user interface in step 220 is determined based on the historical support rate before the first user interface is displayed, and since the first account and the team thereof select the candidate task objects in step 260, the support rate of each candidate task object will change, and thus the value of the attribute promotion of the candidate task object with lower support rate in the present task (i.e. the task to be completed by the first account and the team thereof in this step) will also change, therefore, after the target task object is determined, the attribute value of the target task object needs to be updated according to the combination of the selection of the team of the first account and the historical support rate. That is, based on the selection of the team where the first account is located and the historical support rate, the attribute value of the target task object in the task is determined.
In summary, according to the method provided by the embodiment of the application, the candidate task objects in the task are selected by the player, and the target task objects are determined based on the selection result of the team where the player is located, so that the collective behavior of the player can influence the task, different modes can be provided for the player, more types of player use can be supported by using limited resources, and the resource utilization rate is improved.
In addition, the method dynamically adjusts the attribute value of the candidate task object based on the support rate of the candidate task object, and guides the player to select the candidate task object with low support rate by strengthening the attribute of the candidate task object with low support rate, so that the player can obtain sufficient experience on each candidate task object, and the human-computer interaction efficiency is improved.
Fig. 6 is a flowchart illustrating a task object selection method according to an exemplary embodiment of the present application. The method may be performed by any of the terminals in fig. 4 described above, or by a client running on any of the terminals, on which the first account number is logged in. The method takes two candidate task targets as examples. The method comprises the following steps:
step 320, displaying a first user interface;
the first user interface includes a task portal. Optionally, the task portal displayed by the first user interface may be one or more. A first user interface, as shown in fig. 1, displays a plurality of task portals including a task portal 13 for an open task and a task portal 14 for an unopened task.
Optionally, the first user interface further includes two candidate task objects, where the candidate task objects correspond to the task, i.e. different tasks may correspond to two different candidate task objects, where the candidate task objects are selectable objects as task targets, i.e. the user needs to select from the two candidate task objects to select a target task object.
Optionally, an avatar of the candidate task object is displayed on the task portal of the first user interface. In one possible implementation, for an open task, an avatar of a candidate task object, such as avatar 22 of the candidate task object in FIG. 2, is displayed, and for an unopened task, a silhouette of the avatar of the candidate task object is displayed. Such as the avatar silhouette 28 of the candidate task object in fig. 2.
Optionally, a subtree is also displayed on the first user interface, the subtree being used to indicate the historical support rates of two candidate task objects in the task. The historical support rate refers to a support rate for candidate task objects, which is determined based on historical data of all players playing the task before the first user interface is displayed, for example, the total support times acquired by each candidate task object before the first user interface is displayed is acquired, and the acquired support times of the two candidate task objects are converted into percentages, namely the historical support rate.
Optionally, an attribute tag is further displayed on the first user interface, where the attribute tag is used to indicate attribute promotion for the candidate task object with a lower historical support rate, and a value of the attribute promotion is related to the historical support rates of the two candidate task objects. That is, for a candidate task object with a low support rate, the attribute value is raised in order to raise its support rate. For example, based on the difference in the historic support rates of at least two candidate task objects, a value of the attribute promotion is determined, that is, the greater the difference in the historic support rates between the candidate task objects, the more the attribute promotion for the candidate task object with the lower historic support rate.
Ways to promote attributes for candidate task objects with lower historical support rates include enhancing their vitality values, increasing their skill values, increasing their task difficulties (i.e., the difficulty of defeating the candidate task object), and so forth. That is, the attribute tags displayed on the first user interface include at least one of:
displaying on the first user interface an attribute tag having a first numerical value with increased task difficulty;
Displaying on the first user interface an attribute tag having a life value increase of the second value;
displaying on the first user interface an attribute tag having a third numerical skill effect enhancement.
Wherein the first, second, and third values are determined based on historical support rates between candidate task objects.
Fig. 7 shows a flowchart of a method of displaying a first user interface according to an exemplary embodiment of the present application, corresponding to step 320, the method comprising the steps of:
Step 3201, the user enters the first user interface, e.g., by triggering a jump control to enter the first user interface from the other user interfaces, the first client displays the first user interface.
Step 3202, judging whether the first account is in the season, executing step 3203 when the server is not in the season, and executing step 3204 when the server is in the season.
Step 3203, the first user interface does not display two candidate task objects, a check bar and an attribute tag, and the step is executed when the server where the first account is located is not in the check season, that is, when the server is not in the check season, the task has a fixed task object and cannot be selected by the player, so the first user interface does not display the two candidate task objects, the check bar and the attribute tag.
Step 3204, the first user interface displays two candidate task objects, a subtree and an attribute tag, and the step is executed if the server where the first account is located in the subtree, i.e. the player can select the task object by the collective decision of the team where the first account is located, so the first user interface displays the two candidate task objects, the subtree and the attribute tag.
Step 3205, judging whether the tasks are open, and the first client acquires the opening condition of each task from the server. Step 3207 is executed if the task selected by the first account is open, and step 3206 is executed if the task selected by the first account is not open.
And 3206, displaying the silhouettes of the avatars of the two candidate task objects, and executing the step, namely displaying the silhouettes of the avatars of the two candidate task objects under the condition that the task selected by the first account is not opened. Optionally, a silhouette of the avatars of the two candidate task objects is displayed at the task portal of the unopened task.
And 3207, acquiring the historical support rates, determining the value of the attribute promotion, and acquiring the historical support rates of the two candidate task objects from the server by the first client, and determining the difference between the historical support rates as the value of the attribute promotion of the candidate task object with poor historical support rate. The method includes displaying a syncing bar on a first interface based on a historical support rate, and displaying an attribute tag that includes a value for an attribute promotion.
Step 340, responding to the triggering operation of the task entrance, and displaying a second user interface;
The second user interface includes two candidate task objects. The two candidate task objects are displayed on a second user interface, and the selection of the first account and the team where the first account is located is waited for. Optionally, a virtual image of the two candidate task objects is displayed on the second user interface to indicate that the two candidate task objects are in a state to be selected.
Illustratively, in response to a triggering operation on the task portal, two selection circles are displayed on the second user interface, each selection circle corresponding to one candidate task object. The selection circle is used for a player character to select candidate mission objects for player manipulation, e.g., which selection circle a player enters represents choosing the candidate mission object in the selection circle as a party supported by the player or a party seeking a challenge.
For example, as shown in fig. 3, two selection circles, namely, a selection circle 31 and a selection circle 32, are displayed on the second user interface 30, the selection circle 31 corresponds to the character 1 (candidate task object 1), and the selection circle 32 corresponds to the character 2 (candidate task object 2).
Step 360, controlling a first player character operated by a first account to enter a selection circle of one selected candidate task object, and taking the selected candidate task object as a selection result of the first account on at least two candidate task objects;
For example, a first player character controlling the manipulation of the first account enters a selection circle of one candidate task object which the first account wants to select, and the selected one candidate task object is used as a selection result of the first account on at least two candidate task objects. That is, the first player character corresponding to the first account enters a selection circle of one candidate task object, which indicates that the first account selects the candidate task object in the selection circle to support or challenge.
Optionally, the manner in which the first account controls the first player character to enter the selection circle of the first candidate task object may be at least one of a click operation, a double click operation, a voice operation, a pressure touch operation, an eye control, and a somatosensory control. The application is not limited in this regard.
Optionally, the selection circle where the first player character is located is displayed as a selected state, wherein the selection circle in the selected state is at least one of a selection circle with highlighted color, a selection circle with a bolded appearance line, a selection circle with a shadow, a selection circle with an animation displayed, an audio effect of a candidate task object in the selection circle played, a candidate task object in the selection circle with a highlighted color, a candidate task object in the selection circle with a bolded line, and a candidate task object in the selection circle with a shadow.
Optionally, after the first player character in the team where the first account is located enters any selection circle, a countdown prompt is displayed on the second user interface, where the countdown prompt is used to indicate the remaining time for selecting the candidate task object. For example, a 10 second countdown is set, the timer is started after any player character in the team where the first account is located enters the selection circle, and after the countdown is stopped, the number of player characters in the selection circle is counted.
Step 380, displaying a task picture taking the target task object as a task target on the second user interface.
The target task object is one of two candidate task objects, and the target task object is determined based on the selection result of each account of the team where the first account is located. For example, when a player character enters a selection circle representing a candidate task object to be challenged in the selection circle, the candidate task object in the selection circle having the largest number of entered player characters is set as a target task object, and for example, when a player character enters a selection circle representing a candidate task object to be supported in the selection circle, the candidate task object in the selection circle having the smallest number of entered player characters is set as a target task object.
That is, the target task object is determined by one of the following means:
The target task object is a candidate task object in a first selection circle, the first selection circle being the selection circle with the largest number of entered player characters;
The target task object is a candidate task object in a second selection circle, the second selection circle being the selection circle with the least number of entered player characters;
In the case where the number of player characters entering each selection circle is the same, the target task object is determined from the candidate task objects by means of random selection.
For example, after determining the target task object, a task screen targeting the target task object as a task object is displayed on the second user interface.
The candidate task objects are selected by the team with the first account, so that the support rate of each candidate task object changes, and the value of the attribute promotion of the candidate task object with lower support rate also changes, so that after the target task object is determined, the attribute value of the target task object needs to be updated according to the combination of the selection of the team with the first account and the historical support rate. That is, based on the selection of the team where the first account is located and the historical support rate, the attribute value of the target task object in the task is determined.
In one possible implementation manner of calculating the support rate change of the candidate task object, the support rate change is calculated according to the selections made by each account in the team where the first account is located, for example, when six accounts select the candidate task object 1 and four accounts select the candidate task object 2 in the team where the first account is located, the support number of times of the candidate task object 1 is increased six times, the support number of times of the candidate task object 2 is increased four times, and the support rate is recalculated based on the result. In another possible implementation manner, the team selection result of the first account is uniformly calculated according to the whole team, for example, in the case that six accounts select candidate task objects 1 and four accounts select candidate task objects 2 in the team of the first account, the team of the first account is regarded as all the candidate task objects 1, that is, the number of times of support of the candidate task objects 1 is increased by ten.
FIG. 8 is a flow chart of a method of selecting task objects provided by an exemplary embodiment of the present application, corresponding to steps 340 through 380, taking as an example that entering a selection circle represents supporting candidate task objects in the selection circle, the method comprising the steps of:
Step 3401, the user enters the second user interface, e.g., by triggering a task portal on the first user interface, the user enters the second user interface, and the first client displays the second user interface.
And 3402, generating a selection circle A and a selection circle B on the second user interface, and displaying selection circles of different scenes on the second user interface according to the task, wherein each selection circle corresponds to one candidate task object, for example, a captain A (candidate task object A) is displayed in the selection circle A, and a captain B (candidate task object B) is displayed in the selection circle B.
In the embodiment of the application, the entering selection circle represents the captain in the support selection circle, and each player character of the team where the first account is located enters the selection circle respectively to represent the captain respectively supported.
Step 3404, displaying a countdown on the second user interface, wherein the server starts to count down from the moment that the first player character in the team with the first account enters the selection circle, and displaying the second user interface, for example, starting to count down for 10 seconds from the moment that the first player character in the team with the first account enters the selection circle, and stopping the countdown, wherein the player character cannot enter the selection circle any more.
Step 3405, comparing the number of people in selection circle A and selection circle B, the number of people in selection circle representing the number of people supporting the captain in the selection circle, comparing the number of people in the two selection circles, performing steps 3406 to 3408 if the number of people in selection circle A is greater than the number of people in selection circle B, performing steps 3412 to 3414 if the number of people in selection circle A is less than the number of people in selection circle B, and performing steps 3409 to 3411 if the number of people in the two selection circles are equal.
It is noted that a player character enters a selection circle representing a captain in the selection circle, i.e. desiring to challenge a captain in another selection circle, e.g. a player character enters a selection circle a representing a captain a desiring to challenge captain B, e.g. more than half of player characters in a team enter a selection circle a representing a captain a desiring to challenge captain B, so that the team challenges captain B (i.e. captain B is subsequently displayed on the second user interface), the task of the team is to defeat captain B, and a corresponding prize is available after defeating captain B.
And 3406, displaying the captain B through a second user interface, and executing the step under the condition that the number of people in the selection circle A is larger than that in the selection circle B, wherein the captain B with fewer support people is taken as a target task object, and the task target is defeating captain B.
Step 3407, adjusting the attribute value of the captain B according to the support rate, wherein the support rates of the captain A and the captain B are changed after the teams where the first account is located make the selection, so that the value of the attribute promotion is correspondingly changed, and the first client acquires the current support rates of the captain A and the captain B from the server (namely, the combination of the historical support rate and the selection made by the teams where the first account is located) and determines the value of the attribute promotion of the captain B based on the difference of the current support rates of the captain A and the captain B.
And 3408, the support rate of the captain A is increased, and the support rates of the captain A and the captain B are adjusted according to the selection of the team where the first account is located and recorded in a server, wherein the support rate of the captain A is increased.
And 3409, randomly displaying the captain A or the captain B, wherein the captain A and the captain B have the same number of supporters, so that whether the captain A or the captain B is displayed as a target task object to be defeated is determined in a random mode.
Step 3410, adjusting the displayed attribute values of the captain according to the support rate, wherein the support rates of the captain A and the captain B are changed after the selection is made by the team where the first account is located, so that the value of the attribute promotion is correspondingly changed, and the first client acquires the current support rates of the captain A and the captain B (namely, the combination of the historical support rate and the selection made by the team where the first account is located) from the server, and determines the value of the attribute promotion of the captain displayed in step 3409 based on the difference of the current support rates of the captain A and the captain B.
And 3411, adjusting the support rate of the captain A and the captain B according to the selection of the team where the first account is located, and recording the support rate in the server, wherein the support rate of the captain which is not displayed on the second user interface is increased.
And 3412, displaying the captain A by the second user interface, and executing the step under the condition that the number of people in the selection circle B is larger than that in the selection circle A, wherein the captain A with fewer support people is taken as a target task object, and the task target is defeating captain A.
In step 3413, the attribute value of the captain A is adjusted according to the support rate, the support rates of the captain A and the captain B are changed after the selection is made by the team where the first account is located, so that the value of the attribute promotion is correspondingly changed, and the first client obtains the current support rates of the captain A and the captain B (namely, the support rate determined by combining the historical support rate and the selection made by the team where the first account is located in the task) and determines the value of the attribute promotion of the captain A in the task based on the difference of the current support rates of the captain A and the captain B.
And 3414, adjusting the support rate of the captain A and the captain B according to the selection of the team where the first account is located, and recording the support rate in the server, wherein the support rate of the captain B is increased. That is, the history support rate of the leader B increases for the subsequent player.
And displaying a task picture taking the target task object (the captain A or the captain B determined according to the steps) as a task target on a second user interface, namely, checking the player character of the team where the first account is located with the target task object to acquire rewards corresponding to defeating the target task object.
In summary, according to the method provided by the embodiment of the application, the candidate task objects in the task are selected by the player, and the target task objects are determined based on the selection result of the team where the player is located, so that the collective behavior of the player can influence the task, different modes can be provided for the player, more types of player use can be supported by using limited resources, and the resource utilization rate is improved.
In addition, players can comprehensively select the captain which wants to support or challenge based on numerical requirements and personal preference, and the captain has higher freedom degree and participation sense.
In addition, the method dynamically adjusts the attribute value of the candidate task object based on the support rate of the candidate task object, and guides the player to select the candidate task object with low support rate by strengthening the attribute of the candidate task object with low support rate, so that the player can obtain sufficient experience on each candidate task object, and the human-computer interaction efficiency is improved.
In addition to the above embodiment, the progress bar may be displayed on the interface to display the progress of each task in real time, and each task may dynamically adjust the task difficulty or the value of the task object according to the task progress of each other, or the task clearance times may be displayed on the first user interface, and each task may dynamically adjust the task difficulty or the attribute value of the task object or modify the clearance rewards according to the number of players who have cleared, or the like. The application is not limited in the manner in which the attribute values of the task objects are dynamically adjusted.
The following are device embodiments of the application, reference being made to the above-described method embodiments for details of which are not described in detail in the device embodiments.
Fig. 9 is a block diagram of a task object selection apparatus provided in an exemplary embodiment of the present application. The device comprises:
a display module 420 for displaying a first user interface, the first user interface comprising a task portal;
the display module 420 is configured to display a second user interface in response to a triggering operation on the task entry, where the second user interface includes at least two candidate task objects;
A generating module 440, configured to generate a selection result of the at least two candidate task objects by the first account in response to a selection operation of one candidate task object of the at least two candidate task objects;
The display module 420 is configured to display, on the second user interface, a task screen that targets a target task object, where the target task object is one of the at least two candidate task objects, and the target task object is determined in the at least two candidate task objects based on a selection result of each account of the team on which the first account is located.
In an alternative embodiment, the display module 420 is configured to display, in response to a trigger operation on the task entry, a selection circle having the same number as the candidate task objects in the second user interface, where each selection circle corresponds to one candidate task object, and the generating module 440 is configured to control a first player character operated by the first account to enter a selection circle of a selected one candidate task object, and use the selected one candidate task object as a result of selecting the at least two candidate task objects by the first account.
In an alternative embodiment, the target task object is determined by one of the following modes, wherein the target task object is a candidate task object in a first selection circle, the first selection circle is the selection circle with the largest number of entered player characters, the target task object is a candidate task object in a second selection circle, the second selection circle is the selection circle with the smallest number of entered player characters, and the target task object is determined from the candidate task objects in a random selection mode under the condition that the number of entered player characters in each selection circle is the same.
In an alternative embodiment, the display module 420 is configured to display a countdown prompt on the second user interface after the first player character in the team where the first account is located enters any of the selection circles, where the countdown prompt is used to indicate a remaining time for selecting the candidate task object.
In an alternative embodiment, the display module 420 is configured to obtain the historical support rates of the at least two candidate task objects, and display the at least two candidate task objects and a diagonal bar on the first user interface, where the diagonal bar is used to indicate the historical support rates of the at least two candidate task objects.
In an alternative embodiment, the display module 420 is configured to display an attribute tag on the first user interface, where the attribute tag is configured to indicate an attribute promotion for the candidate task object with the lower historical support rate, and a value of the attribute promotion is related to the historical support rates of the at least two candidate task objects.
In an alternative embodiment, the display module 420 is configured to display a task difficulty enhancement attribute tag having a first value on the first user interface, display a life value enhancement attribute tag having a second value on the first user interface, and display a skill effect enhancement attribute tag having a third value on the first user interface.
In an alternative embodiment, the apparatus further comprises a determining module 460, the determining module 460 being configured to determine the value of the attribute promotion based on a difference in historical support rates of the at least two candidate task objects.
In an alternative embodiment, the display module 420 is configured to display an entry of a first task on the first user interface, where the entry of the first task displays an avatar and a subtree of at least two candidate task objects in the first task, and the first task is an opened task.
In an alternative embodiment, the display module 420 is configured to display, on the first user interface, an entry for a second task, where the second task is an unopened task, and display a silhouette of an avatar of at least two candidate task objects in the second task on the entry for the second task.
In an alternative embodiment, the determining module 460 is configured to determine the attribute value of the target task object in the current task based on the selection of the team where the first account is located and the historical support rate.
It should be noted that, in the task object selecting device provided in the above embodiment, only the division of the above functional modules is used for illustration, in practical application, the above functional allocation may be performed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to complete all or part of the functions described above. In addition, the task object selecting device and the task object selecting method provided in the foregoing embodiments belong to the same concept, and specific implementation processes of the task object selecting device and the task object selecting method are detailed in the method embodiments and are not repeated herein.
The application also provides a terminal which comprises a processor and a memory, wherein at least one instruction is stored in the memory, and the at least one instruction is loaded and executed by the processor to realize the task object selection method provided by each method embodiment. It should be noted that the terminal may be a terminal as provided in fig. 10 below.
Fig. 10 shows a block diagram of a terminal 900 according to an exemplary embodiment of the present application. The terminal 900 may be a smart phone, tablet, MP3 player (Moving Picture Experts Group Audio Layer III, MPEG audio layer 3), MP4 (Moving Picture Experts Group Audio Layer IV, MPEG audio layer 4) player, notebook, or desktop. Terminal 900 may also be referred to by other names of user devices, portable terminals, laptop terminals, desktop terminals, etc.
In general, terminal 900 includes a processor 901 and memory 902.
Processor 901 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. The processor 901 may be implemented in at least one hardware form of DSP (DIGITAL SIGNAL Processing), FPGA (Field-Programmable gate array), PLA (Programmable Logic Array ). The processor 901 may also include a main processor, which is a processor for processing data in a wake-up state, also referred to as a CPU (Central Processing Unit ), and a coprocessor, which is a low-power processor for processing data in a standby state. In some embodiments, the processor 901 may be in an integrated GPU (Graphics Processing Unit, image processor) that is responsible for rendering and rendering of the content that the display screen is required to display. In some embodiments, the processor 901 may also include an AI (ARTIFICIAL INTELLIGENCE ) processor for processing computing operations related to machine learning.
The memory 902 may include one or more computer-readable storage media, which may be non-transitory. The memory 902 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 902 is used to store at least one instruction for execution by processor 901 to implement the method of selecting a task object provided by a method embodiment of the present application.
In some embodiments, terminal 900 can optionally further include a peripheral interface 903 and at least one peripheral. The processor 901, memory 902, and peripheral interface 903 may be connected by a bus or signal line. The individual peripheral devices may be connected to the peripheral device interface 903 via buses, signal lines, or circuit boards. Specifically, the peripheral devices include at least one of a radio frequency circuit 904, a display 905, a camera assembly 906, an audio circuit 907, a location assembly 908, and a power source 909.
The peripheral interface 903 may be used to connect at least one peripheral device associated with an I/O (Input/Output) to the processor 901 and the memory 902. In some embodiments, the processor 901, the memory 902, and the peripheral interface 903 are integrated on the same chip or circuit board, and in some other embodiments, either or both of the processor 901, the memory 902, and the peripheral interface 903 may be implemented on separate chips or circuit boards, which is not limited in this embodiment.
The Radio Frequency circuit 904 is configured to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuit 904 communicates with a communication network and other communication devices via electromagnetic signals. The radio frequency circuit 904 converts an electrical signal into an electromagnetic signal for transmission, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuitry 904 includes an antenna system, an RF transceiver, one or more amplifiers, tuners, oscillators, digital signal processors, codec chipsets, subscriber identity module cards, and so forth. The radio frequency circuit 904 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to, the world wide web, metropolitan area networks, intranets, various generations of mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (WIRELESS FIDELITY ) networks. In some embodiments, the radio frequency circuit 904 may further include NFC (NEAR FIELD Communication) related circuits, which is not limited by the present application.
The display 905 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display 905 is a touch display, the display 905 also has the ability to capture touch signals at or above the surface of the display 905. The touch signal may be input as a control signal to the processor 901 for processing. At this time, the display 905 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 905 may be one, providing a front panel of the terminal 900, in other embodiments, the display 905 may be at least two, provided on different surfaces or in a folded design of the terminal 900, respectively, and in still other embodiments, the display 905 may be a flexible display, provided on a curved surface or a folded surface of the terminal 900. Even more, the display 905 may be arranged in an irregular pattern other than rectangular, i.e., a shaped screen. The display 905 may be made of LCD (Liquid CRYSTAL DISPLAY), OLED (Organic Light-Emitting Diode) or other materials.
The camera assembly 906 is used to capture images or video. Optionally, the camera assembly 906 includes a front camera and a rear camera. Typically, the front camera is disposed on the front panel of the terminal and the rear camera is disposed on the rear surface of the terminal. In some embodiments, the at least two rear cameras are any one of a main camera, a depth camera, a wide-angle camera and a tele camera, so as to realize that the main camera and the depth camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting and Virtual Reality (VR) shooting function or other fusion shooting functions. In some embodiments, camera assembly 906 may also include a flash. The flash lamp can be a single-color temperature flash lamp or a double-color temperature flash lamp. The dual-color temperature flash lamp refers to a combination of a warm light flash lamp and a cold light flash lamp, and can be used for light compensation under different color temperatures.
The audio circuit 907 may include a microphone and a speaker. The microphone is used for collecting sound waves of users and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 901 for processing, or inputting the electric signals to the radio frequency circuit 904 for voice communication. For purposes of stereo acquisition or noise reduction, the microphone may be plural and disposed at different portions of the terminal 900. The microphone may also be an array microphone or an omni-directional pickup microphone. The speaker is used to convert electrical signals from the processor 901 or the radio frequency circuit 904 into sound waves. The speaker may be a conventional thin film speaker or a piezoelectric ceramic speaker. When the speaker is a piezoelectric ceramic speaker, not only the electric signal can be converted into a sound wave audible to humans, but also the electric signal can be converted into a sound wave inaudible to humans for ranging and other purposes. In some embodiments, the audio circuit 907 may also include a headphone jack.
The location component 908 is used to locate the current geographic location of the terminal 900 to enable navigation or LBS (Location Based Service, location-based services). The location component 908 may be a GPS (Global Positioning System ), beidou system, or Galileo system based location component.
The power supply 909 is used to supply power to the various components in the terminal 900. The power supply 909 may be an alternating current, a direct current, a disposable battery, or a rechargeable battery. When the power source 909 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 900 can further include one or more sensors 910. The one or more sensors 910 include, but are not limited to, an acceleration sensor 911, a gyroscope sensor 912, a pressure sensor 913, an optical sensor 915, and a proximity sensor 916.
The acceleration sensor 911 can detect the magnitudes of accelerations on three coordinate axes of the coordinate system established with the terminal 900. For example, the acceleration sensor 911 may be used to detect components of gravitational acceleration in three coordinate axes. The processor 901 may control the display 905 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal acquired by the acceleration sensor 911. The acceleration sensor 911 may also be used for the acquisition of motion data of a game or a user.
The gyro sensor 912 may detect a body direction and a rotation angle of the terminal 900, and the gyro sensor 912 may collect a 3D motion of the user on the terminal 900 in cooperation with the acceleration sensor 911. The processor 901 can realize functions such as motion sensing (e.g., changing a UI according to a tilting operation of a user), image stabilization at photographing, game control, and inertial navigation, based on data acquired by the gyro sensor 912.
The pressure sensor 913 may be provided at a side frame of the terminal 900 and/or at a lower layer of the display 905. When the pressure sensor 913 is provided at a side frame of the terminal 900, a grip signal of the user to the terminal 900 may be detected, and the processor 901 performs left-right hand recognition or shortcut operation according to the grip signal collected by the pressure sensor 913. When the pressure sensor 913 is provided at the lower layer of the display 905, the processor 901 performs control of the operability control on the UI interface according to the pressure operation of the user on the display 905. The operability controls include at least one of a button control, a scroll bar control, an icon control, and a menu control.
The optical sensor 915 is used to collect the intensity of ambient light. In one embodiment, the processor 901 may control the display brightness of the display panel 905 based on the intensity of ambient light collected by the optical sensor 915. Specifically, the display luminance of the display panel 905 is turned up when the ambient light intensity is high, and the display luminance of the display panel 905 is turned down when the ambient light intensity is low. In another embodiment, the processor 901 may also dynamically adjust the shooting parameters of the camera assembly 906 based on the ambient light intensity collected by the optical sensor 915.
A proximity sensor 916, also referred to as a distance sensor, is typically provided on the front panel of the terminal 900. Proximity sensor 916 is used to collect the distance between the user and the front of terminal 900. In one embodiment, the display 905 is controlled by the processor 901 to switch from a bright screen state to a off screen state when the proximity sensor 916 detects that the distance between the user and the front of the terminal 900 is gradually decreasing, and the display 905 is controlled by the processor 901 to switch from the off screen state to the bright screen state when the proximity sensor 916 detects that the distance between the user and the front of the terminal 900 is gradually increasing.
Those skilled in the art will appreciate that the structure shown in fig. 10 is not limiting and that more or fewer components than shown may be included or certain components may be combined or a different arrangement of components may be employed.
The memory also includes one or more programs stored in the memory, the one or more programs including a method for selecting a task object provided by an embodiment of the present application.
The present application provides a computer readable storage medium having stored therein at least one instruction loaded and executed by the processor to implement the task object selection method provided by the above respective method embodiments.
The present application also provides a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium and executes the computer instructions to cause the computer device to perform the method of selecting a task object provided in the alternative implementations described above.
The foregoing embodiment numbers of the present application are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program for instructing relevant hardware, where the program may be stored in a computer readable storage medium, and the storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The foregoing description of the preferred embodiments of the present application is not intended to limit the application, but rather, the application is to be construed as limited to the appended claims.
It should be noted that, before and during the process of collecting the relevant data of the user, the present application may display a prompt interface, a popup window or output voice prompt information, where the prompt interface, popup window or voice prompt information is used to prompt the user to collect the relevant data currently, so that the present application only starts to execute the relevant step of obtaining the relevant data of the user after obtaining the confirmation operation of the user to the prompt interface or popup window, otherwise (i.e. when the confirmation operation of the user to the prompt interface or popup window is not obtained), the relevant step of obtaining the relevant data of the user is finished, i.e. the relevant data of the user is not obtained. In other words, all user data collected by the present application is collected with the consent and authorization of the user, and the collection, use and processing of relevant user data requires compliance with relevant laws and regulations and standards of the relevant country and region.