Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
Like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only to distinguish the description, and are not to be construed as indicating or implying relative importance.
Referring to fig. 1, a schematic structural diagram of an electronic device 1 according to an embodiment of the present application includes at least one processor 11 and a memory 12, and one processor is taken as an example in fig. 1. The processors 11 and the memory 12 are connected through the bus 10, and the memory 12 stores instructions executable by the at least one processor 11, the instructions being executed by the at least one processor 11 to cause the at least one processor 11 to perform the device deployment method in the embodiment described below.
In one embodiment, processor 11 may be a general-purpose processor including, but not limited to, a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc., a digital signal processor (Digital Signal Processor, DSP), an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), an off-the-shelf programmable gate array (Field Programmable Gate Array, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, etc. The general purpose processor may be a microprocessor or the processor may be any conventional processor or the like, and the processor 11 is a control center of the electronic device 1, and connects the various parts of the entire electronic device 1 using various interfaces and lines. The processor 11 may implement or perform the methods, steps and logic blocks disclosed in the embodiments of the present application.
In one embodiment, memory 12 may be implemented by any type of volatile or non-volatile Memory device or combination thereof, including, but not limited to, random access Memory (Random Access Memory, RAM), read Only Memory (ROM), static random access Memory (Static Random Access Memory, SRAM for short), programmable Read Only Memory (Programmable Read-Only Memory, PROM), erasable Read Only Memory (Erasable Programmable Read-Only Memory, EPROM), electrically erasable Read Only Memory (Electric Erasable Programmable Read-Only Memory, EEPROM), and the like.
The structure of the electronic device shown in fig. 1 is only illustrative, and the electronic device 1 may also comprise more or fewer components than shown in fig. 1, or have a different configuration than shown in fig. 1. The components shown in fig. 1 may be implemented in hardware, software, or a combination thereof.
As shown in fig. 2, which is a schematic structural diagram of an interaction system 200 of a scene element according to an embodiment of the present application, the interaction system 200 of a scene element includes: a client 210 and a server 220. The client 210 and the server 220 may communicate data with each other through a wired or wireless connection.
In one embodiment, the client 210 may be, but is not limited to, a mobile terminal such as a smart phone, a tablet computer, a notebook computer, etc.; the server 220 may be, but is not limited to being, a distributed server or the like. The server 220 provides services for the client 210.
Fig. 3 is a schematic flow chart of a scene processing method according to a first embodiment of the present application. The method may be performed by a client 210 as shown in fig. 2, the client 210 being capable of displaying a game scene, the game scene comprising scene elements, the method comprising the steps of:
step S210: and responding to the interaction instruction of the player user aiming at a certain scene element, and displaying an interaction editing interface.
The game scene refers to scenes, sceneries and environments which are presented to a player user in the game, and game developers can pre-configure scene effects presented by the game scene according to the types of the game and the specific content of the game. In a 3D multi-player simultaneous online game scenario and learning world, player users can explore, learn, and interact in such a game world. The game scenario is exemplified by the Carnival world or the future world, for example. Player users refer to game users or players participating in the Carnival world or future world games, and not game developers.
The game scene comprises scene elements, which can be understood as static facilities such as buildings and dynamic scene elements such as trees, grasses and rivers contained in the game scene, and all the scene elements are combined together to form a game background environment. In a game scenario of the Carnival world or the future world, a player user may see buildings or other scene elements in many game scenarios. By way of example, the building may be a high building, a platform, a fountain, a step, a street lamp, an audio device, etc. in a game scene, and the dynamic scene elements are trees, grass, and rivers.
When a player user needs to edit a building or other scene element after entering a game scene, the player user needs to enter through an interactive portal, and after entering the interactive portal, the client 210 displays an interactive editing interface. And the player user creates or edits the building or other scene elements in the game scene according to the preference of the player user through operating the interactive editing interface. By way of example, the colors of buildings such as high-rise buildings, platforms, fountains, steps, street lamps, audio-visual equipment and the like in the game scene can be edited, created or adjusted; the rhythms of dynamic scene elements such as water flow of trees, grass and rivers can be edited, created or adjusted, and the rhythms can be the swinging rhythms of the trees, the swinging rhythms of the grass and the flowing speed of the water flow; the swinging angle of the tree and the swinging angle of the grass can be edited, created or adjusted, or the spraying angle of the fountain can be edited, created or adjusted; the spraying time length of the fountain and the music playing time length of the background music played in the sound equipment can be edited, authored or adjusted. Wherein, the interactive editing interface is used for displaying various attribute parameters of buildings or other scene elements, which can be edited or authored.
For example, the player user may move to a certain scene element, for example, move to the fountain, and there is a prompt option of an interaction portal at the fountain, and the player user triggers the scene interaction portal button by moving the mouse, clicks the enter scene interaction portal, and may enter an editing interface for editing and authoring the fountain scene element. Therefore, for the client 210, when the player user triggers the button of the interaction portal, the client 210 responds to the interaction instruction of the player user for the fountain scene element, and an interaction editing interface of the fountain scene element is displayed on the client 210.
In another embodiment, the player user may move a finger to the fountain side by touching the screen, trigger a scene interaction entry button, click to enter the scene interaction entry, and after the client 210 displays the scene interaction entry in the interaction editing interface, the player user may perform editing and authoring operations on the specific scene element. Besides the above interactive instruction for triggering a certain scene element by means of mouse clicking and finger touching, the instruction can be triggered by means of voice input. The client 210 displays the interactive editing interface after receiving an interactive instruction of the player user for a certain scene element.
Step S220: and acquiring attribute parameters input by a player user aiming at a certain scene element through an interactive editing interface.
In step S210, the building takes a fountain as an example, and the player moves to the side of the fountain, clicks the interactive portal beside the fountain, and displays an interactive editing interface on the client 210. Fountain scene elements can be edited on the interactive interface.
The editing of the scene element is mainly performed on attribute parameters of the scene element. Taking fountain scene elements as an example, please refer to fig. 4, the attribute parameters include: fountain color, fountain spray velocity, fountain spray pattern, spray repetition times, adjacent spray waiting time, play music, etc.
There are various ways of inputting attribute parameters. The entry of the attribute parameters may be through an input box, selection item, scroll bar or drop down menu option, or the like. For example, in fig. 4, the attribute parameter is fountain color, and the player can select emerald green, lemon yellow and haze blue to edit the fountain color by scrolling through the scroll bar in the editing module. When the attribute parameter is fountain jet speed, the player can select the fountain jet speed through a drop-down menu option, and the jet speed is 1m/s, 2m/s, 3m/s, 4m/s, 5m/s, and the like, for example, 4m/s, including but not limited to the above values. When the attribute parameter is fountain jet mode, the player can select up, down, left and right directions by selecting items, and each direction can specifically select an inclination angle, for example, the player can manually input the inclination angle to be 30 degrees upwards through an input box. When the attribute parameter is the number of ejection repetitions, the player user can select the number of ejection repetitions, for example, the number of ejection repetitions is 1, by clicking a pull-down menu option. When the attribute parameter is the adjacent injection waiting time, the player user can select the adjacent injection waiting time to be 1s through the drop-down menu option. When the attribute parameter is play music, the player user can select the required background music through the selection item.
Since there are multiple scene elements in a game scene, there are multiple attribute parameters for a scene element. For example, the scene elements in fig. 4 also include tall buildings, street lamps, steps, and the like. For example, when the scene element is a street lamp, the attribute parameters of the street lamp may include: the appearance color, the illumination color, the height, the appearance structure and the lighting time of the street lamp. And customizing the dynamic picture of the personalized street lamp element of the player user by editing the attribute parameters of the street lamp.
It should be noted that, the player user can only edit or create the operation again for the building or other scene elements already existing in the game scene.
In a 3D multi-player simultaneous online game scenario, player users may author and edit buildings or other scene elements in the game scenario. The player user can edit the existing buildings or other scene elements in the game scene according to the preference of the player user, so that the buildings or scene elements show the favorite picture effect of the user. The player user can participate in the creation process of the game scene, so that the current situation that all scene elements in the game scene cannot be changed after the traditional game is developed and published is changed. The player user is more interested in participating in the game by creating the personalized animation effect capable of realizing the game scene, thereby improving the game participation feeling of the player user.
Step S230: and displaying a certain scene element in the game scene according to the attribute parameters.
As described above, the player user edits the attribute parameters of the fountain according to his own preference, so that the edited fountain presents personalized effects. The player user can randomly combine a plurality of attribute parameters of fountain scene elements in the game scene according to preference. For example, when the player user only wants to edit or create the combination of the fountain color, the fountain spraying speed and the fountain spraying mode, the player can respectively select the editing modules corresponding to the four attribute parameters, and then select the characteristics corresponding to each attribute parameter.
After the selection is completed, clicking the operation button on the interactive editing interface, and generating a display result of the edited scene on the client 210. The display result of the scene is that the fountain color is emerald green, the fountain spraying speed is 4m/s, the fountain spraying mode is upward inclined by 30 degrees, the spraying repetition number is 5, and the method is shown in fig. 5. The player user can see the effect display diagram of the finally edited fountain scene element on the client 210.
In the editing process of a certain scene element, if a player user does not want a certain attribute parameter, the editing module of the attribute parameter of the element can be dragged to a label indicated by the dustbin in a dragging mode to delete the attribute parameter. By way of example, the player user only wants to edit the fountain color and the scene effect of the fountain spraying mode, and can drag the editing module of the fountain spraying speed to the label indicated by the dustbin for deletion. Therefore, in this embodiment, the player user can change each attribute parameter according to the need, so as to generate rich combinations, and the player user can obtain various personalized effects.
In one embodiment, the step S230 specifically includes steps S241-S243:
step S241: code data corresponding to the attribute parameters of a certain scene element is obtained.
As described above, in the interactive editing interface, the player user can freely edit, and adjust the attribute parameters of the existing scene element through the code data corresponding to the attribute parameters of the given existing scene element. Thus, the editing or authoring of property parameters of existing elements is accomplished, in effect, the development of the underlying code data.
Step S242: and generating an editing instruction data set corresponding to the code data according to the target editing result of the code data.
Illustratively, taking the attribute parameters of the fountain color as an example, if the player user wants to implement the change of the fountain color, the player user needs to obtain code data corresponding to the attribute parameters of different fountain colors configured at the bottom. Each fountain color implementation is supported by corresponding code data that is pre-developed based on the game developer.
According to fountain colors which the user wants to edit or create, the background automatically calls code data corresponding to the fountain colors selected by the player user and operates. By way of example, the player user wants to edit the fountain color to be emerald green, and the background automatically calls code data corresponding to the emerald green to operate, and finally the fountain color in the scene is displayed as the emerald green, so that the target editing result of the code data is that the fountain color is displayed as the emerald green. That is, the background can call the code data corresponding to the fountain color attribute parameter according to the instruction of the player user, wherein the player user wants to edit the fountain color. Therefore, the operation instructions of the code data corresponding to all fountain colors are collectively called as an editing instruction data set corresponding to the code data. When the game is initially developed, the developer configures and stores code data of all colors corresponding to the fountain in a bottom database, and when the player user edits the fountain color, the background automatically calls the code data of the fountain color currently selected by the player user and operates the code data.
All color classes described herein include red, orange, yellow, green, cyan, blue, violet, further subdivided under the 7 color classes described above, and the red series may be further subdivided into pink, light red, pink, magenta, red, agate, maroon, lotus red, cherry red, garnet, maroon, orange, iron red, chrome red, vermilion, sorghum red, scarlet, deep red, dark red, purplish red, rose red, mature hertz, brown, coffee, maroon; the subdivision principle of orange series, yellow series, green series, cyan series, blue series and purple series is the same as that of red series, and will not be described here. Therefore, the code data corresponding to the color types covered in the bottom database is relatively complete, the player user can edit and create the fountain color more selectively, the use requirement of the player user can be fully met, and the player user does not need to reset the corresponding fountain color according to the requirement.
Based on the same principle, the fountain spraying speed, spraying time, spraying repetition times, adjacent spraying waiting time, fountain spraying mode, playing music and other attribute parameters all have corresponding code data, and when a player user edits or composes the attribute parameters of each scene element, the background can call the corresponding code data and operate according to the editing or composing instruction of the player user on the attribute parameters of the scene elements.
Step S243: and displaying a certain scene element in the game scene according to the editing instruction data set corresponding to the code data.
The player user edits each attribute parameter of the fountain, each attribute parameter of the fountain scene element has corresponding code data, and each time the player user requests to execute the code data corresponding to one attribute parameter, the client 210 responds to the editing instruction of the player user, and finally the dynamic effect of the scene element edited by the player user is displayed on the client.
In one embodiment, after step S243, the method further includes step S244:
step S244: and storing the editing instruction data set corresponding to the code data in a server.
Because the player user can run and show the editing effect in real time after finishing editing the attribute parameters of a certain scene element each time, if the player user is satisfied with the editing and showing effect at this time, a save button on the interactive editing interface can be clicked to save each attribute parameter edited or authored at this time. The client generates a corresponding editing instruction data set from the edited code data corresponding to each attribute parameter, and submits the editing instruction data set to the server for storage and recording.
In one embodiment, after step S230, the method further includes step S240-step S250:
step S240: and responding to the debugging operation instruction of the player user on the edited certain scene element.
The player user can debug the most satisfactory picture display effect by continuously editing, creating and debugging the attribute parameters of a certain scene element. Thus, this may be a continually explored and repeated process, and thus, client 210 may respond according to each debug instruction of the player user.
Step S250: and publishing the display result of a certain scene element edited by the debugged player user.
When the player user returns to the game scene, the client side displays the display picture effect of the attribute parameter of a certain scene element edited by the player user according to the latest editing instruction data set stored by the server side. Therefore, the player user can continuously try editing and modifying before release, try to run the view effect, and then save and release.
Fig. 6 is a schematic flow chart of a scene processing method according to a second embodiment of the present application. The method may be performed by the server 220 shown in fig. 2, and the method includes steps S610-S630:
Step S610: an edit instruction submitted by the client 210 for a player user to attribute parameters of a certain scene element in the game scene is received.
Step S620: and executing the editing instruction of the player user aiming at the attribute parameter of a certain scene element in the game scene according to the login information of the player user.
Step S630: and sending a display result operation instruction corresponding to the editing instruction of the player user aiming at the attribute parameter of a certain scene element in the game scene to the client side, and displaying the display result operation instruction through the client side 210.
In the above steps, when the player user edits the attribute parameters of the scene elements, the client 210 receives the result display operation instruction corresponding to the edit instruction of the player user for the attribute parameters of the certain scene elements in the game scene according to the edit instruction of the player user for the attribute parameters of the certain scene elements in the game scene, and displays the edit screen effect of the player user.
Because the client 210 and the server 220 interact with each other, referring to the description of the execution process of the client 210, after the client 210 sends the editing instruction of the attribute parameter of a certain scene element in the game scene to the server 220 by the player user, the server 220 will correspondingly execute the editing instruction of the attribute parameter of the certain scene element sent by the logged-in player user based on the ID information of the logged-in player user, and the server 220 sends the display result operation instruction corresponding to the editing instruction of the attribute parameter of the certain scene element sent by the logged-in player user to the client 210, so that the display result operation instruction is displayed by the client 210, and the end player user can view the picture and moving picture effect through the client 210.
In one embodiment, step S610 includes the steps of:
step S611: and receiving code data sets corresponding to attribute parameters of a certain scene element in the game scene submitted by all player users.
Each player user stores an edit data instruction set belonging to the current game scene, and after each time the player user logs in, the client 210 pulls the edit data instruction set on the presence server 220 of the player user, including the data of the game scene where the player user is located, so as to load the game scene belonging to the player user. Accordingly, the server 220 receives the code data set corresponding to the attribute parameter of a certain scene element in the game scene submitted by all player users, and finally, the personalized game scene picture of the player users is displayed on the client 210 through the interaction between the client 210 and the server 220.
Step S612: and updating the code data set corresponding to the attribute parameter of a certain scene element in the game scene of the old version stored locally by using the code data set corresponding to the attribute parameter of the certain scene element in the game scene submitted by the target player user.
When there are a plurality of player users, each player user may submit a code data set corresponding to the attribute parameter of a certain scene element in the game scene. Assuming that, for the same scene element, both player users modify the attribute parameters of the scene element, the server 220 needs to determine that the code data set corresponding to the attribute parameters of a certain scene element submitted by which player user is subject to the editing instruction set corresponding to the attribute parameters of the certain scene element, so as to update the code data set corresponding to the attribute parameters of the certain scene element in the old version of game scene already existing in the local database of the server 220. It should be noted that, each player user may see the code data set corresponding to the attribute parameter of a certain scene element after modifying and updating the code data set and submitting the code data set to the server 220, and the server 220 may update the code data set corresponding to the attribute parameter of a certain scene element in the old version of game scene already existing in the local database of the server 220 according to the time sequence submitted by the two player users. In addition, the server 220 may also select, according to a preset judgment principle, a code dataset corresponding to an attribute parameter of a certain scene element submitted by two player users, so as to update a code dataset corresponding to an attribute parameter of a certain scene element in an old version of game scenes already existing in the local database of the server 220.
In one embodiment, the judgment principle may be that each player user learns the score ranking result in the game scene or each player user contributes to the team score ranking result in the game scene. The target player user is defined herein as the player user with the highest rank of learning points in the game for all player users.
And updating the code data set corresponding to the attribute parameter of a certain scene element in the game scene of the old version stored locally by using the code data set corresponding to the attribute parameter of the certain scene element in the game scene submitted by the player user with the highest learning score ranking of all player users in the game. It should be noted that, among all player users, a player user who does not have editing authority cannot edit or author an element in a game scene where the player user is located. Thus, under such rules, not all player users have editing privileges, and only the player users with the highest rank of learning points in the game are given editing privileges.
Taking future world game scenes as an example, each player user can obtain the right to edit and create scene element attribute parameters in the future world game scenes in a certain period through voting rights.
In the existing future world game scenes, only player users can see the editing result of the scene element attribute parameters, and other player users in the same game scene cannot see the editing process. In this embodiment, the editing process of the attribute parameters of the game scene element can be realized and visible to all the players.
Specifically, a city may be elected by voting among all player users in the Carnival town. In a certain future period, for example, in a week, the player user of the winning city can edit and create the attribute parameters of the existing scene elements in the future world game scene according to own ideas and preferences, and after continuous debugging and commissioning, the player user as the city can issue the display results corresponding to the finally edited scene element attribute parameters to all player users.
The election of the urban subscribers depends on the contributions of the player users in the game scene, such as learning points, contributed team points, answering success rates, login duration and the like, in a certain period, and the player users are ranked according to a certain rule algorithm to give the player users authority to compete for the urban subscribers of the town, namely editing management persons. For example, the rule algorithm may calculate according to the learning score, the team score of contribution, the answer success rate, the score of each calculation condition in the login duration and the weight occupied by the calculation, and the specific calculation data may refer to table 1.
Table 1 player user ranking rule calculation data
The above four calculation conditions for each player user are calculated based on the calculation data of table 1. For example, player user a learns score 110 (score 10), contributed team score 72 (score 5), answer success rate 85% (score 10), log-in duration 85min (score 10), and the final score calculation formula for player user a is: s is SA =10*0.3+5*0.25+10*0.3+10*0.15=8.75。
Player user B learns 122 points (10 points), 102 points for contributing team points (10 points), 78% answer success rate (5 points), 90 minutes login time (10 points), and the final point calculation formula for player user B is: s is SB =10*0.3+10*0.25+5*0.3+10*0.15=8.5。
By order comparison, S can be knownA >SB Therefore, player user a is ranked higher, and can be given edit management authority. The elections being held in a certain period, when selected as the urban length, i.e. managing itThe player users of other player users can edit and release scenes in the management period, and the edited and released game scenes are visible to all users. By adopting the method, the participation degree of the player user can be better improved, the honor sense of the player user can be improved, and the creation enthusiasm of the player user can be improved. By this method, the excellent edited work is made visible to more player users.
Fig. 7 is a flowchart of an interaction method of a scene element according to a third embodiment of the present application. The method comprises the steps of S710-S750:
step S710: clicking through a building in the game scene enters the interaction portal.
In this step, the manner in which the player user enters the interactive portal may refer to the description of step S210 in detail, and will not be described here again.
Step S720: entering an interactive editing interface.
In this step, the player user enters the interactive editing interface, and the input manner of the scene element attribute parameter can refer to the description of step S220 in detail, which is not described here again.
Step S730: editing the attribute parameters of the scene elements, and judging whether to perform trial operation or not after the editing is completed. If not, go to step S720, if yes, go to step S740.
In this step, the player user can edit the existing buildings or other scene elements in the game scene according to his own preference, so that the buildings or scene elements show the favorite picture effect of the user. The player user can also optionally combine multiple attribute parameters of fountain scene elements in the game scene according to preferences. After the player user completes editing and creation of the scene element attribute parameters through the interactive editing page, the player user can click a test run button in order to view the scene element attribute parameter editing effect in time. The player user may choose to continue to attempt editing or to issue directly, depending on the satisfaction of the displayed results of the trial run. Illustratively, when editing is completed, clicking the run button on the interactive editing interface generates a display of the edited scene on the client 210. If the edits are not to the extent that the player user is satisfied, the attempt to edit may continue until the effect of the player user satisfaction is reached.
Step S740: and issuing.
The player user can debug the most satisfactory picture display effect by continuously editing, creating and debugging the attribute parameters of a certain scene element. When the player user returns to the game scene, the client side displays the display picture effect of the attribute parameter of a certain scene element edited by the player user according to the latest editing instruction data set stored by the server side. Therefore, the player user can continuously try editing and modifying before release, try to run the view effect, and then save and release. For details, reference is made to the description of step S250.
Step S750: and displaying the edited scene, wherein the player user sees the scene edited by himself.
After release, the player user can see his own achievements in the game scene, i.e. what you see is what you get, and finally get the sense of achievement.
In summary, by adopting the interaction method of the scene elements, the toy user can participate in the personalized editing process of the existing elements in the game scene, so that the personalized creation force and creation power of the player user are greatly stimulated. The participation degree of the player user can be better improved, the honor sense of the player user is improved, and the creation enthusiasm of the player user can also be improved. By this method, the excellent edited work is made visible to more player users.
Fig. 8 is a schematic structural diagram of an interaction device of the scene element corresponding to fig. 3, where the interaction device of the scene element includes: a first display module 810, an acquisition module 820, a second display module 830.
The first display module 810 is configured to display an interactive editing interface in response to an interactive instruction of a player user for a certain scene element.
And the acquisition module 820 is used for acquiring attribute parameters input by the player user for a certain scene element through the interactive editing interface.
The second display module 830 is configured to display a certain scene element in the game scene according to the attribute parameter.
Fig. 9 is a schematic structural diagram of an interaction device of the scene element corresponding to fig. 6, where the interaction device of the scene element includes: a receiving module 910, an executing module 920, and a processing module 930.
The receiving module 910 is configured to receive an edit instruction submitted by the client for an attribute parameter of a certain scene element in the game scene by a player user.
The execution module 920 is configured to execute an editing instruction of the player user for an attribute parameter of a certain scene element in the game scene according to the login information of the player user.
And the processing module 930 is configured to send a display result operation instruction corresponding to the edit instruction of the attribute parameter of the certain scene element in the game scene to the client by using the player user, and display the display result operation instruction through the client.
The implementation process of the functions and roles of each module in the above device is specifically detailed in the implementation process of the corresponding steps in the above description, and will not be repeated here.
The embodiment of the invention also provides an electronic device readable storage medium, which comprises: a program which, when run on an electronic device, causes the electronic device to perform all or part of the flow of the method in the above-described embodiments. The storage medium may be a magnetic Disk, an optical Disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a Flash Memory (Flash Memory), a Hard Disk (HDD), or a Solid State Drive (SSD), etc. The storage medium may also include a combination of the above-mentioned types of memory 12.
In the several embodiments provided in the present application, the disclosed apparatus and method may be implemented in other manners. The apparatus embodiments described above are merely illustrative, for example, flow diagrams and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, the functional modules in the embodiments of the present application may be integrated together to form a single part, or each module may exist alone, or two or more modules may be integrated to form a single part.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. The foregoing description is only of the preferred embodiments of the present application, and is not intended to limit the scope of the present application. Any modifications, equivalent substitutions, improvements, or the like, which would occur to one skilled in the art, are intended to be included within the spirit and principles of the present application.