Movatterモバイル変換


[0]ホーム

URL:


CN116510311A - Scene element interaction method and device, electronic equipment and storage medium - Google Patents

Scene element interaction method and device, electronic equipment and storage medium
Download PDF

Info

Publication number
CN116510311A
CN116510311ACN202310483558.9ACN202310483558ACN116510311ACN 116510311 ACN116510311 ACN 116510311ACN 202310483558 ACN202310483558 ACN 202310483558ACN 116510311 ACN116510311 ACN 116510311A
Authority
CN
China
Prior art keywords
scene
player user
scene element
game
certain
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310483558.9A
Other languages
Chinese (zh)
Other versions
CN116510311B (en
Inventor
贾强强
陈向东
潘卓
陶建霖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Siming Qichuang Technology Co ltd
Original Assignee
Beijing Siming Qichuang Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Siming Qichuang Technology Co ltdfiledCriticalBeijing Siming Qichuang Technology Co ltd
Priority to CN202310483558.9ApriorityCriticalpatent/CN116510311B/en
Publication of CN116510311ApublicationCriticalpatent/CN116510311A/en
Application grantedgrantedCritical
Publication of CN116510311BpublicationCriticalpatent/CN116510311B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Classifications

Landscapes

Abstract

The application provides a scene element interaction method and device, electronic equipment and storage medium, wherein the method is applied to a client, the client can display a game scene, the game scene comprises scene elements, and the method comprises the following steps: responding to an interaction instruction of a player user aiming at a certain scene element, and displaying an interaction editing interface; acquiring attribute parameters input by a player user aiming at a certain scene element through an interactive editing interface; and displaying a certain scene element in the game scene according to the attribute parameters. By adopting the interaction method of the scene elements, the player user can participate in the personalized editing process of the existing elements in the game scene, so that the personalized creation force and creation power of the player user can be greatly stimulated. The participation degree of the player user can be better improved, the honor sense of the player user is improved, and the creation enthusiasm of the player user can also be improved. By this method, excellent edited works are made visible to more player users.

Description

Scene element interaction method and device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of data processing technologies, and in particular, to a method and apparatus for interaction of scene elements, an electronic device, and a storage medium.
Background
In a game scene and a learning world where 3D multiple people are online at the same time, users can explore, learn and interact in the world. Alternatively, game content provided by authorities is accepted in such a world scene.
In a general 3D multi-person online world scene, participants in the world scene can only see buildings or scenes in the scene, but cannot edit or create existing scene elements in external scenes, so that a user does not interact with the scene, and the game scene is single; or the user can only create the article in the own area, and the creation scope is limited. The user can not be stimulated well to create enthusiasm and achievement, so that the user experience is not strong.
Disclosure of Invention
The embodiment of the application aims to provide an interaction method of scene elements, by adopting the interaction method of scene elements, a player user can participate in the personalized editing process of the existing elements in a game scene, and personalized creation force and creation power of the player user are greatly stimulated. The participation degree of the player user can be better improved, the honor sense of the player user is improved, and the creation enthusiasm of the player user can also be improved. By this method, excellent edited works are made visible to more player users.
In a first aspect, the present application provides an interaction method of a scene element, applied to a client, where the client is capable of displaying a game scene, the game scene includes the scene element, and the method includes: responding to an interaction instruction of a player user aiming at a certain scene element, and displaying an interaction editing interface; acquiring attribute parameters input by the player user aiming at a certain scene element through the interactive editing interface; and displaying the certain scene element in the game scene according to the attribute parameter.
In an embodiment, the displaying the certain scene element in the game scene according to the attribute parameter includes: acquiring code data corresponding to the attribute parameters of a certain scene element; generating an editing instruction data set corresponding to the code data according to a target editing result of the code data; and displaying the certain scene element in the game scene according to the editing instruction data set corresponding to the code data.
In one embodiment, after the displaying the certain scene element in the game scene according to the editing instruction data set corresponding to the code data, the method further includes: and storing the editing instruction data set corresponding to the code data in a server.
In an embodiment, after the displaying the certain scene element in the game scene according to the attribute parameter, the method further comprises: responding to a debugging operation instruction of the player user on the edited certain scene element; and publishing the display result of the certain scene element edited by the debugged player user.
In a second aspect, the present application provides an interaction method of a scene element, applied to a server, including: receiving an editing instruction of a player user submitted by a client for attribute parameters of a certain scene element in a game scene; executing an editing instruction of the player user for attribute parameters of a certain scene element in a game scene according to login information of the player user; and sending a display result operation instruction corresponding to the editing instruction of the player user aiming at the attribute parameter of a certain scene element in the game scene to the client, and displaying the display result operation instruction through the client.
In an embodiment, the receiving the editing instruction of the attribute parameter of the player user for a certain scene element in the game scene submitted by the client includes: receiving code data sets corresponding to attribute parameters of a certain scene element in the game scene submitted by all player users; and updating the code data set corresponding to the attribute parameter of a certain scene element in the game scene of the old version stored locally by using the code data set corresponding to the attribute parameter of the certain scene element in the game scene submitted by the target player user.
In one embodiment, the target player user is the player user with the highest rank of all the player users' learning points in the game; the updating of the locally stored code dataset corresponding to the attribute parameter of a certain scene element in the old version of the game scene with the code dataset corresponding to the attribute parameter of the certain scene element in the game scene submitted by the target player user comprises the following steps: and updating the code data set corresponding to the attribute parameter of a certain scene element in the game scene of the old version stored locally by using the code data set corresponding to the attribute parameter of the certain scene element in the game scene submitted by the player user with the highest learning score ranking of all the player users in the game.
In a third aspect, the present application provides an interaction device for a scene element, including:
the first display module is used for responding to an interaction instruction of a player user aiming at a certain scene element and displaying an interaction editing interface;
the acquisition module is used for acquiring attribute parameters input by the player user aiming at a certain scene element through the interactive editing interface;
and the second display module is used for displaying the certain scene element in the game scene according to the attribute parameter.
In a fourth aspect, the present application provides an interaction device for a scene element, including:
the receiving module is used for receiving an editing instruction of a player user submitted by the client side aiming at an attribute parameter of a certain element in a game scene;
the execution module is used for executing the editing instruction of the player user for the attribute parameter of a certain element in the game scene according to the login information of the player user;
and the processing module is used for sending the display result operation instruction corresponding to the editing instruction of the attribute parameter of the player user aiming at a certain element in the game scene to the client and displaying the display result operation instruction through the client.
In a fifth aspect, the present application provides an electronic device, including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to execute to perform the method of interaction of a scene element of the first aspect of the embodiments of the present application and any of the embodiments thereof or the method of interaction of a scene element of the second aspect of the embodiments and any of the embodiments thereof.
In a sixth aspect, the present application provides a computer readable storage medium storing a computer program which, when executed by a processor, is configured to perform the method of interaction of a scene element of the first aspect of an embodiment of the present application and any embodiment thereof, or the method of interaction of a scene element of the second aspect of an embodiment and any embodiment thereof.
According to the method for interaction of the scene elements, the player user can participate in the personalized editing process of the existing elements in the game scene, and personalized creation force and creation power of the player user are greatly stimulated. The participation degree of the player user can be better improved, the honor sense of the player user is improved, and the creation enthusiasm of the player user can also be improved. By this method, excellent edited works are made visible to more player users.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the following description will briefly explain the drawings that are required to be used in the embodiments of the present application.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
FIG. 2 is a schematic structural diagram of an interactive system for providing scene elements according to an embodiment of the present application;
fig. 3 is a flow chart of an interaction method of a scene element according to the first embodiment of the present application;
FIG. 4 is a schematic diagram of attribute parameters of game scene element input according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a game scene display effect according to an embodiment of the present disclosure;
fig. 6 is a flow chart of an interaction method of a scene element according to a second embodiment of the present application;
Fig. 7 is a flow chart of an interaction method of a scene element according to a third embodiment of the present application;
FIG. 8 is a schematic structural diagram of an interaction device of the scene element corresponding to FIG. 3;
fig. 9 is a schematic structural diagram of an interaction device of the scene element corresponding to fig. 6.
Reference numerals:
1-electronic device, 10-bus; 11-a processor; 12-memory; 200-an interaction system of scene elements; 210-client; 220-a server; 810-a first display module; 820-an acquisition module; 830-a second display module; 910 a receiving module; 920-execution module; 930-processing module.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
Like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only to distinguish the description, and are not to be construed as indicating or implying relative importance.
Referring to fig. 1, a schematic structural diagram of an electronic device 1 according to an embodiment of the present application includes at least one processor 11 and a memory 12, and one processor is taken as an example in fig. 1. The processors 11 and the memory 12 are connected through the bus 10, and the memory 12 stores instructions executable by the at least one processor 11, the instructions being executed by the at least one processor 11 to cause the at least one processor 11 to perform the device deployment method in the embodiment described below.
In one embodiment, processor 11 may be a general-purpose processor including, but not limited to, a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc., a digital signal processor (Digital Signal Processor, DSP), an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), an off-the-shelf programmable gate array (Field Programmable Gate Array, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, etc. The general purpose processor may be a microprocessor or the processor may be any conventional processor or the like, and the processor 11 is a control center of the electronic device 1, and connects the various parts of the entire electronic device 1 using various interfaces and lines. The processor 11 may implement or perform the methods, steps and logic blocks disclosed in the embodiments of the present application.
In one embodiment, memory 12 may be implemented by any type of volatile or non-volatile Memory device or combination thereof, including, but not limited to, random access Memory (Random Access Memory, RAM), read Only Memory (ROM), static random access Memory (Static Random Access Memory, SRAM for short), programmable Read Only Memory (Programmable Read-Only Memory, PROM), erasable Read Only Memory (Erasable Programmable Read-Only Memory, EPROM), electrically erasable Read Only Memory (Electric Erasable Programmable Read-Only Memory, EEPROM), and the like.
The structure of the electronic device shown in fig. 1 is only illustrative, and the electronic device 1 may also comprise more or fewer components than shown in fig. 1, or have a different configuration than shown in fig. 1. The components shown in fig. 1 may be implemented in hardware, software, or a combination thereof.
As shown in fig. 2, which is a schematic structural diagram of an interaction system 200 of a scene element according to an embodiment of the present application, the interaction system 200 of a scene element includes: a client 210 and a server 220. The client 210 and the server 220 may communicate data with each other through a wired or wireless connection.
In one embodiment, the client 210 may be, but is not limited to, a mobile terminal such as a smart phone, a tablet computer, a notebook computer, etc.; the server 220 may be, but is not limited to being, a distributed server or the like. The server 220 provides services for the client 210.
Fig. 3 is a schematic flow chart of a scene processing method according to a first embodiment of the present application. The method may be performed by a client 210 as shown in fig. 2, the client 210 being capable of displaying a game scene, the game scene comprising scene elements, the method comprising the steps of:
step S210: and responding to the interaction instruction of the player user aiming at a certain scene element, and displaying an interaction editing interface.
The game scene refers to scenes, sceneries and environments which are presented to a player user in the game, and game developers can pre-configure scene effects presented by the game scene according to the types of the game and the specific content of the game. In a 3D multi-player simultaneous online game scenario and learning world, player users can explore, learn, and interact in such a game world. The game scenario is exemplified by the Carnival world or the future world, for example. Player users refer to game users or players participating in the Carnival world or future world games, and not game developers.
The game scene comprises scene elements, which can be understood as static facilities such as buildings and dynamic scene elements such as trees, grasses and rivers contained in the game scene, and all the scene elements are combined together to form a game background environment. In a game scenario of the Carnival world or the future world, a player user may see buildings or other scene elements in many game scenarios. By way of example, the building may be a high building, a platform, a fountain, a step, a street lamp, an audio device, etc. in a game scene, and the dynamic scene elements are trees, grass, and rivers.
When a player user needs to edit a building or other scene element after entering a game scene, the player user needs to enter through an interactive portal, and after entering the interactive portal, the client 210 displays an interactive editing interface. And the player user creates or edits the building or other scene elements in the game scene according to the preference of the player user through operating the interactive editing interface. By way of example, the colors of buildings such as high-rise buildings, platforms, fountains, steps, street lamps, audio-visual equipment and the like in the game scene can be edited, created or adjusted; the rhythms of dynamic scene elements such as water flow of trees, grass and rivers can be edited, created or adjusted, and the rhythms can be the swinging rhythms of the trees, the swinging rhythms of the grass and the flowing speed of the water flow; the swinging angle of the tree and the swinging angle of the grass can be edited, created or adjusted, or the spraying angle of the fountain can be edited, created or adjusted; the spraying time length of the fountain and the music playing time length of the background music played in the sound equipment can be edited, authored or adjusted. Wherein, the interactive editing interface is used for displaying various attribute parameters of buildings or other scene elements, which can be edited or authored.
For example, the player user may move to a certain scene element, for example, move to the fountain, and there is a prompt option of an interaction portal at the fountain, and the player user triggers the scene interaction portal button by moving the mouse, clicks the enter scene interaction portal, and may enter an editing interface for editing and authoring the fountain scene element. Therefore, for the client 210, when the player user triggers the button of the interaction portal, the client 210 responds to the interaction instruction of the player user for the fountain scene element, and an interaction editing interface of the fountain scene element is displayed on the client 210.
In another embodiment, the player user may move a finger to the fountain side by touching the screen, trigger a scene interaction entry button, click to enter the scene interaction entry, and after the client 210 displays the scene interaction entry in the interaction editing interface, the player user may perform editing and authoring operations on the specific scene element. Besides the above interactive instruction for triggering a certain scene element by means of mouse clicking and finger touching, the instruction can be triggered by means of voice input. The client 210 displays the interactive editing interface after receiving an interactive instruction of the player user for a certain scene element.
Step S220: and acquiring attribute parameters input by a player user aiming at a certain scene element through an interactive editing interface.
In step S210, the building takes a fountain as an example, and the player moves to the side of the fountain, clicks the interactive portal beside the fountain, and displays an interactive editing interface on the client 210. Fountain scene elements can be edited on the interactive interface.
The editing of the scene element is mainly performed on attribute parameters of the scene element. Taking fountain scene elements as an example, please refer to fig. 4, the attribute parameters include: fountain color, fountain spray velocity, fountain spray pattern, spray repetition times, adjacent spray waiting time, play music, etc.
There are various ways of inputting attribute parameters. The entry of the attribute parameters may be through an input box, selection item, scroll bar or drop down menu option, or the like. For example, in fig. 4, the attribute parameter is fountain color, and the player can select emerald green, lemon yellow and haze blue to edit the fountain color by scrolling through the scroll bar in the editing module. When the attribute parameter is fountain jet speed, the player can select the fountain jet speed through a drop-down menu option, and the jet speed is 1m/s, 2m/s, 3m/s, 4m/s, 5m/s, and the like, for example, 4m/s, including but not limited to the above values. When the attribute parameter is fountain jet mode, the player can select up, down, left and right directions by selecting items, and each direction can specifically select an inclination angle, for example, the player can manually input the inclination angle to be 30 degrees upwards through an input box. When the attribute parameter is the number of ejection repetitions, the player user can select the number of ejection repetitions, for example, the number of ejection repetitions is 1, by clicking a pull-down menu option. When the attribute parameter is the adjacent injection waiting time, the player user can select the adjacent injection waiting time to be 1s through the drop-down menu option. When the attribute parameter is play music, the player user can select the required background music through the selection item.
Since there are multiple scene elements in a game scene, there are multiple attribute parameters for a scene element. For example, the scene elements in fig. 4 also include tall buildings, street lamps, steps, and the like. For example, when the scene element is a street lamp, the attribute parameters of the street lamp may include: the appearance color, the illumination color, the height, the appearance structure and the lighting time of the street lamp. And customizing the dynamic picture of the personalized street lamp element of the player user by editing the attribute parameters of the street lamp.
It should be noted that, the player user can only edit or create the operation again for the building or other scene elements already existing in the game scene.
In a 3D multi-player simultaneous online game scenario, player users may author and edit buildings or other scene elements in the game scenario. The player user can edit the existing buildings or other scene elements in the game scene according to the preference of the player user, so that the buildings or scene elements show the favorite picture effect of the user. The player user can participate in the creation process of the game scene, so that the current situation that all scene elements in the game scene cannot be changed after the traditional game is developed and published is changed. The player user is more interested in participating in the game by creating the personalized animation effect capable of realizing the game scene, thereby improving the game participation feeling of the player user.
Step S230: and displaying a certain scene element in the game scene according to the attribute parameters.
As described above, the player user edits the attribute parameters of the fountain according to his own preference, so that the edited fountain presents personalized effects. The player user can randomly combine a plurality of attribute parameters of fountain scene elements in the game scene according to preference. For example, when the player user only wants to edit or create the combination of the fountain color, the fountain spraying speed and the fountain spraying mode, the player can respectively select the editing modules corresponding to the four attribute parameters, and then select the characteristics corresponding to each attribute parameter.
After the selection is completed, clicking the operation button on the interactive editing interface, and generating a display result of the edited scene on the client 210. The display result of the scene is that the fountain color is emerald green, the fountain spraying speed is 4m/s, the fountain spraying mode is upward inclined by 30 degrees, the spraying repetition number is 5, and the method is shown in fig. 5. The player user can see the effect display diagram of the finally edited fountain scene element on the client 210.
In the editing process of a certain scene element, if a player user does not want a certain attribute parameter, the editing module of the attribute parameter of the element can be dragged to a label indicated by the dustbin in a dragging mode to delete the attribute parameter. By way of example, the player user only wants to edit the fountain color and the scene effect of the fountain spraying mode, and can drag the editing module of the fountain spraying speed to the label indicated by the dustbin for deletion. Therefore, in this embodiment, the player user can change each attribute parameter according to the need, so as to generate rich combinations, and the player user can obtain various personalized effects.
In one embodiment, the step S230 specifically includes steps S241-S243:
step S241: code data corresponding to the attribute parameters of a certain scene element is obtained.
As described above, in the interactive editing interface, the player user can freely edit, and adjust the attribute parameters of the existing scene element through the code data corresponding to the attribute parameters of the given existing scene element. Thus, the editing or authoring of property parameters of existing elements is accomplished, in effect, the development of the underlying code data.
Step S242: and generating an editing instruction data set corresponding to the code data according to the target editing result of the code data.
Illustratively, taking the attribute parameters of the fountain color as an example, if the player user wants to implement the change of the fountain color, the player user needs to obtain code data corresponding to the attribute parameters of different fountain colors configured at the bottom. Each fountain color implementation is supported by corresponding code data that is pre-developed based on the game developer.
According to fountain colors which the user wants to edit or create, the background automatically calls code data corresponding to the fountain colors selected by the player user and operates. By way of example, the player user wants to edit the fountain color to be emerald green, and the background automatically calls code data corresponding to the emerald green to operate, and finally the fountain color in the scene is displayed as the emerald green, so that the target editing result of the code data is that the fountain color is displayed as the emerald green. That is, the background can call the code data corresponding to the fountain color attribute parameter according to the instruction of the player user, wherein the player user wants to edit the fountain color. Therefore, the operation instructions of the code data corresponding to all fountain colors are collectively called as an editing instruction data set corresponding to the code data. When the game is initially developed, the developer configures and stores code data of all colors corresponding to the fountain in a bottom database, and when the player user edits the fountain color, the background automatically calls the code data of the fountain color currently selected by the player user and operates the code data.
All color classes described herein include red, orange, yellow, green, cyan, blue, violet, further subdivided under the 7 color classes described above, and the red series may be further subdivided into pink, light red, pink, magenta, red, agate, maroon, lotus red, cherry red, garnet, maroon, orange, iron red, chrome red, vermilion, sorghum red, scarlet, deep red, dark red, purplish red, rose red, mature hertz, brown, coffee, maroon; the subdivision principle of orange series, yellow series, green series, cyan series, blue series and purple series is the same as that of red series, and will not be described here. Therefore, the code data corresponding to the color types covered in the bottom database is relatively complete, the player user can edit and create the fountain color more selectively, the use requirement of the player user can be fully met, and the player user does not need to reset the corresponding fountain color according to the requirement.
Based on the same principle, the fountain spraying speed, spraying time, spraying repetition times, adjacent spraying waiting time, fountain spraying mode, playing music and other attribute parameters all have corresponding code data, and when a player user edits or composes the attribute parameters of each scene element, the background can call the corresponding code data and operate according to the editing or composing instruction of the player user on the attribute parameters of the scene elements.
Step S243: and displaying a certain scene element in the game scene according to the editing instruction data set corresponding to the code data.
The player user edits each attribute parameter of the fountain, each attribute parameter of the fountain scene element has corresponding code data, and each time the player user requests to execute the code data corresponding to one attribute parameter, the client 210 responds to the editing instruction of the player user, and finally the dynamic effect of the scene element edited by the player user is displayed on the client.
In one embodiment, after step S243, the method further includes step S244:
step S244: and storing the editing instruction data set corresponding to the code data in a server.
Because the player user can run and show the editing effect in real time after finishing editing the attribute parameters of a certain scene element each time, if the player user is satisfied with the editing and showing effect at this time, a save button on the interactive editing interface can be clicked to save each attribute parameter edited or authored at this time. The client generates a corresponding editing instruction data set from the edited code data corresponding to each attribute parameter, and submits the editing instruction data set to the server for storage and recording.
In one embodiment, after step S230, the method further includes step S240-step S250:
step S240: and responding to the debugging operation instruction of the player user on the edited certain scene element.
The player user can debug the most satisfactory picture display effect by continuously editing, creating and debugging the attribute parameters of a certain scene element. Thus, this may be a continually explored and repeated process, and thus, client 210 may respond according to each debug instruction of the player user.
Step S250: and publishing the display result of a certain scene element edited by the debugged player user.
When the player user returns to the game scene, the client side displays the display picture effect of the attribute parameter of a certain scene element edited by the player user according to the latest editing instruction data set stored by the server side. Therefore, the player user can continuously try editing and modifying before release, try to run the view effect, and then save and release.
Fig. 6 is a schematic flow chart of a scene processing method according to a second embodiment of the present application. The method may be performed by the server 220 shown in fig. 2, and the method includes steps S610-S630:
Step S610: an edit instruction submitted by the client 210 for a player user to attribute parameters of a certain scene element in the game scene is received.
Step S620: and executing the editing instruction of the player user aiming at the attribute parameter of a certain scene element in the game scene according to the login information of the player user.
Step S630: and sending a display result operation instruction corresponding to the editing instruction of the player user aiming at the attribute parameter of a certain scene element in the game scene to the client side, and displaying the display result operation instruction through the client side 210.
In the above steps, when the player user edits the attribute parameters of the scene elements, the client 210 receives the result display operation instruction corresponding to the edit instruction of the player user for the attribute parameters of the certain scene elements in the game scene according to the edit instruction of the player user for the attribute parameters of the certain scene elements in the game scene, and displays the edit screen effect of the player user.
Because the client 210 and the server 220 interact with each other, referring to the description of the execution process of the client 210, after the client 210 sends the editing instruction of the attribute parameter of a certain scene element in the game scene to the server 220 by the player user, the server 220 will correspondingly execute the editing instruction of the attribute parameter of the certain scene element sent by the logged-in player user based on the ID information of the logged-in player user, and the server 220 sends the display result operation instruction corresponding to the editing instruction of the attribute parameter of the certain scene element sent by the logged-in player user to the client 210, so that the display result operation instruction is displayed by the client 210, and the end player user can view the picture and moving picture effect through the client 210.
In one embodiment, step S610 includes the steps of:
step S611: and receiving code data sets corresponding to attribute parameters of a certain scene element in the game scene submitted by all player users.
Each player user stores an edit data instruction set belonging to the current game scene, and after each time the player user logs in, the client 210 pulls the edit data instruction set on the presence server 220 of the player user, including the data of the game scene where the player user is located, so as to load the game scene belonging to the player user. Accordingly, the server 220 receives the code data set corresponding to the attribute parameter of a certain scene element in the game scene submitted by all player users, and finally, the personalized game scene picture of the player users is displayed on the client 210 through the interaction between the client 210 and the server 220.
Step S612: and updating the code data set corresponding to the attribute parameter of a certain scene element in the game scene of the old version stored locally by using the code data set corresponding to the attribute parameter of the certain scene element in the game scene submitted by the target player user.
When there are a plurality of player users, each player user may submit a code data set corresponding to the attribute parameter of a certain scene element in the game scene. Assuming that, for the same scene element, both player users modify the attribute parameters of the scene element, the server 220 needs to determine that the code data set corresponding to the attribute parameters of a certain scene element submitted by which player user is subject to the editing instruction set corresponding to the attribute parameters of the certain scene element, so as to update the code data set corresponding to the attribute parameters of the certain scene element in the old version of game scene already existing in the local database of the server 220. It should be noted that, each player user may see the code data set corresponding to the attribute parameter of a certain scene element after modifying and updating the code data set and submitting the code data set to the server 220, and the server 220 may update the code data set corresponding to the attribute parameter of a certain scene element in the old version of game scene already existing in the local database of the server 220 according to the time sequence submitted by the two player users. In addition, the server 220 may also select, according to a preset judgment principle, a code dataset corresponding to an attribute parameter of a certain scene element submitted by two player users, so as to update a code dataset corresponding to an attribute parameter of a certain scene element in an old version of game scenes already existing in the local database of the server 220.
In one embodiment, the judgment principle may be that each player user learns the score ranking result in the game scene or each player user contributes to the team score ranking result in the game scene. The target player user is defined herein as the player user with the highest rank of learning points in the game for all player users.
And updating the code data set corresponding to the attribute parameter of a certain scene element in the game scene of the old version stored locally by using the code data set corresponding to the attribute parameter of the certain scene element in the game scene submitted by the player user with the highest learning score ranking of all player users in the game. It should be noted that, among all player users, a player user who does not have editing authority cannot edit or author an element in a game scene where the player user is located. Thus, under such rules, not all player users have editing privileges, and only the player users with the highest rank of learning points in the game are given editing privileges.
Taking future world game scenes as an example, each player user can obtain the right to edit and create scene element attribute parameters in the future world game scenes in a certain period through voting rights.
In the existing future world game scenes, only player users can see the editing result of the scene element attribute parameters, and other player users in the same game scene cannot see the editing process. In this embodiment, the editing process of the attribute parameters of the game scene element can be realized and visible to all the players.
Specifically, a city may be elected by voting among all player users in the Carnival town. In a certain future period, for example, in a week, the player user of the winning city can edit and create the attribute parameters of the existing scene elements in the future world game scene according to own ideas and preferences, and after continuous debugging and commissioning, the player user as the city can issue the display results corresponding to the finally edited scene element attribute parameters to all player users.
The election of the urban subscribers depends on the contributions of the player users in the game scene, such as learning points, contributed team points, answering success rates, login duration and the like, in a certain period, and the player users are ranked according to a certain rule algorithm to give the player users authority to compete for the urban subscribers of the town, namely editing management persons. For example, the rule algorithm may calculate according to the learning score, the team score of contribution, the answer success rate, the score of each calculation condition in the login duration and the weight occupied by the calculation, and the specific calculation data may refer to table 1.
Table 1 player user ranking rule calculation data
The above four calculation conditions for each player user are calculated based on the calculation data of table 1. For example, player user a learns score 110 (score 10), contributed team score 72 (score 5), answer success rate 85% (score 10), log-in duration 85min (score 10), and the final score calculation formula for player user a is: s is SA =10*0.3+5*0.25+10*0.3+10*0.15=8.75。
Player user B learns 122 points (10 points), 102 points for contributing team points (10 points), 78% answer success rate (5 points), 90 minutes login time (10 points), and the final point calculation formula for player user B is: s is SB =10*0.3+10*0.25+5*0.3+10*0.15=8.5。
By order comparison, S can be knownA >SB Therefore, player user a is ranked higher, and can be given edit management authority. The elections being held in a certain period, when selected as the urban length, i.e. managing itThe player users of other player users can edit and release scenes in the management period, and the edited and released game scenes are visible to all users. By adopting the method, the participation degree of the player user can be better improved, the honor sense of the player user can be improved, and the creation enthusiasm of the player user can be improved. By this method, the excellent edited work is made visible to more player users.
Fig. 7 is a flowchart of an interaction method of a scene element according to a third embodiment of the present application. The method comprises the steps of S710-S750:
step S710: clicking through a building in the game scene enters the interaction portal.
In this step, the manner in which the player user enters the interactive portal may refer to the description of step S210 in detail, and will not be described here again.
Step S720: entering an interactive editing interface.
In this step, the player user enters the interactive editing interface, and the input manner of the scene element attribute parameter can refer to the description of step S220 in detail, which is not described here again.
Step S730: editing the attribute parameters of the scene elements, and judging whether to perform trial operation or not after the editing is completed. If not, go to step S720, if yes, go to step S740.
In this step, the player user can edit the existing buildings or other scene elements in the game scene according to his own preference, so that the buildings or scene elements show the favorite picture effect of the user. The player user can also optionally combine multiple attribute parameters of fountain scene elements in the game scene according to preferences. After the player user completes editing and creation of the scene element attribute parameters through the interactive editing page, the player user can click a test run button in order to view the scene element attribute parameter editing effect in time. The player user may choose to continue to attempt editing or to issue directly, depending on the satisfaction of the displayed results of the trial run. Illustratively, when editing is completed, clicking the run button on the interactive editing interface generates a display of the edited scene on the client 210. If the edits are not to the extent that the player user is satisfied, the attempt to edit may continue until the effect of the player user satisfaction is reached.
Step S740: and issuing.
The player user can debug the most satisfactory picture display effect by continuously editing, creating and debugging the attribute parameters of a certain scene element. When the player user returns to the game scene, the client side displays the display picture effect of the attribute parameter of a certain scene element edited by the player user according to the latest editing instruction data set stored by the server side. Therefore, the player user can continuously try editing and modifying before release, try to run the view effect, and then save and release. For details, reference is made to the description of step S250.
Step S750: and displaying the edited scene, wherein the player user sees the scene edited by himself.
After release, the player user can see his own achievements in the game scene, i.e. what you see is what you get, and finally get the sense of achievement.
In summary, by adopting the interaction method of the scene elements, the toy user can participate in the personalized editing process of the existing elements in the game scene, so that the personalized creation force and creation power of the player user are greatly stimulated. The participation degree of the player user can be better improved, the honor sense of the player user is improved, and the creation enthusiasm of the player user can also be improved. By this method, the excellent edited work is made visible to more player users.
Fig. 8 is a schematic structural diagram of an interaction device of the scene element corresponding to fig. 3, where the interaction device of the scene element includes: a first display module 810, an acquisition module 820, a second display module 830.
The first display module 810 is configured to display an interactive editing interface in response to an interactive instruction of a player user for a certain scene element.
And the acquisition module 820 is used for acquiring attribute parameters input by the player user for a certain scene element through the interactive editing interface.
The second display module 830 is configured to display a certain scene element in the game scene according to the attribute parameter.
Fig. 9 is a schematic structural diagram of an interaction device of the scene element corresponding to fig. 6, where the interaction device of the scene element includes: a receiving module 910, an executing module 920, and a processing module 930.
The receiving module 910 is configured to receive an edit instruction submitted by the client for an attribute parameter of a certain scene element in the game scene by a player user.
The execution module 920 is configured to execute an editing instruction of the player user for an attribute parameter of a certain scene element in the game scene according to the login information of the player user.
And the processing module 930 is configured to send a display result operation instruction corresponding to the edit instruction of the attribute parameter of the certain scene element in the game scene to the client by using the player user, and display the display result operation instruction through the client.
The implementation process of the functions and roles of each module in the above device is specifically detailed in the implementation process of the corresponding steps in the above description, and will not be repeated here.
The embodiment of the invention also provides an electronic device readable storage medium, which comprises: a program which, when run on an electronic device, causes the electronic device to perform all or part of the flow of the method in the above-described embodiments. The storage medium may be a magnetic Disk, an optical Disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a Flash Memory (Flash Memory), a Hard Disk (HDD), or a Solid State Drive (SSD), etc. The storage medium may also include a combination of the above-mentioned types of memory 12.
In the several embodiments provided in the present application, the disclosed apparatus and method may be implemented in other manners. The apparatus embodiments described above are merely illustrative, for example, flow diagrams and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, the functional modules in the embodiments of the present application may be integrated together to form a single part, or each module may exist alone, or two or more modules may be integrated to form a single part.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. The foregoing description is only of the preferred embodiments of the present application, and is not intended to limit the scope of the present application. Any modifications, equivalent substitutions, improvements, or the like, which would occur to one skilled in the art, are intended to be included within the spirit and principles of the present application.

Claims (11)

CN202310483558.9A2023-04-282023-04-28Interaction method and system of scene elements, electronic equipment and storage mediumActiveCN116510311B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202310483558.9ACN116510311B (en)2023-04-282023-04-28Interaction method and system of scene elements, electronic equipment and storage medium

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202310483558.9ACN116510311B (en)2023-04-282023-04-28Interaction method and system of scene elements, electronic equipment and storage medium

Publications (2)

Publication NumberPublication Date
CN116510311Atrue CN116510311A (en)2023-08-01
CN116510311B CN116510311B (en)2023-12-01

Family

ID=87404294

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202310483558.9AActiveCN116510311B (en)2023-04-282023-04-28Interaction method and system of scene elements, electronic equipment and storage medium

Country Status (1)

CountryLink
CN (1)CN116510311B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
WO2025055693A1 (en)*2023-09-112025-03-20网易(杭州)网络有限公司Editing control method and apparatus for game map, and electronic device
WO2025082148A1 (en)*2023-10-192025-04-24网易(杭州)网络有限公司Game building editing method and apparatus, storage medium and electronic device
CN119925940A (en)*2023-11-022025-05-06网易(杭州)网络有限公司 Component editing method, device and electronic device
WO2025092519A1 (en)*2023-11-022025-05-08网易(杭州)网络有限公司Method and apparatus for editing ui in game, and electronic device and storage medium
WO2025092520A1 (en)*2023-11-022025-05-08网易(杭州)网络有限公司Game editing method and apparatus, and electronic device

Citations (11)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN1945588A (en)*2006-10-272007-04-11北京金山软件有限公司Network game system and method for establishing game elements
US20090275412A1 (en)*2008-05-052009-11-05Microsoft CorporationMultiple-player collaborative content editing
US20100160039A1 (en)*2008-12-182010-06-24Microsoft CorporationObject model and api for game creation
US20140256389A1 (en)*2013-03-062014-09-11Ian WentlingMobile game application
CN105617654A (en)*2015-12-282016-06-01北京像素软件科技股份有限公司Method and device for user-defined edit of game copy
CN106215420A (en)*2016-07-112016-12-14北京英雄互娱科技股份有限公司For the method and apparatus creating scene of game
CN111803951A (en)*2019-11-072020-10-23厦门雅基软件有限公司Game editing method and device, electronic equipment and computer readable medium
US20200406137A1 (en)*2019-06-282020-12-31Baidu Online Network Technology (Beijing) Co., Ltd.Voice skill game editing method, apparatus, device and readable storage medium
CN113426140A (en)*2021-06-242021-09-24网易(杭州)网络有限公司Screenshot editing method and device in game, storage medium and computer equipment
CN114661284A (en)*2022-03-302022-06-24北京字节跳动网络技术有限公司 Game editing method, game running method, device and computer equipment
CN115430142A (en)*2022-09-052022-12-06北京有竹居网络技术有限公司Game scene editing method, device, equipment and medium

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN1945588A (en)*2006-10-272007-04-11北京金山软件有限公司Network game system and method for establishing game elements
US20090275412A1 (en)*2008-05-052009-11-05Microsoft CorporationMultiple-player collaborative content editing
US20100160039A1 (en)*2008-12-182010-06-24Microsoft CorporationObject model and api for game creation
US20140256389A1 (en)*2013-03-062014-09-11Ian WentlingMobile game application
CN105617654A (en)*2015-12-282016-06-01北京像素软件科技股份有限公司Method and device for user-defined edit of game copy
CN106215420A (en)*2016-07-112016-12-14北京英雄互娱科技股份有限公司For the method and apparatus creating scene of game
US20200406137A1 (en)*2019-06-282020-12-31Baidu Online Network Technology (Beijing) Co., Ltd.Voice skill game editing method, apparatus, device and readable storage medium
CN111803951A (en)*2019-11-072020-10-23厦门雅基软件有限公司Game editing method and device, electronic equipment and computer readable medium
CN113426140A (en)*2021-06-242021-09-24网易(杭州)网络有限公司Screenshot editing method and device in game, storage medium and computer equipment
CN114661284A (en)*2022-03-302022-06-24北京字节跳动网络技术有限公司 Game editing method, game running method, device and computer equipment
CN115430142A (en)*2022-09-052022-12-06北京有竹居网络技术有限公司Game scene editing method, device, equipment and medium

Cited By (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
WO2025055693A1 (en)*2023-09-112025-03-20网易(杭州)网络有限公司Editing control method and apparatus for game map, and electronic device
WO2025082148A1 (en)*2023-10-192025-04-24网易(杭州)网络有限公司Game building editing method and apparatus, storage medium and electronic device
CN119925940A (en)*2023-11-022025-05-06网易(杭州)网络有限公司 Component editing method, device and electronic device
WO2025092519A1 (en)*2023-11-022025-05-08网易(杭州)网络有限公司Method and apparatus for editing ui in game, and electronic device and storage medium
WO2025092520A1 (en)*2023-11-022025-05-08网易(杭州)网络有限公司Game editing method and apparatus, and electronic device

Also Published As

Publication numberPublication date
CN116510311B (en)2023-12-01

Similar Documents

PublicationPublication DateTitle
CN116510311B (en)Interaction method and system of scene elements, electronic equipment and storage medium
US11673053B2 (en)Open game engine and marketplace with associated game editing and creation tools
PaulDigital art
US7925703B2 (en)Graphical interactive interface for immersive online communities
US7809789B2 (en)Multi-user animation coupled to bulletin board
CN111294663A (en) Bullet screen processing method, device, electronic device, and computer-readable storage medium
US20150050997A1 (en)2.5-dimensional graphical object social network
CN103902804B (en)Can based on previous user game play the system and method for shadow formula playing video game
CN111641843A (en)Method, device, medium and electronic equipment for displaying virtual jumping and shaking activities in live broadcast room
US20020082068A1 (en)Method and apparatus for an educational game and dynamic message entry and display
Ng et al.Situated game level editing in augmented reality
ColsonThe fundamentals of digital art
BogdanovychVirtual institutions
CN111641842A (en)Method and device for realizing collective activity in live broadcast room, storage medium and electronic equipment
CN111641876A (en)Virtual activity scene interaction method, device, medium and electronic equipment in live broadcast room
Zou et al.Sounds of history: A digital twin approach to musical heritage preservation in virtual museums
Paoli et al.Designing badges for a civic media platform: Reputation and named levels
ReedExhibition strategies for videogames in art institutions: blank arcade 2016
McCaffery et al.Exploring heritage through time and space supporting community reflection on the highland clearances
CN119075300A (en) Intelligent dialogue method and system
JP3585477B2 (en) Network game system and computer program for realizing the system
Sermon et al.Liberate your avatar: The revolution will be socially networked
MontemoranoBody language: Avatars, identity formation, and communicative interaction in VRChat
CN101661629A (en)Device and method for monitoring role behavior in three-dimensional virtual world
KR20250096833A (en) Method and device for displaying virtual characters, and device and storage medium

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp