Movatterモバイル変換


[0]ホーム

URL:


CN113457144A - Method and device for selecting virtual units in game, storage medium and electronic equipment - Google Patents

Method and device for selecting virtual units in game, storage medium and electronic equipment
Download PDF

Info

Publication number
CN113457144A
CN113457144ACN202110856819.8ACN202110856819ACN113457144ACN 113457144 ACN113457144 ACN 113457144ACN 202110856819 ACN202110856819 ACN 202110856819ACN 113457144 ACN113457144 ACN 113457144A
Authority
CN
China
Prior art keywords
virtual
selection frame
user interface
graphical user
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110856819.8A
Other languages
Chinese (zh)
Other versions
CN113457144B (en
Inventor
桑田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co LtdfiledCriticalNetease Hangzhou Network Co Ltd
Priority to CN202110856819.8ApriorityCriticalpatent/CN113457144B/en
Publication of CN113457144ApublicationCriticalpatent/CN113457144A/en
Application grantedgrantedCritical
Publication of CN113457144BpublicationCriticalpatent/CN113457144B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Landscapes

Abstract

The disclosure relates to the technical field of human-computer interaction, and provides a method and a device for selecting virtual units in a game, a computer-readable storage medium and electronic equipment. Wherein, the method comprises the following steps: responding to the control operation of a touch medium for a virtual control in a graphical user interface, and adjusting the display size of an initial selection frame displayed in the graphical user interface to generate a target selection frame; selecting a target virtual unit from the plurality of virtual units to be selected according to the position relation between the target selection frame and the plurality of virtual units to be selected; and controlling the target virtual unit to execute game action. According to the scheme, the size of the initial selection frame can be directly adjusted based on the virtual control, so that the virtual units in the game can be selected, and the selection efficiency of the virtual units can be improved.

Description

Method and device for selecting virtual units in game, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of human-computer interaction technologies, and in particular, to a method for selecting a virtual unit in a game, a device for selecting a virtual unit, a computer-readable storage medium, and an electronic device.
Background
Selecting some virtual units from multiple virtual units in the game to control the selected virtual units to perform other game operations is one of the conventional game operations of RTS (Real-Time Strategy) type games.
In the related art, a selection frame is formed by double-point touching to select a game unit within the selection frame. However, this method requires the player to click and operate with two fingers, the operation steps are complicated, the efficiency is low, and when the player performs the operation with two fingers on the screen, the sight of the player is blocked, which affects the game experience of the player and the accuracy of the game operation.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure is directed to a method and an apparatus for selecting a virtual unit in a game, a computer-readable storage medium, and an electronic device, so as to overcome the problems of low efficiency of selecting a virtual unit in a game, and the problems of line-of-sight shielding and influence on player experience during the selection process.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of the present disclosure, a method for selecting virtual units in a game is provided, where a game screen of the game is displayed through a graphical user interface of a display component, the game screen includes a part or all of a game scene and a plurality of virtual units to be selected located in the game scene, the method includes:
responding to the control operation of a touch medium for a virtual control in a graphical user interface, and adjusting the display size of an initial selection frame displayed in the graphical user interface to generate a target selection frame;
selecting a target virtual unit from the plurality of virtual units to be selected according to the position relation between the target selection frame and the plurality of virtual units to be selected;
and controlling the target virtual unit to execute game action.
In an exemplary embodiment of the disclosure, based on the foregoing solution, before responding to a control operation of the touch medium for a virtual control in the graphical user interface, the method further includes:
the method comprises the steps of responding to a trigger operation of a touch medium for a first preset area in a graphical user interface, displaying a virtual control in the graphical user interface, and displaying an initial selection frame at a preset position of the graphical user interface.
In an exemplary embodiment of the present disclosure, based on the foregoing solution, the virtual control includes a rocker and a chassis, where the rocker is located in the chassis;
the adjusting the display size of the initial selection frame displayed in the graphical user interface to generate a target selection frame in response to a control operation of a touch medium on a virtual control in the graphical user interface comprises:
and responding to the sliding operation of a touch medium on the rocker in the chassis, and adjusting the display size of the initial selection frame displayed in the graphical user interface according to the sliding distance of the sliding operation to generate a target selection frame.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, a game picture obtained by shooting a part or all of a game scene and a plurality of virtual units located in the game scene by a virtual camera is displayed through a graphical user interface; the selecting a target virtual unit from the plurality of virtual units to be selected according to the position relationship between the target selection frame and the plurality of virtual units to be selected includes:
in response to the sliding operation of a second preset area in the graphical user interface, adjusting the pose of a virtual camera in the game so that the display position of the virtual unit to be selected in the graphical user interface is at least partially positioned in the target selection frame;
and selecting a target virtual unit from the plurality of virtual units to be selected according to the virtual unit to be selected which is at least partially positioned in the target selection frame at the display position in the graphical user interface.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, the selecting, according to a positional relationship between the target selection box and a plurality of virtual units to be selected, a target virtual unit from the plurality of virtual units to be selected includes:
responding to sliding operation of a second preset area in a graphical user interface, and moving the target selection frame in the graphical user interface so that the display position of the virtual unit to be selected in the graphical user interface is at least partially positioned in the target selection frame;
and selecting a target virtual unit from the plurality of virtual units to be selected according to the virtual unit to be selected which is at least partially positioned in the target selection frame at the display position in the graphical user interface.
In an exemplary embodiment of the present disclosure, based on the foregoing solution, the virtual control includes a rocker and a chassis, where the rocker is located in the chassis;
the adjusting, in response to a control operation of a touch medium for a virtual control in a graphical user interface, a display size of an initial selection frame displayed in the graphical user interface to generate a target selection frame includes:
responding to the pressing operation of a touch medium on the rocker, and detecting the pressing magnitude of the touch medium on the rocker;
and adjusting the display size of the initial selection frame displayed in the graphical user interface according to the pressing force value based on a preset mapping relation so as to generate a target selection frame.
In an exemplary embodiment of the present disclosure, based on the foregoing solution, the virtual control includes a rocker and a chassis, where the rocker is located in the chassis;
the adjusting, in response to a control operation of a touch medium for a virtual control in a graphical user interface, a display size of an initial selection frame displayed in the graphical user interface to generate a target selection frame includes:
when the touch control medium is detected to be in a contact state with the rocker, responding to the sliding operation in a second preset area, and adjusting the display size of the initial selection frame displayed in the graphical user interface according to the sliding distance and/or the sliding direction of the sliding operation to generate a target selection frame.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, the selecting, according to a positional relationship between the target selection box and a plurality of virtual units to be selected, a target virtual unit from the plurality of virtual units to be selected includes:
responding to the sliding operation of a touch medium on the rocker in the chassis, and moving the target selection frame in the graphical user interface to enable the display position of the virtual unit to be selected in the graphical user interface to be at least partially positioned in the target selection frame;
and selecting a target virtual unit from the plurality of virtual units to be selected according to the virtual unit to be selected which is at least partially positioned in the target selection frame at the display position in the graphical user interface.
In an exemplary embodiment of the disclosure, based on the foregoing solution, the selecting a target virtual unit from the plurality of virtual units to be selected according to the virtual unit to be selected located at least partially within the target selection box at the display position in the graphical user interface includes:
and selecting a target virtual unit from the plurality of virtual units to be selected according to the position overlapping degree between the display position of the virtual unit to be selected and the target selection frame.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, the selecting, according to a positional relationship between the target selection box and a plurality of virtual units to be selected, a target virtual unit from the plurality of virtual units to be selected includes:
and when the touch medium and the virtual control are detected to be changed from a contact state to a non-contact state, selecting a target virtual unit from the multiple virtual units to be selected according to the position relation between the target selection frame and the multiple virtual units to be selected.
According to a second aspect of the present disclosure, there is provided a virtual unit selecting apparatus in a game, which displays a game screen of the game through a graphical user interface of a display component, where the game screen includes a part or all of a game scene and a plurality of virtual units to be selected located in the game scene, including:
the target selection frame generation module is configured to respond to the control operation of a touch medium on a virtual control in a graphical user interface, adjust the display size of an initial selection frame displayed in the graphical user interface and generate a target selection frame;
the selecting module is configured to select a target virtual unit from the plurality of virtual units to be selected according to the position relation between the target selecting frame and the plurality of virtual units to be selected;
a control module configured to control the target virtual unit to perform a game action.
According to a third aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of virtual unit selection in a game as described in the first aspect of the embodiments above.
According to a fourth aspect of the embodiments of the present disclosure, there is provided an electronic apparatus including: a processor; and a storage device for storing one or more programs, which when executed by the one or more processors, cause the one or more processors to implement the method of virtual unit selection in a game as described in the first aspect of the embodiments above.
As can be seen from the foregoing technical solutions, the method for selecting a virtual unit in a game, the device for selecting a virtual unit in a game, and the computer-readable storage medium and the electronic device for implementing the method for selecting a virtual unit in a game in the exemplary embodiments of the present disclosure have at least the following advantages and positive effects:
in some embodiments of the present disclosure, first, in response to a control operation of a touch medium on a virtual control in a graphical user interface, a display size of an initial selection frame displayed in the graphical user interface is adjusted to generate a target selection frame, and then, according to a positional relationship between the target selection frame and a plurality of virtual units to be selected, a target virtual unit may be selected from the plurality of virtual units to control the target virtual unit to execute a game action. Compared with the prior art, on one hand, the method and the device can directly operate the virtual control to adjust the display size of the initial selection frame so as to select the virtual unit in the game, so that the operation steps during virtual unit selection can be simplified, and the selection efficiency of the virtual unit in the game is improved; on the other hand, the mode of selecting the virtual units through the virtual control can reduce or avoid the shielding of the visual field range of the player in the virtual unit selecting process, and improve the accuracy of the game operation of the player.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
FIG. 1A illustrates a graphical user interface diagram of a prior art virtual unit selection in an embodiment of the present disclosure;
FIG. 1B illustrates another graphical user interface diagram of a prior art virtual unit selection in an exemplary embodiment of the present disclosure;
FIG. 1C illustrates yet another graphical user interface diagram of a prior art virtual unit selection in an exemplary embodiment of the present disclosure;
FIG. 2 illustrates a flow diagram of a method for virtual unit selection in a game in an exemplary embodiment of the disclosure;
FIG. 3 shows a schematic diagram of a graphical user interface in an example embodiment of the present disclosure;
FIG. 4 shows a schematic diagram of another graphical user interface in an example embodiment of the present disclosure;
FIG. 5 illustrates a flow chart of a method of resizing a display of an initial selection box in an exemplary embodiment of the present disclosure;
fig. 6 is a flowchart illustrating a method for selecting a virtual unit according to a position relationship between a target selection box and a plurality of virtual units to be selected in an exemplary embodiment of the disclosure;
FIG. 7 is a flowchart illustrating another method for selecting a virtual unit according to a target selection box in an exemplary embodiment of the disclosure;
FIG. 8 is a flow chart illustrating a further method for selecting virtual units based on a target selection box in an exemplary embodiment of the disclosure;
FIG. 9 illustrates yet another graphical user interface schematic in an exemplary embodiment of the present disclosure;
FIG. 10 illustrates yet another graphical user interface schematic in an exemplary embodiment of the present disclosure;
FIG. 11 illustrates a graphical user interface diagram of a virtual unit selected according to the selection box of FIG. 10 in an exemplary embodiment of the present disclosure;
FIG. 12 is a schematic diagram illustrating a virtual unit pick device in a game according to an exemplary embodiment of the present disclosure;
FIG. 13 shows a schematic diagram of a structure of a computer storage medium in an exemplary embodiment of the disclosure;
fig. 14 shows a schematic structural diagram of an electronic device in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
The terms "a," "an," "the," and "said" are used in this specification to denote the presence of one or more elements/components/parts/etc.; the terms "comprising" and "having" are intended to be inclusive and mean that there may be additional elements/components/etc. other than the listed elements/components/etc.; the terms "first" and "second", etc. are used merely as labels, and are not limiting on the number of their objects.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities.
Selecting some virtual units from multiple virtual units in the game to control the selected virtual units to perform other game operations is one of the conventional game operations of RTS (Real-Time Strategy) type games.
It is easy for the PC end (Personal Computer) to select the virtual units, and the game player can select a single virtual unit or select the virtual units in batch by clicking a frame with a mouse. However, for a touch-enabled terminal with a smaller display screen, such as a mobile phone, a tablet computer, a wearable electronic device, etc., this approach is not suitable.
In the related art, the selection of the virtual unit may be performed by a frame selected in a double-point touch manner. As shown in fig. 1A, a selection box may be formed in an area between the two-point touches according to an operation gesture of the player to select a virtual unit.
However, this method requires the player to quickly click and operate the finger, and has the disadvantages of complicated operation steps, low efficiency, and high false touch rate. Further, no matter the virtual units in the cluster at the center of the screen are selected, or the virtual units in the clusters at the sides of the screen are selected, as shown in fig. 1B and 1C, in the operation process, the fingers can block the central sight of the player who is playing the confrontation, which affects the experience of the player and the accuracy of the game operation.
The method for selecting the virtual units in the game provided by the disclosure overcomes the defects in the related art at least to a certain extent.
Fig. 2 is a schematic flow chart illustrating a method for selecting a virtual unit in a game according to an exemplary embodiment of the present disclosure, where the method for selecting a virtual unit in a game provided by this embodiment displays a game screen of the game through a graphical user interface of a display component. Referring to fig. 2, the method includes:
step S210, responding to the control operation of a touch control medium for a virtual control in a graphical user interface, and adjusting the display size of an initial selection frame displayed in the graphical user interface to generate a target selection frame;
step S220, selecting a target virtual unit from the plurality of virtual units to be selected according to the position relation between the target selection frame and the plurality of virtual units to be selected;
and step S230, controlling the target virtual unit to execute game action.
In the technical solution provided in the embodiment shown in fig. 2, first, in response to a control operation of a touch medium on a virtual control in a graphical user interface, a display size of an initial selection frame displayed in the graphical user interface is adjusted to generate a target selection frame, and then, a target virtual unit may be selected from a plurality of virtual units according to a positional relationship between the target selection frame and the plurality of virtual units to be selected, so as to control the target virtual unit to execute a game action. Compared with the prior art, on one hand, the method and the device can directly operate the virtual control to adjust the display size of the initial selection frame so as to select the virtual unit in the game, so that the operation steps during virtual unit selection can be simplified, and the selection efficiency of the virtual unit in the game is improved; on the other hand, the mode of selecting the virtual units through the virtual control can reduce or avoid the shielding of the visual field range of the player in the virtual unit selecting process, and improve the accuracy of game operation of the player.
The following detailed description of the various steps in the example shown in fig. 2:
in step S210, in response to a control operation of the touch media on a virtual control in the graphical user interface, a display size of an initial selection frame displayed in the graphical user interface is adjusted to generate a target selection frame.
In an exemplary embodiment, the touch medium may include a substance capable of multi-touch on a display screen of the terminal device, such as a finger of a user, a stylus, and the like. The initial selection frame may include a certain point at a preset position in the graphical user interface, for example, a central point of the graphical user interface, but may also include a certain initial selection frame having an initial display size at the preset position, such as a selection frame of 2px × 2px or an initial selection frame having a display size smaller than a virtual unit, where px is a pixel unit (hereinafter, as the same, px is to be understood as a pixel unit if px is not specifically described below), and the preset position may be set by a user according to a requirement.
The virtual control can comprise a rocker and a chassis, and the rocker is located in the chassis. Specifically, the touch medium can press the rocker to slide in the chassis at will, but the rocker cannot be slid out of the chassis, that is, the rocker cannot be separated from the chassis in the process of pressing the rocker to slide by the touch medium.
In an optional embodiment, before responding to the control operation of the touch medium for the virtual control in the graphical user interface, the method further comprises: the method comprises the steps of responding to a trigger operation of a touch medium for a first preset area in a graphical user interface, displaying a virtual control in the graphical user interface, and displaying an initial selection frame at a preset position of the graphical user interface.
For example, a certain display area in the graphical user interface may be configured as a virtual control trigger area in advance, for example, a display area with a preset width on the left side in the graphical user interface may be configured as a virtual control trigger area, as shown in fig. 3, 31 may be a virtual control trigger area, that is, a first preset area. In other words, after the user performs the triggering operation in the virtual control triggering area, a virtual control is displayed at the triggering position, such as thevirtual control 32 in fig. 3, where 321 is the rocker and 322 is the chassis, and an initial selection box, such as theinitial selection box 33 in fig. 3, is displayed at the preset position of the graphical user interface. In fig. 3, a circle like 34 represents a virtual unit.
Specifically, when it is detected that the touch medium is in a contact state with a first preset area in the graphical user interface, a virtual control is displayed at the contact position, as shown in fig. 3 at 32, and an initial selection box is displayed at the center position of the graphical user interface, where the initial selection box may be an initial selection box with an initial display size of 0 × 0, that is, a point, as shown in fig. 3 at 33.
In another optional implementation, the virtual control and the initial selection frame may also be directly displayed in the graphical user interface, that is, after the user opens the game, a virtual control and an initial selection frame are directly displayed in the game interface, and the virtual control and the initial selection frame do not need to be triggered and displayed through the first preset area by means of the touch medium.
In order to prevent misoperation caused by false touch of a user, an initial selection frame directly displayed in a graphical user interface may have an initial display state, and when the initial selection frame is in the initial display state (the initial display state may be understood as an inactivated state), selection of a virtual unit based on the initial selection frame cannot be performed, that is, at this time, a sliding operation of the user in the graphical user interface may only be used to control a virtual camera to move so as to update a game scene shot within a visual field of the virtual camera, and may not be used to move the initial selection frame or perform size adjustment on the initial selection frame.
When a virtual control and an initial selection frame are initially displayed in a game interface, in response to a trigger operation of a touch medium for the displayed virtual control, for example, when it is detected that the touch medium is in a contact state with the virtual control, the initial selection frame may be configured to be in a target display state from the initial display state, and the target display state may be understood as an activated state.
In yet another alternative embodiment, only the virtual control may be directly displayed in the graphical user interface, an initial selection box is displayed and activated in response to a trigger operation of the touch medium for the virtual control, and then the virtual unit is selected based on the initial selection box by using the exemplary method of the present disclosure.
In yet another optional implementation, only the virtual control may be directly displayed in the graphical user interface, an initial selection frame in an initial display state is displayed at a preset position of the graphical user interface in response to a first trigger operation of the touch medium on the virtual control, and the initial selection frame is configured from the initial display state to a target display state in response to a second trigger operation of the touch medium on the virtual control. As described above, when the initial selection frame is in the initial display state in which it is not activated, selection of a virtual unit cannot be performed based on the initial selection frame. Selection of a virtual unit may be made using the exemplary method of the present disclosure based on the initial selection box while the initial display box is in the target display state that is activated.
For example, the control operation in step S210 may include an operation in which the touch medium presses the joystick to slide the joystick in the chassis. Based on this, a specific implementation manner of step S210 may be that, in response to a sliding operation of a touch medium in the chassis with respect to the joystick, a display size of an initial selection frame displayed in the graphical user interface is adjusted according to a sliding distance of the sliding operation to generate a target selection frame.
In an exemplary embodiment, the initial selection box may include a point at a preset position in the graphical user interface, and after the initial selection box is resized, the generated target selection box may include an arbitrary quadrangle (e.g., a rectangle), a circle, or another arbitrary closed polygon. The initial selection box may also comprise a closed figure of any shape having a certain display size in the graphical user interface, such as a rectangle, a circle, etc., when the target selection box and the initial selection box are in accordance with each other.
It can be seen that the generated target selection box may be a rectangle when the initial selection box is a point or a rectangle, and may be a circle when the initial selection box is a point or a circle.
In the following, how to adjust the display size of the initial selection frame displayed in the graphical user interface according to the sliding distance of the touch medium in the chassis for the sliding operation of the joystick to generate the target selection frame is described by taking the target selection frame as a rectangle or a circle, respectively.
When the target selection frame is a rectangle, the sliding distance of the rectangle in the horizontal direction and the sliding distance of the rectangle in the vertical direction can be mapped according to the sliding operation, the display boundary of the rectangle in the horizontal direction and the increment of the display size of the rectangle in the vertical direction are correspondingly determined, and then the display size of the initial selection frame is adjusted to generate the target selection frame.
Specifically, a proportional relationship between the sliding distance of the sliding operation map in the horizontal direction and the increment corresponding to the length of the rectangle and a proportional relationship between the sliding distance of the sliding operation map in the vertical direction and the increment corresponding to the width of the rectangle may be preset, for example, a proportional relationship between the sliding distance of the sliding operation map in the horizontal direction and the increment corresponding to the length of the rectangle is preset to be 1:2, and a proportional relationship between the sliding distance of the sliding operation map in the vertical direction and the increment corresponding to the width of the rectangle is also preset to be 1: 2. Taking the initial selection frame as an example, when the user slides the joystick in the chassis, and the mapping distance in the horizontal direction is 5px, and the mapping distance in the vertical direction is 4px, a target selection frame with a length of 10px and a width of 8px is generated in the graphical user interface, where the point corresponding to the initial selection frame is the center of the diagonal line. As shown in fig. 4, theinitial selection box 33 in fig. 3 may be adjusted according to the sliding operation of the user, so as to generate thetarget selection box 41 in fig. 4.
It should be noted that, during the process of generating the target selection frame, the user may slide the joystick back and forth to determine the optimal display size of the target selection frame. In the process of generating the target selection frame, the sliding direction corresponding to the first sliding operation may be determined as a positive direction, and then, in the process of generating the target selection frame, when the sliding direction of the sliding operation of the user mapped in the horizontal direction or the vertical direction coincides with the first sliding operation, the increment of the corresponding display boundary is a positive value, otherwise, the increment is a negative value. If in a certain adjustment process, the user slides the rocker in the chassis to the right and up by a distance of 3px and then vertically and down by a distance of 1px, then the sliding operation is mapped in the horizontal and right direction as the positive direction of the horizontal direction, and the mapping is mapped in the vertical and up direction as the positive direction of the vertical direction. That is, the display size of the initial selection frame in the horizontal direction and the display size in the vertical direction are first increased correspondingly, and then the display size in the horizontal direction is not changed, and the display size in the vertical direction is decreased correspondingly, so that the target selection frame is finally generated.
When the target selection frame is circular, the increased length of the radius of the circle can be determined according to the sliding distance of the sliding operation mapping in the horizontal direction or the vertical direction, or the increased length of the radius of the circle can be determined directly according to the sliding distance of the sliding operation, so that the display size of the initial selection frame is adjusted to generate the target selection frame. Similarly, a preset proportional relationship between the sliding distance of the sliding operation and the increasing length of the radius of the circular selection frame may be predetermined, and taking the example of determining the increasing length of the radius of the circular shape based on the sliding distance of the sliding operation mapping in the horizontal direction, the direction in which the initial sliding operation is initially mapped in the horizontal direction may be defined as a positive direction, for example, when the sliding operation is initially slid to the right, the horizontal rightward direction is the positive direction, and when the sliding operation is mapped in the positive direction, the increasing length of the radius is a positive value, otherwise, the increasing length is a negative value.
Illustratively, the center of the rocker and the chassis are coincident when there is no sliding motion of the rocker. Taking the example that the rocker and the chassis are circular, the initial selection frame is a point in the graphical user interface, and the generated target selection frame is rectangular, the distance that the rocker is slid in the horizontal direction is proportional to the length of the generated target selection frame, and the distance that the rocker is slid in the vertical direction is proportional to the width of the target selection frame. In other words, the area of the target frame is proportional to the distance between the center of the rocker and the center of the chassis, i.e., the farther the rocker center is from the center of the chassis, the larger the area of the target frame is generated. Specifically, for example, the length of the selection frame in the horizontal direction and the width of the selection frame in the vertical direction may be represented by the percentage of the distance that the rocker is slid out on the horizontal axis of the chassis (i.e., in the horizontal direction) to the radius of the chassis, which is the same as the percentage of the length of the target selection frame to the length of the gui, and the percentage of the distance that the rocker is slid out on the vertical axis of the chassis (i.e., in the vertical direction) to the radius of the chassis, which is the same as the percentage of the width of the target selection frame to the width of the gui.
The above manner is to adjust the display size of the initial selection frame by controlling the rocker to slide in the chassis, so as to generate the target selection frame. In another exemplary embodiment, the control operation in step S210 may include performing pressing operations with different forces on a joystick in the chassis, and specifically, the display size of the initial selection frame may be adjusted by detecting a pressing force value of the touch medium against the joystick based on a 3D touch (three-dimensional touch) technology.
For example, referring to fig. 5, the method for adjusting the display size of the initial selection box based on the 3D touch technology may include steps S510 to S520.
In step S510, in response to a pressing operation of a touch medium on the joystick, a pressing magnitude of the touch medium on the joystick is detected.
For example, when the duration of the contact state of the touch medium and the rocker is greater than a preset threshold, such as 0.2 second, the current pressing force value of the touch medium may be detected based on a 3D touch technology. Specifically, compared with the operation of multi-point Touch in a planar two-dimensional space, the 3D-Touch technology increases the perception of the finger strength and the finger area, so that the current pressing strength value of the Touch medium on the rocker can be directly detected through the 3D-Touch technology.
Next, in step S520, based on a preset mapping relationship, a display size of an initial selection frame displayed in the graphical user interface is adjusted according to the pressing force value to generate a target selection frame.
In an exemplary embodiment, the preset mapping relationship may include a corresponding relationship between the pressing force value and the increment of the display size of the initial selection frame, and the corresponding relationship may be a proportional relationship, that is, the larger the pressing force value is, the larger the increment of the display size of the initial selection frame is.
For example, a correspondence relationship between the pressing force value and the increment of the display size of the initial selection frame may be preset to be y-3 x, where x represents the pressing force value and y represents the increment of the display size of the initial selection frame, so that the increment of the display size of the initial selection frame may be determined based on the detected pressing force value, thereby adjusting the display size of the initial selection frame. When the generated target selection frame is a rectangle, the preset mapping relationship may include a mapping relationship between the pressing force value and an increment of the length of the rectangle and a mapping relationship between the pressing force value and an increment of the width of the rectangle, which may be equal or unequal, and this is not particularly limited in this exemplary embodiment. When the generated target selection frame is a circle, the preset mapping relationship may include a correspondence relationship between the pressing force value and an increment of a radius of the circle.
In an exemplary embodiment, the preset mapping relationship may further include a correspondence relationship between pressing force values of different levels and increments of the display size of the initial selection frame. For example, when the pressing force value is greater than 0 and less than 3, the pressing force value is of a first level, the increment of the display size of the corresponding initial selection frame is 20px, when the pressing force value is not greater than 5 and not less than 3, the pressing force value is of a second level, the increment of the display size of the corresponding initial selection frame is 40px, and the like. And after the current pressing force value is detected, determining a pressing force level corresponding to the current pressing force value, and then adjusting the display size of the initial selection frame according to the increment of the display size corresponding to the pressing force level to generate the target selection frame.
In still another exemplary embodiment, the control operation in step S210 may further include an operation of sliding in a second preset region of the graphical user interface. Based on this, in a specific implementation manner of step S210, when it is detected that the touch medium is in a contact state with the joystick, in response to a sliding operation in a second preset area, a display size of an initial selection frame displayed in the graphical user interface is adjusted according to a sliding distance and/or a sliding direction of the sliding operation, so as to generate a target selection frame.
For example, taking the touch medium as the finger of the user as an example, the user may press the rocker with the left hand, and then slide the finger of the right hand in the second preset area to adjust the display size of the initial selection frame, so as to generate the target selection frame. For example, a certain area in the graphic user interface may be previously configured as a selection box resizing area, i.e., a second preset area. When the touch medium slides in the area, the display size of the generated selection frame can be correspondingly adjusted based on the sliding operation of the touch medium.
The specific implementation of adjusting the display size of the initial selection frame displayed in the graphical user interface according to the sliding distance and the sliding direction of the touch medium in the second preset area and according to the sliding distance of the touch medium in the second preset area may refer to the specific implementation of adjusting the display size of the initial selection frame in the graphical user interface according to the sliding operation of the touch medium on the rocker in the chassis, and will not be described herein again.
A specific implementation manner of adjusting the display size of the initial selection frame displayed in the graphical user interface according to the sliding direction of the touch medium in the second preset area may be that the display size of the initial selection frame is adjusted according to the sliding direction based on the size increase of the preset boundary of the initial selection frame.
Taking a rectangular selection frame as an example, the preset increase size of the boundary of the rectangular selection frame in the first direction is a, and the increase size of the boundary in the second direction is b, that is, the increase size of the boundary of the selection frame in the first direction is a every time the touch medium slides, and the increase size of the boundary in the second direction is b in the generation process of the target selection frame, and the direction in which the first sliding direction is mapped in the first direction and the second direction is taken as a positive direction, for the positive direction, the increase size of the boundary is a positive value, and for the negative direction, the increase size of the boundary is a negative value. For example, in the process of generating a certain target selection frame, the target selection frame is first slid 2 times upward and then slid 1 time downward and leftward, taking the first direction as the horizontal direction and the second direction as the vertical direction as an example, in the process of generating the target selection frame, the positive direction of the horizontal direction to the left and the vertical direction as the positive direction of the vertical direction in the second generation, that is, the display sizes corresponding to the display boundaries of the initial selection frame in the first direction and the second direction are first increased by 2a and 2b, respectively, and then the display sizes in the first direction and the second direction are then decreased by a and b, respectively, so as to finally generate a rectangular selection frame with the display size a in the first direction and the display size b in the second direction. The target selection box may be understood as a selection box used when the virtual unit is finally selected.
Of course, the display size of the initial selection frame may also be adjusted according to the sliding direction based on the preset increasing speed of the display boundary, which is similar to the embodiment of adjusting the display size of the initial selection frame according to the sliding direction based on the preset increasing size of the display boundary, and is not described herein again.
With continued reference to fig. 2, in step S220, a target virtual unit is selected from the plurality of virtual units to be selected according to the position relationship between the target selection frame and the plurality of virtual units to be selected.
Next, a specific embodiment of step S220 will be further described with reference to fig. 6 to 11.
Exemplarily, fig. 6 is a schematic flowchart illustrating a method for selecting a virtual unit according to a position relationship between a target selection frame and a plurality of virtual units to be selected in an exemplary embodiment of the present disclosure. Referring to fig. 6, the method may include steps S610 to S620. Wherein:
in step S610, in response to a sliding operation in a second preset area in the graphical user interface, adjusting a pose of a virtual camera in the game so that a display position of a virtual unit to be selected in the graphical user interface is at least partially within the target selection frame.
In an exemplary embodiment, a game picture obtained by shooting a part or all of a game scene and a plurality of virtual units positioned in the game scene by a virtual camera is displayed through a graphical user interface. When the pose of the virtual camera changes, the game picture displayed in the graphical user interface is updated, and correspondingly, the display position of the virtual unit to be selected in the graphical user interface also changes along with the update of the game picture.
For example, a certain area in the graphical user interface may be configured as a pose adjustment area of the virtual camera, that is, a second preset area in advance, as shown in 35 in fig. 3, when the touch medium slides in the area, the pose of the virtual camera may be correspondingly adjusted based on the sliding operation of the touch medium, so that the display position of the virtual unit to be selected in the graphical user interface is at least partially located in the target selection frame. The pose of the virtual camera comprises the position and the rotation angle of the virtual camera.
It should be noted that the first preset area and the second preset area are two areas in the graphical user interface where there is no overlapping area. And when the second preset area is an area for adjusting the pose of the virtual camera, the display size or the movement of the target selection frame cannot be controlled, when the second preset area is an area for controlling the display size of the target selection frame, the display size or the movement of the target selection frame cannot be adjusted, and when the second preset area is an area for controlling the movement of the target selection frame, the display size or the pose of the virtual camera or the movement of the target selection frame cannot be adjusted.
For example, according to a preset proportional relationship between the sliding distance of the sliding operation and the moving distance and/or the rotation angle of the virtual camera, the moving distance and/or the rotation angle of the virtual camera can be determined based on the sliding distance of the sliding operation, the moving direction and/or the rotation direction of the virtual camera can be determined based on the sliding direction of the sliding operation, for example, the moving direction of the virtual camera in the virtual scene is consistent with or opposite to the sliding direction of the sliding operation, and then the virtual camera in the game can be moved or rotated based on the sliding direction and the sliding distance of the sliding operation for the second preset area to adjust the pose of the virtual camera.
As described above, when the virtual camera moves, the game scene picture captured by the virtual camera changes, and the game picture displayed in the graphical user interface also changes, so that the display position of the virtual unit in the game scene in the graphical user interface changes, and the display position of the at least one virtual unit to be selected in the graphical user interface can be moved at least partially into the selection frame by controlling the movement or rotation of the virtual camera. However, it is understood that, at this time, the position of each virtual unit in the game scene is not changed, but the display position of the position of each virtual unit in the game scene mapped in the graphical user interface is changed. The virtual unit to be selected can be understood as the virtual unit that the game player wants to select.
In step S620, a target virtual unit is selected from the multiple virtual units to be selected according to the virtual unit to be selected, which is located at least partially in the target selection frame at the display position in the graphical user interface.
For example, the specific implementation of step S620 may include: and selecting a target virtual unit from the plurality of virtual units to be selected according to the position overlapping degree between the display position of the virtual unit to be selected and the target selection frame.
The display position of the virtual unit to be selected may include a display area occupied by the virtual unit to be selected in the graphical user interface. The position overlapping degree can be understood as the overlapping ratio between the area of the display area occupied by the display position of the virtual unit to be selected in the graphical user interface and the area of the display area occupied by the display position of the target selection frame in the graphical user interface.
Take the example that a represents the area of the display area occupied by the display position of a certain virtual unit to be selected in the graphical user interface, and B represents the area of the display area occupied by the display position of the target selection frame in the graphical user interface. Then, the position overlapping ratio between the two can be expressed as the following formula (1) or formula (2):
Figure BDA0003184370400000171
Figure BDA0003184370400000172
wherein A ≧ B denotes the intersection of A and B, i.e., the area of the region where A and B overlap.
For example, when a target virtual unit is selected according to a position overlapping degree between the display position of the virtual unit to be selected and the target selection frame, the virtual unit to be selected, of which the position overlapping degree is greater than or equal to a preset threshold value, may be determined as the target virtual unit. The preset threshold may be set by self-definition according to actual conditions or requirements, and may be any value greater than 0 and less than or equal to 1. Taking the above formula (1) as an example, when the preset threshold is equal to 1, it may be indicated that the virtual unit in which the display position in the graphical user interface is entirely contained within the target selection box is selected.
In an exemplary embodiment, taking a straight line passing through a center point of a chassis of the virtual control and forming an angle of 90 degrees with the horizontal direction as a boundary, when the rocker is slid to the right in the chassis, all virtual units contained in the target selection frame can be selected, for example, a virtual unit with an overlap rate of more than 85% with the target selection frame can be selected. When the rocker is slid leftwards in the chassis, all virtual units in contact with the target selection frame can be selected, for example, the virtual units with the overlapping rate of the target selection frame being more than 10% can be selected. That is, when the rocker is slid rightwards in the chassis, the preset threshold corresponding to the overlap ratio may be greater than the preset threshold corresponding to the overlap ratio when the rocker is slid leftwards in the chassis, and certainly, on the contrary, it is also possible to set only one preset threshold without distinguishing whether the rocker is slid rightwards or leftwards in the chassis, and if the rocker is slid rightwards or leftwards in the chassis, the virtual unit with the overlap ratio greater than 90% is selected as the target virtual unit.
Through the steps S610 to S620, the display size of the generated target selection frame may be determined based on the distance that the rocker is slid in the chassis, and the pose of the virtual camera may be adjusted based on the sliding of the touch medium in the second preset area, so as to move the display position of the virtual unit to be selected in the graphical user interface to the target selection frame, thereby realizing the selection of the virtual unit, and the target selection frame is not moved in the whole process.
In another exemplary embodiment, the display size of the generated target selection frame may be determined based on the distance that the rocker is slid in the chassis, and the generated target selection frame is moved based on the sliding operation of the touch medium in the second preset area, so as to implement the selection of the virtual unit.
For example, fig. 7 is a flowchart illustrating another method for selecting a virtual unit according to a target selection box in an exemplary embodiment of the disclosure. Referring to fig. 7, the method may include steps S710 to S720. Wherein,
in step S710, in response to a sliding operation in a second preset area in the graphical user interface, the target selection frame is moved in the graphical user interface, so that a display position of the virtual unit to be selected in the graphical user interface is at least partially located in the target selection frame.
The second preset area in step S710 may include any area in the graphical user interface that does not have an overlapping portion with the first preset area, and the display size of the second preset area may be customized according to a requirement. When the touch medium slides in the second preset area, the target selection frame may be moved in the graphical user interface based on the sliding operation of the touch medium, so that the display position of the virtual unit to be selected is at least partially located in the target selection frame.
For example, the moving distance of the target selection frame corresponding to the sliding distance of the current sliding operation may be determined according to a preset proportional relationship between the sliding distance of the sliding operation and the moving distance of the target selection frame. The moving direction of the target selection frame is determined based on the sliding direction of the sliding operation, i.e., the sliding direction of the sliding operation and the moving direction of the target selection frame coincide. And moving the target selection frame based on the sliding distance and the sliding direction of the sliding operation so that the display position of the virtual unit to be selected in the graphical user interface is at least partially positioned in the target selection frame.
Next, in step S720, a target virtual unit is selected from the plurality of virtual units to be selected according to the virtual unit to be selected, which is at least partially located in the target selection frame at the display position in the graphical user interface.
The specific implementation of step S720 is completely the same as the specific implementation of step S620, and is not described herein again.
It should be noted that, for the embodiment shown in fig. 7, the selection of the virtual unit is performed because the target selection frame is moved in the graphical user interface, that is, only the virtual unit that is currently displayed in the graphical user interface can be selected. Therefore, when the embodiment shown in fig. 7 is used to select a virtual unit, the pose of the virtual camera may be adjusted by sliding in the second preset area before the initial selection frame is not activated, so that the virtual unit to be selected can be displayed in the graphical user interface. And then activating the initial selection frame to select the virtual unit, wherein when the initial selection frame is activated, the second preset area can only be used for adjusting the display position of the generated target selection frame in the graphical user interface, and can not be used for adjusting the pose of the virtual camera.
In the embodiment shown in fig. 6, in the actual selection process of the virtual unit, the display position of the virtual unit to be selected in the graphical user interface can be moved into the target selection frame by adjusting the pose of the virtual camera, so as to achieve the purpose of selecting any virtual unit to be selected in the virtual scene, and therefore, the virtual unit to be selected does not need to be displayed in the graphical user interface in advance. Certainly, in order to improve the selection efficiency, the display position of the virtual unit to be selected in the graphical user interface can be moved to the vicinity of the initial selection frame in advance by adjusting the pose of the virtual camera, so that when the virtual unit is actually selected, the virtual unit to be selected can be directly selected through the generated target selection frame only by fine tuning the display pose of the virtual camera or even without adjusting the virtual camera.
In still another exemplary embodiment, when the display size of the initial selection frame is adjusted based on the pressing force value of the touch medium against the joystick in step S210 or when the display size of the initial selection frame is adjusted based on the sliding operation of the touch medium in the second preset area in step S210, in step S220, the target selection frame may be moved in the graphical user interface by the sliding operation of the joystick, so that the display position of the virtual unit to be selected in the graphical user interface is at least partially located in the target selection frame, thereby performing the selection of the virtual unit.
Exemplarily, fig. 8 is a flowchart illustrating a method for selecting a virtual unit according to a position relationship between a target selection box and a plurality of virtual units to be selected in an exemplary embodiment of the present disclosure. Referring to fig. 8, the method may include steps S810 to S820. Wherein:
in step S810, in response to a sliding operation of a touch medium in the chassis against the rocker, moving the target selection frame in the graphical user interface so that a display position of the virtual unit to be selected in the graphical user interface is at least partially located in the target selection frame.
Illustratively, the central axes of the chassis in the horizontal direction and the vertical direction are used as boundary lines respectively, when the rocker is positioned above the central axis in the horizontal direction, the target selection frame moves upwards, and otherwise, the target selection frame moves downwards; when the rocker is positioned on the right side of the central axis in the vertical direction, the target selection frame moves rightwards, and otherwise, the target selection frame moves leftwards. And determining the moving direction of the target selection frame in the graphical user interface based on the sliding direction of the rocker. And determining the moving distance of the target selection frame in the graphical user interface based on the preset proportional relation between the sliding distance of the rocker and the moving distance of the target selection frame.
In step S820, a target virtual unit is selected from the multiple virtual units to be selected according to the virtual unit to be selected, which is located at least partially in the target selection frame at the display position in the graphical user interface.
For example, the specific implementation of step S820 is completely the same as the specific implementation of step S620, and is not described again here.
When the display size of the initial selection frame is adjusted based on the pressing force value of the touch medium against the joystick in step S210, the user can be assisted in completing the selection of the virtual unit with one hand through the steps S810 to S820. Specifically, the user may control the size of the generated target selection frame by pressing the pressing force of the rocker with the finger, as shown in fig. 9, the generated target selection frame may be 91 according to the current pressing force value, and then, the virtual rocker is continuously pressed and slid in the chassis to move thetarget selection frame 91, so as to obtain thetarget selection frame 101 in fig. 10, so that the display position of the virtual unit to be selected in the graphical user interface is at least partially located in thetarget selection frame 101. At this time, if the finger is released, the target virtual unit may be selected from the multiple virtual units to be selected according to the current display size and the position overlapping degree between the display position of thetarget selection frame 101 and the display position of the virtual unit to be selected in the graphical user interface. As shown in fig. 11, the finally selected target virtual units are 111, 112, 113. Therefore, the shielding of the visual field range of the user in the virtual unit selection process is avoided, the accuracy of user operation is improved, and the virtual unit selection efficiency is improved.
In an exemplary embodiment, the size of the generated target selection frame may be controlled by a pressing degree of the touch medium pressing the rocker, and the position of the generated target selection frame is adjusted by a sliding operation in the second preset area or the pose of the virtual camera is adjusted to adjust the display position of the virtual unit in the graphical user interface, so that the virtual unit desired to be selected is framed and selected in the generated target selection frame, and thus the virtual unit is selected. At this time, the virtual control may only include one touch key, for example, only the rocker is retained without the chassis, and the rocker cannot be slid in the process of selecting the virtual unit. The present exemplary embodiment is not particularly limited in this regard.
Illustratively, the selecting a target virtual unit from the plurality of virtual units to be selected according to the position relationship between the target selection frame and the plurality of virtual units to be selected includes: and when the touch medium and the virtual control are detected to be changed from a contact state to a non-contact state, selecting a target virtual unit from the multiple virtual units to be selected according to the position relation between the target selection frame and the multiple virtual units to be selected.
For example, in an exemplary application scenario, as shown in fig. 3, a user may press in the firstpreset area 31 by using a left hand, at this time, avirtual control 32 is displayed at a pressing position, and at this time, the pressing position of the user is a display position of a joystick in the virtual control. The user can then slide the joystick in the secondpredetermined area 35 with the right hand while continuing to press the joystick with the left hand to control the size of the generated target selection box, so as to adjust the pose of the virtual camera to enable the display position of the virtual unit which the user wants to select in the graphical user interface to be at least partially positioned in the target selection frame, when it is determined that the display positions of the virtual units desired to be selected in the graphical user interface are all at least partially within the target selection box, the user can release the left finger, at this time, the virtual unit is selected according to the overlapping rate of the display position of the virtual unit in the graphical user interface and the currently generated target selection frame, namely, the virtual unit of which the overlapping rate of the current display position in the graphical user interface and the currently generated target selection frame is larger than the preset threshold value is selected.
In the exemplary application scenario, in the process of selecting the virtual unit, the display position of the target selection frame in the graphical user interface is not changed, that is, the central point of the target selection frame is always located at the preset position of the graphical user interface, and the display position of the virtual unit in the graphical user interface is adjusted through the pose change of the virtual camera, so that the virtual unit to be selected is framed in the target selection frame to be selected. In the whole process, the game location of the virtual unit in the game scene is not changed, but the display position of the game location in the graphic user interface is changed.
In another exemplary application scenario, as shown in fig. 3, a user may press in the firstpreset area 31 with the left hand, at this time, avirtual control 32 is displayed at the pressed position, and at this time, the pressed position of the user is the display position of therocker 321 in the virtual control. Then, the user can continuously press the rocker with the left hand to slide the rocker in the chassis to control the size of the generated target selection frame, and can slide the rocker in the secondpreset area 35 with the right hand to move the currently generated target selection frame in the graphical user interface, so that the display position of the virtual unit which the user wants to select in the graphical user interface is at least partially located in the target selection frame. When it is determined that the display positions of the virtual units to be selected in the graphical user interface are at least partially located in the target selection frame, the user can release the left finger, and at this time, the virtual units are selected according to the overlapping rate of the display positions of the virtual units in the graphical user interface and the currently generated target selection frame, that is, the virtual units of which the overlapping rate of the display positions in the graphical user interface and the currently generated target selection frame is greater than the preset threshold value are selected. In the exemplary application scenario, during the process of selecting the virtual unit, the display position of the virtual unit in the graphical user interface is not changed, and the virtual unit to be selected is selected in the target selection frame by moving the target selection frame.
In yet another exemplary application scenario, in the graphical user interface shown in fig. 9, a user may control the size of the generated target selection frame by pressing the joystick with the left hand, and then the left hand presses the joystick to slide to adjust the display position of the generated target selection frame in the graphical user interface, so that the display position of the virtual unit to be selected in the graphical user interface is at least partially located in the generated target selection frame. And then, releasing the left hand, and at the moment, selecting the virtual unit according to the overlapping rate of the display position of the virtual unit in the graphical user interface and the currently generated target selection frame, namely, selecting the virtual unit of which the overlapping rate of the display position of the virtual unit in the graphical user interface and the currently generated target selection frame is greater than a preset threshold value. In the exemplary application scenario, during the process of selecting the virtual unit, the display position of the virtual unit in the graphical user interface is not changed, and the virtual unit to be selected is selected in the target selection frame by moving the target selection frame.
In another exemplary application scenario, as shown in fig. 3, a user may press in the firstpreset area 31 with the left hand, at which time avirtual control 32 is displayed at a pressed position, and at which time the pressed position of the user is a display position of arocker 321 in the virtual control. Then, the user can slide in the second preset area by the right hand to adjust the size of theinitial selection frame 33, and can slide in the chassis by the left hand to control the rocker to move the selection frame in the graphical user interface, so that the display position of the virtual unit to be selected in the graphical user interface is at least partially located in the generated target selection frame. When it is determined that the display positions of the virtual units to be selected in the graphical user interface are at least partially located in the target selection frame, the user may release the left finger, and at this time, the virtual units may be selected according to the position overlapping degree between the display positions of the virtual units in the graphical user interface and the currently generated target selection frame, for example, the virtual units whose overlapping ratio between the current display positions in the graphical user interface and the currently generated target selection frame is greater than the preset threshold value are selected. In the exemplary application scenario, the target selection frame generated by the movement control of the rocker is moved, and the display size of the target selection frame generated by the sliding control in the second preset area is used to select the virtual unit.
It should be noted that, in the present disclosure, after the left finger leaves the virtual control, the virtual joystick returns to the center of the chassis.
In an exemplary embodiment, after the selection of the virtual unit is performed, the successfully selected virtual unit may be specially marked to remind the user which virtual units are currently selected, and the generated target selection box may be cancelled to be displayed. The preset selected mark may be customized according to a requirement, for example, a yellow circle may be added to the selected virtual unit, and this is not particularly limited in this exemplary embodiment. At the same time, the rocker in the virtual control will return to the center of the chassis. Or the virtual control can be hidden in the first preset area, namely the virtual control is also canceled to be displayed, and when the user triggers the first preset area next time, the virtual control is displayed again.
In an exemplary embodiment, after the virtual unit is selected, the game of the current client may exit from the frame selection state (the frame selection state may be understood as a state corresponding to the process of selecting the virtual unit), that is, the game of the current client is in the non-frame selection state. In the non-frame selection state, the second preset area or any area in the whole graphical user interface can be used for controlling the pose adjustment of the virtual camera so as to update the game picture displayed in the graphical user interface.
Continuing to refer to fig. 2, next, in step S230, the target virtual unit is controlled to perform a game action.
In an exemplary application scenario, after selecting a target virtual unit from a plurality of virtual units to be selected, a game player may control the selected target virtual unit to execute a corresponding game action according to the own requirement. For example, controlling the target virtual unit to move, attack, grouping the target virtual unit, and the like, which is not particularly limited in the present exemplary embodiment.
In the method, the selection frame can be rapidly generated through the virtual control to select the virtual unit, so that the selection efficiency of the virtual unit is improved. Meanwhile, the size of the generated target selection frame can be controlled through the matching of the virtual control and the second preset area, different numbers of virtual units can be selected, the applicability of virtual unit selection is improved, and the target selection frame can be controlled to move, so that a user is assisted in selecting a virtual unit frame to be selected into the generated target selection frame, useless virtual units are avoided being selected, and the accuracy of virtual unit selection can be guaranteed through the combination of the virtual control and the second preset area. Furthermore, the selection of the virtual units can be completed through operation in a small range, even the user can be assisted to complete the quick and accurate selection of the virtual units by using one hand, and the shielding of the user view in the virtual unit selection process is reduced.
Those skilled in the art will appreciate that all or part of the steps implementing the above embodiments are implemented as computer programs executed by a CPU. The computer program, when executed by the CPU, performs the functions defined by the method provided by the present invention. The program may be stored in a computer readable storage medium, which may be a read-only memory, a magnetic or optical disk, or the like.
Furthermore, it should be noted that the above-mentioned figures are only schematic illustrations of the processes involved in the method according to exemplary embodiments of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Fig. 12 is a schematic structural diagram of a virtual unit selection device in a game in an exemplary embodiment of the present disclosure, which displays a game screen of the game through a graphical user interface of a display component, where the game screen includes a part or all of a game scene and a plurality of virtual units to be selected located in the game scene. Referring to fig. 12, theapparatus 1200 may include a target selectionbox generating module 1210, a selectingmodule 1220, and acontrol module 1230. Wherein:
a target selectionbox generation module 1210 configured to adjust a display size of an initial selection box displayed in a graphical user interface to generate a target selection box in response to a control operation of a touch medium on a virtual control in the graphical user interface;
the selectingmodule 1220 is configured to select a target virtual unit from the multiple virtual units to be selected according to a position relationship between the target selection frame and the multiple virtual units to be selected.
Acontrol module 1230 configured to control the target virtual unit to perform a game action.
In an exemplary implementation manner, based on the foregoing embodiment, theapparatus 1200 may further include an activation display module configured to:
the method comprises the steps of responding to a trigger operation of a touch medium for a first preset area in a graphical user interface, displaying a virtual control in the graphical user interface, and displaying an initial selection frame at a preset position of the graphical user interface.
In an exemplary embodiment, based on the foregoing embodiments, the virtual control includes a rocker and a chassis, the rocker being located in the chassis;
the target selectionbox generation module 1210 may be specifically configured to: and responding to the sliding operation of a touch medium on the rocker in the chassis, and adjusting the display size of the initial selection frame displayed in the graphical user interface according to the sliding distance of the sliding operation to generate a target selection frame.
In an exemplary embodiment, a game picture obtained by shooting a part or all of a game scene and a plurality of virtual units located in the game scene by a virtual camera is displayed through a graphical user interface, and based on the foregoing embodiment, the selectingmodule 1220 may be specifically configured to:
in response to the sliding operation of a second preset area in the graphical user interface, adjusting the pose of a virtual camera in the game so that the display position of the virtual unit to be selected in the graphical user interface is at least partially positioned in the target selection frame;
and selecting a target virtual unit from the plurality of virtual units to be selected according to the virtual unit to be selected which is at least partially positioned in the target selection frame at the display position in the graphical user interface.
In an exemplary implementation manner, based on the foregoing embodiment, the selectingmodule 1220 may be specifically configured to:
responding to sliding operation of a second preset area in a graphical user interface, and moving the target selection frame in the graphical user interface so that the display position of the virtual unit to be selected in the graphical user interface is at least partially positioned in the target selection frame;
and selecting a target virtual unit from the plurality of virtual units to be selected according to the virtual unit to be selected which is at least partially positioned in the target selection frame at the display position in the graphical user interface.
In an exemplary implementation, based on the foregoing embodiments, the target selectionbox generation module 1210 may be specifically configured to:
responding to the pressing operation of a touch medium on the rocker, and detecting the pressing magnitude of the touch medium on the rocker;
and adjusting the display size of the initial selection frame displayed in the graphical user interface according to the pressing force value based on a preset mapping relation so as to generate a target selection frame.
In an exemplary embodiment, based on the foregoing embodiments, the virtual control includes a rocker and a chassis, the rocker being located in the chassis; the target selectionbox generation module 1210 may be specifically configured to:
when the touch control medium is detected to be in a contact state with the rocker, responding to the sliding operation in a second preset area, and adjusting the display size of the initial selection frame displayed in the graphical user interface according to the sliding distance and/or the sliding direction of the sliding operation to generate a target selection frame.
In an exemplary implementation manner, based on the foregoing embodiment, the selectingmodule 1220 may be specifically configured to:
responding to the sliding operation of a touch medium on the rocker in the chassis, and moving the target selection frame in the graphical user interface to enable the display position of the virtual unit to be selected in the graphical user interface to be at least partially positioned in the target selection frame;
and selecting a target virtual unit from the plurality of virtual units to be selected according to the virtual unit to be selected which is at least partially positioned in the target selection frame at the display position in the graphical user interface.
In an exemplary implementation manner, based on the foregoing embodiment, the selecting a target virtual unit from the plurality of virtual units to be selected according to the virtual unit to be selected located at least partially within the target selection box at the display position in the graphical user interface includes:
and selecting a target virtual unit from the plurality of virtual units to be selected according to the position overlapping degree between the display position of the virtual unit to be selected and the target selection frame.
In an exemplary implementation manner, based on the foregoing embodiment, the selectingmodule 1220 may be specifically configured to:
and when the touch medium and the virtual control are detected to be changed from a contact state to a non-contact state, selecting a target virtual unit from the multiple virtual units to be selected according to the position relation between the target selection frame and the multiple virtual units to be selected.
The specific details of each module in the above-mentioned device for selecting virtual units in game have been described in detail in the method for selecting virtual units in game, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Moreover, although the steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that the steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a mobile terminal, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, there is also provided a computer storage medium capable of implementing the above method. On which a program product capable of implementing the above-described method of the present specification is stored. In some possible embodiments, various aspects of the disclosure may also be implemented in the form of a program product comprising program code for causing a terminal device to perform the steps according to various exemplary embodiments of the disclosure described in the "exemplary methods" section above of this specification, when the program product is run on the terminal device.
Referring to fig. 13, aprogram product 1300 for implementing the above method according to an embodiment of the present disclosure is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
In addition, in an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
Anelectronic device 1400 according to such an embodiment of the present disclosure is described below with reference to fig. 14. Theelectronic device 1400 shown in fig. 14 is only an example and should not bring any limitations to the functionality and scope of use of the embodiments of the present disclosure.
As shown in fig. 14, theelectronic device 1400 is embodied in the form of a general purpose computing device. The components of theelectronic device 1400 may include, but are not limited to: the at least oneprocessing unit 1410, the at least onememory unit 1420, thebus 1430 that connects the various system components (including thememory unit 1420 and the processing unit 1410), and thedisplay unit 1440.
Wherein the storage unit stores program code that is executable by theprocessing unit 1410, such that theprocessing unit 1410 performs steps according to various exemplary embodiments of the present disclosure described in the "exemplary methods" section above in this specification. For example, theprocessing unit 1410 may execute the following as shown in fig. 2: step S210, responding to the control operation of a touch control medium for a virtual control in a graphical user interface, and adjusting the display size of an initial selection frame displayed in the graphical user interface to generate a target selection frame; step S220, selecting the virtual units in the game according to the target selection frame.
Thestorage unit 1420 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM)14201 and/or acache memory unit 14202, and may further include a read only memory unit (ROM) 14203.
Storage unit 1420 may also include a program/utility 14204 having a set (at least one) ofprogram modules 14205,such program modules 14205 including but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 1430 may be any type of bus structure including a memory cell bus or memory cell controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
Theelectronic device 1400 may also communicate with one or more external devices 1500 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with theelectronic device 1400, and/or with any devices (e.g., router, modem, etc.) that enable theelectronic device 1400 to communicate with one or more other computing devices. Such communication can occur via an input/output (I/O)interface 1450. Also, theelectronic device 1400 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) via thenetwork adapter 1460. As shown, thenetwork adapter 1460 communicates with the other modules of theelectronic device 1400 via thebus 1430. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with theelectronic device 1400, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
Furthermore, the above-described figures are merely schematic illustrations of processes included in methods according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims (13)

1. A method for selecting virtual units in a game is characterized in that a game picture of the game is displayed through a graphical user interface of a display assembly, the game picture comprises part or all of game scenes and a plurality of virtual units to be selected positioned in the game scenes, and the method comprises the following steps:
responding to the control operation of a touch medium for a virtual control in a graphical user interface, and adjusting the display size of an initial selection frame displayed in the graphical user interface to generate a target selection frame;
selecting a target virtual unit from the plurality of virtual units to be selected according to the position relation between the target selection frame and the plurality of virtual units to be selected;
and controlling the target virtual unit to execute game action.
2. The in-game virtual unit selection method according to claim 1, wherein prior to responding to a control operation of a touch-sensitive medium with respect to a virtual control in a graphical user interface, the method further comprises:
the method comprises the steps of responding to a trigger operation of a touch medium for a first preset area in a graphical user interface, displaying a virtual control in the graphical user interface, and displaying an initial selection frame at a preset position of the graphical user interface.
3. The in-game virtual unit selection method according to claim 1 or 2, wherein the virtual control comprises a rocker and a chassis, the rocker being located in the chassis;
the adjusting the display size of the initial selection frame displayed in the graphical user interface to generate a target selection frame in response to a control operation of a touch medium on a virtual control in the graphical user interface comprises:
and responding to the sliding operation of a touch medium on the rocker in the chassis, and adjusting the display size of the initial selection frame displayed in the graphical user interface according to the sliding distance of the sliding operation to generate a target selection frame.
4. The method of claim 1, wherein a game scene is partially or completely captured by a virtual camera and a plurality of virtual units located in the game scene via a graphical user interface;
the selecting a target virtual unit from the plurality of virtual units to be selected according to the position relationship between the target selection frame and the plurality of virtual units to be selected includes:
in response to the sliding operation of a second preset area in the graphical user interface, adjusting the pose of a virtual camera in the game so that the display position of the virtual unit to be selected in the graphical user interface is at least partially positioned in the target selection frame;
and selecting a target virtual unit from the plurality of virtual units to be selected according to the virtual unit to be selected which is at least partially positioned in the target selection frame at the display position in the graphical user interface.
5. The method for selecting a virtual unit in a game according to claim 1, wherein the selecting a target virtual unit from the plurality of virtual units to be selected according to the positional relationship between the target selection frame and the plurality of virtual units to be selected includes:
responding to sliding operation of a second preset area in a graphical user interface, and moving the target selection frame in the graphical user interface so that the display position of the virtual unit to be selected in the graphical user interface is at least partially positioned in the target selection frame;
and selecting a target virtual unit from the plurality of virtual units to be selected according to the virtual unit to be selected which is at least partially positioned in the target selection frame at the display position in the graphical user interface.
6. The in-game virtual unit selection method of claim 1, wherein the virtual control comprises a rocker and a chassis, the rocker being located in the chassis;
the adjusting, in response to a control operation of a touch medium for a virtual control in a graphical user interface, a display size of an initial selection frame displayed in the graphical user interface to generate a target selection frame includes:
responding to the pressing operation of a touch medium on the rocker, and detecting the pressing magnitude of the touch medium on the rocker;
and adjusting the display size of the initial selection frame displayed in the graphical user interface according to the pressing force value based on a preset mapping relation so as to generate a target selection frame.
7. The in-game virtual unit selection method of claim 1, wherein the virtual control comprises a rocker and a chassis, the rocker being located in the chassis;
the adjusting, in response to a control operation of a touch medium for a virtual control in a graphical user interface, a display size of an initial selection frame displayed in the graphical user interface to generate a target selection frame includes:
when the touch control medium is detected to be in a contact state with the rocker, responding to the sliding operation in a second preset area, and adjusting the display size of the initial selection frame displayed in the graphical user interface according to the sliding distance and/or the sliding direction of the sliding operation to generate a target selection frame.
8. The method for selecting a virtual unit in a game according to claim 6 or 7, wherein the selecting a target virtual unit from the plurality of virtual units to be selected according to the positional relationship between the target selection frame and the plurality of virtual units to be selected includes:
responding to the sliding operation of a touch medium on the rocker in the chassis, and moving the target selection frame in the graphical user interface to enable the display position of the virtual unit to be selected in the graphical user interface to be at least partially positioned in the target selection frame;
and selecting a target virtual unit from the plurality of virtual units to be selected according to the virtual unit to be selected which is at least partially positioned in the target selection frame at the display position in the graphical user interface.
9. The method for selecting virtual units in a game according to claim 4, 5 or 8, wherein the selecting a target virtual unit from the plurality of virtual units to be selected according to the virtual unit to be selected located at least partially within the target selection box at the display position in the graphical user interface comprises:
and selecting a target virtual unit from the plurality of virtual units to be selected according to the position overlapping degree between the display position of the virtual unit to be selected and the target selection frame.
10. The method for selecting a virtual unit in a game according to claim 1, wherein the selecting a target virtual unit from the plurality of virtual units to be selected according to a positional relationship between the target selection frame and the plurality of virtual units to be selected, comprises:
and when the touch medium and the virtual control are detected to be changed from a contact state to a non-contact state, selecting a target virtual unit from the multiple virtual units to be selected according to the position relation between the target selection frame and the multiple virtual units to be selected.
11. A virtual unit selection device in a game is characterized in that a game picture of the game is shown through a graphical user interface of a display component, the game picture comprises part or all of a game scene and a plurality of virtual units to be selected positioned in the game scene, and the device comprises:
the target selection frame generation module is configured to respond to the control operation of a touch medium on a virtual control in a graphical user interface, adjust the display size of an initial selection frame displayed in the graphical user interface and generate a target selection frame;
the selecting module is configured to select a target virtual unit from the plurality of virtual units to be selected according to the position relation between the target selecting frame and the plurality of virtual units to be selected;
a control module configured to control the target virtual unit to perform a game action.
12. A computer-readable medium, on which a computer program is stored, which program, when being executed by a processor, carries out the method according to any one of claims 1 to 10.
13. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to carry out the method of any one of claims 1 to 10.
CN202110856819.8A2021-07-282021-07-28Virtual unit selection method and device in game, storage medium and electronic equipmentActiveCN113457144B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202110856819.8ACN113457144B (en)2021-07-282021-07-28Virtual unit selection method and device in game, storage medium and electronic equipment

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202110856819.8ACN113457144B (en)2021-07-282021-07-28Virtual unit selection method and device in game, storage medium and electronic equipment

Publications (2)

Publication NumberPublication Date
CN113457144Atrue CN113457144A (en)2021-10-01
CN113457144B CN113457144B (en)2024-02-02

Family

ID=77882860

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202110856819.8AActiveCN113457144B (en)2021-07-282021-07-28Virtual unit selection method and device in game, storage medium and electronic equipment

Country Status (1)

CountryLink
CN (1)CN113457144B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN115040873A (en)*2022-06-172022-09-13网易(杭州)网络有限公司Game grouping processing method and device, computer equipment and storage medium
CN116173503A (en)*2023-02-272023-05-30网易(杭州)网络有限公司 Game object selection method, device, electronic equipment and storage medium
WO2023097993A1 (en)*2021-12-022023-06-08网易(杭州)网络有限公司Information processing method and apparatus, device and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN109771941A (en)*2019-03-132019-05-21网易(杭州)网络有限公司Selection method and device, the equipment and medium of virtual objects in game
JP2019100417A (en)*2017-11-302019-06-24ジヤトコ株式会社Vehicle control device and vehicle control method
CN110302530A (en)*2019-08-082019-10-08网易(杭州)网络有限公司Virtual unit control method, device, electronic equipment and storage medium
CN110420459A (en)*2019-07-292019-11-08网易(杭州)网络有限公司Virtual unit is formed a team control method, device, electronic equipment and storage medium
CN110665225A (en)*2019-10-082020-01-10网易(杭州)网络有限公司Control method and device in game
CN111346369A (en)*2020-03-022020-06-30网易(杭州)网络有限公司Shooting game interaction method and device, electronic equipment and storage medium
CN112933591A (en)*2021-03-152021-06-11网易(杭州)网络有限公司Method and device for controlling game virtual character, storage medium and electronic equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JP2019100417A (en)*2017-11-302019-06-24ジヤトコ株式会社Vehicle control device and vehicle control method
CN109771941A (en)*2019-03-132019-05-21网易(杭州)网络有限公司Selection method and device, the equipment and medium of virtual objects in game
CN110420459A (en)*2019-07-292019-11-08网易(杭州)网络有限公司Virtual unit is formed a team control method, device, electronic equipment and storage medium
CN110302530A (en)*2019-08-082019-10-08网易(杭州)网络有限公司Virtual unit control method, device, electronic equipment and storage medium
CN110665225A (en)*2019-10-082020-01-10网易(杭州)网络有限公司Control method and device in game
CN111346369A (en)*2020-03-022020-06-30网易(杭州)网络有限公司Shooting game interaction method and device, electronic equipment and storage medium
CN112933591A (en)*2021-03-152021-06-11网易(杭州)网络有限公司Method and device for controlling game virtual character, storage medium and electronic equipment

Cited By (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
WO2023097993A1 (en)*2021-12-022023-06-08网易(杭州)网络有限公司Information processing method and apparatus, device and storage medium
CN115040873A (en)*2022-06-172022-09-13网易(杭州)网络有限公司Game grouping processing method and device, computer equipment and storage medium
CN116173503A (en)*2023-02-272023-05-30网易(杭州)网络有限公司 Game object selection method, device, electronic equipment and storage medium

Also Published As

Publication numberPublication date
CN113457144B (en)2024-02-02

Similar Documents

PublicationPublication DateTitle
CN107913520B (en)Information processing method, information processing device, electronic equipment and storage medium
JP7601488B2 (en) Method and device for controlling virtual objects, mobile terminal and computer program
JP6013583B2 (en) Method for emphasizing effective interface elements
US9836146B2 (en)Method of controlling virtual object or view point on two dimensional interactive display
CN108295466B (en)Virtual object motion control method and device, electronic equipment and storage medium
CN113457144B (en)Virtual unit selection method and device in game, storage medium and electronic equipment
JP2020504851A (en) Game screen display control method, device, storage medium, and electronic device
CN107977141B (en)Interaction control method and device, electronic equipment and storage medium
CN107132981B (en)Display control method and device, storage medium, the electronic equipment of game picture
CN113559501B (en)Virtual unit selection method and device in game, storage medium and electronic equipment
CN107203321B (en)Display control method and device, storage medium, the electronic equipment of game picture
CN113457117A (en)Method and device for selecting virtual units in game, storage medium and electronic equipment
JP7589426B2 (en) Mark processing method and device, computer device, and computer program
US20140115533A1 (en)Information-processing device, storage medium, information-processing method, and information-processing system
CN107213643A (en)Display control method and device, storage medium, the electronic equipment of game picture
US10416761B2 (en)Zoom effect in gaze tracking interface
CN111467803A (en)In-game display control method and device, storage medium, and electronic device
CN108245889B (en)Free visual angle orientation switching method and device, storage medium and electronic equipment
CN111530065A (en)Game control method and device, computer storage medium and electronic equipment
CN108355352B (en)Virtual object control method and device, electronic device and storage medium
WO2019166005A1 (en)Smart terminal, sensing control method therefor, and apparatus having storage function
US20230147561A1 (en)Metaverse Content Modality Mapping
CN112402967B (en)Game control method, game control device, terminal equipment and medium
CN114546240A (en)Interactive implementation method, device, equipment and storage medium of game
CN108499102B (en)Information interface display method and device, storage medium and electronic equipment

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp