Movatterモバイル変換


[0]ホーム

URL:


CN112486321B - Three-dimensional model operation control method and device and terminal equipment - Google Patents

Three-dimensional model operation control method and device and terminal equipment
Download PDF

Info

Publication number
CN112486321B
CN112486321BCN202011380120.0ACN202011380120ACN112486321BCN 112486321 BCN112486321 BCN 112486321BCN 202011380120 ACN202011380120 ACN 202011380120ACN 112486321 BCN112486321 BCN 112486321B
Authority
CN
China
Prior art keywords
target
gear
operation data
target control
mapping
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011380120.0A
Other languages
Chinese (zh)
Other versions
CN112486321A (en
Inventor
张二阳
李志帅
敖亚磊
侯晓龙
郑旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhengzhou J&T Hi Tech Co Ltd
Original Assignee
Zhengzhou J&T Hi Tech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhengzhou J&T Hi Tech Co LtdfiledCriticalZhengzhou J&T Hi Tech Co Ltd
Priority to CN202011380120.0ApriorityCriticalpatent/CN112486321B/en
Publication of CN112486321ApublicationCriticalpatent/CN112486321A/en
Application grantedgrantedCritical
Publication of CN112486321BpublicationCriticalpatent/CN112486321B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Landscapes

Abstract

The application provides a three-dimensional model operation control method, a device and a terminal device, wherein the method comprises the following steps: when a user uses a mouse of the terminal equipment to operate a target control, displaying an operator on an interface of the terminal equipment, wherein the operator comprises an operating rod and a mouse indicating ball; acquiring operation data when the user operates the target control by using the mouse; performing first mapping processing on the operation data based on a first mapping model to obtain a target position of the operation data mapped on the operation rod, and displaying the mouse pointer ball at the target position; and performing second mapping processing on the operation data based on a second mapping model to obtain a target gear of the operation data mapped on the target control, and switching the gear of the target control in the three-dimensional model to the target gear. The method can avoid the problem of non-response or multiple responses, and can simulate the damping sense and the Kanton sense in a real scene.

Description

Three-dimensional model operation control method and device and terminal equipment
Technical Field
The application relates to the technical field of computer three-dimensional simulation, in particular to a three-dimensional model operation control method and device and terminal equipment.
Background
With the continuous application of the three-dimensional simulation field of the Personal Computer (PC for short), the requirement of the user for the three-dimensional interactive experience of the PC end is continuously improved, and the three-dimensional interactive experience becomes a key factor for evaluating the quality of the three-dimensional simulation effect of the PC end.
At present, the three-dimensional interaction process of the PC end is mainly based on the conversion of the user on the operations of the keyboard and the mouse, and then the operation effect is presented on the three-dimensional control of the interface. However, the operation may not be responded or may be responded for many times due to the effect of the viewing angle position during the specific operation, and the damping sense and the click sense during the operation cannot be simulated due to the limitation of the hardware functions of the keyboard and the mouse.
Disclosure of Invention
An object of the present invention is to provide a method, an apparatus and a terminal device for controlling a three-dimensional model operation, so as to solve the problems of the prior art that the operation is not responded or is responded for multiple times, and the damping and the click feeling can not be simulated when the operation is not performed.
In order to achieve the above purpose, the technical solutions adopted in the embodiments of the present application are as follows:
in a first aspect, an embodiment of the present application provides a three-dimensional model operation control method, where the method is applied to a terminal device, a three-dimensional model to be operated is displayed on an interface of the terminal device, and the three-dimensional model includes at least one target control with an adjustable gear to be operated, and the method includes:
when a user uses a mouse of the terminal device to operate a target control, displaying an operator on an interface of the terminal device, wherein the operator comprises the following display elements: an operating rod and a mouse indicating ball.
Obtaining operation data when the user uses the mouse to operate the target control, wherein the operation data comprises: the drag direction and the drag distance.
And performing first mapping processing on the operation data based on a first mapping model to obtain a target position of the operation data mapped on the operation rod, and displaying the mouse pointer ball at the target position.
And performing second mapping processing on the operation data based on a second mapping model to obtain a target gear of the operation data mapped on the target control, and switching the gear of the target control in the three-dimensional model to the target gear.
As an optional implementation manner, the performing, based on the first mapping model, the first mapping process on the operation data to obtain a target position where the operation data is mapped on the operation rod, and displaying the mouse pointer ball at the target position includes:
and acquiring the number of gears of the target control and the distance between adjacent gears.
And inputting the gear number, the distance between the adjacent gears and the operation data into the first mapping model to obtain a target position of the operation data mapped on the operation rod, and displaying the mouse indicating ball at the target position.
As an optional implementation manner, the first mapping model is configured to perform length averaging on the operation levers according to the number of the gears to obtain a plurality of positions to be selected, where the number of the positions is the same as the number of the gears, and select one position to be selected from the plurality of positions to be selected as the target position according to the distance between adjacent gears and the operation data.
As an optional implementation manner, the performing, on the basis of the second mapping model, the second mapping process on the operation data to obtain a target gear of the operation data mapped on the target control, and switching the gear of the target control in the three-dimensional model to the target gear includes:
and acquiring the distance between adjacent gears of the target control.
And inputting the distance between the adjacent gears and the operation data into the second mapping model to obtain a target gear of the operation data mapped on the target control, and switching the gear of the target control in the three-dimensional model to the target gear.
As an optional implementation manner, the second mapping model is configured to determine whether to perform gear shifting according to the operation data and the distance between adjacent gears, and use a previous gear or a next gear of a current gear on the target control as the target gear when determining to perform gear shifting.
As an optional implementation manner, the operator further includes: the system comprises a selection box and a prompt box, wherein the prompt box is used for displaying information of a selected target control.
The displaying the operator on the interface of the terminal equipment comprises the following steps:
and determining the display position of the selected middle frame according to the position of the target control on the interface.
And determining the rotation angle of the operator according to the positions of the two ends of the target control and the initial position of the operating rod.
And rotating the operator according to the rotation angle, displaying the center selection frame on the display position of the center selection frame according to the rotated angle, and displaying the operation rod, the mouse indication ball and the prompt box according to the preset position relation between the center selection frame and the operation rod, the mouse indication ball and the prompt box.
As an optional implementation manner, after the displaying the operator on the interface of the terminal device, the method further includes:
and hiding a system mouse icon displayed on the interface of the terminal equipment.
In a second aspect, an embodiment of the present application provides a three-dimensional model operation control apparatus, where the apparatus is applied to a terminal device, a three-dimensional model to be operated is displayed on an interface of the terminal device, and the three-dimensional model includes at least one target control with an adjustable gear to be operated, and the apparatus includes:
the display module is used for displaying an operator on an interface of the terminal equipment when a user operates the target control by using a mouse of the terminal equipment, wherein the operator comprises the following display elements: an operating rod and a mouse indicating ball.
An obtaining module, configured to obtain operation data when the user operates the target control using the mouse, where the operation data includes: the drag direction and the drag distance.
And the first mapping module is used for carrying out first mapping processing on the operation data based on a first mapping model to obtain a target position of the operation data mapped on the operation rod, and displaying the mouse pointer ball at the target position.
And the second mapping module is used for performing second mapping processing on the operation data based on a second mapping model to obtain a target gear of the operation data mapped on the target control, and switching the gear of the target control in the three-dimensional model to the target gear.
As an optional implementation manner, the first mapping module is specifically configured to:
and acquiring the number of gears of the target control and the distance between adjacent gears.
And inputting the gear number, the distance between the adjacent gears and the operation data into the first mapping model to obtain a target position of the operation data mapped on the operation rod, and displaying the mouse indicating ball at the target position.
As an optional implementation manner, the first mapping model is configured to perform length averaging on the operation levers according to the number of the gears to obtain a plurality of positions to be selected, where the number of the positions is the same as the number of the gears, and select one position to be selected from the plurality of positions to be selected as the target position according to the distance between adjacent gears and the operation data.
As an optional implementation manner, the second mapping module is specifically configured to:
and acquiring the distance between adjacent gears of the target control.
And inputting the distance between the adjacent gears and the operation data into the second mapping model to obtain a target gear of the operation data mapped on the target control, and switching the gear of the target control in the three-dimensional model to the target gear.
As an optional implementation manner, the second mapping model is configured to determine whether to perform gear shifting according to the operation data and the distance between adjacent gears, and use a previous gear or a next gear of a current gear of the target control as the target gear when determining to perform gear shifting.
As an optional implementation manner, the operator further includes: the method comprises the steps of selecting a middle frame and a prompt box, wherein the prompt box is used for displaying information of a selected target control.
The display module is specifically configured to:
and determining the display position of the selected middle frame according to the position of the target control on the interface.
And determining the rotation angle of the operator according to the positions of the two ends of the target control and the initial position of the operating rod.
And rotating the operator according to the rotation angle, displaying the selected middle frame on the display position of the selected middle frame according to the rotated angle, and displaying the operation rod, the mouse indicating ball and the prompt box according to the preset position relation between the selected middle frame and the operation rod, the mouse indicating ball and the prompt box.
As an optional implementation manner, the apparatus further includes:
and the hiding module is used for hiding the system mouse icon displayed on the interface of the terminal equipment.
In a third aspect, an embodiment of the present application provides a terminal device, including: the system comprises a processor, a storage medium and a bus, wherein the storage medium stores machine-readable instructions executable by the processor, when a terminal device runs, the processor and the storage medium communicate through the bus, and the processor executes the machine-readable instructions to execute the steps of the three-dimensional model operation control method of the first aspect.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and the computer program is executed by a processor to perform the steps of the three-dimensional model operation control method according to the first aspect.
The beneficial effect of this application is:
according to the three-dimensional model operation control method, the three-dimensional model operation control device and the terminal device, when a user operates a target control of a three-dimensional model by using a mouse, the terminal device displays an operator on an interface, the terminal device further obtains a target position of the user operation mapped on an operating rod of the operator by using the first mapping model, obtains a target gear of the user operation mapped on the target control by using the second mapping model, and presents the mapped effect on the operator and the target control. Because the terminal equipment uses the two mapping models to respectively map the user operation to the operator and the target control, more accurate mapping can be carried out according to the characteristics of the operator and the target control, the accuracy of the obtained target position is higher, the problem of no response or multiple responses can be avoided, and the damping sense and the stuck sense in a real scene can be simulated.
In addition, the system mouse icon is hidden, so that interference on the user can be avoided, and the user experience is improved.
Drawings
To more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
Fig. 1 is a schematic flowchart of a three-dimensional model operation control method according to an embodiment of the present application;
FIG. 2 is an exemplary diagram of a three-dimensional model and an operator interface;
fig. 3 is a schematic flowchart of a mapping process performed by a terminal device based on a first mapping model;
fig. 4 is a schematic flowchart of the terminal device performing mapping processing based on the second mapping model;
FIG. 5 is a schematic flow chart of a display operator of the terminal device;
fig. 6 is a block diagram of a three-dimensional model operation control apparatus according to an embodiment of the present application;
fig. 7 is another block configuration diagram of a three-dimensional model operation control apparatus according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of aterminal device 80 according to an embodiment of the present application.
Detailed Description
In order to make the purpose, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it should be understood that the drawings in the present application are only for illustration and description purposes and are not used to limit the protection scope of the present application. Additionally, it should be understood that the schematic drawings are not necessarily drawn to scale. The flowcharts used in this application illustrate operations implemented according to some embodiments of the present application. It should be understood that the operations of the flow diagrams may be performed out of order, and steps without logical context may be performed in reverse order or simultaneously. In addition, one skilled in the art, under the guidance of the present disclosure, may add one or more other operations to the flowchart, or may remove one or more operations from the flowchart.
In addition, the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that in the embodiments of the present application, the term "comprising" is used to indicate the presence of the features stated hereinafter, but does not exclude the addition of further features.
At present, the operation effect is presented on the three-dimensional control of the interface by converting the operation of the keyboard and the mouse at the PC end. However, the following two problems may occur:
firstly, the three-dimensional interaction process of the PC terminal mainly comprises a clicking type and a dragging type, and the problems that the operation is not responded or the response is repeated frequently and the like can occur due to the influence of the visual angle position in the specific using process. For high-risk simulation fields or scenarios where operations are to be completed in a short time, such problems may lead to serious consequences.
Secondly, the three-dimensional interaction at the PC end is generally completed only through a mouse and a keyboard, and due to the limitation of hardware conditions of the mouse and the keyboard, the current three-dimensional interaction process only can show a click or drag effect, and cannot simulate damping feeling and click feeling in a real scene.
Based on the above problems, the embodiments of the present application provide a three-dimensional model operation control method, which obtains target positions mapped on an operator and a three-dimensional control by using two mapping models after a user operates a mouse, and correspondingly moves an icon to the target positions, so that the obtained target positions are more accurate, the problem of non-response or multiple responses can be avoided, and damping feeling and click-and-pause feeling in a real scene can be simulated.
The method and the device for mapping the operation of the terminal device to the target positions on the operator and the operation control can be applied to a three-dimensional simulation scene, in which a three-dimensional model is displayed in a display screen of the terminal device, the three-dimensional model comprises at least one operation control, a user can use a mouse to control the operation control, and when the user controls, the terminal device maps the operation of the user to the target positions on the operator and the operation control based on the method of the embodiment of the application and correspondingly moves the icon to the target positions.
Fig. 1 is a schematic flowchart of a three-dimensional model operation control method provided in an embodiment of the present application, where an execution subject of the method is a terminal device including a mouse, for example, a PC. As shown in fig. 1, the method includes:
s101, when a user uses a mouse of the terminal device to operate a target control, displaying an operator on an interface of the terminal device, wherein the operator comprises the following display elements: an operating rod and a mouse indicating ball.
Before executing the step, a three-dimensional model to be operated is displayed on an interface of the terminal device, for example, the three-dimensional model may be a three-dimensional model of a train console, and the like.
The three-dimensional model comprises at least one target control to be operated and capable of adjusting gears, and a user can adjust the gears of the target controls by dragging a mouse.
It should be noted that the operation control in the three-dimensional model may include various types, such as a knob type, a button type, and the like. Regardless of the type of control, the control may be a target control for the adjustable gear as described herein, as long as the control is adjustable. For example, a certain control is a press type control, which is opened after being pressed for the first time, closed after being pressed for the second time, and so on, then the press type control may be used as a target control with adjustable gear according to the present application, and display processing is performed according to the method of the present application.
When the user operates the target control by using the mouse, the terminal equipment can display the operator on the interface. In the embodiment of the application, the operator is an interface element displayed by the terminal device, is independent of the three-dimensional model, and is used for timely positioning a certain target control when a user operates the target control and timely presenting the effect of the user on operating the target control. Therefore, in the application, after the user operates the target control, the operation effect can be presented on the operator and the three-dimensional model of the target control at the same time.
Fig. 2 is an exemplary diagram of a three-dimensional model and an interface of an operator, as shown in fig. 2, the three-dimensional model displayed by the terminal device includes a target control a with adjustable gears, and when a user operates the target control a through a mouse, the terminal device may display the operator in a semi-transparent manner above the target control a.
Alternatively, the operator displayed on the terminal device may include a joystick and a mouse pointer. The operating rod can be used for representing the range of the target control operation, and the position of the mouse indicating ball on the operating rod represents the effect of the target control operation of the user. Illustratively, when the user does not drag the mouse for the target control, the mouse indicating ball is located at the starting end of the operating rod, and after the user drags the mouse, the mouse indicating ball is displayed at the corresponding position of the operating rod according to the distance dragged by the user.
S102, obtaining operation data when a user operates a target control by using a mouse, wherein the operation data comprises: the drag direction and the drag distance.
Optionally, the terminal device may determine whether the user starts to drag the mouse by monitoring a click event, a movement event, and a release event of the mouse, and calculate a drag direction and a drag distance when the user drags the mouse based on a position coordinate of a mouse pointer on a display screen of the terminal device in a process that the user drags the mouse.
And S103, performing first mapping processing on the operation data based on a first mapping model to obtain a target position of the operation data mapped on the operating rod, and displaying the mouse indicating ball at the target position.
The first mapping model is used for mapping the operation of the user to the operator so as to present the effect of the user after the operation on the operator.
By using the first mapping model, the terminal device can calculate a target position where the mouse indication ball should be located on the operation rod after the user moves the mouse in the dragging direction for the target control by the dragging distance, and the terminal device moves the mouse indication ball to the target position, namely, the mouse indication ball is displayed on the target position.
Alternatively, in the embodiment of the present application, the position of the element may be represented by coordinates of the element on the display screen. Accordingly, the mouse indicates the target position of the ball on the joystick, and may refer to coordinates on the display screen that are within the range of the joystick.
And S104, performing second mapping processing on the operation data based on a second mapping model to obtain a target gear of the operation data mapped on the target control, and switching the gear of the target control in the three-dimensional model to the target gear.
The second mapping model is used for mapping the user operation to the target control so as to present the effect of the user after the user operation on the target control.
By using the second mapping model, the terminal device can calculate whether the gear of the target control needs to be switched to other gears except the current gear after the user moves the mouse by the dragging distance according to the dragging direction for the target control, and if so, the gear of the target control is switched to other corresponding gears.
It should be noted that, in the specific implementation process, the execution sequence of the steps S103 and S104 is not sequential.
In this embodiment, when the user operates the target control of the three-dimensional model using the mouse, the terminal device displays the operator on the interface, and the terminal device further obtains the target position of the user operation mapped on the operating rod of the operator using the first mapping model, obtains the target gear of the user operation mapped on the target control using the second mapping model, and presents the mapped effect on the operator and the target control. Because the terminal equipment uses the two mapping models to respectively map the user operation to the operator and the target control, more accurate mapping can be carried out according to the characteristics of the operator and the target control, the accuracy of the obtained target position is higher, the problem of no response or multiple responses can be avoided, and the damping sense and the pause sense in a real scene can be simulated.
The following describes the process of the terminal device performing the mapping process based on the first mapping model and the second mapping model, respectively.
Fig. 3 is a schematic flowchart of a terminal device performing mapping processing based on a first mapping model, and as shown in fig. 3, an optional manner of the step S103 includes:
and S301, acquiring the gear number of the target control and the distance between adjacent gears.
Optionally, the number of the gears of the target control is an inherent parameter of the three-dimensional model, and the number of the gears can be obtained by reading parameter information of the three-dimensional model. The distance between adjacent gears can be obtained by calculating the distance between the coordinates of the center points of the adjacent gears.
S302, inputting the gear number, the distance between the adjacent gears and the operation data into the first mapping model to obtain a target position of the operation data mapped on the operation rod, and displaying the mouse indicating ball at the target position.
Optionally, the number of gears, the distance between adjacent gears, and the operation data are used as input parameters of a first mapping model, and the first mapping model performs a first mapping process based on the input parameters, so that a target position of the operation data mapped on the operation lever can be obtained.
Optionally, the first mapping model may be a machine learning model obtained by training in advance using a training sample, or may also be a mapping model composed of several calculation formulas.
Regardless of the form of the first mapping model, the first mapping model may perform the mapping as follows.
As an alternative embodiment, the first mapping model may be configured to perform length averaging on the operation levers according to the number of the gear positions to obtain a plurality of candidate positions that are the same as the number of the gear positions, and select one candidate position from the plurality of candidate positions as the target position according to the distance between the adjacent gear positions and the operation data.
In one example, assuming that the number of shift positions is 2, after the length of the operating lever is averaged, two candidate positions are obtained, one candidate position is a start position of the operating lever, and the other candidate position is an end position of the operating lever.
In another example, assuming that the number of shift positions is 3, after the length of the operating lever is averaged, three candidate positions are obtained, where a first candidate position is a start position of the operating lever, a second candidate position is a position of a half length of the operating lever, and a third candidate position is an end position of the operating lever.
After the processing, each gear of the target control can respectively correspond to a to-be-selected position on the operating rod. After the user drags the mouse, the mouse indicates that the ball may only be located in one of these candidate positions.
Optionally, the terminal device may determine, based on the dragging distance and the distance between adjacent gears, whether the distance dragged by the user has satisfied the distance to another gear, and if yes, determine, by combining with the dragging direction, a to-be-selected position corresponding to another gear on the operation rod before or after the mouse indication ball is moved to the current position. If not, the terminal device can keep the position of the mouse pointing ball unchanged.
Exemplarily, assuming that the number of gears is 3, the terminal device determines, based on the dragging distance and the adjacent gear distance, that the dragging distance of the user meets the distance to the adjacent gear of the current gear, and then, the terminal device determines, according to the dragging direction, that the user drags the mouse in the direction of the next gear of the current gear, so that the terminal device can use the to-be-selected position corresponding to the next gear of the current gear on the operating rod as the target position, and move the mouse indication ball to the target position.
Fig. 4 is a schematic flowchart of a process of performing mapping processing by a terminal device based on a second mapping model, and as shown in fig. 4, an optional manner of the step S104 includes:
and S401, acquiring the distance between adjacent gears of the target control.
The manner of obtaining the distance between adjacent gears of the target control in this step is the same as that in step S301, and reference may be made to the description of step S301, which is not described herein again.
S402, inputting the distance between the adjacent gears and the operation data into a second mapping model to obtain a target gear of the operation data mapped on the target control, and switching the gear of the target control in the three-dimensional model to the target gear.
Optionally, the distance between the adjacent gears and the operation data are used as input parameters of a second mapping model, and the second mapping model performs second mapping processing based on the input parameters, so that a target position of the operation data mapped on the operation rod can be obtained.
Optionally, the second mapping model may be a machine learning model obtained by training in advance using training samples, or may also be a mapping model composed of several calculation formulas.
Regardless of which form the second mapping model takes, the second mapping model may accomplish the mapping as follows.
As an optional implementation manner, the second mapping model may be configured to determine whether to perform gear shifting according to the operation data and the distance between adjacent gears, and when determining to perform gear shifting, use a previous gear or a next gear of a current gear of the target control as the target gear.
Alternatively, the terminal device may perform linear or nonlinear processing based on the operation data and the distance between adjacent gears. For example, when the dragging distance is greater than or equal to half of the distance between adjacent gears, the terminal device may determine to perform gear shifting, and determine, based on the dragging direction, that the target gear to be shifted is the previous gear or the next gear of the current gear.
It is worth noting that when the user drags the mouse, the dragging may be continued for a long distance, and the total distance of dragging may be greater than the distance between adjacent gears of the target control. For the situation, the terminal device can monitor the dragging distance of the mouse in real time, once the distance of the mouse can be switched to the previous gear or the next gear is reached, gear switching and display can be completed in the above mode, and then the position after gear switching and display are taken as the new current gear, judgment and processing are continued based on the above mode, and by analogy, the specific execution process is not repeated.
In this embodiment, the second mapping model determines whether to perform shift switching based on the operation data and the distance between adjacent shifts, and does not perform shift switching when the operation data does not satisfy a specific condition, thereby better simulating a damping feeling and a click feeling of an operation.
The following describes a processing procedure when the terminal device displays the above-described operator on the interface.
As mentioned above, the operator may include a joystick and a mouse pointer, and as an alternative, the operator may further include: selecting a middle frame and a prompt frame. An example of a checkbox and prompt box interface may be found in FIG. 2, previously described. A checkbox may be displayed in a semi-transparent manner over the target control to indicate that the target control is currently being operated by the user. The prompt box is used for displaying the information of the selected target control. For example, the information of the target control displayed by the prompt box may include: name, state value, and operation mode of the target control.
Fig. 5 is a schematic flowchart of a process for displaying an operator on a terminal device, and as shown in fig. 5, an alternative way for displaying the operator on the interface of the terminal device in step S101 includes:
and S501, determining the display position of the selected middle frame according to the position of the target control on the interface.
Optionally, the position of the target control on the interface may refer to a two-dimensional coordinate of a center point of the target control on the interface. The terminal device may first obtain a three-dimensional position coordinate of the center point of the target control in the three-dimensional model, and map the three-dimensional position coordinate to the two-dimensional interface, so as to obtain a two-dimensional coordinate of the center point of the target control on the interface.
After obtaining the two-dimensional coordinate of the center point of the target control on the interface, the two-dimensional coordinate may be used as the display position of the selected middle frame. Specifically, the two-dimensional coordinates are display positions of the center points of the selected middle frames.
And S502, determining the rotation angle of the operator according to the positions of the two ends of the target control and the initial position of the operating rod.
Optionally, the direction of each target control in the three-dimensional control may be consistent with the direction of the screen, or a certain angle difference may exist, and for the operator, the direction needs to be kept consistent with the direction of the currently operating target control at any time, so that the direction of the operator can be adjusted in real time according to the direction of the target control. Specifically, in this step, the rotation angle of the operator may be determined according to the positions of the two ends of the target control and the initial position of the operation rod. The initial position of the operating rod is the position of the operating rod at the moment before the user operates the target control, and the initial position of the operating rod can be represented by the positions of two end points of the operating rod, and a ray can be obtained through the positions of the two end points.
It should be noted that if the current target control is the first target control to be operated, that is, the operator is not displayed on the interface before the current target control is operated, the initial position of the joystick may be a default position parallel to the bottom end of the screen. The positions of the two ends of the target control can represent the positions of two end points of the target control, and the other ray can be obtained through the positions of the two end points. Furthermore, the angle formed between the two rays can be used as the angle difference between the operating rod and the target control, and the angle difference can be used as the rotation angle of the operator.
And S503, rotating the operator according to the rotation angle, displaying the center selection frame on the display position of the center selection frame according to the rotated angle, and displaying the operation rod, the mouse indication ball and the prompt box according to the preset position relation between the center selection frame and the operation rod, the mouse indication ball and the prompt box.
Optionally, the center selection box, the joystick, the mouse pointer and the prompt box in the operator are used as the constituent elements of the operator, and the position relationship in the operator is preset and fixed. Therefore, the operating rod, the mouse pointer and the prompt box can be displayed simultaneously according to the fixed position relation while the selection box is displayed at the determined display position according to the rotation angle of the operator, so that the display of the operator above the target control is realized.
Optionally, when the prompt box is displayed, the name, the state, the operation mode, and the like of the currently operated target control may be displayed in the prompt box. The state of the target control can be the current gear of the target control, and the state can be updated in real time along with the operation change of a user. The operation mode of the target control may be, for example: rotate, click, etc.
As an alternative implementation, after the terminal device displays the operator on the interface, the system mouse icon displayed on the interface of the terminal device may be hidden.
After the processing of the foregoing embodiment, when the user operates the target control, the terminal device displays the operator on the interface, and adjusts the position of the mouse indication ball on the operation rod according to the operation data of the user on the system mouse, the user can know the state change of the target control according to the position change of the mouse indication ball, and the position of the system mouse icon has a small effect on the user in this scene, so that the terminal device can hide the system mouse icon, thereby avoiding interference with the user, and improving the experience of the user.
Based on the same inventive concept, a three-dimensional model operation control device corresponding to the three-dimensional model operation control method is also provided in the embodiments of the present application, and as the principle of solving the problem of the device in the embodiments of the present application is similar to that of the three-dimensional model operation control method in the embodiments of the present application, the implementation of the device may refer to the implementation of the method, and the repeated parts are not described again.
Fig. 6 is a block diagram of a three-dimensional model operation control apparatus according to an embodiment of the present disclosure, where the apparatus may be applied to a terminal device, and a three-dimensional model to be operated is displayed on an interface of the terminal device, where the three-dimensional model includes at least one target control to be operated and with an adjustable gear. As shown in fig. 6, the apparatus includes:
thedisplay module 601 is configured to display an operator on an interface of the terminal device when a user operates the target control using a mouse of the terminal device, where the operator includes the following display elements: an operating rod and a mouse indicating ball.
An obtainingmodule 602, configured to obtain operation data when the user operates the target control using the mouse, where the operation data includes: the drag direction and the drag distance.
Afirst mapping module 603, configured to perform a first mapping process on the operation data based on a first mapping model to obtain a target position of the operation data mapped on the operation rod, and display the mouse pointer ball at the target position.
Asecond mapping module 604, configured to perform a second mapping process on the operation data based on a second mapping model, to obtain a target gear of the operation data mapped on the target control, and switch the gear of the target control in the three-dimensional model to the target gear.
As an optional implementation manner, thefirst mapping module 603 is specifically configured to:
and acquiring the number of gears of the target control and the distance between adjacent gears.
And inputting the gear number, the distance between the adjacent gears and the operation data into the first mapping model to obtain a target position of the operation data mapped on the operation rod, and displaying the mouse indicating ball at the target position.
As an alternative embodiment, the first mapping model is configured to average the lengths of the operating levers according to the number of the gear positions to obtain a plurality of candidate positions that are the same as the number of the gear positions, and select one candidate position from the plurality of candidate positions as the target position according to the distance between the adjacent gear positions and the operation data.
As an optional implementation manner, thesecond mapping module 604 is specifically configured to:
and acquiring the distance between adjacent gears of the target control.
And inputting the distance between the adjacent gears and the operation data into the second mapping model to obtain a target gear of the operation data mapped on the target control, and switching the gear of the target control in the three-dimensional model to the target gear.
As an optional implementation manner, the second mapping model is configured to determine whether to perform gear shifting according to the operation data and the distance between adjacent gears, and when it is determined to perform gear shifting, use a previous gear or a next gear of a current gear of the target control as the target gear.
As an optional implementation, the operator further includes: the method comprises the steps of selecting a middle frame and a prompt box, wherein the prompt box is used for displaying information of a selected target control.
Thedisplay module 601 is specifically configured to:
and determining the display position of the selected frame according to the position of the target control on the interface.
And determining the rotation angle of the operator according to the positions of the two ends of the target control and the initial position of the operating rod.
And rotating the operator according to the rotation angle, displaying the center selection frame on the display position of the center selection frame according to the rotated angle, and displaying the operation rod, the mouse indicating ball and the prompt box according to the preset position relation between the center selection frame and the operation rod, the mouse indicating ball and the prompt box.
Fig. 7 is another block configuration diagram of a three-dimensional model operation control apparatus according to an embodiment of the present application, and as shown in fig. 7, the apparatus further includes:
ahiding module 605, configured to hide a system mouse icon displayed on the interface of the terminal device.
An embodiment of the present application further provides aterminal device 80, as shown in fig. 8, which is a schematic structural diagram of theterminal device 80 provided in the embodiment of the present application, and includes: aprocessor 81, amemory 82, and abus 83. Thememory 82 stores machine-readable instructions (e.g., execution instructions corresponding to the display module, the obtaining module, the first mapping module, the second mapping module, and the hiding module in the apparatuses in fig. 6 and 7, etc.) executable by theprocessor 81, when theterminal device 80 is running, theprocessor 81 communicates with thememory 82 through thebus 83, and the machine-readable instructions are executed by theprocessor 81 to perform the method steps in the above method embodiments.
An embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program performs the steps of the three-dimensional model operation control method.
It can be clearly understood by those skilled in the art that, for convenience and simplicity of description, the specific working process of the system and the apparatus described above may refer to the corresponding process in the method embodiment, and is not described in detail in this application. In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. The above-described apparatus embodiments are merely illustrative, and for example, the division of the modules is merely a logical division, and there may be other divisions in actual implementation, and for example, a plurality of modules or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed coupling or direct coupling or communication connection between each other may be through some communication interfaces, indirect coupling or communication connection between devices or modules, and may be in an electrical, mechanical or other form.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application.

Claims (10)

CN202011380120.0A2020-11-302020-11-30Three-dimensional model operation control method and device and terminal equipmentActiveCN112486321B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202011380120.0ACN112486321B (en)2020-11-302020-11-30Three-dimensional model operation control method and device and terminal equipment

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202011380120.0ACN112486321B (en)2020-11-302020-11-30Three-dimensional model operation control method and device and terminal equipment

Publications (2)

Publication NumberPublication Date
CN112486321A CN112486321A (en)2021-03-12
CN112486321Btrue CN112486321B (en)2022-12-13

Family

ID=74937948

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202011380120.0AActiveCN112486321B (en)2020-11-302020-11-30Three-dimensional model operation control method and device and terminal equipment

Country Status (1)

CountryLink
CN (1)CN112486321B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN114237431B (en)*2021-12-092024-12-10郑州捷安高科股份有限公司 Interactive device control method, device, equipment and readable storage medium for simulation scene
CN114706490A (en)2022-02-282022-07-05北京所思信息科技有限责任公司Mouse model mapping method, device, equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN1804774A (en)*2004-12-222006-07-19微软公司Improving touch screen accuracy
WO2017133572A1 (en)*2016-02-022017-08-10上海逗屋网络科技有限公司Method and device for moving target object based on touch control
CN107066173A (en)*2017-03-282017-08-18腾讯科技(深圳)有限公司Method of controlling operation thereof and device
CN109976650A (en)*2019-01-252019-07-05网易(杭州)网络有限公司Man-machine interaction method, device and electronic equipment
CN110115838A (en)*2019-05-302019-08-13腾讯科技(深圳)有限公司Method, apparatus, equipment and the storage medium of mark information are generated in virtual environment
CN111773705A (en)*2020-08-062020-10-16网易(杭州)网络有限公司Interaction method and device in game scene

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN111701226A (en)*2020-06-172020-09-25网易(杭州)网络有限公司Control method, device and equipment for control in graphical user interface and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN1804774A (en)*2004-12-222006-07-19微软公司Improving touch screen accuracy
WO2017133572A1 (en)*2016-02-022017-08-10上海逗屋网络科技有限公司Method and device for moving target object based on touch control
CN107066173A (en)*2017-03-282017-08-18腾讯科技(深圳)有限公司Method of controlling operation thereof and device
CN109976650A (en)*2019-01-252019-07-05网易(杭州)网络有限公司Man-machine interaction method, device and electronic equipment
CN110115838A (en)*2019-05-302019-08-13腾讯科技(深圳)有限公司Method, apparatus, equipment and the storage medium of mark information are generated in virtual environment
CN111773705A (en)*2020-08-062020-10-16网易(杭州)网络有限公司Interaction method and device in game scene

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
对STL模型的交互选择功能的实现;黄常标等;《计算机工程与应用》;20050501(第17期);121-123*

Also Published As

Publication numberPublication date
CN112486321A (en)2021-03-12

Similar Documents

PublicationPublication DateTitle
De Haan et al.IntenSelect: Using Dynamic Object Rating for Assisting 3D Object Selection.
CN112486321B (en)Three-dimensional model operation control method and device and terminal equipment
Andersen et al.Immersion or diversion: Does virtual reality make data visualisation more effective?
Hix et al.Usability engineering of virtual environments
WO2015053266A1 (en)Plant operation training apparatus, control method, program, and plant operation training system
JP2013214275A (en)Three-dimensional position specification method
JP5706870B2 (en) Visualize the view of the scene
CN104331234B (en)A kind of method and apparatus for performing mapping operation
JP6239694B1 (en) Image processing apparatus, image processing method, and program
EP2665042A1 (en)Visual processing based on interactive rendering
Muender et al.Comparison of mouse and multi-touch for protein structure manipulation in a citizen science game interface
CN110020301A (en)Web browser method and device
US11574113B2 (en)Electronic apparatus, information processing method, and recording medium
Trindade et al.Improving 3D navigation techniques in multiscale environments: a cubemap-based approach
EP1162527B1 (en)A method in a process control system and a process control system
CN114237431B (en) Interactive device control method, device, equipment and readable storage medium for simulation scene
JP2013235052A5 (en) Display control apparatus, control method thereof, and program
Dalgarno et al.The importance of active exploration, optical flow, and task alignment for spatial learning in desktop 3D environments
CN111413889A (en)Motion simulation control method and device of four-bar linkage
CN114397960B (en)Flight control direction visualization method based on intelligent mobile equipment
Baker et al.VRoot: A VR-Based application for manual root system architecture reconstruction
CN110193190A (en)Creation method, touch control terminal equipment, electronic equipment and the medium of game object
CN115168207B (en)Method for reducing graphic cross-equipment display perception deviation based on physiological experiment analysis
US20140320501A1 (en)Goodness of fit based on error calculation and fit type
JP2023163002A (en)Method for reading, reader, reading system, and program

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
CB03Change of inventor or designer information

Inventor after:Zhang Eryang

Inventor after:Li Zhishuai

Inventor after:Ao Yalei

Inventor after:Hou Xiaolong

Inventor after:Zheng Xu

Inventor before:Zhang Eryang

Inventor before:Li Zhishuai

Inventor before:Ao Yalei

Inventor before:Hou Xiaolong

Inventor before:Zheng Xu

CB03Change of inventor or designer information
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp