Disclosure of Invention
The invention provides a control method and a control device of an AR virtual model, which are used for determining a moving track of a first virtual model on a first plane where the first virtual model is located according to a control command and controlling the first virtual model to move on the first plane according to the moving track, so that the first virtual model can move on the plane in a first AR scene, and the immersion effect of the AR scene on a user is improved when the AR virtual model is controlled.
The first aspect of the present invention provides a method for controlling an AR virtual model, including:
acquiring a control command, wherein the control command is used for controlling a first virtual model in a first AR scene to move according to a moving track;
determining a first plane in which the first virtual model resides in the first AR scene;
determining the moving track of the first virtual model on the first plane according to the control command;
and controlling the first virtual model to move on the first plane according to the movement track.
In an embodiment of the first aspect of the present invention, the obtaining the control command includes:
acquiring a moving track of a target object on a second plane, wherein the second plane is a plane where the target object is located;
the determining the trajectory of the first virtual model in the first plane according to the control command includes:
and taking the moving track of the target object on the second plane as the moving track of the first virtual model on the first plane.
In an embodiment of the first aspect of the present invention, the taking the moving track of the target object in the second plane as the moving track of the first virtual model in the first plane includes:
determining a moving track of the target object on the second plane;
and mapping the moving track of the target object on the second plane to the first plane to obtain the moving track of the first virtual model on the first plane.
In an embodiment of the first aspect of the present invention, the mapping the moving track of the target object on the second plane to the first plane to obtain the moving track of the first virtual model on the first plane includes:
and after the moving track of the target object on the second plane is subjected to projection matrix mapping, view matrix mapping and imaging plane mapping, the moving track of the first virtual model on the first plane is obtained.
In an embodiment of the first aspect of the present invention, after the moving trajectory of the target object on the second plane is subjected to projection matrix mapping, view matrix mapping, and plane mapping, obtaining the moving trajectory of the first virtual model on the first plane includes:
converting the two-dimensional vector coordinates of the moving track of the target object on the second plane into image screen coordinates and projecting the image screen coordinates onto a far plane and a near plane;
connecting rays between the far plane and the near plane along the starting point and the end point of the two-dimensional vector by using a projection matrix and a view matrix respectively;
and determining a coincidence line of the ray and the imaging plane as a moving track of the first virtual model in the first plane.
In an embodiment of the first aspect of the present invention, the determining a first plane in which the first virtual model is located in the first AR scene includes:
and determining a first plane of the first virtual model in the first AR scene according to a synchronous positioning and map reconstruction SLAM algorithm.
In an embodiment of the first aspect of the present invention, the controlling the first virtual model to move on the first plane according to the movement trajectory includes:
determining a position of the first virtual model in each frame of video image of the first AR scene at the first plane according to the movement track;
playing the each frame of video image of the first AR scene.
In an embodiment of the first aspect of the present invention, a moving track of the first virtual model in the first plane includes a direction and an angle of rotation of the first virtual model in the first plane;
then the controlling the first virtual model to move on the first plane according to the movement trajectory further includes:
and controlling the first virtual model to rotate on the first plane according to the moving track.
In an embodiment of the first aspect of the present invention, the second plane is a display screen for displaying the first AR scene;
the target object is a user finger or a function control on the display screen.
A second aspect of the present invention provides an AR virtual model control apparatus, including:
the receiver is used for acquiring a control command, and the control command is used for controlling a first virtual model in a first AR scene to move according to a moving track;
a processor to determine a first plane in which the first virtual model resides in the first AR scene;
the processor is further used for determining a moving track of the first virtual model on the first plane according to the control command;
and the AR display is used for controlling the first virtual model to move on the first plane according to the moving track.
In an embodiment of the second aspect of the present invention, the receiver is specifically configured to obtain a moving track of a target object on a second plane, where the second plane is a plane where the target object is located;
the processor is specifically configured to use a movement trajectory of the target object in the second plane as a movement trajectory of the first virtual model in the first plane.
In an embodiment of the second aspect of the present invention, the processor is specifically configured to,
determining a moving track of the target object on the second plane;
and mapping the moving track of the target object on the second plane to the first plane to obtain the moving track of the first virtual model on the first plane.
In an embodiment of the second aspect of the present invention, the processor is specifically configured to,
and after the moving track of the target object on the second plane is subjected to projection matrix mapping, view matrix mapping and imaging plane mapping, the moving track of the first virtual model on the first plane is obtained.
In an embodiment of the second aspect of the present invention, the processor is specifically configured to,
converting the two-dimensional vector coordinates of the moving track of the target object on the second plane into image screen coordinates and projecting the image screen coordinates onto a far plane and a near plane;
connecting rays between the far plane and the near plane along the starting point and the end point of the two-dimensional vector by using a projection matrix and a view matrix respectively;
and determining a coincidence line of the ray and the imaging plane as a moving track of the first virtual model in the first plane.
In an embodiment of the second aspect of the present invention, the processor is specifically configured to determine a first plane in which the first virtual model is located in the first AR scene according to a simultaneous localization and mapping SLAM algorithm.
In an embodiment of the second aspect of the present invention, the display module is specifically configured to,
determining a position of the first virtual model in each frame of video image of the first AR scene at the first plane according to the movement track;
playing the each frame of video image of the first AR scene.
In an embodiment of the second aspect of the present invention, the moving track of the first virtual model in the first plane includes the direction and angle of rotation of the first virtual model in the first plane;
the display is specifically configured to control the first virtual model to rotate on the first plane according to the movement trajectory.
In an embodiment of the second aspect of the present invention, the second plane is a display screen for displaying the first AR scene;
the target object is a user finger or a function control on the display screen.
In a third aspect, an embodiment of the present application provides a control apparatus for an AR virtual model, including: a processor and a memory; the memory is used for storing programs; the processor is configured to call a program stored in the memory to perform the method according to any one of the first aspect of the present application.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium storing program code, which when executed, performs the method according to any one of the first aspect of the present application.
In summary, the present invention provides a method and an apparatus for controlling an AR virtual model, wherein the method includes: acquiring a control command, wherein the control command is used for controlling a first virtual model in a first AR scene to move according to a moving track; determining a first plane in which the first virtual model is located in the first AR scene; determining the moving track of the first virtual model on the first plane according to the control command; and controlling the first virtual model to move on the first plane according to the moving track. The method and the device for controlling the AR virtual model can determine the moving track of the first virtual model on the first plane where the first virtual model is located according to the control command, and control the first virtual model to move on the first plane according to the moving track, so that the first virtual model can move on the plane in the first AR scene, and the immersion effect of the AR scene on a user is improved when the AR virtual model is controlled.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
FIG. 1 is a diagram illustrating an embodiment of a display structure of an AR virtual model in the prior art. As shown in fig. 1, in the conventional AR virtual model display structure, when displaying an AR scene including a virtual model provided by an AR display device to a user, the virtual model itself is usually used as a first viewing angle. As shown in fig. 1, the central location within theAR display interface 1 is avirtual model 3 in the AR scene, and thevirtual model 3 is a virtual character in the AR scene. However, no matter what content or different planes exist in the AR scene, the AR virtual model displayed in the existing AR scene only displays the virtual model at the center position of theAR display interface 1, so that the real scene presented to the user in the AR scene shown in fig. 1 is a table 2, and the plane where the desktop of the table 2 is located is not parallel to the plane of the AR display device. At this time, if thevirtual model 3 is also displayed at the center position of theAR display interface 1, thevirtual model 3 is caused to be in a "floating" or "floating" state in the AR scene in front of the table 2 that the user can see.
FIG. 2 is a diagram illustrating an embodiment of a display structure of an AR virtual model in the prior art. In the prior art shown in fig. 2, the display structure shown in fig. 1 is improved to a certain extent, and the virtual model is no longer displayed in a central position in theAR display interface 1 of the AR display device, but is placed on a specific plane in theAR display interface 1. For example, as shown in fig. 2, in the AR scene of theAR display interface 1, the virtual model is no longer located at the center of the AR scene, but is located on the plane where the desktop of the table 2 is located in the AR scene. This can improve the sense of immersion of the user when viewing the AR scene.
However, when the display structure of the AR virtual model as shown in fig. 2 is adopted, there is also a need to solve the problem of how to control the movement of the virtual model. In some games or some application programs, a user can control the AR virtual model to realize operations such as moving or turning in the AR scene by operating the movement of the control on the interface of the mobile phone, so that the interaction between the user and the AR scene is increased. However, since a specific plane in the AR scene is displayed when the AR virtual model shown in fig. 2 is displayed in the AR scene, the planes in different AR scenes are different. According to the control method of the AR virtual model provided in the prior art, the virtual model can only move on the plane where the control operated by the user is located, but the moving mode of the AR virtual model actually expected to be operated by the user on a specific plane cannot be met, and a good AR scene immersion effect cannot be provided for the user. For example, the following description is made with reference to the display structures shown in fig. 3 to 5, where fig. 3 is a schematic view of a display structure of an embodiment of a control method for an AR virtual model in the prior art; FIG. 4 is a schematic diagram of a display structure of an embodiment of a control method for an AR virtual model in the prior art; fig. 5 is a schematic view of a display structure of an embodiment of a control method of an AR virtual model in the prior art. Specifically, in the display structure shown in fig. 3, the virtual model in the AR scene of theAR display interface 1 is no longer located at the center of the AR scene, but is located on the plane where the desktop of the table 2 in the AR scene is located. When the user needs to control the movement of the virtual model in the AR scene, the movement of the virtual model in the AR scene needs to be realized by moving the control provided by theAR display interface 1 to different directions, for example. As fig. 1 shows four possible moving directions of a control which a user may operate, for example, the control is thevirtual model 3 itself, and the user may move in four directions, up, down, left, and right, as shown in fig. 3, by touching thevirtual model 3. More specifically, taking the example in fig. 4 where the user operates thevirtual model 3 to move to the right, when the user touches thevirtual model 3 on theAR display interface 1 and slides to the right, the virtual model moves to the right according to the sliding direction of the user operation. Thevirtual model 3 shown in fig. 5 moves to the right in the direction indicated by the user, but when the virtual model in the AR scene interacts with the user, the plane where the user can operate is usually only the plane where the AR display interface is located, and when the virtual model is located on a different plane in the AR scene, the existing interaction method controlled by the virtual model can only cause the virtual model to move on the plane where the AR display interface is located, so that in the display structure shown in fig. 5, thevirtual model 3 does not move on the desktop of the table 2 in the AR scene where the virtual model is located, but moves in theAR display interface 1, and a visual illusion that the virtual model "drifts" is caused to the user. And cannot satisfy the moving manner of the AR virtual model that the user actually wants to operate on a specific plane such as the desktop of the table 2, nor provide the user with a good AR scene immersion effect.
Therefore, in order to overcome the technical problems in the prior art, the movement track of the first virtual model on the first plane where the first virtual model is located is determined through the control command of the user, and the first virtual model is controlled to move on the first plane according to the movement track, so that the first virtual model can move on the plane in the first AR scene, and the immersion effect of the AR scene on the user is improved when the AR virtual model is controlled.
The technical solution of the present invention will be described in detail below with specific examples. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments.
Fig. 6 is a flowchart illustrating a control method of an AR virtual model according to an embodiment of the present invention. As shown in fig. 6, the method for controlling an AR virtual model provided in this embodiment includes:
s101: and acquiring a control command, wherein the control command is used for controlling the first virtual model in the first AR scene to move according to the moving track.
S102: a first plane in which the first virtual model resides in the first AR scene is determined.
S103: and determining the moving track of the first virtual model in the first plane according to the control command.
S104: and controlling the first virtual model to move on the first plane according to the moving track.
In the application scenario of this embodiment, the execution subject of this embodiment of the method may be any AR display device, and the AR display device may be any electronic device with an AR display function, for example: the user can view the AR scene and the virtual model in the AR scene through the AR display device. And the virtual model can be controlled to move in the AR scene through the movement of the operation target object, so that the user realizes the interaction with the AR scene through the operation target object.
Specifically, the AR display apparatus as the execution subject of the present embodiment first acquires a control instruction for controlling the first virtual model in the first AR scene to move along the movement trajectory through S101. The first AR scene is an AR scene that is being displayed by the AR display device through the display interface thereof when the control command is acquired, the first AR scene at least includes one virtual model, and the virtual model that can be controlled by the control command is the first virtual model. For example, in the embodiment shown in fig. 7, the first AR scene comprises a table 2 being displayed at the time in theAR display interface 1 of the AR display device, and the firstvirtual model 3 on the table is a character. The control command acquired in S101 may be used to control the movement of thevirtual model 3 in the AR scene as shown in fig. 7, or an instruction command may be received before acquiring the control command, so as to instruct thevirtual model 3 to move according to the acquired control command.
Subsequently in S102, the AR display device determines a first plane in which the first virtual model resides in the first AR scene. For example, in the embodiment shown in fig. 7, it is determined that the plane in which the firstvirtual model 3 is located is the plane in which the desktop of the table 2 in the first AR scene is located. Optionally, a first plane in which the first virtual model is located in the first AR scene may be determined according to a Simultaneous Localization and Mapping (SLAM) algorithm in this step. The specific manner of determining the plane existing in the scene may be a manner of performing image recognition processing on the video image in thedisplay interface 1 acquired by the AR display device, and the plane included in the image is determined according to the SLAM algorithm, and the specific image processing implementation manner and the algorithm in the prior art are the same, and the detailed description thereof is omitted, and the details of which are not shown may refer to the common general knowledge in the art.
Subsequently in S103, the AR display device determines the movement locus of the first plane acquired in S102 of the first virtual model according to the control command acquired in S101.
In order to implement S103 in this embodiment, optionally, the control command acquired in S101 may include: and the moving track of the target object on a second plane, wherein the second plane is the plane where the target object is located. For example, fig. 8 is a schematic display structure diagram of an embodiment of the acquisition control instruction according to the present invention, and fig. 9 is a schematic display structure diagram of an embodiment of the acquisition control instruction according to the present invention. Fig. 8 and 9 show a manner of acquiring a movement trajectory of the target object in a second plane when the target object is thefunction control 11 or the user'sfinger 12 on the display screen, where the second plane is thedisplay interface 1 of the AR display apparatus for displaying the first AR scene.
Specifically, in the embodiment shown in fig. 8, the target object is afunction control 11 on the display screen, and the user can move in different directions by touching thefunction control 11. In the embodiment of fig. 8, when the AR display apparatus detects that the user operates thefunctionality control 11 to move to the right, that is, controls thefunctionality control 11 to move on the plane where thedisplay interface 1 is located, avector 31 of the movement track of the target object on the second plane is determined according to the movement of thefunctionality control 11, and thevector 31 points to a point B from a point a where thevirtual model 3 is located in the figure. It should be noted that thevector 31 obtained here is a projection of the actual moving direction of thefunction control 11 on the virtual model, that is, thefunction control 11 actually moves rightward by a certain distance, and the distance corresponds to the position of thefunction control 11 on thedisplay interface 1 to obtain thevector 31. In each embodiment of the present application, a vector manner is adopted to quantitatively describe a movement track, and the movement track may also adopt other description manners, such as a straight line or a line segment, which is not limited in this embodiment.
Or, in the embodiment illustrated in fig. 9, the target object is the user'sfinger 12, and the user's finger slides from the point a where thevirtual model 3 is located to the point B, and the AR display device may determine the motion of the user'sfinger 12 through the touch sensor on thedisplay interface 1, and then obtain avector 31 representing the movement track of the user'sfinger 12 in the second plane, where thevector 31 moves from the point a where thevirtual model 3 is located in the figure to point B. The above-mentioned embodiments shown in fig. 8 and 9 can be implemented as a manner of acquiring the moving track of the target object on the second plane in S101, the AR display device can be implemented by using either of the above-mentioned embodiments, or the AR display device allows the user to obtain the moving track of the second plane on thedisplay interface 1 by operating thefunction control 11 and by using the user'S finger 12.
Optionally, after the moving track of the target object on the second plane is obtained through S101 according to the above embodiment, S103 specifically includes: and taking the moving track of the target object on the second plane as the moving track of the first virtual model on the first plane. And controls the first virtual model to move in the first plane according to the movement trajectory in the following S104. The movement track of the second plane can be mapped onto the first plane in a mapping mode to obtain the movement track of the first virtual model on the first plane. That is, S103 may specifically include: determining the moving track of the target object on the second plane; and mapping the moving track of the target object on the second plane to the first plane to obtain the moving track of the first virtual model on the first plane.
The above processes S103 and S104 are illustrated in fig. 10 to 12, wherein fig. 10 is a schematic display structure diagram of an embodiment of the control method of the AR virtual model according to the present invention; FIG. 11 is a schematic diagram of a display structure of an embodiment of a method for controlling an AR virtual model according to the present invention; fig. 12 is a schematic view of a display structure of an embodiment of a method for controlling an AR virtual model according to the present invention.
In particular, in the embodiment shown in fig. 10, several possible movement trajectories of the first virtual model in the first plane are shown. Here, the control command obtained in S101 may be used to indicate a moving track of the firstvirtual model 3 in thefirst plane 21 where the firstvirtual model 3 obtained in S102 is located, and four directions of thedesktop 21 in the top, bottom, left and right directions are only examples in the figure.
More specifically, in the embodiment shown in fig. 11, the movement locus of the first virtual model in the first plane determined by the AR display apparatus according to the movement locus of the target object in the second plane shown in fig. 8 or 9 is shown. The movement locus of the target object on the second plane, which is acquired by the final AR display device shown in fig. 8 or 9, is avector 31 to the right. While thisvector 31 is relative to the second plane in which thedisplay screen 1 is located, but the firstvirtual model 3 in fig. 11 is in thefirst plane 21, and the acquired control command also instructs the firstvirtual model 3 to move in thefirst plane 21. Therefore, in S103, the AR display apparatus needs to determine the movement trajectory of the first virtual model in the first plane according to the acquired movement trajectory of the target object in the second plane, that is, the movement trajectory of the target object in the second plane is taken as the movement trajectory of the first virtual model in the first plane. The movement locus of the firstvirtual model 3 in thefirst plane 21 finally determined through S103 is, as shown in fig. 11, avector 32 along thefirst plane 21, which is not parallel to thevector 31 to the right in fig. 8 and 9, but is a vector to the right in thefirst plane 21 on which the desktop of the table 2 is located. The vector is avector 32 obtained by mapping avector 31 of the movement locus of the second plane shown in fig. 8 and 9 to the first plane in which the first virtual model is located.
Finally, fig. 12 shows the result obtained after the AR display device executes S104, in which the firstvirtual model 3 moves from the position in fig. 11 along the starting point of thevector 32 determined in fig. 11 in the first plane to the end point of thevector 32, and finally the firstvirtual model 3 moves in thefirst plane 21 according to the control command. Optionally, in the above embodiment, one possible implementation manner of S104 is to determine, according to the movement track, a position of the firstvirtual model 3 in thefirst plane 21 in each frame of the video image of the first AR scene; after the position of the firstvirtual model 3 is determined in all the video images, each frame of video image of the first AR scene is played, wherein the position of the firstvirtual model 3 in each frame of video image is different and shows a rule gradually changing along thevector 32.
In addition, optionally, in the above embodiment, fig. 13 is a schematic display structure diagram of an embodiment of obtaining a control command according to the present invention. In an embodiment as shown in fig. 13, the movement locus of the firstvirtual model 3 in thefirst plane 21 includes a direction and an angle of rotation of the firstvirtual model 3 in thefirst plane 21, for example, in fig. 13, the control command is anarc vector 33 of the movement locus of the target object in the second plane, and after mapping to the first plane, determining that the movement locus of the firstvirtual model 31 in thefirst plane 21 is that the firstvirtual model 3 rotates in thefirst plane 21 according to the rotation direction and the rotation angle of the mapped vector, S104 further includes: and controlling the firstvirtual model 3 to rotate on thefirst plane 21 according to the moving track. Specifically, in fig. 13, the firstvirtual model 3 is rotated counterclockwise by 180 degrees according to the rotation direction and the rotation angle of the vector of the movement trajectory.
Alternatively, in the foregoing embodiment, only the moving track of the first virtual model is taken as a straight line for exemplary illustration, and likewise, the moving track of the first virtual model in the first plane may also be a curved line. For example, fig. 14 is a schematic view of a display structure of an embodiment of obtaining a control command according to the present invention. Avector 34 shown in fig. 14 is a moving trajectory of the firstvirtual model 3 on thefirst plane 21 determined by the AR display device according to the control command in S103, where the moving trajectory of the target object on the second plane in the control command is an irregular curve, and thevector 34 that is mapped onto thefirst plane 21 according to the moving trajectory of the second plane to obtain the moving trajectory of the firstvirtual model 3 on thefirst plane 21 is an irregular curve.
In summary, the control method for the AR virtual model provided in this embodiment can determine, according to the control command, the movement trajectory of the first virtual model in the first plane where the first virtual model is located, and control the first virtual model to move in the first plane according to the movement trajectory, so that the first virtual model can move on the plane in the first AR scene, and the immersion effect of the AR scene on the user is improved when the AR virtual model is controlled.
Optionally, this embodiment further provides a possible implementation manner when thevector 31 of the moving track of the target object on the second plane is mapped to thevector 32 obtained after the first plane where the first virtual model is located: and after the moving track of the target object on the second plane is subjected to projection matrix mapping, view matrix mapping and imaging plane mapping, the moving track of the first virtual model on the first plane is obtained. The method specifically comprises the following steps: converting the two-dimensional vector coordinates of the moving track of the target object on the second plane into image screen coordinates and projecting the image screen coordinates onto a far plane and a near plane; connecting rays between a far plane and a near plane of an image screen coordinate along a starting point and an end point of a two-dimensional vector by using a projection matrix and a view matrix; and determining a coincidence line of the ray and the imaging plane as a moving track of the first virtual model in the first plane.
Fig. 15 and fig. 16 are taken as examples for explanation, wherein fig. 15 is a schematic view of a display structure of the virtual model according to the present invention; FIG. 16 is a schematic diagram illustrating the determination of the movement trajectory of the first virtual model in the first plane according to the present invention. Fig. 15 shows a display structure of the virtual model of the present invention, and the firstvirtual model 3 to be displayed and theimaging plane 43 of the first plane where the first virtual model is located are mapped by the projection matrix and the view matrix to obtain the display structure shown in fig. 15. Wherein, the position of the image screen obtained after the firstvirtual model 3 and theimaging plane 43 are converted into the image screen coordinates through the projection matrix and the view matrix is between thefar plane 41 and thenear plane 42. The view matrix in this embodiment refers to a position of the firstvirtual model 3 to be displayed, which is away from the observer, and the position of the observer is a junction of an origin of the view matrix, such as a dashed line in the drawing. All models in the world are regarded as a large model, and the left sides of all model matrixes are multiplied by a model matrix representing the transformation of the whole world to obtain the view matrix. The projection matrix is to convert the vertices in the view coordinate system to a plane, and the far object and the near object between thefar plane 41 and thenear plane 42 are simulated by a perspective projection method as shown in fig. 15 to be small in appearance and large in appearance. The display structure used in this embodiment is the same as in the prior art, and the definition of the view matrix, the projection matrix, and the non-illustrated parts like the screen coordinates can be referred to the common knowledge in the art. The present embodiment focuses mainly on fig. 16, first, two-dimensional vector coordinates of a moving track of a target object on a second plane are converted into image screen coordinates, and avector 411 on afar plane 41 and avector 421 projected onto anear plane 42 are obtained. Subsequently, a ray can be connected through thevector 411 and thevector 421 along the starting point and the ending point of the vector, respectively, where the starting point of the ray can be the position of the observer in the above-mentioned figure, and thecoincidence line 431 of the ray and theimaging plane 43 is the moving track of the firstvirtual model 3 in the imaging plane, that is, the moving track of the firstvirtual model 3 in thefirst plane 21.
Or alternatively, the present embodiment may also determine the movement trajectory of the first plane in another manner, for example, in a manner of establishing a mapping relationship between the vector of the movement trajectory of the second plane and the vector of the movement trajectory of the first plane. For example, the vectors in the four directions shown in fig. 3 are the moving tracks of the target object that the user may manipulate in the second plane, and the four directions may be respectively mapped to the four vectors in the first plane shown in fig. 8. When the vector of the movement track of the second plane in fig. 3 is detected, the mapping relationship is queried to determine the movement track of the first plane, and only four vectors are included in fig. 8 as an example, and the mapping relationship of multiple vectors may actually exist. And as shown in fig. 8, the correspondence relationship between the four directions may be determined according to different directional characteristics of the first plane, for example, in fig. 8, four sides of the table with the first plane correspond to four sides of the display screen with the second plane, and the correspondence relationship between the sides may be adjusted according to an angle between the first plane and the second plane, for example, an angle formed by the four directions of the first plane and the corresponding four directions of the second plane should be less than 45 degrees, and when the angle is greater than 45 degrees, the correspondence relationship may be correspondingly adjusted in a rotating manner so that the four directions of the first plane and the four directions of the second plane are less than 45 degrees.
Further alternatively, since the user may be in the process of moving at any time when using the AR display apparatus, the AR display apparatus may determine the first plane where the first virtual model is located in real time through the above S101 to S104 when displaying each frame of image in the AR video image, and determine and adjust the moving track of the first virtual model in the real-time first plane at any time.
Fig. 17 is a flowchart illustrating a control method of an AR virtual model according to an embodiment of the present invention. An embodiment shown in fig. 17 provides a flow combining the AR virtual model control method in the foregoing embodiments, where the flow includes: (1) the AR algorithm provides SLAM tracking capability, plane cognitive capability in a real environment and tracking capability of a real position posture of a current device camera are provided, and a technical basis is provided for an AR scene role interaction method. (2) And placing a virtual model in the virtual rendering space through the plane information provided by the AR algorithm, and starting to control the virtual game role to carry out the AR game through operation input, wherein the screen operation input comprises game remote sensing, keys, screen touch, sound, mobile phone vibration and the like. (3) The input for controlling the change of the position angle of the role is equivalent to a two-dimensional vector of a screen space, and the two-dimensional vector is transmitted to the script environment to be associated with the corresponding virtual role animation, wherein the two-dimensional vector finally falls on a moving plane where the actual role is located through projection matrix mapping, view matrix mapping and plane mapping. (4) And (3) obtaining the rotation and motion vector of the character in the real three-dimensional space, and driving the virtual character to move, rotate and play the corresponding model animation in the corresponding frame in the game script.
Fig. 18 is a schematic structural diagram of a control apparatus of an AR virtual model according to an embodiment of the present invention. As shown in fig. 18, the control device 18 of the AR virtual model according to the present embodiment includes: anacquisition module 1801, aprocessing module 1802, and adisplay module 1803. The obtainingmodule 1801 is configured to obtain a control command, where the control command is used to control a first virtual model in a first AR scene to move according to a movement trajectory; theprocessing module 1802 is configured to determine a first plane in which the first virtual model is located in the first AR scene; theprocessing module 1802 is further configured to determine a moving trajectory of the first virtual model in the first plane according to the control command; thedisplay module 1803 is configured to control the first virtual model to move in the first plane according to the movement track.
The control apparatus of the AR virtual model provided in this embodiment may be used to execute the control method of the AR virtual model according to the embodiment shown in fig. 6, and the specific implementation manner and principle thereof are the same and will not be described again.
Optionally, in the foregoing embodiment, the obtainingmodule 1801 is specifically configured to obtain a moving trajectory of the target object on a second plane, where the second plane is a plane where the target object is located; theprocessing module 1802 is specifically configured to use a moving trajectory of the target object in the second plane as a moving trajectory of the first virtual model in the first plane.
Optionally, in the foregoing embodiment, theprocessing module 1802 is specifically configured to determine a moving track of the target object on the second plane; and mapping the moving track of the target object on the second plane to the first plane to obtain the moving track of the first virtual model on the first plane.
Optionally, in the foregoing embodiment, theprocessing module 1802 is specifically configured to obtain a moving trajectory of the first virtual model in the first plane after the moving trajectory of the target object in the second plane is subjected to projection matrix mapping, view matrix mapping, and imaging plane mapping.
Optionally, in the above embodiment, theprocessing module 1802 is specifically configured to convert the two-dimensional vector coordinates of the movement trajectory of the target object on the second plane into image screen coordinates and project the image screen coordinates onto the far plane and the near plane; connecting rays between the far plane and the near plane along the starting point and the end point of the two-dimensional vector by using a projection matrix and a view matrix respectively; and determining a coincidence line of the ray and the imaging plane as a moving track of the first virtual model in the first plane.
Optionally, in the above embodiment, theprocessing module 1802 is specifically configured to determine, according to the synchronized positioning and map reconstruction SLAM algorithm, a first plane in which the first virtual model is located in the first AR scene.
Optionally, in the foregoing embodiment, thedisplay module 1803 is specifically configured to determine, according to the movement track, a position of the first virtual model in each frame of the video image of the first AR scene on the first plane; each frame of video image of the first AR scene is played.
Optionally, in the above embodiment, the moving track of the first virtual model in the first plane includes the direction and angle of rotation of the first virtual model in the first plane; thedisplay module 1803 is specifically configured to control the first virtual model to rotate on the first plane according to the moving track.
Optionally, in the above embodiment, the second plane is a display screen for displaying the first AR scene; the target object is a user finger or a function control on the display screen.
The control apparatus of the AR virtual model provided in this embodiment may be used to execute the control method of the AR virtual model shown in the foregoing embodiments, and the specific implementation manner and principle thereof are the same and will not be described again.
It should be noted that, in the embodiment of the present application, the division of the module is schematic, and is only one logic function division, and there may be another division manner in actual implementation. Each functional module in the embodiments of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
FIG. 19 is a schematic structural diagram of an embodiment of a control apparatus for an AR virtual model according to the present invention. As shown in fig. 19, thecontrol device 19 of the AR virtual model according to the present embodiment includes: areceiver 1901, aprocessor 1902, and anAR display 1903. Thereceiver 1901 is configured to obtain a control command, where the control command is used to control a first virtual model in a first AR scene to move according to a movement trajectory; theprocessor 1902 is configured to determine a first plane in which the first virtual model resides in the first AR scene; theprocessor 1902 is further configured to determine a moving trajectory of the first virtual model in the first plane according to the control command; theAR display 1903 is used to control the first virtual model to move in the first plane according to the movement trajectory.
The control apparatus of the AR virtual model provided in this embodiment may be used to execute the control method of the AR virtual model according to the embodiment shown in fig. 6, and the specific implementation manner and principle thereof are the same and will not be described again.
Optionally, in the foregoing embodiment, thereceiver 1901 is specifically configured to acquire a moving track of the target object on a second plane, where the second plane is a plane where the target object is located; theprocessor 1902 is specifically configured to use a moving trajectory of the target object in the second plane as a moving trajectory of the first virtual model in the first plane.
Optionally, in the above embodiment, theprocessor 1902 is specifically configured to determine a moving track of the target object in the second plane; and mapping the moving track of the target object on the second plane to the first plane to obtain the moving track of the first virtual model on the first plane.
Optionally, in the foregoing embodiment, theprocessor 1902 is specifically configured to obtain a moving trajectory of the first virtual model in the first plane after the moving trajectory of the target object in the second plane is subjected to projection matrix mapping, view matrix mapping, and imaging plane mapping.
Optionally, in the above embodiment, theprocessor 1902 is specifically configured to convert the two-dimensional vector coordinates of the movement trajectory of the target object on the second plane into image screen coordinates and project the image screen coordinates onto the far plane and the near plane; connecting rays between the far plane and the near plane along the starting point and the end point of the two-dimensional vector by using a projection matrix and a view matrix respectively; and determining a coincidence line of the ray and the imaging plane as a moving track of the first virtual model in the first plane.
Optionally, in the above embodiment, theprocessor 1902 is specifically configured to determine, according to the simultaneous localization and mapping SLAM algorithm, a first plane in which the first virtual model is located in the first AR scene.
Optionally, in the above embodiment, theAR display 1903 is specifically configured to determine, according to the movement track, a position of the first virtual model in each frame of the video image of the first AR scene on the first plane; each frame of video image of the first AR scene is played.
Optionally, in the above embodiment, the moving track of the first virtual model in the first plane includes the direction and angle of rotation of the first virtual model in the first plane;
theAR display 1903 is specifically configured to control the first virtual model to rotate on the first plane according to the movement trajectory.
Optionally, in the above embodiment, the second plane is a display screen for displaying the first AR scene; the target object is a user finger or a function control on the display screen.
The control apparatus of the AR virtual model provided in this embodiment may be used to execute the control method of the AR virtual model shown in the foregoing embodiments, and the specific implementation manner and principle thereof are the same and will not be described again.
The present invention also provides an electronic device readable storage medium, which includes a program that, when executed on an electronic device, causes the electronic device to execute the method for controlling an AR virtual model according to any of the above embodiments.
An embodiment of the present invention further provides an electronic device, including: a processor; and a memory for storing executable instructions for the processor; wherein the processor is configured to execute the control method of the AR virtual model in any of the above embodiments via execution of the executable instructions.
An embodiment of the present invention also provides a program product, including: a computer program (i.e., executing instructions) stored in a readable storage medium. The computer program may be read from a readable storage medium by at least one processor of the encoding apparatus, and the computer program is executed by the at least one processor to cause the encoding apparatus to implement the control method of the AR virtual model provided in the foregoing various embodiments.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the present invention in any way, and any simple modification, equivalent change and modification made to the above embodiment according to the technical spirit of the present invention are still within the scope of the technical solution of the present invention.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.