Disclosure of Invention
The invention aims to provide a method for displaying touch operation information in head-mounted display equipment so as to improve the operation accuracy of a user and reduce the operation difficulty of the user.
The object of the present invention is achieved by the following technical means. According to the method for displaying touch operation information in the head-mounted display device provided by the invention, the method is applied to a mobile terminal connected with the head-mounted display device, the mobile terminal comprises a touch area, the head-mounted display device comprises at least one display device, and the method comprises the following steps: acquiring parameter information of a display area of the display device currently displaying the application; acquiring first touch operation information of a user on the touch area; converting the first touch operation information into second touch operation information based on the mapping relation between the display area and the touch area according to the first touch operation information; and displaying an animation effect corresponding to the first touch operation information in the display area according to the second touch operation information so as to prompt a user that the current touch position corresponds to the position of the display area.
The object of the invention can be further achieved by the following technical measures.
The method for displaying touch operation information in a head-mounted display device further includes: the parameter information comprises at least one identifier, wherein the identifier is used for distinguishing the display area; according to the identification, sending the second touch operation information to the display area corresponding to the identification; and generating the animation effect on the display area based on the second touch operation information received on the display area.
In the method for displaying touch operation information in a head-mounted display device, in response to detecting the running of the application of the specified type, first touch operation information of the user on the touch area is acquired.
In the method for displaying touch operation information in a head-mounted display device, the converting the first touch operation information into the second touch operation information based on the mapping relationship between the display area and the touch area includes: establishing a mapping relation between the display area and the touch area; and converting the first touch operation position of the first touch operation information into a second touch operation position of the second touch operation information according to the mapping relation.
The method for displaying touch operation information in a head-mounted display device in the foregoing, wherein the establishing a mapping relationship between the display area and the touch area includes: acquiring a height value and a width value of the resolution of the display area; acquiring a height value and a width value of the resolution of the touch area; and respectively calculating the height ratio and the width ratio of the display area to the touch area.
In the method for displaying touch operation information in a head-mounted display device, the converting a first touch operation position of the first touch operation information into a second touch operation position of the second touch operation information according to the mapping relationship further includes: and acquiring a height value and a width value of the first touch operation position, and multiplying the height value and the width value of the first touch operation position by the height ratio and the width ratio respectively to obtain a second touch operation position.
The method for displaying touch operation information in a head-mounted display device in the foregoing description, wherein the method further includes: acquiring the position of at least one preset touch control on the touch area; and simulating the touch operation information of the user on the touch area to a specified position according to the touch operation information of the user on the touch area and the position of the at least one preset touch control.
In the method for displaying touch operation information in a head-mounted display device, the simulating, according to the touch operation information of the user on the touch area and the position of the at least one preset touch control, the touch operation information of the user on the touch area to a specified position includes: acquiring the position of a touch operation of a user on the touch area; detecting whether the position of the touch operation is overlapped with the position of the at least one preset touch control; if not, acquiring the position of the preset touch control closest to the position of the touch operation in the at least one preset touch control; and simulating the touch operation to the position of the preset touch control closest to the position of the touch operation.
In an embodiment of the method for displaying touch operation information in a head-mounted display device, the simulating, according to the touch operation information of the user on the touch area and the position of the at least one preset touch control, the touch operation information of the user on the touch area to a specified position further includes: and in response to the fact that no preset touch control is detected at the position of the touch operation and whether the touch frequency of the position of the touch operation exceeds a threshold value is detected, obtaining the position of the preset touch control closest to the position of the touch operation in the at least one preset touch control.
The present invention further provides a mobile terminal, the mobile terminal is connected to a head-mounted display device, the mobile terminal includes a touch area, the head-mounted display device includes at least one display device, and the mobile terminal includes: the display information acquisition module is used for acquiring parameter information of a display area currently displayed and applied on the at least one display device; the first touch operation information acquisition module is used for acquiring first touch operation information of a user on the touch area; the second touch operation information conversion module is used for converting the first touch operation information into second touch operation information based on the mapping relation between the display area and the touch area according to the first touch operation information; and the sending module is used for sending the second touch operation information to the head-mounted display equipment so as to display an animation effect corresponding to the first touch operation information on the display area, so as to prompt a user that the current touch position corresponds to the position of the display area.
The invention further provides a computer-readable storage medium, which stores executable instructions that, when executed by a processor, cause the aforementioned method for acquiring touch operation information to be performed.
The beneficial effects of the invention at least comprise:
1. the method comprises the steps of obtaining parameter information of a display area of a display device, obtaining first touch operation information of a user on the touch area, converting the first touch operation information into second touch operation information based on a mapping relation between the display area and the touch area according to the first touch operation information, and displaying an animation effect corresponding to the first touch operation information in the display area according to the second touch operation information so as to prompt the user that a current touch position corresponds to the position of the display area, so that the current touch information of the user can be prompted, the accuracy of user operation is improved, and the operation difficulty of the user is reduced.
2. The method comprises the steps of acquiring the position of at least one preset touch control on a touch area, and simulating the touch operation information of a user on a touch device to an appointed position according to the touch operation information of the user on the touch area and the position of the at least one preset touch control, so that the touch operation of the user is simulated to a correct touch operation position under the condition that the touch operation position of the user does not have the preset touch control, and the accuracy of user operation is improved.
The foregoing description is only an overview of the technical solutions of the present invention, and in order to make the technical means of the present invention more clearly understood, the present invention may be implemented in accordance with the content of the description, and in order to make the above and other objects, features, and advantages of the present invention more clearly understandable, the following preferred embodiments are described in detail with reference to the accompanying drawings.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Virtual Reality (VR) technology is also called smart environment technology. The virtual reality technology comprises a computer, electronic information and simulation technology, and the basic realization mode is that a processor simulates a virtual environment so as to provide environmental immersion for people. Augmented Reality (AR) technology, compared to virtual Reality, highlights interaction of information in the real world, and may also be called Mixed Reality (Mixed Reality) according to different proportions of virtual and display. Virtual information is applied to the real world through a computer technology, and a real environment and a virtual object are superposed on the same picture or space in real time and exist simultaneously. Both VR and AR have large similarities in key device, terminal morphology, and differ in key technology and application areas. VR brings the experience of feeling of immersing through isolated audio and video content, and is higher to showing the image quality requirement, and AR emphasizes the seamless integration of virtual information and display environment, and is higher to the perception interaction requirement. In addition, VR focuses on mass markets for gaming, video, live social, etc., and AR currently focuses on vertical applications for industry, military, etc., and begins to penetrate consumers.
Due to the wide popularization of smart phones, the smart phones are used as input devices and computing units of VR/AR, so that the weight and cost of VR/AR can be reduced, and the smart phones are a design trend. The inventor finds that, in the research on the display control method, because the VR/AR terminal is virtual reality or virtual-real combination, when the VR/AR terminal is operated by an external touch device, a user cannot directly see the touch operation of the VR/AR terminal, and thus the VR/AR terminal is easily touched by mistake. Therefore, the inventor provides a method for acquiring and displaying touch operation information from a mobile terminal by a VR/AR terminal in the embodiment of the application, and converts the touch operation information from the mobile terminal into the touch operation information of the VR/AR terminal and displays the touch operation information on the VR/AR terminal, so that the operation accuracy of a user can be improved, the operation difficulty of the user can be reduced, and the user experience can be improved.
It can be understood that after the mobile terminal is connected to the VR/AR terminal, the input can be performed based on the touch screen of the mobile terminal as a touch pad of the VR/AR device. The display area of the VR/AR terminal displays the content of the currently running APP, such as a game interface and a video playing interface, and a user can operate on the mobile terminal to see the game interface and the content on the display area of the VR/AR terminal.
As shown in fig. 1, anexemplary system architecture 100 in accordance with one or more embodiments of the present invention may include a head mounted display assembly 11. The head mounted display assembly 11 may include a head mounteddisplay device 111 and amobile terminal 112. The head mounteddisplay device 111 may include one or twodisplay apparatuses 1111. The display device is used for imaging in front of the eyes of a target user. In addition, head mounteddisplay device 111 also includes aframe 1112. In some embodiments, the sensors, processing unit, memory, and battery of head mounteddisplay device 111 can be placed insideframe 1112. In some alternative implementations of some implementations, one or more of the sensors, processing unit, memory, and battery may also be integrated into another separate accessory (not shown) that connects to theframe 1112 via a data cable. In some alternative implementations of some implementations, the head mounteddisplay device 111 may have only display functionality and partial sensors, while user input interface, data processing, data storage, power supply capabilities, and the like, are provided by themobile terminal 112.
Themobile terminal 112 may include adisplay screen 1121 that may be touch-operated. In some embodiments, the head mounteddisplay device 111 and themobile terminal 112 may communicate through a wireless connection (wifi or bluetooth). In some alternative implementations of some embodiments, the head mounteddisplay device 111 and themobile terminal 112 may also be connected by a data line (not shown).
It should be understood that the number of head mounted display devices and mobile terminals in fig. 1 is merely illustrative. There may be any suitable number of head mounted display devices and mobile terminals, as desired for implementation.
As shown in fig. 2, the method for displaying touch operation information in a head-mounted display device according to the present invention is applied to a mobile terminal connected to the head-mounted display device, where the mobile terminal includes a touch area, and the head-mounted display device includes at least one display device, and the method includes:
s101, acquiring parameter information of the currently displayed application in a display area of the display device. Specifically, the currently displayed application is a running application that can be seen by the user. In some embodiments, the currently displayed application is a full screen running application, in which case the currently displayed application occupies all or most of the display area, while no other concurrently displayed applications exist. In other embodiments, the currently displayed application is a non-full screen running application, in which case there may be other applications also displayed in the display device. It will be appreciated that different applications correspond to different display areas on the display device. For example, in a 2D display space, the display areas of different applications do not overlap on the 2D plane; for example, in a 3D display space, the display areas of different applications may not overlap in plane, or may not overlap in depth. Therefore, in order to distinguish different currently displayed applications, parameter information of the currently displayed applications may be acquired. In one or more embodiments, the Display area may be distinguished by obtaining an identification (Display ID) of the current Display application; the identification is a unique symbol assigned by the system to the application when the application is open. In other embodiments, the display area may be distinguished by obtaining the coordinate system of the currently displayed application in display space, for example, for a 3D display space, the coordinates of the center point of the application display may be taken. In some embodiments, the resolution of the current display application may also be obtained to distinguish between display regions. One skilled in the art will appreciate that one or more of the above-described display parameters may be acquired simultaneously to differentiate between display regions. When a plurality of display applications exist in the current display device, the currently activated display application may be taken as the current display application.
S102, first touch operation information of a user on the touch area is obtained.
Specifically, after the head-mounted display device is connected to the mobile terminal, the mobile terminal may be simulated as a touch device with a pure black display, and a touch area of the touch device may be capable of acquiring first touch operation information of a user on the touch device in response to detecting the running of a specific application, where the system may set the specific application in a plurality of applications, for example, a game or a video player that needs to run in full screen may be set as the specific application. The first touch operation information may include a first touch position and a touch manner, and the touch manner includes but is not limited to clicking, long-pressing or sliding. The click can comprise a plurality of click modes such as single click, double click or continuous multiple clicks; the long press may include a variety of long press patterns of different durations, such as a long press for 2s, a long press for 4s, and so on; the sliding may also include a variety of different sliding manners, such as single finger sliding, three finger sliding, and so forth. As an example, the touch information may be acquired through a MotionEvent interface provided by the android system. The process thereafter advances to step S103.
S103, converting the first touch operation information into second touch operation information based on the mapping relation between the display area and the touch area according to the first touch operation information.
Specifically, step S103 includes: and S1031, establishing a mapping relation between the display area and the touch area.
Specifically, step S1031 includes: and S10311, acquiring a height value and a width value of the resolution of the display area. Wherein the height value and the width value of the resolution of the display area of the head-mounted display device may be transmitted to the mobile terminal in the form of the first information. The mobile terminal may also be obtained through the cloud (in a case that both the head-mounted display device and the mobile terminal are connected to the cloud), which is not limited in this respect. It can be understood that, in the case that the display area is rectangular, the width value and the height value are obtained; when the display area is circular, the radius, the center of a circle, and the like of the display area need to be acquired.
And S10312, acquiring a height value and a width value of the resolution of the touch area. The height value and the width value of the resolution of the touch area can be directly obtained through system built-in parameters of the mobile terminal. It can be understood that, under the condition that the touch area is rectangular, the width value and the height value are obtained; when the touch area is circular, the radius, the circle center and the like are obtained.
And S10313, respectively calculating the height ratio and the width ratio of the display area to the touch area.
Height value of display area/height value of touch area: fheight;
width of display area/width of touch area: fwidth.
Step S103 further includes: s1032, converting the first touch operation position of the first touch operation information into a second touch operation position of the second touch operation information according to the mapping relation. Specifically, the height value and the width value of the first touch operation position may be obtained, and the height value and the width value of the first touch operation position may be multiplied by the height ratio and the width ratio, respectively, to obtain the second touch operation position. Through the steps, the mapping relation between the current display application display area and the touch area is established, and then the first touch operation position of the first touch operation information is converted into the second touch operation position of the second touch operation information according to the established mapping relation. For example, when the current first touch operation position is the center of the touch area, the second touch operation position is also at the center position on the display area.
It is understood that the first touch operation information includes a first touch operation position and a touch manner, and the second touch operation information also includes a second touch operation position and a touch manner. In this step, only the first touch operation position of the touch area is converted into the second touch operation position of the display area, but the touch manner is not changed, so the touch manner of the first touch operation information is the same as the touch manner of the second touch operation information. The process thereafter advances to S104.
And S104, displaying an animation effect corresponding to the first touch operation information in the display area according to the second touch operation information so as to prompt a user that the current touch position corresponds to the position of the display area.
Specifically, the mobile terminal may send the second touch operation information to a display area of the current application, and the display area of the current application receives the second touch operation information and displays an animation effect corresponding to the first touch operation information in the display area according to the second touch operation information, so as to prompt the user that the current touch position corresponds to the position of the display area. As an example, the second touch operation information may be sent to the display area by calling an InputEvent and InputManager interface of an android; the animation effect corresponding to the first touch operation information may be generated by 3D software such as Unity.
Based on the technical characteristics, the first touch information acting on the mobile terminal is converted into the second touch information based on the head-mounted display equipment and displayed on the head-mounted display equipment, so that the current touch information of the user can be prompted, and the accuracy of user operation is improved.
Optionally, the method for displaying touch operation information in a head-mounted display device further includes:
according to the identification, sending the second touch operation information to the display area corresponding to the identification; and
and generating the animation effect on the display area based on the received second touch operation information on the display area.
Specifically, when the mobile terminal runs at least one APP, the VR/AR terminal presents at least one display area, and the VR/AR terminal can mark each display area as different identifications so as to distinguish different display areas. In the case where the "shooting game" and the "accelerator" are simultaneously running, for example, the VR/AR terminal is in the split screen mode, two display areas are presented, i.e., a "shooter game" display area, and an "accelerator" display area, which may be marked by the VR/AR terminal as different identifications, for example, the label 1 is a "shooting game" display area, the label 2 is an "accelerator" display area, an anchor point having a controllable display area on the VR/AR terminal, when the anchor point is in the display area of the shooting game, the VR/AR terminal can send the identifier 1 to the mobile terminal in the form of second information, the mobile terminal receives the identifier 1 and sends the second touch operation information to the display area of the shooting game according to the identifier 1, and the display area of the shooting game generates an animation effect to prompt a user for touch. When the user needs to regulate and control the 'accelerator', the anchor point can be located in the 'accelerator' display area through head control, the VR/AR terminal can send the identifier 2 to the mobile terminal in the form of second information, the mobile terminal receives the identifier 2 and sends the second touch operation information to the 'accelerator' display area according to the identifier 2, and the 'accelerator' display area generates an animation effect to carry out touch prompt on the user. The technical characteristics realize that the identification of each application can be accurately identified, and the touch operation of the user is prevented from being sent to the wrong application.
Optionally, in some other embodiments, the head-mounted display device may determine whether the currently running APP is a game type or another preset type, that is, the head-mounted display device determines whether the mode in which the head-mounted display device is currently located is an immersion mode, and if so, the head-mounted display device sends third information to the mobile terminal, where the third information represents that the head-mounted display device is in the immersion mode. After receiving the third information, the mobile terminal responds to the third information and performs steps S101-S103 to execute the method for displaying touch operation information in the head-mounted display device according to the present invention. If the head-mounted display device judges that the current mode is not the immersion mode, the method for displaying the touch operation information in the head-mounted display device does not need to be executed. The head-mounted display device determines whether to execute the method for displaying the touch operation information in the head-mounted display device according to the invention by judging whether the current mode is the immersion mode or not, and can judge whether to execute the method for displaying the touch operation information in the head-mounted display device according to the type of the currently running APP, so that the function of displaying the touch operation information is prevented from being started without control, and the memory of a CPU is saved.
Optionally, in some other embodiments, the method for acquiring touch operation information further includes:
s201, obtaining the position of at least one preset touch control on the touch area.
Specifically, the mobile terminal obtains the position of at least one preset touch control according to a currently running APP, where the preset touch control refers to a control capable of performing touch control operation when the current APP is running, and for example, if the currently running APP is a shooting game, the mobile terminal obtains the positions of all touch controls of an interface of the design game and stores the positions.
S202, simulating the touch operation information of the user on the touch area to a specified position according to the touch operation information of the user on the touch area and the position of the at least one preset touch control.
Specifically, the touch operation information may include a touch operation position, and in some other embodiments, the touch operation information may further include a touch frequency, and the designated position indicates a position of one of the at least one preset touch control in the present example. According to the touch operation information of the user on the touch area and the position of the at least one preset touch control, the touch operation information of the user on the touch area is simulated to the specified position, and the correct touch operation of the user can be assisted.
Step S202 specifically includes: s2021, acquiring the position of the touch operation of the user on the touch area;
s2022, detecting whether the position of the touch operation coincides with the position of the at least one preset touch control.
Specifically, if the position of the current touch operation of the user coincides with the position of the at least one preset touch control, it is indicated that the position of the current touch operation of the user is correct, and the subsequent steps do not need to be continued. If the position of the current touch operation of the user is not coincident with the position of the at least one preset touch control, it is indicated that the position of the current touch operation of the user does not have the preset touch control, that is, it is indicated that the position of the current touch operation of the user is wrong, and the subsequent steps are continued.
S2023, obtaining a position of a preset touch control closest to the position of the touch operation in the at least one preset touch control.
Specifically, at least one preset touch control may exist near the position of the touch operation of the user, and the position of the preset touch control closest to the position of the touch operation of the user in the at least one preset touch control is the position where the user originally wants to go to the touch operation.
S2024, simulating the touch operation to the position of the preset touch control closest to the position of the touch operation.
Specifically, the current touch operation of the user is directly simulated to the position of the preset touch control closest to the position of the current touch operation of the user, so that the user is assisted to realize correct touch, and the accuracy of the touch operation of the user is improved.
Optionally, step S202 may further include:
and in response to the fact that no preset touch control is detected at the position of the touch operation and whether the touch frequency of the position of the touch operation exceeds a threshold value is detected, obtaining the position of the preset touch control closest to the position of the touch operation in the at least one preset touch control.
Specifically, if the position of the current touch operation of the user does not coincide with the position of the at least one preset touch control, it is indicated that the position of the touch operation of the user does not have the preset touch control, that is, the position of the current touch operation of the user is wrong, and it is detected whether the touch frequency of the position of the current touch operation of the user exceeds a threshold. If the touch frequency of the current touch operation position of the user exceeds a threshold value, the user aims to touch the position of one of the at least one preset touch control, and under the condition, the position of the preset touch control closest to the touch operation position of the user in the at least one preset touch control is obtained, and the user is assisted in performing touch operation. Optionally, the threshold of the touch frequency may be set to three or four times as required, which is not limited in the present invention. The touch operation intention of the user can be more accurately determined by detecting that no preset touch control exists at the position of the touch operation and detecting the frequency of the touch operation of the user, so that the situation that the touch operation of the user is forcibly assisted by the user due to mistaken touch is avoided.
For more convenient understanding of the present invention, based on the above method for displaying touch operation information in a head-mounted display device, one or more embodiments of the present application are further described according to the effect diagram of the touch area and the display area of one embodiment of the present invention shown in fig. 3. As can be seen from fig. 3, the mobile terminal includes a touch area, the display area is an interface currently displayed in the user field, and the display area includes a plurality of touch controls. Since the user does not need to pay attention to the information of the mobile terminal after connecting with the head-mounted display device, the background in the touch area of the mobile terminal can be pure white or pure black without displaying any content or control. When a user performs a clicking operation at some position of the touch area, the method for acquiring touch operation information according to one or more embodiments of the present invention may convert the touch information of the user in the touch area of the mobile terminal into a display area of an application of the user on the head-mounted display device, for example, as shown by a dotted line in the figure. In addition, in the touch position in the display area, the current touch position of the user can be informed through a preset click animation effect, so that the user can be immersed in the display content without paying attention to the touch area.
As shown in fig. 4, the mobile terminal according to an embodiment of the present invention includes:
a display information obtaining module 101, configured to obtain parameter information of a display area currently displayed and applied to the at least one display device;
a first touch operation information obtaining module 102, configured to obtain first touch operation information of a user on the touch area;
a first touch operation information conversion module 103, configured to convert, according to the first touch operation information, the first touch operation information into second touch operation information based on a mapping relationship between the display area and the touch area; and
a sending module 104, configured to send the second touch operation information to the head-mounted display device so as to display an animation effect corresponding to the first touch operation information on the display area, so as to prompt a user that a current touch position corresponds to a position of the display area.
It is to be understood that, in addition to the above listed modules, the present invention also includes other modules capable of implementing the method for displaying touch operation information in a head-mounted display device according to the present invention.
In another aspect of the present invention, a computer-readable storage medium is provided, which stores executable instructions, software programs, and modules, which when executed by a processor, cause execution of obtaining touch operation information. The readable storage medium may include a high-speed random access memory, and may further include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other non-volatile solid state storage device, and may be applied to various terminals, which may be computers, servers, and the like.
Embodiments of the present invention further provide a computer program product, which when running on a computer, causes the computer to execute the above related steps to implement the method for acquiring touch operation information in the above embodiments.
Although the present invention has been described with reference to a preferred embodiment, it should be understood that various changes, substitutions and alterations can be made herein without departing from the spirit and scope of the invention as defined by the appended claims.