Movatterモバイル変換


[0]ホーム

URL:


CN111984116A - VR perception touch device - Google Patents

VR perception touch device
Download PDF

Info

Publication number
CN111984116A
CN111984116ACN202010762943.3ACN202010762943ACN111984116ACN 111984116 ACN111984116 ACN 111984116ACN 202010762943 ACN202010762943 ACN 202010762943ACN 111984116 ACN111984116 ACN 111984116A
Authority
CN
China
Prior art keywords
input device
input
display terminal
infrared
touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010762943.3A
Other languages
Chinese (zh)
Inventor
张恩泽
赖文杰
胡志发
成茵
窦诚诚
张现阳
焦坦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Visionertech Co ltd
Original Assignee
Chengdu Visionertech Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Visionertech Co ltdfiledCriticalChengdu Visionertech Co ltd
Priority to CN202010762943.3ApriorityCriticalpatent/CN111984116A/en
Publication of CN111984116ApublicationCriticalpatent/CN111984116A/en
Pendinglegal-statusCriticalCurrent

Links

Images

Classifications

Landscapes

Abstract

The invention relates to the technical field of VR (virtual reality), and discloses a VR sensing touch device which comprises a VR display terminal and VR head display equipment, wherein the VR display terminal is in communication connection with an input device and a position sensing device, the input device is used for inputting characters, the position sensing device is used for positioning the finger position of a user on the input device and transmitting the positioning data to the VR display terminal, and the VR display terminal generates a virtual finger image and an input device image by using the positioning data and transmits the virtual finger image and the input device image to the VR head display equipment for synchronous display. The invention solves the problem that the existing VR equipment lacks an effective and convenient interaction mode in the aspect of character input or accurate key pressing.

Description

VR perception touch device
Technical Field
The invention relates to the technical field of VR (virtual reality), in particular to a VR sensing touch device.
Background
At present, VR glasses do not have a universal interaction device, most manufacturers can provide a (pair of) customized handle for the VR glasses, the handle is also in different types such as a remote control type, a 3DOF (degree of freedom) type and a 6DOF type, and the technical details of each family are not the same. But no matter which kind of implementation scheme, mostly all be to feeling interactive handle class equipment, VR handle equipment can promote the object for appreciation nature, the interactive of VR glasses, but does not have the fine accurate control of realization of way and input ability. In the era of computer hosts, the operation and control of a mouse and a keyboard are derived, so that a user can quickly type, quickly position and quickly select; in the era of mobile internet of mobile phones, touch screens are derived, and with continuous practice and adaptation of users, full-keyboard typing on a small screen can achieve very high input efficiency. The interaction of the current VR class, especially the text input and the accurate key control class technology are still incomplete, and an interactive short board is formed, so that most of VR class applications are mainly watching and entertainment, more users input passively, and the active output capacity is insufficient.
Another big problem that causes VR interaction difficulty is that users generally wear closed VR head-mounted display devices all the way, so that the users can not effectively sense the accurate positions of both hands, especially fingers, but an important premise of accurate input is that eyes (whether main visual angle or auxiliary light) observe and track both hands to determine the input position. However, once both eyes are covered, the input cannot be performed efficiently. To address this problem, various attempts have been made in the industry:
1. the fingers are observed through the camera and then fed back to the VR head display equipment, the scheme is used for locking the positions of both hands, and the actual experience process is not flexible and accurate enough, so that the original free advantage of VR is lost;
2. the problem that the requirement on the exercise and adaptation of a user is high and the degree of freedom is not high is solved by the handles with more keys, no method is available for effectively solving the hovering indication requirement, and the threshold is invisibly increased;
3. and changing the operation mode of the input method, wherein the operation mode is similar to a wooden fish knocking scheme, a virtual keyboard scheme and the like.
In summary, the existing VR devices have no way to compare with the traditional mouse, keyboard and touch screen solutions, because the interactive devices are cumbersome to wear/use and inefficient, whether in terms of text input or precise key pressing. This has also led to a lack of mainstream acceptance in the industry for text entry or accurate keying of VR devices.
Disclosure of Invention
Based on the technical problems, the invention provides a VR sensing touch device, which solves the problem that the existing VR equipment lacks an effective and convenient interaction mode in the aspect of character input or accurate key pressing.
In order to solve the technical problems, the technical scheme adopted by the invention is as follows:
the utility model provides a VR perception touch device, including the first display device of VR display terminal and VR, VR display terminal and input device, position induction system communication connection, input device is used for the text input, position induction system is used for fixing a position the finger position of user on input device and transmits the positioning data for VR display terminal, VR display terminal utilizes the positioning data to generate virtual finger image and input device image and transmits the first display device synchronous display of VR for.
Preferably, the position sensing device comprises a sensing control module and a plurality of infrared correlation sensors, and the infrared correlation sensors are arranged around the input device and form at least one layer of infrared mesh above the touch input device.
In a preferred embodiment, the infrared correlation sensor is a single-point infrared correlation sensor.
In a preferred embodiment, the infrared correlation sensor is a grating-type infrared correlation sensor.
Preferably, the input device is an input keyboard or a touch screen, wherein the touch screen is a capacitive touch screen or a resistive touch screen.
Compared with the prior art, the invention has the beneficial effects that:
the invention induces the finger position used in the input operation of the input device through the position induction device, generates the virtual image according to the acquired finger positioning data and displays the virtual image in the VR display terminal, so that a user can synchronously judge the input state of the user at the moment through the generated virtual image under the condition of wearing the VR display terminal. The problem that the user cannot conveniently complete character input or accurate key pressing due to the fact that the user cannot sense the accurate positions of the hands, particularly fingers, of the closed VR display terminal is solved, and user operation experience and input efficiency are improved.
Drawings
FIG. 1 is a schematic top view of the mechanism of the present invention.
FIG. 2 is a schematic diagram of the present invention in a finger-floating state.
FIG. 3 is a diagram illustrating the use of the present invention in a finger touch state.
Fig. 4 is a flow chart of the present invention.
Wherein 11 input devices, 12 position sensing devices and 13 fingers.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present disclosure more apparent, the technical solutions of the embodiments of the present disclosure will be described clearly and completely with reference to the drawings of the embodiments of the present disclosure. It is to be understood that the described embodiments are only a few embodiments of the present disclosure, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the described embodiments of the disclosure without any inventive step, are within the scope of protection of the disclosure.
Unless otherwise defined, technical or scientific terms used herein shall have the ordinary meaning as understood by one of ordinary skill in the art to which this disclosure belongs. The use of "first," "second," and similar terms in this disclosure is not intended to indicate any order, quantity, or importance, but rather is used to distinguish one element from another. The word "comprising" or "comprises", and the like, means that the element or item listed before the word covers the element or item listed after the word and its equivalents, but does not exclude other elements or items. The terms "connected" or "coupled" and the like are not restricted to physical or mechanical connections, but may include electrical connections, whether direct or indirect. "upper", "lower", "left", "right", and the like are used merely to indicate relative positional relationships, and when the absolute position of the object being described is changed, the relative positional relationships may also be changed accordingly.
Example 1:
referring to fig. 1 ~ 4, a VR perception touch device, including the first display equipment of VR display terminal and VR, VR display terminal and input device, position induction system communication connection, input device is used for the text input, position induction system is used for the finger position of location user on input device and transmits the positioning data for VR display terminal, VR display terminal utilizes the positioning data to generate virtual finger image and input device image and transmits for the first display equipment synchronous display of VR.
In this embodiment, when the user is carrying out text input or accurate button, locateposition induction system 12 oninput device 11 and can respond to user'sfinger 13 position and transmit the positioning data for VR display terminal, VR display terminal utilizes the positioning data to generatevirtual finger 13 image andinput device 11 image and transmits for the first display device synchronous display of VR. Therefore, when a user wears the closed VR head display equipment, thevirtual finger 13 image displayed in the VR head display equipment is used as visual feedback, so that the user can use thevirtual finger 13 as a reference, the user can move thefinger 13 and correctly and quickly select the target position of theinput device 11, and the purpose of quickly inputting characters or accurately pressing keys without watching two hands of an entity is achieved. The input state of the user at the moment is judged synchronously through the generated virtual image, the problem that the user cannot conveniently complete character input or accurate key pressing due to the fact that the user cannot sense the accurate positions of the two hands, particularly thefingers 13, of the user through the closed VR head display device is solved, and the user operation experience and the input efficiency are improved.
Preferably, the VR display terminal generates images of thefinger 13 and theinput device 11 using software such as Quest3D or Un it 3D.
Further, theposition sensing device 12 is composed of a sensing control module and a plurality of infrared correlation sensors, and the infrared correlation sensors are arranged around theinput device 11 and form at least one layer of infrared mesh above thetouch input device 11.
When a user performs an input operation on theinput device 11 with thefinger 13, the action process of thefinger 13 is "hover → approach → touch → far away → hover", in the process, thefinger 13 passes through an infrared correlation sensor (the infrared correlation sensor includes an infrared emitter and an infrared receiver, and infrared rays are formed between the infrared emitter and the infrared receiver) to form an infrared mesh above theinput device 11, and due to the blocking effect of thefinger 13, part of the infrared rays of the infrared correlation sensor are blocked and cannot be received by the infrared receiver, and the space surrounded by the rest normal infrared rays beside the blocked infrared rays is thefinger 13 positioning space. And then by utilizing the cooperation of the multilayer infrared meshes, a horizontal axis sensing area, a vertical axis sensing area and a height sensing area of the finger can be obtained by utilizing the infrared correlation sensor, wherein the sensing areas are shown in fig. 2 and 3. From this, it can be seen that the mesh size, density, and number of layers of the infrared mesh determine the accuracy of positioning thefinger 13, and the smaller the mesh size, the higher the infrared density, and the greater the number of layers of the infrared mesh, the more accurate the positioning contour of thefinger 13. Infrared correlation sensor transmits the locating data for VR display terminal through the response control module, rejects interference and false triggering through filtering algorithm, can form the multilayer projection of both hands in the infrared ray meshwork area, and rethread VR display terminal is to alright generation virtual 13 images of finger as the visual feedback foundation of user in the VR scene.
Therefore, the working process of the VR-aware touch device is summarized as the data of the position-sensing device → the gesture projection is generated → the gesture action is generated → the touch and hover gestures are perceived → the VR head display device displays the virtual gesture.
In theinput device 11, since theposition sensing device 12 is mounted on theinput device 11, the mounting position, distance, and size of theposition sensing device 12 and theinput device 11 can be easily determined, so that a virtual image of theinput device 11 can be generated in advance by using the VR display terminal, and the position of thefinger 13 on theinput device 11 can be determined according to the position of theposition sensing device 12.
Further, the infrared correlation sensor is a single-point infrared correlation sensor. The single-point infrared correlation sensor is characterized by comprising a pair of infrared emitters and infrared receivers, and theposition sensing device 12 formed by the single-point infrared correlation sensor has the advantages of adjustable size and direction of formed infrared mesh grids and better flexibility.
Furthermore, the infrared correlation sensor is a grid-shaped infrared correlation sensor. The grid-shaped infrared correlation sensor consists of a strip-shaped infrared emitter and an infrared receiver, a plurality of infrared sensing light rays are arranged on the grid-shaped infrared correlation sensor, and theposition sensing device 12 consisting of the grid-shaped infrared correlation sensor has the advantages of more compact structure, fewer parts and convenience in installation.
Further, theinput device 11 is an input keyboard or a touch screen, wherein the touch screen is a capacitive touch screen or a resistive touch screen.
The above is an embodiment of the present invention. The embodiments and specific parameters in the embodiments are only used for clearly illustrating the verification process of the invention and are not used for limiting the patent protection scope of the invention, which is defined by the claims, and all the equivalent structural changes made by using the contents of the description and the drawings of the present invention should be included in the protection scope of the present invention.

Claims (5)

CN202010762943.3A2020-07-312020-07-31VR perception touch devicePendingCN111984116A (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202010762943.3ACN111984116A (en)2020-07-312020-07-31VR perception touch device

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202010762943.3ACN111984116A (en)2020-07-312020-07-31VR perception touch device

Publications (1)

Publication NumberPublication Date
CN111984116Atrue CN111984116A (en)2020-11-24

Family

ID=73444939

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202010762943.3APendingCN111984116A (en)2020-07-312020-07-31VR perception touch device

Country Status (1)

CountryLink
CN (1)CN111984116A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN114972670A (en)*2022-06-232022-08-30浙江秒到科技有限公司Network virtual sacrifice method, system, electronic equipment and storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN2456245Y (en)*2000-12-282001-10-24陈刚Grating infrared electronic mutual emission sensor
US20110248941A1 (en)*2010-03-172011-10-13Samer AbdoSystem and method for capturing hand annotations
CN102929443A (en)*2012-10-122013-02-13深圳市深越光电技术有限公司Three-dimensional infrared touch display screen and electronic device
CN105511692A (en)*2015-12-292016-04-20南京中电熊猫液晶显示科技有限公司Infrared touch device and method
CN105975067A (en)*2016-04-282016-09-28上海创米科技有限公司Key input device and method applied to virtual reality product
CN106406563A (en)*2016-08-312017-02-15李文松Character inputting method and apparatus in VR environment
US20190212808A1 (en)*2018-01-112019-07-11Steelseries ApsMethod and apparatus for virtualizing a computer accessory
CN110362231A (en)*2019-07-122019-10-22腾讯科技(深圳)有限公司The method and device that new line touch control device, image are shown
CN212675515U (en)*2020-07-312021-03-09成都易瞳科技有限公司VR perception touch device

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN2456245Y (en)*2000-12-282001-10-24陈刚Grating infrared electronic mutual emission sensor
US20110248941A1 (en)*2010-03-172011-10-13Samer AbdoSystem and method for capturing hand annotations
CN102929443A (en)*2012-10-122013-02-13深圳市深越光电技术有限公司Three-dimensional infrared touch display screen and electronic device
CN105511692A (en)*2015-12-292016-04-20南京中电熊猫液晶显示科技有限公司Infrared touch device and method
CN105975067A (en)*2016-04-282016-09-28上海创米科技有限公司Key input device and method applied to virtual reality product
CN106406563A (en)*2016-08-312017-02-15李文松Character inputting method and apparatus in VR environment
US20190212808A1 (en)*2018-01-112019-07-11Steelseries ApsMethod and apparatus for virtualizing a computer accessory
CN110362231A (en)*2019-07-122019-10-22腾讯科技(深圳)有限公司The method and device that new line touch control device, image are shown
CN212675515U (en)*2020-07-312021-03-09成都易瞳科技有限公司VR perception touch device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN114972670A (en)*2022-06-232022-08-30浙江秒到科技有限公司Network virtual sacrifice method, system, electronic equipment and storage medium

Similar Documents

PublicationPublication DateTitle
US11360558B2 (en)Computer systems with finger devices
CN103336575B (en)The intelligent glasses system of a kind of man-machine interaction and exchange method
CN202142005U (en)System for long-distance virtual screen input
CN105144057B (en)For moving the equipment, method and graphic user interface of cursor according to the cosmetic variation of the control icon with simulation three-dimensional feature
US8854433B1 (en)Method and system enabling natural user interface gestures with an electronic system
CN106200955B (en)System and method for using texture in graphic user interface widget
US20160274788A1 (en)Method and device for building virtual keyboard
WO2012039140A1 (en)Operation input apparatus, operation input method, and program
CN102880304A (en)Character inputting method and device for portable device
EP3007441A1 (en)Interactive displaying method, control method and system for achieving displaying of a holographic image
CN101995943B (en) Stereo image interactive system
KR101019254B1 (en) Terminal device with space projection and space touch function and its control method
US9013396B2 (en)System and method for controlling a virtual reality environment by an actor in the virtual reality environment
KR102147430B1 (en)virtual multi-touch interaction apparatus and method
CN103124949B (en)Utilize vision directing mouse input method, input system and the input equipment of monocular-camera calibration technique
CN102508562B (en)Three-dimensional interaction system
CN102314301A (en)Virtual touch sensing system and method
CN212675515U (en)VR perception touch device
CN108334203A (en)A kind of virtual reality fusion keyboard system for virtual reality
CN103064514A (en)Method for achieving space menu in immersive virtual reality system
CN109558061A (en)A kind of method of controlling operation thereof and terminal
CN102855087A (en)Input method, device and terminal
US20130257809A1 (en)Optical touch sensing apparatus
CN109408171A (en)A kind of display control method and terminal
CN208888763U (en) A virtual-reality fusion keyboard system for virtual reality

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination

[8]ページ先頭

©2009-2025 Movatter.jp