


技术领域technical field
本发明涉及的是一种可以穿戴的三维图形界面手势交互技术,具体地说是一种基于单目视觉与标记、使用手势进行人机交互的可穿戴式三维手势交互系统及其使用方法。The present invention relates to a wearable three-dimensional graphic interface gesture interaction technology, in particular to a wearable three-dimensional gesture interaction system based on monocular vision and marking, using gestures for human-computer interaction and its use method.
背景技术Background technique
随着人们对自然交互的要求不断提高,通过手势进行人机交互已经成为主流趋势。为了达到随时随地都能进行虚拟数据的操作,可穿戴式计算机技术是解决此问题的一种可行的方法。随着微型投影仪的发展,微型投影仪成为移动设备标准配置已是不可逆的趋势。同时,人们也渐渐不满足于移动设备过小的显示屏幕。所以将界面投影到平面上来扩大屏幕大小成为研究热点。With the continuous improvement of people's requirements for natural interaction, human-computer interaction through gestures has become a mainstream trend. In order to operate virtual data anytime and anywhere, wearable computer technology is a feasible method to solve this problem. With the development of pico projectors, it is an irreversible trend that pico projectors become the standard configuration of mobile devices. At the same time, people are gradually dissatisfied with the too small display screen of mobile devices. Therefore, projecting the interface onto a plane to expand the screen size has become a research hotspot.
对于可穿戴式计算机的研究,如;中国专利:其名称为“穿戴式计算机专用的微小型键盘”,申请号200410043913.8,此专利提供了一种穿戴式计算机专用的微小型键盘,体积小、结构简单,但是交互方式仍然是传统键盘式交互,不符合自然交互的要求。中国专利:其名称为“穿戴式输入装置”,申请号200920178274.4;中国专利:其名称为“一种手套式虚拟输入装置”,申请号200920204864.X。这些专利拜托传统鼠标键盘式的硬件束缚,实现设备虚拟化的目的,但是需要特制硬件的支持,不适合移动设备的要求。中国专利:其名称为“带有中文LINUX操作系统的可穿戴式多媒体无线微型计算机”,申请号00107607.8,此专利使用人的自然语言进行人机交互,但是人们更加习惯使用手进行交互,故此专利不适合大规模商用推广。中国专利:其名称为“一种用于穿戴式计算机的人机交互头盔”,申请号200710023167.X,此专利提出一种具有视线跟踪功能,能通过配合语音实现对穿戴式计算机进行人机交互的头盔,能广泛应用于航空、航天、机械制造、电力、军事等不能彻底解放人的双手的应用领域,但是对硬件要求过高。For the research on wearable computers, such as; Chinese patent: its title is " special miniature keyboard for wearable computer ", application number 200410043913.8, this patent provides a kind of microminiature keyboard special for wearable computer, volume is little, structure Simple, but the interaction method is still the traditional keyboard interaction, which does not meet the requirements of natural interaction. Chinese patent: its name is "wearable input device", application number 200920178274.4; Chinese patent: its name is "a glove-type virtual input device", application number 200920204864.X. These patents get rid of the traditional mouse and keyboard hardware constraints to achieve the purpose of device virtualization, but they need the support of special hardware, which is not suitable for the requirements of mobile devices. Chinese patent: its name is "Wearable Multimedia Wireless Microcomputer with Chinese LINUX Operating System", application number 00107607.8, this patent uses human natural language for human-computer interaction, but people are more accustomed to using hands for interaction, so this patent Not suitable for large-scale commercial promotion. Chinese patent: its name is "A Human-Computer Interaction Helmet for Wearable Computers", application number 200710023167.X, this patent proposes a device with eye-tracking function, which can realize human-computer interaction with wearable computers by cooperating with voice The helmet can be widely used in aviation, aerospace, machinery manufacturing, electric power, military and other application fields that cannot completely liberate people's hands, but the hardware requirements are too high.
发明内容Contents of the invention
鉴于以上所述现有技术存在的问题和不足,本发明的目的在于提供一种可穿戴式三维手势交互系统及其使用方法,系统结构简单,用户能通过手势对虚拟三维物体进行交互,三维界面生产具体。为达到上述目的,本发明的构思是:根据可穿戴式计算机技术要求,本发明包括便携式电脑(1)、摄像头(2)、微型投影仪(3)和彩色标记(4)。为了使摄像头拍摄的画面是用户看到的画面,故摄像头应靠近用户眼部,将摄像头戴在头上或者挂在胸前。为了使投影画面不被手所遮挡,微型投影仪应该戴在头部。并且为了使摄像头获得的画面与微型投影仪投出的实际画面,由于两者本身位置之间不同造成的几何失真不影响后面处理与降低程序的复杂性,应该使微型投影仪与摄像头尽量靠近,即两者到投影屏幕的距离与夹角尽量相同。微型投影仪与摄像头连接在一台便携式电脑上,此便携式电脑可以使用笔记本电脑,也可以使用智能手机。用户手指上需要带上彩色标记,在本发明中,左手的食指、拇指和小指和右手的食指、拇指需要带上不同的彩色标记。In view of the above-mentioned problems and deficiencies in the prior art, the purpose of the present invention is to provide a wearable 3D gesture interaction system and its use method. The system structure is simple, and users can interact with virtual 3D objects through gestures. The 3D interface Production specific. To achieve the above purpose, the idea of the present invention is: according to the technical requirements of wearable computers, the present invention includes a portable computer (1), a camera (2), a micro projector (3) and a color mark (4). In order to make the picture taken by the camera the picture that the user sees, the camera should be close to the user's eyes, and the camera should be worn on the head or hung on the chest. In order to prevent the projection screen from being blocked by hands, the miniature projector should be worn on the head. And in order to make the picture obtained by the camera and the actual picture cast by the micro-projector, the geometric distortion caused by the difference between the two positions does not affect the subsequent processing and reduce the complexity of the program, the micro-projector should be as close as possible to the camera, That is, the distance and included angle between the two and the projection screen should be the same as possible. The pico projector and camera are connected to a laptop that can be used with a laptop or a smartphone. Color markings need to be worn on the user's fingers. In the present invention, the index finger, thumb and little finger of the left hand and the index finger and thumb of the right hand need to wear different color markings.
微型投影仪与摄像头并无特别要求,目前市面上能够购得的均可。将摄像头与微型投影仪安置在帽子上,戴在头上,便携式电脑背在身后,这样便构成了穿戴式三维手势交互方法的硬件系统。随着移动设备的发展,当微型投影仪也如同摄像头成为移动设备配置的时候,本系统可以简化为单一的移动设备,做到一体化的要求。There are no special requirements for the micro-projector and camera, and any currently available on the market can be used. The camera and the micro-projector are arranged on the hat, worn on the head, and the portable computer is carried behind the back, so that the hardware system of the wearable three-dimensional gesture interaction method is formed. With the development of mobile devices, when the micro-projector becomes a mobile device configuration like a camera, this system can be simplified to a single mobile device to meet the requirements of integration.
根据上述发明构思,本发明采用下述技术方案:According to above-mentioned inventive concept, the present invention adopts following technical scheme:
一种可穿戴式三维手势交互系统,包括便携式电脑(1)、摄像头(2)、微型投影仪(3)和彩色标记(4),其特征在于摄像头和微型投影仪安置在帽子上,使两者到投影屏幕的距离与夹角尽量相同,分别连接在一台便携式电脑上,并且在用户手指上需要带上彩色标记。A wearable three-dimensional gesture interaction system, including a portable computer (1), a camera (2), a micro-projector (3) and a color marker (4), which is characterized in that the camera and the micro-projector are placed on the hat, so that the two The distance and included angle from the projection screen should be the same as possible, and they should be connected to a portable computer respectively, and colored marks should be placed on the user's fingers.
一种上述可穿戴式三维手势交互系统的使用方法,其特征在于操作步骤为:A method for using the above-mentioned wearable three-dimensional gesture interaction system, characterized in that the operation steps are:
1)标记识别:通过摄像头(2)获得用户手指上彩色标记(4)的信息,通过图像分割获得其坐标信息;1) Mark recognition: Obtain the information of the colored mark (4) on the user's finger through the camera (2), and obtain its coordinate information through image segmentation;
2)动作识别:实时计算各个标记的运动向量以及各标记的位置关系,从而分析是何种动作指令;2) Action recognition: calculate the motion vector of each marker and the positional relationship of each marker in real time, so as to analyze what kind of action command it is;
3)响应输出:根据动作指令修改画面,并通过微型投影仪(3)投影到一个平面上,实现人机交互。3) Response output: Modify the screen according to the action instruction, and project it onto a plane through the micro projector (3), realizing human-computer interaction.
上述步骤1的标记识别,通过成熟的基于颜色的计算机图像处理技术,将摄像头拍摄到的实际图像进行分割,获得各个彩色标记的坐标信息。For the mark recognition in the above step 1, the actual image captured by the camera is segmented through the mature color-based computer image processing technology, and the coordinate information of each color mark is obtained.
上述步骤2的动作识别,本发明中的交互方式是手势交互,包括四类手势:多点触控手势、自由手势、混合手势和虚拟鼠标。多点触控手势就是如今最热门的多点触控的一系列手势,通过此类手势,可以自由缩放图像、移动图标等等。自由手势就是人们日常生活中的手势,例如左手做出一个“OK”手势,便能实现确认的操作;左手食指与右手食指交叉呈“X”形,就能实现取消的操作;两手撑平,左手在上,右手在下,左手拇指与右手食指相触,左手食指与右手拇指相触,摆出一个“取景”的手势,就可以将当前摄像头获得的画面保存起来,即拍照功能。混合手势就是通过左右手配合进行人机交互,左手控制三维变换,例如视角的选择等;右手控制平面变换,如画面的左右移动等。虚拟鼠标是本发明提出的一种手势。虚拟鼠标即将左手手掌作为鼠标使用。此手势是将左手撑开,手心面对摄像头,除拇指外的四个手指尽量弯曲,使得四个手指的手指甲在一直线上。此时左手食指与小指的彩色标记在一直线上,将食指上的标记作为左上角顶点,小指上的标记作为右上角顶点,向下做一正方形,在这正方形区域内即为控制区域,与图形界面上的坐标对应。并且此正方形区域在左手手掌之上,当右手食指在控制区域中移动时,可以接触手掌,即可看作为手掌心为控制区域。这样设计的优点是右手食指在进行移动的时候可以有借力的地方,能够使移动更稳,达到精确定位的效果。将在具体实施方式中,结合图进行具体说明。For the action recognition in
上述步骤3的响应输出,将便携电脑中的图形界面通过微型投影仪投影到墙上、桌面等物理平面上。使用的图形界面不仅可以使用如今常规的二维图形界面,还可以使用三维图形界面。可以兼容如今商业三维界面,例如BumpTop和Real Desktop。本发明也提出一种利用投影屏幕标记实现三维界面的方法。投影屏幕标记即在原有的界面四个顶角上做绘制不同的标记再投射到投影屏幕上,这些标记就称为“投影屏幕标记”。由于人与投影平面法向量的夹角的不同,投影的画面会产生几何失真,然后再通过摄像头获得实际投影图像。根据四个标记和其视觉变换矩阵,计算出投影平面的方向向量,进而渲染出虚拟三维物体并投影到平面上;最后用户能通过手势对虚拟三维物体进行交互。这样随着用户的移动,即用户与投影平面法向量夹角的变化,将渲染不同的虚拟物体,但用户将产生虚拟物体犹如真实存在的感受。并且用户能通过手势对虚拟三维物体进行交互。将在具体实施方式中,结合图进行具体说明。The response output of the above step 3 is to project the graphical interface in the portable computer onto a physical plane such as a wall or a desktop through a micro-projector. The graphical interface used can not only use the conventional two-dimensional graphical interface, but also use the three-dimensional graphical interface. Compatible with today's commercial 3D interfaces, such as BumpTop and Real Desktop. The invention also proposes a method for realizing a three-dimensional interface by using projection screen marks. The projection screen mark is to draw different marks on the four top corners of the original interface and project them on the projection screen. These marks are called "projection screen marks". Due to the different angles between the person and the normal vector of the projection plane, the projected picture will produce geometric distortion, and then the actual projected image will be obtained through the camera. According to the four markers and their visual transformation matrix, the direction vector of the projection plane is calculated, and then the virtual three-dimensional object is rendered and projected onto the plane; finally, the user can interact with the virtual three-dimensional object through gestures. In this way, with the movement of the user, that is, the change of the angle between the user and the normal vector of the projection plane, different virtual objects will be rendered, but the user will have the feeling that the virtual object is as real. And the user can interact with the virtual three-dimensional object through gestures. In the specific implementation manner, specific description will be made in conjunction with the figures.
本发明与现有技术相比较,具有如下显而易见的突出实质性特点和显著优点:Compared with the prior art, the present invention has the following obvious outstanding substantive features and significant advantages:
本发明通过一种投影屏幕标记的三维图形界面方式和一种虚拟鼠标手势,使三维图形界面更加生动,交互方式更加灵活。本发明设备简单、成本低,能满足移动设备的要求。The present invention makes the three-dimensional graphic interface more vivid and the interactive mode more flexible through a three-dimensional graphic interface mode marked on a projection screen and a virtual mouse gesture. The invention has simple equipment and low cost, and can meet the requirements of mobile equipment.
附图说明Description of drawings
图1 本发明实施例的示意图;Fig. 1 is the schematic diagram of the embodiment of the present invention;
图2 本发明实施例的流程示意图;Fig. 2 is a schematic flow diagram of an embodiment of the present invention;
图3 虚拟鼠标示意图;Figure 3 Schematic diagram of the virtual mouse;
图4 投影屏幕标记示意图。Figure 4 Schematic diagram of projection screen marking.
具体实施方式Detailed ways
以下结合附图对本发明的优选实施例作进一步的详细说明。Preferred embodiments of the present invention will be further described in detail below in conjunction with the accompanying drawings.
实施例一:如图1所示,可穿戴式三维手势交互系统包括一台便携式电脑(1)、一个摄像头(2)、一台微型投影仪(3)和一些彩色标记(4)。本交互系统对硬件要求低,微型投影仪和摄像头没有要求,故系统成本低。为了使摄像头拍摄的画面是用户看到的画面,并且手部动作不影响投影效果,如图1所示,将微型投影仪和摄像头安置在帽子上。Embodiment 1: As shown in Figure 1, a wearable 3D gesture interaction system includes a portable computer (1), a camera (2), a micro projector (3) and some colored markers (4). The interactive system has low requirements for hardware, and no requirements for micro-projectors and cameras, so the system cost is low. In order to make the picture taken by the camera be the picture seen by the user, and the hand movement does not affect the projection effect, as shown in Figure 1, the micro-projector and the camera are placed on the hat.
实施例二:可穿戴式三维手势交互系统的使用方法,如图2所示,其具体步骤如下:Embodiment two: the use method of wearable three-dimensional gesture interaction system, as shown in Figure 2, its specific steps are as follows:
①、标记识别;①, mark recognition;
②、动作识别;②, action recognition;
③、响应输出。③. Response output.
上述步骤①所述的标记识别,是通过摄像头获得用户手指上彩色标记的信息,通过图像分割获得其坐标信息,其具体步骤如下:The mark recognition described in the above step ① is to obtain the information of the colored mark on the user's finger through the camera, and obtain its coordinate information through image segmentation. The specific steps are as follows:
(1)、获取关键帧画面。(1). Obtain key frame images.
(2)、对关键帧画面中的彩色标记进行图像分割,获得各个彩色标记的坐标,并计算各标记的相对位置作为手势。(2) Carry out image segmentation on the colored markers in the key frame image, obtain the coordinates of each colored marker, and calculate the relative position of each marker as a gesture.
上述步骤②所述的动作识别进行手势交互,其具体步骤如下:The action recognition described in the
(3)、是否是多点触控手势?是多点触控手势则转步骤(7),否则转步骤(4)。(3) Is it a multi-touch gesture? If it is a multi-touch gesture, go to step (7), otherwise go to step (4).
(4)、是否是自由手势?是则转步骤(8),否则转步骤(5)。(4) Is it a free gesture? If yes, go to step (8), otherwise go to step (5).
(5)、是否是混合手势?是则转步骤(9),否则转步骤(6)。(5) Is it a mixed gesture? If yes, go to step (9), otherwise go to step (6).
(6)、是否是虚拟鼠标?是则转步骤(10),否则转步骤(1)。(6) Is it a virtual mouse? If yes, go to step (10), otherwise go to step (1).
步骤(6)中一种特殊的手势:虚拟鼠标,如图3所示,左手食指与小指的彩色标记在一直线上,将食指上的标记作为左上角顶点,小指上的标记作为右上角顶点,向下做一正方形,在这正方形区域内即为控制区域,即手掌心,与图形界面上的坐标对应。这样设计的优点是右手食指在进行移动的时候可以有借力的地方,能够使移动更稳,达到精确定位的效果。运用虚拟鼠标可以作为虚拟键盘输入文字,也可以在界面上精确地绘图。A special gesture in step (6): virtual mouse, as shown in Figure 3, the color marks of the index finger and little finger of the left hand are in a straight line, the mark on the index finger is the vertex of the upper left corner, and the mark on the little finger is the vertex of the upper right corner , make a square downward, and the control area in this square area is the palm, which corresponds to the coordinates on the graphical interface. The advantage of this design is that the right index finger can have a place to use when moving, which can make the movement more stable and achieve the effect of precise positioning. Using the virtual mouse can be used as a virtual keyboard to input text, and can also accurately draw on the interface.
上述步骤③所述的响应输出,根据动作指令修改画面,并通过微型投影仪投影到一个平面上,实现人机交互,其具体步骤如下:For the response output described in the above step ③, the screen is modified according to the action instruction, and projected onto a plane through a micro-projector to realize human-computer interaction. The specific steps are as follows:
(7)、根据彩色标记的不同手势,做出相应的多点触控操作,完成交互后转步骤(1)。(7) According to the different gestures of the colored marks, make corresponding multi-touch operations, and turn to step (1) after completing the interaction.
(8)、根据彩色标记的不同手势,做出相应的自由手势操作,完成交互后转步骤(1)。(8) According to the different gestures of the colored marks, make corresponding free gesture operations, and then go to step (1) after completing the interaction.
(9)、根据彩色标记的不同手势,做出相应的混合手势操作,完成交互后转步骤(1)。(9). According to the different gestures of the colored marks, make corresponding mixed gesture operations, and then go to step (1) after completing the interaction.
(10)、根据彩色标记的不同手势,做出相应的虚拟鼠标操作,完成交互后转步骤(1)。(10) According to the different gestures of the colored markers, make corresponding virtual mouse operations, and turn to step (1) after completing the interaction.
本发明中可以使用普通二维图形界面,也可以使用三维图形界面。本实施例给出一种特殊的三维图形界面“投影屏幕标记”,如图4所示,在原有的界面四个顶角上绘制不同的标记再投射到投影屏幕上。由于人与投影平面法向量的夹角的不同,投影的画面将产生几何失真,然后再通过摄像头获得这些实际投影图像。根据四个标记和其视觉的变换矩阵:In the present invention, an ordinary two-dimensional graphical interface can be used, and a three-dimensional graphical interface can also be used. This embodiment provides a special three-dimensional graphic interface "projection screen mark", as shown in Figure 4, different marks are drawn on the four corners of the original interface and then projected onto the projection screen. Due to the different angles between the person and the normal vector of the projection plane, the projected picture will produce geometric distortion, and then these actual projected images are obtained through the camera. According to the transformation matrix of the four markers and their vision:
; ;
其中,是世界坐标系(,,)到摄像头坐标系(,,)的变换矩阵。在中为选择矩阵,为平移矩阵,为透视变换矩阵。计算出投影平面的方向向量,进而渲染出虚拟三维物体并投影到平面上,最后用户能通过手势对虚拟三维物体进行交互。以往基于标记的增强现实系统是摄像头位置固定,标记位置变动。而本发明中的投影屏幕标记是标记位置,即投影屏幕不动,摄像头移动。这样随着用户的移动,即用户与投影平面法向量夹角的变化,将渲染不同的虚拟物体,但用户将产生虚拟物体犹如真实存在的感受,比以往三维界面更加生动具体。并且用户能通过手势对虚拟三维物体进行交互。in, is the world coordinate system ( , , ) to the camera coordinate system ( , , ) transformation matrix. exist middle For the selection matrix, is the translation matrix, is the perspective transformation matrix. The direction vector of the projection plane is calculated, and then the virtual three-dimensional object is rendered and projected onto the plane. Finally, the user can interact with the virtual three-dimensional object through gestures. In the past marker-based augmented reality systems, the position of the camera is fixed and the position of the marker changes. However, the projection screen mark in the present invention is a mark position, that is, the projection screen does not move, and the camera moves. In this way, as the user moves, that is, the angle between the user and the normal vector of the projection plane changes, different virtual objects will be rendered, but the user will have the feeling that the virtual object is as real as it is, which is more vivid and specific than the previous three-dimensional interface. And the user can interact with the virtual three-dimensional object through gestures.
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN2011101146045ACN102193631A (en) | 2011-05-05 | 2011-05-05 | Wearable three-dimensional gesture interaction system and using method thereof |
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN2011101146045ACN102193631A (en) | 2011-05-05 | 2011-05-05 | Wearable three-dimensional gesture interaction system and using method thereof |
| Publication Number | Publication Date |
|---|---|
| CN102193631Atrue CN102193631A (en) | 2011-09-21 |
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN2011101146045APendingCN102193631A (en) | 2011-05-05 | 2011-05-05 | Wearable three-dimensional gesture interaction system and using method thereof |
| Country | Link |
|---|---|
| CN (1) | CN102193631A (en) |
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2014012486A1 (en)* | 2012-07-17 | 2014-01-23 | Gao Shouqian | Wearable wireless intelligent electronic device having removable and freely-combinable functional modules |
| CN103995592A (en)* | 2014-05-21 | 2014-08-20 | 上海华勤通讯技术有限公司 | Wearable equipment and terminal information interaction method and terminal |
| CN104063038A (en)* | 2013-03-18 | 2014-09-24 | 联想(北京)有限公司 | Information processing method and device and electronic equipment |
| CN104102071A (en)* | 2013-04-12 | 2014-10-15 | 王旭东 | Full-automatic photographing equipment |
| CN104345802A (en)* | 2013-08-08 | 2015-02-11 | 派布勒斯有限公司 | Method and device for controlling a near eye display |
| CN104460988A (en)* | 2014-11-11 | 2015-03-25 | 陈琦 | Input control method of intelligent cell phone virtual reality device |
| CN104461277A (en)* | 2013-09-23 | 2015-03-25 | Lg电子株式会社 | Mobile terminal and method of controlling therefor |
| CN104660941A (en)* | 2013-11-22 | 2015-05-27 | 北京弘天智达科技有限公司 | Micro display communication device, system and method |
| CN104866103A (en)* | 2015-06-01 | 2015-08-26 | 联想(北京)有限公司 | Relative position determining method, wearable electronic equipment and terminal equipment |
| WO2015165181A1 (en)* | 2014-04-28 | 2015-11-05 | 京东方科技集团股份有限公司 | Method and apparatus for controlling projection of wearable device, and wearable device |
| CN105302337A (en)* | 2013-06-04 | 2016-02-03 | 李文傑 | High-resolution and high-sensitivity three-dimensional (3D) mouse movement control system, movement control device and motion detection method thereof |
| CN106293078A (en)* | 2016-08-02 | 2017-01-04 | 福建数博讯信息科技有限公司 | Virtual reality exchange method based on photographic head and device |
| CN104090465B (en)* | 2014-06-17 | 2017-01-11 | 福建水立方三维数字科技有限公司 | Three-dimensional interactive projection imaging method |
| WO2017107182A1 (en)* | 2015-12-25 | 2017-06-29 | 深圳市柔宇科技有限公司 | Head-mounted display device |
| CN107085467A (en)* | 2017-03-30 | 2017-08-22 | 北京奇艺世纪科技有限公司 | A kind of gesture identification method and device |
| CN109445599A (en)* | 2018-11-13 | 2019-03-08 | 宁波视睿迪光电有限公司 | Interaction pen detection method and 3D interactive system |
| CN109496331A (en)* | 2016-05-20 | 2019-03-19 | 奇跃公司 | Contextual awareness of user interface menus |
| CN110049228A (en)* | 2018-01-17 | 2019-07-23 | 北京林业大学 | A kind of new method and system taken pictures based on gesture control |
| CN110140099A (en)* | 2017-01-27 | 2019-08-16 | 高通股份有限公司 | System and method for tracking control unit |
| CN111624770A (en)* | 2015-04-15 | 2020-09-04 | 索尼互动娱乐股份有限公司 | Pinch and hold gesture navigation on head mounted display |
| CN112416133A (en)* | 2020-11-30 | 2021-02-26 | 魔珐(上海)信息科技有限公司 | Hand motion capture method and device, electronic equipment and storage medium |
| CN112515661A (en)* | 2020-11-30 | 2021-03-19 | 魔珐(上海)信息科技有限公司 | Posture capturing method and device, electronic equipment and storage medium |
| CN114138119A (en)* | 2021-12-08 | 2022-03-04 | 武汉卡比特信息有限公司 | Gesture recognition system and method for mobile phone interconnection split screen projection |
| CN114967927A (en)* | 2022-05-30 | 2022-08-30 | 桂林电子科技大学 | Intelligent gesture interaction method based on image processing |
| US12406454B2 (en) | 2016-03-31 | 2025-09-02 | Magic Leap, Inc. | Interactions with 3D virtual objects using poses and multiple-dof controllers |
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101077232A (en)* | 2007-06-07 | 2007-11-28 | 南京航空航天大学 | Human-computer interaction helmet for type computer |
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101077232A (en)* | 2007-06-07 | 2007-11-28 | 南京航空航天大学 | Human-computer interaction helmet for type computer |
| Title |
|---|
| 《北京理工大学学报》 20051231 柳阳等 基于穿戴视觉的人手跟踪与手势识别方法 第1083-1086页 1-2 第25卷, 第12期* |
| 《计算机辅助设计与图形学学报》 20110430 李岩等 一种手部实时跟踪与定位的虚实碰撞检测方法 第713-718页 1-2 第23卷, 第4期* |
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2014012486A1 (en)* | 2012-07-17 | 2014-01-23 | Gao Shouqian | Wearable wireless intelligent electronic device having removable and freely-combinable functional modules |
| CN104063038A (en)* | 2013-03-18 | 2014-09-24 | 联想(北京)有限公司 | Information processing method and device and electronic equipment |
| CN104102071A (en)* | 2013-04-12 | 2014-10-15 | 王旭东 | Full-automatic photographing equipment |
| CN105302337B (en)* | 2013-06-04 | 2019-10-18 | 李文傑 | High-resolution and high-sensitivity three-dimensional (3D) mouse movement control system, movement control device and motion detection method thereof |
| CN105302337A (en)* | 2013-06-04 | 2016-02-03 | 李文傑 | High-resolution and high-sensitivity three-dimensional (3D) mouse movement control system, movement control device and motion detection method thereof |
| CN104345802B (en)* | 2013-08-08 | 2019-03-22 | 脸谱公司 | For controlling the devices, systems, and methods of near-to-eye displays |
| CN104345802A (en)* | 2013-08-08 | 2015-02-11 | 派布勒斯有限公司 | Method and device for controlling a near eye display |
| CN104461277A (en)* | 2013-09-23 | 2015-03-25 | Lg电子株式会社 | Mobile terminal and method of controlling therefor |
| CN104660941A (en)* | 2013-11-22 | 2015-05-27 | 北京弘天智达科技有限公司 | Micro display communication device, system and method |
| WO2015165181A1 (en)* | 2014-04-28 | 2015-11-05 | 京东方科技集团股份有限公司 | Method and apparatus for controlling projection of wearable device, and wearable device |
| US9872002B2 (en) | 2014-04-28 | 2018-01-16 | Boe Technology Group Co., Ltd. | Method and device for controlling projection of wearable apparatus, and wearable apparatus |
| CN103995592A (en)* | 2014-05-21 | 2014-08-20 | 上海华勤通讯技术有限公司 | Wearable equipment and terminal information interaction method and terminal |
| CN104090465B (en)* | 2014-06-17 | 2017-01-11 | 福建水立方三维数字科技有限公司 | Three-dimensional interactive projection imaging method |
| CN104460988A (en)* | 2014-11-11 | 2015-03-25 | 陈琦 | Input control method of intelligent cell phone virtual reality device |
| CN104460988B (en)* | 2014-11-11 | 2017-12-22 | 陈琦 | A kind of input control method of smart mobile phone virtual reality device |
| CN111624770A (en)* | 2015-04-15 | 2020-09-04 | 索尼互动娱乐股份有限公司 | Pinch and hold gesture navigation on head mounted display |
| CN104866103B (en)* | 2015-06-01 | 2019-12-24 | 联想(北京)有限公司 | Relative position determining method, wearable electronic device and terminal device |
| CN104866103A (en)* | 2015-06-01 | 2015-08-26 | 联想(北京)有限公司 | Relative position determining method, wearable electronic equipment and terminal equipment |
| WO2017107182A1 (en)* | 2015-12-25 | 2017-06-29 | 深圳市柔宇科技有限公司 | Head-mounted display device |
| US12406454B2 (en) | 2016-03-31 | 2025-09-02 | Magic Leap, Inc. | Interactions with 3D virtual objects using poses and multiple-dof controllers |
| CN109496331A (en)* | 2016-05-20 | 2019-03-19 | 奇跃公司 | Contextual awareness of user interface menus |
| CN109496331B (en)* | 2016-05-20 | 2022-06-21 | 奇跃公司 | Contextual awareness of user interface menus |
| CN106293078A (en)* | 2016-08-02 | 2017-01-04 | 福建数博讯信息科技有限公司 | Virtual reality exchange method based on photographic head and device |
| US11740690B2 (en) | 2017-01-27 | 2023-08-29 | Qualcomm Incorporated | Systems and methods for tracking a controller |
| US12141341B2 (en) | 2017-01-27 | 2024-11-12 | Qualcomm Incorporated | Systems and methods for tracking a controller |
| CN110140099A (en)* | 2017-01-27 | 2019-08-16 | 高通股份有限公司 | System and method for tracking control unit |
| CN110140099B (en)* | 2017-01-27 | 2022-03-11 | 高通股份有限公司 | System and method for tracking controller |
| CN107085467A (en)* | 2017-03-30 | 2017-08-22 | 北京奇艺世纪科技有限公司 | A kind of gesture identification method and device |
| CN110049228A (en)* | 2018-01-17 | 2019-07-23 | 北京林业大学 | A kind of new method and system taken pictures based on gesture control |
| CN109445599A (en)* | 2018-11-13 | 2019-03-08 | 宁波视睿迪光电有限公司 | Interaction pen detection method and 3D interactive system |
| CN112515661B (en)* | 2020-11-30 | 2021-09-14 | 魔珐(上海)信息科技有限公司 | Posture capturing method and device, electronic equipment and storage medium |
| WO2022111525A1 (en)* | 2020-11-30 | 2022-06-02 | 魔珐(上海)信息科技有限公司 | Posture capturing method and apparatus, electronic device, and storage medium |
| CN112515661A (en)* | 2020-11-30 | 2021-03-19 | 魔珐(上海)信息科技有限公司 | Posture capturing method and device, electronic equipment and storage medium |
| CN112416133A (en)* | 2020-11-30 | 2021-02-26 | 魔珐(上海)信息科技有限公司 | Hand motion capture method and device, electronic equipment and storage medium |
| CN114138119A (en)* | 2021-12-08 | 2022-03-04 | 武汉卡比特信息有限公司 | Gesture recognition system and method for mobile phone interconnection split screen projection |
| CN114967927A (en)* | 2022-05-30 | 2022-08-30 | 桂林电子科技大学 | Intelligent gesture interaction method based on image processing |
| CN114967927B (en)* | 2022-05-30 | 2024-04-16 | 桂林电子科技大学 | Intelligent gesture interaction method based on image processing |
| Publication | Publication Date | Title |
|---|---|---|
| CN102193631A (en) | Wearable three-dimensional gesture interaction system and using method thereof | |
| CN114174960B (en) | Projection in a virtual environment | |
| US10545580B2 (en) | 3D interaction method, device, computer equipment and storage medium | |
| US9395821B2 (en) | Systems and techniques for user interface control | |
| Grossman et al. | Multi-finger gestural interaction with 3d volumetric displays | |
| CN103336575B (en) | The intelligent glasses system of a kind of man-machine interaction and exchange method | |
| CN107491174A (en) | Method, apparatus, system and electronic equipment for remote assistance | |
| CN107450714A (en) | Man-machine interaction support test system based on augmented reality and image recognition | |
| CN104063039A (en) | Human-computer interaction method of wearable computer intelligent terminal | |
| Huang et al. | A survey on human-computer interaction in mixed reality | |
| CN104464414A (en) | Augmented reality teaching system | |
| CN106846496A (en) | DICOM images based on mixed reality technology check system and operating method | |
| CN109145802A (en) | More manpower gesture man-machine interaction methods and device based on Kinect | |
| CN104637080B (en) | A kind of three-dimensional drawing system and method based on man-machine interaction | |
| CN109471533A (en) | A student terminal system in VR/AR classroom and its use method | |
| Lu et al. | Classification, application, challenge, and future of midair gestures in augmented reality | |
| CN115328304A (en) | A 2D-3D fusion virtual reality interaction method and device | |
| Qian et al. | Portalware: Exploring free-hand ar drawing with a dual-display smartphone-wearable paradigm | |
| CN118466805A (en) | Non-contact 3D model human-computer interaction method based on machine vision and gesture recognition | |
| Marchal et al. | Designing intuitive multi-touch 3d navigation techniques | |
| Schöning et al. | Bimanual interaction with interscopic multi-touch surfaces | |
| CN104821135B (en) | It is a kind of to realize the method and device that paper map is combined display with electronic map | |
| Zhang et al. | A hybrid 2d-3d tangible interface for virtual reality | |
| CN115525151A (en) | Immersive interactive large screen implementation method | |
| CN115115812A (en) | Virtual scene display method and device and storage medium |
| Date | Code | Title | Description |
|---|---|---|---|
| C06 | Publication | ||
| PB01 | Publication | ||
| C10 | Entry into substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| C02 | Deemed withdrawal of patent application after publication (patent law 2001) | ||
| WD01 | Invention patent application deemed withdrawn after publication | Application publication date:20110921 |