Movatterモバイル変換


[0]ホーム

URL:


CN102193631A - Wearable three-dimensional gesture interaction system and using method thereof - Google Patents

Wearable three-dimensional gesture interaction system and using method thereof
Download PDF

Info

Publication number
CN102193631A
CN102193631ACN2011101146045ACN201110114604ACN102193631ACN 102193631 ACN102193631 ACN 102193631ACN 2011101146045 ACN2011101146045 ACN 2011101146045ACN 201110114604 ACN201110114604 ACN 201110114604ACN 102193631 ACN102193631 ACN 102193631A
Authority
CN
China
Prior art keywords
gesture
mark
turning
interaction
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2011101146045A
Other languages
Chinese (zh)
Inventor
凌晨
陈明
张文俊
赵凡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Shanghai for Science and Technology
Original Assignee
University of Shanghai for Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Shanghai for Science and TechnologyfiledCriticalUniversity of Shanghai for Science and Technology
Priority to CN2011101146045ApriorityCriticalpatent/CN102193631A/en
Publication of CN102193631ApublicationCriticalpatent/CN102193631A/en
Pendinglegal-statusCriticalCurrent

Links

Images

Landscapes

Abstract

Translated fromChinese

本发明公开了一种可穿戴式三维手势交互系统及其使用方法。本系统由一个微型投影仪、一个普通摄像头和便携式电脑,一组简易标记组成。通过简易的计算机视觉计算,实现运用手势与三维图形界面进行人机交互的功能。本发明通过一种投影屏幕标记的三维图形界面方式和一种虚拟鼠标手势,使三维图形界面更加生动,交互方式更灵活。本发明设备简单、成本低,也能满足移动设备的要求。

The invention discloses a wearable three-dimensional gesture interaction system and a using method thereof. The system consists of a miniature projector, an ordinary camera, a portable computer, and a group of simple markers. Through simple computer vision computing, the function of using gestures and three-dimensional graphical interface for human-computer interaction is realized. The present invention makes the three-dimensional graphic interface more vivid and the interactive mode more flexible through a three-dimensional graphic interface mode marked on a projection screen and a virtual mouse gesture. The invention has simple equipment and low cost, and can also meet the requirements of mobile equipment.

Description

Translated fromChinese
可穿戴式三维手势交互系统及其使用方法Wearable 3D gesture interaction system and method of use thereof

技术领域technical field

本发明涉及的是一种可以穿戴的三维图形界面手势交互技术,具体地说是一种基于单目视觉与标记、使用手势进行人机交互的可穿戴式三维手势交互系统及其使用方法。The present invention relates to a wearable three-dimensional graphic interface gesture interaction technology, in particular to a wearable three-dimensional gesture interaction system based on monocular vision and marking, using gestures for human-computer interaction and its use method.

背景技术Background technique

随着人们对自然交互的要求不断提高,通过手势进行人机交互已经成为主流趋势。为了达到随时随地都能进行虚拟数据的操作,可穿戴式计算机技术是解决此问题的一种可行的方法。随着微型投影仪的发展,微型投影仪成为移动设备标准配置已是不可逆的趋势。同时,人们也渐渐不满足于移动设备过小的显示屏幕。所以将界面投影到平面上来扩大屏幕大小成为研究热点。With the continuous improvement of people's requirements for natural interaction, human-computer interaction through gestures has become a mainstream trend. In order to operate virtual data anytime and anywhere, wearable computer technology is a feasible method to solve this problem. With the development of pico projectors, it is an irreversible trend that pico projectors become the standard configuration of mobile devices. At the same time, people are gradually dissatisfied with the too small display screen of mobile devices. Therefore, projecting the interface onto a plane to expand the screen size has become a research hotspot.

对于可穿戴式计算机的研究,如;中国专利:其名称为“穿戴式计算机专用的微小型键盘”,申请号200410043913.8,此专利提供了一种穿戴式计算机专用的微小型键盘,体积小、结构简单,但是交互方式仍然是传统键盘式交互,不符合自然交互的要求。中国专利:其名称为“穿戴式输入装置”,申请号200920178274.4;中国专利:其名称为“一种手套式虚拟输入装置”,申请号200920204864.X。这些专利拜托传统鼠标键盘式的硬件束缚,实现设备虚拟化的目的,但是需要特制硬件的支持,不适合移动设备的要求。中国专利:其名称为“带有中文LINUX操作系统的可穿戴式多媒体无线微型计算机”,申请号00107607.8,此专利使用人的自然语言进行人机交互,但是人们更加习惯使用手进行交互,故此专利不适合大规模商用推广。中国专利:其名称为“一种用于穿戴式计算机的人机交互头盔”,申请号200710023167.X,此专利提出一种具有视线跟踪功能,能通过配合语音实现对穿戴式计算机进行人机交互的头盔,能广泛应用于航空、航天、机械制造、电力、军事等不能彻底解放人的双手的应用领域,但是对硬件要求过高。For the research on wearable computers, such as; Chinese patent: its title is " special miniature keyboard for wearable computer ", application number 200410043913.8, this patent provides a kind of microminiature keyboard special for wearable computer, volume is little, structure Simple, but the interaction method is still the traditional keyboard interaction, which does not meet the requirements of natural interaction. Chinese patent: its name is "wearable input device", application number 200920178274.4; Chinese patent: its name is "a glove-type virtual input device", application number 200920204864.X. These patents get rid of the traditional mouse and keyboard hardware constraints to achieve the purpose of device virtualization, but they need the support of special hardware, which is not suitable for the requirements of mobile devices. Chinese patent: its name is "Wearable Multimedia Wireless Microcomputer with Chinese LINUX Operating System", application number 00107607.8, this patent uses human natural language for human-computer interaction, but people are more accustomed to using hands for interaction, so this patent Not suitable for large-scale commercial promotion. Chinese patent: its name is "A Human-Computer Interaction Helmet for Wearable Computers", application number 200710023167.X, this patent proposes a device with eye-tracking function, which can realize human-computer interaction with wearable computers by cooperating with voice The helmet can be widely used in aviation, aerospace, machinery manufacturing, electric power, military and other application fields that cannot completely liberate people's hands, but the hardware requirements are too high.

发明内容Contents of the invention

鉴于以上所述现有技术存在的问题和不足,本发明的目的在于提供一种可穿戴式三维手势交互系统及其使用方法,系统结构简单,用户能通过手势对虚拟三维物体进行交互,三维界面生产具体。为达到上述目的,本发明的构思是:根据可穿戴式计算机技术要求,本发明包括便携式电脑(1)、摄像头(2)、微型投影仪(3)和彩色标记(4)。为了使摄像头拍摄的画面是用户看到的画面,故摄像头应靠近用户眼部,将摄像头戴在头上或者挂在胸前。为了使投影画面不被手所遮挡,微型投影仪应该戴在头部。并且为了使摄像头获得的画面与微型投影仪投出的实际画面,由于两者本身位置之间不同造成的几何失真不影响后面处理与降低程序的复杂性,应该使微型投影仪与摄像头尽量靠近,即两者到投影屏幕的距离与夹角尽量相同。微型投影仪与摄像头连接在一台便携式电脑上,此便携式电脑可以使用笔记本电脑,也可以使用智能手机。用户手指上需要带上彩色标记,在本发明中,左手的食指、拇指和小指和右手的食指、拇指需要带上不同的彩色标记。In view of the above-mentioned problems and deficiencies in the prior art, the purpose of the present invention is to provide a wearable 3D gesture interaction system and its use method. The system structure is simple, and users can interact with virtual 3D objects through gestures. The 3D interface Production specific. To achieve the above purpose, the idea of the present invention is: according to the technical requirements of wearable computers, the present invention includes a portable computer (1), a camera (2), a micro projector (3) and a color mark (4). In order to make the picture taken by the camera the picture that the user sees, the camera should be close to the user's eyes, and the camera should be worn on the head or hung on the chest. In order to prevent the projection screen from being blocked by hands, the miniature projector should be worn on the head. And in order to make the picture obtained by the camera and the actual picture cast by the micro-projector, the geometric distortion caused by the difference between the two positions does not affect the subsequent processing and reduce the complexity of the program, the micro-projector should be as close as possible to the camera, That is, the distance and included angle between the two and the projection screen should be the same as possible. The pico projector and camera are connected to a laptop that can be used with a laptop or a smartphone. Color markings need to be worn on the user's fingers. In the present invention, the index finger, thumb and little finger of the left hand and the index finger and thumb of the right hand need to wear different color markings.

微型投影仪与摄像头并无特别要求,目前市面上能够购得的均可。将摄像头与微型投影仪安置在帽子上,戴在头上,便携式电脑背在身后,这样便构成了穿戴式三维手势交互方法的硬件系统。随着移动设备的发展,当微型投影仪也如同摄像头成为移动设备配置的时候,本系统可以简化为单一的移动设备,做到一体化的要求。There are no special requirements for the micro-projector and camera, and any currently available on the market can be used. The camera and the micro-projector are arranged on the hat, worn on the head, and the portable computer is carried behind the back, so that the hardware system of the wearable three-dimensional gesture interaction method is formed. With the development of mobile devices, when the micro-projector becomes a mobile device configuration like a camera, this system can be simplified to a single mobile device to meet the requirements of integration.

根据上述发明构思,本发明采用下述技术方案:According to above-mentioned inventive concept, the present invention adopts following technical scheme:

一种可穿戴式三维手势交互系统,包括便携式电脑(1)、摄像头(2)、微型投影仪(3)和彩色标记(4),其特征在于摄像头和微型投影仪安置在帽子上,使两者到投影屏幕的距离与夹角尽量相同,分别连接在一台便携式电脑上,并且在用户手指上需要带上彩色标记。A wearable three-dimensional gesture interaction system, including a portable computer (1), a camera (2), a micro-projector (3) and a color marker (4), which is characterized in that the camera and the micro-projector are placed on the hat, so that the two The distance and included angle from the projection screen should be the same as possible, and they should be connected to a portable computer respectively, and colored marks should be placed on the user's fingers.

一种上述可穿戴式三维手势交互系统的使用方法,其特征在于操作步骤为:A method for using the above-mentioned wearable three-dimensional gesture interaction system, characterized in that the operation steps are:

1)标记识别:通过摄像头(2)获得用户手指上彩色标记(4)的信息,通过图像分割获得其坐标信息;1) Mark recognition: Obtain the information of the colored mark (4) on the user's finger through the camera (2), and obtain its coordinate information through image segmentation;

2)动作识别:实时计算各个标记的运动向量以及各标记的位置关系,从而分析是何种动作指令;2) Action recognition: calculate the motion vector of each marker and the positional relationship of each marker in real time, so as to analyze what kind of action command it is;

3)响应输出:根据动作指令修改画面,并通过微型投影仪(3)投影到一个平面上,实现人机交互。3) Response output: Modify the screen according to the action instruction, and project it onto a plane through the micro projector (3), realizing human-computer interaction.

上述步骤1的标记识别,通过成熟的基于颜色的计算机图像处理技术,将摄像头拍摄到的实际图像进行分割,获得各个彩色标记的坐标信息。For the mark recognition in the above step 1, the actual image captured by the camera is segmented through the mature color-based computer image processing technology, and the coordinate information of each color mark is obtained.

上述步骤2的动作识别,本发明中的交互方式是手势交互,包括四类手势:多点触控手势、自由手势、混合手势和虚拟鼠标。多点触控手势就是如今最热门的多点触控的一系列手势,通过此类手势,可以自由缩放图像、移动图标等等。自由手势就是人们日常生活中的手势,例如左手做出一个“OK”手势,便能实现确认的操作;左手食指与右手食指交叉呈“X”形,就能实现取消的操作;两手撑平,左手在上,右手在下,左手拇指与右手食指相触,左手食指与右手拇指相触,摆出一个“取景”的手势,就可以将当前摄像头获得的画面保存起来,即拍照功能。混合手势就是通过左右手配合进行人机交互,左手控制三维变换,例如视角的选择等;右手控制平面变换,如画面的左右移动等。虚拟鼠标是本发明提出的一种手势。虚拟鼠标即将左手手掌作为鼠标使用。此手势是将左手撑开,手心面对摄像头,除拇指外的四个手指尽量弯曲,使得四个手指的手指甲在一直线上。此时左手食指与小指的彩色标记在一直线上,将食指上的标记作为左上角顶点,小指上的标记作为右上角顶点,向下做一正方形,在这正方形区域内即为控制区域,与图形界面上的坐标对应。并且此正方形区域在左手手掌之上,当右手食指在控制区域中移动时,可以接触手掌,即可看作为手掌心为控制区域。这样设计的优点是右手食指在进行移动的时候可以有借力的地方,能够使移动更稳,达到精确定位的效果。将在具体实施方式中,结合图进行具体说明。For the action recognition instep 2 above, the interaction mode in the present invention is gesture interaction, including four types of gestures: multi-touch gestures, free gestures, hybrid gestures and virtual mouse. Multi-touch gestures are a series of the most popular multi-touch gestures today. Through such gestures, you can freely zoom images, move icons, and so on. Free gestures are gestures in people's daily life. For example, if you make an "OK" gesture with your left hand, you can perform a confirmation operation; if you cross your left index finger with your right index finger in an "X" shape, you can perform a cancel operation; hold your hands flat, The left hand is on the top, the right hand is on the bottom, the left thumb touches the right index finger, and the left index finger touches the right thumb. Make a "viewfinder" gesture to save the picture obtained by the current camera, that is, the camera function. Hybrid gestures are human-computer interaction through the cooperation of left and right hands. The left hand controls three-dimensional transformation, such as the selection of viewing angle, etc.; the right hand controls plane transformation, such as moving left and right of the screen. The virtual mouse is a gesture proposed by the present invention. A virtual mouse is about using the palm of your left hand as a mouse. This gesture is to spread the left hand apart, with the palm facing the camera, and bend the four fingers except the thumb as far as possible, so that the fingernails of the four fingers are in a straight line. At this time, the colored marks on the index finger and little finger of the left hand are on a straight line, and the mark on the index finger is used as the vertex of the upper left corner, and the mark on the little finger is used as the vertex of the upper right corner, and a square is made downward, and the area within this square is the control area. Coordinate correspondence on the graphical interface. And this square area is above the palm of the left hand. When the right index finger moves in the control area, it can touch the palm, which can be regarded as the palm as the control area. The advantage of this design is that the right index finger can have a place to use when moving, which can make the movement more stable and achieve the effect of precise positioning. In the specific implementation manner, specific description will be made in conjunction with the figures.

上述步骤3的响应输出,将便携电脑中的图形界面通过微型投影仪投影到墙上、桌面等物理平面上。使用的图形界面不仅可以使用如今常规的二维图形界面,还可以使用三维图形界面。可以兼容如今商业三维界面,例如BumpTop和Real Desktop。本发明也提出一种利用投影屏幕标记实现三维界面的方法。投影屏幕标记即在原有的界面四个顶角上做绘制不同的标记再投射到投影屏幕上,这些标记就称为“投影屏幕标记”。由于人与投影平面法向量的夹角的不同,投影的画面会产生几何失真,然后再通过摄像头获得实际投影图像。根据四个标记和其视觉变换矩阵,计算出投影平面的方向向量,进而渲染出虚拟三维物体并投影到平面上;最后用户能通过手势对虚拟三维物体进行交互。这样随着用户的移动,即用户与投影平面法向量夹角的变化,将渲染不同的虚拟物体,但用户将产生虚拟物体犹如真实存在的感受。并且用户能通过手势对虚拟三维物体进行交互。将在具体实施方式中,结合图进行具体说明。The response output of the above step 3 is to project the graphical interface in the portable computer onto a physical plane such as a wall or a desktop through a micro-projector. The graphical interface used can not only use the conventional two-dimensional graphical interface, but also use the three-dimensional graphical interface. Compatible with today's commercial 3D interfaces, such as BumpTop and Real Desktop. The invention also proposes a method for realizing a three-dimensional interface by using projection screen marks. The projection screen mark is to draw different marks on the four top corners of the original interface and project them on the projection screen. These marks are called "projection screen marks". Due to the different angles between the person and the normal vector of the projection plane, the projected picture will produce geometric distortion, and then the actual projected image will be obtained through the camera. According to the four markers and their visual transformation matrix, the direction vector of the projection plane is calculated, and then the virtual three-dimensional object is rendered and projected onto the plane; finally, the user can interact with the virtual three-dimensional object through gestures. In this way, with the movement of the user, that is, the change of the angle between the user and the normal vector of the projection plane, different virtual objects will be rendered, but the user will have the feeling that the virtual object is as real. And the user can interact with the virtual three-dimensional object through gestures. In the specific implementation manner, specific description will be made in conjunction with the figures.

本发明与现有技术相比较,具有如下显而易见的突出实质性特点和显著优点:Compared with the prior art, the present invention has the following obvious outstanding substantive features and significant advantages:

本发明通过一种投影屏幕标记的三维图形界面方式和一种虚拟鼠标手势,使三维图形界面更加生动,交互方式更加灵活。本发明设备简单、成本低,能满足移动设备的要求。The present invention makes the three-dimensional graphic interface more vivid and the interactive mode more flexible through a three-dimensional graphic interface mode marked on a projection screen and a virtual mouse gesture. The invention has simple equipment and low cost, and can meet the requirements of mobile equipment.

附图说明Description of drawings

图1 本发明实施例的示意图;Fig. 1 is the schematic diagram of the embodiment of the present invention;

图2 本发明实施例的流程示意图;Fig. 2 is a schematic flow diagram of an embodiment of the present invention;

图3 虚拟鼠标示意图;Figure 3 Schematic diagram of the virtual mouse;

图4 投影屏幕标记示意图。Figure 4 Schematic diagram of projection screen marking.

具体实施方式Detailed ways

以下结合附图对本发明的优选实施例作进一步的详细说明。Preferred embodiments of the present invention will be further described in detail below in conjunction with the accompanying drawings.

实施例一:如图1所示,可穿戴式三维手势交互系统包括一台便携式电脑(1)、一个摄像头(2)、一台微型投影仪(3)和一些彩色标记(4)。本交互系统对硬件要求低,微型投影仪和摄像头没有要求,故系统成本低。为了使摄像头拍摄的画面是用户看到的画面,并且手部动作不影响投影效果,如图1所示,将微型投影仪和摄像头安置在帽子上。Embodiment 1: As shown in Figure 1, a wearable 3D gesture interaction system includes a portable computer (1), a camera (2), a micro projector (3) and some colored markers (4). The interactive system has low requirements for hardware, and no requirements for micro-projectors and cameras, so the system cost is low. In order to make the picture taken by the camera be the picture seen by the user, and the hand movement does not affect the projection effect, as shown in Figure 1, the micro-projector and the camera are placed on the hat.

实施例二:可穿戴式三维手势交互系统的使用方法,如图2所示,其具体步骤如下:Embodiment two: the use method of wearable three-dimensional gesture interaction system, as shown in Figure 2, its specific steps are as follows:

①、标记识别;①, mark recognition;

②、动作识别;②, action recognition;

③、响应输出。③. Response output.

上述步骤①所述的标记识别,是通过摄像头获得用户手指上彩色标记的信息,通过图像分割获得其坐标信息,其具体步骤如下:The mark recognition described in the above step ① is to obtain the information of the colored mark on the user's finger through the camera, and obtain its coordinate information through image segmentation. The specific steps are as follows:

(1)、获取关键帧画面。(1). Obtain key frame images.

(2)、对关键帧画面中的彩色标记进行图像分割,获得各个彩色标记的坐标,并计算各标记的相对位置作为手势。(2) Carry out image segmentation on the colored markers in the key frame image, obtain the coordinates of each colored marker, and calculate the relative position of each marker as a gesture.

上述步骤②所述的动作识别进行手势交互,其具体步骤如下:The action recognition described in theabove step ② performs gesture interaction, and its specific steps are as follows:

(3)、是否是多点触控手势?是多点触控手势则转步骤(7),否则转步骤(4)。(3) Is it a multi-touch gesture? If it is a multi-touch gesture, go to step (7), otherwise go to step (4).

(4)、是否是自由手势?是则转步骤(8),否则转步骤(5)。(4) Is it a free gesture? If yes, go to step (8), otherwise go to step (5).

(5)、是否是混合手势?是则转步骤(9),否则转步骤(6)。(5) Is it a mixed gesture? If yes, go to step (9), otherwise go to step (6).

(6)、是否是虚拟鼠标?是则转步骤(10),否则转步骤(1)。(6) Is it a virtual mouse? If yes, go to step (10), otherwise go to step (1).

步骤(6)中一种特殊的手势:虚拟鼠标,如图3所示,左手食指与小指的彩色标记在一直线上,将食指上的标记作为左上角顶点,小指上的标记作为右上角顶点,向下做一正方形,在这正方形区域内即为控制区域,即手掌心,与图形界面上的坐标对应。这样设计的优点是右手食指在进行移动的时候可以有借力的地方,能够使移动更稳,达到精确定位的效果。运用虚拟鼠标可以作为虚拟键盘输入文字,也可以在界面上精确地绘图。A special gesture in step (6): virtual mouse, as shown in Figure 3, the color marks of the index finger and little finger of the left hand are in a straight line, the mark on the index finger is the vertex of the upper left corner, and the mark on the little finger is the vertex of the upper right corner , make a square downward, and the control area in this square area is the palm, which corresponds to the coordinates on the graphical interface. The advantage of this design is that the right index finger can have a place to use when moving, which can make the movement more stable and achieve the effect of precise positioning. Using the virtual mouse can be used as a virtual keyboard to input text, and can also accurately draw on the interface.

上述步骤③所述的响应输出,根据动作指令修改画面,并通过微型投影仪投影到一个平面上,实现人机交互,其具体步骤如下:For the response output described in the above step ③, the screen is modified according to the action instruction, and projected onto a plane through a micro-projector to realize human-computer interaction. The specific steps are as follows:

(7)、根据彩色标记的不同手势,做出相应的多点触控操作,完成交互后转步骤(1)。(7) According to the different gestures of the colored marks, make corresponding multi-touch operations, and turn to step (1) after completing the interaction.

(8)、根据彩色标记的不同手势,做出相应的自由手势操作,完成交互后转步骤(1)。(8) According to the different gestures of the colored marks, make corresponding free gesture operations, and then go to step (1) after completing the interaction.

(9)、根据彩色标记的不同手势,做出相应的混合手势操作,完成交互后转步骤(1)。(9). According to the different gestures of the colored marks, make corresponding mixed gesture operations, and then go to step (1) after completing the interaction.

(10)、根据彩色标记的不同手势,做出相应的虚拟鼠标操作,完成交互后转步骤(1)。(10) According to the different gestures of the colored markers, make corresponding virtual mouse operations, and turn to step (1) after completing the interaction.

本发明中可以使用普通二维图形界面,也可以使用三维图形界面。本实施例给出一种特殊的三维图形界面“投影屏幕标记”,如图4所示,在原有的界面四个顶角上绘制不同的标记再投射到投影屏幕上。由于人与投影平面法向量的夹角的不同,投影的画面将产生几何失真,然后再通过摄像头获得这些实际投影图像。根据四个标记和其视觉的变换矩阵:In the present invention, an ordinary two-dimensional graphical interface can be used, and a three-dimensional graphical interface can also be used. This embodiment provides a special three-dimensional graphic interface "projection screen mark", as shown in Figure 4, different marks are drawn on the four corners of the original interface and then projected onto the projection screen. Due to the different angles between the person and the normal vector of the projection plane, the projected picture will produce geometric distortion, and then these actual projected images are obtained through the camera. According to the transformation matrix of the four markers and their vision:

Figure 413876DEST_PATH_IMAGE001
Figure 413876DEST_PATH_IMAGE001
;

其中,是世界坐标系(

Figure 702217DEST_PATH_IMAGE003
,
Figure 456547DEST_PATH_IMAGE004
,)到摄像头坐标系(,
Figure 271422DEST_PATH_IMAGE007
,
Figure 139146DEST_PATH_IMAGE008
)的变换矩阵。在
Figure 414269DEST_PATH_IMAGE002
Figure 57740DEST_PATH_IMAGE009
为选择矩阵,
Figure 517541DEST_PATH_IMAGE010
为平移矩阵,
Figure 246462DEST_PATH_IMAGE011
为透视变换矩阵。计算出投影平面的方向向量,进而渲染出虚拟三维物体并投影到平面上,最后用户能通过手势对虚拟三维物体进行交互。以往基于标记的增强现实系统是摄像头位置固定,标记位置变动。而本发明中的投影屏幕标记是标记位置,即投影屏幕不动,摄像头移动。这样随着用户的移动,即用户与投影平面法向量夹角的变化,将渲染不同的虚拟物体,但用户将产生虚拟物体犹如真实存在的感受,比以往三维界面更加生动具体。并且用户能通过手势对虚拟三维物体进行交互。in, is the world coordinate system (
Figure 702217DEST_PATH_IMAGE003
,
Figure 456547DEST_PATH_IMAGE004
, ) to the camera coordinate system ( ,
Figure 271422DEST_PATH_IMAGE007
,
Figure 139146DEST_PATH_IMAGE008
) transformation matrix. exist
Figure 414269DEST_PATH_IMAGE002
middle
Figure 57740DEST_PATH_IMAGE009
For the selection matrix,
Figure 517541DEST_PATH_IMAGE010
is the translation matrix,
Figure 246462DEST_PATH_IMAGE011
is the perspective transformation matrix. The direction vector of the projection plane is calculated, and then the virtual three-dimensional object is rendered and projected onto the plane. Finally, the user can interact with the virtual three-dimensional object through gestures. In the past marker-based augmented reality systems, the position of the camera is fixed and the position of the marker changes. However, the projection screen mark in the present invention is a mark position, that is, the projection screen does not move, and the camera moves. In this way, as the user moves, that is, the angle between the user and the normal vector of the projection plane changes, different virtual objects will be rendered, but the user will have the feeling that the virtual object is as real as it is, which is more vivid and specific than the previous three-dimensional interface. And the user can interact with the virtual three-dimensional object through gestures.

Claims (5)

1. A wearable three-dimensional gesture interaction system comprises a portable computer (1), a camera (2), a micro projector (3) and color marks (4), and is characterized in that the camera and the micro projector are arranged on a hat, so that the distance between the camera and the projection screen and the included angle between the camera and the projection screen are the same as much as possible, the camera and the micro projector are respectively connected to the portable computer, and the user needs to bring the color marks on fingers of the user.
2. The use method of the wearable three-dimensional gesture interaction system according to claim 1, characterized by the following operation steps:
firstly, identifying a mark: the information of the color mark (4) on the finger of the user is obtained through the camera (2), and the coordinate information of the color mark is obtained through image segmentation;
action recognition: calculating the motion vector of each mark and the position relation of each mark in real time, and analyzing the action command;
responding and outputting: and modifying the picture according to the action instruction, and projecting the picture to a plane through a micro projector (3) to realize human-computer interaction.
3. The use method of the wearable three-dimensional gesture interaction system according to claim 2, wherein the specific operation steps of the step (r) mark recognition are as follows:
(1) acquiring a key frame picture;
(2) and carrying out image segmentation on the color marks in the key frame picture to obtain the coordinates of each color mark, and calculating the relative position of each mark as a gesture.
4. The use method of the wearable three-dimensional gesture interaction system according to claim 2, wherein the specific operation steps of the step (ii) motion recognition are as follows:
(3) is it a multi-touch gesture? If the gesture is a multi-point touch gesture, turning to the step (7), otherwise, turning to the step (4);
(4) is it a free gesture? If yes, turning to the step (8), otherwise, turning to the step (5);
(5) is it a mix gesture? If yes, turning to the step (9), otherwise, turning to the step (6);
(6) is it a virtual mouse? If yes, turning to the step (10), otherwise, turning to the step (1);
a special gesture in step (6): the virtual mouse, the color marks of the index finger and the little finger of the left hand are on the same straight line, the mark on the index finger is used as the top left corner vertex, the mark on the little finger is used as the top right corner vertex, a square is made downwards, the square area is the control area, namely the palm center, and corresponds to the coordinate on the graphical interface; the design has the advantages that the index finger of the right hand can have a force-assisted place when moving, so that the movement is more stable, and the effect of accurate positioning is achieved; (3) is it a multi-touch gesture? If the gesture is a multi-touch gesture, the step (7) is performed, otherwise, the step (4) is performed.
5. The method according to claim 2, wherein the step c, responding to the output, comprises the following specific steps:
(7) according to different gestures of the color marks, corresponding multi-point touch operation is carried out, and the step (1) is carried out after the interaction is finished;
(8) making corresponding free gesture operation according to different gestures marked by colors, and turning to the step (1) after finishing interaction;
(9) making corresponding mixed gesture operation according to different gestures marked by colors, and turning to the step (1) after finishing interaction;
(10) and (3) making corresponding virtual mouse operation according to different gestures of the color marks, and turning to the step (1) after the interaction is completed.
CN2011101146045A2011-05-052011-05-05Wearable three-dimensional gesture interaction system and using method thereofPendingCN102193631A (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN2011101146045ACN102193631A (en)2011-05-052011-05-05Wearable three-dimensional gesture interaction system and using method thereof

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN2011101146045ACN102193631A (en)2011-05-052011-05-05Wearable three-dimensional gesture interaction system and using method thereof

Publications (1)

Publication NumberPublication Date
CN102193631Atrue CN102193631A (en)2011-09-21

Family

ID=44601809

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN2011101146045APendingCN102193631A (en)2011-05-052011-05-05Wearable three-dimensional gesture interaction system and using method thereof

Country Status (1)

CountryLink
CN (1)CN102193631A (en)

Cited By (25)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
WO2014012486A1 (en)*2012-07-172014-01-23Gao ShouqianWearable wireless intelligent electronic device having removable and freely-combinable functional modules
CN103995592A (en)*2014-05-212014-08-20上海华勤通讯技术有限公司Wearable equipment and terminal information interaction method and terminal
CN104063038A (en)*2013-03-182014-09-24联想(北京)有限公司Information processing method and device and electronic equipment
CN104102071A (en)*2013-04-122014-10-15王旭东Full-automatic photographing equipment
CN104345802A (en)*2013-08-082015-02-11派布勒斯有限公司Method and device for controlling a near eye display
CN104460988A (en)*2014-11-112015-03-25陈琦Input control method of intelligent cell phone virtual reality device
CN104461277A (en)*2013-09-232015-03-25Lg电子株式会社Mobile terminal and method of controlling therefor
CN104660941A (en)*2013-11-222015-05-27北京弘天智达科技有限公司Micro display communication device, system and method
CN104866103A (en)*2015-06-012015-08-26联想(北京)有限公司Relative position determining method, wearable electronic equipment and terminal equipment
WO2015165181A1 (en)*2014-04-282015-11-05京东方科技集团股份有限公司Method and apparatus for controlling projection of wearable device, and wearable device
CN105302337A (en)*2013-06-042016-02-03李文傑 High-resolution and high-sensitivity three-dimensional (3D) mouse movement control system, movement control device and motion detection method thereof
CN106293078A (en)*2016-08-022017-01-04福建数博讯信息科技有限公司Virtual reality exchange method based on photographic head and device
CN104090465B (en)*2014-06-172017-01-11福建水立方三维数字科技有限公司Three-dimensional interactive projection imaging method
WO2017107182A1 (en)*2015-12-252017-06-29深圳市柔宇科技有限公司Head-mounted display device
CN107085467A (en)*2017-03-302017-08-22北京奇艺世纪科技有限公司A kind of gesture identification method and device
CN109445599A (en)*2018-11-132019-03-08宁波视睿迪光电有限公司Interaction pen detection method and 3D interactive system
CN109496331A (en)*2016-05-202019-03-19奇跃公司 Contextual awareness of user interface menus
CN110049228A (en)*2018-01-172019-07-23北京林业大学A kind of new method and system taken pictures based on gesture control
CN110140099A (en)*2017-01-272019-08-16高通股份有限公司System and method for tracking control unit
CN111624770A (en)*2015-04-152020-09-04索尼互动娱乐股份有限公司Pinch and hold gesture navigation on head mounted display
CN112416133A (en)*2020-11-302021-02-26魔珐(上海)信息科技有限公司Hand motion capture method and device, electronic equipment and storage medium
CN112515661A (en)*2020-11-302021-03-19魔珐(上海)信息科技有限公司Posture capturing method and device, electronic equipment and storage medium
CN114138119A (en)*2021-12-082022-03-04武汉卡比特信息有限公司Gesture recognition system and method for mobile phone interconnection split screen projection
CN114967927A (en)*2022-05-302022-08-30桂林电子科技大学Intelligent gesture interaction method based on image processing
US12406454B2 (en)2016-03-312025-09-02Magic Leap, Inc.Interactions with 3D virtual objects using poses and multiple-dof controllers

Citations (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN101077232A (en)*2007-06-072007-11-28南京航空航天大学Human-computer interaction helmet for type computer

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN101077232A (en)*2007-06-072007-11-28南京航空航天大学Human-computer interaction helmet for type computer

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
《北京理工大学学报》 20051231 柳阳等 基于穿戴视觉的人手跟踪与手势识别方法 第1083-1086页 1-2 第25卷, 第12期*
《计算机辅助设计与图形学学报》 20110430 李岩等 一种手部实时跟踪与定位的虚实碰撞检测方法 第713-718页 1-2 第23卷, 第4期*

Cited By (37)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
WO2014012486A1 (en)*2012-07-172014-01-23Gao ShouqianWearable wireless intelligent electronic device having removable and freely-combinable functional modules
CN104063038A (en)*2013-03-182014-09-24联想(北京)有限公司Information processing method and device and electronic equipment
CN104102071A (en)*2013-04-122014-10-15王旭东Full-automatic photographing equipment
CN105302337B (en)*2013-06-042019-10-18李文傑High-resolution and high-sensitivity three-dimensional (3D) mouse movement control system, movement control device and motion detection method thereof
CN105302337A (en)*2013-06-042016-02-03李文傑 High-resolution and high-sensitivity three-dimensional (3D) mouse movement control system, movement control device and motion detection method thereof
CN104345802B (en)*2013-08-082019-03-22脸谱公司For controlling the devices, systems, and methods of near-to-eye displays
CN104345802A (en)*2013-08-082015-02-11派布勒斯有限公司Method and device for controlling a near eye display
CN104461277A (en)*2013-09-232015-03-25Lg电子株式会社Mobile terminal and method of controlling therefor
CN104660941A (en)*2013-11-222015-05-27北京弘天智达科技有限公司Micro display communication device, system and method
WO2015165181A1 (en)*2014-04-282015-11-05京东方科技集团股份有限公司Method and apparatus for controlling projection of wearable device, and wearable device
US9872002B2 (en)2014-04-282018-01-16Boe Technology Group Co., Ltd.Method and device for controlling projection of wearable apparatus, and wearable apparatus
CN103995592A (en)*2014-05-212014-08-20上海华勤通讯技术有限公司Wearable equipment and terminal information interaction method and terminal
CN104090465B (en)*2014-06-172017-01-11福建水立方三维数字科技有限公司Three-dimensional interactive projection imaging method
CN104460988A (en)*2014-11-112015-03-25陈琦Input control method of intelligent cell phone virtual reality device
CN104460988B (en)*2014-11-112017-12-22陈琦A kind of input control method of smart mobile phone virtual reality device
CN111624770A (en)*2015-04-152020-09-04索尼互动娱乐股份有限公司Pinch and hold gesture navigation on head mounted display
CN104866103B (en)*2015-06-012019-12-24联想(北京)有限公司Relative position determining method, wearable electronic device and terminal device
CN104866103A (en)*2015-06-012015-08-26联想(北京)有限公司Relative position determining method, wearable electronic equipment and terminal equipment
WO2017107182A1 (en)*2015-12-252017-06-29深圳市柔宇科技有限公司Head-mounted display device
US12406454B2 (en)2016-03-312025-09-02Magic Leap, Inc.Interactions with 3D virtual objects using poses and multiple-dof controllers
CN109496331A (en)*2016-05-202019-03-19奇跃公司 Contextual awareness of user interface menus
CN109496331B (en)*2016-05-202022-06-21奇跃公司 Contextual awareness of user interface menus
CN106293078A (en)*2016-08-022017-01-04福建数博讯信息科技有限公司Virtual reality exchange method based on photographic head and device
US11740690B2 (en)2017-01-272023-08-29Qualcomm IncorporatedSystems and methods for tracking a controller
US12141341B2 (en)2017-01-272024-11-12Qualcomm IncorporatedSystems and methods for tracking a controller
CN110140099A (en)*2017-01-272019-08-16高通股份有限公司System and method for tracking control unit
CN110140099B (en)*2017-01-272022-03-11高通股份有限公司System and method for tracking controller
CN107085467A (en)*2017-03-302017-08-22北京奇艺世纪科技有限公司A kind of gesture identification method and device
CN110049228A (en)*2018-01-172019-07-23北京林业大学A kind of new method and system taken pictures based on gesture control
CN109445599A (en)*2018-11-132019-03-08宁波视睿迪光电有限公司Interaction pen detection method and 3D interactive system
CN112515661B (en)*2020-11-302021-09-14魔珐(上海)信息科技有限公司Posture capturing method and device, electronic equipment and storage medium
WO2022111525A1 (en)*2020-11-302022-06-02魔珐(上海)信息科技有限公司Posture capturing method and apparatus, electronic device, and storage medium
CN112515661A (en)*2020-11-302021-03-19魔珐(上海)信息科技有限公司Posture capturing method and device, electronic equipment and storage medium
CN112416133A (en)*2020-11-302021-02-26魔珐(上海)信息科技有限公司Hand motion capture method and device, electronic equipment and storage medium
CN114138119A (en)*2021-12-082022-03-04武汉卡比特信息有限公司Gesture recognition system and method for mobile phone interconnection split screen projection
CN114967927A (en)*2022-05-302022-08-30桂林电子科技大学Intelligent gesture interaction method based on image processing
CN114967927B (en)*2022-05-302024-04-16桂林电子科技大学Intelligent gesture interaction method based on image processing

Similar Documents

PublicationPublication DateTitle
CN102193631A (en)Wearable three-dimensional gesture interaction system and using method thereof
CN114174960B (en)Projection in a virtual environment
US10545580B2 (en)3D interaction method, device, computer equipment and storage medium
US9395821B2 (en)Systems and techniques for user interface control
Grossman et al.Multi-finger gestural interaction with 3d volumetric displays
CN103336575B (en)The intelligent glasses system of a kind of man-machine interaction and exchange method
CN107491174A (en)Method, apparatus, system and electronic equipment for remote assistance
CN107450714A (en)Man-machine interaction support test system based on augmented reality and image recognition
CN104063039A (en)Human-computer interaction method of wearable computer intelligent terminal
Huang et al.A survey on human-computer interaction in mixed reality
CN104464414A (en)Augmented reality teaching system
CN106846496A (en)DICOM images based on mixed reality technology check system and operating method
CN109145802A (en)More manpower gesture man-machine interaction methods and device based on Kinect
CN104637080B (en)A kind of three-dimensional drawing system and method based on man-machine interaction
CN109471533A (en) A student terminal system in VR/AR classroom and its use method
Lu et al.Classification, application, challenge, and future of midair gestures in augmented reality
CN115328304A (en) A 2D-3D fusion virtual reality interaction method and device
Qian et al.Portalware: Exploring free-hand ar drawing with a dual-display smartphone-wearable paradigm
CN118466805A (en) Non-contact 3D model human-computer interaction method based on machine vision and gesture recognition
Marchal et al.Designing intuitive multi-touch 3d navigation techniques
Schöning et al.Bimanual interaction with interscopic multi-touch surfaces
CN104821135B (en)It is a kind of to realize the method and device that paper map is combined display with electronic map
Zhang et al.A hybrid 2d-3d tangible interface for virtual reality
CN115525151A (en)Immersive interactive large screen implementation method
CN115115812A (en)Virtual scene display method and device and storage medium

Legal Events

DateCodeTitleDescription
C06Publication
PB01Publication
C10Entry into substantive examination
SE01Entry into force of request for substantive examination
C02Deemed withdrawal of patent application after publication (patent law 2001)
WD01Invention patent application deemed withdrawn after publication

Application publication date:20110921


[8]ページ先頭

©2009-2025 Movatter.jp