Movatterモバイル変換


[0]ホーム

URL:


CN103150024B - Computer operation method - Google Patents

Computer operation method
Download PDF

Info

Publication number
CN103150024B
CN103150024BCN201310115476.5ACN201310115476ACN103150024BCN 103150024 BCN103150024 BCN 103150024BCN 201310115476 ACN201310115476 ACN 201310115476ACN 103150024 BCN103150024 BCN 103150024B
Authority
CN
China
Prior art keywords
gesture
user
computer
hand
resolved
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310115476.5A
Other languages
Chinese (zh)
Other versions
CN103150024A (en
Inventor
施海昕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ickey Shanghai Internet Technology Co ltd
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by IndividualfiledCriticalIndividual
Priority to CN201310115476.5ApriorityCriticalpatent/CN103150024B/en
Publication of CN103150024ApublicationCriticalpatent/CN103150024A/en
Application grantedgrantedCritical
Publication of CN103150024BpublicationCriticalpatent/CN103150024B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Landscapes

Abstract

The invention relates to computer technology, in particular to a computer operation method. The computer operation method comprises the steps of starting the somatosensory device, initializing the position and the state of a cursor, making a gesture, identifying the current position and the gesture of a hand of a user, making the gesture again, identifying the current position and the gesture of the hand of the user again, updating the position and the state of the cursor, calculating the position change and the gesture change of the hand of the user, analyzing the position change and the gesture change of the hand of the user, operating and updating a graphical interface. The computer operation method of the invention can utilize the existing body sensing equipment to control and operate the daily used computer, further expands the application range of the existing body sensing equipment, and is widely applied to equipment with information processing capability, such as a PC, a notebook computer, a netbook, an intelligent terminal, a television and the like.

Description

Translated fromChinese
一种计算机操作方法A method of computer operation

技术领域technical field

本发明涉及计算机技术,特别涉及一种计算机操作方法。The present invention relates to computer technology, in particular to a computer operating method.

背景技术Background technique

从世界上有计算机开始,人机交互就伴随着计算机的进化而进化。这在很大程度上是显而易见的,因为我们必须让计算机读懂我们所输入的命令,并根据命令来运算和反馈。所谓人机交互,就是人与计算机打交道的方式。Since there are computers in the world, human-computer interaction has evolved along with the evolution of computers. This is largely obvious, because we have to make the computer understand the commands we enter, and operate and respond according to the commands. The so-called human-computer interaction is the way people interact with computers.

这一历史最早可追溯到1880年,美国人口调查局的赫曼·霍列瑞斯由于疲于手工处理人口普查数据,开始寻求用机器制表的方式。其成果是穿孔卡计算机的出现,而霍列瑞斯也因此被称为“数据处理之父”。现在的人很难想象,原始的人机交互就是通过“卡槽”来定位信息,然后用机器来读懂它。This history can be traced back to 1880, when Herman Hollerith of the U.S. Census Bureau was tired of processing census data by hand and began to seek a way to tabulate by machine. The result was the punched card computer, for which Hollerith is known as the "father of data processing." It is hard for people nowadays to imagine that the original human-computer interaction is to locate information through "slots", and then use machines to read it.

一直到真正的计算机出现,都沿用着打字机时代“键盘”这一传统人机交互方式。但1983年,鼠标出现了。与键盘中的方向键相比,鼠标显然更加符合人的自然习惯。鼠标已经成为今天绝大多数电脑用户不可或缺的工具。Until the emergence of real computers, the traditional human-computer interaction method of "keyboard" in the typewriter era was followed. But in 1983, the mouse appeared. Compared with the arrow keys in the keyboard, the mouse is obviously more in line with people's natural habits. The mouse has become an indispensable tool for the vast majority of computer users today.

键盘与鼠标的人机交互组合,从PC时代一直延续到互联网时代,并无太大改变,直到智能手机和多点触摸的出现。The human-computer interaction combination of keyboard and mouse has continued from the PC era to the Internet era, and has not changed much until the emergence of smart phones and multi-touch.

在iPhone问世之前,智能手机一直在沿袭键盘和鼠标的信息输入方式。键盘是必须有的,不能少于10个;鼠标太大,对位置的指示采用触摸的方式完成。Before the advent of the iPhone, smartphones have always followed the keyboard and mouse information input method. Keyboards are a must, and there should be no less than 10; the mouse is too large, and the position indication is done by touch.

但多点触摸打开了另外一扇窗户,它让所有人意识到其实键盘可以成为触摸的一部分,而很多命令其实能通过多个手指在触摸屏上划动方式的不同来完成,比如放大和缩小图片。But multi-touch opens another window, it makes everyone realize that the keyboard can actually be part of the touch, and many commands can actually be done by swiping multiple fingers on the touch screen in different ways, such as zooming in and out of pictures .

2010年11月初,微软家用视频游戏主机Xbox360的体感外设Kinect正式公开销售。借助Kinect,普通人不需要使用任何手柄、摇杆、鼠标或者其他遥控器,即可用身体直接控制游戏。In early November 2010, Kinect, a somatosensory peripheral for Microsoft's home video game console Xbox360, was officially on sale. With Kinect, ordinary people can directly control games with their bodies without using any handles, joysticks, mice or other remote controls.

比如要玩体育游戏,你只需要接通电源,然后站到电视机前面即可。所有对游戏角色的操控都用一种最自然的方式完成,不管是乒乓球的挥拍还是保龄球的击打,你只需要像现实中那样摆出动作即可。To play a sports game, for example, you just plug it in and stand in front of the TV. All the manipulation of game characters is done in the most natural way, whether it is a table tennis swing or a bowling ball, you only need to pose as in reality.

又比如赛车,如何控制方向?想象你真的在开车,把双手举到空中就像握着一个方向盘,左右转动就行了。对了,你还可以在空中模拟挂挡。Another example is a racing car, how to control the direction? Imagine that you are actually driving a car, put your hands up in the air like holding a steering wheel, and just turn it left and right. By the way, you can also simulate gearing in the air.

一句话,“你就是遥控器”。In a word, "you are the remote control".

然而,以Kinect为代表的体感外设目前只能用于控制游戏,无法用于控制、操作日常使用的计算机(例如PC和笔记本电脑)进行上网浏览、办公。However, the somatosensory peripherals represented by Kinect can only be used to control games at present, and cannot be used to control and operate everyday computers (such as PCs and laptops) for Internet browsing and office work.

发明内容Contents of the invention

为了解决现有的体感外设只能用于控制游戏,无法用于控制、操作日常使用的计算机(例如PC和笔记本电脑)的技术问题,本发明提出了一种计算机操作方法,包括:步骤S1,计算机启动体感设备;步骤S2,计算机初始化光标的位置和状态,并在显示器上画出光标;步骤S3,用户在体感设备建立的感应空间中做出手势;步骤S4,计算机通过体感设备,初次识别出用户的手的当前位置和手势;步骤S5,用户在体感设备建立的感应空间中再次做出手势;步骤S6,计算机通过体感设备,再次识别出用户的手的当前位置和手势;步骤S7,计算机在显示器上更新光标的位置和状态,从而使其与用户的手的位置和手势相对应;步骤S8,计算机将最近两次获取的用户的手的位置和手势的数据进行比较,计算出用户的手的位置变化和手势变化;步骤S9,计算机利用预定义的手势模式对用户的手的位置变化和手势变化进行解析;步骤S10,计算机根据解析的结果操作和更新图形界面;步骤S11,转到步骤S5。In order to solve the technical problem that the existing somatosensory peripherals can only be used to control games, but cannot be used to control and operate daily-used computers (such as PCs and notebook computers), the present invention proposes a computer operation method, including: Step S1 , the computer starts the somatosensory device; step S2, the computer initializes the position and state of the cursor, and draws the cursor on the display; step S3, the user makes a gesture in the sensing space established by the somatosensory device; step S4, the computer passes the somatosensory device, the first time Recognize the current position and gesture of the user's hand; step S5, the user makes a gesture again in the sensing space established by the somatosensory device; step S6, the computer recognizes the current position and gesture of the user's hand again through the somatosensory device; step S7 , the computer updates the position and state of the cursor on the display so that it corresponds to the position and gesture of the user's hand; step S8, the computer compares the data of the position and gesture of the user's hand acquired twice recently, and calculates The position change and gesture change of the user's hand; step S9, the computer uses the predefined gesture mode to analyze the position change and gesture change of the user's hand; step S10, the computer operates and updates the graphical interface according to the result of the analysis; step S11, Go to step S5.

本发明的计算机操作方法,可以利用现有的体感设备控制、操作日常使用的计算机(例如PC和笔记本电脑),进一步扩展了现有体感设备的应用范围,广泛适用于PC、笔记本电脑、上网本、智能终端、电视机等具有信息处理能力的设备。The computer operation method of the present invention can use the existing somatosensory equipment to control and operate daily-used computers (such as PCs and notebook computers), further expands the application range of the existing somatosensory equipment, and is widely applicable to PCs, notebook computers, netbooks, Devices with information processing capabilities such as smart terminals and televisions.

附图说明Description of drawings

图1为本发明的计算机操作方法流程图。Fig. 1 is a flowchart of the computer operation method of the present invention.

图2为本发明实施例1中手的位置变化和手势变化与操作的对应关系图。FIG. 2 is a graph showing the corresponding relationship between hand position changes, gesture changes and operations in Embodiment 1 of the present invention.

图3为本发明实施例2中手的位置变化和手势变化与操作的对应关系图。FIG. 3 is a graph showing the corresponding relationship between hand position changes, gesture changes and operations in Embodiment 2 of the present invention.

具体实施方式detailed description

下面结合附图详细说明本发明的一种计算机操作方法。A computer operation method of the present invention will be described in detail below in conjunction with the accompanying drawings.

本发明提供了一种计算机操作方法,其流程如图1所示,包括:步骤S1,计算机启动体感设备;步骤S2,计算机初始化光标的位置和状态,并在显示器上画出光标;步骤S3,用户在体感设备建立的感应空间中做出手势;步骤S4,计算机通过体感设备,初次识别出用户的手的当前位置和手势;步骤S5,用户在体感设备建立的感应空间中再次做出手势;步骤S6,计算机通过体感设备,再次识别出用户的手的当前位置和手势;步骤S7,计算机在显示器上更新光标的位置和状态,从而使其与用户的手的位置和手势相对应;步骤S8,计算机将最近两次获取的用户的手的位置和手势的数据进行比较,计算出用户的手的位置变化和手势变化;步骤S9,计算机利用预定义的手势模式对用户的手的位置变化和手势变化进行解析;步骤S10,计算机根据解析的结果操作和更新图形界面;步骤S11,转到步骤S5。The present invention provides a computer operation method, the process of which is shown in Figure 1, comprising: step S1, the computer starts the somatosensory device; step S2, the computer initializes the position and state of the cursor, and draws the cursor on the display; step S3, The user makes a gesture in the sensing space established by the somatosensory device; step S4, the computer recognizes the current position and gesture of the user's hand for the first time through the somatosensory device; step S5, the user makes a gesture again in the sensing space established by the somatosensory device; Step S6, the computer recognizes the current position and gesture of the user's hand again through the somatosensory device; Step S7, the computer updates the position and state of the cursor on the display so that it corresponds to the position and gesture of the user's hand; Step S8 , the computer compares the user's hand position and gesture data acquired twice recently, and calculates the user's hand position change and gesture change; step S9, the computer uses the predefined gesture pattern to analyze the user's hand position change and Analyze the gesture change; step S10, the computer operates and updates the graphical interface according to the result of the analysis; step S11, go to step S5.

下面举例说明本发明的一种计算机操作方法。A kind of computer operation method of the present invention is illustrated as follows.

实施例1Example 1

由于以Kinect为代表的体感设备目前只能用于控制游戏,无法用于控制、操作日常使用的计算机(例如PC和笔记本电脑),因此,本发明提出了一种计算机操作方法,如图1和图2所示,包括:步骤S1,计算机启动体感设备;步骤S2,计算机初始化光标的位置和状态,并在显示器上画出光标;步骤S3,用户在体感设备建立的感应空间中做出手势;在本实施例中,计算机识别如下几种手势:五指伸直,张开手掌,可以将其形象地称之为“布”手势;握紧的拳头,可以将其形象地称之为“石头”手势;食指伸直,其余手指握紧,可以将其形象地称之为“棍子”手势;中指和食指伸直,其余手指握紧,作剪刀状,可以将其形象地称之为“剪刀”手势;步骤S4,计算机通过体感设备,初次识别出用户的手的当前位置和手势;步骤S5,用户在体感设备建立的感应空间中再次做出手势;步骤S6,计算机通过体感设备,再次识别出用户的手的当前位置和手势;步骤S7,计算机在显示器上更新光标的位置和状态,从而使其与用户的手的位置和手势相对应;步骤S8,计算机将最近两次获取的用户的手的位置和手势的数据进行比较,计算出用户的手的位置变化和手势变化;步骤S9,计算机利用预定义的手势模式对用户的手的位置变化和手势变化进行解析;在本实施例中,如果用户的手保持“石头”手势,用户的手的位置发生变化,则计算机将其解析为“移动”操作;在本实施例中,“移动”操作是指图形界面上的光标随着手势的移动方向而移动;在本实施例中,如果用户的手保持“布”手势,用户的手的位置发生变化,则计算机将其解析为“移动”操作;在本实施例中,如果用户的手从“布”手势变化为“石头”手势,则计算机将其解析为“打开”操作;在本实施例中,“打开”操作是指打开文件夹、打开文件或者运行程序的操作;在本实施例中,如果用户的手从“布”手势变化为“棍子”手势,则计算机将其解析为“选择”操作;在本实施例中,“选择”操作是指选中操作对象;在本实施例中,如果用户的手从“布”手势变化为“棍子”手势,并且用户的手的位置在一定时间内保持不变,则计算机将其解析为“长按”操作;在本实施例中,如果用户对选中的操作对象执行“长按”操作,计算机就会使选中的操作对象处于可移动状态;在本实施例中,如果用户的手从“布”手势变化为“棍子”手势,并且用户的手的位置发生变化,则计算机将其解析为“拖动”操作;在本实施例中,“拖动”操作是指将选中的操作对象拖动到指定的位置;处于“拖动”操作状态时,如果用户的手从“棍子”手势变化为“布”手势,则计算机将其解析为“释放”操作;在本实施例中,“释放”操作是指取消“拖动”操作状态并且将选中的操作对象的位置变更为当前光标所在的位置;通过“拖动”操作与“释放”操作的配合,可以方便地实现计算机常见的拖放操作;在本实施例中,如果用户的手保持“剪刀”手势,用户的手的位置发生变化,则计算机将根据用户的手的位置的不同变化情况进行解析:如果用户的手保持“剪刀”手势作水平移动,则计算机将其解析为“水平滚动”操作,如果用户的手保持“剪刀”手势作非水平移动,则计算机将其解析为“垂直滚动”操作;步骤S10,计算机根据解析的结果操作和更新图形界面;步骤S11,转到步骤S5。Since the somatosensory device represented by Kinect can only be used to control games at present, and cannot be used to control and operate computers (such as PCs and notebook computers) used in daily use, the present invention proposes a computer operation method, as shown in Figure 1 and As shown in Figure 2, it includes: step S1, the computer starts the somatosensory device; step S2, the computer initializes the position and state of the cursor, and draws the cursor on the display; step S3, the user makes a gesture in the sensing space established by the somatosensory device; In this embodiment, the computer recognizes the following gestures: straighten five fingers, open palm, which can be called "cloth" gesture vividly; clenched fist, which can be called "stone" vividly Gesture: straighten the index finger, and clenched the rest of the fingers, which can be called "stick" gesture; straighten the middle finger and index finger, and clenched the rest of the fingers, making scissors, which can be called "scissors" figuratively Gesture; step S4, the computer recognizes the current position and gesture of the user's hand for the first time through the somatosensory device; step S5, the user makes a gesture again in the sensing space established by the somatosensory device; step S6, the computer recognizes the user's hand again through the somatosensory device The current position and gesture of the user's hand; step S7, the computer updates the position and state of the cursor on the display, so that it corresponds to the position and gesture of the user's hand; Comparing the position and gesture data of the user to calculate the position change and gesture change of the user's hand; step S9, the computer uses the predefined gesture pattern to analyze the position change and gesture change of the user's hand; in this embodiment, If the user's hand maintains the "stone" gesture and the position of the user's hand changes, the computer interprets it as a "move" operation; In this embodiment, if the user's hand maintains the "cloth" gesture and the position of the user's hand changes, the computer interprets it as a "move" operation; in this embodiment, if the user's hand From the "cloth" gesture to the "stone" gesture, the computer interprets it as an "open" operation; in this embodiment, the "open" operation refers to the operation of opening a folder, opening a file, or running a program; in this embodiment In the example, if the user's hand changes from the "cloth" gesture to the "stick" gesture, the computer interprets it as a "selection" operation; in this embodiment, the "selection" operation refers to selecting the operation object; in this embodiment In , if the user's hand changes from the "cloth" gesture to the "stick" gesture, and the position of the user's hand remains unchanged for a certain period of time, the computer interprets it as a "long press" operation; in this embodiment, If the user performs a "long press" operation on the selected operation object, the computer will make the selected operation object in a movable state; in this embodiment, if the user's hand changes from the "cloth" gesture to the "stick" gesture, and When the position of the user's hand changes, the computer interprets it as a "drag" operation; in this embodiment, the "drag" operation refers to dragging the selected operation object to the specified position; in the "drag" operation state, if the user's hand changes from the "stick" gesture to the "cloth" gesture, the computer interprets it as a "release" operation; in this embodiment, "release" Operation refers to canceling the "drag" operation state and changing the position of the selected operation object to the position of the current cursor; through the cooperation of "drag" operation and "release" operation, common computer drag and drop operations can be easily realized ; In this embodiment, if the user's hand maintains the "scissors" gesture and the position of the user's hand changes, the computer will analyze it according to the different changes in the position of the user's hand: if the user's hand maintains the "scissors" gesture If the user moves horizontally, the computer interprets it as a "horizontal scrolling" operation. If the user's hand keeps the "scissors" gesture for a non-horizontal movement, the computer interprets it as a "vertical scrolling" operation; in step S10, the computer Operate and update the graphical interface; step S11, go to step S5.

实施例2Example 2

本发明提供了另一种计算机操作方法,如图1和图3所示,包括:步骤S1,计算机启动体感设备;步骤S2,计算机初始化光标的位置和状态,并在显示器上画出光标;步骤S3,用户在体感设备建立的感应空间中做出手势;在本实施例中,计算机识别如下几种手势:五指伸直,张开手掌,可以将其形象地称之为“布”手势;握紧的拳头,可以将其形象地称之为“石头”手势;食指伸直,其余手指握紧,可以将其形象地称之为“棍子”手势;中指和食指伸直,其余手指握紧,作剪刀状,可以将其形象地称之为“剪刀”手势;步骤S4,计算机通过体感设备,初次识别出用户的手的当前位置和手势;步骤S5,用户在体感设备建立的感应空间中再次做出手势;步骤S6,计算机通过体感设备,再次识别出用户的手的当前位置和手势;步骤S7,计算机在显示器上更新光标的位置和状态,从而使其与用户的手的位置和手势相对应;步骤S8,计算机将最近两次获取的用户的手的位置和手势的数据进行比较,计算出用户的手的位置变化和手势变化;步骤S9,计算机利用预定义的手势模式对用户的手的位置变化和手势变化进行解析;在本实施例中,如果用户的手保持“棍子”手势,用户的手的位置发生变化,则计算机将其解析为“移动”操作;在本实施例中,“移动”操作是指图形界面上的光标随着手势的移动方向而移动;在本实施例中,如果用户的手保持“布”手势,用户的手的位置发生变化,则计算机将其解析为“移动”操作;在本实施例中,如果用户的手从“布”手势变化为“棍子”手势,则计算机将其解析为“打开”操作;在本实施例中,“打开”操作是指打开文件夹、打开文件或者运行程序的操作;在本实施例中,如果用户的手从“布”手势变化为“石头”手势,则计算机将其解析为“选择”操作;在本实施例中,“选择”操作是指选中操作对象;在本实施例中,如果用户的手从“布”手势变化为“石头”手势,并且用户的手的位置在一定时间内保持不变,则计算机将其解析为“长按”操作;在本实施例中,如果用户对选中的操作对象执行“长按”操作,计算机就会弹出与选中的操作对象有关的弹出菜单;在本实施例中,如果用户的手从“布”手势变化为“石头”手势,并且用户的手的位置发生变化,则计算机将其解析为“拖动”操作;在本实施例中,“拖动”操作是指将选中的操作对象拖动到指定的位置;处于“拖动”操作状态时,如果用户的手从“石头”手势变化为“布”手势,则计算机将其解析为“释放”操作;在本实施例中,“释放”操作是指取消“拖动”操作状态并且将选中的操作对象的位置变更为当前光标所在的位置;通过“拖动”操作与“释放”操作的配合,可以方便地实现计算机常见的拖放操作;在本实施例中,如果用户的手保持“剪刀”手势,用户的手的位置发生变化,则计算机将根据用户的手的位置的不同变化情况进行解析:如果用户的手保持“剪刀”手势作水平移动,则计算机将其解析为“水平滚动”操作,如果用户的手保持“剪刀”手势作非水平移动,则计算机将其解析为“垂直滚动”操作;步骤S10,计算机根据解析的结果操作和更新图形界面;步骤S11,转到步骤S5。The present invention provides another computer operation method, as shown in Figure 1 and Figure 3, comprising: step S1, the computer starts the somatosensory device; step S2, the computer initializes the position and state of the cursor, and draws the cursor on the display; step S3, the user makes a gesture in the sensing space established by the somatosensory device; in this embodiment, the computer recognizes the following gestures: straighten the five fingers, open the palm, which can be called a "cloth" gesture vividly; A tight fist can be vividly called the "stone" gesture; the index finger is straightened, and the rest of the fingers are clenched, which can be vividly called the "stick" gesture; the middle finger and index finger are straightened, and the rest of the fingers are clenched. In the shape of scissors, it can be called the "scissors" gesture; step S4, the computer recognizes the current position and gesture of the user's hand for the first time through the somatosensory device; Make a gesture; step S6, the computer recognizes the current position and gesture of the user's hand again through the somatosensory device; step S7, the computer updates the position and state of the cursor on the display, so that it is consistent with the position and gesture of the user's hand Corresponding; step S8, the computer compares the position and gesture data of the user's hand acquired twice recently, and calculates the position change and gesture change of the user's hand; Analysis of position changes and gesture changes; in this embodiment, if the user's hand maintains a "stick" gesture and the position of the user's hand changes, the computer interprets it as a "move" operation; in this embodiment, The "move" operation means that the cursor on the graphical interface moves along with the moving direction of the gesture; in this embodiment, if the user's hand maintains the "cloth" gesture and the position of the user's hand changes, the computer interprets it as "Move" operation; in this embodiment, if the user's hand changes from the "cloth" gesture to the "stick" gesture, the computer interprets it as an "open" operation; in this embodiment, the "open" operation refers to The operation of opening a folder, opening a file, or running a program; in this embodiment, if the user's hand changes from a "cloth" gesture to a "stone" gesture, the computer interprets it as a "selection" operation; in this embodiment , the "select" operation refers to selecting the operation object; in this embodiment, if the user's hand changes from the "cloth" gesture to the "stone" gesture, and the position of the user's hand remains unchanged within a certain period of time, the computer will It is resolved as a "long press" operation; in this embodiment, if the user performs a "long press" operation on the selected operation object, the computer will pop up a pop-up menu related to the selected operation object; in this embodiment, if When the user's hand changes from a "cloth" gesture to a "stone" gesture, and the position of the user's hand changes, the computer interprets it as a "drag" operation; in this embodiment, the "drag" operation refers to moving the Drag the selected operation object to the specified position; in the "drag" operation state, if the user's hand changes from the "stone" gesture to the "cloth" gesture, the computer interprets it as a "release" operation; In this embodiment, the "release" operation refers to canceling the "drag" operation state and changing the position of the selected operation object to the position of the current cursor; through the cooperation of the "drag" operation and the "release" operation, it is convenient In this embodiment, if the user's hand maintains the "scissors" gesture and the position of the user's hand changes, the computer will analyze it according to the different changes in the position of the user's hand: if If the user's hand keeps the "scissors" gesture for horizontal movement, the computer interprets it as a "horizontal scrolling" operation; if the user's hand keeps the "scissors" gesture for non-horizontal movement, the computer interprets it as a "vertical scrolling" operation; Step S10, the computer operates and updates the graphical interface according to the analysis result; Step S11, go to step S5.

Claims (8)

Described step S9 refers to that the position of user's hand changes if user's hand keeps " stone " gesture, and computer is resolved to " movement " operation; If user's hand keeps " cloth " gesture, the position of user's hand changes, and computer is resolved to " movement " operation; If user's hand is changed to " stone " gesture from " cloth " gesture, computer is resolved to " opening " operation; If user's hand is changed to " rod " gesture from " cloth " gesture, computer is resolved to " selection " operation; If user's hand is changed to " rod " gesture from " cloth " gesture, and the position of user's hand remains unchanged within a certain period of time, and computer is resolved to " long by " operation; If user's hand is changed to " rod " gesture from " cloth " gesture, and the position of user's hand changes, and computer is resolved to " dragging " operation; In the time of " dragging " mode of operation, if user's hand is changed to " cloth " gesture from " rod " gesture, computer is resolved to " release " operation; If user's hand keeps " scissors " gesture, the position of user's hand changes, computer is resolved the different situations of change according to the position of user's hand: if user's hand keeps " scissors " gesture to move horizontally, computer is resolved to " horizontal rolling " operation, if user's hand keeps " scissors " gesture to do non-moving horizontally, computer is resolved to " vertical scrolling " operation.
Described step S9 refers to that the position of user's hand changes if user's hand keeps " rod " gesture, and computer is resolved to " movement " operation; If user's hand keeps " cloth " gesture, the position of user's hand changes, and computer is resolved to " movement " operation; If user's hand is changed to " rod " gesture from " cloth " gesture, computer is resolved to " opening " operation; If user's hand is changed to " stone " gesture from " cloth " gesture, computer is resolved to " selection " operation; If user's hand is changed to " stone " gesture from " cloth " gesture, and the position of user's hand remains unchanged within a certain period of time, and computer is resolved to " long by " operation; If user's hand is changed to " stone " gesture from " cloth " gesture, and the position of user's hand changes, and computer is resolved to " dragging " operation; In the time of " dragging " mode of operation, if user's hand is changed to " cloth " gesture from " stone " gesture, computer is resolved to " release " operation; If user's hand keeps " scissors " gesture, the position of user's hand changes, computer is resolved the different situations of change according to the position of user's hand: if user's hand keeps " scissors " gesture to move horizontally, computer is resolved to " horizontal rolling " operation, if user's hand keeps " scissors " gesture to do non-moving horizontally, computer is resolved to " vertical scrolling " operation.
CN201310115476.5A2013-04-032013-04-03Computer operation methodActiveCN103150024B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201310115476.5ACN103150024B (en)2013-04-032013-04-03Computer operation method

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201310115476.5ACN103150024B (en)2013-04-032013-04-03Computer operation method

Publications (2)

Publication NumberPublication Date
CN103150024A CN103150024A (en)2013-06-12
CN103150024Btrue CN103150024B (en)2016-05-04

Family

ID=48548150

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201310115476.5AActiveCN103150024B (en)2013-04-032013-04-03Computer operation method

Country Status (1)

CountryLink
CN (1)CN103150024B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN104750406B (en)*2013-12-312019-12-24深圳迈瑞生物医疗电子股份有限公司 Monitoring equipment and its display interface layout adjustment method and device
CN103914050B (en)*2014-04-082016-08-31北京中亦安图科技股份有限公司A kind of calculator room equipment monitoring method and system
CN105915987B (en)*2016-04-152018-07-06济南大学A kind of implicit interactions method towards smart television
CN108052202B (en)*2017-12-112021-06-11深圳市星野信息技术有限公司3D interaction method and device, computer equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN102081494A (en)*2009-11-272011-06-01实盈光电股份有限公司 Recognition Method of Windows Sign Language Cursor Control
CN102226880A (en)*2011-06-032011-10-26北京新岸线网络技术有限公司Somatosensory operation method and system based on virtual reality
CN102236409A (en)*2010-04-302011-11-09宏碁股份有限公司 Image-based gesture recognition method and system
CN102629155A (en)*2011-11-082012-08-08北京新岸线网络技术有限公司Method and device for implementing non-contact operation

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JP5845002B2 (en)*2011-06-072016-01-20ソニー株式会社 Image processing apparatus and method, and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN102081494A (en)*2009-11-272011-06-01实盈光电股份有限公司 Recognition Method of Windows Sign Language Cursor Control
CN102236409A (en)*2010-04-302011-11-09宏碁股份有限公司 Image-based gesture recognition method and system
CN102226880A (en)*2011-06-032011-10-26北京新岸线网络技术有限公司Somatosensory operation method and system based on virtual reality
CN102629155A (en)*2011-11-082012-08-08北京新岸线网络技术有限公司Method and device for implementing non-contact operation

Also Published As

Publication numberPublication date
CN103150024A (en)2013-06-12

Similar Documents

PublicationPublication DateTitle
US11048333B2 (en)System and method for close-range movement tracking
TWI471756B (en)Virtual touch method
JP6253204B2 (en) Classification of user input intent
JP5701440B1 (en) Method to improve user input operability
US20120105367A1 (en)Methods of using tactile force sensing for intuitive user interface
KR101930225B1 (en) Method and apparatus for controlling touch screen operation mode
TWI590147B (en)Touch modes
US20150100911A1 (en)Gesture responsive keyboard and interface
CN101315593A (en)Touch control type mobile operation device and touch control method applied to same
CN102768595B (en)A kind of method and device identifying touch control operation instruction on touch-screen
CN102819398A (en)Method for slidingly controlling camera via touch screen device
CN104965669A (en)Physical button touch method and apparatus and mobile terminal
JP5374564B2 (en) Drawing apparatus, drawing control method, and drawing control program
CN103425242B (en) Electronic device and method of operation thereof
CN103150024B (en)Computer operation method
TWI615747B (en)System and method for displaying virtual keyboard
CN113209601A (en)Interface display method and device, electronic equipment and storage medium
CN105204754A (en)One-handed operation method and device of touch screen
US20150153925A1 (en)Method for operating gestures and method for calling cursor
US10860120B2 (en)Method and system to automatically map physical objects into input devices in real time
CN101546231B (en) Method and device for multi-object orientation touch selection
JP5705393B1 (en) Method to improve user input operability
CN108132721B (en)Method for generating drag gesture, touch device and portable electronic equipment
CN103605460B (en)Gesture recognition method and related terminal
US20150100912A1 (en)Portable electronic device and method for controlling the same

Legal Events

DateCodeTitleDescription
C06Publication
PB01Publication
C10Entry into substantive examination
SE01Entry into force of request for substantive examination
C14Grant of patent or utility model
GR01Patent grant
TR01Transfer of patent right
TR01Transfer of patent right

Effective date of registration:20200706

Address after:201612 Shanghai Songjiang District Caohejing Development Zone Songjiang high tech Park Xinzhu Road 258, 32 Building 1101 room.

Patentee after:ICKEY (SHANGHAI) INTERNET TECHNOLOGY Co.,Ltd.

Address before:610041 No. 3, No. 23, No. 177, Tianfu Road, Chengdu hi tech Zone, Sichuan, 3

Patentee before:Shi Haixin


[8]ページ先頭

©2009-2025 Movatter.jp