Movatterモバイル変換


[0]ホーム

URL:


CN102722253A - Static man-machine interactive control method and application thereof - Google Patents

Static man-machine interactive control method and application thereof
Download PDF

Info

Publication number
CN102722253A
CN102722253ACN2012102004046ACN201210200404ACN102722253ACN 102722253 ACN102722253 ACN 102722253ACN 2012102004046 ACN2012102004046 ACN 2012102004046ACN 201210200404 ACN201210200404 ACN 201210200404ACN 102722253 ACN102722253 ACN 102722253A
Authority
CN
China
Prior art keywords
user
control
fine motion
virtual unit
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2012102004046A
Other languages
Chinese (zh)
Inventor
黄得锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by IndividualfiledCriticalIndividual
Priority to CN2012102004046ApriorityCriticalpatent/CN102722253A/en
Publication of CN102722253ApublicationCriticalpatent/CN102722253A/en
Pendinglegal-statusCriticalCurrent

Links

Landscapes

Abstract

The invention provides a static man-machine interactive control method. A user can select and control the keystrokes of one or more virtual equipment through a user allowable micro motion. The method comprises the following steps of: 1) creating virtual equipment N1, N2,..., Nn of equipment M1, M2,...., Mn to be controlled in a virtual world; 2) linking the equipment M1, M2,..., Mn and the virtual equipment N1, N2,..., Nn; and 3) linking the user allowable micro motion and the virtual equipment N1, N2,..., Nn; and when implementing the micro motion, the user can select, lock and control the virtual equipment N1, N2,..., Nn, and thus the user can start, stop or control the equipment Mm to be controlled by controlling the virtual equipment Nm. According to the static man-machine interactive control method, the user can operate by lying or sitting without ever leaving the location, therefore, the user can carry out various controls easily and freely for a long time, and the problem that the user has to stop operating due to physical downing can be solved; and the static man-machine interactive control method is suitable for relatively wide people; and any people with active function can carry out corresponding man-machine interaction through the static man-machine interactive control method provided by the invention.

Description

A kind of control method of quiet formula human-computer interaction and utilization thereof
Technical field
The present invention relates to a kind of control method and utilization that makes the bionical virtual world of human-computer interaction.
Background technology
In the actual life, we always hope more facility, and also always run into inconvenience unavoidably, leave berth like the patient of hospital and remove to control the various electronic equipments that hospital provides; In the big winter, we hope to hide and control various electrical equipment in the quilt the inside.
Summary of the invention
The purpose of this invention is to provide a kind of control method and utilization thereof of the human-computer interaction that addresses the above problem.
For the ease of understanding the present invention, the spy carries out following explanation of nouns to each related term.
The user allows fine motion to make scheme: when the user implements some or the qualified fine motion of a group when doing, can send a steering order to computing machine; The fine motion here especially refers to user's action by a small margin, and less than 20cm, concrete manifestation is like arm fine motion, the little song of pin like: corresponding joint moving displacement arbitrarily; Above-mentioned condition of the present invention especially comprises the situation that qualification is not given an order.
Virtual permission action scheme: virtual world is given action or the action scheme that utensil can carry out in self-role or the virtual world, and said action scheme comprises continuous combination of actions, action dynamics, speed etc.
Turning joint: the user is not the activity that the activity in whole joints can be controlled my role's phase position; When especially self-role is non-human; User some joint on one's body not, so " turning joint " of indication of the present invention is meant that virtual world is given the movable position of self-role and corresponding to the joint on user's actual body.On the other hand, when self-role's movable part is counted more than the actual turning joint of user, the additive method that then adopts the present invention to introduce; The alleged turning joint of this paper is not limited only to the skeleton junction in addition, and mobilizable any position on its general reference human body is like any point on the whole upper arm.
Palm: comprise the joint on all palms of wrist, like finger.
Sole: comprise the joint on all soles of ankle, like toe.
Estimate the index of motion amplitude: can be followed the tracks of displacement and direction that the position takes place, followed the tracks of the angle of position on two time points etc.
Action is amplified: in order to make every effort to user's sense of reality, reach the synchronisation requirement in interactive process, set following two rules:
1, in the human perception limit of power, action is amplified and preferably only user's movement range, dynamics is amplified;
2, when surpassing the human perception limit of power, action is amplified and can also be amplified user's responsiveness.
For realizing above-mentioned purpose, technical scheme of the present invention is:
A kind of control method of quiet formula human-computer interaction, the user allows fine motion to do to select and to control the control key of one or more virtual unit through implementing the user, and it may further comprise the steps:
1) establishment waits to control equipment M1, M2 in virtual world ... The virtual unit N1 of Mn, N2 ... Nn;
2) association waits to control equipment M1, M2 ... Mn and virtual unit N1, N2 ... Nn;
3) Zheng Lian user allows fine motion work and virtual unit N1, N2 ... Nn; When making the user carry out fine motion to do, can implement virtual unit N1, N2 ... The selection of Nn, lock and control, make the user can open, close or control then and wait to control equipment Mm through control virtual unit Nm.
The system of selection that the user allows fine motion to do virtual unit Nm and control key thereof is through eye tracking, catches eyeball is just being seen virtual unit Nm and control key thereof, to confirm the selection to virtual unit Nm and control key thereof.
The fine motion that locking virtual unit Nm and control key thereof are corresponding connects as eyelid carries out action nictation.
Said user allows fine motion to be provided with the user as scheme or stage property is carried out the amplitude peak M of this fine motion work, the amplitude peak that corresponding virtual control key is performed is N; Being located at user on the t time point, to carry out the amplitude that this fine motion does be Mt; The amplitude that corresponding virtual control key is performed is Nt; Then this system satisfies: when Mt >=M, and Nt=N.
Limit said user and allow fine motion to make scheme, when making the user accomplish arbitrary fine motion and do with said amplitude peak M, any adjacent two partly angle changing values on the trunk except that palm and sole are less than 30 degree.
A kind of control system of quiet formula human-computer interaction; The user does to select and the control virtual unit through implementing fine motion; Then realize that it is characterized in that: it comprises to respectively waiting the control of the equipment of controlling in the reality: imaging device, the user who shows virtual unit allows fine motion to make the identification capture device of scheme, control user and synchronous control system and associated virtual equipment and the converting system of waiting the equipment of controlling to virtual unit is selected and control action is synchronous.
Said identification capture device is provided with selective system and the locking system to virtual unit.
The usefulness of technique scheme is:
The present invention is because user's health need not to leave the position, therefore in operating process, but equal recumbencies or be seated, thereby the user can long-time free and relaxed completion each item control, and can be because of not being short of physical strength, and compelled the termination; Therefore the adaptation population is extremely wide, and all healths have the people of active muscles ability to carry out corresponding human-computer interaction through the present invention.
Through specific embodiment the present invention is further described below.
Specific embodiment
The control method of 1 one kinds of quiet formula human-computer interactions of embodiment
A kind of control method of quiet formula human-computer interaction, the user allows fine motion to do to select and to control the control key of one or more virtual unit through implementing the user, and it may further comprise the steps:
1) establishment waits to control equipment M1, M2 in virtual world ... The virtual unit N1 of Mn, N2 ... Nn;
2) association waits to control equipment M1, M2 ... Mn and virtual unit N1, N2 ... Nn;
3) Zheng Lian user allows fine motion work and virtual unit N1, N2 ... Nn; When making the user carry out fine motion to do, can implement virtual unit N1, N2 ... The selection of Nn, lock and control, make the user can open, close or control then and wait to control equipment Mm through control virtual unit Nm.
The system of selection that the user allows fine motion to do virtual unit Nm and control key thereof is through eye tracking, catches eyeball is just being seen virtual unit Nm and control key thereof, to confirm the selection to virtual unit Nm and control key thereof.
The fine motion that locking virtual unit Nm and control key thereof are corresponding connects as eyelid carries out action nictation.
Said user allows fine motion to be provided with the user as scheme or stage property is carried out the amplitude peak M of this fine motion work, the amplitude peak that corresponding virtual control key is performed is N; Being located at user on the t time point, to carry out the amplitude that this fine motion does be Mt; The amplitude that corresponding virtual control key is performed is Nt; Then this system satisfies: when Mt >=M, and Nt=N.
Limit said user and allow fine motion to make scheme, when making the user accomplish arbitrary fine motion and do with said amplitude peak M, any adjacent two partly angle changing values on the trunk except that palm and sole are less than 30 degree.
A kind of control system of quiet formula human-computer interaction; The user does to select and the control virtual unit through implementing fine motion; Then realize that it is characterized in that: it comprises to respectively waiting the control of the equipment of controlling in the reality: imaging device, the user who shows virtual unit allows fine motion to make the identification capture device of scheme, control user and synchronous control system and associated virtual equipment and the converting system of waiting the equipment of controlling to virtual unit is selected and control action is synchronous.
Said identification capture device is provided with selective system and the locking system to virtual unit.

Claims (7)

CN2012102004046A2011-09-122012-06-18Static man-machine interactive control method and application thereofPendingCN102722253A (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN2012102004046ACN102722253A (en)2011-09-122012-06-18Static man-machine interactive control method and application thereof

Applications Claiming Priority (3)

Application NumberPriority DateFiling DateTitle
CN201110269433.32011-09-12
CN2011102694332011-09-12
CN2012102004046ACN102722253A (en)2011-09-122012-06-18Static man-machine interactive control method and application thereof

Publications (1)

Publication NumberPublication Date
CN102722253Atrue CN102722253A (en)2012-10-10

Family

ID=46948048

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN2012102004046APendingCN102722253A (en)2011-09-122012-06-18Static man-machine interactive control method and application thereof

Country Status (1)

CountryLink
CN (1)CN102722253A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN103092349A (en)*2013-01-232013-05-08宁凯Panoramic experience method based on Kinect somatosensory equipment
CN108776541A (en)*2014-04-112018-11-09黄得锋A kind of control method of human-computer interaction

Citations (6)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US5913727A (en)*1995-06-021999-06-22Ahdoot; NedInteractive movement and contact simulation game
CN1889016A (en)*2006-07-252007-01-03周辰Eye-to-computer cursor automatic positioning controlling method and system
US7205979B2 (en)*1987-03-172007-04-17Sun Microsystems, Inc.Computer data entry and manipulation apparatus and method
CN101890237A (en)*2010-07-162010-11-24叶尔肯·拜山Game controller and control method thereof
CN102047201A (en)*2008-05-262011-05-04微软国际控股私有有限公司Controlling virtual reality
CN102129292A (en)*2010-01-152011-07-20微软公司Recognizing user intent in motion capture system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US7205979B2 (en)*1987-03-172007-04-17Sun Microsystems, Inc.Computer data entry and manipulation apparatus and method
US5913727A (en)*1995-06-021999-06-22Ahdoot; NedInteractive movement and contact simulation game
CN1889016A (en)*2006-07-252007-01-03周辰Eye-to-computer cursor automatic positioning controlling method and system
CN102047201A (en)*2008-05-262011-05-04微软国际控股私有有限公司Controlling virtual reality
CN102129292A (en)*2010-01-152011-07-20微软公司Recognizing user intent in motion capture system
CN101890237A (en)*2010-07-162010-11-24叶尔肯·拜山Game controller and control method thereof

Cited By (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN103092349A (en)*2013-01-232013-05-08宁凯Panoramic experience method based on Kinect somatosensory equipment
CN108776541A (en)*2014-04-112018-11-09黄得锋A kind of control method of human-computer interaction

Similar Documents

PublicationPublication DateTitle
Guidali et al.A robotic system to train activities of daily living in a virtual environment
Schweighofer et al.Task-oriented rehabilitation robotics
Kim et al.Continuous shared control for stabilizing reaching and grasping with brain-machine interfaces
Patton et al.Robotics and virtual reality: a perfect marriage for motor control research and rehabilitation
Luo et al.Combined perception, control, and learning for teleoperation: key technologies, applications, and challenges
Carlson et al.The birth of the brain-controlled wheelchair
CN102541260A (en)Human-machine interaction control method and application thereof
Pang et al.Study on the sEMG driven upper limb exoskeleton rehabilitation device in bilateral rehabilitation
Gasser et al.Design and performance characterization of a hand orthosis prototype to aid activities of daily living in a post-stroke population
Pacchierotti et al.Improving transparency in passive teleoperation by combining cutaneous and kinesthetic force feedback
Hasegawa et al.Bilateral control of elbow and shoulder joints using functional electrical stimulation between humans and robots
Zhu et al.Face-computer interface (FCI): Intent recognition based on facial electromyography (fEMG) and online human-computer interface with audiovisual feedback
Padmanabha et al.Hat: Head-worn assistive teleoperation of mobile manipulators
Ueki et al.Development of virtual reality exercise of hand motion assist robot for rehabilitation therapy by patient self-motion control
CN102722253A (en)Static man-machine interactive control method and application thereof
Lavoie et al.Comparing eye–hand coordination between controller-mediated virtual reality, and a real-world object interaction task
Velasco-Alvarez et al.BCI-based navigation in virtual and real environments
Hoshyarmanesh et al.Evaluation of haptic devices and end‐users: novel performance metrics in tele‐robotic microsurgery
Ma et al.Sensing and force-feedback exoskeleton robotic (SAFER) glove mechanism for hand rehabilitation
Kim et al.Developments in brain–machine interfaces from the perspective of robotics
CN204428386U (en)A kind of interactive device for realizing hand rehabilitation training
Villani et al.Natural interaction based on affective robotics for multi-robot systems
Geng et al.Motor prediction in brain-computer interfaces for controlling mobile robots
BeckerleVirtual Hand Experience
Naganuma et al.Promotion of rehabilitation practice for elderly people using robotic pets

Legal Events

DateCodeTitleDescription
C06Publication
PB01Publication
C10Entry into substantive examination
SE01Entry into force of request for substantive examination
RJ01Rejection of invention patent application after publication

Application publication date:20121010

RJ01Rejection of invention patent application after publication

[8]ページ先頭

©2009-2025 Movatter.jp