Movatterモバイル変換


[0]ホーム

URL:


CN207752446U - A kind of gesture identification interaction systems based on Leap Motion equipment - Google Patents

A kind of gesture identification interaction systems based on Leap Motion equipment
Download PDF

Info

Publication number
CN207752446U
CN207752446UCN201820650784.6UCN201820650784UCN207752446UCN 207752446 UCN207752446 UCN 207752446UCN 201820650784 UCN201820650784 UCN 201820650784UCN 207752446 UCN207752446 UCN 207752446U
Authority
CN
China
Prior art keywords
gesture
user
finger
microcontroller
hand
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201820650784.6U
Other languages
Chinese (zh)
Inventor
林潼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by IndividualfiledCriticalIndividual
Priority to CN201820650784.6UpriorityCriticalpatent/CN207752446U/en
Application grantedgrantedCritical
Publication of CN207752446UpublicationCriticalpatent/CN207752446U/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Landscapes

Abstract

The utility model provides a kind of gesture identification interaction systems based on Leap Motion equipment, including be sequentially connected electrically Leap Motion equipment, microcontroller, bionic mechanical hand;The Leap Motion equipment is used to acquire the hand motion of user, and the gesture of user is gone out based on the hand motion recognition;The microcontroller is for matching the gesture for the user that Leap Motion equipment identifies with preset gesture;The bionic mechanical hand is used for when the gesture of user is matched with preset gesture, receive the control instruction that microcontroller is sent, and it executes corresponding action and carries out human-computer interaction, the self-closing disease patient for not liking and being linked up with people can be allowed to understand and be familiar with people's gesture used in everyday, and learn to imitate corresponding interactive action, the interest of human-computer interaction can be increased, self-closing disease patient is assisted to overcome the effect of Social disorder.

Description

A kind of gesture identification interaction systems based on Leap Motion equipment
Technical field
The utility model is related to human-computer intellectualization technical fields, and Leap Motion equipment is based on more particularly to one kindGesture identification interaction systems.
Background technology
This has the child that a group is not realized in the world, they can not link up with normal person, they immerseIn oneself the slight world, the gesture of normal person usually can not be understood, they are like the star of the sky, in black nightIn lonesomely flicker, they are exactly the self-closing disease patient for the child for being known as star.
Self-closing disease, also known as autism (autism) or autistic disorder (autistic disorder) etc. are popularity hairsThe representative disease of obstacle (pervasive developmental disorder, PDD) is educated, onset is in infantile period, mainlyShow as that different degrees of speech developmental disorder, Social disorder, interest are narrow and behavior is mechanical.And gesture is peopleHow a kind of very important communication way in daily exchange makes not liking the self-closing disease patient understanding linked up with people and be familiar withThe common gesture of people day and the problem of be people's urgent need to resolve with people's interaction.
Utility model content
Based on this, it is necessary to provide a kind of gesture identification interaction systems based on Leap Motion equipment, including electric successivelyThe Leap Motion equipment of connection, microcontroller, bionic mechanical hand;
The Leap Motion equipment is used to acquire the hand motion of user, and user hand is obtained based on the hand motionThe general image in portion, radian, the palm of the centre of the palm and each articulations digitorum manus bone spatial three-dimensional position coordinate, palm and digital flexionThe information that mobile or rotation direction separates or polymerize with normal vector, finger with the direction of speed, palm direction;And analyze acquisitionObtained information identifies the gesture of user;
The gesture for the user that the microcontroller is used to identify Leap Motion equipment and the progress of preset gestureMatch;
The preset gesture includes:It waves, shake hands, touching fist, finger gesticulates number one, finger gesticulates number two, fingerGesticulate number three, finger gesticulates number four, finger gesticulates number five, finger gesticulates digital zero;
The bionic mechanical hand is used for when the gesture of user is matched with preset gesture, receives the control that microcontroller is sentInstruction, and execute corresponding action;
Action includes accordingly for the execution:
When the gesture of user is with waving to match, bionic mechanical hand executes the action waved to user;
When the gesture of user with shake hands match when, bionic mechanical hand executes the action shaken hands with user;
When the gesture of user is with touching fist and matching, bionic mechanical hand executes the action for user touch fist;
When the gesture of user gesticulates number one with finger to be matched, the finger of bionic mechanical hand, which executes, gesticulates the dynamic of number fourMake, the game equal to five is added with user's simulation;
When the gesture of user gesticulates number two with finger to be matched, the finger of bionic mechanical hand, which executes, gesticulates the dynamic of number threeMake, the game equal to five is added with user's simulation;
When the gesture of user gesticulates number three with finger to be matched, the finger of bionic mechanical hand, which executes, gesticulates the dynamic of number twoMake, the game equal to five is added with user's simulation;
When the gesture of user gesticulates number four with finger to be matched, the finger of bionic mechanical hand, which executes, gesticulates the dynamic of number oneMake, the game equal to five is added with user's simulation;
When the gesture of user gesticulates number five with finger to be matched, the finger of bionic mechanical hand, which executes, gesticulates the dynamic of digital zeroMake, the game equal to five is added with user's simulation;
When the gesture of user gesticulates digital zero with finger to be matched, the finger of bionic mechanical hand, which executes, gesticulates the dynamic of number fiveMake, the game equal to five is added with user's simulation.
The bionic mechanical hand adjusts steering engine by receiving the control instruction that microcontroller is sent in one of the embodiments,Group state, completes specified gesture motion.
Further include in one of the embodiments,:Gesture update module, the gesture update module are electrically connected with microcontrollerIt connects, for updating preset gesture, and bionic mechanical hand is set and makes corresponding action progress interaction.
A kind of gesture identification interaction systems based on Leap Motion equipment provided by the utility model, including it is electric successivelyThe Leap Motion equipment of connection, microcontroller, bionic mechanical hand;The Leap Motion equipment is used to acquire the hand of userAction obtains general image, the centre of the palm and each articulations digitorum manus bone space three-dimensional position of user's hand based on the hand motionSet the radian of coordinate, palm and digital flexion, the direction of palm movement or rotation and speed, the direction of palm direction and normal directionAmount, finger separate or the information of polymerization;And analyze the gesture that the information acquired identifies user;The microcontroller is used for willThe gesture for the user that Leap Motion equipment identifies is matched with preset gesture;The preset gesture includes:It wavesHand, shake hands, touch fist, finger gesticulate number one, finger gesticulate number two, finger gesticulate number three, finger gesticulate number four, fingerGesticulate number five, finger gesticulates digital zero;The bionic mechanical hand is used for when the gesture of user is matched with preset gesture, is connectThe control instruction that microcontroller is sent is received, and executes corresponding action;Action includes accordingly for the execution:When user gesture withWhen waving to match, bionic mechanical hand executes the action waved to user;When the gesture of user with shake hands match when, bionical machineTool hand executes the action shaken hands with user;When the gesture of user is with touching fist and matching, bionic mechanical hand execute with user intoRow touches the action of fist;When the gesture of user gesticulates number one with finger to be matched, number is gesticulated in the finger execution of bionic mechanical handFour action is added the game equal to five with user's simulation;When the gesture of user gesticulates number two with finger to be matched, bionical machineThe finger of tool hand executes the action for gesticulating number three, and the game equal to five is added with user's simulation;When the gesture and finger of userWhen gesticulating three matching of number, the finger of bionic mechanical hand executes the action for gesticulating number two, is added equal to five with user's simulationGame;When the gesture of user gesticulates number four with finger to be matched, the finger of bionic mechanical hand executes the action for gesticulating number one,The game equal to five is added with user's simulation;When the gesture of user gesticulates number five with finger to be matched, the hand of bionic mechanical handRefer to the action for executing and gesticulating digital zero, the game equal to five is added with user's simulation;When the gesture of user gesticulates number with fingerWhen zero matching, the finger of bionic mechanical hand executes the action for gesticulating number five, and the game equal to five is added with user's simulation.This realityWith novel by capturing identification and matching preset gesture, and control biomimetics manipulator makes corresponding gesture motion into pedestrianMachine is interactive, the self-closing disease patient for not liking and being linked up with people can be allowed to understand and be familiar with people's gesture used in everyday, and learnCorresponding interactive action is imitated, the interest of human-computer interaction can be increased, there is love self-closing disease patient, assist self-closing disease patientThe effect of Social disorder is overcome, in addition, the utility model has the advantages that quick and precisely to identify user gesture.
Description of the drawings
Fig. 1 is a kind of gesture identification interaction systems flow chart based on Leap Motion equipment in one embodiment;
Fig. 2 is a kind of system block diagram of the gesture identification interaction based on Leap Motion equipment in one embodiment;
Fig. 3 is the three dimensional space coordinate schematic diagram of Leap Motion equipment in one embodiment;
Fig. 4 is the general image and finger orientation schematic diagram for the hand that Leap Motion equipment obtains in one embodiment;
Fig. 5 is the general image and palm direction for user's hand that Leap Motion equipment obtains in one embodimentDirection and normal vector schematic diagram;
Fig. 6 is the user gesture schematic three dimensional views identified in one embodiment;
Fig. 7 is the electric signal control schematic diagram of bionic mechanical hand in one embodiment;
Fig. 8 is the action schematic diagram that the finger of control biomimetics manipulator in one embodiment gesticulates number five;
Fig. 9 is the action schematic diagram that control biomimetics manipulator is shaken hands with user in one embodiment.
Specific implementation mode
In order to make the purpose of the utility model, technical solutions and advantages more clearly understood, below in conjunction with attached drawing and implementationExample, the present invention will be further described in detail.It should be appreciated that specific embodiment described herein is only used to explainThe utility model is not used to limit the utility model.
The description of specific distinct unless the context otherwise, element and component in the utility model, quantity both can be singleForm exist, form that can also be multiple exists, and the utility model is defined not to this.Step in the utility modelAlthough being arranged with label, be not used to limit the precedence of step, unless expressly stated the order of step orBased on the execution of person's step needs other steps, otherwise the relative rank of step is adjustable.It is appreciated that thisTerm "and/or" used in text one of is related to and covers associated Listed Items or one or more of any and institutePossible combination.
In one embodiment, as shown in Fig. 2, a kind of gesture identification interaction systems based on Leap Motion equipment, packetInclude the Leap Motion equipment 100 being sequentially connected electrically, microcontroller 200, bionic mechanical hand 300;The Leap Motion equipmentHand motion for acquiring user obtains the general image of user's hand, the centre of the palm and each finger based on the hand motionJoint bone spatial three-dimensional position coordinate, the radian of palm and digital flexion, palm movement or the direction rotated and speed, palmThe information that the direction of direction separates or polymerize with normal vector, finger;And analyze the gesture that the information acquired identifies user.
As shown in Fig. 3, Fig. 4, Fig. 5, Fig. 6, Leap Motion equipment is caught from different perspectives according to two built-in camerasThe picture for catching user's hand motion, reconstruct palm real world three dimensional space movable information.The range of detection probably existsAbove sensor between 25 millimeters to 600 millimeters, the space of detection is substantially rectangular pyramid.Leap Motion equipmentA rectangular coordinate system is established, the origin of coordinate is the center of Leap Motion equipment, and the X-axis of coordinate is parallel to LeapMotion equipment is directed toward screen right, and Y-axis points up, and Z axis is directed toward away from the direction of screen.
When Leap Motion equipment detects the hand of user, finger or is gesture, equipment assign it (user'sHand, finger or be gesture) a unique id number is as label, as long as this entity does not go out the visible area of equipment, thisA ID number will be constant always, in each frame, Leap Motion equipment obtain the general image of user's hand, the centre of the palm andEach articulations digitorum manus bone spatial three-dimensional position coordinate, the radian of palm and digital flexion, the direction of palm direction and normal vector, handRefer to the information for separating or polymerizeing, and frame animation is obtained by the comparison of current frame and frame earlier, to obtain palm movementOr direction and speed, the zoom action information of finger of rotation.Judge and know by the above-mentioned information acquired of comprehensive analysisDo not go out the gesture of user.The utility model finds through a large number of experiments, when user hand and Leap Motion equipment away fromWhen from being 30 millimeters to 550 millimeters, the acquisition of Leap Motion equipment captures the better of user gesture;A reality whereinIt applies in example, when the hand of user is 35 millimeters to 500 millimeters at a distance from Leap Motion equipment, Leap Motion equipmentAcquisition captures the best results of user gesture.
Leap Motion equipment is by measuring each finger to the distance of the centre of the palm, meter in one of the embodiments,Finger length information is calculated, and combination finger length information and finger coordinate information identify the gesture of human body;Wherein oneIn a embodiment, Leap Motion equipment is by comparing the continuous a few frame centres of the palm of analysis and each articulations digitorum manus bone space three-dimensionalThe gesture of the variation identification human body of position coordinates.Solution provided by the utility model can not only identify static user handGesture can also identify dynamic user gesture, and recognition speed is fast, and accuracy is high.
The gesture for the user that the microcontroller is used to identify Leap Motion equipment in one of the embodiments,It is matched with preset gesture;
The gesture of the above-mentioned user identified is matched with preset gesture in system, the preset gesture packetIt includes:It waves, shake hands, touching fist, finger gesticulates number one, finger gesticulates number two, finger gesticulates number three, finger gesticulates numberFour, finger gesticulates number five, finger gesticulates digital zero;
In one of the embodiments, using the method for template matches and maximum probability and Nearest neighbor rule by user'sGesture is matched with preset gesture in system, specifically includes the direction stretched out towards, hand in the matching centre of the palm, the shape that finger stretches outState and quantity, adjacent Interphalangeal angle, the direction of the nearly articulations digitorum manus of index finger, thumb and index finger tip distance, index finger and middle fingerFinger tip distance, nameless finger tip distance with middle finger, nameless finger tip distance with little finger of toe, each finger tip and the centre of the palmOne or several in the direction of distance, palm movement or rotation.The present invention can be identified fast and accurately and default gestureThe user gesture to match.
The bionic mechanical hand is used for when the gesture of user is matched with preset gesture, receives the control that microcontroller is sentInstruction, and execute corresponding action;
Action includes accordingly for the execution:
When the gesture of user is with waving to match, bionic mechanical hand executes the action waved to user;
When the gesture of user with shake hands match when, bionic mechanical hand executes the action shaken hands with user;
When the gesture of user is with touching fist and matching, bionic mechanical hand executes the action for user touch fist;
When the gesture of user gesticulates number one with finger to be matched, the finger of bionic mechanical hand, which executes, gesticulates the dynamic of number fourMake, the game equal to five is added with user's simulation;
When the gesture of user gesticulates number two with finger to be matched, the finger of bionic mechanical hand, which executes, gesticulates the dynamic of number threeMake, the game equal to five is added with user's simulation;
When the gesture of user gesticulates number three with finger to be matched, the finger of bionic mechanical hand, which executes, gesticulates the dynamic of number twoMake, the game equal to five is added with user's simulation;
When the gesture of user gesticulates number four with finger to be matched, the finger of bionic mechanical hand, which executes, gesticulates the dynamic of number oneMake, the game equal to five is added with user's simulation;
When the gesture of user gesticulates number five with finger to be matched, the finger of bionic mechanical hand, which executes, gesticulates the dynamic of digital zeroMake, the game equal to five is added with user's simulation;
When the gesture of user gesticulates digital zero with finger to be matched, the finger of bionic mechanical hand, which executes, gesticulates the dynamic of number fiveMake, the game equal to five is added with user's simulation.
The bionic mechanical hand adjusts steering engine by receiving the control instruction that microcontroller is sent in one of the embodiments,Group state, completes specified gesture motion.
The utility model is by capturing identification and matching preset gesture, and control biomimetics manipulator makes corresponding handGesture action carries out human-computer interaction, can allow not liking the self-closing disease patient understanding linked up with people and to be familiar with people used in everydayGesture, and learn to imitate corresponding interactive action, the interest of human-computer interaction can be increased, there is love self-closing disease patient, associationSelf-closing disease patient is helped to overcome the effect of Social disorder, in addition, the present invention has the advantages that quick and precisely to identify user gesture.
In one of the embodiments, as shown in fig. 7, the bionic mechanical hand 300 includes microcontroller and multiple steering engines,The multiple steering engine is electrically connected with microcontroller respectively, and the microcontroller is electrically connected with microcontroller 200, and the microcontroller is logicalIt crosses and receives the control instruction adjustment rudder set state that microcontroller is sent, complete specified gesture motion.One embodiment whereinIn, the steering engine group of bionic mechanical hand includes 15 steering engines, wherein and include 2 steering engines at two articulations digitorum manus of thumb, other fourRespectively include 3 steering engines at three articulations digitorum manus of finger, the joint of wrist includes 1 steering engine, bionic mechanical hand by adjustingThe state of 15 steering engines, each articulations digitorum manus of control biomimetics manipulator, wrist joint bending or stretch allow bionic mechanical handMake the various actions for imitating human hand gesture.Certainly, as needed, bionic mechanical hand can also include more steering engines, toMake the gesture motion that bionic mechanical hand is made more careful, it is more life-like.
In one of the embodiments, as shown in figure 8, when Leap Motion equipment capture the user gesture that recognizes withWhen finger gesticulates digital zero matching in preset gesture, microcontroller is by sending microcontroller of the control instruction to bionic mechanical handDevice, control biomimetics manipulator steering engine group adjustment state, allow the finger of bionic mechanical hand gesticulate number five action, with user's mouldThe quasi- game being added equal to five.As shown in figure 9, when Leap Motion equipment captures the user gesture recognized and preset handWhen the gesture matching shaken hands in gesture, microcontroller is by sending control instruction to the microcontroller of bionic mechanical hand, control biomimetics machineThe action that the steering engine group of tool hand adjusts state, bionic mechanical hand is allowed to shake hands with user, interaction of shaking hands is carried out with user.
The system of the utility model further includes in one of the embodiments,:Gesture update module, the gesture update mouldBlock and monolithic mechatronics for updating preset gesture, and are arranged bionic mechanical hand and make corresponding action progress interaction.
, can be by the self-defined preset gesture library of gesture update module according to the needs of user, such as increase finger-guessing gameGesture scissors, stone, cloth, and bionic mechanical hand is set and makes identical gesture, or setting bionic mechanical hand makes finger-guessing game energyThe action of win.In one of the embodiments, when it is scissors to capture the gesture for recognizing user's hand, control biomimetics manipulatorThe action for making stone, when it is stone to capture the gesture for recognizing user's hand, control biomimetics manipulator does the action of cloth outputting,When it is cloth to capture the gesture for recognizing user's hand, control biomimetics manipulator makes the action of scissors, and interaction is carried out with user,Increase the enjoyment of human-computer interaction.Certainly, user can also delete the gesture in default gesture library as needed, for example, working as userIt can be deleted from default gesture library when already being familiar with and having grasped some gesture in default gesture library.One whereinIn embodiment, user is very familiar and when having grasped the gesture shaken hands, the gesture that will shake hands is deleted from default gesture library, willThe memory space of default gesture library, which is available, stores other gestures for needing to be grasped study.
The utility model is by capturing identification and matching preset gesture, and control biomimetics manipulator makes corresponding handGesture action carries out human-computer interaction, can allow not liking the self-closing disease patient understanding linked up with people and to be familiar with people used in everydayGesture, and learn to imitate corresponding interactive action, the interest of human-computer interaction can be increased, there is love self-closing disease patient, associationSelf-closing disease patient is helped to overcome the effect of Social disorder, in addition, the utility model, which has, quick and precisely identifies user gestureAdvantage.
In one embodiment, as shown in Figure 1, a kind of gesture identification interaction systems work based on Leap Motion equipmentInclude as flow:
Step S10 acquires the hand motion of user by Leap Motion equipment, is obtained and is used based on the hand motionGeneral image, the centre of the palm and each articulations digitorum manus bone spatial three-dimensional position coordinate of family hand, the radian of palm and digital flexion,The information that the direction of palm movement or rotation separates or polymerize with normal vector, finger with the direction of speed, palm direction;
Step S20 identifies the gesture of user by analyzing the above-mentioned information acquired;
As shown in Fig. 3, Fig. 4, Fig. 5, Fig. 6, Leap Motion equipment is caught from different perspectives according to two built-in camerasThe picture for catching user's hand motion, reconstruct palm real world three dimensional space movable information.The range of detection probably existsAbove sensor between 25 millimeters to 600 millimeters, the space of detection is substantially rectangular pyramid.Leap Motion sensingsDevice establishes a rectangular coordinate system, and the origin of coordinate is the center of sensor, and the X-axis of coordinate is parallel to sensor, is directed toward screenRight, Y-axis point up, and Z axis is directed toward away from the direction of screen.
When Leap Motion equipment detects the hand of user, finger or is gesture, equipment assign it (user'sHand, finger or be gesture) a unique id number is as label, as long as this entity does not go out the visible area of equipment, thisA ID number will be constant always, in each frame, Leap Motion equipment obtain the general image of user's hand, the centre of the palm andEach articulations digitorum manus bone spatial three-dimensional position coordinate, the radian of palm and digital flexion, the direction of palm direction and normal vector, handRefer to the information for separating or polymerizeing, and frame animation is obtained by the comparison of current frame and frame earlier, to obtain palm movementOr direction and speed, the zoom action information of finger of rotation.Judge and know by the above-mentioned information acquired of comprehensive analysisDo not go out the gesture of user.The utility model finds through a large number of experiments, when user hand and Leap Motion equipment away fromWhen from being 30 millimeters to 550 millimeters, the acquisition of Leap Motion equipment captures the better of user gesture;A reality whereinIt applies in example, when the hand of user is 35 millimeters to 500 millimeters at a distance from Leap Motion equipment, Leap Motion equipmentAcquisition captures the best results of user gesture.
Leap motio equipment is by measuring each finger to the distance of the centre of the palm, meter in one of the embodiments,Finger length information is calculated, and combination finger length information and finger coordinate information identify the gesture of human body;Wherein oneIn a embodiment, leap motio equipment is by comparing the continuous a few frame centres of the palm of analysis and each articulations digitorum manus bone space three-dimensional positionSet the gesture of the variation identification human body of coordinate.
Step S30 matches the gesture of user with preset gesture;
The gesture of the above-mentioned user identified is matched with preset gesture in system, the preset gesture packetIt includes:It waves, shake hands, touching fist, finger gesticulates number one, finger gesticulates number two, finger gesticulates number three, finger gesticulates numberFour, finger gesticulates number five, finger gesticulates digital zero;
In one of the embodiments, using the system of template matches and maximum probability and Nearest neighbor rule by user'sGesture is matched with preset gesture, specifically includes the direction stretched out towards, hand in the matching centre of the palm, the state that finger stretches out and numberAmount, adjacent Interphalangeal angle, the direction of the nearly articulations digitorum manus of index finger, thumb and index finger tip distance, index finger and middle finger finger tipDistance, nameless finger tip distance with middle finger, nameless finger tip distance with little finger of toe, each finger tip at a distance from the centre of the palm,One or several in the direction of palm movement or rotation.The utility model can be identified fast and accurately and default gesture phaseMatched user gesture.
Step S41, when the gesture of user is with waving to match, then control biomimetics manipulator is waved dynamic to userMake;
Step S42, when the gesture of user with shake hands match when, then control biomimetics manipulator and user shake hands dynamicMake;
Step S43, when the gesture of user is with touching fist and matching, then control biomimetics manipulator carries out touching the dynamic of fist with userMake;
Step S44, when the gesture of user gesticulates number one with finger to be matched, then the finger of control biomimetics manipulator is gesticulatedThe action of number four is added the game equal to five with user's simulation;
Step S45, when the gesture of user gesticulates number two with finger to be matched, then the finger of control biomimetics manipulator is gesticulatedThe action of number three is added the game equal to five with user's simulation;
Step S46, when the gesture of user gesticulates number three with finger to be matched, then the finger of control biomimetics manipulator is gesticulatedThe action of number two is added the game equal to five with user's simulation;
Step S47, when the gesture of user gesticulates number four with finger to be matched, then the finger of control biomimetics manipulator is gesticulatedThe action of number one is added the game equal to five with user's simulation;
Step S48, when the gesture of user gesticulates number five with finger to be matched, then the finger of control biomimetics manipulator is gesticulatedThe action of digital zero is added the game equal to five with user's simulation;
Step S49, when the gesture of user gesticulates digital zero with finger to be matched, then the finger of control biomimetics manipulator is gesticulatedThe action of number five is added the game equal to five with user's simulation.
The utility model is by capturing identification and matching preset gesture, and control biomimetics manipulator makes corresponding handGesture action carries out human-computer interaction, can allow not liking the self-closing disease patient understanding linked up with people and to be familiar with people used in everydayGesture, and learn to imitate corresponding interactive action, the interest of human-computer interaction can be increased, there is love self-closing disease patient, associationSelf-closing disease patient is helped to overcome the effect of Social disorder, in addition, the utility model, which has, quick and precisely identifies user gestureAdvantage.
Since self-closing disease patient has more serious Social disorder, so one kind is simple but it can be allowed to likeInteresting " toy " seem extremely important, should " toy " it can be helped to learn and be familiar with the common gesture of people day and mutual therewithIt is dynamic, assist self-closing disease patient to overcome the obstacle of human communication.And the utility model to self-closing disease patient precisely in order to provide in this wayA kind of system as " toy ".
Above-described embodiments merely represent several embodiments of the utility model, the description thereof is more specific and detailed,But it should not be understood as limiting the scope of the patent of the utility model.It should be pointed out that for the common of this fieldFor technical user person, without departing from the concept of the premise utility, various modifications and improvements can be made, these are allBelong to the scope of protection of the utility model.Therefore, the protection domain of the utility model patent should be determined by the appended claims.

Claims (2)

CN201820650784.6U2018-05-032018-05-03A kind of gesture identification interaction systems based on Leap Motion equipmentActiveCN207752446U (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201820650784.6UCN207752446U (en)2018-05-032018-05-03A kind of gesture identification interaction systems based on Leap Motion equipment

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201820650784.6UCN207752446U (en)2018-05-032018-05-03A kind of gesture identification interaction systems based on Leap Motion equipment

Publications (1)

Publication NumberPublication Date
CN207752446Utrue CN207752446U (en)2018-08-21

Family

ID=63154880

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201820650784.6UActiveCN207752446U (en)2018-05-032018-05-03A kind of gesture identification interaction systems based on Leap Motion equipment

Country Status (1)

CountryLink
CN (1)CN207752446U (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN109500815A (en)*2018-12-032019-03-22深圳市越疆科技有限公司Robot for the judgement study of preposition posture
CN109670416A (en)*2018-12-032019-04-23深圳市越疆科技有限公司Learning method, learning system and storage medium based on the judgement of preposition posture
CN109717878A (en)*2018-12-282019-05-07上海交通大学A kind of detection system and application method paying attention to diagnosing normal form jointly for autism
CN111409068A (en)*2020-03-132020-07-14兰州大学 Bionic manipulator control system and bionic manipulator
CN115844336A (en)*2023-02-072023-03-28之江实验室Automatic real-time monitoring system and device for epileptic seizure
CN117301059A (en)*2023-10-122023-12-29河海大学Teleoperation system, teleoperation method and storage medium for mobile robot

Cited By (8)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN109500815A (en)*2018-12-032019-03-22深圳市越疆科技有限公司Robot for the judgement study of preposition posture
CN109670416A (en)*2018-12-032019-04-23深圳市越疆科技有限公司Learning method, learning system and storage medium based on the judgement of preposition posture
CN109670416B (en)*2018-12-032023-04-28深圳市越疆科技有限公司 Learning method, learning system and storage medium based on pre-posture judgment
CN109500815B (en)*2018-12-032023-06-02日照市越疆智能科技有限公司Robot for front gesture judgment learning
CN109717878A (en)*2018-12-282019-05-07上海交通大学A kind of detection system and application method paying attention to diagnosing normal form jointly for autism
CN111409068A (en)*2020-03-132020-07-14兰州大学 Bionic manipulator control system and bionic manipulator
CN115844336A (en)*2023-02-072023-03-28之江实验室Automatic real-time monitoring system and device for epileptic seizure
CN117301059A (en)*2023-10-122023-12-29河海大学Teleoperation system, teleoperation method and storage medium for mobile robot

Similar Documents

PublicationPublication DateTitle
CN207752446U (en)A kind of gesture identification interaction systems based on Leap Motion equipment
CN108549490A (en)A kind of gesture identification interactive approach based on Leap Motion equipment
CN106859956B (en)A kind of human acupoint identification massage method, device and AR equipment
Wang et al.Real-time hand-tracking with a color glove
CN103049761B (en)Sign Language Recognition Method based on sign language glove and system
CN102778953B (en)Motion sensing control method of shadow play remote digital performing based on Kinect
CN114529639A (en)Method, device, equipment and storage medium for generating virtual image animation
CN107450714A (en)Man-machine interaction support test system based on augmented reality and image recognition
CN109800645A (en)A kind of motion capture system and its method
CN103425238A (en) Gesture-based control system cloud system
CN110348370B (en)Augmented reality system and method for human body action recognition
Rautaray et al.Design of gesture recognition system for dynamic user interface
CN109839827A (en)A kind of gesture identification intelligent home control system based on total space location information
CN114756130A (en) A virtual-real interactive system for hands
Aditya et al.Recent trends in HCI: A survey on data glove, LEAP motion and microsoft kinect
CN106293099A (en)Gesture identification method and system
CN113496168B (en)Sign language data acquisition method, device and storage medium
Ong et al.Investigation of feature extraction for unsupervised learning in human activity detection
Sreejith et al.Real-time hands-free immersive image navigation system using Microsoft Kinect 2.0 and Leap Motion Controller
CN116403280A (en)Monocular camera augmented reality gesture interaction method based on key point detection
CN111640183A (en)AR data display control method and device
CN112837339B (en) Trajectory drawing method and device based on motion capture technology
Annachhatre et al.Virtual Mouse Using Hand Gesture Recognition-A Systematic Literature Review
Chang et al.Real-time arm motion tracking and hand gesture recognition based on a single inertial measurement unit
CN112861606A (en)Virtual reality hand motion recognition and training method based on skeleton animation tracking

Legal Events

DateCodeTitleDescription
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp