Movatterモバイル変換


[0]ホーム

URL:


CN107247466A - Robot head gesture control method and system - Google Patents

Robot head gesture control method and system
Download PDF

Info

Publication number
CN107247466A
CN107247466ACN201710439682.XACN201710439682ACN107247466ACN 107247466 ACN107247466 ACN 107247466ACN 201710439682 ACN201710439682 ACN 201710439682ACN 107247466 ACN107247466 ACN 107247466A
Authority
CN
China
Prior art keywords
gesture
gesture shape
recognition result
robot
shape recognition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710439682.XA
Other languages
Chinese (zh)
Other versions
CN107247466B (en
Inventor
黄毅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
RESEARCH INSTITUTE OF BIT IN ZHONGSHAN
Zhongshan Changfeng Intelligent Automation Equipment Research Institute Co ltd
Original Assignee
Zhongshan Changfeng Intelligent Automation Equipment Research Institute Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhongshan Changfeng Intelligent Automation Equipment Research Institute Co ltdfiledCriticalZhongshan Changfeng Intelligent Automation Equipment Research Institute Co ltd
Priority to CN201710439682.XApriorityCriticalpatent/CN107247466B/en
Publication of CN107247466ApublicationCriticalpatent/CN107247466A/en
Application grantedgrantedCritical
Publication of CN107247466BpublicationCriticalpatent/CN107247466B/en
Expired - Fee Relatedlegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Classifications

Landscapes

Abstract

The invention discloses a robot head gesture control method and system. The method comprises the steps of identifying the gesture shape of the hand of a detected person to obtain a gesture shape identification result; when the gesture shape recognition result is a first gesture shape, setting a tracking flag position set in the robot, and triggering the robot to enter a tracking preparation state; when the gesture shape recognition result is a second gesture shape and the tracking flag is set, the robot tracks the hand motion of the person to be detected to perform head rotation motion; and when the gesture shape recognition result is a third gesture shape, clearing the tracking flag bit, stopping the head of the robot from rotating, and fixing the head of the robot at a stopping position. The method and the system control the robot to perform corresponding actions through different gestures of the tested person, can truly simulate the actual interactive operation between doctors and patients, and provide a good and effective traditional Chinese medicine rotation manipulation practice platform for doctors.

Description

A kind of robot head gestural control method and system
Technical field
The present invention relates to machine binocular vision technology field and technical field of medical equipment, more particularly to a kind of robotHead gestural control method and system.
Background technology
With the development of society, increasing people is perplexed by cervical spondylopathy.The treatment side of the cervical spondylopathy used at presentMethod is mainly expectant treatment, and wherein traditional Chinese medical science rotation class gimmick is adopted extensively as a kind for the treatment of method simple to operate, instant effectWith.Traditional Chinese medical science rotation class gimmick can be divided into self-positioning, preloading, quick acting, recover four steps, and class gimmick is rotated with right sideExemplified by:Patient's end sitting position, neck natural relaxation, doctor uses by method, rubs the gimmicks such as method, rolling method and loosen neck soft 5-10min;Allowing patients head, actively horizontal rotation to extreme angles, rotates, reaches fixed sense again after maximum flexion;Doctor is with anconPatient's lower jaw is held in the palm, 3-5s is gently pulled up;Patient is allowed to relax one's muscles, ancon is quickly lifted upwards with short power;After operating successfullyIt can hear or many sound snaps;Using carrying, by etc. gimmick musculi colli is loosened again.
In the self-positioning step of clinical manipulation, patient is firstly the need of following the gesture of doctor to rotate head, to reachThe physical endurance angle of itself, then carries out traditional Chinese medical science rotation class manipulation in the angle position.But the incidence of patient isCompare fragile place, it is necessary to which doctor can carry out dynamics and position accurately presses during traditional Chinese medical science rotation class manipulationRub treatment, it is therefore desirable to which doctor carries out substantial amounts of traditional Chinese medical science rotation class gimmick exercise.Used in current traditional Chinese medical science rotation class gimmick trainingCervical spondylopathy patient is simulated to provide exercising platform for doctor by robot, but the method that control machine people uses is predominantly using distantDevice straighforward operation is controlled, the position of needs is gone to using the head of remote control control robot.But in clinical practice operation, suffer fromPerson is to need to follow the gesture of doctor to rotate head, to reach the physical endurance angle of itself, therefore uses remote control controlThe head of robot processed, which is gone to, needs the method for position to simulate the interactive operation between actual doctors and patients in clinical practice.
The content of the invention
It is an object of the invention to provide a kind of robot head gestural control method and system, allow robot according to quiltThe gesture of survey personnel is acted, the interactive operation between the actual doctors and patients of true simulation, to provide exercising platform for doctor.
To achieve the above object, the invention provides following scheme:
A kind of robot head gestural control method, methods described includes:
The gesture shape of tested personnel's hand is recognized, the gesture shape recognition result is obtained;The gesture shape identificationAs a result first gesture shape, second gesture shape and the 3rd gesture shape are included;
When the gesture shape recognition result is the first gesture shape, by the tracking mark set in the robotWill position position, triggers the robot and enters tracking SBR, preparation starts to track the hand motion of the tested personnel;
When the gesture shape recognition result is the second gesture shape and tracking mark position has been set, instituteThe hand motion for stating tested personnel described in robotic tracking carries out end rotation motion;
When the gesture shape recognition result is three gesture shape, tracking mark position is reset, stoppedThe end rotation motion of the robot, the robot head is fixed on stop position.
Optionally, the gesture shape of identification tested personnel's hand, obtains the gesture shape recognition result, specific bagInclude:
Obtain the coloured image and depth image of tested personnel's hand;
Gesture foreground picture is obtained according to the coloured image and the depth image;
The gesture shape of the tested personnel is identified according to the gesture foreground picture, the gesture shape identification knot is obtainedReally.
Optionally, it is described that gesture foreground picture is obtained according to the coloured image and the depth image, specifically include:
The depth image is handled using Threshold Segmentation Algorithm, image district of the gray value in setting range is extractedDomain is used as foreground area;
The coloured image of the foreground area is obtained according to correspondence position of the foreground area in the coloured image;
Histogram is set up according to features of skin colors;
The coloured image of the foreground area is transformed into corresponding color space;
Back projection is carried out in the color space according to the histogram and obtains probability graph;
Denoising is carried out to the probability graph using morphological erosion expansion algorithm and Threshold Segmentation Algorithm, obtains describedGesture foreground picture.
Optionally, the gesture shape that the tested personnel is identified according to the gesture foreground picture, obtains the handGesture shape recognition result, is specifically included:
Calculate the characteristic vector of the gesture foreground picture;
The characteristic vector is classified using SVMs, gesture classification result is obtained;
The gesture shape of tested personnel's hand is identified according to the gesture classification result, the gesture shape is obtainedRecognition result.
Optionally, it is described when the gesture shape recognition result be the second gesture shape and the tracking mark positionWhen being set, the hand motion of tested personnel described in the robotic tracking carries out end rotation motion, specifically includes:
The direction of rotation of the robot head is determined according to the probability graph;
The horizontal rotation speed and vertical rotary speed of the robot head are calculated according to the probability graph;
The robot head is controlled according to the direction of rotation, the horizontal rotation speed and the vertical rotary speedHorizontally rotated according to the direction of rotation and the horizontal rotation speed, according to the direction of rotation and the vertical rotationSpeed is rotated vertically.
The invention also discloses a kind of robot head gestural control system, the system includes:
Gesture shape recognition result acquisition module, the gesture shape for recognizing tested personnel's hand obtains the gestureShape recognition result;The gesture shape recognition result includes first gesture shape, second gesture shape and the 3rd gesture shape;
First gesture shape control module, for when the gesture shape recognition result be the first gesture shape when,By the tracking mark position set in the robot position, trigger the robot and enter tracking SBR, preparation start withThe hand motion of tested personnel described in track;
Second gesture shape control module, for being the second gesture shape and institute when the gesture shape recognition resultTracking mark position is stated when being set, controls the hand motion of tested personnel described in the robotic tracking to carry out end rotation fortuneIt is dynamic;
3rd gesture shape control module, for when the gesture shape recognition result be three gesture shape when,Tracking mark position is reset, stops the end rotation motion of the robot, the robot head, which is fixed on, to stopStop bit is put.
Optionally, the gesture shape recognition result acquisition module is specifically included:
Image acquisition submodule, coloured image and depth image for obtaining tested personnel's hand;
Gesture foreground picture acquisition submodule, for obtaining gesture prospect according to the coloured image and the depth imageFigure;
Gesture shape recognition result acquisition submodule, for identifying the tested personnel's according to the gesture foreground pictureGesture shape, obtains the gesture shape recognition result.
Optionally, the gesture foreground picture acquisition submodule is specifically included:
Foreground area extraction unit, for being handled using Threshold Segmentation Algorithm the depth image, extracts gray scaleThe image-region being worth in setting range is used as foreground area;
Prospect color image taking unit, for being obtained according to correspondence position of the foreground area in the coloured imageObtain the coloured image of the foreground area;
Histogram sets up unit, for setting up histogram according to features of skin colors;
Image conversion unit, for the coloured image of the foreground area to be transformed into corresponding color space;
Probability graph acquiring unit, probability is obtained for carrying out back projection in the color space according to the histogramFigure;
Gesture foreground picture acquiring unit, for using morphological erosion expansion algorithm and Threshold Segmentation Algorithm to the probabilityFigure carries out denoising, obtains the gesture foreground picture.
Optionally, the gesture shape recognition result acquisition submodule is specifically included:
Characteristic vector computing unit, the characteristic vector for calculating the gesture foreground picture;
Gesture classification result acquiring unit, for classifying using SVMs to the characteristic vector, obtains handGesture classification results;
Gesture shape recognition result acquiring unit, for identifying tested personnel's hand according to the gesture classification resultThe gesture shape in portion, obtains the gesture shape recognition result.
Optionally, the second gesture shape control module is specifically included:
Direction of rotation acquisition submodule, the direction of rotation for determining the robot head according to the probability graph;
Rotary speed calculating sub module, the horizontal rotation speed for calculating the robot head according to the probability graphWith vertical rotary speed;
Rotary motion control submodule, for according to the direction of rotation, the horizontal rotation speed and the vertical rotationRotary speed controls the robot head to be horizontally rotated according to the direction of rotation and the horizontal rotation speed, according to instituteState direction of rotation and the vertical rotary speed is rotated vertically.
The specific embodiment provided according to the present invention, the invention discloses following technique effect:
The invention provides a kind of robot head gestural control method and system.Methods described recognizes tested personnel firstThe gesture shape of hand, obtains the gesture shape recognition result;The gesture shape recognition result include first gesture shape,Second gesture shape and the 3rd gesture shape;, will be described when the gesture shape recognition result is the first gesture shapeThe tracking mark position position set in robot, triggers the robot and enters tracking SBR, it is described that preparation starts trackingThe hand motion of tested personnel;When the gesture shape recognition result be the second gesture shape and the tracking mark positionWhen being set, the hand motion of tested personnel described in the robotic tracking carries out end rotation motion;When the gesture shapeWhen recognition result is three gesture shape, tracking mark position is reset, stops the head rotation of the robotTranshipment is dynamic, and the robot head is fixed on stop position.Methods described and system are controlled by the different gestures of tested personnelRobot processed is acted accordingly, can truly be simulated the interactive operation between actual doctors and patients, be provided well for doctorEffective traditional Chinese medical science rotation class gimmick exercising platform.
Brief description of the drawings
In order to illustrate more clearly about the embodiment of the present invention or technical scheme of the prior art, below will be to institute in embodimentThe accompanying drawing needed to use is briefly described, it should be apparent that, drawings in the following description are only some implementations of the present inventionExample, for those of ordinary skill in the art, without having to pay creative labor, can also be according to these accompanying drawingsObtain other accompanying drawings.
Fig. 1 is the method flow diagram of robot head gestural control method of the embodiment of the present invention;
Fig. 2 is the schematic diagram of gesture shape recognition result described in the embodiment of the present invention;
Fig. 3 is the schematic diagram of state space of the present invention and the steady-state spatial coordinate system;
Fig. 4 is the signal moved using robot head gestural control method control machine head part of the present inventionFigure;
Fig. 5 is the structural representation of robot head gestural control system of the embodiment of the present invention.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, completeSite preparation is described, it is clear that described embodiment is only a part of embodiment of the invention, rather than whole embodiments.It is based onEmbodiment in the present invention, it is every other that those of ordinary skill in the art are obtained under the premise of creative work is not madeEmbodiment, belongs to the scope of protection of the invention.
It is an object of the invention to provide a kind of robot head gestural control method and system.
In order to facilitate the understanding of the purposes, features and advantages of the present invention, it is below in conjunction with the accompanying drawings and specific realApplying mode, the present invention is further detailed explanation.
Fig. 1 is the method flow diagram of robot head gestural control method of the embodiment of the present invention.
Referring to Fig. 1, a kind of robot head gestural control method, including:
Step 101:The gesture shape of tested personnel's hand is recognized, the gesture shape recognition result is obtained.The gestureShape recognition result includes first gesture shape, second gesture shape and the 3rd gesture shape.
In step 101, the gesture shape of identification tested personnel's hand obtains the gesture shape recognition result, toolBody includes:
Step 1011:Obtain the coloured image and depth image of tested personnel's hand.
The coloured image and the depth image shoot acquisition by being fixed on the imaging sensor of robot head.InstituteState the Kinect that imaging sensor is Microsoft.
Step 1012:Gesture foreground picture is obtained according to the coloured image and the depth image.
The step 1012 is specifically included:
Step (1):The depth image is handled using Threshold Segmentation Algorithm, gray value is extracted in setting rangeImage-region be used as foreground area.
The light and shade of the depth image of gained represents the distance of object distance camera lens representated by current pixel point herein, falseIf there is a cupboard in camera lens apart from 5 meters of camera lens, a flat people raised one's hand is apart from 3 meters of camera lens, and wherein the hand of people is apart from camera lens2.5 meters, then what is shown in the obtained depth image is exactly the patch of the shape of a dark cupboard, and one brighterHuman body shape patch, also have the brighter patch of hand shape on this human body patch (because for hand is compared with human bodyApart from camera lens closer to), then according to light and shade (gray value) set threshold value with regard to the object of different distance can be partitioned into.In the present embodimentThe depth image is handled using Threshold Segmentation Algorithm, image-region of the gray value in setting range is extracted as precedingScene area, exactly in order to be partitioned into hand region in image background.Specific method is:The depth image is traveled through, by gray scalePixel intensity of the value in setting range retains, and the pixel outside setting range is set to 0, thus can be by the prospectSplit described in region from view picture in depth image.
Step (2):The foreground area is obtained according to correspondence position of the foreground area in the coloured imageColoured image.
The foreground area is exactly the image-region of tested personnel's hand, extracts figure of the gray value in setting rangeAfter picture region is as foreground area, according to position of the foreground area in the depth image, against the cromogramIdentical region is with regard to that can obtain the coloured image of the foreground area as in.Then according still further to skin color segmentation image, other are removedIt is not the object of the colour of skin (such as close to clothing of hand etc.), it becomes possible to obtain the gesture foreground picture.
Step (3):Histogram is set up according to features of skin colors.
The features of skin colors is exactly the feature that human body complexion has, and the features of skin colors can be obtained from many places, this realityApply in the method described in example, be the picture by choosing tested personnel's hand in advance, then the colour of skin to the hand is enteredRow statistics obtains the features of skin colors, is then compared with the similar features in application scenarios, is subject to certain calculating with mutualDistinguish, the specific features of skin colors obtained from.
Histogram described in the present embodiment uses Cr-Cb two-dimensional histogram.Initially set up one 50 × 50 twoHistogram is tieed up, statistics falls into the number of the pixel of each block in the two-dimensional histogram, sets up the institute of the features of skin colorsState two-dimensional histogram.Similarly, then the two-dimensional histogram of current scene in the coloured image for obtaining the foreground area is counted, contrastThe histogrammic difference of scene and the colour of skin, more significant feature in colour of skin histogram is left, it is easy to the feature that background is obscuredDelete, just obtained the last histogram, the histogram is normalized, its scope is fallen between 0-255.
Step (4):The coloured image of the foreground area is transformed into corresponding color space.
The features of skin colors has the face used in the different forms of expression, the present embodiment under different colours spaceThe colour space is YCrCb color spaces.
Step (5):Back projection is carried out in the color space according to the histogram and obtains probability graph.
In the histogram above set up, for certain point, abscissa is Cr value, and ordinate is Cb value, shouldThe value of point represents the number (frequency is may be considered after normalization) of the pixel with Cr, Cb value, in the coloured imageFull figure is traveled through again, and to any pixel point, Cr, Cb value for the point inquire about the corresponding frequency in the histogram, by thisFrequency and then obtains the probability graph as the brightness of the point, and bright-dark degree's representative of certain pixel should in the probability graphPoint is the probability size of tested personnel's hand skin, and the point is brighter, and the probability is bigger.
Step (6):Denoising is carried out to the probability graph using morphological erosion expansion algorithm and Threshold Segmentation Algorithm,Obtain the gesture foreground picture.
The probability graph is handled using morphological erosion expansion algorithm and Threshold Segmentation Algorithm, the probability is removedInfluence of noise in figure, obtains the gesture foreground picture, and the gesture foreground picture is a width black and white gray level image.
Step 1013:The gesture shape of the tested personnel is identified according to the gesture foreground picture, the gesture is obtainedShape recognition result.
The step 1013 is specifically included:
Step is 1.:Calculate the characteristic vector of the gesture foreground picture.
Geometric invariant moment (Hu squares) feature of the gesture foreground picture is calculated, hand described in the gesture foreground picture is calculatedFinger tip number, calculate the girth and area ratio of the gesture foreground picture.
The Hu moment characteristics, the finger tip number and the girth and area ratio are spliced into a row vector as work asThe characteristic vector of the preceding gesture foreground picture.For example calculate the obtained Hu be characterized as [0.8,0.1,0.01,0,0,0,0], the finger tip number is calculated as 3, and the girth is 0.02 with area ratio, then the characteristic vector that splicing is obtained is justIt is [0.8,0.1,0.01,0,0,0,0,3,0.02].
Step is 2.:The characteristic vector is classified using SVMs, gesture classification result is obtained.
The characteristic vector is classified using the grader of trained completion, such as using the SVMsAlgorithm for Training grader, is then classified using the grader to the characteristic vector, obtains the gesture classification result.
Step is 3.:The gesture shape of tested personnel's hand is identified according to the gesture classification result, obtains describedGesture shape recognition result.
Fig. 2 is the schematic diagram of gesture shape recognition result described in the embodiment of the present invention.Gesture shape of the present invention is knownOther result includes three kinds of gesture shapes, respectively first gesture shape, second gesture shape and the 3rd gesture shape.Wherein,One gesture shape is used to represent that the triggering robot prepares to start tracking, and second gesture shape is used to represent that the robot is openedThe gesture for beginning to track the tested personnel carries out end rotation motion, and the 3rd gesture shape is used to represent tracking stopping.Referring to figure2, in the present embodiment, using the gesture shape shown in Fig. 2 (a) as the first gesture shape, using the hand shown in Fig. 2 (b)Gesture shape is used as the 3rd gesture shape as the second gesture shape using the gesture shape shown in Fig. 2 (c).In realityIn the application of border, the different gesture shapes can be arranged as required to as first, second, and third gesture shape.
Step 102:When the gesture shape recognition result is the first gesture shape, it will be set in the robotTracking mark position position, trigger the robot and enter tracking SBR, preparation starts to track the hand of the tested personnelPortion is acted.
When the gesture shape recognition result is the first gesture shape shown in Fig. 2 (a), by the robotThe tracking mark position position of setting, triggers the robot and enters tracking SBR, preparation starts to track the tested personnelHand motion.The tracking mark position is a kind of protection setting to the robot motion, and robot is rotatedIt is preceding can all detect tracking mark position whether set, if without set, the robot is not carried out movement instruction, i.e., will not be withThe hand motion of tested personnel described in track is rotated.
Step 103:When the gesture shape recognition result be the second gesture shape and the tracking mark position byDuring set, the hand motion of tested personnel described in the robotic tracking carries out end rotation motion.
When the gesture shape recognition result is the second gesture shape shown in Fig. 2 (b) and tracking mark positionWhen being set, the hand motion that the robot starts to track the tested personnel carries out end rotation motion.By to instituteProbability graph is stated to carry out can be calculated coordinate of the presently described gesture shape in the coloured image, then to the gesture shapeImage coordinate carry out calculate can obtain the movement velocity in each joint of robot.
The robot is the training robot that class gimmick training is rotated towards the traditional Chinese medical science, and the robot is used for simulating cervical vertebraPatient for doctor to provide exercising platform.The training robot head neck has two joints, wherein the first joint can be with waterFlat rotation, second joint can rotate vertically, and the structural simulation human cervical spine with a kind of variation rigidity.
Step 103 is specifically included:
Step 1031:The direction of rotation of the robot head is determined according to the probability graph.
First, the coordinate that any point is defined in the probability graph is (x, y), and the gray value of (x, y) point is p (x, y), instituteThe p+q rank geometric moments for stating probability graph are:
Then:
M00=∑ p (x, y) (2)
M10=∑ xp (x, y) (3)
M01=∑ yp (x, y) (4)
Center of gravity P of the second gesture shape in the probability graphc(xc,yc) be:
Wherein, xcRepresent the x coordinate of the center of gravity, ycRepresent the y-coordinate of the center of gravity.
The state space that present image plane is image is defined, described image plane is to be differentiated according to described image sensorThe plane that rate is defined, described image and described image plane are identical with the resolution ratio of described image sensor, such as when useWhen described image sensor resolution ratio is 1440 × 900, the resolution sizes of described image plane and described image are also 1440×900.Then the current state space is:
X=(xc,yc)T (7)
Steady-state spatial is defined in the state space for Ω s,
Wherein, uwThe width of the state space is represented, β represents proportionality coefficient, and β is that value is being less than 1/2nd justNumber, vhRepresent the height of the state space.
Coordinate of the presently described second gesture shape in the state space is calculated according to the probability graph.According to describedRelative position relation between coordinate and the steady-state spatial border, determines the direction of rotation of the robot head.
Fig. 3 is the schematic diagram of state space of the present invention and the steady-state spatial coordinate system.As shown in figure 3, u0,u1,v0,v1Left and right, the upper and lower border of the steady-state spatial Ω s is represented respectively.When the second gesture shape is in the state spaceIn horizontal coordinate be located at the left side of the steady-state spatial left margin, i.e., the value of described horizontal coordinate is less than u0When, it is determined that describedRobot head turns clockwise;When the horizontal coordinate is more than u1When, determine the robot head rotate counterclockwise.Also may be usedTo be less than u when the value of the horizontal coordinate0When, determine the robot head rotate counterclockwise;When the horizontal coordinate is more than u1When, determine that the robot head turns clockwise.When vertical seat target value of the mark in the state space is smallIn v0When, determine that the robot head rotates to direction of bowing;When the horizontal coordinate is more than v1When, determine the robotHead rotates to new line direction;Or when the vertical seat target value is less than v0When, determine the robot head to new line directionRotation;When the horizontal coordinate is more than v1When, determine that the robot head rotates to direction of bowing.
Step 1032:The horizontal rotation speed and vertical rotation speed of the robot head are calculated according to the probability graphDegree.
According to coordinate position of the presently described second gesture shape in the state space and the steady-state spatial borderPosition calculate site error, calculate the site error be exactly the current state space of the second gesture shape withThe steady-state spatial closest to border make it is poor.
According to the state space and the steady-state spatial, the current site error is calculated, the site error e'sCalculation formula is as follows:
Wherein, R represents the conversion formula on the steady-state spatial border,Wherein a, b, c, d takesValue is respectively:
Wherein, c is a column vector, represents the border of the steady-state spatial,u0Represent the steady-state spatial ΩS border, u1The right margin of the steady-state spatial Ω s, v are represented respectively0The coboundary of the steady-state spatial Ω s, v are represented respectively1The lower boundary of the steady-state spatial Ω s is represented respectively.
Wherein, X represents the current state space, X=(xc,yc)T
The input u for controlling the robot head rotary speed is calculated according to the site error et,
ut=ke (14)
Wherein, k represents scaling coefficient, the scaling for carrying out the site error,ItsMiddle ηuvRepresent proportionality coefficient, the ηu、ηvIt is two constants.
In order that the rotary motion of the robot is more steady, sign function is asked for the site error, i.e.,:
Wherein, euRepresent error of the site error in image level direction, evRepresent that the site error is perpendicular in imageNogata to error,The speed of described image horizontal direction is represented,Represent the speed of described image vertical direction.
The speed of service w for obtaining the robot end is calculated using image turnxAnd wy, calculation formula is as follows:
Wherein, ωxRepresent the rotary speed rotated around image transverse axis of the robot head, that is, the machineThe vertical rotary speed of head part;ωyThe rotary speed rotated around the image longitudinal axis of the robot head is represented, alsoIt is the horizontal rotation speed of the robot head;(up,vp) be described image sensor image coordinate system principal point;λ tablesShow that described image sensor focal length is converted into the length of pixel;U represents the second gesture shape on the probability graphRow coordinate, v represents row coordinate of the second gesture shape on the probability graph;JsRepresent image turn,
Described robot is the training robot that class gimmick training is rotated towards the traditional Chinese medical science, and described robot is used for simulatingCervical spondylopathy patient for doctor to provide exercising platform.The artificial two-articulated robot of described machine, the incidence tool of the robotThere are two joints, wherein the first joint can be horizontally rotated, second joint can rotate vertically, and use a kind of variation rigidityStructural simulation human cervical spine.
Step 1033:According to the direction of rotation, the vertical rotary speed ωxWith the horizontal rotation speed omegayControlThe robot head is horizontally rotated according to the direction of rotation and the horizontal rotation speed, according to the direction of rotationRotated vertically with the vertical rotary speed.When the direction of rotation represents to turn clockwise, described first is controlled to closeSection is turned clockwise according to the horizontal rotation speed, and when the direction of rotation represents rotate counterclockwise, control is describedFirst joint carries out rotate counterclockwise according to the horizontal rotation speed, that is, controls the left-right rotation of the robot head.TogetherWhen the direction of rotation represents to rotate to direction of bowing, control the second joint according to the vertical rotary speed to instituteState direction of bowing to rotate, when the direction of rotation represents to rotate to new line direction, control the second joint according to described perpendicularDirect rotary rotary speed rotates to the new line direction, that is, controls rotating upwardly and downwardly for the robot head.
Step 104:It is when the gesture shape recognition result is three gesture shape, tracking mark position is clearZero, stop the end rotation motion of the robot, the robot head is fixed on stop position.
When the robot head rotates to exercise desired position, gesture is replaced by shown in Fig. 2 (c) by operating personnelThe 3rd gesture shape.Now the gesture shape recognition result is the 3rd gesture shape, and tracking mark position is clearZero, stop the end rotation motion of the robot, the robot head is fixed on stop position.That is, described machineHead part is tracked behind position needed for the second gesture shape is rotated to, static to be fixed on position needed for exercise.
Fig. 4 is the signal moved using robot head gestural control method control machine head part of the present inventionFigure.As shown in figure 4, tested personnel's hand 401 is placed in the front of described image sensor 402, according to rotation position needsMake three kinds of gesture shapes as shown in figure (2).The robot head has first joint 403 and the second joint404。
When tested personnel's hand makes the first gesture shape as shown in Fig. 2 (a), the now sign-shapedShape recognition result is the first gesture shape, by the tracking mark position set in the robot position, triggers the machinePeople enters tracking SBR, and preparation starts to track the hand motion of the tested personnel.
Next it is now described when tested personnel's hand makes the second gesture shape as shown in Fig. 2 (b)Gesture shape recognition result is the second gesture shape and the tracking mark position when being set, the robot start withThe hand motion of tested personnel described in track carries out end rotation motion.According to the direction of rotation, the vertical rotation calculatedRotary speed ωxWith the horizontal rotation speed omegay, first joint 403 of the robot head is controlled according to the rotationDirection and the horizontal rotation speed are horizontally rotated, when the direction of rotation represents to turn clockwise, and control described theOne joint 403 is turned clockwise according to the horizontal rotation speed, when the direction of rotation represents rotate counterclockwise, controlMake first joint 403 and carry out rotate counterclockwise according to the horizontal rotation speed, that is, control a left side for the robot headTurn right dynamic.The second joint 404 of the robot head is controlled according to the direction of rotation and the vertical rotation simultaneouslySpeed is rotated vertically, when the direction of rotation represents to rotate to direction of bowing, and controls the second joint 404 according to instituteState vertical rotary speed to rotate to the direction of bowing, when the direction of rotation represents to rotate to new line direction, control is describedSecond joint 404 rotates according to the vertical rotary speed to the new line direction, that is, controls above and below the robot headRotate.
When the robot head rotates to exercise desired position, the tested personnel (operator) changes gestureFor the 3rd gesture shape shown in Fig. 2 (c).Now the gesture shape recognition result is the 3rd gesture shape, will be describedTracking mark position is reset, and stops the end rotation motion of the robot, and the robot head is fixed on stop position.The final second gesture shape movement for having made robotic tracking tested personnel is to the angles and positions of exercise needs, Neng GouzhenInteractive operation between the actual doctors and patients of real simulation, the exercising platform for the treatment of skill is provided for doctor.
Fig. 5 is the structural representation of robot head gestural control system of the embodiment of the present invention.
As shown in figure 5, the robot head gestural control system, including:
Gesture shape recognition result acquisition module 501, the gesture shape for recognizing tested personnel's hand obtains the handGesture shape recognition result;The gesture shape recognition result includes first gesture shape, second gesture shape and the 3rd sign-shapedShape;
First gesture shape control module 502, for being the first gesture shape when the gesture shape recognition resultWhen, by the tracking mark position set in the robot position, trigger the robot and enter tracking SBR, prepare to startTrack the hand motion of the tested personnel;
Second gesture shape control module 503, for being the second gesture shape when the gesture shape recognition resultAnd tracking mark position is when being set, the hand motion of tested personnel described in the robotic tracking is controlled to carry out head rotationTranshipment is dynamic;
3rd gesture shape control module 504, for being the 3rd gesture shape when the gesture shape recognition resultWhen, tracking mark position is reset, stops the end rotation motion of the robot, the robot head is fixed onStop position.
Wherein, the gesture shape recognition result acquisition module 501 is specifically included:
Image acquisition submodule, coloured image and depth image for obtaining tested personnel's hand;
Gesture foreground picture acquisition submodule, for obtaining gesture prospect according to the coloured image and the depth imageFigure;
Gesture shape recognition result acquisition submodule, for identifying the tested personnel's according to the gesture foreground pictureGesture shape, obtains the gesture shape recognition result.
Wherein, the gesture foreground picture acquisition submodule is specifically included:
Foreground area extraction unit, for being handled using Threshold Segmentation Algorithm the depth image, extracts gray scaleThe image-region being worth in setting range is used as foreground area;
Prospect color image taking unit, for being obtained according to correspondence position of the foreground area in the coloured imageObtain the coloured image of the foreground area;
Histogram sets up unit, for setting up histogram according to features of skin colors;
Image conversion unit, for the coloured image of the foreground area to be transformed into corresponding color space;
Probability graph acquiring unit, probability is obtained for carrying out back projection in the color space according to the histogramFigure;
Gesture foreground picture acquiring unit, for using morphological erosion expansion algorithm and Threshold Segmentation Algorithm to the probabilityFigure carries out denoising, obtains the gesture foreground picture.
Wherein, the gesture shape recognition result acquisition submodule is specifically included:
Characteristic vector computing unit, the characteristic vector for calculating the gesture foreground picture;
Gesture classification result acquiring unit, for classifying using SVMs to the characteristic vector, obtains handGesture classification results;
Gesture shape recognition result acquiring unit, for identifying tested personnel's hand according to the gesture classification resultThe gesture shape in portion, obtains the gesture shape recognition result.
Wherein, the second gesture shape control module 503 is specifically included:
Direction of rotation acquisition submodule, the direction of rotation for determining the robot head according to the probability graph;
Rotary speed calculating sub module, the horizontal rotation speed for calculating the robot head according to the probability graphWith vertical rotary speed;
Rotary motion control submodule, for according to the direction of rotation, the horizontal rotation speed and the vertical rotationRotary speed controls the robot head to be horizontally rotated according to the direction of rotation and the horizontal rotation speed, according to instituteState direction of rotation and the vertical rotary speed is rotated vertically.
Robot head gestural control system of the present invention, can be controlled according to the gesture shape of tested personnel's handThe rotation and stopping of robot head, make the robot head move to the angles and positions of exercise needs, being capable of true mouldIntend the interactive operation between actual doctors and patients, the exercising platform for the treatment of skill is provided for doctor.
Specific case used herein is set forth to the principle and embodiment of the present invention, and above example is saidThe bright method and its core concept for being only intended to help to understand the present invention;Simultaneously for those of ordinary skill in the art, foundationThe thought of the present invention, will change in specific embodiments and applications.In summary, this specification content is notIt is interpreted as limitation of the present invention.

Claims (10)

CN201710439682.XA2017-06-122017-06-12Robot head gesture control method and systemExpired - Fee RelatedCN107247466B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201710439682.XACN107247466B (en)2017-06-122017-06-12Robot head gesture control method and system

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201710439682.XACN107247466B (en)2017-06-122017-06-12Robot head gesture control method and system

Publications (2)

Publication NumberPublication Date
CN107247466Atrue CN107247466A (en)2017-10-13
CN107247466B CN107247466B (en)2020-10-20

Family

ID=60019058

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201710439682.XAExpired - Fee RelatedCN107247466B (en)2017-06-122017-06-12Robot head gesture control method and system

Country Status (1)

CountryLink
CN (1)CN107247466B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN108717553A (en)*2018-05-182018-10-30杭州艾米机器人有限公司A kind of robot follows the method and system of human body
CN111968723A (en)*2020-07-302020-11-20宁波羽扬科技有限公司Kinect-based upper limb active rehabilitation training method
CN111975765A (en)*2019-05-242020-11-24京瓷办公信息系统株式会社Electronic device, robot system, and virtual area setting method
CN115145403A (en)*2022-09-052022-10-04北京理工大学 A gesture-based hand marker tracking method and system

Citations (11)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20120290111A1 (en)*2011-05-092012-11-15Badavne Nilay CRobot
WO2013112504A1 (en)*2012-01-252013-08-01Chrysler Group LlcAutomotive vehicle power window control using capacitive switches
CN103472920A (en)*2013-09-132013-12-25通号通信信息集团有限公司Action-recognition-based medical image control method and system
KR20140022654A (en)*2012-08-142014-02-25(주)동부로봇Cleaning robot for having gesture recignition function, and the contol method
CN103903011A (en)*2014-04-022014-07-02重庆邮电大学Intelligent wheelchair gesture recognition control method based on image depth information
CN104636342A (en)*2013-11-072015-05-20大连东方之星信息技术有限公司Archive display system with gesture control function
CN105787471A (en)*2016-03-252016-07-20南京邮电大学Gesture identification method applied to control of mobile service robot for elder and disabled
CN106200395A (en)*2016-08-052016-12-07易晓阳A kind of multidimensional identification appliance control method
CN106502272A (en)*2016-10-212017-03-15上海未来伙伴机器人有限公司A kind of target following control method and device
CN106529432A (en)*2016-11-012017-03-22山东大学Hand area segmentation method deeply integrating significance detection and prior knowledge
CN107204005A (en)*2017-06-122017-09-26北京理工大学A kind of hand mark tracking and system

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20120290111A1 (en)*2011-05-092012-11-15Badavne Nilay CRobot
WO2013112504A1 (en)*2012-01-252013-08-01Chrysler Group LlcAutomotive vehicle power window control using capacitive switches
KR20140022654A (en)*2012-08-142014-02-25(주)동부로봇Cleaning robot for having gesture recignition function, and the contol method
CN103472920A (en)*2013-09-132013-12-25通号通信信息集团有限公司Action-recognition-based medical image control method and system
CN104636342A (en)*2013-11-072015-05-20大连东方之星信息技术有限公司Archive display system with gesture control function
CN103903011A (en)*2014-04-022014-07-02重庆邮电大学Intelligent wheelchair gesture recognition control method based on image depth information
CN105787471A (en)*2016-03-252016-07-20南京邮电大学Gesture identification method applied to control of mobile service robot for elder and disabled
CN106200395A (en)*2016-08-052016-12-07易晓阳A kind of multidimensional identification appliance control method
CN106502272A (en)*2016-10-212017-03-15上海未来伙伴机器人有限公司A kind of target following control method and device
CN106529432A (en)*2016-11-012017-03-22山东大学Hand area segmentation method deeply integrating significance detection and prior knowledge
CN107204005A (en)*2017-06-122017-09-26北京理工大学A kind of hand mark tracking and system

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
MEI WANG 等: "《Hand gesture recognition using valley circle feature and Hu’s moments technique for robot movement control》", 《MEASUREMENT》*
吴宇: "《机器人视觉交流中的手势识别仿真》", 《计算机仿真》*
周凯: "《基于肤色和SVM的手势识别及其应用研究》", 《中国优秀硕士学位论文全文数据库 信息科技辑》*
王艳苓: "《基于手势识别的光标控制交互技术研究》", 《中国优秀硕士学位论文全文数据库 信息科技辑》*

Cited By (6)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN108717553A (en)*2018-05-182018-10-30杭州艾米机器人有限公司A kind of robot follows the method and system of human body
CN108717553B (en)*2018-05-182020-08-18杭州艾米机器人有限公司Method and system for robot to follow human body
CN111975765A (en)*2019-05-242020-11-24京瓷办公信息系统株式会社Electronic device, robot system, and virtual area setting method
CN111975765B (en)*2019-05-242023-05-23京瓷办公信息系统株式会社Electronic device, robot system, and virtual area setting method
CN111968723A (en)*2020-07-302020-11-20宁波羽扬科技有限公司Kinect-based upper limb active rehabilitation training method
CN115145403A (en)*2022-09-052022-10-04北京理工大学 A gesture-based hand marker tracking method and system

Also Published As

Publication numberPublication date
CN107247466B (en)2020-10-20

Similar Documents

PublicationPublication DateTitle
CN106250867B (en)A kind of implementation method of the skeleton tracking system based on depth data
JP7015152B2 (en) Processing equipment, methods and programs related to key point data
Joo et al.Panoptic studio: A massively multiview system for social motion capture
WO2022121645A1 (en)Method for generating sense of reality of virtual object in teaching scene
CN105426827B (en)Living body verification method, device and system
CN108830150A (en)One kind being based on 3 D human body Attitude estimation method and device
Davis et al.Determining 3-d hand motion
US20070098250A1 (en)Man-machine interface based on 3-D positions of the human body
CN111460976B (en) A data-driven real-time hand movement assessment method based on RGB video
KR20220024494A (en) Method and system for human monocular depth estimation
CN107247466A (en)Robot head gesture control method and system
CN110561399A (en)Auxiliary shooting device for dyskinesia condition analysis, control method and device
CN115761901B (en) A method for detecting and evaluating riding posture
CN108364302A (en)A kind of unmarked augmented reality multiple target registration method
CN107967687A (en) A method and system for acquiring the walking posture of a target object
CN112906653A (en)Multi-person interactive exercise training and evaluation system
CN106650628A (en)Fingertip detection method based on three-dimensional K curvature
CN106022211B (en)Method for controlling multimedia equipment by utilizing gestures
Jain et al.Human computer interaction–Hand gesture recognition
JP6770208B2 (en) Information processing device
US20250218222A1 (en)Systems and methods for automatic hand gesture recognition
Kondori et al.Direct hand pose estimation for immersive gestural interaction
CN107204005B (en)Hand marker tracking method and system
CN109895095B (en)Training sample obtaining method and device and robot
CN114779925A (en) A method and device for line-of-sight interaction based on a single target

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
TA01Transfer of patent application right
TA01Transfer of patent application right

Effective date of registration:20180212

Address after:528400 Guangdong province Zhongshan Torch Development Zone, Cheung Hing Road 6 No. 11 South Hebei trade building 1110 card

Applicant after:ZHONGSHAN CHANGFENG INTELLIGENT AUTOMATION EQUIPMENT RESEARCH INSTITUTE Co.,Ltd.

Applicant after:RESEARCH INSTITUTE OF BIT IN ZHONGSHAN

Address before:528400 Guangdong province Zhongshan Torch Development Zone, Cheung Hing Road 6 No. 11 South Hebei trade building 1110 card

Applicant before:ZHONGSHAN CHANGFENG INTELLIGENT AUTOMATION EQUIPMENT RESEARCH INSTITUTE Co.,Ltd.

GR01Patent grant
GR01Patent grant
CF01Termination of patent right due to non-payment of annual fee
CF01Termination of patent right due to non-payment of annual fee

Granted publication date:20201020


[8]ページ先頭

©2009-2025 Movatter.jp