Movatterモバイル変換


[0]ホーム

URL:


CN104899591B - The extracting method of wrist point and arm point based on depth camera - Google Patents

The extracting method of wrist point and arm point based on depth camera
Download PDF

Info

Publication number
CN104899591B
CN104899591BCN201510336118.6ACN201510336118ACN104899591BCN 104899591 BCN104899591 BCN 104899591BCN 201510336118 ACN201510336118 ACN 201510336118ACN 104899591 BCN104899591 BCN 104899591B
Authority
CN
China
Prior art keywords
point
wrist
arm
hand
palm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510336118.6A
Other languages
Chinese (zh)
Other versions
CN104899591A (en
Inventor
潘志庚
郭双双
张明敏
罗江林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jilin Jidong Culture and Art Group Co., Ltd.
Original Assignee
Jilin Jiyuan Space-Time Cartoon Game Science And Technology Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jilin Jiyuan Space-Time Cartoon Game Science And Technology Group Co LtdfiledCriticalJilin Jiyuan Space-Time Cartoon Game Science And Technology Group Co Ltd
Priority to CN201510336118.6ApriorityCriticalpatent/CN104899591B/en
Publication of CN104899591ApublicationCriticalpatent/CN104899591A/en
Application grantedgrantedCritical
Publication of CN104899591BpublicationCriticalpatent/CN104899591B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Classifications

Landscapes

Abstract

The present invention relates to the extracting method of a kind of wrist point based on depth camera and arm point, applied to fields such as virtual reality and augmented realities.Cromogram, depth map and the OpenNI/NITE gathered by Kinect depth cameras(Open natural interaction API)The palm of the hand point three-dimensional position of offer, first splits hand region, is then come out accurate hand extracted region by Bayes's complexion model, is extracted wrist point and arm point by visible sensation method;The palm of the hand point that wrist point, arm point and the NITE storehouses of extraction provide designs a variety of man-machine interaction gestures.Advantage is:Since 1.Kinect depth cameras come out, Many researchers have done the research of gesture identification using Kinect, but few based on wrist and arm in these researchs, and the method that the method for view-based access control model extracts wrist point and arm point is rarely found;2. wrist point and arm point extracting method in the present invention, amount of calculation is small, simple and easy, can real-time stabilization accurately extract wrist point and arm point.

Description

The extracting method of wrist point and arm point based on depth camera
Technical field
The present invention relates to the extracting method of a kind of wrist point based on depth camera and arm point, utilizes Kinect depth phasesCromogram, depth map and the OpenNI/NITE of machine collection(Open natural interaction API)The palm of the hand point three-dimensional position of offer, firstHand region is split, then extracted wrist point and arm point using visible sensation method.The wrist point of extraction, armThe palm of the hand point that point and NITE storehouses provide can design a variety of man-machine interaction gestures, and the gesture of design then is applied into virtual realityWith the field such as augmented reality.
Background technology
After Kinect even depth camera comes out, Many researchers have done grinding for many gesture identifications using KinectStudy carefully, these researchs are mainly based upon the gesture identification research of finger tip and the palm of the hand, and wrist and arm are added, Combination Design oneThe research of a little gesture identifications is few.Effect of the wrist in gesture identification is vital.Because wrist is from armTo the hinge that must be formed a connecting link of finger, therefore coordination and the ginseng of arm are much necessarily also required to using wrist as leading actionWith.Wrist point and arm point can be accurately extracted, is vital to the accuracy based on wrist and the gesture identification of arm.
The existing gesture identification method based on depth camera is all how to extract finger cusp or palm of the hand point, passes through the palm of the handPut with the position of finger cusp or number to design some gestures, these gestures are relatively simple, and application is than relatively limited.If wrist point and arm point can be extracted, by finger cusp, palm of the hand point, wrist point and arm point Combination Design gesture,Gesture wide variety is not only, and has extensive use range.For example, waved according to the Position Design of wrist point and arm pointThe gesture of arm, this gesture can apply to virtually play ball;Arm is designed according to palm of the hand point, wrist point and arm point to lift or put downGesture, can be used for virtually lifting barbell.
The content of the invention
It is an object of the invention to provide the extracting method of a kind of wrist point based on depth camera and arm point, solveAbove mentioned problem existing for prior art.What the depth map and OpeNI/NITE that the present invention is provided by Kinect depth cameras providedThe three-dimensional position of palm of the hand point, first comes out hand region segmentation, then using the bianry image being partitioned into as mask figure, from colourSplit region of selling in image, accurate hand is extracted in the coloured image in hand region using complexion model and Bayes classifierRegion.In the accurate hand region extracted, wrist point and hand are extracted using wrist point extracting method and arm point extracting methodArm point, then design various gestures using wrist point, arm point and palm of the hand point.
The above-mentioned purpose of the present invention is achieved through the following technical solutions:
The extracting method of wrist point and arm point based on depth camera, the colour gathered by Kinect depth camerasThe palm of the hand point three-dimensional position that figure, depth map and OpenNI/NITE are provided, hand region is first split, then pass through pattra leavesThis complexion model comes out accurate hand extracted region, is extracted wrist point and arm point by visible sensation method;ExtractionThe palm of the hand point that wrist point, arm point and OpenNI/NITE are provided designs a variety of man-machine interaction gestures, then should by the gesture of designUse virtual reality and augmented reality field;
The hand region is split:First split region of selling in the two-dimensional direction, then split in the depth direction, thenUsing the hand area image being partitioned into as mask figure, hand region segmentation is come out in the coloured image that Kinect is provided, thenAccurate hand region is extracted by Bayes's complexion model;
The method of wrist point extraction is:An end points of wrist is extracted first, it is then diagonal as rectangle using this end pointsOne summit of line, the inscribe rectangle parallel to reference axis is done with each point on handwheel exterior feature, select the most short inscribe of diagonalThe central point of rectangle is wrist point;
The method of arm point extraction is:Ask for calculating all inscribed circles of profile first, in all inscribed circles,The radius of inscribed circle is selected to be more than the threshold value of setting(Referring to the parameter training of inscribed circle diameter threshold value)To prevent from getting on finger,Palm of the hand point and wrist point be not in inscribed circle, and the center of circle of the inscribed circle farthest apart from palm of the hand point and wrist point is armPoint.
According to palm of the hand point, wrist point and arm point position and straight angle devise 3 class gestures, and with voidIntend reality and augmented reality is combined, designed gesture is applied in virtual reality and augmented reality.
The beneficial effects of the present invention are:The method of existing extraction wrist and arm, is largely in wrist and armPlace sets aid mark, for example in wrist is a belt for being different from the colour of skin, wrist from extracting.Based on pure visionThe method method that extracts wrist point and arm point it is rarely found.In existing gesture identification research and application, by handWrist point and arm point are as trace point and to design interactive example also rarely found.This method is simple and easy, and amount of calculation is small, Neng GoushiShi Wending's extracts wrist point and arm point, and designs gesture with wrist point and the arm point extracted and be applied to virtual existingReal and augmented reality field.
Brief description of the drawings
Accompanying drawing described herein is used for providing a further understanding of the present invention, forms the part of the application, this hairBright illustrative example and its illustrate to be used to explain the present invention, do not form inappropriate limitation of the present invention.
Fig. 1 is the flow chart in the accurate hand region of extraction of the present invention;
The cromogram that Fig. 2 is inputted when being the accurate hand region of extraction of the present invention;
Depth map when Fig. 3 is the extraction accurate hand region of the present invention;
Split obtained coarse hand region when Fig. 4 is the extraction accurate hand region of the present invention to depth map;
Cromogram corresponding to coarse hand and arm regions when Fig. 5 is the extraction accurate hand region of the present invention;
The accurate hand area image that Fig. 6 is obtained when being the accurate hand region of extraction of the present invention;
Fig. 7 is the design sketch of the wrist end points extraction of the present invention;
The wrist end points that Fig. 8 is the present invention is extracted as inscribe rectangle schematic diagram;
The wrist end points that Fig. 9 is the present invention is extracted as non-inscribe rectangle schematic diagram;
Figure 10 is the wrist point schematic diagram of the various gestures extraction of the present invention;
The arm point that Figure 11 is the present invention is extracted as non-inscribed circle schematic diagram;
The arm point that Figure 12 is the present invention is extracted as inscribed circle schematic diagram;
Figure 13 is palm of the hand point schematic diagram in inscribed circle of the present invention;
Figure 14 is the wrist point of the present invention in inscribed circle schematic diagram;
The parameter training figure of Figure 15 inscribed circle diameter threshold values;
Figure 16 is the arm point that various gestures are tried to achieve by calculating inscribed circle of the present invention;
The arm that Figure 17 is the present invention is laid flat gesture schematic diagram;
Figure 18 clenches fist for the arm of the present invention and lifts gesture schematic diagram;
Figure 19 is laid flat to clench fist with arm for the arm of the present invention lifts the application schematic diagram of gesture;
Figure 20 is the downwardly and upwardly bending gesture schematic diagram of the present invention;
Figure 21 is the application schematic diagram of the wrist flex gesture of the present invention;
Figure 22 downwardly and upwardly waves arm gesture schematic diagram for the present invention's;
Figure 23 is the application schematic diagram for waving arm gesture of the present invention.
Embodiment
The detailed content and its embodiment of the present invention is further illustrated below in conjunction with the accompanying drawings.
Referring to shown in Fig. 1 to Figure 23, the extracting method of the wrist point and arm point of the invention based on depth camera, pass throughThe three-dimensional position for the palm of the hand point that the depth map and OpenNI/NITE that Kinect depth cameras provide provide, first by hand region pointCut out, then using the bianry image being partitioned into as mask figure, region of selling is split from coloured image, utilizes complexion modelAccurate hand region is extracted in the coloured image in hand region with Bayes classifier.In the accurate hand region extracted, profitWith wrist point extracting method and arm point extracting method extraction wrist point and arm point, wrist point, arm point and hand are then utilizedHeart point designs various gestures.
1st, the segmentation in hand region
According to the three-dimensional coordinate of the OpenNI/NITE palm of the hand points provided, first split in the two-dimensional direction.Palm of the hand point is rectangleCenter, one rectangle comprising hand and arm regions of segmentation.
For the coordinate of palm of the hand point in the two-dimensional direction.Represent the length of side of rectangle.
In the segmentation of depth direction:The depth map provided according to Kinect, extract the depth value of palm of the hand point, determine handAnd arm regions.IfAnd the depth valueMeet,It is the point in hand and arm regions, i.e.,:
WhereinWhether mark pixel is hand and arm regions,The depth value of palm of the hand point traced into is represented,Represent pixelThe depth value at place,The depth capacity scope of hand and arm is represented, the present invention is taken as 160mm.
The extraction of area of skin color:The characteristics of due to Kinect data, cause the edge sawtooth of hand and arm regions brighterAobvious, this can influence the accuracy of contours extract below.Therefore, after in one's hands and arm regions, using Bayes's complexion model,The place that those are not the colours of skin is removed, obtains accurate hand and arm regions.
The present invention using YCbCr color spaces builds complexion model, its reason be it discreteness and people visual impressionThe characteristics such as the homogeneity known, brightness and chrominance separation and colour of skin aggregation zone are compact.Then calculated with Bayes classifier everyIndividual Cb*Cr components are skin color probabilities, are the colour of skin more than 0.6, establish colour of skin look-up table.Area of skin color is extracted when given oneImage, YCbCr color spaces are translated into, the Cb*Cr components of each pixel are inquired about in the colour of skin look-up table with foundation isNo is the colour of skin.
2nd, the extraction of wrist point
2.1st, profile is extracted from the bianry image of hand and arm, then extracts the convex closure of profile.Convex closure is by some straightLine segment composition, the end points of line segment is exactly the intersection point with convex closure and profile.
2.2nd, the extraction of wrist end points
Form convex closure line segment in choose most long line segment, obtain this line segment corresponding to profile, calculate on the profileEach point to the distance of nose section, take profile point P corresponding to ultimate range, P points are an end points of wrist.
2.3rd, the extraction of wrist point
Using the wrist end points of extraction as a summit, any point is another summit on contouring, two summits of connectionLine segment is the diagonal of rectangle, does rectangle, judges whether the rectangle is profile inscribe rectangle, i.e., the point on rectangle is all on profileOr in contoured interior.A little, the wrist end points with extraction is cornerwise the two of rectangle to the institute on traversal profile in this approachIndividual summit, does rectangle, seeks inscribe rectangle.
Ask in all inscribe rectangles, select the most short inscribe rectangle of diagonal, the central point of inscribe rectangle is wrist point.
3rd, the extraction of arm point
Wrist is a narrow region, so the range areas very little of wrist point.Arm is then one sizableRegion, so the extraction of arm point also quite flexible.The inscribed circle of profile is calculated, inscribe diameter of a circle is greater than the threshold value of setting(Prevent inscribed circle from getting on finger)And palm of the hand point and wrist point be not necessarily in circle(Prevent inscribed circle from taking in palm and wristOn).Chosen distance palm of the hand point and the farthest inscribed circle of wrist point in all inscribed circles, center of circle of this circle is exactly arm point.
3.1st, the inscribed circle of profile is calculated
On contouring a bit, using this point as basic point, line segment is a little done on contouring, using the midpoint of line segment as the center of circle, lineSegment length be diameter do circle, judge on profile a little to the distance in the center of circle, if thering is a point or multiple points to arrive on profileThe distance in the center of circle is less than radius, and it is not inscribed circle to turn out this circle.
It is the center of circle to continue to take the center of next profile point and basic point line, and the distance between 2 points are that diameter does circle, are sentencedWhether disconnected is inscribed circle.Until on profile institute a little all traveled through, basic point just takes next profile point as basic point, then with wheelEach point on exterior feature does circle, asks for inscribed circle, until basic point is taken all over whole profile.
3.2 calculate arm point
The center of circle of each inscribed circle is calculated to the distance of palm of the hand point and wrist point, if palm of the hand point and wrist point arrive the center of circleDistance is less than radius, it was demonstrated that in inscribed circle, this inscribed circle is undesirable for hand point and wrist point.Otherwise this is recorded respectivelyThe distance of centre point, radius and the center of circle to palm of the hand point and wrist point.Radius is selected to be more than in all qualified inscribed circlesThe center of circle of certain threshold value and the farthest inscribed circle of circle center distance palm of the hand point and wrist point is arm point.
The parameter training of 3.3 inscribed circle diameter threshold values
The parameter training process of the inscribed circle diameter threshold value of the present invention:In selected depth bounds, one is stretched out every timeFinger, finger, which will stretch, not bend, front and rear slowly mobile in this depth bounds, used during movement and be based on profileMethod extraction finger tip point, using the finger tip point of extraction as the center of circle, setting length r=20 are that radius does circle, and this circle has two with fingerIntersection point, the width of the distance, i.e. finger of two intersection points is calculated, in this depth bounds, every changing other hands in the five fingers for a period of timeRefer to, test repeatedly, extract the width in this depth bounds of different fingers.The present invention have selected six depth boundses, will be everyThe finger width value output of individual depth bounds, tries to achieve the scope of finger width, takes the median of finger width scope to add 3px conductsThreshold value during the extraction of arm point.
The width range square length of side of depth value scope finger choose ()
[1001mm, 1100mm] [10px, 13px] 12px+
[901mm,1000mm] [12px,16px] 14px+
[801mm,900mm] [15px, 20px] 17px+
[701mm,800mm] [18px,23px] 20px+
[601mm,700mm] [23px,26px] 24px+
[501mm,600mm] [24px,28px] 26px+
4th, the design of the palm of the hand, wrist and handle composite gesture
If palm of the hand point is, wrist point is, arm point isIf straight lineWith straight lineObtuse angle in angle is, straight lineAngle with the transverse axis of reference axis is.According to,And palm of the hand point3 class gestures are designed with the position of arm point.
4.1st, arm is stretched flat and lifted the design of gesture
WhenAndWhen, it is set as that arm is stretchedDeuce gesture.
WhenAndWhen, gesture is lifted for clenching fist for arm,
4.2nd, the design of wrist flex gesture
Work as straight lineWith straight lineObtuse angle in angle isAndFor handGesture is bent downwardly along wrist.WhenAndFor hand gesture is bent upwards along wrist.
4.3rd, the design of arm gesture is waved
WhenBrandished downwards for arm.
WhenBrandished upwards for arm.
Embodiment 1:The segmentation in hand region
Referring to shown in Fig. 1 and Fig. 2, the present invention provides depth map using Kinect depth cameras and OpenNI/NITE is providedPalm of the hand point three-dimensional position carry out the segmentation in hand region.Then the look-up table established using Bayes's complexion model, extraction are accurateHand region.
Embodiment 2:The extraction of wrist point
Referring to shown in Fig. 3 to Fig. 5, the profile in hand region is extracted first, the convex closure of profile is then extracted again, tries to achieve convex closureProfile corresponding to nose section, each point on the profile is calculated to the distance of nose section, to nose segment distance maximumProfile point is an end points of wrist.
Wrist end points is a summit, and it is some another summit to appoint on contouring, using two summits as on the diagonal of rectangleTwo summits, do the rectangle parallel to reference axis, then judge on the rectangle each point whether on profile or contoured interior.If the point outside profile on rectangle be present, this rectangle is not inscribe rectangle.The point on each profile is traveled through in this approach, withWrist end points is two summits of diagonal, does rectangle, asks for inscribe rectangle.Select diagonal most short in all inscribe rectanglesInscribe rectangle, the central point of this rectangle is exactly wrist point.
Embodiment 3:The extraction of arm point
Referring to shown in Fig. 6 to Fig. 8, using this point as basic point, line segment is a little done on contouring, with line on contouring a bitThe midpoint of section be the center of circle, and line segment length does circle for diameter, judge on profile a little to the distance in the center of circle, if had on profileOne point or multiple points are less than radius to the distance in the center of circle, and it is not inscribed circle to turn out this circle.
It is the center of circle to continue to take the center of next profile point and basic point line, and the distance between 2 points are that diameter does circle, are sentencedWhether disconnected is inscribed circle.Until on profile institute a little all traveled through, basic point just takes next profile point as basic point, then with wheelEach point on exterior feature does circle, asks for inscribed circle, until basic point is taken all over whole profile.
The center of circle of each inscribed circle is calculated to the distance of palm of the hand point and wrist point, if palm of the hand point and wrist point are to the center of circleDistance be less than radius, it was demonstrated that in inscribed circle, this inscribed circle is undesirable for hand point and wrist point.Otherwise record respectivelyThe distance of this centre point, radius and the center of circle to palm of the hand point and wrist point.Select radius big in all qualified inscribed circlesIn the center of circle of certain threshold value and the farthest inscribed circle of circle center distance palm of the hand point and wrist point be arm point.
Embodiment 4:Palm of the hand point, the design and application of wrist point and arm point combination gesture
Referring to shown in Fig. 9 to Figure 14, if palm of the hand point is, wrist point is, arm point isIf straight lineWith straight lineObtuse angle in angle is, straight lineWith the folder of the transverse axis of reference axisAngle is.According to,And 3 class gestures are designed in the position of palm of the hand point and arm point.
Gesture identification is combined with augmented reality, is stretched flat and is clenched fist with designed arm and lift gesture to driveThe stretching, extension of virtual portrait model arm and lifting for arm.Hand represents virtual hand in augmented reality along the reclinate gesture of wristThe action of model preparation opening wine bottle, hand are bent upwards the unlatching of gesture expression bottle along wrist.It is virtual to drive to wave arm gestureHand in scene is played table tennis with virtual portrait.
The preferred embodiment of the present invention is the foregoing is only, is not intended to limit the invention, for the technology of this areaFor personnel, the present invention can have various modifications and variations.All any modification, equivalent substitution and improvements made for the present invention etc.,It should be included in the scope of the protection.

Claims (1)

CN201510336118.6A2015-06-172015-06-17The extracting method of wrist point and arm point based on depth cameraActiveCN104899591B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201510336118.6ACN104899591B (en)2015-06-172015-06-17The extracting method of wrist point and arm point based on depth camera

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201510336118.6ACN104899591B (en)2015-06-172015-06-17The extracting method of wrist point and arm point based on depth camera

Publications (2)

Publication NumberPublication Date
CN104899591A CN104899591A (en)2015-09-09
CN104899591Btrue CN104899591B (en)2018-01-05

Family

ID=54032245

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201510336118.6AActiveCN104899591B (en)2015-06-172015-06-17The extracting method of wrist point and arm point based on depth camera

Country Status (1)

CountryLink
CN (1)CN104899591B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN107358215B (en)*2017-07-202020-10-09重庆工商大学 An image processing method applied to an augmented reality system for jewelry
CN108563329B (en)*2018-03-232021-04-27上海数迹智能科技有限公司Human body arm position parameter extraction algorithm based on depth map
CN108985242B (en)*2018-07-232020-07-14中国联合网络通信集团有限公司Gesture image segmentation method and device
CN109344701B (en)*2018-08-232021-11-30武汉嫦娥医学抗衰机器人股份有限公司Kinect-based dynamic gesture recognition method
CN111354029B (en)*2020-02-262023-05-05深圳市瑞立视多媒体科技有限公司Gesture depth determination method, device, equipment and storage medium
CN112036284B (en)*2020-08-252024-04-19腾讯科技(深圳)有限公司Image processing method, device, equipment and storage medium
CN112949542A (en)*2021-03-172021-06-11哈尔滨理工大学Wrist division line determining method based on convex hull detection

Citations (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN103226387A (en)*2013-04-072013-07-31华南理工大学Video fingertip positioning method based on Kinect
CN104063059A (en)*2014-07-132014-09-24华东理工大学Real-time gesture recognition method based on finger division

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN102402680B (en)*2010-09-132014-07-30株式会社理光Hand and indication point positioning method and gesture confirming method in man-machine interactive system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN103226387A (en)*2013-04-072013-07-31华南理工大学Video fingertip positioning method based on Kinect
CN104063059A (en)*2014-07-132014-09-24华东理工大学Real-time gesture recognition method based on finger division

Also Published As

Publication numberPublication date
CN104899591A (en)2015-09-09

Similar Documents

PublicationPublication DateTitle
CN104899591B (en)The extracting method of wrist point and arm point based on depth camera
CN105447466B (en)A kind of identity integrated recognition method based on Kinect sensor
CN101807114B (en)Natural interactive method based on three-dimensional gestures
CN103226387B (en)Video fingertip localization method based on Kinect
KR101259662B1 (en) A face classification method, a face classification apparatus, a classification map, a face classification program, and a recording medium on which the program is recorded
CN107123083A (en) Face Editing Method
CN102567703B (en)Hand motion identification information processing method based on classification characteristic
CN106874861A (en)A kind of face antidote and system
CN102867313B (en)Visual saliency detection method with fusion of region color and HoG (histogram of oriented gradient) features
CN106598227A (en)Hand gesture identification method based on Leap Motion and Kinect
US7404774B1 (en)Rule based body mechanics calculation
JP2005038375A (en)Eye configuration classifying method and configuration classifying map and eye make-up method
CN108227922A (en)Cosmetic method on a kind of real-time digital image of virtual reality
CN105068748A (en)User interface interaction method in camera real-time picture of intelligent touch screen equipment
CN105303523A (en)Image processing method and mobile terminal
CN107767335A (en)A kind of image interfusion method and system based on face recognition features' point location
CN101833654B (en)Sparse representation face identification method based on constrained sampling
CN101777195A (en)Three-dimensional face model adjusting method
CN105046199A (en)Finger tip point extraction method based on pixel classifier and ellipse fitting
CN105447480A (en)Face recognition game interactive system
CN103745228B (en)Dynamic gesture identification method on basis of Frechet distance
CN102496002A (en)Facial beauty evaluation method based on images
CN104679242A (en)Hand gesture segmentation method based on monocular vision complicated background
CN105107200A (en)Face change system and method based on real-time deep somatosensory interaction and augmented reality technology
CN103500010A (en)Method for locating fingertips of person through video

Legal Events

DateCodeTitleDescription
C06Publication
PB01Publication
C10Entry into substantive examination
SE01Entry into force of request for substantive examination
CB02Change of applicant information
CB02Change of applicant information

Address after:130012 Jilin province city Changchun well-informed high tech Industrial Development Zone, Road No. 168

Applicant after:JILIN JIYUAN SPACE-TIME CARTOON GAME SCIENCE AND TECHNOLOGY GROUP CO., LTD.

Address before:No. 2888, Silicon Valley Avenue, Changchun high tech Zone, Jilin Province

Applicant before:JILIN JIYUAN SPACE-TIME ANIMATION GAME TECHNOLOGY CO., LTD.

GR01Patent grant
GR01Patent grant
CP01Change in the name or title of a patent holder
CP01Change in the name or title of a patent holder

Address after:130012 No. 168 Boxue Road, Changchun High-tech Industrial Development Zone, Jilin Province

Patentee after:Jilin Jidong Culture and Art Group Co., Ltd.

Address before:130012 No. 168 Boxue Road, Changchun High-tech Industrial Development Zone, Jilin Province

Patentee before:JILIN JIYUAN SPACE-TIME CARTOON GAME SCIENCE AND TECHNOLOGY GROUP CO., LTD.


[8]ページ先頭

©2009-2025 Movatter.jp