Movatterモバイル変換


[0]ホーム

URL:


CN106227351A - The state transition method of a kind of mobile unit and device - Google Patents

The state transition method of a kind of mobile unit and device
Download PDF

Info

Publication number
CN106227351A
CN106227351ACN201610611481.9ACN201610611481ACN106227351ACN 106227351 ACN106227351 ACN 106227351ACN 201610611481 ACN201610611481 ACN 201610611481ACN 106227351 ACN106227351 ACN 106227351A
Authority
CN
China
Prior art keywords
gesture
state
mobile unit
synopsis
judged
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610611481.9A
Other languages
Chinese (zh)
Inventor
韩宇
何杰
席齐
黄健东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Guangyun Network Technology Co ltd
Original Assignee
Shenzhen Guangyun Network Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Guangyun Network Technology Co ltdfiledCriticalShenzhen Guangyun Network Technology Co ltd
Priority to CN201610611481.9ApriorityCriticalpatent/CN106227351A/en
Publication of CN106227351ApublicationCriticalpatent/CN106227351A/en
Pendinglegal-statusCriticalCurrent

Links

Classifications

Landscapes

Abstract

The present invention relates to the state transition method of a kind of mobile unit, comprise the steps: that photographic head continues to monitor whether the personnel in its identification region make the gesture of setting, in this way, in lookup system, the synopsis of storage, finds the state parameter that the gesture of acquirement is corresponding;Otherwise, repeat this step and keep the state of described mobile unit constant;After obtaining above-mentioned gesture corresponding states parameter, change the state parameter of described mobile unit so that it is be transformed into the state that described acquirement gesture is corresponding.The invention still further relates to a kind of device realizing said method.Implement state transition method and the device of a kind of mobile unit of the present invention, have the advantages that operation safety, State Transferring will not lose efficacy.

Description

The state transition method of a kind of mobile unit and device
Technical field
The present invention relates to automatic controlling device, more particularly, it relates to the state transition method of a kind of mobile unit and dressPut.
Background technology
The State Transferring of mobile unit inputs what specific signal realized typically by certain input equipment.In early daysEquipment realizes State Transferring typically by button or button.Along with the development of technology, engender touch screen;In manyOn mobile unit, the conversion of equipment state can be realized by touch screen.But, although these modes are capable of vehicle-mountedThe State Transferring of equipment, but due to the particularity of mobile unit, stretched out one's hand by human pilot in driving procedure control vehicle-mountedThe state of equipment, sight line leaves road surface simultaneously, can bring unsafe factor to driving.In order to improve this shortcoming, existingTechnology use acoustic control operate mobile unit, although such operation need not leave road surface so that the sight line of driver, alsoNeed not stretch out one's hand operation.But, it is limited to the maturity of speech recognition and the particularity of mobile unit local environment (generally at carCarrying out State Transferring when advancing, now, environment noise is compared with big and type is uncertain), therefore, pure speech of the prior artThe problem identifying existence translation exception all the time, such as, wakes up up when need not wake-up device or can not when needs wake up upWake up up etc..
Summary of the invention
The technical problem to be solved in the present invention is, State Transferring dangerous for the aforesaid operations of prior art lost efficacyDefect, it is provided that a kind of operate safety, the state transition method of a kind of mobile unit that State Transferring will not lose efficacy and device.
The technical solution adopted for the present invention to solve the technical problems is: construct the State Transferring side of a kind of mobile unitMethod, described mobile unit includes photographic head, comprises the steps:
Whether the personnel in A) photographic head continues to monitor its identification region make the gesture of setting, in this way, perform next step;NoThen, repeat this step and keep the state of described mobile unit constant;
B) synopsis of storage in lookup system, finds the state parameter that the gesture of acquirement is corresponding;
C) state parameter of described mobile unit is changed so that it is be transformed into the state that described acquirement gesture is corresponding.
Further, described step A) farther include following steps:
A1) by comparing the multiple image data before present frame and present frame, it is judged that whether the personnel in described identification regionMake gesture, in this way, perform next step;Otherwise, this step is repeated;
A2) view data being judged as gesture part in present frame is separated, it may be judged whether meet the gesture content of setting, asIt is to obtain code or the coding of this gesture, and perform step B);Otherwise it is assumed that the current gesture content obtained not can recognize that,Return step A1);
Wherein, described gesture content includes gesture motion or gesture shape, it is judged that described gesture motion needs use to include currentlyMultiple image data before frame and present frame, it is judged that gesture shape then needs to use the view data of present frame.
Further, described step A1) in judge whether the personnel in described identification region make gesture and pass through as followsStep realizes:
A11) in described identification region, search the hand position of these personnel;
A12) realize following the tracks of to the hand found, obtain the view data of hand, it is judged that whether the hand of these personnel sellsGesture, in this way, performs step A2);Otherwise, step A11 is returned).
Further, described step A1) the middle judgement using gauss hybrid models realization whether gesture occurs;Described stepRapid A2) in by the continuous multiple frames gesture shape preserved before and position result, and the current gesture shape identified contrasts, logicalCross stealthy Markov model and carry out identification maneuver sequence probability, exceed the gesture motion being identification of confidence threshold, described handsGesture action include clockwise, turn-take or horizontally slip counterclockwise;Judgement to described gesture shape uses random forests algorithm realExisting.
Further, described step B) in, number or encode by comparing the gesture obtained and be previously stored in systemCoding or numbering in middle synopsis judge whether described gesture is present in described synopsis and obtains this gesture numbering or compileThe state parameter that code is corresponding.
Further, the state that described acquirement gesture is corresponding includes that activation equipment enters audio frequency control pattern or activation setsFor entering the pattern that receives calls.
Further, described mobile unit includes display screen, can know under described display screen display current stateThe state that other all gestures and the identified rear system of this gesture will enter.
The invention still further relates to a kind of device realizing said method, described mobile unit includes photographic head, and this device includes:
Gesture recognition module: continue to monitor its personnel identified in region for photographic head and whether make the gesture of setting, in this way,Call synopsis module;Otherwise, repeat this step and keep the state of described mobile unit constant;
Synopsis module: the synopsis of storage in lookup system, finds the state parameter that the gesture of acquirement is corresponding;
State changes module: change the state parameter of described mobile unit so that it is be transformed into the state that described acquirement gesture is corresponding.
Further, described gesture recognition module farther includes:
Gesture searches unit: for by comparing the multiple image data before present frame and present frame, it is judged that described cog regionWhether the personnel in territory make gesture, in this way, call gesture identification unit;
Gesture identification unit: for separating the view data being judged as gesture part in present frame, it may be judged whether meet and setFixed gesture shape, in this way, obtains code or the coding of this gesture, and performs synopsis module;Otherwise it is assumed that obtain is currentGesture in frame not can recognize that, returns gesture and searches unit;
Wherein, described gesture identification unit is also by the continuous multiple frames gesture shape preserved before and position result, and currently knowsOther gesture shape contrasts, and carrys out identification maneuver sequence probability by stealthy Markov model, exceedes being of confidence thresholdThe gesture motion identified, including clockwise, turn-take or horizontally slip counterclockwise.
Further, in described synopsis module, number or encode by comparing the gesture obtained and be previously storedCoding in synopsis or numbering judge whether described gesture is present in described synopsis and obtains this gesture volume in systemsNumber or state parameter corresponding to coding;
Described device also includes gesture display module, and described gesture display module is worked as the display screen display at mobile unitThe state of all gestures being capable of identify that under front state and this gesture correspondence respectively.
Implement state transition method and the device of a kind of mobile unit of the present invention, have the advantages that owing to drivingThe personnel that sail are when making gesture, and its sight line is not required to leave road surface, and therefore its driving safety is protected to a certain extentCard;Meanwhile, the identification of gesture has no bearing on environmental noise, even if so in the case of bad environments, it is also possible to make carThe state of load equipment is changed, this point, is very important in the waking up up or activate of equipment.When a mobile unit quiltAfter activating after waking up up, both can be continuing with gesture conversion equipment state so that it is be transformed into the shape that need not input too many informationState, it is also possible to then use Voice command so that mobile unit enters the state needing to input relatively multi information, such as, navigate.AlwaysIt, its operation safety, State Transferring will not lose efficacy.
Accompanying drawing explanation
Fig. 1 is the status method flow path switch in the state transition method of a kind of mobile unit of the present invention and device embodimentFigure;
Fig. 2 is the particular flow sheet of gesture identification in described embodiment;
Fig. 3 is the schematic diagram showing screen display gesture and corresponding states thereof in described embodiment;
Fig. 4 is the structural representation of device in described embodiment.
Detailed description of the invention
Below in conjunction with accompanying drawing, embodiments of the present invention is further illustrated.
As it is shown in figure 1, in the state transition method of a kind of mobile unit and device embodiment of the present invention, mobile unitIncluding the photographic head of the image of setting regions in obtaining car, the state transition method of this mobile unit comprises the steps:
In step S11 identifies region, whether personnel make setting gesture, in this way, perform next step;Otherwise, this step is repeated:In the present embodiment, above-mentioned mobile unit has photographic head, and the region that this photographic head is used for setting is monitored and obtainsIts image, for example, it is possible to human pilot or operator seat are monitored by the photographic head setting this mobile unit.Photographic head obtains shouldAfter the view data of the certain time (such as, the persistent period of multiple picture frames) in region, it becomes possible to judge in this regionWhether personnel make gesture, and whether the gesture made sets gesture, if this gesture is to set gesture or set in gestureOne, then perform next step;Otherwise, this step is repeated.It is noted that when repeating this step, this car is at equipmentState be immovable, such as, if perform this step time, that this mobile unit is locked out or dormancy, then repeatDuring this step, that this mobile unit is locked out equally or dormancy.As for the concrete steps of gesture identification in this step, slightlyAfter have more detailed description.
Step S12 obtains the corresponding state parameter of this gesture: in this step, due to previous step it has been determined that gestureExist, and belonged to one of the gesture of setting or the gesture of setting, the most in this step, it is simply that obtained this gesture correspondingState parameter.Specifically, state parameter determines the state of mobile unit, has the situation of various states at mobile unit,State parameter is one group of parameter, and when mobile unit is arranged according to this group parameter, its state is designated.Such as, one vehicle-mountedEquipment has lock-out state, after it unlocks, can enter music state, navigational state etc., now, mobile unitEnter the state parameter of music state after unblock and mobile unit unlock after to enter the state parameter of navigational state be different's.For the present embodiment, the gesture of each setting has a numbering or coding, when a gesture is identified, justCan obtain setting the coding of a gesture;And in systems, storage has above-mentioned gesture coding or numbering and the state parameter of correspondence thereofList.After the numbering obtaining this gesture or coding, it is possible to table look-up and obtain the state parameter of its correspondence.In other wordsSay, in the present embodiment, by comparing the gesture that obtains and number or coding and the coding being previously stored in synopsis in systemsOr numbering judges whether described gesture is present in described synopsis and obtains this gesture numbering or the state parameter of coding correspondence.
Step S13 is changed the state parameter of this mobile unit and is state parameter corresponding to this gesture: in this step, conversionThe state parameter of described mobile unit so that it is be transformed into the state parameter that described acquirement gesture is corresponding, say, that this is vehicle-mountedThe state of equipment is by the state that original State Transferring is that this gesture represents, and such as, original locking or resting state changesIt is the unblock that represents of gesture and accepts phonetic entry state;Or be converted to another hands by original locking or resting stateUnblock that gesture represents also accepts the navigational state of phonetic entry;Or be converted to another gesture by original locking or resting stateRepresent unblock and enter music state or the state etc. that receives calls.
In the present embodiment, as in figure 2 it is shown, the identification process for gesture may include steps of:
Step S21 obtains present frame and the view data of multiple frames before: in this step, as the basis of gesture identification,First it is to obtain the view data that photographic head obtains.In the present embodiment, at least need to obtain present frame and several frames beforeView data, such as, the data of 3-5 picture frame before acquirement.
Step S22 contrasts above-mentioned view data and judges whether gesture exists, and in this way, performs next step, as no, returns stepRapid S21.For the step whether the identification gesture in this step exists, in the present embodiment, first it is to identify regionThe hand position of interior these personnel of lookup, namely determines the hand position of personnel in view data, and is tracked it,To the view data of hand, changed in different frame by these view data, it is judged that whether the hand of these personnel makes gesture,In this way, next step is performed;Otherwise, step S21 is returned.
Step S23 judges whether this gesture sets gesture, in this way, performs next step;Otherwise, step S21 is returned: at thisIn embodiment, in either case, the state that gesture can represent is limited certainly, say, that define certain handsGesture represents what a kind of state determined that, and a state can only represent by a gesture.In other words, gesture must be predefinedAbility effective.If a gesture is in advance without definition, then this gesture can only be illegal or unrecognizable gesture, i.e.Make to make and be caught in, also the state of this mobile unit will not be brought impact.And due to the difference of gesture, its relevant figureCertainly it is different as data.In this step, it is simply that the view data of gesture will be obtained, different, from be previously storedThe view data contrast of definition, if meeting, then judges that this gesture is that be previously set, discernible;As do not met, then it is assumed thatThis gesture is unrecognizable, returns step S21.It is noted that in the present embodiment, gesture is divided by its content,At least include gesture motion and gesture shape;Gesture motion refers to that the continuous print that operator uses hand or finger to make movesMaking, gesture shape is then the shape that operator uses that its hand or finger are formed, so, it is judged that described gesture motion needsUse the multiple image data included before present frame and present frame, it is judged that gesture shape then needs to use the picture number of present frameAccording to.
Step S24 obtains numbering or the coding of this setting gesture: in this step, the view data obtaining with currently obtainingThe numbering of the gesture met or coding, this numbering or coding search, by being used in subsequent steps, the state ginseng that this gesture is correspondingNumber.
It is noted that in the present embodiment,.
In the present embodiment, gauss hybrid models is used to realize the judgement whether gesture occurs;By the company preserved beforeContinue multiframe gesture shape and position result, and the current gesture shape identified contrasts, and is known by stealthy Markov modelOther action sequence probability, exceedes the gesture motion being identification of confidence threshold, gesture motion include clockwise, turn-take counterclockwiseOr horizontally slip;And use random forests algorithm to realize the judgement of gesture shape when judging gesture shape.
Additionally, in the present embodiment, above-mentioned mobile unit also includes display screen, therefore, it can showing at this mobile unitShow the state that all gestures being capable of identify that under screen display current state and the identified rear system of this gesture will enter, please joinSee Fig. 3.The benefit so arranged is operator can be reminded to make correct gesture select, and shortens the operating time, improves drivingSafety.
The invention still further relates to a kind of device realizing said method, this device includes: gesture recognition module 1, synopsis mouldBlock 2, state change module 3 and gesture display module 4;Wherein, gesture recognition module 1 continues to monitor its cog region for photographic headWhether the personnel in territory make the gesture of setting, in this way, call synopsis module;Otherwise, repeat this step and keep described carThe state of load equipment is constant;The synopsis of storage in synopsis module 2 lookup system, finds the state ginseng that the gesture of acquirement is correspondingNumber;State is changed module 3 and is changed the state parameter of described mobile unit so that it is be transformed into the state that described acquirement gesture is corresponding;Gesture display module 4 is for all gestures being capable of identify that under the display screen display current state of mobile unit and this handsThe state that gesture is the most corresponding.
Further, gesture recognition module farther includes: gesture searches single 11 for by comparing present frame and working asMultiple image data before front frame, it is judged that whether the personnel in described identification region make gesture, in this way, call gesture identificationUnit 12;Gesture identification unit 12 is for separating the view data being judged as gesture part in present frame, it may be judged whether symbolClose the gesture shape set, in this way, obtain code or the coding of this gesture, and perform synopsis module;Otherwise it is assumed that obtainGesture in present frame not can recognize that, returns gesture and searches unit.Wherein, described gesture identification unit 12 is also by preserving beforeContinuous multiple frames gesture shape and position result, and the current gesture shape identified contrasts, by stealthy Markov modelCarry out identification maneuver sequence probability, exceed the gesture motion being identification of confidence threshold, including clockwise, turn-take or left counterclockwiseRight slip.
Meanwhile, in described synopsis module 2, number or encode by comparing the gesture obtained and be previously stored in systemCoding or numbering in middle synopsis judge whether described gesture is present in described synopsis and obtains this gesture numbering or compileThe state parameter that code is corresponding.
Embodiment described above only have expressed the several embodiments of the present invention, and it describes more concrete and detailed, but alsoTherefore the restriction to the scope of the claims of the present invention can not be interpreted as.It should be pointed out that, for those of ordinary skill in the artFor, without departing from the inventive concept of the premise, it is also possible to make some deformation and improvement, these broadly fall into the guarantor of the present inventionProtect scope.Therefore, the protection domain of patent of the present invention should be as the criterion with claims.

Claims (10)

CN201610611481.9A2016-07-302016-07-30The state transition method of a kind of mobile unit and devicePendingCN106227351A (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201610611481.9ACN106227351A (en)2016-07-302016-07-30The state transition method of a kind of mobile unit and device

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201610611481.9ACN106227351A (en)2016-07-302016-07-30The state transition method of a kind of mobile unit and device

Publications (1)

Publication NumberPublication Date
CN106227351Atrue CN106227351A (en)2016-12-14

Family

ID=57536478

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201610611481.9APendingCN106227351A (en)2016-07-302016-07-30The state transition method of a kind of mobile unit and device

Country Status (1)

CountryLink
CN (1)CN106227351A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN108399009A (en)*2018-02-112018-08-14易视腾科技股份有限公司The method and device of smart machine is waken up using human-computer interaction gesture
CN112684886A (en)*2020-12-282021-04-20上海宽创国际文化科技股份有限公司Motion control method and system of multi-dimensional art interaction device
CN117037237A (en)*2023-06-292023-11-10廊坊市珍圭谷科技有限公司Man-machine interaction method and system for intelligent robot

Citations (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN101349944A (en)*2008-09-032009-01-21宏碁股份有限公司Gesture guiding system and method for controlling computer system by touch gestures
CN101763515A (en)*2009-09-232010-06-30中国科学院自动化研究所Real-time gesture interaction method based on computer vision
CN103576848A (en)*2012-08-092014-02-12腾讯科技(深圳)有限公司Gesture operation method and gesture operation device
US20140128144A1 (en)*2012-11-082014-05-08Audible, Inc.In-vehicle gaming system for passengers
CN104216514A (en)*2014-07-082014-12-17深圳市华宝电子科技有限公司Method and device for controlling vehicle-mounted device, and vehicle

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN101349944A (en)*2008-09-032009-01-21宏碁股份有限公司Gesture guiding system and method for controlling computer system by touch gestures
CN101763515A (en)*2009-09-232010-06-30中国科学院自动化研究所Real-time gesture interaction method based on computer vision
CN103576848A (en)*2012-08-092014-02-12腾讯科技(深圳)有限公司Gesture operation method and gesture operation device
US20140128144A1 (en)*2012-11-082014-05-08Audible, Inc.In-vehicle gaming system for passengers
CN104216514A (en)*2014-07-082014-12-17深圳市华宝电子科技有限公司Method and device for controlling vehicle-mounted device, and vehicle

Cited By (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN108399009A (en)*2018-02-112018-08-14易视腾科技股份有限公司The method and device of smart machine is waken up using human-computer interaction gesture
CN112684886A (en)*2020-12-282021-04-20上海宽创国际文化科技股份有限公司Motion control method and system of multi-dimensional art interaction device
CN117037237A (en)*2023-06-292023-11-10廊坊市珍圭谷科技有限公司Man-machine interaction method and system for intelligent robot

Similar Documents

PublicationPublication DateTitle
CN108349389B (en)Commanding apparatus and for receive in a motor vehicle by user input character string method
EP2969697B1 (en)System and method for identifying handwriting gestures in an in-vehicle infromation system
CN109552340B (en) Gesture and Expression Control for Vehicles
CN111653277A (en) Vehicle voice control method, device, equipment, vehicle and storage medium
CN106934333B (en)Gesture recognition method and system
WO2013101047A1 (en)Systems, methods, and apparatus for invehicle fiducial mark tracking and interpretation
KR20150076627A (en)System and method for learning driving information in vehicle
CN103823632A (en)Screen unlocking method and terminal thereof
CN109614953A (en)A kind of control method based on image recognition, mobile unit and storage medium
CN106227351A (en)The state transition method of a kind of mobile unit and device
CN108509049B (en)Method and system for inputting gesture function
CN113548061B (en)Man-machine interaction method and device, electronic equipment and storage medium
CN103186230A (en)Man-machine interaction method based on color identification and tracking
JP2016200910A (en)Driver state determination device
CN106650660A (en)Vehicle type recognition method and terminal
JP2017510875A (en) Gesture device, operation method thereof, and vehicle equipped with the same
US10152223B2 (en)System-initiated help function for operating an apparatus associated with a vehicle-input of spaces
CN105117096A (en)Image identification based anti-tracking method and apparatus
CN106020479A (en)Method and apparatus for operating center console of intelligent vehicle with distance by utilization of orientation gestures
JP2023020995A (en) Autonomous vehicle detection method, device, electronic device and readable storage medium
CN106126055A (en)operation interface display control method and device
CN105825148A (en)Method and device for controlling mobile terminal application in moving
US9373026B2 (en)Method and system for recognizing hand gesture using selective illumination
WO2022224332A1 (en)Information processing device, vehicle control system, information processing method, and non-transitory computer-readable medium
WO2021019876A1 (en)Information processing device, driver specifying device, and learning model

Legal Events

DateCodeTitleDescription
C06Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
RJ01Rejection of invention patent application after publication
RJ01Rejection of invention patent application after publication

Application publication date:20161214


[8]ページ先頭

©2009-2025 Movatter.jp