Movatterモバイル変換


[0]ホーム

URL:


CN108245099A - Robot moving method and device - Google Patents

Robot moving method and device
Download PDF

Info

Publication number
CN108245099A
CN108245099ACN201810036671.1ACN201810036671ACN108245099ACN 108245099 ACN108245099 ACN 108245099ACN 201810036671 ACN201810036671 ACN 201810036671ACN 108245099 ACN108245099 ACN 108245099A
Authority
CN
China
Prior art keywords
target
camera
robot
detecting
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810036671.1A
Other languages
Chinese (zh)
Inventor
王声平
周毕兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Infinite Power Development Co., Ltd.
Original Assignee
Shenzhen Water World Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Water World Co LtdfiledCriticalShenzhen Water World Co Ltd
Priority to CN201810036671.1ApriorityCriticalpatent/CN108245099A/en
Priority to PCT/CN2018/077604prioritypatent/WO2019136808A1/en
Publication of CN108245099ApublicationCriticalpatent/CN108245099A/en
Pendinglegal-statusCriticalCurrent

Links

Classifications

Landscapes

Abstract

Present invention is disclosed a kind of robot moving method and devices, the described method comprises the following steps:Auditory localization is carried out by voice collection device, determines the direction of target;By camera towards target, and pass through camera and detect target;When detecting target, vision positioning is carried out by camera, determines target position;It is moved to target position.A kind of robot moving method that the embodiment of the present invention is provided, by auditory localization determine target direction with by camera towards target, camera is recycled to carry out vision positioning to determine the accurate location of target, it allows the robot to fast and accurately be moved in front of user's (target), solves the technical issues of robot can not be accurately moved in front of user, realize accurate adjustment robot and user towards relationship and distance relation, ensure that robot can preferably receive the phonetic order of user, be greatly improved user experience.

Description

Robot moving method and device
Technical field
The present invention relates to robotic technology fields, especially relate to a kind of robot moving method and device.
Background technology
With the development of science and technology, more and more Intelligent life electric appliances enter family, people’s lives are substantially increasedComfort and convenience, wherein, sweeping robot is one of Intelligent life electric appliance most favored by users.It sweeps the floor machinePeople also known as sweeps machine, intellective dust collector, robot cleaner etc. automatically, can rely on certain artificial intelligence, automatically, independentlyLand clearing work is completed in the room.
Current sweeping robot also adds speech identifying function, and by speech identifying function, sweeping robot canReceive the phonetic order of user, and corresponding action is performed according to phonetic order.When user from sweeping robot farther out when, sweep the floorRobot then determines the position of user by auditory localization technology, and the voice for moving again to user position reception user refers toIt enables.
However, the positioning accuracy of auditory localization is not high, position substantially can only be determined so that sweeping robot cannot be timelyIt is accurately moved in front of user, no user oriented or the situation larger with user distance usually occurs, so as to cause sweepingFloor-washing robot can not accurately receive the phonetic order of user, influence user experience.
Invention content
The main object of the present invention is provides a kind of robot moving method and device, it is intended to which solving robot can not be accurateThe technical issues of being moved in front of user.
To achieve these objectives, the embodiment of the present invention proposes a kind of robot moving method, the described method comprises the following steps:
Auditory localization is carried out by voice collection device, determines the direction of target;
By camera towards the target, and the target is detected by the camera;
When detecting the target, vision positioning is carried out by the camera, determines the target position;
It is moved to the target position.
Optionally, the camera is monocular cam, described to carry out vision positioning by the camera, is determined describedThe step of target position, includes:
The image of the target is acquired by the monocular cam;
A distance is moved towards the target direction, the image of the target is acquired again by the monocular cam;
The target position is determined according to the front and rear image of the target acquired twice.
Optionally, the camera is binocular camera, described to carry out vision positioning by the camera, is determined describedThe step of target position, includes:
The image of the target is acquired by the binocular camera;
The target position is determined according to the image of the target.
Optionally, described the step of detecting the target by the camera, includes:
When not detecting the target, hide the barrier for blocking the target, until detecting that the target isOnly.
Optionally, described to hide the barrier for blocking the target, until detecting target the step of, includes:
It is moved towards the target direction;
When encountering barrier, along the Boundary Moving of the barrier, until detecting the target.
Optionally, described to hide the barrier for blocking the target, until detecting target the step of, includes:Toward the lateral movement of the target direction, until detecting the target.
Optionally, described the step of detecting the target by the camera, includes:
Recognition of face detection is carried out by the camera;
When detecting face, determine to detect the target.
Optionally, the step moved to the target position includes:
It is that terminal plans mobile route using present position as starting point, the target position;
It is moved along the mobile route to the terminal.
Optionally, the method further includes:When the position of the target changes, the mobile road is planned againDiameter.
Optionally, the artificial sweeping robot of the machine.
The embodiment of the present invention proposes a kind of robot mobile device simultaneously, and described device includes:
Direction determining mould for carrying out auditory localization by voice collection device, determines the direction of target;
Module of target detection, for camera to be detected the target towards the target, and by the camera;
Position determination module, for when detecting the target, carrying out vision positioning by the camera, determining instituteState target position;
Mobile control module, for the robot to be controlled to be moved to the target position.
Optionally, the camera is monocular cam, and the position determination module includes:
First collecting unit, for acquiring the image of the target by the monocular cam;
Second collecting unit, for the robot to be controlled to move a distance towards the target direction, again by instituteState the image that monocular cam acquires the target;
First determination unit, the image of the target for being acquired twice before and after determine that the target institute is in placeIt puts.
Optionally, the camera is binocular camera, and the position determination module includes:
Third collecting unit, for acquiring the image of the target by the binocular camera;
Second determination unit, for determining the target position according to the image of the target.
Optionally, described device further includes obstacle avoidance module, and the obstacle avoidance module is used for:
When not detecting the target, hide the barrier for blocking the target, until the module of target detectionUntil detecting the target.
Optionally, the obstacle avoidance module includes:
First movement unit, for the robot to be controlled to be moved towards the target direction;
Second mobile unit, for the robot when encountering barrier, to be controlled to be moved along the boundary of the barrierIt is dynamic, until the module of target detection detects the target.
Optionally, the obstacle avoidance module includes third mobile unit, and the third mobile unit is used for:Control the machinePeople toward the target direction lateral movement, until the module of target detection detects the target.
Optionally, module of target detection includes:
Face identification unit, for carrying out recognition of face detection by the camera;
Target determination unit, for when detecting face, determining to detect the target.
Optionally, the mobile control module includes:
Path planning unit, for being terminal planning movement by starting point, the target position of present positionPath;
Mobile control unit, for the robot to be controlled to be moved along the mobile route to the terminal.
Optionally, the mobile control module further includes routing update unit, and the routing update unit is used for:When describedWhen the position of target changes, the mobile route is planned again.
The embodiment of the present invention also proposes a kind of sweeping robot, including memory, processor and at least one is storedIn the memory and the application program performed by the processor is configured as, the application program is configurable for holdingRow aforementioned machines people's moving method.
A kind of robot moving method that the embodiment of the present invention is provided determines target direction will take the photograph by auditory localizationPicture head recycles camera to carry out vision positioning to determine the accurate location of target so that robot can be quick towards targetIt is accurately moved in front of user's (target), solves the technical issues of robot can not be accurately moved in front of user, it is realShowed accurate adjustment robot and user towards relationship and distance relation, ensure that robot can preferably receive the language of userSound instructs, and is greatly improved user experience.
Description of the drawings
Fig. 1 is the flow chart of the robot moving method first embodiment of the present invention;
Fig. 2 is the structure diagram of robot in the embodiment of the present invention;
Fig. 3 is the target-bound schematic diagram of camera of robot in the embodiment of the present invention;
Fig. 4 is the flow chart of the robot moving method second embodiment of the present invention;
Fig. 5 is the schematic diagram of the barrier that shelter target is hidden by robot in the embodiment of the present invention;
Fig. 6 is the another schematic diagram for the barrier that shelter target is hidden by robot in the embodiment of the present invention;
Fig. 7 is the module diagram of the robot mobile device first embodiment of the present invention;
Fig. 8 is the module diagram of the module of target detection in Fig. 7;
Fig. 9 is the module diagram of the position determination module in Fig. 7;
Figure 10 is the another module diagram of the position determination module in Fig. 7;
Figure 11 is the module diagram of the mobile control module in Fig. 7;
Figure 12 is the module diagram of the mobile control module in Fig. 7;
Figure 13 is the module diagram of the robot mobile device second embodiment of the present invention;
Figure 14 is the module diagram of the obstacle avoidance module in Figure 13.
The embodiments will be further described with reference to the accompanying drawings for the realization, the function and the advantages of the object of the present invention.
Specific embodiment
It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, it is not intended to limit the present invention.
The embodiment of the present invention is described below in detail, the example of the embodiment is shown in the drawings, wherein from beginning to endSame or similar label represents same or similar element or the element with same or like function.Below with reference to attachedThe embodiment of figure description is exemplary, and is only used for explaining the present invention, and is not construed as limiting the claims.
Those skilled in the art of the present technique are appreciated that unless expressly stated, singulative " one " used herein, " oneIt is a ", " described " and "the" may also comprise plural form.It is to be further understood that is used in the specification of the present invention arrangesDiction " comprising " refers to there are the feature, integer, step, operation, element and/or component, but it is not excluded that presence or additionOther one or more features, integer, step, operation, element, component and/or their group.It should be understood that when we claim memberPart is " connected " or during " coupled " to another element, it can be directly connected or coupled to other elements or there may also beIntermediary element.In addition, " connection " used herein or " coupling " can include wireless connection or wireless coupling.It is used herein to arrangeTake leave whole or any cell and all combination of the "and/or" including one or more associated list items.
Those skilled in the art of the present technique are appreciated that unless otherwise defined all terms used herein are (including technology artLanguage and scientific terminology), there is the meaning identical with the general understanding of the those of ordinary skill in fields of the present invention.Should alsoUnderstand, those terms such as defined in the general dictionary, it should be understood that have in the context of the prior artThe consistent meaning of meaning, and unless by specific definitions as here, the meaning of idealization or too formal otherwise will not be usedTo explain.
The robot moving method and device of the embodiment of the present invention, can be applied to various robots, be particularly suitable for sweepingFloor-washing robot.It is described in detail by taking sweeping robot as an example below.
With reference to Fig. 1, propose the robot moving method first embodiment of the present invention, the described method comprises the following steps:
S11, auditory localization is carried out by voice collection device, determines the direction of target.
Voice collection device in the embodiment of the present invention is preferably microphone array, as shown in Fig. 2, sweeping robot 100On be provided with the microphone array being made of four microphones 101 and a camera 102.Sweeping robot utilizes microphone arrayThe sound that row acquisition sound source is sent out carries out auditory localization, so that it is determined that the direction of sound source, that is, target using auditory localization technology.SoundSource location technology is the prior art of comparative maturity, and details are not described herein.
Goal refers mainly to people (user), naturally it is also possible to be the sounding object that other are capable of sounding, the present invention is to thisIt is not construed as limiting.
S12, by camera towards target, and pass through camera and detect target.Judge whether to detect target, when detectingDuring target, S13 is entered step;When not detecting target, terminate flow.
Behind the direction that target is determined, camera is then made its right by sweeping robot towards target, such as rotating cameraQuasi- target or entire robot rotary alignment target, and start camera, target is detected by camera.As shown in figure 3, it sweeps102 alignment target 200 of camera of floor-washing robot 100, so that within sweep of the eye (camera of the target 200 in camera 102Two sidelines location of 102 extensions).
When target is behaved, sweeping robot can carry out recognition of face detection by camera, when detecting face,It then determines to detect target, otherwise determines not detect target.Face recognition technology is the prior art of comparative maturity, hereinIt repeats no more.
It, can be with other visual spies of recognition detection human body other than people being identified detection using face recognition technologyIt levies to realize target detection, the present invention will not enumerate herein to be repeated.
When target is other sounding objects, then it can realize that target is examined by the visual properties of the recognition detection sounding objectIt surveys.
S13, vision positioning is carried out by camera, determines target position.
When detecting target, sweeping robot then carries out vision using vision positioning technology by camera to target to be determinedPosition, determines target position.
Optionally, when camera is monocular cam, sweeping robot can be adopted first in situ by monocular camCollect the image of target, then move a distance towards target direction, the image of target is acquired again by monocular cam, finallyGo out the three-dimensional coordinate of target according to the image analysis calculation of the front and rear target acquired twice, determine target position.Pass through twoThe specific method that the image of a station acquisition calculates the three-dimensional coordinate of target is same as the prior art, and this will not be repeated here.
Optionally, when camera is binocular camera, sweeping robot can directly pass through in situ binocular cameraThe image of target is acquired, often acquisition is primary to obtain the two images with parallax, is analyzed using the two images with parallaxThe three-dimensional coordinate of target is calculated, determines target position.Target is calculated using the two images analysis with parallaxThe specific method of three-dimensional coordinate is same as the prior art, and this will not be repeated here.
S14, it is moved to target position.
After target position is determined, sweeping robot is then moved to target position, until being moved to and meshMark stops movement after keeping certain distance, the phonetic order of target is then received by voice collection device, according to phonetic orderPerform corresponding action.
When being moved to target position, sweeping robot can be using present position as starting point, target positionMobile route is planned for terminal, is moved then along mobile route to terminal, is eventually arrived at target position.
Further, when the position of target changes in moving process, sweeping robot then planning movement againPath is moved along the mobile route planned again to terminal.So as to which when the object moves, sweeping robot can also follow targetIt is mobile, realize the real-time tracking retinue to target.
So that the accurate location of target is determined by way of auditory localization+vision positioning so that robot can be fastIt is fast to be accurately moved in front of user's (target), solve the technical issues of robot can not be accurately moved in front of user.
With reference to Fig. 4, propose the robot moving method second embodiment of the present invention, the described method comprises the following steps:
S21, auditory localization is carried out by voice collection device, determines the direction of target.
S22, by camera towards target, and pass through camera and detect target.Judge whether to detect target, when not examiningWhen measuring target, S23 is entered step;When detecting target, S24 is entered step.
S23, the barrier for hiding shelter target, until detecting target.
S24, vision positioning is carried out by camera, determines target position.
S25, it is moved to target position.
In view of that may have barrier between sweeping robot and target, when sweeping robot does not detect in situ meshDuring mark, illustrate that the visual field of the camera of sweeping robot is blocked by barrier, therefore the present embodiment is in the base of first embodimentStep S23 is increased on plinth, when sweeping robot does not detect target in situ, then hides the barrier of shelter target,Until detecting target so that sweeping robot even if blocked by barrier can automatic dodging barrier reach and useFamily is at one's side.
Sweeping robot can hide the barrier of shelter target in the following manner:
It optionally, can not as shown in figure 5, when the sight of the camera 102 of sweeping robot 100 is blocked by barrier 300When detecting target 200, sweeping robot 100 is first moved towards 200 direction of target, when encountering barrier 300, further along obstacleThe Boundary Moving of object 300, until detecting target 200.During the Boundary Moving along barrier 300, camera102 always towards target 200, and when the edge for being moved to barrier 300, the sight (shown in dotted line) of camera 102 is just kept awayIt has opened barrier 300 and has reached target 200, so as to detect target 200.When detecting target 200, sweeping robot100 carry out vision positioning by camera 102, determine 200 position of target, and move to 200 position of target.
It optionally, can not as shown in fig. 6, when the sight of the camera 102 of sweeping robot 100 is blocked by barrier 300When detecting target 200, sweeping robot 100 is then toward the lateral movement in 200 direction of target, until detecting target 200.It is laterally preferably at an acute angle with 200 direction of target, as between 45-90 degree angle, to reduce displacement distance to the greatest extent, naturally it is also possible in straightAngle or obtuse angle.Toward during lateral movement, camera 102 is always towards target 200, when being moved to 300 side of barrierDuring edge position, the sight (shown in dotted line) of camera 102 just avoids barrier 300 and reaches target 200, so as to examineMeasure target 200.When detecting target 200, sweeping robot 100 then carries out vision positioning by camera 102, determines mesh200 positions are marked, and are moved to 200 position of target.
The robot moving method of the embodiment of the present invention, by auditory localization determine target direction with by camera towards meshMark recycles camera to carry out vision positioning to determine the accurate location of target so that robot can be moved fast and accuratelyIn front of to user's (target), solve the technical issues of robot can not be accurately moved in front of user, realize accurate tuneWhole robot and user towards relationship and distance relation, ensure that robot can preferably receive the phonetic order of user, poleBig improves user experience.
The operations such as also, in the embodiment of the present invention, robot also is able to initiative recognition user, and realization follows, interactive experienceMore abundant, user experience is more preferably.
With reference to Fig. 7, the robot mobile device first embodiment of the present invention is proposed, described device includes direction determining mould10th, module of target detection 20, position determination module 30 and mobile control module 40, wherein:Direction determining mould 10, for passing throughVoice collection device carries out auditory localization, determines the direction of target;Module of target detection 20, for by camera towards the meshMark, and pass through camera and detect target;Position determination module 30, for when detecting target, vision to be carried out by cameraPositioning, determines target position;Mobile control module 40, for robot to be controlled to be moved to target position.
Voice collection device in the embodiment of the present invention is preferably microphone array, as shown in Fig. 2, sweeping robot 100On be provided with the microphone array being made of four microphones 101 and a camera 102.Direction determining mould 10 utilizes MikeThe sound that wind array acquisition sound source is sent out carries out auditory localization, so that it is determined that the side of sound source, that is, target using auditory localization technologyTo.Auditory localization technology is the prior art of comparative maturity, and details are not described herein.Goal refers mainly to people (user), whenCan also be so the sounding object that other are capable of sounding, this is not limited by the present invention.
Behind the direction that target is determined, module of target detection 20 then makes camera towards target, such as rotating cameraIts alignment target or the entire robot rotary alignment target of control, and start camera, target is detected by camera.Such as figureShown in 3, module of target detection 20 controls 102 alignment target 200 of camera of sweeping robot 100, so that target 200 is imagingFirst 102 (two sidelines location that camera 102 extends) within sweep of the eye.
When target is behaved, module of target detection 20 can carry out target detection by face recognition technology.Such as Fig. 8 institutesTo show, module of target detection 20 includes face identification unit 21 and target determination unit 22, wherein:Face identification unit 21, is used forRecognition of face detection is carried out by camera;Target determination unit 22, for when detecting face, determining to detect target,Otherwise it determines not detect target.Face recognition technology is the prior art of comparative maturity, and details are not described herein.
Module of target detection 20, can be with recognition detection people other than people being identified detection using face recognition technologyOther visual properties of body realize target detection, and the present invention will not enumerate herein repeats.
When target is other sounding objects, module of target detection 20 can then pass through the visual spy of the recognition detection sounding objectIt levies to realize target detection.
When detecting target, position determination module 30 then regards target using vision positioning technology by cameraFeel positioning, determine target position.
Optionally, when camera is monocular cam, position determination module 30 is as shown in figure 9, single including the first acquisitionFirst 31, second collecting unit 32 and the first determination unit 33, wherein:First collecting unit 31, is adopted for passing through monocular camCollect the image of target;Second collecting unit 32, for robot to be controlled to move a distance towards target direction, again by monocularCamera acquires the image of target;First determination unit 33, for the image analysis calculation of target acquired twice before and afterGo out the three-dimensional coordinate of target, determine target position.The three-dimensional coordinate of target is calculated by the image of two station acquisitionsSpecific method it is same as the prior art, this will not be repeated here.
Optionally, when camera is binocular camera, position determination module 30 is as shown in Figure 10, is acquired including third single34 and second determination unit 35 of member, wherein:Third collecting unit 34, for acquiring the image of target by binocular camera, oftenAcquisition is primary to obtain the two images with parallax;Second determination unit 35, for determining target institute according to the image of targetIn position, such as:The three-dimensional coordinate of target is calculated using the two images analysis with parallax, determines target position.ProfitThe specific method of three-dimensional coordinate for calculating target with the two images analysis with parallax is same as the prior art, does not go to live in the household of one's in-laws on getting married hereinIt states.
After target position is determined, mobile control module 40 then controls robot to be moved to target position,Until being moved to stopping moving after target holding certain distance.Then robot then can receive mesh by voice collection deviceTarget phonetic order performs corresponding action according to phonetic order.
Mobile control module 40 is as shown in figure 11, including path planning unit 41 and mobile control unit 42, wherein:PathPlanning unit 41, for being that terminal plans mobile route using present position as starting point, target position;Mobile control is singleMember 42 for robot to be controlled to move along road radial end movement, eventually arrives at target position.
Further, as shown in figure 12, mobile control module 40 can also include routing update unit 43, the routing updateUnit 43 is used for:When the position of target changes, mobile route is planned again.Mobile control unit 42 then along advising againThe mobile route drawn is moved to terminal.So as to which when the object moves, sweeping robot can also follow target to move, and realize to targetReal-time tracking retinue.
So that the accurate location of target is determined by way of auditory localization+vision positioning so that robot can be fastIt is fast to be accurately moved in front of user's (target), solve the technical issues of robot can not be accurately moved in front of user.
With reference to Figure 13, the robot mobile device second embodiment of the present invention is proposed, the present embodiment is in first embodimentOn the basis of increase obstacle avoidance module 50, which is used for:When module of target detection 20 does not detect target, hideThe barrier of shelter target, until module of target detection 20 detects target, even if so that sweeping robot is hinderedHinder object block also can automatic dodging barrier reach user at one's side.
In the embodiment of the present invention, obstacle avoidance module 50 is as shown in figure 14, including 51 and second mobile unit of first movement unit52, wherein:First movement unit 51, for robot to be controlled to be moved towards target direction;Second mobile unit 52 is encountered for working asDuring barrier, Boundary Moving of the robot along barrier is controlled, until module of target detection 20 detects target.
As shown in figure 5, when the sight of the camera 102 of robot 100 is blocked by barrier 300, module of target detection 20When can not detect target 200, first movement unit 51 then controls robot 100 to be moved towards 200 direction of target, when encountering obstacleDuring object 300, the second mobile unit 52 controls Boundary Moving of the robot 100 along barrier 300 again, until module of target detectionUntil 20 detect target 200.During the Boundary Moving along barrier 300, camera 102 is always for robot 100Towards target 200, when the edge for being moved to barrier 300, the sight (shown in dotted line) of camera 102 just avoids barrierHinder object 300 and reach target 200, so as to which module of target detection 20 is just able to detect that target 200.
In further embodiments, obstacle avoidance module 50 includes third mobile unit, which is used for:Control machineDevice people toward target direction lateral movement, until module of target detection 20 detects target.
As shown in fig. 6, when the sight of the camera 102 of robot 100 is blocked by barrier 300, module of target detection 20When can not detect target 200, third mobile unit then controls lateral movement of the robot 100 toward 200 direction of target, Zhi DaojianUntil measuring target 200.It is laterally preferably at an acute angle with 200 direction of target, as between 45-90 degree angle, with reduce to the greatest extent it is mobile away fromFrom, naturally it is also possible to rectangular or obtuse angle.Toward during being displaced sideways, camera 102 is always towards target for robot 100200, when being moved to 300 marginal position of barrier, the sight (shown in dotted line) of camera 102 just avoids barrier300 and target 200 is reached, so as to which module of target detection 20 is just able to detect that target 200.
The robot mobile device of the embodiment of the present invention, by auditory localization determine target direction with by camera towards meshMark recycles camera to carry out vision positioning to determine the accurate location of target so that robot can be moved fast and accuratelyIn front of to user's (target), solve the technical issues of robot can not be accurately moved in front of user, realize accurate tuneWhole robot and user towards relationship and distance relation, ensure that robot can preferably receive the phonetic order of user, poleBig improves user experience.
The present invention proposes a kind of sweeping robot simultaneously, is deposited including memory, processor and at least one be stored inIn reservoir and the application program performed by processor is configured as, the application program is configurable for performing robot movementMethod.The robot moving method includes the following steps:Auditory localization is carried out by voice collection device, determines the side of targetTo;By camera towards target, and pass through camera and detect target;When detecting target, vision is carried out by camera and is determinedPosition, determines target position;It is moved to target position.Robot moving method described in the present embodiment is this hairRobot moving method involved by bright middle above-described embodiment, details are not described herein.
It will be understood by those skilled in the art that the present invention includes being related to performing one in operation described hereinOr multinomial equipment.These equipment can specially be designed and be manufactured or can also include general-purpose computations for required purposeKnown device in machine.These equipment have the computer program being stored in it, these computer programs selectively activateOr reconstruct.Such computer program, which can be stored in equipment (for example, computer) readable medium or be stored in, to be suitable forStorage e-command is simultaneously coupled in any kind of medium of bus respectively, and the computer-readable medium includes but not limited toAny kind of disk (including floppy disk, hard disk, CD, CD-ROM and magneto-optic disk), ROM (Read-Only Memory, it is read-only to depositReservoir), RAM (Random Access Memory, random access memory), EPROM (Erasable Programmable Read-Only Memory, Erarable Programmable Read only Memory), EEPROM (Electrically Erasable ProgrammableRead-Only Memory, Electrically Erasable Programmable Read-Only Memory), flash memory, magnetic card or light card.It is it is, readableMedium includes any medium by equipment (for example, computer) storage or transmission information in the form of it can read.
Those skilled in the art of the present technique be appreciated that can with computer program instructions come realize these structure charts and/orThe combination of each frame and these structure charts and/or the frame in block diagram and/or flow graph in block diagram and/or flow graph.This technology is ledField technique personnel be appreciated that these computer program instructions can be supplied to all-purpose computer, special purpose computer or otherThe processor of programmable data processing method is realized, so as to pass through the processing of computer or other programmable data processing methodsDevice performs the scheme specified in the frame of structure chart and/or block diagram and/or flow graph disclosed by the invention or multiple frames.
Those skilled in the art of the present technique are appreciated that in the various operations crossed by discussion in the present invention, method, flowSteps, measures, and schemes can be replaced, changed, combined or be deleted.Further, it is each with having been crossed by discussion in the present inventionOther steps, measures, and schemes in kind operation, method, flow may also be alternated, changed, rearranged, decomposed, combined or deleted.Further, it is of the prior art have with disclosed in the present invention various operations, method, the step in flow, measure, schemeIt may also be alternated, changed, rearranged, decomposed, combined or deleted.
The foregoing is merely the preferred embodiment of the present invention, are not intended to limit the scope of the invention, every utilizationThe equivalent structure or equivalent flow shift that description of the invention and accompanying drawing content are made directly or indirectly is used in other correlationsTechnical field, be included within the scope of the present invention.

Claims (10)

CN201810036671.1A2018-01-152018-01-15Robot moving method and devicePendingCN108245099A (en)

Priority Applications (2)

Application NumberPriority DateFiling DateTitle
CN201810036671.1ACN108245099A (en)2018-01-152018-01-15Robot moving method and device
PCT/CN2018/077604WO2019136808A1 (en)2018-01-152018-02-28Robot moving method, robot moving device, floor sweeping robot

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201810036671.1ACN108245099A (en)2018-01-152018-01-15Robot moving method and device

Publications (1)

Publication NumberPublication Date
CN108245099Atrue CN108245099A (en)2018-07-06

Family

ID=62727331

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201810036671.1APendingCN108245099A (en)2018-01-152018-01-15Robot moving method and device

Country Status (2)

CountryLink
CN (1)CN108245099A (en)
WO (1)WO2019136808A1 (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN110025260A (en)*2017-12-202019-07-19东芝生活电器株式会社Autonomous driving body and autonomous driving body system
CN110881909A (en)*2019-12-202020-03-17小狗电器互联网科技(北京)股份有限公司Control method and device of sweeper
CN110916576A (en)*2018-12-132020-03-27成都家有为力机器人技术有限公司Cleaning method based on voice and image recognition instruction and cleaning robot
CN110946518A (en)*2019-12-202020-04-03小狗电器互联网科技(北京)股份有限公司Control method and device of sweeper
CN110946519A (en)*2019-12-202020-04-03小狗电器互联网科技(北京)股份有限公司Control method and device of sweeper
CN111008571A (en)*2019-11-152020-04-14万翼科技有限公司Indoor garbage treatment method and related product
CN111012252A (en)*2019-12-202020-04-17小狗电器互联网科技(北京)股份有限公司Control method and device of sweeper
CN111067354A (en)*2018-10-192020-04-28佛山市顺德区美的饮水机制造有限公司 Water dispenser and its moving method and device
CN112043206A (en)*2020-09-012020-12-08珠海格力电器股份有限公司Sweeping and mopping integrated machine and cleaning method thereof
CN112597910A (en)*2020-12-252021-04-02北京小狗吸尘器集团股份有限公司Method and device for monitoring human activities by using sweeping robot
WO2021062681A1 (en)*2019-09-302021-04-08中新智擎科技有限公司Automatic meal delivery method and apparatus, and robot
CN112656309A (en)*2020-12-252021-04-16北京小狗吸尘器集团股份有限公司Function execution method and device of sweeper, readable storage medium and electronic equipment
CN112703504A (en)*2018-10-192021-04-23深圳新物种科技有限公司Object identification method and device, electronic equipment and computer readable storage medium
TWI731331B (en)*2019-05-102021-06-21中興保全科技股份有限公司Mobile security device
CN113679298A (en)*2021-08-272021-11-23美智纵横科技有限责任公司Robot control method, robot control device, robot, and readable storage medium
WO2022143285A1 (en)*2020-12-312022-07-07深圳市杉川机器人有限公司Cleaning robot and distance measurement method therefor, apparatus, and computer-readable storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN1696854A (en)*2004-05-142005-11-16三星光州电子株式会社 Systems and methods for moving autopilot and compensating for path turns
US20060005160A1 (en)*1997-08-182006-01-05National Instruments CorporationImage acquisition device
CN101295016A (en)*2008-06-132008-10-29河北工业大学 A sound source autonomous search and location method
CN102138769A (en)*2010-01-282011-08-03深圳先进技术研究院Cleaning robot and cleaning method thereby
CN103054522A (en)*2012-12-312013-04-24河海大学Cleaning robot system based on vision measurement and measurement and control method of cleaning robot system
CN104188598A (en)*2014-09-152014-12-10湖南格兰博智能科技有限责任公司Automatic ground cleaning robot
CN104887155A (en)*2015-05-212015-09-09南京创维信息技术研究院有限公司Intelligent sweeper
CN106489104A (en)*2014-11-262017-03-08艾罗伯特公司 Systems and methods for use of optical range sensors in mobile robots
CN106527444A (en)*2016-11-292017-03-22深圳市元征科技股份有限公司Control method of cleaning robot and the cleaning robot
CN107491069A (en)*2017-08-312017-12-19珠海市微半导体有限公司Robot runs into the processing method and chip of barrier

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
TWI481980B (en)*2012-12-052015-04-21Univ Nat Chiao TungElectronic apparatus and navigation method thereof
CN105929827B (en)*2016-05-202020-03-10北京地平线机器人技术研发有限公司Mobile robot and positioning method thereof
CN106203259A (en)*2016-06-272016-12-07旗瀚科技股份有限公司The mutual direction regulating method of robot and device
CN106210511A (en)*2016-06-302016-12-07纳恩博(北京)科技有限公司A kind of method and apparatus positioning user
CN107515606A (en)*2017-07-202017-12-26北京格灵深瞳信息技术有限公司Robot implementation method, control method and robot, electronic equipment

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20060005160A1 (en)*1997-08-182006-01-05National Instruments CorporationImage acquisition device
CN1696854A (en)*2004-05-142005-11-16三星光州电子株式会社 Systems and methods for moving autopilot and compensating for path turns
CN101295016A (en)*2008-06-132008-10-29河北工业大学 A sound source autonomous search and location method
CN102138769A (en)*2010-01-282011-08-03深圳先进技术研究院Cleaning robot and cleaning method thereby
CN103054522A (en)*2012-12-312013-04-24河海大学Cleaning robot system based on vision measurement and measurement and control method of cleaning robot system
CN104188598A (en)*2014-09-152014-12-10湖南格兰博智能科技有限责任公司Automatic ground cleaning robot
CN106489104A (en)*2014-11-262017-03-08艾罗伯特公司 Systems and methods for use of optical range sensors in mobile robots
CN104887155A (en)*2015-05-212015-09-09南京创维信息技术研究院有限公司Intelligent sweeper
CN106527444A (en)*2016-11-292017-03-22深圳市元征科技股份有限公司Control method of cleaning robot and the cleaning robot
CN107491069A (en)*2017-08-312017-12-19珠海市微半导体有限公司Robot runs into the processing method and chip of barrier

Cited By (19)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN110025260A (en)*2017-12-202019-07-19东芝生活电器株式会社Autonomous driving body and autonomous driving body system
CN111067354B (en)*2018-10-192022-06-07佛山市顺德区美的饮水机制造有限公司Water dispenser and moving method and device thereof
CN111067354A (en)*2018-10-192020-04-28佛山市顺德区美的饮水机制造有限公司 Water dispenser and its moving method and device
CN112703504A (en)*2018-10-192021-04-23深圳新物种科技有限公司Object identification method and device, electronic equipment and computer readable storage medium
CN110916576A (en)*2018-12-132020-03-27成都家有为力机器人技术有限公司Cleaning method based on voice and image recognition instruction and cleaning robot
TWI731331B (en)*2019-05-102021-06-21中興保全科技股份有限公司Mobile security device
WO2021062681A1 (en)*2019-09-302021-04-08中新智擎科技有限公司Automatic meal delivery method and apparatus, and robot
CN111008571B (en)*2019-11-152023-04-18万翼科技有限公司Indoor garbage treatment method and related product
CN111008571A (en)*2019-11-152020-04-14万翼科技有限公司Indoor garbage treatment method and related product
CN110946519A (en)*2019-12-202020-04-03小狗电器互联网科技(北京)股份有限公司Control method and device of sweeper
CN111012252A (en)*2019-12-202020-04-17小狗电器互联网科技(北京)股份有限公司Control method and device of sweeper
CN110946518A (en)*2019-12-202020-04-03小狗电器互联网科技(北京)股份有限公司Control method and device of sweeper
CN110881909A (en)*2019-12-202020-03-17小狗电器互联网科技(北京)股份有限公司Control method and device of sweeper
CN112043206A (en)*2020-09-012020-12-08珠海格力电器股份有限公司Sweeping and mopping integrated machine and cleaning method thereof
CN112597910A (en)*2020-12-252021-04-02北京小狗吸尘器集团股份有限公司Method and device for monitoring human activities by using sweeping robot
CN112656309A (en)*2020-12-252021-04-16北京小狗吸尘器集团股份有限公司Function execution method and device of sweeper, readable storage medium and electronic equipment
CN112597910B (en)*2020-12-252024-05-07北京小狗吸尘器集团股份有限公司Method and device for monitoring character activities by using sweeping robot
WO2022143285A1 (en)*2020-12-312022-07-07深圳市杉川机器人有限公司Cleaning robot and distance measurement method therefor, apparatus, and computer-readable storage medium
CN113679298A (en)*2021-08-272021-11-23美智纵横科技有限责任公司Robot control method, robot control device, robot, and readable storage medium

Also Published As

Publication numberPublication date
WO2019136808A1 (en)2019-07-18

Similar Documents

PublicationPublication DateTitle
CN108245099A (en)Robot moving method and device
CN114093052B (en) Intelligent inspection method and system suitable for computer room management
CN106527444B (en)Control method of cleaning robot and cleaning robot
CN101794349B (en)Experimental system and method for augmented reality of teleoperation of robot
Schillebeeckx et al.Biomimetic sonar: Binaural 3D localization using artificial bat pinnae
CN106162144A (en)A kind of visual pattern processing equipment, system and intelligent machine for overnight sight
KR101394809B1 (en)A method and systems for obtaining an improved stereo image of an object
CN106920250B (en)Robot target identification and localization method and system based on RGB-D video
CN112985263B (en)Method, device and equipment for detecting geometrical parameters of bow net
JP4677060B1 (en) Position calibration information collection device, position calibration information collection method, and position calibration information collection program
CN109857112A (en)Obstacle Avoidance and device
WO2017197919A1 (en)Wireless charging positioning method, device, and system, and electric vehicle
CN105014675B (en)A kind of narrow space intelligent mobile robot vision navigation system and method
US11010916B2 (en)Method of configuring camera position suitable for localization and robot implementing same
CN105629196A (en)Positioning system based on machine vision and dynamic fingerprint and corresponding method
CN113675923A (en) Charging method, charging device and robot
CN110881909A (en)Control method and device of sweeper
CN105301585B (en)Information displaying method and device
CN106155093A (en)A kind of robot based on computer vision follows the system and method for human body
Kim et al.Recognition and localization of generic objects for indoor navigation using functionality
CN114636422A (en)Positioning and navigation method for information machine room scene
CN112656307B (en)Cleaning method and cleaning robot
CN103006332A (en)Scalpel tracking method and device and digital stereoscopic microscope system
CN116787421B (en) Motion control method and system for survey robots used in field scenes
JP6194992B2 (en) Object analysis method, object analysis device, and object analysis system

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
TA01Transfer of patent application right
TA01Transfer of patent application right

Effective date of registration:20190905

Address after:Room 402, 4th floor, Kanghe Sheng Building, New Energy Innovation Industrial Park, No. 1 Chuangsheng Road, Nanshan District, Shenzhen City, Guangdong Province, 518000

Applicant after:Shenzhen Infinite Power Development Co., Ltd.

Address before:518000 Block 503,602, Garden City Digital Building B, 1079 Nanhai Avenue, Shekou, Nanshan District, Shenzhen City, Guangdong Province

Applicant before:SHENZHEN WOTE WODE CO., LTD.

RJ01Rejection of invention patent application after publication
RJ01Rejection of invention patent application after publication

Application publication date:20180706


[8]ページ先頭

©2009-2025 Movatter.jp