Movatterモバイル変換


[0]ホーム

URL:


CN106774345A - A kind of method and apparatus for carrying out multi-robot Cooperation - Google Patents

A kind of method and apparatus for carrying out multi-robot Cooperation
Download PDF

Info

Publication number
CN106774345A
CN106774345ACN201710067320.2ACN201710067320ACN106774345ACN 106774345 ACN106774345 ACN 106774345ACN 201710067320 ACN201710067320 ACN 201710067320ACN 106774345 ACN106774345 ACN 106774345A
Authority
CN
China
Prior art keywords
robot
cooperation
destination object
instruction
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710067320.2A
Other languages
Chinese (zh)
Other versions
CN106774345B (en
Inventor
戴萧何
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai xianruan Information Technology Co., Ltd
Original Assignee
Shanghai Zhixian Robot Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Zhixian Robot Technology Co LtdfiledCriticalShanghai Zhixian Robot Technology Co Ltd
Priority to CN201710067320.2ApriorityCriticalpatent/CN106774345B/en
Publication of CN106774345ApublicationCriticalpatent/CN106774345A/en
Application grantedgrantedCritical
Publication of CN106774345BpublicationCriticalpatent/CN106774345B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Classifications

Landscapes

Abstract

The purpose of the application is to provide a kind of method and apparatus for carrying out multi-robot Cooperation;The instruction that cooperates matched with robot is obtained from the network equipment;Based on the cooperation instruction, corresponding multi-robot Cooperation task is performed.Compared with prior art, in this application, the multiple independent robot for carrying out collaborative task is based on the cooperation instruction got from map network equipment, and corresponding multi-robot Cooperation task is performed jointly.Herein, the application that the application can be based on concrete scene needs, the cooperation sent by the network equipment is instructed, multiple independent robots are carried out into flexible combination, allow that each robot after combination realizes the work compound of the complicated task of or job classification larger to workload, so as to contribute to the decomposition of complex work and the optimization of overall resource.

Description

A kind of method and apparatus for carrying out multi-robot Cooperation
Technical field
The application is related to computer realm, more particularly to a kind of technology for carrying out multi-robot Cooperation.
Background technology
Existing robot application is mostly individual machine people's independently working, for example, individual machine people independently moves, independently removesCargo transport thing etc., because individual machine people has certain limitations in equipment scale, application of function so that single robot can be with completeInto by a relatively simple, larger for a few thing amount or complex task of task, individual machine people then cannot be completeInto or complete effect it is undesirable.For example, in transport operation, it is for the moving object of some large volumes, then highly desirable manyRobot collaboration is carried and mobile operation.But lack carries out efficient combination with common in the prior art by multiple independent robotsPerform the technology of same item or same group task.
The content of the invention
The purpose of the application is to provide a kind of method and apparatus for carrying out multi-robot Cooperation.
According to the one side of the application, there is provided one kind carries out multi-Robotics Cooperation Method in robotic end, including:
The instruction that cooperates matched with robot is obtained from the network equipment;
Based on the cooperation instruction, corresponding multi-robot Cooperation task is performed.
According to further aspect of the application, additionally provide one kind carries out multi-Robotics Cooperation Method at network equipment end,Including:
The cooperation for providing matching to one or more robots is instructed, wherein, the robot is based on corresponding cooperation and refers toOrder performs corresponding multi-robot Cooperation task.
According to the another aspect of the application, a kind of robot for carrying out multi-robot Cooperation is additionally provided, including:
First device, for obtaining the instruction that cooperates matched with robot from the network equipment;
Second device, for based on the cooperation instruction, performing corresponding multi-robot Cooperation task.
According to the another aspect of the application, a kind of network equipment for carrying out multi-robot Cooperation is additionally provided, including:
4th device, the cooperation for providing matching to one or more robots is instructed, wherein, the robot is based onCorresponding cooperation instruction performs corresponding multi-robot Cooperation task.
According to the another aspect of the application, a kind of system for carrying out multi-robot Cooperation is additionally provided, wherein the systemIncluding:It is and another according to the application according to a kind of robot for carrying out multi-robot Cooperation that on the other hand the application providesA kind of network equipment for carrying out multi-robot Cooperation that aspect is provided.
Compared with prior art, in this application, the multiple independent robot for carrying out collaborative task is based on from correspondence netThe cooperation instruction that network equipment gets, performs corresponding multi-robot Cooperation task jointly.Here, the application can be based on specificThe application of scene is needed, and the cooperation sent by the network equipment is instructed, and multiple independent robots are carried out into flexible combination,Allow that each robot after combination realizes the work compound of the complicated task of or job classification larger to workload, fromAnd contribute to the decomposition of complex work and the optimization of overall resource.
Further, in a kind of implementation of the application, the robot can be used for realizing that multi-robot formation is movedIt is dynamic, for example, the robot for being cooperated can be based on the cooperation instruction of matching to the movement of destination locations, or follow target pairAs movement, moved with the formation for realizing multiple robots.Based on this implementation, can flexibly and effectively realize that needs are based onAll kinds of collaborative tasks that multiple robot team formation movements are realized, for example cooperate mobile carrying task dispatching.
Further, in a kind of implementation of the application, the robot get cooperation instruction after, by determineRobot destination object to be followed;And then recognize the destination object from the robot in real time scene of capture;So as to realize, based on the cooperation instruction, controlling the robot to be moved to destination object by corresponding mobile route.With it is existingRobot follow technology to compare, the application can real-time change, in the natural environment that disturbing factor is more, lock exactlySet the goal object, and is effectively tracked, and so as to improve the degree of accuracy that robot is followed, solves current robot and followsThe recurrent technical problem with wrong target or with losing target.Phase is pressed based on the cooperation instruction control robot simultaneouslyThe mobile route answered is moved to destination object, can on the whole realize the interoperable movement of forming into columns of multiple robots.
Further, in one implementation, based on the cooperation instruction, the robot is controlled by corresponding mobile roadRadial direction destination object is moved, wherein, relative position and many machines in instructing that cooperate between the robot and destination objectPeople's formation status information matches, and relative distance between second robot and first robot be included in it is defaultRelative distance range threshold in.Here, can control to control many machines in multi-robot Cooperation task by cooperating instructionThe queue shape of people, or specific to the relative position between robot so that the work compound between each robot is matched somebody with somebodyIt is right higher, improve the completion efficiency of collaborative task.
Brief description of the drawings
By the detailed description made to non-limiting example made with reference to the following drawings of reading, the application otherFeature, objects and advantages will become more apparent upon:
Fig. 1 shows to carry out multi-robot Cooperation in robotic end and network equipment end according to one kind of the application one sideMethod flow diagram;
Fig. 2 shows a kind of method flow that multi-robot Cooperation is carried out in robotic end according to the application one sideFigure;
Fig. 3 shows a kind of system diagram for carrying out multi-robot Cooperation according to the application one side.
Same or analogous reference represents same or analogous part in accompanying drawing.
Specific embodiment
The application is described in further detail below in conjunction with the accompanying drawings.
In one typical configuration of the application, terminal, the equipment of service network and trusted party include one or moreProcessor (CPU), input/output interface, network interface and internal memory.
Internal memory potentially includes the volatile memory in computer-readable medium, random access memory (RAM) and/orThe forms such as Nonvolatile memory, such as read-only storage (ROM) or flash memory (flash RAM).Internal memory is computer-readable mediumExample.
Computer-readable medium includes that permanent and non-permanent, removable and non-removable media can be by any methodOr technology realizes information Store.Information can be computer-readable instruction, data structure, the module of program or other data.The example of the storage medium of computer includes, but are not limited to phase transition internal memory (PRAM), static RAM (SRAM), movesState random access memory (DRAM), other kinds of random access memory (RAM), read-only storage (ROM), electric erasableProgrammable read only memory (EEPROM), fast flash memory bank or other memory techniques, read-only optical disc read-only storage (CD-ROM),Digital versatile disc (DVD) or other optical storages, magnetic cassette tape, magnetic disk storage or other magnetic storage apparatus orAny other non-transmission medium, can be used to store the information that can be accessed by a computing device.Defined according to herein, computerComputer-readable recording medium does not include non-temporary computer readable media (transitory media), such as data-signal and carrier wave of modulation.
Fig. 1 shows to carry out multi-robot Cooperation in robotic end and network equipment end according to one kind of the application one sideMethod flow diagram.
Wherein, methods described includes step S11, step S12 and step S21.
The embodiment of the present application provides a kind of method for carrying out multi-robot Cooperation, and methods described can be in corresponding machinePeople end and network equipment end are realized.Wherein, the robot includes performing automatically the various machinery equipments of work, Ke YishiMachinery equipment with locomotive function or carrying load function or other functions, it is also possible to while having above-mentioned various functionsMachinery equipment, for example, various with the mobile artificial intelligence equipment for carrying function.In this application, same cooperation is carried out to appointThe function that multiple robots of business have can with it is identical, can also be different.The network equipment includes but is not limited to computer, netNetwork main frame, single network server, multiple webserver collection or Cloud Server, wherein, the Cloud Server can be operationIn a distributed system, a virtual supercomputer being made up of the computer collection of a group loose couplings, it is used to realizeSimple efficient, safe and reliable, disposal ability can elastic telescopic calculating service.In this application, the robot may refer toIt is the robot 1, the network equipment may refer to be the network equipment 2.
Specifically, in the step s 21, the cooperation that the network equipment 2 can provide matching to one or more robots 1 refers toOrder, wherein, the robot 1 is based on corresponding cooperation instruction and performs corresponding multi-robot Cooperation task.Accordingly, in stepIn S11, the instruction that cooperates matched with itself is obtained from the network equipment 2 by corresponding robot 1.Here, many machine cooperations are appointedBusiness can with multiple robots 1 execution various tasks.For example, multiple robots 1 keep the synchronizing moving of similarity distance;And for example, multiple robots 1 deliver same target jointly;And for example, multiple robots 1 carry out an assembling for all parts of objectTask dispatching.In one implementation, the network equipment 2 can be based on the difference of collaborative task type or specific cooperative operationAnd be that different robots 1 matches corresponding cooperation instruction.
In one implementation, the cooperation instruction can include following at least any one:The multimachine of the robotDevice people's formation status information;The speed control rule of the robot;The coordinate of the destination object to be followed of the robotInformation;Other of the robot perform relevant information.
Specifically, delivered jointly together with the synchronizing moving of the holding similarity distance of multiple robots 1, or multiple robots 1As a example by the scene of one object, in one implementation, the network equipment 2 can by the instruction that cooperates be given each robot 1 itsEach the mobile required formation status information for maintaining, for example, keep a row, a line or multiple row to form into columns;In another realization sideIn formula, the network equipment 2 can also be instructed by the cooperation of speed control rule, control the operation speed of each robot 1 of cooperationDegree, to adjust the distance between each robot 1, so as to realize the control to whole queue movement;In another realizationIn mode, the network equipment 2 can also provide the coordinate information of its destination object to be followed to one or more robots 1, canTo be the coordinate information of the destination object for providing to be determined when moving operation starts, or in moving process, it is based onThe coordinate information for providing destination object in real time is set.
So that multiple robots 1 carry out a scene for the assembling task of all parts of object as an example, the collaborative taskCan include in order that each robot 1 moves to the speed control rule of the robot of respective assembling position;The coordinate information of the target location of the robot;And the assembly operation Step Information of robot etc..Additionally, based on othersThe specific tasks of collaborative task need, and the collaborative task is also by adaptations.
In one implementation, the network equipment 2 can be unified to corresponding each robot 1 of the collaborative task simultaneouslySend cooperation instruction;In another implementation, the network equipment 2 can also be on any opportunity to any one or more machinesPeople 1 sends cooperation instruction respectively.In one implementation, the corresponding cooperation of multiple robots 1 in same collaborative taskInstruction can be with identical;Can also differ, or part is identical, part is different, for example, multiple robots 1 with the formation of a row,Keep in the synchronizing moving scene of similarity distance, the cooperation instruction of the first robot 1 of queue can be with other robot in queue 1Cooperation instruction it is different.
Then, in step s 12, robot 1 can be performed corresponding multi-robot Cooperation and appointed based on the cooperation instructionBusiness.In one implementation, Direct Communication is needed not move through to realize correspondingly collaborative task between each robot 1,And one or more robots 1 that the network equipment 2 is cooperated with the cooperation instruction real-time control are can be by, and by eachIndividual robot 1 performs cooperation instruction to realize the completion of collaborative task respectively.In one implementation, the network equipment 2 can be withEach robot 1 necessity that cooperation needs each other is only given to instruct, and other i.e. executable operations that need not cooperate,Can be independently executed by robot 1, for example, keeping the synchronizing moving of similarity distance, or multiple robots 1 in multiple robots 1In the scene of common delivery same target, kept for overall formation and the control of the queue speed of service can be by the network equipment 1By the instruction control that cooperates, and operation is specifically followed for each robot 1, such as follow determination, the identification operation of objectCan be set by each robot 1 itself and performed.
In this application, carrying out the multiple independent robot 1 of collaborative task can be based on being obtained from map network equipment 2The cooperation instruction arrived, performs corresponding multi-robot Cooperation task jointly.Here, the application can be based on the application of concrete sceneNeed, the cooperation sent by the network equipment 2 is instructed, multiple independent robots are carried out into flexible combination so that combinationEach robot afterwards can realize the work compound of the complicated task of or job classification larger to workload, so as to contribute toThe decomposition of complex work and the optimization of overall resource.
In one implementation, in the step S12, robot 1 can be described based on the cooperation instruction, controlRobot 1 is moved by corresponding mobile route to destination locations or destination object.Here, the multi-robot Cooperation task of the applicationCan be the collaborative task for needing multiple robot team formation movements, such as multiple robots 1 keep the synchronizing moving of similarity distance;And for example, multiple robots 1 deliver same target jointly.Specifically, in one implementation, based on the cooperation instruction, canTo control robot 1 to be moved to destination locations by corresponding mobile route, such as described robot 1 is to be located at queue forefrontOne or more robots, it without specific destination object, and may correspond to the destination locations for needing to reach;In one kindIn implementation, based on the cooperation instruction, robot 1 can also be controlled to be moved to destination object by corresponding mobile route,For example, one or more robots 1 positioned at robot queue forefront can have the object of tracking, such as people of certain movementOr thing, and for example, the non-robot 1 positioned at robot queue forefront needs to follow destination object, i.e. target robot to be movedDynamic, the target robot can be the immediate other robot in front of robot 1, or other are preset or based on cooperationInstruct the other robot for determining.
In this implementation, the robot 1 can be used for realizing that multi-robot formation is moved, for example, being cooperatedRobot 1 can be based on the cooperation instruction of matching to the movement of destination locations, or follow destination object to move, it is many to realizeThe formation movement of individual robot 1.Based on this implementation, can flexibly and effectively realize needing based on multiple robot team formationsThe mobile all kinds of collaborative tasks realized, for example cooperate mobile carrying task dispatching.
Further, Fig. 2 shows a kind of side that multi-robot Cooperation is carried out in robotic end according to the application one sideMethod flow chart.Wherein, methods described includes step S11 and step S12, and further, the step S12 includes step S121, stepRapid S122 and step S123.
Specifically, in step S121, robot 1 can determine the destination object to be followed of the robot 1.In one kindIn implementation, the destination object includes target robot, is carried in the corresponding target robot of the robotSame transport object, now, the collaborative task can correspond to the mobile carrying task that cooperates.The robot 1 is in cooperationIt needs to be determined that its destination object to be followed when task starts.
In one implementation, in step S121, when the robot 1 is arranged to follow the mode, the machineDevice people 1 can recognize corresponding matching object from the robot 1 in real time peripheral information of capture, and then the matching is rightAs the destination object to be followed of the robot 1.In one implementation, can be opened by default trigger actionThe follow the mode of robot 1.When the follow the mode starts, robot 1 can in real time capture peripheral information, in a kind of realization sideIn formula, the initial data of ambient condition information, original number can be got by one or more sensing devices in robot 1According to can be image, picture or point cloud.And then, robot 1 detects the object class that needs are followed from the initial dataType, can have one or more objects to belong to the object type in environment.By the method for machine learning, precondition is classified wellDevice, that is, extract the characteristic information of the scan data of a certain class object, is input in grader, is examined from environmental information by contrastingMeasure certain class object.Certain class object often has multiple, and matching object is the conduct selected from one or more class objectsThe object of destination object.
Further, in one implementation, the matching object can include but is not limited to following at least any one:With the immediate object of the robot 1 around the robot 1;It is closest with the robot 1 in the front of the robot 1Object;In the front of the robot 1 and the immediate object of the robot 1;Around the robot 1 and with treat withWith the object of the characteristics of objects information match of object;Believe around the robot 1 and with the characteristics of objects of object to be followedThe object that breath is most matched;In multiple objects around the robot 1 with the characteristics of objects information match of object to be followedWith the immediate object of the robot.In one implementation, the characteristics of objects information can include but is not limited to treatFollow one or more information in positional information, movement state information, the main body characteristic information of object.
Further, in a kind implementation, in step S121, the robot 1 can be referred to based on the cooperationOrder determines the coordinate information of destination object to be followed;And then, the robot 1 obtains the surrounding environment of the robot in real timeInformation, wherein, the distance between the robot 1 and the coordinate information is less than or equal to predetermined distance threshold;Then, it is describedRobot 1 recognizes corresponding matching object from the ambient condition information, and the matching object is treated as the robot 1The destination object for following.Here, coordinate information both can be absolute coordinate information, or relative co-ordinate information.Robot 1Its ambient condition information is obtained by scanning, if now the distance between the robot 1 and coordinate information is less than or equal to pre-Fixed distance threshold;The matching object matched with the coordinate information can be then identified from environmental information, and it is right by matchingAs being set as destination object.
Further, in a kind implementation, if robot 1 obtain it is described cooperation instruction, its position with wait to followMore than predetermined distance threshold, the application further gives the one kind in the case of this kind for the distance between position of objectSolution:When the distance between the robot 1 and the coordinate information is more than predetermined distance threshold, the robot is controlled1, towards coordinate information movement, the distance between robot 1 and coordinate information is reduced with this;Then, in moving processIn, the ambient condition information of the robot 1 is obtained in real time, until the distance between the robot 1 and the coordinate information is smallWhen predetermined distance threshold, you can recognize corresponding matching object with from the ambient condition information, and will be describedMatching object is used as the destination object to be followed of the robot 1.
Then, in step S122, robot 1 can recognize the mesh from the robot 1 in real time scene of captureMark object.In the moving process of robot 1, each object in environment is also at the state being continually changing, therefore, robot 1 is neededThe environment that be based on real-time change repeats the identification operation of destination object again and again.In one implementation, machinePeople 1 can obtain real time environmental data information, then detected from the environmental data information by periodically scanning for surrounding environmentGo out and belong to of a sort all objects with destination object, finally according to the detection in some cycle or multiple cycles of lasting scanningAs a result, the destination object for matching is identified;
Specifically, in a kind of implementation, in step S122, robot 1 can obtain the robot 1 with real time scanAmbient condition information;Then, the characteristics of objects information with the destination object can be detected from the ambient condition informationOne or more object of observations for matching, here, the destination object determined due to the last recongnition of objects operation,Its corresponding characteristics of objects information is stored, for example, the characteristics of objects information of destination object will be determined with history observational recordForm storage, therefore, it can by current context information scanning determined by one or more object of observations characteristics of objects letterBreath carries out Similarity matching with the characteristics of objects information of the destination object of storage, here, the object of observation or the destination objectCharacteristics of objects information can include but is not limited to following any one:The positional information of object;The movement state information of object;It is rightMain body characteristic information of elephant etc., wherein, the positional information refers to the position of object described in the correspondence scanning moment;Motion state is believedCease the movable information such as including the direction of motion, velocity magnitude;Main body characteristic information refers to then the external appearance characteristic of the subject body,Including shape, size and colouring information etc.;And then, robot 1 can be identified from object of observation one or more describedThe destination object, for example, the object of observation for meeting certain matching degree may both be estimated as destination object.
Further, in one implementation, it is described that the target is identified from object of observation one or more describedObject can include:Each object of observation is observed with history in determining corresponding one or more object of observations of the robot 1The related information of record, wherein, one or more of object of observations include the destination object, the history observational record bagInclude the object-related information of one or more history object of observations;Then, the robot 1 according to the object of observation with it is describedThe related information of history observational record identifies the destination object from object of observation one or more described.
Specifically, when the environment that robot 1 is based on real-time change repeats the identification operation of destination object again and againAfter when determining the destination object, this destination object and its corresponding characteristics of objects information record can be entered history observation noteRecord, at the same time it can also other object of observations that will simultaneously be determined with the destination object and its corresponding characteristics of objects informationMatching determination is carried out, is equally recorded in history observational record.Further, when currently carrying out recongnition of objects and operating, can be withEach object of observation carries out data correlation with history observational record in one or more object of observations that will currently get, onePlant in implementation, the data correlation can refer to by each object of observation point in current one or more object of observationsObservational record not with each object in the history observational record of storage is matched, and its result is related information.For example, there is N number of object of observation in certain present scanning cycle, in environment, the M history observation note of object is stored before robotRecord, wherein, M may be identical or different from the quantity of N;And N number of object specific object corresponding with M object there may be oneIndividual or multiple objects occur simultaneously.Data correlation is carried out, is that N number of object of observation is right with M in history observational record respectively one by oneThe observational record of elephant is matched, the matching degree for being matched each time, and whole matching result is a square for N rows M rowBattle array, matrix element is corresponding matching degree, and this matrix is related information.Wherein, the object of observation includes target pairAs including.In one implementation, the spy that the matching can be carried out with object-based one or more characteristics of objects informationLevy matching.Then, the destination object is identified based on the related information for obtaining.Obtaining related information i.e. matching degree squareAfter battle array, by comprehensive analysis computing, a kind of whole matching interrelational form of degree highest is chosen, so as to obtain the target pairAs.
In one implementation, methods described also includes step S13 (not shown), and robot 1 can be with step s 13The history observational record is updated according to one or more of object of observations, wherein, the history observational record after renewalIn object include the destination object that is identified from object of observation one or more described.Corresponding to robot 1Object of observation is continually changing with the change of environment, in one implementation, if there is the new object of observation to occur,Increase the corresponding observational record;If the existing object of observation disappears, the corresponding sight of the object of observation is deletedSurvey record;If the existing object of observation is still suffered from, the relevant information in the corresponding observational record is updated.
Then, in step S123, robot 1 can control the robot by corresponding based on the cooperation instructionMobile route is moved to destination object.Specifically, robot 1 can determine movement of the robot 1 to the destination objectPath;And then, control the robot 1 to be moved by the mobile route.Wherein, the determination of the mobile route or mobileWhat the cooperation instruction that controlling behavior can be all based on the network equipment 2 was performed, or only one is referred to based on the cooperationWhat order was performed.
In one implementation, robot 1 can control the robot by corresponding shifting based on the cooperation instructionDynamic path is moved to destination object, wherein, formation state between the robot and destination object with cooperate instruct in it is manyRobot formation status information matches, and relative distance between second robot and first robot is included inIn default relative distance range threshold.Wherein, the network equipment 2 can by the instruction that cooperates be given each described robot 1 itsEach the mobile required formation status information for maintaining, for example, keep a row, a line or multiple row to form into columns, in a kind of implementationIn, these formation states can be realized by the mobile route of robot 1, the isoparametric setting of motion state;In another realityIn existing mode, the network equipment 2 can also be instructed by the cooperation of speed control rule, control the fortune of each robot 1 of cooperationScanning frequency degree, to adjust the distance between each robot 1, so as to realize the control of whole queue movement.Here, can lead toCooperation instruction is crossed to control to control in multi-robot Cooperation task the queue shape of multirobot, or specific to robot each other itBetween relative position.So that the work compound fitness between each robot 1 is higher, the completion efficiency of cotasking is improved.
In one implementation, the step S123 can include step S1231 (not shown) and step S1232 (notShow).Specifically, in step S1231, robot 1 can determine the robot 1 to described based on the cooperation instructionThe mobile route of destination object;In step S1232, robot 1 can control the robot 1 based on the cooperation instructionMoved by the mobile route.
Further, in step S1231, robot 1 can be obtained from the ambient condition information of the robotObstacle information;Then, the positional information based on the destination object for identifying, determines the coordinates of targets of the robot 1;Then, based on the cooperation instruction, with reference to the coordinates of targets and the obstacle information, determine the robot to describedThe mobile route of destination object, wherein, the cooperation instruction includes multirobot formation status information.
Specifically, robot 1 determines from robot body to the obstacle information the destination object first, itsIn, barrier refer in environment in addition to the destination object other all objects, therefore, the existing static barrier of barrierHinder the buildings such as thing, such as wall, pillar when tracking indoors, also there is mobile barrier, for example, be not belonging to the targetThe object of observation of object.Then, by the positional information of presently described destination object, for example, it is reported in corresponding history observationPositional information in record, is set to the coordinates of targets of robot 1.Finally, based on the cooperation instruction, according to distribution of obstaclesThe coordinates of targets of situation and robot, determines mobile route of the robot to the destination object.In actual applications, due toThe mobile route for reaching another position from a position is not unique, thus for robot determine mobile route norUniquely, but from mulitpath the path being best suitable for is selected.In multi-robot Cooperation task, each robot it is onlyVertical motion needs the cooperation take into account simultaneously between, here, the network equipment 2 is supplied to the cooperation of each robot 1 to refer toOrder includes multirobot formation status information, is used to the mobile formation information of each robot 1 for indicating cooperation, for example, keep oneRow, a line or multiple row are formed into columns, and then, by the formation status information come movement from planning robot to the destination objectPath, if for example, each robot 1 is advanced in capable mode, it is necessary to the path width considered on mobile route, excludesThe limited path candidate of path width.In one implementation, the cooperation instruction for containing the formation status information can be withReceived by corresponding robot 1 before movement starts in robot 1, in acceptable tangible motion process, based on fieldThe change of scape and be supplied to robot 1 in real time.
Further, in step S1232, robot 1 can determine the shifting of the robot 1 based on the cooperation instructionDynamic speed, wherein, the cooperation instruction includes speed control rule;Then, controlling the robot 1 can be based on the movementSpeed, moved by the mobile route, wherein, controlled between the robot 1 and destination object by the translational speedRelative distance is included in default relative distance range threshold.Specifically, when multi-robot Cooperation forms into columns movement, except needingFormation is considered, in addition it is also necessary in view of the relative position between specific robot 1, for example, in the mobile carrying task of cooperationIn, if each robot 1 is moved with a row, when transport object is a length of N meters, in order to ensure each robot while carrying thisTransport task, the relative position of two neighboring robot 1 is not just random, and be the need to ensure that two neighboring robot 1 itBetween be maintained in the range of a certain distance, here, can by cooperate instruction in speed control rule determine robot 1 shiftingDynamic speed so that the robot 1 can be moved based on the translational speed, by the mobile route, meanwhile, keep withDefault distance range between the target robot (can correspond to another robot 1) that it is followed.
Further, in one implementation, it is described based on the cooperation instruction, determine the mobile speed of the robot 1Degree, wherein, the cooperation instruction includes that speed control rule includes:Based on speed control rule, the robot 1 is determinedTranslational speed, wherein, the translational speed include pace and/or turning velocity.Here, the motion of robot 1 needsKinematics and dynamic (dynamical) constraint by robot body, meanwhile, also need to consider the chi of robot 1 in collision freeIt is very little.When control robot 1 is moved according to the mobile route, on the one hand control robot 1 is transported without departing from path domainThe control in dynamic direction, on the other hand needs to control the translational speed of robot 1.Further, it is preferable that the mobile speed of robot 1Degree is divided into two components of pace and turning velocity, and specifically, pace refers to the speed point on the direction of robot 1Amount, turning velocity refers in the velocity component on pace direction.
On this basis, further one kind is achieved in that:When robot 1 is big with the distance between the destination objectWhen distance threshold, while carrying out planning control to the pace and the turning velocity;When robot 1 and instituteWhen stating the distance between destination object less than distance threshold, i.e., when robot is already close to destination object, then only need to machineThe direction of motion of people, i.e. turning velocity are micro-adjusted.
In this application, the robot 1 is to be followed by determining the robot 1 after cooperation instruction is gotDestination object;And then recognize the destination object from the robot in real time scene of capture;So as to realize based on the associationInstruct, control the robot 1 to be moved to destination object by corresponding mobile route.Technology phase is followed with existing robotThan, the application can real-time change, in the natural environment that disturbing factor is more, lock onto target object exactly, and carrying outEffectively track, so as to improve the degree of accuracy that robot is followed, solve current robot follow it is recurrent with wrong meshMark or the technical problem with losing target.Control the robot by corresponding mobile route to mesh based on the cooperation instruction simultaneouslyMark object movement, can on the whole realize the interoperable movement of forming into columns of multiple robots.
In one implementation, in the step s 21, the network equipment 1 can provide the first cooperation and refer to the first robotOrder, wherein, first robot is based on the described first cooperation instruction, controls first robot to press corresponding mobile routeMoved to destination object or destination locations;Then, the second cooperation instruction is provided to the second robot, wherein, second machinePeople is based on the described second cooperation instruction, controls second robot to follow the first robot to move by corresponding mobile route.Further, in one implementation, the phase formation state between second robot and first robot with cooperateMultirobot formation status information in instruction matches, and relative between second robot and first robotDistance is included in default relative distance range threshold.Here, first robot and the second robot can be corresponded toIt is different robots 1, in one implementation, same multi-robot Cooperation task can be by one or more the first machinesDevice people and one or more second robots cooperate execution jointly.In one implementation, the first cooperation instruction and theTwo cooperation instructions can be with identical or different.
Fig. 3 shows a kind of system diagram for carrying out multi-robot Cooperation according to the application one side.Wherein, the systemIncluding robot 1 and the network equipment 2.
Wherein, the robot 1 includes first device 31 and second device 32, and the network equipment 2 includes the 4th device41。
The embodiment of the present application provide a kind of system for carrying out multi-robot Cooperation, the system can include robot andThe network equipment.Wherein, the robot includes performing automatically the various machinery equipments of work, can have mobile work(The machinery equipment of load function or other functions or can be carried, it is also possible to while there is the machinery equipment of above-mentioned various functions,For example, various with the mobile artificial intelligence equipment for carrying function.In this application, multiple machines of same collaborative task are carried outThe function that device people has can with it is identical, can also be different.The network equipment includes but is not limited to computer, network host, listThe individual webserver, multiple webserver collection or Cloud Server, wherein, the Cloud Server can operate in distributed systemA virtual supercomputer in system, being made up of the computer collection of a group loose couplings, it is used to realize simple efficient, peaceComplete reliable, disposal ability can elastic telescopic calculating service.In this application, the robot may refer to be the robot1, the network equipment may refer to be the network equipment 2.
Specifically, the cooperation that the 4th device 41 can provide matching to one or more robots 1 is instructed, wherein, it is describedRobot 1 is based on corresponding cooperation instruction and performs corresponding multi-robot Cooperation task.Accordingly, first device 31 sets from networkThe instruction that cooperates matched with itself is obtained in standby 2.Here, many machine collaborative tasks can the execution with multiple robots 1Various tasks.For example, multiple robots 1 keep the synchronizing moving of similarity distance;And for example, multiple robots 1 deliver same jointlyObject;And for example, multiple robots 1 carry out an assembling task dispatching for all parts of object.In one implementation, networkEquipment 2 can be that different robots 1 matches corresponding association based on the difference of collaborative task type or specific cooperative operationInstruct.
In one implementation, the cooperation instruction can include following at least any one:The multimachine of the robotDevice people's formation status information;The speed control rule of the robot;The coordinate of the destination object to be followed of the robotInformation;Other of the robot perform relevant information.
Specifically, delivered jointly together with the synchronizing moving of the holding similarity distance of multiple robots 1, or multiple robots 1As a example by the scene of one object, in one implementation, the network equipment 2 can by the instruction that cooperates be given each robot 1 itsEach the mobile required formation status information for maintaining, for example, keep a row, a line or multiple row to form into columns;In another realization sideIn formula, the network equipment 2 can also be instructed by the cooperation of speed control rule, control the operation speed of each robot 1 of cooperationDegree, to adjust the distance between each robot 1, so as to realize the control to whole queue movement;In another realizationIn mode, the network equipment 2 can also provide the coordinate information of its destination object to be followed to one or more robots 1, canTo be the coordinate information of the destination object for providing to be determined when moving operation starts, or in moving process, it is based onThe coordinate information for providing destination object in real time is set.
So that multiple robots 1 carry out a scene for the assembling task of all parts of object as an example, the collaborative taskCan include in order that each robot 1 moves to the speed control rule of the robot of respective assembling position;The coordinate information of the target location of the robot;And the assembly operation Step Information of robot etc..Additionally, based on othersThe specific tasks of collaborative task need, and the collaborative task is also by adaptations.
In one implementation, the 4th device 41 can unite to corresponding each robot 1 of the collaborative task simultaneouslyOne sends cooperation instruction;In another implementation, the 4th device 41 can also be on any opportunity to any one or moreRobot 1 sends cooperation instruction respectively.In one implementation, the multiple robots 1 in same collaborative task are correspondingCooperation instruction can be with identical;Can also differ, or part is identical, partly different, for example, in multiple robots 1 with a rowIn formation, the synchronizing moving scene of holding similarity distance, the cooperation instruction of the first robot 1 of queue can be with other machines in queueThe cooperation instruction of device people 1 is different.
Then, second device 32 can perform corresponding multi-robot Cooperation task based on the cooperation instruction.In one kindIn implementation, Direct Communication is needed not move through to realize correspondingly collaborative task between each robot 1, and can be logicalOne or more robots 1 that the network equipment 2 is cooperated with the cooperation instruction real-time control are crossed, and by each robot 1Cooperation instruction is performed respectively to realize the completion of collaborative task.In one implementation, the network equipment 2 can be given only respectivelyThe necessary instruction that cooperation needs each other of individual robot 1, and other i.e. executable operations that need not cooperate, can be by machinePeople 1 independently executes, for example, keeping the synchronizing moving of similarity distance, or multiple robots 1 to deliver jointly together in multiple robots 1In the scene of one object, keep to be referred to by cooperation by the network equipment 1 with the control of the queue speed of service for overall formationOrder control, and operation is specifically followed for each robot 1, such as follow the determination of object, recognize that operation can be by eachRobot 1 itself sets and performs.
In this application, carrying out the multiple independent robot 1 of collaborative task can be based on being obtained from map network equipment 2The cooperation instruction arrived, performs corresponding multi-robot Cooperation task jointly.Here, the application can be based on the application of concrete sceneNeed, the cooperation sent by the network equipment 2 is instructed, multiple independent robots are carried out into flexible combination so that combinationEach robot afterwards can realize the work compound of the complicated task of or job classification larger to workload, so as to contribute toThe decomposition of complex work and the optimization of overall resource.
In one implementation, second device 21 can control the robot 1 by corresponding based on the cooperation instructionMobile route moved to destination locations or destination object.Here, the multi-robot Cooperation task of the application can be that needs are moreThe collaborative task of individual robot team formation movement, such as multiple robots 1 keep the synchronizing moving of similarity distance;And for example, Duo GejiDevice people 1 delivers same target jointly.Specifically, in one implementation, based on the cooperation instruction, robot can be controlled1 is moved by corresponding mobile route to destination locations, such as described robot 1 is one or more machines for being located at queue forefrontDevice people, it without specific destination object, and may correspond to the destination locations for needing to reach;In one implementation,Based on the cooperation instruction, robot 1 can also be controlled to be moved to destination object by corresponding mobile route, for example, the machine of being located atOne or more robots 1 of device people's queue forefront can have the object of tracking, people or thing that for example certain is moved, and for example,The non-robot 1 positioned at robot queue forefront needs to follow destination object, i.e. target robot to move, the target machineDevice people can be the immediate other robot in front of robot 1, or other are preset or instruct its of determination based on cooperationHis robot.
In this implementation, the robot 1 can be used for realizing that multi-robot formation is moved, for example, being cooperatedRobot 1 can be based on the cooperation instruction of matching to the movement of destination locations, or follow destination object to move, it is many to realizeThe formation movement of individual robot 1.Based on this implementation, can flexibly and effectively realize needing based on multiple robot team formationsThe mobile all kinds of collaborative tasks realized, for example cooperate mobile carrying task dispatching.
Further, in one implementation, the second device 32 includes first module (not shown), second unit(not shown) and the 3rd unit (not shown).
Specifically, first module can determine the destination object to be followed of the robot 1.In one implementation,The destination object includes target robot, same transport is carried in the corresponding target robot of the robot rightAs now, the collaborative task can correspond to the mobile carrying task that cooperates.The robot 1 is needed when collaborative task startsDetermine its destination object to be followed.
In one implementation, when the robot 1 is arranged to follow the mode, first module can be from the machineCorresponding matching object is recognized in the peripheral information that device people 1 captures in real time, and then object as the robot 1 is matched using describedDestination object to be followed.In one implementation, mould can be followed by default trigger action unlatching robot 1Formula.When the follow the mode starts, robot 1 can in real time capture peripheral information, in one implementation, can be by machineOne or more sensing devices in device people 1 get the initial data of ambient condition information, and initial data can be image, figurePiece or point cloud.And then, robot 1 detects the object type that needs are followed from the initial data, can have in environmentOne or more objects belong to the object type.By the method for machine learning, the good grader of precondition extracts a certain classThe characteristic information of the scan data of object, is input in grader, and certain class object is detected from environmental information by contrasting.CertainClass object often has multiple, and matching object is the object as destination object selected from one or more class objects.
Further, in one implementation, the matching object can include but is not limited to following at least any one:With the immediate object of the robot 1 around the robot 1;It is closest with the robot 1 in the front of the robot 1Object;In the front of the robot 1 and the immediate object of the robot 1;Around the robot 1 and with treat withWith the object of the characteristics of objects information match of object;Believe around the robot 1 and with the characteristics of objects of object to be followedThe object that breath is most matched;In multiple objects around the robot 1 with the characteristics of objects information match of object to be followedWith the immediate object of the robot.In one implementation, the characteristics of objects information can include but is not limited to treatFollow one or more information in positional information, movement state information, the main body characteristic information of object.
Further, in a kind implementation, first module can determine mesh to be followed based on the cooperation instructionMark the coordinate information of object;And then, the robot 1 obtains the ambient condition information of the robot in real time, wherein, the machineDistance between device people 1 and the coordinate information is less than or equal to predetermined distance threshold;Then, the robot 1 is from the weekCollarette environment information recognizes corresponding matching object, and matches object as the destination object to be followed of the robot 1 using described.Here, coordinate information both can be absolute coordinate information, or relative co-ordinate information.Robot 1 obtains it by scanningAmbient condition information, if now the distance between the robot 1 and coordinate information is less than or equal to predetermined distance threshold;ThenThe matching object matched with the coordinate information can be identified from environmental information, and matching object is set as target pairAs.
Further, in a kind implementation, if robot 1 obtain it is described cooperation instruction, its position with wait to followMore than predetermined distance threshold, the application further gives the one kind in the case of this kind for the distance between position of objectSolution:When the distance between the robot 1 and the coordinate information is more than predetermined distance threshold, the robot is controlled1, towards coordinate information movement, the distance between robot 1 and coordinate information is reduced with this;Then, in moving processIn, the ambient condition information of the robot 1 is obtained in real time, until the distance between the robot 1 and the coordinate information is smallWhen predetermined distance threshold, you can recognize corresponding matching object with from the ambient condition information, and will be describedMatching object is used as the destination object to be followed of the robot 1.
Then, second unit can recognize the destination object from the robot 1 in real time scene of capture.In machineIn the moving process of people 1, each object in environment is also at the state being continually changing, therefore, robot 1 is needed based on real-time changeEnvironment again and again repeat destination object identification operation.In one implementation, robot 1 can be by the cycleProperty ground scanning surrounding environment, obtain real time environmental data information, then detected from the environmental data information and destination object categoryIn of a sort all objects, finally according to some cycle or the testing result in multiple cycles of lasting scanning, phase is identifiedThe destination object of matching;
Specifically, in a kind of implementation, the surrounding environment that second unit can obtain the robot 1 with real time scan is believedBreath;Then, with the characteristics of objects information match of the destination object can be detected from the ambient condition informationOr multiple object of observations, here, due to destination object, its corresponding object of the last recongnition of objects operation determinationCharacteristic information is stored, for example, will determine that the characteristics of objects information of destination object is stored in the form of history observational record, becauseThis, can be by the characteristics of objects information of one or more object of observations determined by current context information scanning and the target for storingThe characteristics of objects information of object carries out Similarity matching, here, the characteristics of objects information of the object of observation or the destination objectFollowing any one can be included but is not limited to:The positional information of object;The movement state information of object;The main body characteristic letter of objectBreath etc., wherein, the positional information refers to the position of object described in the correspondence scanning moment;Movement state information includes the side of motionTo movable informations such as, velocity magnitudes;Main body characteristic information refers to then the external appearance characteristic of the subject body, including shape, sizeAnd colouring information etc.;And then, robot 1 can identify the destination object from object of observation one or more described,For example, the object of observation for meeting certain matching degree may both be estimated as destination object.
Further, in one implementation, it is described that the target is identified from object of observation one or more describedObject can include:Each object of observation is observed with history in determining corresponding one or more object of observations of the robot 1The related information of record, wherein, one or more of object of observations include the destination object, the history observational record bagInclude the object-related information of one or more history object of observations;Then, the robot 1 according to the object of observation with it is describedThe related information of history observational record identifies the destination object from object of observation one or more described.
Specifically, when the environment that robot 1 is based on real-time change repeats the identification operation of destination object again and againAfter when determining the destination object, this destination object and its corresponding characteristics of objects information record can be entered history observation noteRecord, at the same time it can also other object of observations that will simultaneously be determined with the destination object and its corresponding characteristics of objects informationMatching determination is carried out, is equally recorded in history observational record.Further, when currently carrying out recongnition of objects and operating, can be withEach object of observation carries out data correlation with history observational record in one or more object of observations that will currently get, onePlant in implementation, the data correlation can refer to by each object of observation point in current one or more object of observationsObservational record not with each object in the history observational record of storage is matched, and its result is related information.For example, there is N number of object of observation in certain present scanning cycle, in environment, the M history observation note of object is stored before robotRecord, wherein, M may be identical or different from the quantity of N;And N number of object specific object corresponding with M object there may be oneIndividual or multiple objects occur simultaneously.Data correlation is carried out, is that N number of object of observation is right with M in history observational record respectively one by oneThe observational record of elephant is matched, the matching degree for being matched each time, and whole matching result is a square for N rows M rowBattle array, matrix element is corresponding matching degree, and this matrix is related information.Wherein, the object of observation includes target pairAs including.In one implementation, the spy that the matching can be carried out with object-based one or more characteristics of objects informationLevy matching.Then, the destination object is identified based on the related information for obtaining.Obtaining related information i.e. matching degree squareAfter battle array, by comprehensive analysis computing, a kind of whole matching interrelational form of degree highest is chosen, so as to obtain the target pairAs.
In one implementation, the robot 1 also includes 3rd device (not shown), and robot 1 can be according to instituteState one or more object of observations and update the history observational record, wherein, it is right in the history observational record after renewalAs the destination object including being identified from object of observation one or more described.Observation corresponding to robot 1 is rightIt is continually changing as the change with environment, in one implementation, if there is the new object of observation to occur, increases phaseThe observational record answered;If the existing object of observation disappears, the corresponding observational record of the object of observation is deleted;If the existing object of observation is still suffered from, the relevant information in the corresponding observational record is updated.
Then, Unit the 3rd can control the robot by corresponding mobile route to mesh based on the cooperation instructionMark object movement.Specifically, robot 1 can determine mobile route of the robot 1 to the destination object;And then, controlThe robot 1 is made to be moved by the mobile route.Wherein, the determination of the mobile route or mobile controlling behavior can be withThe cooperation instruction execution of the network equipment 2 is all based on, or only one is performed based on the cooperation instruction.
In one implementation, Unit the 3rd can control the robot by corresponding based on the cooperation instructionMobile route is moved to destination object, wherein, formation state between the robot and destination object with cooperate and instructMultirobot formation status information matches, and relative distance between second robot and first robot is includedIn default relative distance range threshold.Wherein, the network equipment 2 can provide each robot 1 by the instruction that cooperatesThe formation status information maintained required for it is each mobile, for example, keep a row, a line or multiple row to form into columns, in a kind of realization sideIn formula, these formation states can be realized by the mobile route of robot 1, the isoparametric setting of motion state;At anotherIn implementation, the network equipment 2 can also be instructed by the cooperation of speed control rule, control each robot 1 of cooperationThe speed of service, to adjust the distance between each robot 1, so as to realize the control of whole queue movement.Here, can be withThe queue shape of control multirobot in multi-robot Cooperation task is controlled by cooperating instruction, or specific to robot each otherBetween relative position.So that the work compound fitness between each robot 1 is higher, the completion effect of cotasking is improvedRate.
In one implementation, Unit the 3rd can include the first subelement (not shown) and the second subelement(not shown).Specifically, the first subelement can determine the robot 1 to the destination object based on the cooperation instructionMobile route;Second subelement can control the robot 1 to be moved by the mobile route based on the cooperation instruction.
Further, the first subelement can obtain obstacle information from the ambient condition information of the robot;Then, the positional information based on the destination object for identifying, determines the coordinates of targets of the robot 1;Then, based on instituteCooperation instruction is stated, with reference to the coordinates of targets and the obstacle information, determines the robot to the destination objectMobile route, wherein, the cooperation instruction includes multirobot formation status information.
Specifically, the first subelement determines from robot body to the obstacle information the destination object first,Wherein, barrier refer in environment in addition to the destination object other all objects, therefore, barrier is existing staticThe buildings such as barrier, such as wall, pillar when tracking indoors, also there is mobile barrier, for example, be not belonging to the meshMark the object of observation of object.Then, by the positional information of presently described destination object, for example, it is reported in the corresponding conception of historyThe positional information surveyed in record, is set to the coordinates of targets of robot 1.Finally, based on the cooperation instruction, according to barrier pointThe coordinates of targets of cloth situation and robot, determines mobile route of the robot to the destination object.In actual applications, byIt is not unique in the mobile route for reaching another position from a position, therefore the mobile route determined for robot is not yetIt is unique, but the path being best suitable for is selected from mulitpath.In multi-robot Cooperation task, each robotSelf-movement needs the cooperation take into account simultaneously between, here, the network equipment 2 is supplied to the cooperation of each robot 1Instruction includes multirobot formation status information, is used to the mobile formation information of each robot 1 for indicating cooperation, for example, keepOne row, a line or multiple row are formed into columns, and then, by the formation status information come shifting from planning robot to the destination objectDynamic path, if for example, each robot 1 is advanced in capable mode, it is necessary to the path width considered on mobile route, excludesFall the limited path candidate of path width.In one implementation, the cooperation instruction for containing the formation status information canReceived by corresponding robot 1 before movement starts with robot 1, in acceptable tangible motion process, be based onThe change of scene and be supplied to robot 1 in real time.
Further, the second subelement can determine the translational speed of the robot 1 based on the cooperation instruction, wherein,The cooperation instruction includes speed control rule;Then, the control robot 1 can be based on the translational speed, by describedMobile route is moved, wherein, control the relative distance between the robot 1 and destination object to include by the translational speedIn default relative distance range threshold.Specifically, when multi-robot Cooperation forms into columns movement, except needing to consider teamShape, in addition it is also necessary in view of the relative position between specific robot 1, for example, in the mobile carrying task of cooperation, if each machineDevice people 1 is moved with a row, when transport object is a length of N meters, in order to ensure each robot while carrying the transport task, phaseThe relative position of Lin Liangge robots 1 is not just random, and is maintained at certain between being the need to ensure that two neighboring robot 1Distance range in, here, can by cooperate instruction in speed control rule determine robot 1 translational speed so thatThe robot 1 can be moved based on the translational speed, by the mobile route, meanwhile, the mesh that holding is followed with itDefault distance range between scalar robot (can correspond to another robot 1).
Further, in one implementation, it is described based on the cooperation instruction, determine the mobile speed of the robot 1Degree, wherein, the cooperation instruction includes that speed control rule includes:Based on speed control rule, the robot 1 is determinedTranslational speed, wherein, the translational speed include pace and/or turning velocity.Here, the motion of robot 1 needsKinematics and dynamic (dynamical) constraint by robot body, meanwhile, also need to consider the chi of robot 1 in collision freeIt is very little.When control robot 1 is moved according to the mobile route, on the one hand control robot 1 is transported without departing from path domainThe control in dynamic direction, on the other hand needs to control the translational speed of robot 1.Further, it is preferable that the mobile speed of robot 1Degree is divided into two components of pace and turning velocity, and specifically, pace refers to the speed point on the direction of robot 1Amount, turning velocity refers in the velocity component on pace direction.
On this basis, further one kind is achieved in that:When robot 1 is big with the distance between the destination objectWhen distance threshold, while carrying out planning control to the pace and the turning velocity;When robot 1 and instituteWhen stating the distance between destination object less than distance threshold, i.e., when robot is already close to destination object, then only need to machineThe direction of motion of people, i.e. turning velocity are micro-adjusted.
In this application, the robot 1 is to be followed by determining the robot 1 after cooperation instruction is gotDestination object;And then recognize the destination object from the robot in real time scene of capture;So as to realize based on the associationInstruct, control the robot 1 to be moved to destination object by corresponding mobile route.Technology phase is followed with existing robotThan, the application can real-time change, in the natural environment that disturbing factor is more, lock onto target object exactly, and carrying outEffectively track, so as to improve the degree of accuracy that robot is followed, solve current robot follow it is recurrent with wrong meshMark or the technical problem with losing target.Control the robot by corresponding mobile route to mesh based on the cooperation instruction simultaneouslyMark object movement, can on the whole realize the interoperable movement of forming into columns of multiple robots.
In one implementation, the 4th device 41 of the network equipment 1 can provide the first cooperation and refer to the first robotOrder, wherein, first robot is based on the described first cooperation instruction, controls first robot to press corresponding mobile routeMoved to destination object or destination locations;Then, the second cooperation instruction is provided to the second robot, wherein, second machinePeople is based on the described second cooperation instruction, controls second robot to follow the first robot to move by corresponding mobile route.Further, in one implementation, the phase formation state between second robot and first robot with cooperateMultirobot formation status information in instruction matches, and relative between second robot and first robotDistance is included in default relative distance range threshold.Here, first robot and the second robot can be corresponded toIt is different robots 1, in one implementation, same multi-robot Cooperation task can be by one or more the first machinesDevice people and one or more second robots cooperate execution jointly.In one implementation, the first cooperation instruction and theTwo cooperation instructions can be with identical or different.
It is obvious to a person skilled in the art that the application is not limited to the details of above-mentioned one exemplary embodiment, Er QieIn the case of without departing substantially from spirit herein or essential characteristic, the application can be in other specific forms realized.Therefore, no matterFrom the point of view of which point, embodiment all should be regarded as exemplary, and be nonrestrictive, scope of the present application is by appended powerProfit requires to be limited rather than described above, it is intended that all in the implication and scope of the equivalency of claim by fallingChange is included in the application.Any reference in claim should not be considered as the claim involved by limitation.ThisOutward, it is clear that " including " word is not excluded for other units or step, odd number is not excluded for plural number.The multiple stated in device claimUnit or device can also be realized by a unit or device by software or hardware.The first, the second grade word is used for tableShow title, and be not offered as any specific order.

Claims (24)

CN201710067320.2A2017-02-072017-02-07Method and equipment for multi-robot cooperationActiveCN106774345B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201710067320.2ACN106774345B (en)2017-02-072017-02-07Method and equipment for multi-robot cooperation

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201710067320.2ACN106774345B (en)2017-02-072017-02-07Method and equipment for multi-robot cooperation

Publications (2)

Publication NumberPublication Date
CN106774345Atrue CN106774345A (en)2017-05-31
CN106774345B CN106774345B (en)2020-10-30

Family

ID=58956308

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201710067320.2AActiveCN106774345B (en)2017-02-072017-02-07Method and equipment for multi-robot cooperation

Country Status (1)

CountryLink
CN (1)CN106774345B (en)

Cited By (28)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN108428059A (en)*2018-03-272018-08-21昆明理工大学A kind of detecting robot of pipe queue forms and develops method
CN108527367A (en)*2018-03-282018-09-14华南理工大学A kind of description method of multirobot work compound task
CN108873913A (en)*2018-08-222018-11-23深圳乐动机器人有限公司From mobile device work compound control method, device, storage medium and system
CN109683556A (en)*2017-10-182019-04-26苏州宝时得电动工具有限公司From mobile device work compound control method, device and storage medium
CN109676611A (en)*2019-01-252019-04-26北京猎户星空科技有限公司Multirobot cooperating service method, device, control equipment and system
CN109740464A (en)*2018-12-212019-05-10北京智行者科技有限公司The identification follower method of target
CN109765889A (en)*2018-12-312019-05-17深圳市越疆科技有限公司A kind of monitoring method of robot, device and intelligent terminal
CN109947105A (en)*2019-03-272019-06-28科大智能机器人技术有限公司A kind of speed regulating method and speed regulation device of automatic tractor
CN110153983A (en)*2018-02-152019-08-23欧姆龙株式会社Control system, slave device control unit, control method and storage medium
CN110347159A (en)*2019-07-122019-10-18苏州融萃特种机器人有限公司Mobile robot Multi computer cooperation method and system
CN110355751A (en)*2018-03-262019-10-22发那科株式会社Control device and machine learning device
CN111065981A (en)*2017-09-252020-04-24日本电产新宝株式会社Moving body and moving body system
CN111443642A (en)*2020-04-242020-07-24深圳国信泰富科技有限公司Cooperative control system and method for robot
CN111612312A (en)*2020-04-292020-09-01深圳优地科技有限公司Robot distribution method, robot, terminal device and storage medium
CN111766854A (en)*2019-03-272020-10-13杭州海康机器人技术有限公司Control system and control method for AGV cooperative transportation
CN112396653A (en)*2020-10-312021-02-23清华大学Target scene oriented robot operation strategy generation method
CN112540605A (en)*2020-03-312021-03-23深圳优地科技有限公司Multi-robot cooperation clearance method, server, robot and storage medium
CN112775957A (en)*2019-11-082021-05-11珠海市一微半导体有限公司Control method of working robot, working robot system and chip
CN112873206A (en)*2021-01-222021-06-01中国铁建重工集团股份有限公司Multi-task automatic distribution mechanical arm control system and operation trolley
CN113771033A (en)*2021-09-132021-12-10中冶赛迪技术研究中心有限公司Multi-robot site integrated control system, method, device and medium
CN114019912A (en)*2021-10-152022-02-08上海电机学院 A swarm robot motion planning control method and system
CN114227699A (en)*2022-02-102022-03-25乐聚(深圳)机器人技术有限公司Robot motion adjustment method, robot motion adjustment device, and storage medium
CN114296460A (en)*2021-12-302022-04-08杭州海康机器人技术有限公司Cooperative transportation method and device, readable storage medium and electronic equipment
CN114536339A (en)*2022-03-032022-05-27深圳市大族机器人有限公司Method and device for controlling cooperative robot, cooperative robot and storage medium
CN115097816A (en)*2022-05-202022-09-23深圳市大族机器人有限公司Modularized multi-robot cooperation control method
CN115218904A (en)*2022-06-132022-10-21深圳市优必选科技股份有限公司 Follow-up navigation method, device, computer-readable storage medium, and mobile device
CN116997442A (en)*2021-03-262023-11-03Abb瑞士股份有限公司 Industrial robots with point-to-point communication interfaces to support collaboration between robots
CN119247892A (en)*2024-08-292025-01-03江苏创新包装科技有限公司 A dual-machine collaborative handling robot system based on parameter optimization ratio and its pairing method

Citations (8)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN102662377A (en)*2012-05-172012-09-12哈尔滨工业大学Formation system and formation method of multi-mobile robot based on wireless sensor network
CN103608741A (en)*2011-06-132014-02-26微软公司Tracking and following of moving objects by a mobile robot
CN103901889A (en)*2014-03-272014-07-02浙江大学Multi-robot formation control path tracking method based on Bluetooth communications
CN104950887A (en)*2015-06-192015-09-30重庆大学Transportation device based on robot vision system and independent tracking system
CN105425791A (en)*2015-11-062016-03-23武汉理工大学Swarm robot control system and method based on visual positioning
CN105527960A (en)*2015-12-182016-04-27燕山大学Mobile robot formation control method based on leader-follow
CN106094835A (en)*2016-08-012016-11-09西北工业大学The dynamic formation control method of front-wheel drive vehicle type moving machine device people
CN106155065A (en)*2016-09-282016-11-23上海仙知机器人科技有限公司A kind of robot follower method and the equipment followed for robot

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN103608741A (en)*2011-06-132014-02-26微软公司Tracking and following of moving objects by a mobile robot
CN102662377A (en)*2012-05-172012-09-12哈尔滨工业大学Formation system and formation method of multi-mobile robot based on wireless sensor network
CN103901889A (en)*2014-03-272014-07-02浙江大学Multi-robot formation control path tracking method based on Bluetooth communications
CN104950887A (en)*2015-06-192015-09-30重庆大学Transportation device based on robot vision system and independent tracking system
CN105425791A (en)*2015-11-062016-03-23武汉理工大学Swarm robot control system and method based on visual positioning
CN105527960A (en)*2015-12-182016-04-27燕山大学Mobile robot formation control method based on leader-follow
CN106094835A (en)*2016-08-012016-11-09西北工业大学The dynamic formation control method of front-wheel drive vehicle type moving machine device people
CN106155065A (en)*2016-09-282016-11-23上海仙知机器人科技有限公司A kind of robot follower method and the equipment followed for robot

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
中国人工智能学会编著: "《中国人工智能进展》", 31 December 2009*
卢惠民: "《ROS与中型组足球机器人》", 31 October 2016*

Cited By (38)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN111065981A (en)*2017-09-252020-04-24日本电产新宝株式会社Moving body and moving body system
CN109683556B (en)*2017-10-182021-02-09苏州宝时得电动工具有限公司Cooperative work control method and device for self-moving equipment and storage medium
CN109683556A (en)*2017-10-182019-04-26苏州宝时得电动工具有限公司From mobile device work compound control method, device and storage medium
CN110153983A (en)*2018-02-152019-08-23欧姆龙株式会社Control system, slave device control unit, control method and storage medium
CN110355751B (en)*2018-03-262023-04-28发那科株式会社Control device and machine learning device
CN110355751A (en)*2018-03-262019-10-22发那科株式会社Control device and machine learning device
CN108428059B (en)*2018-03-272021-07-16昆明理工大学 A method for formation and evolution of pipeline inspection robot queue
CN108428059A (en)*2018-03-272018-08-21昆明理工大学A kind of detecting robot of pipe queue forms and develops method
CN108527367A (en)*2018-03-282018-09-14华南理工大学A kind of description method of multirobot work compound task
CN108527367B (en)*2018-03-282021-11-19华南理工大学Description method of multi-robot cooperative work task
CN108873913A (en)*2018-08-222018-11-23深圳乐动机器人有限公司From mobile device work compound control method, device, storage medium and system
CN109740464A (en)*2018-12-212019-05-10北京智行者科技有限公司The identification follower method of target
CN109765889A (en)*2018-12-312019-05-17深圳市越疆科技有限公司A kind of monitoring method of robot, device and intelligent terminal
CN109676611A (en)*2019-01-252019-04-26北京猎户星空科技有限公司Multirobot cooperating service method, device, control equipment and system
CN111766854A (en)*2019-03-272020-10-13杭州海康机器人技术有限公司Control system and control method for AGV cooperative transportation
CN109947105A (en)*2019-03-272019-06-28科大智能机器人技术有限公司A kind of speed regulating method and speed regulation device of automatic tractor
CN110347159B (en)*2019-07-122022-03-08苏州融萃特种机器人有限公司Mobile robot multi-machine cooperation method and system
CN110347159A (en)*2019-07-122019-10-18苏州融萃特种机器人有限公司Mobile robot Multi computer cooperation method and system
CN112775957A (en)*2019-11-082021-05-11珠海市一微半导体有限公司Control method of working robot, working robot system and chip
CN112540605A (en)*2020-03-312021-03-23深圳优地科技有限公司Multi-robot cooperation clearance method, server, robot and storage medium
CN111443642A (en)*2020-04-242020-07-24深圳国信泰富科技有限公司Cooperative control system and method for robot
CN111612312A (en)*2020-04-292020-09-01深圳优地科技有限公司Robot distribution method, robot, terminal device and storage medium
CN111612312B (en)*2020-04-292023-12-22深圳优地科技有限公司Robot distribution method, robot, terminal device, and storage medium
CN112396653A (en)*2020-10-312021-02-23清华大学Target scene oriented robot operation strategy generation method
CN112873206A (en)*2021-01-222021-06-01中国铁建重工集团股份有限公司Multi-task automatic distribution mechanical arm control system and operation trolley
CN116997442A (en)*2021-03-262023-11-03Abb瑞士股份有限公司 Industrial robots with point-to-point communication interfaces to support collaboration between robots
CN113771033A (en)*2021-09-132021-12-10中冶赛迪技术研究中心有限公司Multi-robot site integrated control system, method, device and medium
CN114019912A (en)*2021-10-152022-02-08上海电机学院 A swarm robot motion planning control method and system
CN114019912B (en)*2021-10-152024-02-27上海电机学院Group robot motion planning control method and system
CN114296460A (en)*2021-12-302022-04-08杭州海康机器人技术有限公司Cooperative transportation method and device, readable storage medium and electronic equipment
CN114296460B (en)*2021-12-302023-12-15杭州海康机器人股份有限公司Collaborative handling method and device, readable storage medium and electronic equipment
CN114227699A (en)*2022-02-102022-03-25乐聚(深圳)机器人技术有限公司Robot motion adjustment method, robot motion adjustment device, and storage medium
CN114227699B (en)*2022-02-102024-06-11乐聚(深圳)机器人技术有限公司Robot motion adjustment method, apparatus, and storage medium
CN114536339A (en)*2022-03-032022-05-27深圳市大族机器人有限公司Method and device for controlling cooperative robot, cooperative robot and storage medium
CN114536339B (en)*2022-03-032024-05-31深圳市大族机器人有限公司Control method and device for cooperative robot, cooperative robot and storage medium
CN115097816A (en)*2022-05-202022-09-23深圳市大族机器人有限公司Modularized multi-robot cooperation control method
CN115218904A (en)*2022-06-132022-10-21深圳市优必选科技股份有限公司 Follow-up navigation method, device, computer-readable storage medium, and mobile device
CN119247892A (en)*2024-08-292025-01-03江苏创新包装科技有限公司 A dual-machine collaborative handling robot system based on parameter optimization ratio and its pairing method

Also Published As

Publication numberPublication date
CN106774345B (en)2020-10-30

Similar Documents

PublicationPublication DateTitle
CN106774345A (en)A kind of method and apparatus for carrying out multi-robot Cooperation
Reif et al.Social potential fields: A distributed behavioral control for autonomous robots
Martinez-Cantin et al.A Bayesian exploration-exploitation approach for optimal online sensing and planning with a visually guided mobile robot
Eich et al.Towards coordinated multirobot missions for lunar sample collection in an unknown environment
US10220510B2 (en)Unified collaborative environments
WO2022192132A1 (en)Controlling multiple simulated robots with a single robot controller
BräunlLocalization and navigation
González-Banos et al.Motion planning with visibility constraints: Building autonomous observers
Eilers et al.Modeling an AGV based facility logistics system to measure and visualize performance availability in a VR environment
Bechtsis et al.Unmanned ground vehicles in precision farming services: An integrated emulation modelling approach
Stipes et al.Cooperative localization and mapping
Gianni et al.ARE: Augmented reality environment for mobile robots
Wei et al.Vision-guided fine-operation of robot and its application in eight-puzzle game
UmariMulti-robot map exploration based on multiple rapidly-exploring randomized trees
Trinidad Barnech et al.Initial Results with a Simulation Capable Robotics Cognitive Architecture
Dewan et al.Advancement in SLAM techniques and their diverse applications
Kang et al.Team Tidyboy at the WRS 2020: A modular software framework for home service robots
Mansour et al.Depth estimation with ego-motion assisted monocular camera
Vithalani et al.Autonomous navigation using monocular ORB SLAM2
Asavasirikulkij et al.A study of digital twin and its communication protocol in factory automation cell
CN113433953A (en)Multi-robot cooperative obstacle avoidance method and device and intelligent robot
Skoglar et al.Concurrent path and sensor planning for a UAV-towards an information based approach incorporating models of environment and sensor
Hutter et al.Robust and resource-efficient cooperative exploration and mapping using homogeneous autonomous robot teams
Shinde et al.Ros simulation-based autonomous navigation systems and object detection
Denysyuk et al.A* Modification for Mobile Robotic Systems

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
TA01Transfer of patent application right
TA01Transfer of patent application right

Effective date of registration:20200702

Address after:200131 2nd floor, building 13, No. 27, Xinjinqiao Road, China (Shanghai) pilot Free Trade Zone, Pudong New Area, Shanghai

Applicant after:Shanghai xianruan Information Technology Co., Ltd

Address before:201203, Shanghai, Pudong New Area, China (Shanghai) free trade test area, No. 301, Xia Xia Road, room 22

Applicant before:SHANGHAI SEER ROBOTICS TECHNOLOGY Co.,Ltd.

GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp