Specific embodiment
The application is described in further detail below in conjunction with the accompanying drawings.
In one typical configuration of the application, terminal, the equipment of service network and trusted party include one or moreProcessor (CPU), input/output interface, network interface and internal memory.
Internal memory potentially includes the volatile memory in computer-readable medium, random access memory (RAM) and/orThe forms such as Nonvolatile memory, such as read-only storage (ROM) or flash memory (flash RAM).Internal memory is computer-readable mediumExample.
Computer-readable medium includes that permanent and non-permanent, removable and non-removable media can be by any methodOr technology realizes information Store.Information can be computer-readable instruction, data structure, the module of program or other data.The example of the storage medium of computer includes, but are not limited to phase transition internal memory (PRAM), static RAM (SRAM), movesState random access memory (DRAM), other kinds of random access memory (RAM), read-only storage (ROM), electric erasableProgrammable read only memory (EEPROM), fast flash memory bank or other memory techniques, read-only optical disc read-only storage (CD-ROM),Digital versatile disc (DVD) or other optical storages, magnetic cassette tape, magnetic disk storage or other magnetic storage apparatus orAny other non-transmission medium, can be used to store the information that can be accessed by a computing device.Defined according to herein, computerComputer-readable recording medium does not include non-temporary computer readable media (transitory media), such as data-signal and carrier wave of modulation.
Fig. 1 shows to carry out multi-robot Cooperation in robotic end and network equipment end according to one kind of the application one sideMethod flow diagram.
Wherein, methods described includes step S11, step S12 and step S21.
The embodiment of the present application provides a kind of method for carrying out multi-robot Cooperation, and methods described can be in corresponding machinePeople end and network equipment end are realized.Wherein, the robot includes performing automatically the various machinery equipments of work, Ke YishiMachinery equipment with locomotive function or carrying load function or other functions, it is also possible to while having above-mentioned various functionsMachinery equipment, for example, various with the mobile artificial intelligence equipment for carrying function.In this application, same cooperation is carried out to appointThe function that multiple robots of business have can with it is identical, can also be different.The network equipment includes but is not limited to computer, netNetwork main frame, single network server, multiple webserver collection or Cloud Server, wherein, the Cloud Server can be operationIn a distributed system, a virtual supercomputer being made up of the computer collection of a group loose couplings, it is used to realizeSimple efficient, safe and reliable, disposal ability can elastic telescopic calculating service.In this application, the robot may refer toIt is the robot 1, the network equipment may refer to be the network equipment 2.
Specifically, in the step s 21, the cooperation that the network equipment 2 can provide matching to one or more robots 1 refers toOrder, wherein, the robot 1 is based on corresponding cooperation instruction and performs corresponding multi-robot Cooperation task.Accordingly, in stepIn S11, the instruction that cooperates matched with itself is obtained from the network equipment 2 by corresponding robot 1.Here, many machine cooperations are appointedBusiness can with multiple robots 1 execution various tasks.For example, multiple robots 1 keep the synchronizing moving of similarity distance;And for example, multiple robots 1 deliver same target jointly;And for example, multiple robots 1 carry out an assembling for all parts of objectTask dispatching.In one implementation, the network equipment 2 can be based on the difference of collaborative task type or specific cooperative operationAnd be that different robots 1 matches corresponding cooperation instruction.
In one implementation, the cooperation instruction can include following at least any one:The multimachine of the robotDevice people's formation status information;The speed control rule of the robot;The coordinate of the destination object to be followed of the robotInformation;Other of the robot perform relevant information.
Specifically, delivered jointly together with the synchronizing moving of the holding similarity distance of multiple robots 1, or multiple robots 1As a example by the scene of one object, in one implementation, the network equipment 2 can by the instruction that cooperates be given each robot 1 itsEach the mobile required formation status information for maintaining, for example, keep a row, a line or multiple row to form into columns;In another realization sideIn formula, the network equipment 2 can also be instructed by the cooperation of speed control rule, control the operation speed of each robot 1 of cooperationDegree, to adjust the distance between each robot 1, so as to realize the control to whole queue movement;In another realizationIn mode, the network equipment 2 can also provide the coordinate information of its destination object to be followed to one or more robots 1, canTo be the coordinate information of the destination object for providing to be determined when moving operation starts, or in moving process, it is based onThe coordinate information for providing destination object in real time is set.
So that multiple robots 1 carry out a scene for the assembling task of all parts of object as an example, the collaborative taskCan include in order that each robot 1 moves to the speed control rule of the robot of respective assembling position;The coordinate information of the target location of the robot;And the assembly operation Step Information of robot etc..Additionally, based on othersThe specific tasks of collaborative task need, and the collaborative task is also by adaptations.
In one implementation, the network equipment 2 can be unified to corresponding each robot 1 of the collaborative task simultaneouslySend cooperation instruction;In another implementation, the network equipment 2 can also be on any opportunity to any one or more machinesPeople 1 sends cooperation instruction respectively.In one implementation, the corresponding cooperation of multiple robots 1 in same collaborative taskInstruction can be with identical;Can also differ, or part is identical, part is different, for example, multiple robots 1 with the formation of a row,Keep in the synchronizing moving scene of similarity distance, the cooperation instruction of the first robot 1 of queue can be with other robot in queue 1Cooperation instruction it is different.
Then, in step s 12, robot 1 can be performed corresponding multi-robot Cooperation and appointed based on the cooperation instructionBusiness.In one implementation, Direct Communication is needed not move through to realize correspondingly collaborative task between each robot 1,And one or more robots 1 that the network equipment 2 is cooperated with the cooperation instruction real-time control are can be by, and by eachIndividual robot 1 performs cooperation instruction to realize the completion of collaborative task respectively.In one implementation, the network equipment 2 can be withEach robot 1 necessity that cooperation needs each other is only given to instruct, and other i.e. executable operations that need not cooperate,Can be independently executed by robot 1, for example, keeping the synchronizing moving of similarity distance, or multiple robots 1 in multiple robots 1In the scene of common delivery same target, kept for overall formation and the control of the queue speed of service can be by the network equipment 1By the instruction control that cooperates, and operation is specifically followed for each robot 1, such as follow determination, the identification operation of objectCan be set by each robot 1 itself and performed.
In this application, carrying out the multiple independent robot 1 of collaborative task can be based on being obtained from map network equipment 2The cooperation instruction arrived, performs corresponding multi-robot Cooperation task jointly.Here, the application can be based on the application of concrete sceneNeed, the cooperation sent by the network equipment 2 is instructed, multiple independent robots are carried out into flexible combination so that combinationEach robot afterwards can realize the work compound of the complicated task of or job classification larger to workload, so as to contribute toThe decomposition of complex work and the optimization of overall resource.
In one implementation, in the step S12, robot 1 can be described based on the cooperation instruction, controlRobot 1 is moved by corresponding mobile route to destination locations or destination object.Here, the multi-robot Cooperation task of the applicationCan be the collaborative task for needing multiple robot team formation movements, such as multiple robots 1 keep the synchronizing moving of similarity distance;And for example, multiple robots 1 deliver same target jointly.Specifically, in one implementation, based on the cooperation instruction, canTo control robot 1 to be moved to destination locations by corresponding mobile route, such as described robot 1 is to be located at queue forefrontOne or more robots, it without specific destination object, and may correspond to the destination locations for needing to reach;In one kindIn implementation, based on the cooperation instruction, robot 1 can also be controlled to be moved to destination object by corresponding mobile route,For example, one or more robots 1 positioned at robot queue forefront can have the object of tracking, such as people of certain movementOr thing, and for example, the non-robot 1 positioned at robot queue forefront needs to follow destination object, i.e. target robot to be movedDynamic, the target robot can be the immediate other robot in front of robot 1, or other are preset or based on cooperationInstruct the other robot for determining.
In this implementation, the robot 1 can be used for realizing that multi-robot formation is moved, for example, being cooperatedRobot 1 can be based on the cooperation instruction of matching to the movement of destination locations, or follow destination object to move, it is many to realizeThe formation movement of individual robot 1.Based on this implementation, can flexibly and effectively realize needing based on multiple robot team formationsThe mobile all kinds of collaborative tasks realized, for example cooperate mobile carrying task dispatching.
Further, Fig. 2 shows a kind of side that multi-robot Cooperation is carried out in robotic end according to the application one sideMethod flow chart.Wherein, methods described includes step S11 and step S12, and further, the step S12 includes step S121, stepRapid S122 and step S123.
Specifically, in step S121, robot 1 can determine the destination object to be followed of the robot 1.In one kindIn implementation, the destination object includes target robot, is carried in the corresponding target robot of the robotSame transport object, now, the collaborative task can correspond to the mobile carrying task that cooperates.The robot 1 is in cooperationIt needs to be determined that its destination object to be followed when task starts.
In one implementation, in step S121, when the robot 1 is arranged to follow the mode, the machineDevice people 1 can recognize corresponding matching object from the robot 1 in real time peripheral information of capture, and then the matching is rightAs the destination object to be followed of the robot 1.In one implementation, can be opened by default trigger actionThe follow the mode of robot 1.When the follow the mode starts, robot 1 can in real time capture peripheral information, in a kind of realization sideIn formula, the initial data of ambient condition information, original number can be got by one or more sensing devices in robot 1According to can be image, picture or point cloud.And then, robot 1 detects the object class that needs are followed from the initial dataType, can have one or more objects to belong to the object type in environment.By the method for machine learning, precondition is classified wellDevice, that is, extract the characteristic information of the scan data of a certain class object, is input in grader, is examined from environmental information by contrastingMeasure certain class object.Certain class object often has multiple, and matching object is the conduct selected from one or more class objectsThe object of destination object.
Further, in one implementation, the matching object can include but is not limited to following at least any one:With the immediate object of the robot 1 around the robot 1;It is closest with the robot 1 in the front of the robot 1Object;In the front of the robot 1 and the immediate object of the robot 1;Around the robot 1 and with treat withWith the object of the characteristics of objects information match of object;Believe around the robot 1 and with the characteristics of objects of object to be followedThe object that breath is most matched;In multiple objects around the robot 1 with the characteristics of objects information match of object to be followedWith the immediate object of the robot.In one implementation, the characteristics of objects information can include but is not limited to treatFollow one or more information in positional information, movement state information, the main body characteristic information of object.
Further, in a kind implementation, in step S121, the robot 1 can be referred to based on the cooperationOrder determines the coordinate information of destination object to be followed;And then, the robot 1 obtains the surrounding environment of the robot in real timeInformation, wherein, the distance between the robot 1 and the coordinate information is less than or equal to predetermined distance threshold;Then, it is describedRobot 1 recognizes corresponding matching object from the ambient condition information, and the matching object is treated as the robot 1The destination object for following.Here, coordinate information both can be absolute coordinate information, or relative co-ordinate information.Robot 1Its ambient condition information is obtained by scanning, if now the distance between the robot 1 and coordinate information is less than or equal to pre-Fixed distance threshold;The matching object matched with the coordinate information can be then identified from environmental information, and it is right by matchingAs being set as destination object.
Further, in a kind implementation, if robot 1 obtain it is described cooperation instruction, its position with wait to followMore than predetermined distance threshold, the application further gives the one kind in the case of this kind for the distance between position of objectSolution:When the distance between the robot 1 and the coordinate information is more than predetermined distance threshold, the robot is controlled1, towards coordinate information movement, the distance between robot 1 and coordinate information is reduced with this;Then, in moving processIn, the ambient condition information of the robot 1 is obtained in real time, until the distance between the robot 1 and the coordinate information is smallWhen predetermined distance threshold, you can recognize corresponding matching object with from the ambient condition information, and will be describedMatching object is used as the destination object to be followed of the robot 1.
Then, in step S122, robot 1 can recognize the mesh from the robot 1 in real time scene of captureMark object.In the moving process of robot 1, each object in environment is also at the state being continually changing, therefore, robot 1 is neededThe environment that be based on real-time change repeats the identification operation of destination object again and again.In one implementation, machinePeople 1 can obtain real time environmental data information, then detected from the environmental data information by periodically scanning for surrounding environmentGo out and belong to of a sort all objects with destination object, finally according to the detection in some cycle or multiple cycles of lasting scanningAs a result, the destination object for matching is identified;
Specifically, in a kind of implementation, in step S122, robot 1 can obtain the robot 1 with real time scanAmbient condition information;Then, the characteristics of objects information with the destination object can be detected from the ambient condition informationOne or more object of observations for matching, here, the destination object determined due to the last recongnition of objects operation,Its corresponding characteristics of objects information is stored, for example, the characteristics of objects information of destination object will be determined with history observational recordForm storage, therefore, it can by current context information scanning determined by one or more object of observations characteristics of objects letterBreath carries out Similarity matching with the characteristics of objects information of the destination object of storage, here, the object of observation or the destination objectCharacteristics of objects information can include but is not limited to following any one:The positional information of object;The movement state information of object;It is rightMain body characteristic information of elephant etc., wherein, the positional information refers to the position of object described in the correspondence scanning moment;Motion state is believedCease the movable information such as including the direction of motion, velocity magnitude;Main body characteristic information refers to then the external appearance characteristic of the subject body,Including shape, size and colouring information etc.;And then, robot 1 can be identified from object of observation one or more describedThe destination object, for example, the object of observation for meeting certain matching degree may both be estimated as destination object.
Further, in one implementation, it is described that the target is identified from object of observation one or more describedObject can include:Each object of observation is observed with history in determining corresponding one or more object of observations of the robot 1The related information of record, wherein, one or more of object of observations include the destination object, the history observational record bagInclude the object-related information of one or more history object of observations;Then, the robot 1 according to the object of observation with it is describedThe related information of history observational record identifies the destination object from object of observation one or more described.
Specifically, when the environment that robot 1 is based on real-time change repeats the identification operation of destination object again and againAfter when determining the destination object, this destination object and its corresponding characteristics of objects information record can be entered history observation noteRecord, at the same time it can also other object of observations that will simultaneously be determined with the destination object and its corresponding characteristics of objects informationMatching determination is carried out, is equally recorded in history observational record.Further, when currently carrying out recongnition of objects and operating, can be withEach object of observation carries out data correlation with history observational record in one or more object of observations that will currently get, onePlant in implementation, the data correlation can refer to by each object of observation point in current one or more object of observationsObservational record not with each object in the history observational record of storage is matched, and its result is related information.For example, there is N number of object of observation in certain present scanning cycle, in environment, the M history observation note of object is stored before robotRecord, wherein, M may be identical or different from the quantity of N;And N number of object specific object corresponding with M object there may be oneIndividual or multiple objects occur simultaneously.Data correlation is carried out, is that N number of object of observation is right with M in history observational record respectively one by oneThe observational record of elephant is matched, the matching degree for being matched each time, and whole matching result is a square for N rows M rowBattle array, matrix element is corresponding matching degree, and this matrix is related information.Wherein, the object of observation includes target pairAs including.In one implementation, the spy that the matching can be carried out with object-based one or more characteristics of objects informationLevy matching.Then, the destination object is identified based on the related information for obtaining.Obtaining related information i.e. matching degree squareAfter battle array, by comprehensive analysis computing, a kind of whole matching interrelational form of degree highest is chosen, so as to obtain the target pairAs.
In one implementation, methods described also includes step S13 (not shown), and robot 1 can be with step s 13The history observational record is updated according to one or more of object of observations, wherein, the history observational record after renewalIn object include the destination object that is identified from object of observation one or more described.Corresponding to robot 1Object of observation is continually changing with the change of environment, in one implementation, if there is the new object of observation to occur,Increase the corresponding observational record;If the existing object of observation disappears, the corresponding sight of the object of observation is deletedSurvey record;If the existing object of observation is still suffered from, the relevant information in the corresponding observational record is updated.
Then, in step S123, robot 1 can control the robot by corresponding based on the cooperation instructionMobile route is moved to destination object.Specifically, robot 1 can determine movement of the robot 1 to the destination objectPath;And then, control the robot 1 to be moved by the mobile route.Wherein, the determination of the mobile route or mobileWhat the cooperation instruction that controlling behavior can be all based on the network equipment 2 was performed, or only one is referred to based on the cooperationWhat order was performed.
In one implementation, robot 1 can control the robot by corresponding shifting based on the cooperation instructionDynamic path is moved to destination object, wherein, formation state between the robot and destination object with cooperate instruct in it is manyRobot formation status information matches, and relative distance between second robot and first robot is included inIn default relative distance range threshold.Wherein, the network equipment 2 can by the instruction that cooperates be given each described robot 1 itsEach the mobile required formation status information for maintaining, for example, keep a row, a line or multiple row to form into columns, in a kind of implementationIn, these formation states can be realized by the mobile route of robot 1, the isoparametric setting of motion state;In another realityIn existing mode, the network equipment 2 can also be instructed by the cooperation of speed control rule, control the fortune of each robot 1 of cooperationScanning frequency degree, to adjust the distance between each robot 1, so as to realize the control of whole queue movement.Here, can lead toCooperation instruction is crossed to control to control in multi-robot Cooperation task the queue shape of multirobot, or specific to robot each other itBetween relative position.So that the work compound fitness between each robot 1 is higher, the completion efficiency of cotasking is improved.
In one implementation, the step S123 can include step S1231 (not shown) and step S1232 (notShow).Specifically, in step S1231, robot 1 can determine the robot 1 to described based on the cooperation instructionThe mobile route of destination object;In step S1232, robot 1 can control the robot 1 based on the cooperation instructionMoved by the mobile route.
Further, in step S1231, robot 1 can be obtained from the ambient condition information of the robotObstacle information;Then, the positional information based on the destination object for identifying, determines the coordinates of targets of the robot 1;Then, based on the cooperation instruction, with reference to the coordinates of targets and the obstacle information, determine the robot to describedThe mobile route of destination object, wherein, the cooperation instruction includes multirobot formation status information.
Specifically, robot 1 determines from robot body to the obstacle information the destination object first, itsIn, barrier refer in environment in addition to the destination object other all objects, therefore, the existing static barrier of barrierHinder the buildings such as thing, such as wall, pillar when tracking indoors, also there is mobile barrier, for example, be not belonging to the targetThe object of observation of object.Then, by the positional information of presently described destination object, for example, it is reported in corresponding history observationPositional information in record, is set to the coordinates of targets of robot 1.Finally, based on the cooperation instruction, according to distribution of obstaclesThe coordinates of targets of situation and robot, determines mobile route of the robot to the destination object.In actual applications, due toThe mobile route for reaching another position from a position is not unique, thus for robot determine mobile route norUniquely, but from mulitpath the path being best suitable for is selected.In multi-robot Cooperation task, each robot it is onlyVertical motion needs the cooperation take into account simultaneously between, here, the network equipment 2 is supplied to the cooperation of each robot 1 to refer toOrder includes multirobot formation status information, is used to the mobile formation information of each robot 1 for indicating cooperation, for example, keep oneRow, a line or multiple row are formed into columns, and then, by the formation status information come movement from planning robot to the destination objectPath, if for example, each robot 1 is advanced in capable mode, it is necessary to the path width considered on mobile route, excludesThe limited path candidate of path width.In one implementation, the cooperation instruction for containing the formation status information can be withReceived by corresponding robot 1 before movement starts in robot 1, in acceptable tangible motion process, based on fieldThe change of scape and be supplied to robot 1 in real time.
Further, in step S1232, robot 1 can determine the shifting of the robot 1 based on the cooperation instructionDynamic speed, wherein, the cooperation instruction includes speed control rule;Then, controlling the robot 1 can be based on the movementSpeed, moved by the mobile route, wherein, controlled between the robot 1 and destination object by the translational speedRelative distance is included in default relative distance range threshold.Specifically, when multi-robot Cooperation forms into columns movement, except needingFormation is considered, in addition it is also necessary in view of the relative position between specific robot 1, for example, in the mobile carrying task of cooperationIn, if each robot 1 is moved with a row, when transport object is a length of N meters, in order to ensure each robot while carrying thisTransport task, the relative position of two neighboring robot 1 is not just random, and be the need to ensure that two neighboring robot 1 itBetween be maintained in the range of a certain distance, here, can by cooperate instruction in speed control rule determine robot 1 shiftingDynamic speed so that the robot 1 can be moved based on the translational speed, by the mobile route, meanwhile, keep withDefault distance range between the target robot (can correspond to another robot 1) that it is followed.
Further, in one implementation, it is described based on the cooperation instruction, determine the mobile speed of the robot 1Degree, wherein, the cooperation instruction includes that speed control rule includes:Based on speed control rule, the robot 1 is determinedTranslational speed, wherein, the translational speed include pace and/or turning velocity.Here, the motion of robot 1 needsKinematics and dynamic (dynamical) constraint by robot body, meanwhile, also need to consider the chi of robot 1 in collision freeIt is very little.When control robot 1 is moved according to the mobile route, on the one hand control robot 1 is transported without departing from path domainThe control in dynamic direction, on the other hand needs to control the translational speed of robot 1.Further, it is preferable that the mobile speed of robot 1Degree is divided into two components of pace and turning velocity, and specifically, pace refers to the speed point on the direction of robot 1Amount, turning velocity refers in the velocity component on pace direction.
On this basis, further one kind is achieved in that:When robot 1 is big with the distance between the destination objectWhen distance threshold, while carrying out planning control to the pace and the turning velocity;When robot 1 and instituteWhen stating the distance between destination object less than distance threshold, i.e., when robot is already close to destination object, then only need to machineThe direction of motion of people, i.e. turning velocity are micro-adjusted.
In this application, the robot 1 is to be followed by determining the robot 1 after cooperation instruction is gotDestination object;And then recognize the destination object from the robot in real time scene of capture;So as to realize based on the associationInstruct, control the robot 1 to be moved to destination object by corresponding mobile route.Technology phase is followed with existing robotThan, the application can real-time change, in the natural environment that disturbing factor is more, lock onto target object exactly, and carrying outEffectively track, so as to improve the degree of accuracy that robot is followed, solve current robot follow it is recurrent with wrong meshMark or the technical problem with losing target.Control the robot by corresponding mobile route to mesh based on the cooperation instruction simultaneouslyMark object movement, can on the whole realize the interoperable movement of forming into columns of multiple robots.
In one implementation, in the step s 21, the network equipment 1 can provide the first cooperation and refer to the first robotOrder, wherein, first robot is based on the described first cooperation instruction, controls first robot to press corresponding mobile routeMoved to destination object or destination locations;Then, the second cooperation instruction is provided to the second robot, wherein, second machinePeople is based on the described second cooperation instruction, controls second robot to follow the first robot to move by corresponding mobile route.Further, in one implementation, the phase formation state between second robot and first robot with cooperateMultirobot formation status information in instruction matches, and relative between second robot and first robotDistance is included in default relative distance range threshold.Here, first robot and the second robot can be corresponded toIt is different robots 1, in one implementation, same multi-robot Cooperation task can be by one or more the first machinesDevice people and one or more second robots cooperate execution jointly.In one implementation, the first cooperation instruction and theTwo cooperation instructions can be with identical or different.
Fig. 3 shows a kind of system diagram for carrying out multi-robot Cooperation according to the application one side.Wherein, the systemIncluding robot 1 and the network equipment 2.
Wherein, the robot 1 includes first device 31 and second device 32, and the network equipment 2 includes the 4th device41。
The embodiment of the present application provide a kind of system for carrying out multi-robot Cooperation, the system can include robot andThe network equipment.Wherein, the robot includes performing automatically the various machinery equipments of work, can have mobile work(The machinery equipment of load function or other functions or can be carried, it is also possible to while there is the machinery equipment of above-mentioned various functions,For example, various with the mobile artificial intelligence equipment for carrying function.In this application, multiple machines of same collaborative task are carried outThe function that device people has can with it is identical, can also be different.The network equipment includes but is not limited to computer, network host, listThe individual webserver, multiple webserver collection or Cloud Server, wherein, the Cloud Server can operate in distributed systemA virtual supercomputer in system, being made up of the computer collection of a group loose couplings, it is used to realize simple efficient, peaceComplete reliable, disposal ability can elastic telescopic calculating service.In this application, the robot may refer to be the robot1, the network equipment may refer to be the network equipment 2.
Specifically, the cooperation that the 4th device 41 can provide matching to one or more robots 1 is instructed, wherein, it is describedRobot 1 is based on corresponding cooperation instruction and performs corresponding multi-robot Cooperation task.Accordingly, first device 31 sets from networkThe instruction that cooperates matched with itself is obtained in standby 2.Here, many machine collaborative tasks can the execution with multiple robots 1Various tasks.For example, multiple robots 1 keep the synchronizing moving of similarity distance;And for example, multiple robots 1 deliver same jointlyObject;And for example, multiple robots 1 carry out an assembling task dispatching for all parts of object.In one implementation, networkEquipment 2 can be that different robots 1 matches corresponding association based on the difference of collaborative task type or specific cooperative operationInstruct.
In one implementation, the cooperation instruction can include following at least any one:The multimachine of the robotDevice people's formation status information;The speed control rule of the robot;The coordinate of the destination object to be followed of the robotInformation;Other of the robot perform relevant information.
Specifically, delivered jointly together with the synchronizing moving of the holding similarity distance of multiple robots 1, or multiple robots 1As a example by the scene of one object, in one implementation, the network equipment 2 can by the instruction that cooperates be given each robot 1 itsEach the mobile required formation status information for maintaining, for example, keep a row, a line or multiple row to form into columns;In another realization sideIn formula, the network equipment 2 can also be instructed by the cooperation of speed control rule, control the operation speed of each robot 1 of cooperationDegree, to adjust the distance between each robot 1, so as to realize the control to whole queue movement;In another realizationIn mode, the network equipment 2 can also provide the coordinate information of its destination object to be followed to one or more robots 1, canTo be the coordinate information of the destination object for providing to be determined when moving operation starts, or in moving process, it is based onThe coordinate information for providing destination object in real time is set.
So that multiple robots 1 carry out a scene for the assembling task of all parts of object as an example, the collaborative taskCan include in order that each robot 1 moves to the speed control rule of the robot of respective assembling position;The coordinate information of the target location of the robot;And the assembly operation Step Information of robot etc..Additionally, based on othersThe specific tasks of collaborative task need, and the collaborative task is also by adaptations.
In one implementation, the 4th device 41 can unite to corresponding each robot 1 of the collaborative task simultaneouslyOne sends cooperation instruction;In another implementation, the 4th device 41 can also be on any opportunity to any one or moreRobot 1 sends cooperation instruction respectively.In one implementation, the multiple robots 1 in same collaborative task are correspondingCooperation instruction can be with identical;Can also differ, or part is identical, partly different, for example, in multiple robots 1 with a rowIn formation, the synchronizing moving scene of holding similarity distance, the cooperation instruction of the first robot 1 of queue can be with other machines in queueThe cooperation instruction of device people 1 is different.
Then, second device 32 can perform corresponding multi-robot Cooperation task based on the cooperation instruction.In one kindIn implementation, Direct Communication is needed not move through to realize correspondingly collaborative task between each robot 1, and can be logicalOne or more robots 1 that the network equipment 2 is cooperated with the cooperation instruction real-time control are crossed, and by each robot 1Cooperation instruction is performed respectively to realize the completion of collaborative task.In one implementation, the network equipment 2 can be given only respectivelyThe necessary instruction that cooperation needs each other of individual robot 1, and other i.e. executable operations that need not cooperate, can be by machinePeople 1 independently executes, for example, keeping the synchronizing moving of similarity distance, or multiple robots 1 to deliver jointly together in multiple robots 1In the scene of one object, keep to be referred to by cooperation by the network equipment 1 with the control of the queue speed of service for overall formationOrder control, and operation is specifically followed for each robot 1, such as follow the determination of object, recognize that operation can be by eachRobot 1 itself sets and performs.
In this application, carrying out the multiple independent robot 1 of collaborative task can be based on being obtained from map network equipment 2The cooperation instruction arrived, performs corresponding multi-robot Cooperation task jointly.Here, the application can be based on the application of concrete sceneNeed, the cooperation sent by the network equipment 2 is instructed, multiple independent robots are carried out into flexible combination so that combinationEach robot afterwards can realize the work compound of the complicated task of or job classification larger to workload, so as to contribute toThe decomposition of complex work and the optimization of overall resource.
In one implementation, second device 21 can control the robot 1 by corresponding based on the cooperation instructionMobile route moved to destination locations or destination object.Here, the multi-robot Cooperation task of the application can be that needs are moreThe collaborative task of individual robot team formation movement, such as multiple robots 1 keep the synchronizing moving of similarity distance;And for example, Duo GejiDevice people 1 delivers same target jointly.Specifically, in one implementation, based on the cooperation instruction, robot can be controlled1 is moved by corresponding mobile route to destination locations, such as described robot 1 is one or more machines for being located at queue forefrontDevice people, it without specific destination object, and may correspond to the destination locations for needing to reach;In one implementation,Based on the cooperation instruction, robot 1 can also be controlled to be moved to destination object by corresponding mobile route, for example, the machine of being located atOne or more robots 1 of device people's queue forefront can have the object of tracking, people or thing that for example certain is moved, and for example,The non-robot 1 positioned at robot queue forefront needs to follow destination object, i.e. target robot to move, the target machineDevice people can be the immediate other robot in front of robot 1, or other are preset or instruct its of determination based on cooperationHis robot.
In this implementation, the robot 1 can be used for realizing that multi-robot formation is moved, for example, being cooperatedRobot 1 can be based on the cooperation instruction of matching to the movement of destination locations, or follow destination object to move, it is many to realizeThe formation movement of individual robot 1.Based on this implementation, can flexibly and effectively realize needing based on multiple robot team formationsThe mobile all kinds of collaborative tasks realized, for example cooperate mobile carrying task dispatching.
Further, in one implementation, the second device 32 includes first module (not shown), second unit(not shown) and the 3rd unit (not shown).
Specifically, first module can determine the destination object to be followed of the robot 1.In one implementation,The destination object includes target robot, same transport is carried in the corresponding target robot of the robot rightAs now, the collaborative task can correspond to the mobile carrying task that cooperates.The robot 1 is needed when collaborative task startsDetermine its destination object to be followed.
In one implementation, when the robot 1 is arranged to follow the mode, first module can be from the machineCorresponding matching object is recognized in the peripheral information that device people 1 captures in real time, and then object as the robot 1 is matched using describedDestination object to be followed.In one implementation, mould can be followed by default trigger action unlatching robot 1Formula.When the follow the mode starts, robot 1 can in real time capture peripheral information, in one implementation, can be by machineOne or more sensing devices in device people 1 get the initial data of ambient condition information, and initial data can be image, figurePiece or point cloud.And then, robot 1 detects the object type that needs are followed from the initial data, can have in environmentOne or more objects belong to the object type.By the method for machine learning, the good grader of precondition extracts a certain classThe characteristic information of the scan data of object, is input in grader, and certain class object is detected from environmental information by contrasting.CertainClass object often has multiple, and matching object is the object as destination object selected from one or more class objects.
Further, in one implementation, the matching object can include but is not limited to following at least any one:With the immediate object of the robot 1 around the robot 1;It is closest with the robot 1 in the front of the robot 1Object;In the front of the robot 1 and the immediate object of the robot 1;Around the robot 1 and with treat withWith the object of the characteristics of objects information match of object;Believe around the robot 1 and with the characteristics of objects of object to be followedThe object that breath is most matched;In multiple objects around the robot 1 with the characteristics of objects information match of object to be followedWith the immediate object of the robot.In one implementation, the characteristics of objects information can include but is not limited to treatFollow one or more information in positional information, movement state information, the main body characteristic information of object.
Further, in a kind implementation, first module can determine mesh to be followed based on the cooperation instructionMark the coordinate information of object;And then, the robot 1 obtains the ambient condition information of the robot in real time, wherein, the machineDistance between device people 1 and the coordinate information is less than or equal to predetermined distance threshold;Then, the robot 1 is from the weekCollarette environment information recognizes corresponding matching object, and matches object as the destination object to be followed of the robot 1 using described.Here, coordinate information both can be absolute coordinate information, or relative co-ordinate information.Robot 1 obtains it by scanningAmbient condition information, if now the distance between the robot 1 and coordinate information is less than or equal to predetermined distance threshold;ThenThe matching object matched with the coordinate information can be identified from environmental information, and matching object is set as target pairAs.
Further, in a kind implementation, if robot 1 obtain it is described cooperation instruction, its position with wait to followMore than predetermined distance threshold, the application further gives the one kind in the case of this kind for the distance between position of objectSolution:When the distance between the robot 1 and the coordinate information is more than predetermined distance threshold, the robot is controlled1, towards coordinate information movement, the distance between robot 1 and coordinate information is reduced with this;Then, in moving processIn, the ambient condition information of the robot 1 is obtained in real time, until the distance between the robot 1 and the coordinate information is smallWhen predetermined distance threshold, you can recognize corresponding matching object with from the ambient condition information, and will be describedMatching object is used as the destination object to be followed of the robot 1.
Then, second unit can recognize the destination object from the robot 1 in real time scene of capture.In machineIn the moving process of people 1, each object in environment is also at the state being continually changing, therefore, robot 1 is needed based on real-time changeEnvironment again and again repeat destination object identification operation.In one implementation, robot 1 can be by the cycleProperty ground scanning surrounding environment, obtain real time environmental data information, then detected from the environmental data information and destination object categoryIn of a sort all objects, finally according to some cycle or the testing result in multiple cycles of lasting scanning, phase is identifiedThe destination object of matching;
Specifically, in a kind of implementation, the surrounding environment that second unit can obtain the robot 1 with real time scan is believedBreath;Then, with the characteristics of objects information match of the destination object can be detected from the ambient condition informationOr multiple object of observations, here, due to destination object, its corresponding object of the last recongnition of objects operation determinationCharacteristic information is stored, for example, will determine that the characteristics of objects information of destination object is stored in the form of history observational record, becauseThis, can be by the characteristics of objects information of one or more object of observations determined by current context information scanning and the target for storingThe characteristics of objects information of object carries out Similarity matching, here, the characteristics of objects information of the object of observation or the destination objectFollowing any one can be included but is not limited to:The positional information of object;The movement state information of object;The main body characteristic letter of objectBreath etc., wherein, the positional information refers to the position of object described in the correspondence scanning moment;Movement state information includes the side of motionTo movable informations such as, velocity magnitudes;Main body characteristic information refers to then the external appearance characteristic of the subject body, including shape, sizeAnd colouring information etc.;And then, robot 1 can identify the destination object from object of observation one or more described,For example, the object of observation for meeting certain matching degree may both be estimated as destination object.
Further, in one implementation, it is described that the target is identified from object of observation one or more describedObject can include:Each object of observation is observed with history in determining corresponding one or more object of observations of the robot 1The related information of record, wherein, one or more of object of observations include the destination object, the history observational record bagInclude the object-related information of one or more history object of observations;Then, the robot 1 according to the object of observation with it is describedThe related information of history observational record identifies the destination object from object of observation one or more described.
Specifically, when the environment that robot 1 is based on real-time change repeats the identification operation of destination object again and againAfter when determining the destination object, this destination object and its corresponding characteristics of objects information record can be entered history observation noteRecord, at the same time it can also other object of observations that will simultaneously be determined with the destination object and its corresponding characteristics of objects informationMatching determination is carried out, is equally recorded in history observational record.Further, when currently carrying out recongnition of objects and operating, can be withEach object of observation carries out data correlation with history observational record in one or more object of observations that will currently get, onePlant in implementation, the data correlation can refer to by each object of observation point in current one or more object of observationsObservational record not with each object in the history observational record of storage is matched, and its result is related information.For example, there is N number of object of observation in certain present scanning cycle, in environment, the M history observation note of object is stored before robotRecord, wherein, M may be identical or different from the quantity of N;And N number of object specific object corresponding with M object there may be oneIndividual or multiple objects occur simultaneously.Data correlation is carried out, is that N number of object of observation is right with M in history observational record respectively one by oneThe observational record of elephant is matched, the matching degree for being matched each time, and whole matching result is a square for N rows M rowBattle array, matrix element is corresponding matching degree, and this matrix is related information.Wherein, the object of observation includes target pairAs including.In one implementation, the spy that the matching can be carried out with object-based one or more characteristics of objects informationLevy matching.Then, the destination object is identified based on the related information for obtaining.Obtaining related information i.e. matching degree squareAfter battle array, by comprehensive analysis computing, a kind of whole matching interrelational form of degree highest is chosen, so as to obtain the target pairAs.
In one implementation, the robot 1 also includes 3rd device (not shown), and robot 1 can be according to instituteState one or more object of observations and update the history observational record, wherein, it is right in the history observational record after renewalAs the destination object including being identified from object of observation one or more described.Observation corresponding to robot 1 is rightIt is continually changing as the change with environment, in one implementation, if there is the new object of observation to occur, increases phaseThe observational record answered;If the existing object of observation disappears, the corresponding observational record of the object of observation is deleted;If the existing object of observation is still suffered from, the relevant information in the corresponding observational record is updated.
Then, Unit the 3rd can control the robot by corresponding mobile route to mesh based on the cooperation instructionMark object movement.Specifically, robot 1 can determine mobile route of the robot 1 to the destination object;And then, controlThe robot 1 is made to be moved by the mobile route.Wherein, the determination of the mobile route or mobile controlling behavior can be withThe cooperation instruction execution of the network equipment 2 is all based on, or only one is performed based on the cooperation instruction.
In one implementation, Unit the 3rd can control the robot by corresponding based on the cooperation instructionMobile route is moved to destination object, wherein, formation state between the robot and destination object with cooperate and instructMultirobot formation status information matches, and relative distance between second robot and first robot is includedIn default relative distance range threshold.Wherein, the network equipment 2 can provide each robot 1 by the instruction that cooperatesThe formation status information maintained required for it is each mobile, for example, keep a row, a line or multiple row to form into columns, in a kind of realization sideIn formula, these formation states can be realized by the mobile route of robot 1, the isoparametric setting of motion state;At anotherIn implementation, the network equipment 2 can also be instructed by the cooperation of speed control rule, control each robot 1 of cooperationThe speed of service, to adjust the distance between each robot 1, so as to realize the control of whole queue movement.Here, can be withThe queue shape of control multirobot in multi-robot Cooperation task is controlled by cooperating instruction, or specific to robot each otherBetween relative position.So that the work compound fitness between each robot 1 is higher, the completion effect of cotasking is improvedRate.
In one implementation, Unit the 3rd can include the first subelement (not shown) and the second subelement(not shown).Specifically, the first subelement can determine the robot 1 to the destination object based on the cooperation instructionMobile route;Second subelement can control the robot 1 to be moved by the mobile route based on the cooperation instruction.
Further, the first subelement can obtain obstacle information from the ambient condition information of the robot;Then, the positional information based on the destination object for identifying, determines the coordinates of targets of the robot 1;Then, based on instituteCooperation instruction is stated, with reference to the coordinates of targets and the obstacle information, determines the robot to the destination objectMobile route, wherein, the cooperation instruction includes multirobot formation status information.
Specifically, the first subelement determines from robot body to the obstacle information the destination object first,Wherein, barrier refer in environment in addition to the destination object other all objects, therefore, barrier is existing staticThe buildings such as barrier, such as wall, pillar when tracking indoors, also there is mobile barrier, for example, be not belonging to the meshMark the object of observation of object.Then, by the positional information of presently described destination object, for example, it is reported in the corresponding conception of historyThe positional information surveyed in record, is set to the coordinates of targets of robot 1.Finally, based on the cooperation instruction, according to barrier pointThe coordinates of targets of cloth situation and robot, determines mobile route of the robot to the destination object.In actual applications, byIt is not unique in the mobile route for reaching another position from a position, therefore the mobile route determined for robot is not yetIt is unique, but the path being best suitable for is selected from mulitpath.In multi-robot Cooperation task, each robotSelf-movement needs the cooperation take into account simultaneously between, here, the network equipment 2 is supplied to the cooperation of each robot 1Instruction includes multirobot formation status information, is used to the mobile formation information of each robot 1 for indicating cooperation, for example, keepOne row, a line or multiple row are formed into columns, and then, by the formation status information come shifting from planning robot to the destination objectDynamic path, if for example, each robot 1 is advanced in capable mode, it is necessary to the path width considered on mobile route, excludesFall the limited path candidate of path width.In one implementation, the cooperation instruction for containing the formation status information canReceived by corresponding robot 1 before movement starts with robot 1, in acceptable tangible motion process, be based onThe change of scene and be supplied to robot 1 in real time.
Further, the second subelement can determine the translational speed of the robot 1 based on the cooperation instruction, wherein,The cooperation instruction includes speed control rule;Then, the control robot 1 can be based on the translational speed, by describedMobile route is moved, wherein, control the relative distance between the robot 1 and destination object to include by the translational speedIn default relative distance range threshold.Specifically, when multi-robot Cooperation forms into columns movement, except needing to consider teamShape, in addition it is also necessary in view of the relative position between specific robot 1, for example, in the mobile carrying task of cooperation, if each machineDevice people 1 is moved with a row, when transport object is a length of N meters, in order to ensure each robot while carrying the transport task, phaseThe relative position of Lin Liangge robots 1 is not just random, and is maintained at certain between being the need to ensure that two neighboring robot 1Distance range in, here, can by cooperate instruction in speed control rule determine robot 1 translational speed so thatThe robot 1 can be moved based on the translational speed, by the mobile route, meanwhile, the mesh that holding is followed with itDefault distance range between scalar robot (can correspond to another robot 1).
Further, in one implementation, it is described based on the cooperation instruction, determine the mobile speed of the robot 1Degree, wherein, the cooperation instruction includes that speed control rule includes:Based on speed control rule, the robot 1 is determinedTranslational speed, wherein, the translational speed include pace and/or turning velocity.Here, the motion of robot 1 needsKinematics and dynamic (dynamical) constraint by robot body, meanwhile, also need to consider the chi of robot 1 in collision freeIt is very little.When control robot 1 is moved according to the mobile route, on the one hand control robot 1 is transported without departing from path domainThe control in dynamic direction, on the other hand needs to control the translational speed of robot 1.Further, it is preferable that the mobile speed of robot 1Degree is divided into two components of pace and turning velocity, and specifically, pace refers to the speed point on the direction of robot 1Amount, turning velocity refers in the velocity component on pace direction.
On this basis, further one kind is achieved in that:When robot 1 is big with the distance between the destination objectWhen distance threshold, while carrying out planning control to the pace and the turning velocity;When robot 1 and instituteWhen stating the distance between destination object less than distance threshold, i.e., when robot is already close to destination object, then only need to machineThe direction of motion of people, i.e. turning velocity are micro-adjusted.
In this application, the robot 1 is to be followed by determining the robot 1 after cooperation instruction is gotDestination object;And then recognize the destination object from the robot in real time scene of capture;So as to realize based on the associationInstruct, control the robot 1 to be moved to destination object by corresponding mobile route.Technology phase is followed with existing robotThan, the application can real-time change, in the natural environment that disturbing factor is more, lock onto target object exactly, and carrying outEffectively track, so as to improve the degree of accuracy that robot is followed, solve current robot follow it is recurrent with wrong meshMark or the technical problem with losing target.Control the robot by corresponding mobile route to mesh based on the cooperation instruction simultaneouslyMark object movement, can on the whole realize the interoperable movement of forming into columns of multiple robots.
In one implementation, the 4th device 41 of the network equipment 1 can provide the first cooperation and refer to the first robotOrder, wherein, first robot is based on the described first cooperation instruction, controls first robot to press corresponding mobile routeMoved to destination object or destination locations;Then, the second cooperation instruction is provided to the second robot, wherein, second machinePeople is based on the described second cooperation instruction, controls second robot to follow the first robot to move by corresponding mobile route.Further, in one implementation, the phase formation state between second robot and first robot with cooperateMultirobot formation status information in instruction matches, and relative between second robot and first robotDistance is included in default relative distance range threshold.Here, first robot and the second robot can be corresponded toIt is different robots 1, in one implementation, same multi-robot Cooperation task can be by one or more the first machinesDevice people and one or more second robots cooperate execution jointly.In one implementation, the first cooperation instruction and theTwo cooperation instructions can be with identical or different.
It is obvious to a person skilled in the art that the application is not limited to the details of above-mentioned one exemplary embodiment, Er QieIn the case of without departing substantially from spirit herein or essential characteristic, the application can be in other specific forms realized.Therefore, no matterFrom the point of view of which point, embodiment all should be regarded as exemplary, and be nonrestrictive, scope of the present application is by appended powerProfit requires to be limited rather than described above, it is intended that all in the implication and scope of the equivalency of claim by fallingChange is included in the application.Any reference in claim should not be considered as the claim involved by limitation.ThisOutward, it is clear that " including " word is not excluded for other units or step, odd number is not excluded for plural number.The multiple stated in device claimUnit or device can also be realized by a unit or device by software or hardware.The first, the second grade word is used for tableShow title, and be not offered as any specific order.