Movatterモバイル変換


[0]ホーム

URL:


CN111065981A - Moving body and moving body system - Google Patents

Moving body and moving body system
Download PDF

Info

Publication number
CN111065981A
CN111065981ACN201880057317.5ACN201880057317ACN111065981ACN 111065981 ACN111065981 ACN 111065981ACN 201880057317 ACN201880057317 ACN 201880057317ACN 111065981 ACN111065981 ACN 111065981A
Authority
CN
China
Prior art keywords
obstacle
control circuit
signal indicating
path
agv10
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201880057317.5A
Other languages
Chinese (zh)
Inventor
市川明男
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nidec Drive Technology Corp
Original Assignee
Nidec Shimpo Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nidec Shimpo CorpfiledCriticalNidec Shimpo Corp
Publication of CN111065981ApublicationCriticalpatent/CN111065981A/en
Pendinglegal-statusCriticalCurrent

Links

Images

Classifications

Landscapes

Abstract

The management device has: a 1 st communication circuit which communicates with each of a plurality of mobile bodies; and a 1 st control circuit that determines a travel path of each mobile body and transmits a signal indicating the travel path to each of the plurality of mobile bodies via the 1 st communication circuit. Each of the moving bodies has: a 2 nd communication circuit that communicates with the 1 st communication circuit; a sensor that detects an obstacle; and a 2 nd control circuit for moving the movable body according to the travel path determined by the 1 st control circuit. When the sensor detects an obstacle, the 2 nd control circuit causes the moving body to avoid the obstacle, and transmits a signal indicating the presence of the obstacle via the 2 nd communication circuit. When a signal indicating the presence of an obstacle is transmitted from an arbitrary mobile object, the 1 st control circuit changes the route of another mobile object that is expected to pass through the route in which the obstacle is present.

Description

Moving body and moving body system
Technical Field
The present disclosure relates to a mobile body and a mobile body system.
Background
Research and development of moving bodies such as automated guided vehicles and mobile robots are being advanced. For example, japanese patent laid-open nos. 2009-.
Documents of the prior art
Patent document
Patent document 1: japanese laid-open patent publication No. 2009-223634
Patent document 2: japanese patent laid-open publication No. 2009-205652
Patent document 3: japanese patent laid-open publication No. 2005-242489
Disclosure of Invention
Problems to be solved by the invention
Embodiments of the present disclosure provide a technique of making operations of a plurality of moving bodies capable of autonomous movement smoother.
Means for solving the problems
The management device of the exemplary embodiment of the present disclosure manages the operation of a plurality of mobile bodies that can autonomously move. The management device includes: a 1 st communication circuit that communicates with each of the plurality of moving bodies; and a 1 st control circuit that determines a travel path of each of the plurality of mobile bodies and transmits a signal indicating the travel path to each of the plurality of mobile bodies via the 1 st communication circuit. Each of the plurality of moving bodies has: a 2 nd communication circuit that communicates with the 1 st communication circuit; at least one sensor that detects an obstacle; and a 2 nd control circuit that moves the moving body according to the travel path determined by the 1 st control circuit, and when the sensor detects an obstacle, the 2 nd control circuit causes the moving body to avoid the obstacle and transmits a signal indicating the presence of the obstacle via the 2 nd communication circuit. When the signal indicating the presence of the obstacle is transmitted from any of the plurality of moving bodies, the 1 st control circuit changes the path of the moving body, which is expected to pass through the path in which the obstacle is present, among the plurality of moving bodies.
The above-described general manner can also be realized by a system, a method, an integrated circuit, a computer program, or a recording medium. Alternatively, the present invention may be implemented by any combination of systems, apparatuses, methods, integrated circuits, computer programs, and recording media.
Effects of the invention
According to the embodiment of the present disclosure, when a certain moving body performs an operation of avoiding an obstacle, the path of another moving body is changed to a path that does not collide with the obstacle. Therefore, the operation of the moving body system can be smoother.
Drawings
Fig. 1 is a diagram schematically showing the structure of amobile body system 100 of an exemplary embodiment of the present disclosure.
Fig. 2A shows an example of a case where there is no obstacle on the traveling path of themobile body 10A.
FIG. 2B shows a mark M on the traveling path of themobile body 10A1And a marker M2An example of the avoidance operation in the case where theobstacle 70 exists therebetween.
Fig. 2C is a diagram showing an example of the route after the change.
Fig. 2D is a diagram showing another example of the route after the change.
Fig. 3 is a diagram showing an example of data of the travel path of eachmobile body 10 managed by themanagement device 50.
Fig. 4 is a flowchart showing an example of the operation of the 1st control circuit 51 of themanagement device 50.
Fig. 5 is a flowchart showing an example of the operation of the 2nd control circuit 14a of themobile body 10.
Fig. 6 is a diagram showing an outline of the control system for controlling the travel of each AGV according to the present disclosure.
Fig. 7 is a diagram showing an example of the travel space S in which the AGV is located.
Fig. 8A is a diagram showing an AGV and a traction trolley before connection.
FIG. 8B is a diagram showing the AGV and the traction trolley after being connected.
Fig. 9 is an external view of an exemplary AGV according to the present embodiment.
Fig. 10A is a diagram showing an example of the 1 st hardware configuration of an AGV.
Fig. 10B is a diagram showing an example of the 2 nd hardware configuration of the AGV.
Fig. 11A is a diagram showing an AGV that generates a map while moving.
Fig. 11B is a diagram showing an AGV that generates a map while moving.
Fig. 11C is a diagram showing an AGV that generates a map while moving.
Fig. 11D is a diagram showing an AGV that generates a map while moving.
Fig. 11E is a diagram showing an AGV that generates a map while moving.
Fig. 11F is a diagram schematically illustrating a part of the completed map.
Fig. 12 is a diagram showing an example of a map in which one floor is configured by a plurality of partial maps.
Fig. 13 is a diagram showing an example of the hardware configuration of the operation management device.
Fig. 14 is a diagram schematically showing an example of the travel path of the AGV determined by the operation management device.
Detailed Description
< word >
Before describing the embodiments of the present disclosure, definitions of words used in the present specification will be described.
An "automated guided vehicle" (AGV) is a trackless vehicle that manually or automatically loads a load onto a subject, automatically travels to a designated location, and then manually or automatically unloads the load. "automated guided vehicles" include unmanned tractors and unmanned forklifts.
The term "unmanned" means that no human is required to maneuver the vehicle, and does not exclude the case where an unmanned vehicle carries a "human (e.g., a person handling goods)".
An "unmanned tractor" is a trackless vehicle that travels automatically to the indicated location, towing a trolley that loads and unloads goods manually or automatically.
An "unmanned forklift" is a trackless vehicle that has a mast for raising and lowering a fork or the like for transferring a load, automatically transfers the load to the fork or the like, automatically travels to a designated place, and performs an automatic load handling operation.
A "trackless vehicle" is a mobile body (vehicle) having wheels and an electric motor or engine that rotates the wheels.
The "mobile body" is a device that moves while carrying a person or a load, and includes a wheel that generates a driving force (traction) for movement, a bipedal or multi-legged running device, or a driving device such as a propeller. The term "moving body" in the present disclosure includes not only an unmanned carrier in a narrow sense but also a mobile robot, a service robot, and an unmanned aerial vehicle.
The "automatic travel" includes travel of the automated guided vehicle based on an instruction from an operation management system of a computer connected by communication and autonomous travel based on a control device included in the automated guided vehicle. The autonomous traveling includes not only traveling of the automated guided vehicle toward the destination along a predetermined route but also traveling following the tracking target. Further, the automated guided vehicle may temporarily perform manual travel based on an instruction from the operator. The "automatic travel" generally includes both "guided" travel and "unguided" travel, but in the present disclosure, it refers to "unguided" travel.
The "guide type" refers to a method of continuously or intermittently providing a guide body and guiding an automated guided vehicle by the guide body.
The "unguided type" refers to a type of guidance without providing a guide body. The automated guided vehicle according to the embodiment of the present disclosure has its own position estimating device, and can travel without guidance.
The "self-position estimation device" is a device that estimates a self-position on an environment map from sensor data acquired by an external sensor such as a laser range finder.
The "external sensor" is a sensor that senses a state of the outside of the moving body. Examples of ambient sensors are laser range finders (also called range sensors), cameras (or image sensors), LIDAR (Light Detection and ranging), millimeter-wave radar and magnetic sensors.
The "internal sensor" is a sensor that senses the state of the inside of the moving body. Examples of the internal sensors include a rotary encoder (hereinafter, may be simply referred to as "encoder"), an acceleration sensor, and an angular acceleration sensor (for example, a gyro sensor).
"SLAM (スラム)" is an abbreviation for Simultaneous Localization and Mapping, and means that self-position estimation and environment Mapping are performed simultaneously.
< exemplary embodiment >
Hereinafter, an example of the moving body and the moving body system of the present disclosure will be described with reference to the drawings. In some cases, an excessive detailed description is omitted. For example, detailed descriptions of well-known matters and repeated descriptions of substantially the same structures may be omitted. This is to avoid unnecessary redundancy in the following description, which will be readily understood by those skilled in the art. The figures and the following description are provided by the present inventors to provide a sufficient understanding of the present invention to those skilled in the art. And are not intended to limit the subject matter recited in the claims. In the following description, the same or similar components are denoted by the same reference numerals.
Fig. 1 is a diagram schematically showing the structure of amobile body system 100 of an exemplary embodiment of the present disclosure. Themobile body system 100 includes a plurality ofmobile bodies 10 that can autonomously move, and an operation management device (hereinafter, may be simply referred to as "management device") 50 that manages operations of the plurality ofmobile bodies 10. Fig. 1 shows twomobile bodies 10 as an example. Themobile body system 100 may include three or moremobile bodies 10. In the present embodiment, the movingobject 10 is an Automated Guided Vehicle (AGV). In the following description, the movingobject 10 may be referred to as an "AGV 10". Themobile body 10 may be another type of mobile body such as a bipedal or multi-footed walking robot, a hovercraft, or an unmanned aerial vehicle.
Themanagement device 50 has a 1st communication circuit 54 that communicates with each of the plurality ofmobile bodies 10 via a network, and a 1st control circuit 51 that controls the 1st communication circuit 54. The 1st control circuit 51 determines a travel path of each of the plurality ofmobile bodies 10, and transmits a signal indicating the travel path of each to the plurality ofmobile bodies 10 via the 1st communication circuit 54. The travel path may be determined individually for eachmobile body 10, or allmobile bodies 10 may move along the same travel path. The traveling paths of at least two movingbodies 10 among the plurality of movingbodies 10 at least partially overlap.
The "signal indicating the travel route" transmitted from themanagement apparatus 50 to eachmobile body 10 may include, for example, information indicating positions of a plurality of points on a route from the initial position to the destination position. In this specification, such a place is sometimes referred to as a "mark". The marks may be set along the travel path of eachmobile body 10, for example, every distance of several tens of centimeters (cm) to several meters (m).
Each of the plurality ofmobile bodies 10 moves along the travel path in accordance with an instruction from themanagement apparatus 50. In a typical example, eachmobile body 10 has a storage device that stores data of an environment map (sometimes simply referred to as "environment map") and an external sensor that periodically scans the environment and outputs sensor data at each scan. In this case, eachmobile body 10 moves along the travel path while estimating its position and orientation (position) by matching the sensor data with the environment map data.
Each of themobile bodies 10 has a function of detecting an obstacle on the travel path and a function of avoiding the obstacle. Eachmobile body 10 has a 2nd communication circuit 14e capable of communicating with the 1st communication circuit 54 via a network, at least oneobstacle sensor 19 that detects an obstacle, and a 2nd control circuit 14a that controls movement and communication of themobile body 10. The 2nd control circuit 14a controls a driving device, not shown, to move themobile body 10 according to the travel path determined by the 1st control circuit 54. When thesensor 19 detects an obstacle on the travel path, the 2nd control circuit 14a causes themobile body 10 to avoid the obstacle. At this time, the 2nd control circuit 14a transmits a signal indicating the presence of the obstacle to the 1st communication circuit 54 via the 2nd communication circuit 14 e.
The "signal indicating the presence of an obstacle" may include, for example, position information of an obstacle, information of a trajectory of a moving object after avoiding the obstacle, or information indicating the presence or absence of the obstacle. The signal indicating the presence of an obstacle may also contain information relating to the size of the obstacle or the area occupied by the obstacle.
When a signal indicating the presence of an obstacle is transmitted from any of the plurality ofmobile bodies 10, the 1st control circuit 51 of themanagement device 50 changes the route of themobile body 10, which is expected to pass through the route in which the obstacle is present, among the plurality ofmobile bodies 10.
As an example, an operation in a case where a signal indicating a travel route includes information indicating positions of a plurality of points (marks) on the route will be described. When a signal indicating the presence of an obstacle is transmitted from any of the plurality ofmobile bodies 10, the 1st control circuit 54 determines two adjacent spots between which the obstacle is located. The 1st control circuit 54 changes the route of the movingbody 10, which is expected to pass through the route including the two points, among the plurality of movingbodies 10, to the route not including the two points.
By such an operation, the followingmobile body 10 can smoothly move along the new route without being affected by the obstacle. After onemobile body 10 finds an obstacle, the othermobile bodies 10 do not need to perform an operation to avoid the obstacle. Therefore, the operation of the moving body system can be made smoother.
An example of an operation when a path is changed will be described below with reference to fig. 2A to 2D. Here, as an example, a case where the "signal indicating the travel path" includes information indicating positions of a plurality of points (marks) on the path from the initial position to the destination position, and the "signal indicating the presence of an obstacle" includes information indicating a position of an obstacle will be described. The "information indicating the position of the obstacle" is not limited to the information of the position (coordinates) of the obstacle itself, and may be information of the position (coordinates) or trajectory of themobile body 10 after the avoidance operation.
Fig. 2A shows an example of a case where there is no obstacle on the traveling path of themobile body 10A. In this case, themobile body 10A moves along a predetermined travel path (broken line arrow in the figure). More specifically, themobile body 10A sequentially tracks a plurality of marks (only the mark M is illustrated in fig. 2A) instructed from the 1st control circuit 51 of themanagement apparatus 501And M2) And moving from the initial position to the destination position. The movement between the markers is a linear movement. Themobile object 10A may acquire position information of all marks on the travel path in advance, or may request position information of the next mark from themanagement apparatus 50 each time each mark is reached.
FIG. 2B shows a mark M on the traveling path of themobile body 10A1And a marker M2An example of the avoidance operation in the case where theobstacle 70 exists therebetween. Theobstacle 70 is an object that does not exist on the environmental map, and may be, for example, cargo, a person, or other moving objects. The travel path of themobile body 10A is determined in advance on the assumption that such an obstacle does not exist.
When thesensor 19 is used to find theobstacle 70 on the route, themobile body 10A performs an operation of avoiding theobstacle 70. For example, themobile body 10A appropriately combines motions of turning right, turning left, turning, and the like to avoid theobstacle 70. In the example of fig. 2B, when themobile body 10A finds theobstacle 70, the following operations are performed.
(1) The traveling direction is rotated to the right by about 90 degrees in the near front of the obstacle 70 (for example, several tens of cm before), advancing by a distance of the same degree as the width of theobstacle 70. The width of theobstacle 70 can be measured by thesensor 19, a laser range finder, or the like, for example.
(2) The direction of travel is rotated approximately 90 degrees to the left, advancing a distance slightly longer than the width of theobstacle 70.
(3) The direction of travel is rotated approximately 90 degrees to the left, advancing the barrier 70 a distance of about the width.
(4) Rotate the advancing direction by about 90 degrees to the right, advance to the mark M2
The avoidance operation of theobstacle 70 by themobile body 10A is not limited to this example, and any algorithm may be used.
When finding theobstacle 70, themobile unit 10A transmits a signal indicating the presence of theobstacle 70 to themanagement device 50. The movingbody 10A is shown at mark M1And a marker M2Signal of the presence of anobstacle 70 therebetween, or at the mark M1And a marker M2The trajectory (set of a plurality of coordinates) of the avoidance operation of themobile body 10A performed in the meantime is signaled to themanagement device 50. When themobile body 10A can measure the coordinates and size of theobstacle 70 using the laser range finder, information on the coordinates and size of theobstacle 70 may be included in the signal.
The 1st control circuit 51 of themanagement device 50, upon receiving a signal indicating the presence of theobstacle 70 from themobile body 10A, determines whether or not there is a passing signal expected to include the two marks M1And M2The movingbody 10 following the route. In the case where such a movingbody 10 exists, the 1st control circuit 51 changes the path of the movingbody 10 to a path not including the two marks M1And M2The path of (2).
Fig. 2C is a diagram showing an example of the route after the change. In this example, the path of the following othermobile object 10B is changed to a path slightly shifted so as not to collide with theobstacle 70. The 1st control circuit 51 of themanagement device 50 controls the management device by marking M1And M2Change to marker M1' and M2' the path change is realized.
Fig. 2D is a diagram showing another example of the route after the change. In this example, the route of the other followingmobile object 10B is changed greatly. Altered mark M1' and M2' position and original mark M1And M2Is much changed compared to the position of (a).
By performing the route change as described above, the followingmobile object 10B can smoothly move to the destination without performing an operation of avoiding theobstacle 70.
Fig. 3 is a diagram showing an example of data of the travel path of eachmobile body 10 managed by themanagement device 50. Such data may be recorded in a storage device (not shown in fig. 1) provided in themanagement device 50. As shown in fig. 3, the data indicating the travel path of eachmobile body 10 may include information of a plurality of points (marks) on the path. The information of each mark may include information of a position (for example, x-coordinate and y-coordinate) of the mark and an orientation (for example, an angle θ with respect to the x-axis) of the movingbody 10 at the position. In FIG. 3, the information of each mark is represented by M11(x11,y11,θ11) Etc., but these are all recorded in the form of specific numerical values.
Fig. 4 is a flowchart showing an example of the operation of the 1st control circuit 51 of themanagement device 50. In this example, the 1st control circuit 51 performs the following operation
Step S101: the movement path of eachmobile body 10 is determined. The determination of the movement path is performed in accordance with an instruction from a user or a manager or a predetermined program.
Step S102: the instruction of the movement to eachmobile body 10 is started. The timing of starting the instruction of the movement to eachmobile body 10 is also performed in accordance with an instruction from the user or the manager or a predetermined program.
Step S103: it is determined whether or not a notification of the presence of an obstacle is received from any of themobile bodies 10. If the determination is yes, the process proceeds to step S104. If the determination is no, step S103 is executed again.
Step S104: it is determined whether or not there is a subsequentmobile body 10 that is expected to pass through the path where the obstacle exists. This determination can be made, for example, by comparing the position of the obstacle with the path of eachmobile body 10. If the determination is yes, the process proceeds to step S105. If the determination is no, the process returns to step S103.
Step S105: the route of the followingmobile object 10 is changed, and the route change is instructed to themobile object 10. Then, the process shifts to step S103.
Fig. 5 is a flowchart showing an example of the operation of the 2nd control circuit 14a of themobile body 10. In this example, the 2nd control circuit 14a performs the following operation after starting the movement.
Step S201: it is determined whether theobstacle sensor 19 detects an obstacle. If the determination is yes, the process proceeds to step S202. If the determination is no, the process proceeds to step S203.
Step S202: a signal indicating the presence of an obstacle is generated to themanagement device 50, and an operation of avoiding the obstacle is performed.
Step S203: it is determined whether or not an instruction to change the route has been received from themanagement apparatus 50. If the determination is yes, the process proceeds to step S204. If the determination is no, the process returns to step S201.
Step S204: move along the indicated altered path.
The above operation is an example, and the above operation can be appropriately changed. Several modifications of the present embodiment will be described below.
After the route is changed, when a signal indicating that the obstacle has been removed is input, the 1st control circuit 51 may return the changed route to the original route. The signal indicating that the obstacle has been removed may be transmitted by anothermobile body 10 moving in the vicinity of the location, or may be manually input by a manager or a user.
When a signal indicating the presence of an obstacle is transmitted first, the 1st control circuit 51 may request avoidance of the obstacle to the followingmobile object 10 without changing the route of themobile object 10 that is expected to pass through the route in which the obstacle is present. When a signal indicating the presence of an obstacle is transmitted from n (n is any of integers equal to or greater than 2) moving bodies or when a signal indicating the presence of an obstacle is transmitted n times, the 1st control circuit 51 may start changing the path of the movingbody 10 that is expected to pass through the path in which the obstacle is present. According to such an operation, since the route is changed only when the obstacle is present for a long time, it is possible to avoid frequent route changes when the obstacle is present for a short time.
Eachmobile body 10 may further include: a laser range finder; a storage device that holds an environment map; and a position estimation device that determines and outputs an estimated value of the position and orientation of themobile body 10 on the environment map by referring to the data output from the laser range finder and the environment map. In this case, the 2nd control circuit 14a moves themobile body 10 based on the estimated values of the position and the direction output from the position estimating device and the signal indicating the travel path transmitted from the 1st control circuit 54.
The 1st control circuit 54 may transmit the environment map to eachmobile object 10 or instruct updating of the environment map according to the situation. For example, when a signal indicating that an obstacle is present is not input for a certain period of time (for example, within several hours to several days) after a signal indicating that an obstacle is present is transmitted from any of the plurality ofmobile bodies 10, the 1st control circuit 54 may instruct eachmobile body 10 to update the environment map including information of the obstacle.
A more specific example of the case where the moving object is an automated guided vehicle will be described below. In the following description, the automated Guided vehicle is described as "agv (automated Guided vehicle)" using an abbreviation. In the following description, unless contradictory, the present invention can be similarly applied to moving objects other than AGVs, for example, bipedal or multi-legged walking robots, unmanned planes, hovercraft, manned vehicles, and the like.
(1) Basic structure of system
Fig. 6 shows a basic configuration example of an exemplary movingbody management system 100 of the present disclosure. The movingobject management system 100 includes at least one AGV10 and anoperation management device 50 that manages the operation of theAGV 10. Fig. 6 also shows aterminal device 20 operated by theuser 1.
The AGV10 is an unmanned transport vehicle capable of "unguided" travel without a guide such as a magnetic tape during travel. The AGV10 can estimate its own position and transmit the estimated result to theterminal device 20 and theoperation management device 50. The AGV10 can automatically travel in the travel space S in accordance with an instruction from theoperation management device 50. The AGV10 can also operate in a "tracking mode" in which it moves following a person or other moving object.
Theoperation management device 50 is a computer system that manages the travel of each AGV10 by tracking the position of eachAGV 10. Theoperation management device 50 may be a desktop PC, a notebook PC, and/or a server computer. Theoperation management device 50 communicates with each AGV10 via a plurality ofaccess points 2. For example, theoperation management device 50 transmits data of coordinates of a position to which each AGV10 should be next directed to eachAGV 10. Each AGV10 periodically transmits data indicating its position and orientation (orientation) to theoperation management device 50, for example, every 100 milliseconds. When the AGV10 reaches the instructed position, theoperation management device 50 transmits data of the coordinates of the position to be directed next. The AGV10 can also travel within the travel space S in accordance with the operation of theuser 1 input to theterminal device 20. An example of theterminal device 20 is a tablet computer. Typically, the travel of the AGV10 using theterminal device 20 is performed when the map is created, and the travel of the AGV10 using theoperation management device 50 is performed after the map is created.
Fig. 7 shows an example of the travel space S in which threeAGVs 10a, 10b, and 10c are located. Assume that any AGV is traveling in the depth direction in the figure. TheAGVs 10a and 10b are handling loads that are placed on the roof. The AGV10 c follows the AGV10 b ahead. For convenience of explanation,reference numerals 10a, 10b, and 10c are given to fig. 7, but hereinafter, the AGV10 is described.
The AGV10 can transport a load by a traction carriage connected to itself, in addition to a method of transporting a load placed on a roof. Fig. 8A shows the AGV10 and thetraction trolley 5 prior to connection. Casters are provided on the legs of thetraction carriage 5. The AGV10 is mechanically connected to thetraction trolley 5. FIG. 8B shows the AGV10 and thetraction trolley 5 connected. When the AGV10 travels, thetraction trolley 5 is pulled by theAGV 10. By towing thetraction trolley 5, the AGV10 can carry the load placed on thetraction trolley 5.
The method of coupling the AGV10 to thetraction trolley 5 is arbitrary. Here, an example will be explained. Aplate 6 is secured to the roof of theAGV 10. Aguide 7 having a slit is provided on thetraction carriage 5. The AGV10 approaches thetraction trolley 5 to insert theplate 6 into the slot of theguide 7. When the insertion is completed, the AGV10 passes an electromagnetic lock pin, not shown, through theplate 6 and theguide 7, and applies electromagnetic locking. Thus, the AGV10 is physically connected to thetraction trolley 5.
Reference is again made to fig. 6. Each AGV10 and theterminal device 20 can be connected one-to-one, for example, and perform communication according to the Bluetooth (registered trademark) standard. Each AGV10 and theterminal device 20 can also communicate with each other by Wi-Fi (registered trademark) using one or more access points 2. The plurality ofaccess points 2 are connected to each other via, for example, aswitching hub 3. Fig. 6 shows twoaccess points 2a, 2 b. The AGV10 is wirelessly connected to theaccess point 2 a. Theterminal device 20 is wirelessly connected to theaccess point 2 b. The data transmitted from the AGV10 is received by theaccess point 2a, transferred to theaccess point 2b via theswitching hub 3, and transmitted from theaccess point 2b to theterminal device 20. The data transmitted from theterminal device 20 is received by theaccess point 2b, transferred to theaccess point 2a via theswitching hub 3, and transmitted from theaccess point 2a to theAGV 10. This realizes bidirectional communication between the AGV10 and theterminal device 20. The plurality ofaccess points 2 are also connected to theoperation management device 50 via theswitching hub 3. This also enables two-way communication between theoperation management device 50 and eachAGV 10.
(2) Making of environment map
In order to allow the AGV10 to travel while estimating its own position, a map in the travel space S is created. A position estimation device and a laser range finder are mounted on AGV10, and a map can be created using the output of the laser range finder.
The AGV10 transitions to the data retrieval mode through operation by the user. In the data acquisition mode, AGV10 begins to acquire sensor data using the laser rangefinder. The laser rangefinder periodically emits a laser beam such as infrared rays or visible light to the surroundings to scan the surrounding space S. The laser beam is reflected by the surface of a structure such as a wall or a pillar, an object placed on the ground, or the like. The laser range finder receives the reflected light of the laser beam, calculates the distance to each reflection point, and outputs data indicating the measurement result of the position of each reflection point. The position of each reflection point reflects the direction of arrival and the distance of the reflected light. The data of the assay result is sometimes referred to as "measurement data" or "sensor data".
The position estimation device accumulates sensor data in the storage device. When the acquisition of the sensor data in the moving space S is completed, the sensor data stored in the storage device is transmitted to the external device. The external device is, for example, a computer having a processor for signal processing and installed with a mapping program.
The processor for signal processing of the external device superimposes the sensor data acquired for each scan on each other. The map of the space S can be created by repeating the superimposition processing by the signal processing processor. The external device transmits the created map data to theAGV 10. The AGV10 stores the created map data in an internal storage device. The external device may be theoperation management device 50 or may be another device.
Instead of an external device, the AGV10 may map the map. The processing performed by the signal processing processor of the external device described above may be performed by a circuit such as a microcontroller unit (microcomputer) of theAGV 10. When a map is created within the AGV10, it is no longer necessary to transmit the accumulated sensor data to an external device. The data capacity of the sensor data is generally considered to be large. Since it is not necessary to transmit sensor data to an external device, occupation of a communication line can be avoided.
The AGV10 can travel in the travel space S for acquiring the sensor data according to the operation of the user. For example, the AGV10 wirelessly receives a travel command instructing to move in each of the front, rear, left, and right directions from the user via theterminal device 20. The AGV10 travels forward, backward, leftward and rightward within the travel space S according to the travel command to create a map. When the AGV10 is connected to a manipulator such as a joystick by a wire, the AGV can travel forward, backward, leftward and rightward in the travel space S in accordance with a control signal from the manipulator to create a map. The sensor data may be acquired by a person walking on a measurement carriage on which the laser range finder is mounted.
In addition, although a plurality ofAGVs 10 are shown in fig. 6 and 7, one AGV may be used. When there are a plurality ofAGVs 10, theuser 1 can select one AGV10 from the plurality of registered AGVs using theterminal device 20 to create the map of the travel space S.
After the map is created, each AGV10 can automatically travel while estimating its own position using the map. The process of estimating the self position will be described later.
(3) AGV structure
Fig. 9 is an external view of an exemplary AGV10 according to the present embodiment. The AGV10 has twodrive wheels 11a and 11b, fourcasters 11c, 11d, 11e, and 11f, aframe 12, a conveyance table 13, atravel control device 14, and alaser range finder 15. Twodrive wheels 11a and 11b are provided on the right and left sides of the AGV10, respectively. The fourcasters 11c, 11d, 11e, and 11f are disposed at the four corners of theAGV 10. In addition, although the AGV10 also has a plurality of motors connected to the twodrive wheels 11a and 11b, the plurality of motors are not shown in FIG. 9. Fig. 9 shows onedrive wheel 11a and twocaster wheels 11c and 11e on the right side of the AGV10 and acaster wheel 11f on the left rear portion, but theleft drive wheel 11b and the left front caster wheel 11d are not shown because they are hidden by theframe 12. The fourcasters 11c, 11d, 11e, and 11f can freely turn. In the following description, thedrive wheels 11a and 11b are also referred to aswheels 11a and 11b, respectively.
The AGV10 also has at least oneobstacle sensor 19 for detecting obstacles. In the example of fig. 9, fourobstacle sensors 19 are provided at four corners of theframe 12. The number and arrangement of theobstacle sensors 19 may also be different from the example of fig. 9. Theobstacle sensor 19 may be a device capable of measuring a distance, such as an infrared sensor, an ultrasonic sensor, or a stereo camera. When theobstacle sensor 19 is an infrared sensor, for example, infrared light is emitted at regular intervals, and the time until the reflected infrared light returns is measured, whereby an obstacle existing within a certain distance can be detected. When the AGV10 detects an obstacle on the route based on a signal output from at least oneobstacle sensor 19, it performs an operation of avoiding the obstacle.
Thetravel control device 14 is a device that controls the operation of the AGV10, and mainly includes an integrated circuit including a microcomputer (described later), electronic components, and a substrate on which the integrated circuit and the electronic components are mounted. Thetravel control device 14 performs the above-described transmission and reception of data with theterminal device 20 and preprocessing calculation.
Thelaser range finder 15 is, for example, an optical device as follows: the distance from the reflection point is measured by emitting alaser beam 15a of infrared rays or visible light and detecting the reflected light of thelaser beam 15 a. In the present embodiment, thelaser distance meter 15 of the AGV10 emits thepulsed laser beam 15a and detects the reflected light of eachlaser beam 15a while changing the direction every 0.25 degrees in a space of a range of 135 degrees (270 degrees in total) on the left and right with respect to the front of the AGV10, for example. This makes it possible to acquire data of the distance from the reflection point in the direction determined by the angle of 1081 steps in total at every 0.25 degrees. In the present embodiment, the scanning of the surrounding space by thelaser range finder 15 is substantially parallel to the ground surface and planar (two-dimensional). However, thelaser range finder 15 may perform scanning in the height direction.
The AGV10 can create a map of the space S based on the position and orientation (direction) of the AGV10 and the scanning result of thelaser range finder 15. The map can reflect the arrangement of structures such as walls and pillars around the AGV and objects placed on the floor. The data of the map is stored in a storage device provided in theAGV 10.
Generally, the position and posture of a mobile body is referred to as a posture (position). The position and posture of the moving body in the two-dimensional plane are expressed by position coordinates (X, y) in an XY rectangular coordinate system and an angle θ with respect to the X axis. Hereinafter, the position and posture of the AGV10, i.e., the posture (x, y, θ), may be simply referred to as "position".
The position of the reflection point viewed from the radiation position of thelaser beam 15a can be expressed using polar coordinates determined by the angle and the distance. In the present embodiment, thelaser range finder 15 outputs sensor data expressed in polar coordinates. However, thelaser range finder 15 may convert the position expressed by polar coordinates into rectangular coordinates and output the rectangular coordinates.
The construction and operating principle of the laser rangefinder are well known, and therefore further detailed explanation is omitted in this specification. Examples of objects that can be detected by thelaser range finder 15 are people, goods, shelves, walls.
Thelaser range finder 15 is an example of an external sensor for sensing a surrounding space and acquiring sensor data. As another example of such an external sensor, an image sensor and an ultrasonic sensor are considered.
Thetravel control device 14 compares the measurement result of thelaser range finder 15 with map data held by the travel control device itself to estimate the current position of the travel control device itself. The map data to be held may be map data created by anotherAGV 10.
Fig. 10A shows an example of the 1 st hardware configuration of theAGV 10. Fig. 10A also shows a specific configuration of thetravel control device 14.
The AGV10 has atravel control device 14, alaser range finder 15, twomotors 16a and 16b, adrive device 17,wheels 11a and 11b, and tworotary encoders 18a and 18 b.
Thetravel control device 14 includes amicrocomputer 14a, amemory 14b, astorage device 14c, acommunication circuit 14d, and aposition estimation device 14 e. Themicrocomputer 14a, thememory 14b, thestorage device 14c, thecommunication circuit 14d, and theposition estimation device 14e are connected via acommunication bus 14f, and can transmit and receive data to and from each other. Thelaser rangefinder 15 is also connected to thecommunication bus 14f via a communication interface (not shown), and transmits measurement data as a measurement result to themicrocomputer 14a, theposition estimation device 14e, and/or thememory 14 b. Themicrocomputer 14a also functions as the 2nd control circuit 14a shown in fig. 1.
Themicrocomputer 14a is a processor or a control circuit (computer) that executes an arithmetic operation for controlling the entire AGV10 including thetravel control device 14. Typically, themicrocomputer 14a is a semiconductor integrated circuit. Themicrocomputer 14a transmits a PWM (Pulse Width Modulation) signal as a control signal to the drivingdevice 17 to control the drivingdevice 17 so as to adjust the voltage applied to the motor. Thereby, themotors 16a and 16b are rotated at desired rotation speeds, respectively.
One or more control circuits (for example, a microcomputer) for controlling the driving of the left andright motors 16a and 16b may be provided independently of themicrocomputer 14 a. For example, themotor drive device 17 may have two microcomputers that control the driving of themotors 16a and 16b, respectively. The two microcomputers can use the encoder information output from theencoders 18a and 18b, respectively, to perform coordinate calculations to infer the distance the AGV10 has traveled from a given initial position. In addition, the two microcomputers may also control themotor drive circuits 17a and 17b using the encoder information.
Thememory 14b is a volatile storage device that stores a computer program executed by themicrocomputer 14 a. Thememory 14b may be used as a work memory for themicrocomputer 14a and theposition estimation device 14e to perform arithmetic operations.
Thestorage device 14c is a nonvolatile semiconductor storage device. However, thestorage device 14c may be a magnetic recording medium typified by a hard disk or an optical recording medium typified by an optical disk. Thestorage device 14c may include a head device for writing and/or reading data to or from any recording medium, and a control device for the head device.
Thestorage device 14c stores map data M of the space S to be traveled, and data (travel path data) R of one or more travel paths. The map data M can be created by the AGV10 operating in the map creation mode and stored in thestorage device 14 c. The travel path data R is transmitted from the outside after the map data M is made. In the present embodiment, the map data M and the travel route data R are stored in thesame storage device 14c, but may be stored in different storage devices.
An example of the travel route data R will be described.
When theterminal device 20 is a tablet computer, the AGV10 receives the travel path data R indicating the travel path from the tablet computer. The travel route data R at this time includes mark data indicating positions of a plurality of marks. The "flag" indicates the passing position (via point) of the AGV10 to be traveled. The travel route data R includes at least position information of a start mark indicating a travel start position and an end mark indicating a travel end position. The travel route data R may further include position information of one or more markers of intermediate transit points. When the travel route includes one or more intermediate transit points, a route that reaches the end mark sequentially from the start mark via the travel transit points is defined as the travel route. The data for each marker may include, in addition to the coordinate data for that marker, the orientation (angle) and travel speed data of the AGV10 before moving to the next marker. When the AGV10 stops temporarily at the position of each marker and estimates its own position and notifies theterminal device 20, the data of each marker may include data of an acceleration time required to accelerate to the travel speed and/or a deceleration time required to decelerate from the travel speed to stop at the position of the next marker.
The movement of the AGV10 may be controlled not by theterminal device 20 but by the operation management apparatus 50 (e.g., a PC and/or a server computer). In this case, it may be that theoperation management device 50 instructs the AGV10 to move to the next marker each time the AGV10 reaches a mark. For example, the AGV10 receives, as the travel path data R indicating the travel path, coordinate data of a destination position to be followed, or data of a distance from the destination position and an angle of travel to be performed, from theoperation management device 50.
The AGV10 can travel along the stored travel route while estimating its own position using the created map and the sensor data output from thelaser range finder 15 acquired during travel.
Thecommunication circuit 14d is a wireless communication circuit that performs wireless communication according to Bluetooth (registered trademark) and/or Wi-Fi (registered trademark) standards, for example. Any standard includes wireless communication standards that use frequencies in the 2.4GHz band. For example, in a mode in which the AGV10 is caused to travel to create a map, thecommunication circuit 14d performs wireless communication in accordance with the Bluetooth (registered trademark) standard, and performs one-to-one communication with theterminal device 20.
Theposition estimation device 14e performs a process of creating a map and a process of estimating the position of the device itself during travel. Theposition estimation device 14e creates a map of the moving space S based on the position and posture of the AGV10 and the scanning result of the laser range finder. While traveling, theposition estimation device 14e receives sensor data from thelaser range finder 15, and reads out the map data M stored in thestorage device 14 c. The self position (x, y, θ) on the map data M is identified by matching the local map data (sensor data) created from the scanning result of thelaser range finder 15 with the map data M of a larger range. Theposition estimation device 14e generates data indicating "reliability" indicating how well the local map data matches the map data M. Each data of the self position (x, y, θ) and the reliability can be transmitted from the AGV10 to theterminal device 20 or theoperation management device 50. Theterminal device 20 or theoperation management device 50 can receive the respective data of the own position (x, y, θ) and the reliability and display them on a built-in or connected display device.
In the present embodiment, themicrocomputer 14a and theposition estimation device 14e are different components, but this is merely an example. Themicrocomputer 14a and theposition estimation device 14e may be a single chip circuit or a semiconductor integrated circuit that can independently perform the operations of the microcomputer and the position estimation device. Achip circuit 14g including themicrocomputer 14a and the position inferring means 14e is shown in fig. 10A. Hereinafter, an example in which themicrocomputer 14a and theposition estimation device 14e are independently provided will be described.
Twomotors 16a and 16b are respectively mounted to the twowheels 11a and 11b to rotate the respective wheels. That is, the twowheels 11a and 11b are driving wheels, respectively. In this description, a case will be described where themotor 16a and themotor 16b are motors that drive the right and left wheels, respectively, of theAGV 10.
The movingbody 10 also has anencoder unit 18 that measures the rotational position or the rotational speed of thewheels 11a and 11 b. Theencoder unit 18 includes a 1st rotary encoder 18a and a 2nd rotary encoder 18 b. The 1st rotary encoder 18a measures rotation of an arbitrary position of the power transmission mechanism from themotor 16a to thewheel 11 a. The 2nd rotary encoder 18b measures rotation of an arbitrary position of the power transmission mechanism from themotor 16b to thewheel 11 b. Theencoder unit 18 transmits the signals acquired by therotary encoders 18a and 18b to themicrocomputer 14 a. Themicrocomputer 14a may control the movement of themobile body 10 not only by using the signal received from theposition estimation device 14e but also by using the signal received from theencoder unit 18.
The drivingdevice 17 hasmotor driving circuits 17a and 17b, and themotor driving circuits 17a and 17b are used to adjust voltages applied to the twomotors 16a and 16b, respectively. Themotor drive circuits 17a and 17b each include a so-called inverter circuit. Themotor drive circuits 17a and 17b turn on or off the current flowing in each motor according to a PWM signal transmitted from themicrocomputer 14a or the microcomputer in themotor drive circuit 17a, thereby adjusting the voltage applied to the motors.
Fig. 10B shows an example of the 2 nd hardware configuration of theAGV 10. The 2 nd hardware configuration example is different from the 1 st hardware configuration example (fig. 10A) in that alaser positioning system 14h is provided and amicrocomputer 14a is connected to each component in a one-to-one correspondence.
Thelaser positioning system 14h has aposition inference device 14e and alaser range finder 15. Theposition estimation device 14e and thelaser range finder 15 are connected by, for example, an ethernet (registered trademark) cable. The operations of theposition estimation device 14e and thelaser range finder 15 are as described above. Thelaser positioning system 14h outputs information indicating the attitude (x, y, θ) of the AGV10 to themicrocomputer 14 a.
Themicrocomputer 14a has various general-purpose I/O interfaces or general-purpose input/output ports (not shown). Themicrocomputer 14a is directly connected to other components in thetravel control device 14, such as thecommunication circuit 14d and thelaser positioning system 14h, via the general-purpose input/output port.
Fig. 10B is the same as fig. 10A except for the above-described structure. Therefore, the description of the same structure is omitted.
The AGV10 according to the embodiment of the present disclosure may have a safety sensor such as a bumper switch, not shown. The AGV10 may also have inertial measurement devices such as gyroscopic sensors. By using the measurement data of the internal sensors such as therotary encoders 18a and 18b and the inertial measurement unit, the moving distance and the amount of change (angle) in the posture of the AGV10 can be estimated. The estimated values of these distances and angles are referred to as odometer data, and can function to assist the position and posture information acquired by theposition estimation device 14 e.
(4) Map data
Fig. 11A to 11F schematically show an AGV10 that travels while acquiring sensor data. Theuser 1 may manually move the AGV10 while operating theterminal device 20. Alternatively, the unit having thetravel control device 14 shown in fig. 10A and 6B or the AGV10 itself may be mounted on the dolly, and theuser 1 may push or pull the dolly to acquire the sensor data.
An AGV10 that uses alaser rangefinder 15 to scan the surrounding space is shown in FIG. 11A. The laser beam is radiated at every predetermined step angle to perform scanning. The illustrated scanning range is a schematically illustrated example, and is different from the scanning range of 270 degrees in total.
In fig. 11A to 11F, the positions of the reflection points of the laser beam are schematically shown by a plurality ofblack dots 4 indicated by a symbol "·". The scanning of the laser beam is performed in a short cycle during the change of the position and posture of thelaser range finder 15. Therefore, the number of actual reflection points is much larger than the number ofreflection points 4 shown in the figure. Theposition estimation device 14e accumulates the position of theblack dot 4 acquired along with the travel, for example, in thememory 14 b. The map data is gradually completed by the AGV10 continuing to scan while traveling. In fig. 11B to 11E, only the scanning range is shown for the sake of simplicity. This scanning range is illustrative and is different from the above-described example of 270 degrees in total.
The map may be created using themicrocomputer 14a in the AGV10 or an external computer based on sensor data acquired after the sensor data is acquired in an amount necessary for creating the map. Alternatively, the map may be created in real time based on sensor data acquired by the movingAGV 10.
Fig. 11F schematically shows a part of the completed map 80. In the map shown in fig. 11F, a free space is defined by a Point Cloud (Point Cloud) corresponding to a set of reflection points of the laser beam. Another example of the map is an occupied grid map that distinguishes space occupied by an object from free space in units of a grid. Theposition estimation device 14e stores map data (map data M) in thememory 14b or thestorage device 14 c. The number and density of black dots shown in the drawings are examples.
The map data thus obtained may be shared bymultiple AGVs 10.
A typical example of an algorithm for the AGV10 to infer its location from map data is ICP (Iterative closest point) matching. As described above, the self position (x, y, θ) on the map data M can be estimated by matching the local map data (sensor data) created from the scanning result of thelaser range finder 15 with the map data M in a wider range.
When the area where the AGV10 travels is large, the data amount of the map data M increases. Therefore, there is a possibility that a time required for creating a map increases, or a lot of time is required for estimating the self position. When such a failure occurs, the map data M may be divided into a plurality of data of local maps and created and recorded.
Fig. 12 shows an example of covering the entire area of one floor of one factory by a combination of four partial map data M1, M2, M3, M4. In this example, one local map data covers an area of 50m × 50 m. The boundary portion of two maps adjacent in each of the X direction and the Y direction is provided with a rectangular repetition area having a width of 5 m. This overlapping area is referred to as a "map switching area". When the AGV10 traveling while referring to one local map reaches the map switching area, the travel is switched to travel with reference to another local map adjacent thereto. The number of local maps is not limited to four, and may be set as appropriate according to the area of the floor on which the AGV10 travels, and the performance of the computer that performs the map creation and the self position estimation. The size of the local map data and the width of the overlap area are not limited to the above-described examples, and may be set arbitrarily.
(5) Configuration example of operation management device
Fig. 13 shows an example of the hardware configuration of theoperation management device 50. Theoperation management device 50 has aCPU 51, amemory 52, a position database (position DB)53, acommunication circuit 54, a map database (map DB)55, and animage processing circuit 56.
TheCPU 51, thememory 52, theposition DB 53, thecommunication circuit 54, the map DB 55, and theimage processing circuit 56 are connected by acommunication bus 57, and can transmit and receive data to and from each other.
TheCPU 51 is a signal processing circuit (computer) that controls the operation of theoperation management device 50. Typically, theCPU 51 is a semiconductor integrated circuit. TheCPU 51 functions as the 1st control circuit 51 shown in fig. 1.
Thememory 52 is a volatile storage device that stores a computer program executed by theCPU 51. Thememory 52 may be used as a work memory for theCPU 51 to perform operations.
Theposition DB 53 stores position data indicating positions that can be destinations of theAGVs 10. The position data may be expressed by coordinates virtually set by a manager in a factory, for example. The location data is determined by the manager.
Thecommunication circuit 54 performs wired communication in accordance with, for example, the ethernet (registered trademark) standard. Thecommunication circuit 54 is connected to the access point 2 (fig. 1) by a wire, and can communicate with the AGV10 via theaccess point 2. Thecommunication circuit 54 receives data from theCPU 51 via thebus 57 that should be sent to theAGV 10. In addition, thecommunication circuit 54 sends data (notification) received from the AGV10 to theCPU 51 and/or thememory 52 via thebus 57.
The map DB 55 stores data of maps inside a factory or the like where the AGV10 travels. This map may be the same as map 80 (fig. 11F) or may be different. The data format is not limited as long as it is a map having a one-to-one correspondence relationship with the position of eachAGV 10. For example, the map stored in the map DB 55 may be a map created by CAD.
Theposition DB 53 and the map DB 55 may be constructed on a nonvolatile semiconductor memory, or may be constructed on a magnetic recording medium represented by a hard disk or an optical recording medium represented by an optical disk.
Theimage processing circuit 56 is a circuit that generates video data to be displayed on themonitor 58. Theimage processing circuit 56 operates exclusively when the manager operates theoperation management device 50. In the present embodiment, further detailed description is particularly omitted. The monitor 59 may be integrated with theoperation management device 50. TheCPU 51 may perform the processing of theimage processing circuit 56.
(6) Actions of the operation management device
The operation of theoperation management device 50 will be described in brief with reference to fig. 14. Fig. 14 is a diagram schematically showing an example of the travel path of the AGV10 determined by theoperation management device 50.
The operation of the AGV10 and theoperation management device 50 is summarized as follows. Next, for a AGV10 currently located at a location (marker) M1A mark M to be driven to a final destination through several positionsn+1An example of (n: a positive integer of 1 or more) will be described. In addition, a mark M is recorded in theposition DB 531Mark M to be passed later2Marker M2Mark M to be passed later3And coordinate data of each position.
TheCPU 51 of theoperation management device 50 reads the mark M with reference to theposition DB 532Generating the orientation mark M2The travel command of (1). Thecommunication circuit 54 sends a travel instruction to the AGV10 via theaccess point 2.
TheCPU 51 periodically receives data representing the current position and posture from the AGV10 via theaccess point 2. In this way, theoperations management device 50 is able to track the position of eachAGV 10. TheCPU 51 judges that the current position of the AGV10 and the marker M are present2When they match, the mark M is read3Generating the orientation mark M3And sends the travel command to theAGV 10. That is, when determining that the AGV10 has reached a certain position, theoperation management device 50 transmits a travel command to the next position to be passed. Thus, the AGV10 can reach the marker M as the final destinationn+1
Industrial applicability
The moving body and the moving body management system of the present disclosure can be suitably used for moving and carrying goods, parts, finished products, and the like in factories, warehouses, construction sites, logistics, hospitals, and the like.
Description of the reference symbols
1: a user; 2a, 2 b: an access point; 10: AGVs (mobiles); 14: a travel control device; 14 a: a microcomputer (2 nd control circuit); 14 b: a memory; 14 c: a storage device; 14 d: a communication circuit (2 nd communication circuit); 14 e: a position inferring device; 16a, 16 b: a motor; 15: a laser range finder; 17: a drive device; 17a, 17 b: a motor drive circuit; 18: an encoder unit; 18a, 18 b: a rotary encoder; 19: an obstacle sensor; 20: a terminal device (a mobile computer such as a tablet computer); 50: an operation management device; 51: a CPU (1 st control circuit); 52: a memory; 53: a location database (location DB); 54: a communication circuit (1 st communication circuit); 55: a map database (map DB); 56: an image processing circuit; 100: a mobile body management system; 101: a moving body; 103: an external sensor; 105: a position inferring device; 107: a storage device; 109: a controller; 111: a drive device.

Claims (6)

CN201880057317.5A2017-09-252018-09-20Moving body and moving body systemPendingCN111065981A (en)

Applications Claiming Priority (3)

Application NumberPriority DateFiling DateTitle
JP2017-1835312017-09-25
JP20171835312017-09-25
PCT/JP2018/034905WO2019059307A1 (en)2017-09-252018-09-20Moving body and moving body system

Publications (1)

Publication NumberPublication Date
CN111065981Atrue CN111065981A (en)2020-04-24

Family

ID=65809839

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201880057317.5APendingCN111065981A (en)2017-09-252018-09-20Moving body and moving body system

Country Status (3)

CountryLink
JP (1)JP7136426B2 (en)
CN (1)CN111065981A (en)
WO (1)WO2019059307A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN111806460A (en)*2020-07-172020-10-23青岛蚂蚁机器人有限责任公司Automatic guide transport vechicle control system
CN115145259A (en)*2021-03-162022-10-04丰田自动车株式会社 Movement path calculation device, moving body control system, movement path calculation method, and computer-readable storage medium
TWI784786B (en)*2020-11-162022-11-21日商豐田自動織機股份有限公司Automated guided vehicle control device

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
WO2020235392A1 (en)*2019-05-172020-11-26村田機械株式会社Transport vehicle system, transport vehicle, and control method
CN110567471B (en)*2019-08-092020-10-09易普森智慧健康科技(深圳)有限公司Indoor traffic control method based on position
JP7338611B2 (en)*2020-11-162023-09-05株式会社豊田自動織機 Controller for automatic guided vehicle
JP7664547B2 (en)*2020-12-282025-04-18パナソニックIpマネジメント株式会社 TRANSPORT SYSTEM, TRANSPORT METHOD, AND TRANSPORT DEVICE
JP7562625B2 (en)2022-12-142024-10-07東芝エレベータ株式会社 Server system, behavior planning system, behavior planning method and program
JP2024115843A (en)*2023-02-152024-08-27株式会社東芝 Information processing device, system, mobile object, method and program

Citations (11)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JPH11259130A (en)*1998-03-061999-09-24Nissan Motor Co Ltd Automatic guided vehicle route setting method and automatic guided vehicle control method
WO2002023297A1 (en)*2000-09-112002-03-21Kunikatsu TakaseMobile body movement control system
JP2010231698A (en)*2009-03-302010-10-14Advanced Telecommunication Research Institute International Network robot system, robot control apparatus, robot control method, and robot control program
CN103268111A (en)*2013-05-282013-08-28重庆大学 A networked distributed multi-mobile robot system
CN105974925A (en)*2016-07-192016-09-28合肥学院AGV trolley driving control method and system
CN106325280A (en)*2016-10-202017-01-11上海物景智能科技有限公司 A multi-robot anti-collision method and system
CN106548231A (en)*2016-11-242017-03-29北京地平线机器人技术研发有限公司Mobile controller, mobile robot and the method for moving to optimal interaction point
CN106774345A (en)*2017-02-072017-05-31上海仙知机器人科技有限公司A kind of method and apparatus for carrying out multi-robot Cooperation
JP2017130121A (en)*2016-01-222017-07-27株式会社ダイヘンMobile body, and server
JP2017134794A (en)*2016-01-292017-08-03パナソニックIpマネジメント株式会社Mobile robot control system, and server device for controlling mobile robots
CN107015566A (en)*2017-06-052017-08-04河池学院A kind of multirobot detecting system based on LabVIEW

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JPH11259130A (en)*1998-03-061999-09-24Nissan Motor Co Ltd Automatic guided vehicle route setting method and automatic guided vehicle control method
WO2002023297A1 (en)*2000-09-112002-03-21Kunikatsu TakaseMobile body movement control system
JP2010231698A (en)*2009-03-302010-10-14Advanced Telecommunication Research Institute International Network robot system, robot control apparatus, robot control method, and robot control program
CN103268111A (en)*2013-05-282013-08-28重庆大学 A networked distributed multi-mobile robot system
JP2017130121A (en)*2016-01-222017-07-27株式会社ダイヘンMobile body, and server
JP2017134794A (en)*2016-01-292017-08-03パナソニックIpマネジメント株式会社Mobile robot control system, and server device for controlling mobile robots
CN105974925A (en)*2016-07-192016-09-28合肥学院AGV trolley driving control method and system
CN106325280A (en)*2016-10-202017-01-11上海物景智能科技有限公司 A multi-robot anti-collision method and system
CN106548231A (en)*2016-11-242017-03-29北京地平线机器人技术研发有限公司Mobile controller, mobile robot and the method for moving to optimal interaction point
CN106774345A (en)*2017-02-072017-05-31上海仙知机器人科技有限公司A kind of method and apparatus for carrying out multi-robot Cooperation
CN107015566A (en)*2017-06-052017-08-04河池学院A kind of multirobot detecting system based on LabVIEW

Cited By (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN111806460A (en)*2020-07-172020-10-23青岛蚂蚁机器人有限责任公司Automatic guide transport vechicle control system
TWI784786B (en)*2020-11-162022-11-21日商豐田自動織機股份有限公司Automated guided vehicle control device
CN115145259A (en)*2021-03-162022-10-04丰田自动车株式会社 Movement path calculation device, moving body control system, movement path calculation method, and computer-readable storage medium

Also Published As

Publication numberPublication date
JP7136426B2 (en)2022-09-13
JPWO2019059307A1 (en)2020-10-15
WO2019059307A1 (en)2019-03-28

Similar Documents

PublicationPublication DateTitle
JP7168211B2 (en) Mobile object that avoids obstacles and its computer program
US20190294181A1 (en)Vehicle, management device, and vehicle management system
JP6825712B2 (en) Mobiles, position estimators, and computer programs
JP7136426B2 (en) Management device and mobile system
JP7081881B2 (en) Mobiles and mobile systems
JP7111424B2 (en) Mobile object, position estimation device, and computer program
CN110998472A (en)Mobile object and computer program
CN110998473A (en)Position estimation system and mobile body having the same
JP2020166702A (en)Mobile body system, map creation system, route creation program and map creation program
JPWO2019187816A1 (en) Mobiles and mobile systems
JP2019175137A (en)Mobile body and mobile body system
JPWO2019054209A1 (en) Map making system and map making device
JP2019053391A (en)Mobile body
CN111971633A (en)Position estimation system, mobile object having the same, and computer program
JP2019079171A (en)Movable body
JP2019175136A (en)Mobile body
JP2019179497A (en)Moving body and moving body system
JP2019148871A (en)Movable body and movable body system
CN112578789A (en)Moving body
JPWO2019069921A1 (en) Mobile
JP2019067001A (en)Moving body
JP2019175138A (en)Mobile body and management device
JPWO2019059299A1 (en) Operation management device
JP2020166701A (en)Mobile object and computer program

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
WD01Invention patent application deemed withdrawn after publication
WD01Invention patent application deemed withdrawn after publication

Application publication date:20200424


[8]ページ先頭

©2009-2025 Movatter.jp