Movatterモバイル変換


[0]ホーム

URL:


US10001780B2 - Systems and methods for dynamic route planning in autonomous navigation - Google Patents

Systems and methods for dynamic route planning in autonomous navigation
Download PDF

Info

Publication number
US10001780B2
US10001780B2US15/341,612US201615341612AUS10001780B2US 10001780 B2US10001780 B2US 10001780B2US 201615341612 AUS201615341612 AUS 201615341612AUS 10001780 B2US10001780 B2US 10001780B2
Authority
US
United States
Prior art keywords
route
robot
pose
poses
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US15/341,612
Other versions
US20180120856A1 (en
Inventor
Borja Ibarz Gabardos
Jean-Baptiste Passot
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Brain Corp
Original Assignee
Brain Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
US case filed in Illinois Northern District CourtlitigationCriticalhttps://portal.unifiedpatents.com/litigation/Illinois%20Northern%20District%20Court/case/1%3A24-cv-12569Source: District CourtJurisdiction: Illinois Northern District Court"Unified Patents Litigation Data" by Unified Patents is licensed under a Creative Commons Attribution 4.0 International License.
First worldwide family litigation filedlitigationhttps://patents.darts-ip.com/?family=62021299&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=US10001780(B2)"Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Brain CorpfiledCriticalBrain Corp
Priority to US15/341,612priorityCriticalpatent/US10001780B2/en
Assigned to BRAIN CORPORATIONreassignmentBRAIN CORPORATIONASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: GABARDOS, BORJA IBARZ, PASSOT, JEAN-BAPTISTE
Priority to PCT/US2017/059379prioritypatent/WO2018085294A1/en
Priority to KR1020197015595Aprioritypatent/KR102528869B1/en
Priority to CN201780074759.6Aprioritypatent/CN110023866B/en
Priority to JP2019523827Aprioritypatent/JP7061337B2/en
Priority to EP17867114.5Aprioritypatent/EP3535630A4/en
Priority to CA3042532Aprioritypatent/CA3042532A1/en
Publication of US20180120856A1publicationCriticalpatent/US20180120856A1/en
Priority to US16/011,499prioritypatent/US10379539B2/en
Publication of US10001780B2publicationCriticalpatent/US10001780B2/en
Application grantedgrantedCritical
Priority to US16/454,217prioritypatent/US20200004253A1/en
Assigned to HERCULES CAPITAL, INC.reassignmentHERCULES CAPITAL, INC.SECURITY INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: BRAIN CORPORATION
Activelegal-statusCriticalCurrent
Adjusted expirationlegal-statusCritical

Links

Images

Classifications

Definitions

Landscapes

Abstract

Systems and methods for dynamic route planning in autonomous navigation are disclosed. In some exemplary implementations, a robot can have one or more sensors configured to collect data about an environment including detected points on one or more objects in the environment. The robot can then plan a route in the environment, where the route can comprise one or more route poses. The route poses can include a footprint indicative at least in part of a pose, size, and shape of the robot along the route. Each route pose can have a plurality of points therein. Based on forces exerted on the points of each route pose by other route poses, objects in the environment, and others, each route pose can reposition. Based at least in part on interpolation performed on the route poses (some of which may be repositioned), the robot can dynamically route.

Description

COPYRIGHT
A portion of the disclosure of this patent document contains material that is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent files or records, but otherwise reserves all copyright rights whatsoever.
BACKGROUNDTechnological Field
The present application relates generally to robotics, and more specifically to systems and methods for dynamic route planning in autonomous navigation.
Background
Robotic navigation can be a complex problem. In some cases, robots can determine a route to travel. By way of illustration, a robot can learn routes demonstrated by a user (e.g., the user can control a robot along a route and/or can upload a map containing a route). As another illustration, a robot can plan its own route in an environment based on its knowledge of the environment (e.g., a map). However, a challenge that can occur is that after a robot determines a route, features of the environment can change. For example, items can fall into the path of the route and/or parts of the environment can change. Current robots may not be able to make real time adjustments to its planned path in response to these changes (e.g., blockages). In such situations, current robots may stop, collide into objects, and/or make sub-optimal adjustments to its route. Accordingly, there is a need for improved systems and methods for autonomous navigation, including systems and methods for dynamic route planning.
SUMMARY
The foregoing needs are satisfied by the present disclosure, which provides for, inter alia, apparatus and methods for dynamic route planning in autonomous navigation. Example implementations described herein have innovative features, no single one of which is indispensable or solely responsible for their desirable attributes. Without limiting the scope of the claims, some of the advantageous features will now be summarized.
In a first aspect, a robot is disclosed. In one exemplary implementation, the robot includes: one or more sensors configured to collect data about an environment including detected points on one or more objects in the environment; and a controller configured to: create a map of the environment based at least in part on the collected data, determine a route in the map in which the robot will travel, generate one or more route poses on the route, wherein each route pose comprises a footprint indicative of poses of the robot along the route and each route pose has a plurality of points disposed therein, determine forces on each of the plurality of points of each route pose, the forces comprising repulsive forces from one or more of the detected points on the one or more objects and attractive forces from one or more of the plurality of points on others of the one or more route poses, reposition one or more route poses in response to the forces on each point of the one or more route poses, and perform interpolation between one or more route poses to generate a collision-free path between the one or more route poses for the robot to travel.
In one variant, the one or more route poses form a sequence in which the robot travels along the route; and the interpolation comprises a linear interpolation between sequential ones of the one or more route poses.
In another variant, the interpolation generates one or more interpolation route poses having substantially similar footprints to the footprint of each route pose. In another variant, the determination of the forces on each point of the one or more route poses further comprises computing a force function that associates, at least in part, the forces on each point of each route pose with one or more characteristics of objects in the environment.
In another variant, the one or more characteristics includes one or more of distance, shape, material, and color. In another variant, the force function associates zero repulsive force exerted by a first detected point on a first object where a distance between the first point and a second point of a first route pose is above a predetermined distance threshold.
In another variant, the footprint of each route pose has substantially similar size and shape as the footprint of the robot.
In another variant, the robot comprises a floor cleaner.
In a second aspect, a method for dynamic navigation of a robot is disclosed. In one exemplary implementation, the method includes: generating a map of the environment using data from one or more sensors; determining a route on the map, the route including one or more route poses, each route pose comprising a footprint indicative at least in part of a pose and a shape of the robot along the route and each route pose having a plurality of points disposed therein; computing repulsive forces from a point on an object in the environment onto the plurality of points of a first route pose of the one or more route poses; repositioning the first route pose in response to at least the repulsive force; and performing an interpolation between the repositioned first route pose and another of the one or more route poses.
In one variant, determining attractive forces from a point on another of the one or more route poses exerted on the plurality of points of the first route pose. In another variant, detecting a plurality of objects in the environment with the one or more sensors, each of the plurality of objects having detected points; and defining a force function, the force function computing repulsive forces exerted by each of the detected points of the plurality of objects on the plurality of points of the first route pose, wherein each repulsive force is a vector.
In another variant, repositioning the first route pose includes calculating the minimum of the force function.
In another variant, the repositioning of the first route pose includes translating and rotating the first route pose.
In another variant, interpolation includes: generating an interpolation route pose having a footprint substantially similar to a shape of the robot; and determining a translation and rotation of the interpolation route pose based at least on a collision-free path between the translated and rotated first route pose and the another of the one or more route poses.
In another variant, the method further comprising computing a magnitude of the repulsive forces as proportional to a distance between the point on the object and each of the plurality of points of the first route pose if the point on the object is outside of the footprint of the first route pose.
In another variant, computing a magnitude of the repulsive forces as inversely proportional to a distance between the point on the object and each of the plurality of points of the first route pose if the point on the object is inside the footprint of the first route pose.
In another variant, the method further includes computing torque forces onto the plurality of points of the first route pose due to the repulsive forces.
In a third aspect, a non-transitory computer-readable storage apparatus is disclosed. In one embodiment, the non-transitory computer-readable storage apparatus has a plurality of instructions stored thereon, the instructions being executable by a processing apparatus to operate a robot. The instructions are configured to, when executed by the processing apparatus, cause the processing apparatus to: generate a map of the environment using data from one or more sensors; determine a route on the map, the route including one or more route poses, each route pose comprising a footprint indicative at least in part of a pose and a shape of the robot along the route and each route pose having a plurality of points disposed therein; and compute repulsive forces from a point on an object in the environment onto the plurality of points of a first route pose of the one or more route poses.
In one variant, the instructions when executed by the processing apparatus, further cause the processing apparatus to determine attractive forces from a point on another of the one or more route poses exerted on the plurality of points of the first route pose.
In another variant, the instructions when executed by the processing apparatus, further cause the processing apparatus to determine torque forces from a point on another of the one or more route poses exerted on the plurality of points of the first route pose.
These and other objects, features, and characteristics of the present disclosure, as well as the methods of operation and functions of the related elements of structure and the combination of parts and economies of manufacture, will become more apparent upon consideration of the following description and the appended claims with reference to the accompanying drawings, all of which form a part of this specification, wherein like reference numerals designate corresponding parts in the various figures. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the disclosure. As used in the specification and in the claims, the singular form of “a”, “an”, and “the” include plural referents unless the context clearly dictates otherwise.
BRIEF DESCRIPTION OF THE DRAWINGS
The disclosed aspects will hereinafter be described in conjunction with the appended drawings, provided to illustrate and not to limit the disclosed aspects, wherein like designations denote like elements.
FIG. 1 illustrates various side elevation views of exemplary body forms for a robot in accordance with principles of the present disclosure.
FIG. 2A is a diagram of an overhead view of a robot navigating a path in accordance with some implementations of this disclosure.
FIG. 2B illustrates an overhead view of a user demonstrating a route to a robot before the robot autonomously travels a route in an environment.
FIG. 3 is a functional block diagram of a robot in accordance with some principles of this disclosure.
FIG. 4A is a top view diagram illustrating the interaction between a robot and an obstacle in accordance with some implementations of this disclosure.
FIG. 4B is a diagram of a global layer, intermediate layer, and local layer in accordance with implementations of the present disclosure.
FIG. 4C is a process flow diagram of an exemplary method for dynamic route planning in accordance with some implementations of this disclosure.
FIG. 4D illustrates an overhead view of route poses along with repulsive forces exerted by objects in accordance with some implementations of the present disclosure.
FIG. 4E illustrates example points on a route pose in accordance with some implementations of the present disclosure.
FIG. 4F illustrates an overhead view showing attractive forces between route poses in accordance with some implementations of the present disclosure.
FIG. 5 is an overhead view of a diagram showing interpolation between route poses in accordance with some implementations of this disclosure.
FIG. 6 is a process flow diagram of an exemplary method for operation of a robot in accordance with some implementations of this disclosure.
FIG. 7 is a process flow diagram of an exemplary method for operation of a robot in accordance with some implementations of this disclosure.
All Figures disclosed herein are © Copyright 2016 Brain Corporation. All rights reserved.
DETAILED DESCRIPTION
Various aspects of the novel systems, apparatuses, and methods disclosed herein are described more fully hereinafter with reference to the accompanying drawings. This disclosure can, however, be embodied in many different forms and should not be construed as limited to any specific structure or function presented throughout this disclosure. Rather, these aspects are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art. Based on the teachings herein, one skilled in the art should appreciate that the scope of the disclosure is intended to cover any aspect of the novel systems, apparatuses, and methods disclosed herein, whether implemented independently of, or combined with, any other aspect of the disclosure. For example, an apparatus can be implemented or a method can be practiced using any number of the aspects set forth herein. In addition, the scope of the disclosure is intended to cover such an apparatus or method that is practiced using other structure, functionality, or structure and functionality in addition to or other than the various aspects of the disclosure set forth herein. It should be understood that any aspect disclosed herein can be implemented by one or more elements of a claim.
Although particular aspects are described herein, many variations and permutations of these aspects fall within the scope of the disclosure. Although some benefits and advantages of the preferred aspects are mentioned, the scope of the disclosure is not intended to be limited to particular benefits, uses, and/or objectives. The detailed description and drawings are merely illustrative of the disclosure rather than limiting, the scope of the disclosure being defined by the appended claims and equivalents thereof.
The present disclosure provides for improved systems and methods for dynamic route planning in autonomous navigation. As used herein, a robot can include mechanical or virtual entities configured to carry out complex series of actions automatically. In some cases, robots can be machines that are guided by computer programs or electronic circuitry. In some cases, robots can include electro-mechanical components that are configured for navigation, where the robot can move from one location to another. Such navigating robots can include autonomous cars, floor cleaners, rovers, drones, carts, and the like.
As referred to herein, floor cleaners can include floor cleaners that are manually controlled (e.g., driven or remote control) and/or autonomous (e.g., using little to no user control). For example, floor cleaners can include floor scrubbers that a janitor, custodian, or other person operates and/or robotic floor scrubbers that autonomously navigate and/or clean an environment. Similarly, floor cleaners can also include vacuums, steamers, buffers, mop, polishers, sweepers, burnishers, etc.
Detailed descriptions of the various implementations and variants of the system and methods of the disclosure are now provided. While many examples discussed herein are in the context of robotic floor cleaners, it will be appreciated that the described systems and methods contained herein can be used in other robots. Myriad other example implementations or uses for the technology described herein would be readily envisaged by those having ordinary skill in the art, given the contents of the present disclosure.
Advantageously, the systems and methods of this disclosure at least: (i) provide for dynamic route planning in an autonomously navigating robot; (ii) enhance efficiency in navigating environments, which can allow for improved and/or efficient utilization of resources (e.g., energy, fuel, cleaning fluid, etc.) usage; and (iii) provide computational efficiency which can reduce consumption of processing power, energy, time, and/or other resources in navigating robots. Other advantages are readily discernable by one having ordinary skill given the contents of the present disclosure.
For example, many current robots that can autonomously navigate are programmed to navigate a route and/or path to a goal. In order to navigate these routes, these robots can create a path plan (e.g., a global solution). Also, these robots can have localized plans in a small area around it (e.g., in the order of a few meters), where the robot can determine how it will navigate around obstacles detected by its sensors (typically with basic commands to turn when an object is detected). The robot can then traverse the space in the pattern and avoid obstacles detected by its sensors by, e.g., stopping, slowing down, deviating left or right, etc. However, in many current applications, such traversal and avoidance can be complicated and robots can either have undesirable results (e.g., stoppages or collisions) and/or not be able to navigate through more complex situations. In some cases, such current applications can also be computationally expensive and/or slow to run, causing robots to act unnaturally.
Advantageously, using systems and methods disclosed herein, robots can deviate from its programming, following more efficient paths and/or making more complex adjustments to avoid obstacles. In some implementations described herein, such movements can be determined in a more efficient, faster way, that also appears more natural as a robot plans more complex paths.
A person having ordinary skill in the art would appreciate that a robot, as referred to herein, can have a number of different appearances/forms.FIG. 1 illustrates various side elevation views of exemplary body forms for a robot in accordance with principles of the present disclosure. These are non-limiting examples meant to further illustrate the variety of body forms, but not to restrict robots described herein to any particular body form. For example,body form100 illustrates an example where the robot is a stand-up shop vacuum.Body form102 illustrates an example where the robot is a humanoid robot having an appearance substantially similar to a human body.Body form104 illustrates an example where the robot is a drone having propellers.Body form106 illustrates an example where the robot has a vehicle shape having wheels and a passenger cabin.Body form108 illustrates an example where the robot is a rover.
Body form110 can be an example where the robot is a motorized floor scrubber.Body form112 can be a motorized floor scrubber having a seat, pedals, and a steering wheel, where a user can drivebody form112 like a vehicle asbody form112 cleans, however,body form112 can also operate autonomously. Other body forms are further contemplated, including industrial machines that can be robotized, such as forklifts, tugs, boats, planes, etc.
FIG. 2A is a diagram of an overhead view ofrobot202 navigating apath206 in accordance with some implementations of this disclosure.Robot202 can autonomously navigate throughenvironment200, which can comprisevarious objects208,210,212,218.Robot202 can start at an initial location and end at an end location. As illustrated, the initial position and the end position are substantially the same, illustrating a substantially closed loop. However, in other cases, the initial location and the end location may not be substantially the same, forming an open loop.
By way of illustration, in some implementations,robot202 can be a robotic floor cleaner, such as a robotic floor scrubber, vacuum cleaner, steamer, mop, burnisher, sweeper, and the like.Environment200 can be a space having floors that are desired to be cleaned. For example,environment200 can be a store, warehouse, office building, home, storage facility, etc. One or more ofobjects208,210,212,218 can be shelves, displays, objects, items, people, animals, or any other entity or thing that may be on the floor or otherwise impede the robot's ability to navigate throughenvironment200.Route206 can be the cleaning path traveled byrobot202 autonomously.Route206 can follow a path that weaves betweenobjects208,210,212,218 as illustrated inexample route206. For example, whereobjects208,210,212,218 are shelves in a store,robot202 can go along the aisles of the store and clean the floors of the aisles. However, other routes are also contemplated, such as, without limitation, weaving back and forth along open floor areas and/or any cleaning path a user could use to clean the floor (e.g., if the user is manually operating a floor cleaner). In some cases,robot202 can go over a portion a plurality of times. Accordingly, routes can overlap on themselves. Accordingly,route206 is meant merely as illustrative examples and can appear differently as illustrated. Also, as illustrated, one example ofenvironment200 is shown, however, it should be appreciated thatenvironment200 can take on any number of forms and arrangements (e.g., of any size, configuration, and layout of a room or building) and is not limited by the example illustrations of this disclosure.
Inroute206,robot202 can begin at the initial location, which can berobot202's starting point.Robot202 can then clean alongroute206 autonomously (e.g., with little or no control from a user) until it reaches an end location, where it can stop cleaning. The end location can be designated by a user and/or determined byrobot202. In some cases, the end location can be the location inroute206 after whichrobot202 has cleaned the desired area of floor. As previously described,route206 can be a closed loop or an open loop. By way of illustrative example, an end location can be a location for storage forrobot202, such as a temporary parking spot, storage room/closet, and the like. In some cases, the end location can be the point where a user training and/or programming tasks forrobot202 stopped training and/or programming.
In the context of floor cleaners (e.g., floor scrubbers, vacuum cleaners, etc.),robot202 may or may not clean at every point alongroute206. By way of illustration, whererobot202 is a robotic floor scrubber, the cleaning system (e.g., water flow, cleaning brushes, etc.) ofrobot202 may only be operating in some portions ofroute206 and not others. For example,robot202 may associate certain actions (e.g., turning, turning on/off water, spraying water, turning on/off vacuums, moving vacuum hose positions, gesticulating an arm, raising/lowering a lift, moving a sensor, turning on/off a sensor, etc.) with particular positions and/or trajectories (e.g., while moving in a certain direction or in a particular sequence along route206) along the demonstrated route. In the context of floor cleaners, such association may be desirable when only some areas of the floor are to be cleaned but not others and/or in some trajectories. In such cases,robot202 can turn on a cleaning system in areas where a user demonstrated forrobot202 to clean, and turn off the cleaning system otherwise.
FIG. 2B illustrates an overhead view of auser demonstrating route216 torobot202 beforerobot202 autonomously travelsroute206 inenvironment200. In demonstratingroute216, a user can startrobot202 at an initial location.Robot202 can then weave around objects208,210,212,218.Robot202 can stop at an end location, as previously described. In some cases (and as illustrated), autonomously navigatedroute206 can be exactly the same as demonstratedroute216. In some cases,route206 might not be precisely the same asroute216, but can be substantially similar. For example, asrobot202 navigatesroute206,robot202 uses its sensors to sense where it is in relationship to its surrounding. Such sensing may be imprecise in some instances, which may causerobot202 to not navigate the precise route that had been demonstrated androbot202 had been trained to follow. In some cases, small changes toenvironment200, such as the moving of shelves and/or changes in the items on the shelves, can causerobot202 to deviate fromroute216 when it autonomously navigatesroute206. As another example, as previously described,robot202 can avoid objects by turning around them, slowing down, etc. when autonomously navigatingroute206. These objects might not have been present (and avoided) when the user demonstratedroute216. For example, the objects may be temporarily and/or transient items, and/or may be transient and/or dynamic changes to theenvironment200. As another example, the user may have done a poorjob demonstrating route216. For example, the user may have crashed and/or bumped into a wall, shelf, object, obstacle, etc. As another example, an obstacle may have been present while the user had demonstratedroute216, but no longer there whenrobot202 autonomously navigatesroute206. In these cases,robot202 can store in memory (e.g., memory302) one or more actions that it can correct, such as crashing and/or bumping to a wall, shelf, object, obstacle, etc. Whenrobot202 then autonomously navigates demonstrated route216 (e.g., as route206),robot202 can correct such actions and not perform them (e.g., not crash and/or bump into a wall, shelf, object, obstacle, etc.) when it is autonomously navigating. In this way,robot202 can determine not to autonomously navigate at least a portion of a navigable route, such as a demonstrated route. In some implementations, determining not to autonomously navigate at least a portion of the navigable route includes determining when to avoid an obstacle and/or object.
As previously mentioned, as a user demonstratesroute216, the user can turn on and off the cleaning system ofrobot202, or perform other actions, in order to trainrobot202 where (e.g., at what position), and/or along what trajectories, to clean along route216 (and subsequently whenrobot202 autonomously cleans route206). The robot can record these actions inmemory302 and later perform them when autonomously navigating. These actions can include any actions thatrobot202 may perform, such as turning, turning on/off water, spraying water, turning on/off vacuums, moving vacuum hose positions, gesticulating an arm, raising/lowering a lift, moving a sensor, turning on/off a sensor, etc.
FIG. 3 is a functional block diagram of arobot202 in accordance with some principles of this disclosure. As illustrated inFIG. 3,robot202 can includecontroller304,memory302,user interfaces unit308,exteroceptive sensors unit306,proprioceptive sensors unit310, andcommunications unit312, as well as other components and subcomponents (e.g., some of which may not be illustrated). Although a specific implementation is illustrated inFIG. 3, it is appreciated that the architecture may be varied in certain implementations as would be readily apparent to one of ordinary skill given the contents of the present disclosure.
Controller304 can control the various operations performed byrobot202.Controller304 can include one or more processors (e.g., microprocessors) and other peripherals. As used herein, processor, microprocessor, and/or digital processor can include any type of digital processing device such as, without limitation, digital signal processors (“DSPs”), reduced instruction set computers (“RISC”), general-purpose (“CISC”) processors, microprocessors, gate arrays (e.g., field programmable gate arrays (“FPGAs”)), programmable logic device (“PLDs”), reconfigurable computer fabrics (“RCFs”), array processors, secure microprocessors, specialized processors (e.g., neuromorphic processors), and application-specific integrated circuits (“ASICs”). Such digital processors may be contained on a single unitary integrated circuit die, or distributed across multiple components.
Controller304 can be operatively and/or communicatively coupled tomemory302.Memory302 can include any type of integrated circuit or other storage device configured to store digital data including, without limitation, read-only memory (“ROM”), random access memory (“RAM”), non-volatile random access memory (“NVRAM”), programmable read-only memory (“PROM”), electrically erasable programmable read-only memory (“EEPROM”), dynamic random-access memory (“DRAM”), Mobile DRAM, synchronous DRAM (“SDRAM”), double data rate SDRAM (“DDR/2 SDRAM”), extended data output (“EDO”) RAM, fast page mode RAM (“FPM”), reduced latency DRAM (“RLDRAM”), static RAM (“SRAM”), “flash” memory (e.g., NAND/NOR), memristor memory, pseudostatic RAM (“PSRAM”), etc.Memory302 can provide instructions and data tocontroller304. For example,memory302 can be a non-transitory, computer-readable storage medium having a plurality of instructions stored thereon, the instructions being executable by a processing apparatus (e.g., controller304) to operaterobot202. In some cases, the instructions can be configured to, when executed by the processing apparatus, cause the processing apparatus to perform the various methods, features, and/or functionality described in this disclosure. Accordingly,controller304 can perform logical and arithmetic operations based on program instructions stored withinmemory302.
In some implementations,exteroceptive sensors unit306 can comprise systems and/or methods that can detect characteristics within and/or aroundrobot202.Exteroceptive sensors unit306 can comprise a plurality and/or a combination of sensors.Exteroceptive sensors unit306 can include sensors that are internal torobot202 or external, and/or have components that are partially internal and/or partially external. In some cases,exteroceptive sensors unit306 can include exteroceptive sensors such as sonar, LIDAR, radar, lasers, cameras (including video cameras, infrared cameras, 3D cameras, etc.), time of flight (“TOF”) cameras, antenna, microphones, and/or any other sensor known in the art. In some implementations,exteroceptive sensors unit306 can collect raw measurements (e.g., currents, voltages, resistances gate logic, etc.) and/or transformed measurements (e.g., distances, angles, detected points in obstacles, etc.).Exteroceptive sensors unit306 can generate data based at least in part on measurements. Such data can be stored in data structures, such as matrices, arrays, etc. In some implementations, the data structure of the sensor data can be called an image.
In some implementations,proprioceptive sensors unit310 can include sensors that can measure internal characteristics ofrobot202. For example,proprioceptive sensors unit310 can measure temperature, power levels, statuses, and/or any other characteristic ofrobot202. In some cases,proprioceptive sensors unit310 can be configured to determine the odometry ofrobot202. For example,proprioceptive sensors unit310 can includeproprioceptive sensors unit310, which can comprise sensors such as accelerometers, inertial measurement units (“IMU”), odometers, gyroscopes, speedometers, cameras (e.g. using visual odometry), clock/timer, and the like. Odometry to facilitate autonomous navigation ofrobot202. This odometry can includerobot202's position (e.g., where position includes robot's location, displacement and/or orientation, and can sometimes be interchangeable with the term pose as used herein) relative to the initial location. In some implementations,proprioceptive sensors unit310 can collect raw measurements (e.g., currents, voltages, resistances gate logic, etc.) and/or transformed measurements (e.g., distances, angles, detected points in obstacles, etc.). Such data can be stored in data structures, such as matrices, arrays, etc. In some implementations, the data structure of the sensor data can be called an image.
In some implementations,user interfaces unit308 can be configured to enable a user to interact withrobot202. For example,user interfaces308 can include touch panels, buttons, keypads/keyboards, ports (e.g., universal serial bus (“USB”), digital visual interface (“DVI”), Display Port, E-Sata, Firewire, PS/2, Serial, VGA, SCSI, audioport, high-definition multimedia interface (“HDMI”), personal computer memory card international association (“PCMCIA”) ports, memory card ports (e.g., secure digital (“SD”) and miniSD), and/or ports for computer-readable medium), mice, rollerballs, consoles, vibrators, audio transducers, and/or any interface for a user to input and/or receive data and/or commands, whether coupled wirelessly or through wires.User interfaces unit308 can include a display, such as, without limitation, liquid crystal display (“LCDs”), light-emitting diode (“LED”) displays, LED LCD displays, in-plane-switching (“IPS”) displays, cathode ray tubes, plasma displays, high definition (“HD”) panels, 4K displays, retina displays, organic LED displays, touchscreens, surfaces, canvases, and/or any displays, televisions, monitors, panels, and/or devices known in the art for visual presentation. In some implementationsuser interfaces unit308 can be positioned on the body ofrobot202. In some implementations,user interfaces unit308 can be positioned away from the body ofrobot202, but can be communicatively coupled to robot202 (e.g., via communication units including transmitters, receivers, and/or transceivers) directly or indirectly (e.g., through a network, server, and/or a cloud).
In some implementations,communications unit312 can include one or more receivers, transmitters, and/or transceivers. Communications unit312 can be configured to send/receive a transmission protocol, such as BLUETOOTH®, ZIGBEE®, Wi-Fi, induction wireless data transmission, radio frequencies, radio transmission, radio-frequency identification (“RFID”), near-field communication (“NFC”), infrared, network interfaces, cellular technologies such as 3G (3GPP/3GPP2), high-speed downlink packet access (“HSDPA”), high-speed uplink packet access (“HSUPA”), time division multiple access (“TDMA”), code division multiple access (“CDMA”) (e.g., IS-95A, wideband code division multiple access (“WCDMA”), etc.), frequency hopping spread spectrum (“FHSS”), direct sequence spread spectrum (“DSSS”), global system for mobile communication (“GSM”), Personal Area Network (“PAN”) (e.g., PAN/802.15), worldwide interoperability for microwave access (“WiMAX”), 802.20, long term evolution (“LTE”) (e.g., LTE/LTE-A), time division LTE (“TD-LTE”), global system for mobile communication (“GSM”), narrowband/frequency-division multiple access (“FDMA”), orthogonal frequency-division multiplexing (“OFDM”), analog cellular, cellular digital packet data (“CDPD”), satellite systems, millimeter wave or microwave systems, acoustic, infrared (e.g., infrared data association (“IrDA”)), and/or any other form of wireless data transmission.
As used herein, network interfaces can include any signal, data, or software interface with a component, network, or process including, without limitation, those of the FireWire (e.g., FW400, FW800, FWS800T, FWS1600, FWS3200, etc.), universal serial bus (“USB”) (e.g., USB 1.X, USB 2.0, USB 3.0, USB Type-C, etc.), Ethernet (e.g., 10/100, 10/100/1000 (Gigabit Ethernet), 10-Gig-E, etc.), multimedia over coax alliance technology (“MoCA”), Coaxsys (e.g., TVNET™), radio frequency tuner (e.g., in-band or OOB, cable modem, etc.), Wi-Fi (802.11), WiMAX (e.g., WiMAX (802.16)), PAN (e.g., PAN/802.15), cellular (e.g., 3G, LTE/LTE-A/TD-LTE/TD-LTE, GSM, etc.), IrDA families, etc. As used herein, Wi-Fi can include one or more of IEEE-Std. 802.11, variants of IEEE-Std. 802.11, standards related to IEEE-Std. 802.11 (e.g., 802.11 a/b/g/n/ac/ad/af/ah/ai/aj/aq/ax/ay), and/or other wireless standards.
Communications unit312 can also be configured to send/receive a transmission protocol over wired connections, such as any cable that has a signal line and ground. For example, such cables can include Ethernet cables, coaxial cables, Universal Serial Bus (“USB”), FireWire, and/or any connection known in the art. Such protocols can be used bycommunications unit312 to communicate to external systems, such as computers, smart phones, tablets, data capture systems, mobile telecommunications networks, clouds, servers, or the like.Communications unit312 can be configured to send and receive signals comprising of numbers, letters, alphanumeric characters, and/or symbols. In some cases, signals can be encrypted, using algorithms such as 128-bit or 256-bit keys and/or other encryption algorithms complying with standards such as the Advanced Encryption Standard (“AES”), RSA, Data Encryption Standard (“DES”), Triple DES, and the like.Communications unit312 can be configured to send and receive statuses, commands, and other data/information. For example,communications unit312 can communicate with a user operator to allow the user to controlrobot202.Communications unit312 can communicate with a server/network in order to allowrobot202 to send data, statuses, commands, and other communications to the server. The server can also be communicatively coupled to computer(s) and/or device(s) that can be used to monitor and/orcontrol robot202 remotely.Communications unit312 can also receive updates (e.g., firmware or data updates), data, statuses, commands, and other communications from a server forrobot202.
In some implementations, one or the components and/or subcomponents can be instantiated remotely fromrobot202. For example, mapping and localization units262, may be located in a cloud and/or connected torobot202 throughcommunications unit312. Connections can be direct and/or through a server and/or network. Accordingly, implementations of the functionality of this disclosure should also be understood to include remote interactions where data can be transferred usingcommunications unit312, and one or more portions of processes can be completed remotely.
FIG. 4A is a top view diagram illustrating the interaction betweenrobot202 and anobstacle402 in accordance with some implementations of this disclosure. In navigatingroute216,robot202 can encounterobstacle402.Obstacle402 can impede the path ofrobot202, which is illustrated asroute portion404. If robot were to continue following onroute portion404, it may collide withobstacle402. However, in some circumstances, usingexteroceptive sensors unit306 and/orproprioceptive sensors unit310,robot202 can stop before colliding withobstacle402.
This interaction withobstacle402 illustrates advantages of implementations in accordance with the present disclosure.FIG. 4B is a diagram ofglobal layer406,intermediate layer408, andlocal layer410 in accordance with implementations of the present disclosure.Global layer406,intermediate layer408, andlocal layer410 can be hardware and/or software layers instantiated in one or more ofmemory302 and/orcontroller304.Global layer406 can include software and/or hardware that implements global mapping and routing. For example, the high-level mapping can include a map ofenvironment200. The map can also include a representation ofroute216, allowingrobot202 to navigate the space inenvironment200.
In some implementations,global layer406 can include a global planner. In this way,global layer406 can determine one or more of: the location of robot202 (e.g., in global coordinates such as two-dimensional coordinates, three-dimensional coordinates, four-dimensional coordinates, etc.); thepath robot202 should take to reach its goal; and/or higher-level (e.g., long-range) planning. In this way,robot202 can determine its general path and/or direction to travel from one location to another.
Local layer410 includes software and/or hardware that implements local planning. For example,local layer410 can include short-range planning configured for maneuvering in local constraints of motion.Local layer410 can process data received fromexteroceptive sensors unit306 and determine the presence and/or positioning of obstacles and/or objects nearrobot202. For example, if an object is within range of a sensor of exteroceptive sensors unit306 (e.g., a LIDAR, sonar, camera, etc.),robot202 can detect the object. Thelocal layer410 can compute and/or control motor functionality to navigate around objects, such by controlling actuators to turn, move forward, reverse, etc. In some cases, processing inlocal layer410 can be computationally intensive. For example,local layer410 can receive data from sensors ofexteroceptive sensors unit306 and/orproprioceptive sensors unit310.Local layer410 can then determine motor functions to avoid an object detected by exteroceptive sensors unit306 (e.g., using a motor to turn a steering column left and right, and/or using a motor to push the robot forward). The interplay oflocal layer410 andglobal layer406 can allowrobot202 to make local adjustments while still moving generally along a route to its goal.
However, in some circumstances, it can be desirable to make adjustments at a finer level than what would be computed byglobal layer406, yet not at the computationally intensive level of precise motor functions oflocal layer410. Accordingly,intermediate layer408 can include hardware and/or software that can determine intermediate adjustments ofrobot202 as it navigates around objects.
Inintermediate layer408,robot202 can plan how to avoid objects and/or obstacles in its environment. In some cases,intermediate layer408 can be initialized with at least a partial path and/or route from a global path planner fromglobal layer406.
Because objects (e.g., obstacles, walls, etc.) present things in whichrobot202 could collide, objects and/or obstacles can put forth a repulsive force onrobot202. In some cases, byobjects repulsing robot202,robot202 can navigate along a collision-free path around those objects and/or obstacles.
FIG. 4C is a process flow diagram of anexemplary method450 for dynamic route planning in accordance with some implementations of this disclosure. In some implementations,method450 can be performed byintermediate layer408 and/or bycontroller304. Block452 can include obtaining a route containing one or more route poses. In some cases, this route can be created byrobot202 and/or uploaded ontorobot202. In some cases, the route can be passed fromglobal layer406 tointermediate layer408. Block454 can include selecting a first route pose. Block456 can include, for the first route pose, determining repulsive forces from objects in the environment. Block458 can include, for the first route pose, determining attractive forces from other route poses. Block460 can include determining the translation and/or rotation of the first route pose due to the repulsive forces and attractive forces. Block462 can include performing interpolation to account for the translated and/or rotated route pose. This process and others will be illustrated throughout this disclosure.
By way of illustration,FIG. 4D illustrates route poses414 and416 along with repulsive forces exerted by objects in accordance with some implementations of the present disclosure. For example, the points on a route can be discretized locations along the path, such as route poses, illustrating the pose ofrobot202 throughout its route. In some cases, such discretized locations can also have associated probabilities, such as particles or bubbles. Route poses can identify the position and/or orientation thatrobot202 would travel on the route. In a planar application, the route pose can include (x, y, θ) coordinates. In some cases, θ can be the heading of the robot in the plane. The route poses can be regularly or irregularly spaced onrobot202's route. In some cases, intermediate layer can obtain the route containing one or more route poses fromglobal layer406, as described inblock452 ofmethod450. In some implementations, route poses can form a sequence, whereinrobot202 travels between sequential route poses on a route. For example, route poses414 and416 could be a sequence of route poses whererobot202 travels to routepose414 and then to routepose416.
By way of illustrative example, route poses414 and416 illustrate discretized locations along theroute portion404. This illustrative example shows route poses414 and416 as shaped asrobot202, with substantially similar footprints. The footprints of route poses414 and416 can be adjusted in size depending on how conservative one desires to be with respect to robot collisions. A smaller footprint can present higher likelihoods of a collision, but such a smaller footprint can allowrobot202 to clear more areas that it should be able to as it autonomously navigates. A larger footprint might decrease the likelihood of a collision, butrobot202 would not go through some places autonomously that it otherwise should be able to. The footprint can be predetermined by a footprint parameter that sets the size (e.g., scales) of the footprint ofrobot202, as illustrated in route poses (e.g., route poses414 and416). In some cases, there can be a plurality of footprint parameters that control the sizes of route poses ofrobot202 asymmetrically.
InFIG. 4D, while route poses414 and416 are illustrated and described, it should be appreciated by someone having ordinary skill in the art that there can be any number of route poses throughout a route, and the descriptions of the implementations of this disclosure can be applied to those route poses. Advantageously, having route poses414 and416 shaped like robot202 (e.g., a footprint of robot202) can allowrobot202 to determine places in whichrobot202 can fit while travelling. The footprint parameter(s) can be used to adjust howrobot202 projects itself. For example, a larger footprint used in route poses414 and/or416 can be more conservative in that it can cause, at least in part,robot202 to travel further away from objects. In contrast, a smaller footprint can cause, at least in part,robot202 to travel closer to objects. Route poses (e.g., route poses414 and416) can be of different sizes from one another. By way of illustration, it may be desirable forrobot202 to be more conservative in certain scenarios, such as on turns. Accordingly, in this illustration, the footprint of route poses on turns can be larger than the footprint of route poses on straightaways. Such dynamic reshaping of route poses can be performed by making the size of the route poses dependent on the rotation of the route pose relative to other route poses, or the changes in translation and/or rotation of route pose. One or more of the route poses on a route (e.g., route poses414 and/or416) can also be a different shape other than the shape ofrobot202. For example, the route poses can be circular, square, triangular, and/or any other shape.
As described inblock454 frommethod450, one can observe either route poses414 or416 as a first route pose. However, for purposes of illustration, and to illustrate the breadth of the described implementations of this disclosure, route poses414 and416 will be described together.
Points along objects (e.g., points determined by mapping, detecting by sensors ofexteroceptive sensors unit306, etc.) can exert a repulsive force on route poses of robot202 (e.g., route poses414 and416). In this way, the objects can, conceptually, preventrobot202 from colliding into them. In some cases, these points can represent at least in part poses and/or sets of poses. For example,arrows412 illustrate repulsive forces from points alongobject210.
In some implementations, the forces exerted by points by objects may be uniform in that each point on route poses414 and416 can have substantially similar forces exerted on them. However, in other implementations, the forces exerted by points of objects on route poses414 and416 may not be uniform and may vary based on a force function.
By way of illustration, a force function (e.g., a repulsive force function) can in some cases determine at least in part the repulsive force exerted on a point on route poses414 or416 by an object. For example, the force functions can be used inblock456 ofmethod450 to determine the repulsive forces from objects in the environment for a first route pose (e.g., a first route pose of route poses414 and416). In some implementations, the force function can be dependent on characteristics of where an object appears relative to route poses414 and416. The force function can then represent the force experienced by points route poses414 and416 (e.g., one or more points on the surface of route poses414 and416, the center of route poses414 and416, the center of mass of route poses414 and416, and/or any point of and/or around route poses414 and416). Because the forces can be dependent on their direction and magnitudes, repulsive forces (and/or attractive forces) can be vectors. In some cases, repulsive forces can exert rotational forces on a route pose, which can manifest in torque forces.
For example, repulsion forces and torque forces can be calculated at n different poses along a path. In some cases, these n different poses can be associated with route poses. Each pose can consist of m points in a footprint. In some cases, these m points can be points on the route poses.
In some cases, a plurality of points can define the body ofrobot202 as reflected in route poses414 and416, providing representative coverage over a portion of the body ofrobot202 and/or substantially all ofrobot202. For example, 15-20 points can be distributed throughout the surface and/or interior ofrobot202 and be reflected in route poses414 and416. However, in some cases, there can be fewer points.FIG. 4E illustrates example points on route pose414, such aspoint418. Each point can experience, at least in part, the forces (e.g., repulsive forces) placed on it by objects in the surrounding of route poses414.
Advantageously, by having a plurality of points on the body of route poses414 and416 that can experience forces, points of route poses414 and416 can translate and/or rotate relative to one another, causing, at least in part, repositioning (e.g., translation and/or rotation) of route poses414 and416. These translations and/or rotations of route poses414 and416 can cause deformations of the route navigated byrobot202.
Torsion forces can occur when different points on a route pose experience different magnitudes and directions of forces. Accordingly, the torsion force can cause the route poses to rotate. In some cases, predetermined parameters can define at least in part the torsion experienced by route poses414 and416. For example, a predetermined torsion parameter can include a multiplier for the rotational forces experience on a point on route poses414 or416. This predetermined torsion parameter can be indicative of force due to misalignment of route poses414 or416 and the path. In some cases, the predetermined torsion parameter may vary based on whether the force is repulsive or cohesive.
Returning toFIG. 4D, a characteristic on which the force function depends in part can be a position of a point on an object relative to routeposes414 and416. Distance can be determined based at least in part on sensors ofexteroceptive sensors unit306. As a first example, the repulsive force exerted onto route poses414 and416 from a point on an object exterior to robot202 (e.g., not within the footprint of route poses414 and416 such as points onobstacles210 and212 as illustrated) can be characterized at least in part by the function r(d)∝1/d, where r is the repulsion of a point on an object and d is the distance between the point on an object and a point on route pose414 or route pose416. In this way, the repulsion of a point on an object is inversely proportional to the distance between the point on the object and the point on route pose414 or route pose416. Advantageously, such a function allows objects close to route poses414 and416 to exert more repulsion, and thereby potentially more strongly influence the course ofrobot202 to avoid a collision than objects further away.
In some cases, a predetermined repulsive distance threshold can be put on the distance between a point on route pose414 and route pose416 and a point on an object. This predetermined repulsive distance threshold can be indicative at least in part of the maximum distance between a points on either route pose414 and route pose416 and a point on an object in which the point on the object can exert a repulsive force (and/or a torsion force) on points on either route poses414 and416. Accordingly, when a point on an object is a distance (e.g., from a point on either route pose414 and route pose416) that is above (or equal to and/or above, depending on the definition of the threshold), the repulsive force and/or torsion force can be zero or substantially zero. Advantageously, having a predetermined repulsive distance threshold can, in some cases, prevent some points on objects from exerting forces on points on route poses414 and416. In this way, when there is a predetermined repulsive distance,robot202 can get closer to certain objects and/or not be influenced by further away objects.
As a second example, the repulsive force exerted onto route poses414 and416 from a point on the interior of route poses414 and416 (e.g., within the footprint of route poses414 and416). For example, object402 hasportion420 that appears interior to routepose416. In these cases, a different force function can be exerted by points ofobject402 inportion420 onto points of route pose416 inportion420. In some implementations, this force can be characterized at least in part by the function r(d)∝d, where the variables are as described above. Advantageously, by having a different force function defined for interior objects, route pose416 can move asymmetrically causing rotations.
In some implementations, the force function can also depend on other characteristics of objects, such as shape, material, color, and/or any other characteristic of the object. These characteristics can be determined by one or more of sensors ofexteroceptive sensors306 in accordance with known methods in the art. Advantageously, taking into account characteristics can be further informative of howrobot202 should navigate around objects. In some instances, the cost map can be used to compute additional repulsion values based on these characteristics.
For example, the shape of an object can be indicative at least in part of an associated repercussion of collision. By way of illustration, a humanoid shape may be indicative of a person. As such, an object detected with this shape can place a greater repulsive force on route poses414 and416 in order to push the path further away from the humanoid shape. As another example, the shape of an object can be indicative in part of increased damage (e.g., to the object or robot202) if a collision occurred. By way of illustration, pointed objects, skinny objects, irregular objects, predetermined shapes (e.g., vase, lamp, display, etc.) and/or any other shape can be indicative at least in part of resulting in increased damage. Size may be another characteristic of shape that can be taken into account. For example, smaller objects may be more fragile in the event of a collision, but larger objects could cause more damage torobot202. In the case of size, force functions can take into account the size of the objects so that the points on those objects repulse points on route poses414 and416 proportionally as desired. By way of illustration, if route pose414 is between a larger object and a smaller object, if points of the larger object have a relatively larger repulsive force as defined at least in part on the force function, route pose414 will be pushed relatively closer to the smaller object. If the points of the smaller object have a relatively larger repulsive force as defined at least in part on the force function, route pose414 will be pushed relatively closer to the larger object. Accordingly, the repulsive force on route poses414 and416 can be adjusted based at least in part on the shape. The shape can be detected at least in part by sensors ofexteroceptive sensors unit306. As another illustrative example, walls can be identified in a cost map, and a repulsive force can be associated with walls due to their size and shape.
In some implementations, the force function can also depend on the material of the objects. For example, certain materials can be indicative at least in part of more damage if a collision occurred. By way of illustration, glass, porcelain, mirrors, and/or other fragile material can prove to be more damaging in the event of a collision. In some cases, such as in the case of mirrors, the material can sometimes cause errors in the sensors ofexteroceptive sensor units306. Accordingly, in some cases, it may be desirable forrobot202 to navigate further away from such objects, which can be reflected in the force function (e.g., increasing the repulsion force exerted by points on objects of some materials versus other materials).
In some implementations, color can be detected by sensors ofexteroceptive sensor units306. The force function can be dependent at least in part on the color of an object and/or points on an object. For example, certain objects in an environment may be a certain color (e.g., red, yellow, etc.) to indicate at least in part that robot202 (or in some cases people) should be cautious of those objects. Accordingly, in some cases, it may be desirable forrobot202 to navigate further away from such objects, which can be reflected in the force function.
In some implementations, the force function can be dependent on other factors, such as the location of an object. For example, certain areas of a map (e.g., as passed from global layer406) can have characteristics. By way of illustration, some areas of the map (e.g., a cost map) can be areas in whichrobot202 should not pass. There can also can be places whererobot202 cannot go into because they are not accessible (such as into an object). Accordingly, in some cases, the force function can be adjusted to account for such places. In some implementations, the force function can cause points in those places to exert no force (or substantially no force) on points on route poses414 and416. Advantageously, no force can be reflective of regions whererobot202 would not go (e.g., inside objects and the like). In contrast, in some implementations, such places can be treated as obstacles, exerting a repulsive force on route poses414 and416. Advantageously, having such a repulsion force can keeprobot202 from attempting to enter such areas.
In some implementations, not all forces on route poses414 and416 are repulsive. For example, points on route poses (e.g., route poses414 and416) can exert attractive (e.g., cohesive) forces, which can, at least in part, pull route poses towards each other.FIG. 4F illustrates attractive forces between route poses414 and416 in accordance with some implementations of the present disclosure. The arrows are indicative at least in part that route poses are drawn towards each other alongroute portion404. Advantageously, the cohesive force between route poses can cause, at least in part,robot202 towards following a path substantially similar to the path planned by global layer406 (e.g., a route substantially similar to an original route, such as an originally demonstrated route thatrobot202 should follow in the absence of objects around which to navigate).
The cohesive force can be set by a force function (e.g., a cohesive force function), which can be dependent on characteristics of the path, such as the spacing distance between route poses/particles, the smoothness of the path, how desirable it is forrobot202 to follow a path, etc. In some cases, the cohesive force function can be based at least in part on a predetermined cohesion multiplier, which can determine at least in part the force pulling route poses together. A lower predetermined cohesion multiplier can reduce the cohesive strength of route portion404 (e.g., draw of route poses towards it) and, in some cases, may cause a loss in smoothness of the path travelled byrobot202. In some cases, only sequential route poses exert cohesive forces on the points of one another. In other cases, all route poses exert cohesive forces on one another. In still other cases, some route poses exert cohesive forces on others. The determination of which route poses are configured to exert cohesive forces on one another can depend on a number of factors, which may vary on a case-by-case basis. For example, if a route is circular, it may be desirable for all route poses to exert cohesive forces on one another to tighten the circle. As another example, if the route is complex, then it may be desirable for certain complex paths to only have sequential route poses exert cohesive forces on one another. This limitation may allowrobot202 to make more turns and/or have more predictable results because other positioned route poses will not unduly influence it. Ones between the aforementioned examples in complexity may have some of the route poses exerting cohesive forces. As another example, the number of route poses may also be a factor. Having a lot of route poses on a route may cause unexpected results if all of them exert cohesive forces on one another. If there are fewer route poses, this might not be a problem, and all or some of the route poses can exert forces. In some cases, there can be a predetermined cohesive force distance threshold, where if a point on a first route pose is distance that is more than the predetermined cohesive force distance threshold (or more than or equal to, depending on how it is defined) from a point on a second route pose, the cohesive force can be zero or substantially zero.
In some implementations the cohesive force function and the repulsive force function can be the same force function. In other implementations, the cohesive force function and the repulsive force functions are separate. The cohesive force function can be used to determine the attractive forces from other route poses in accordance withblock458 frommethod450. In some implementations, both the cohesive forces and repulsive forces can result in torsion (e.g., causing rotation) of a route pose.
As described with reference tointermediate layer408, route poses414 and416 can experience different attractive and repulsive forces. In some implementations, the forces can be stored in arrays. For example, there can be an array of forces indicative of repulsion, torsion, cohesion, etc.
In some cases, forces can be toggled, such as by using an on/off parameter that can turn on or off any individual force and/or group of forces from a point. For example, the on/off parameter can be binary wherein one value turns the force on and another turns the force off. In this way, some forces can be turned off, such as based on the distance an object is from a route pose, whether a point is in the interior of an object or no go zone, distance between route poses, etc.
On the balance, the net forces on route poses414 and416 can reposition one or more of route poses414 and416. For example, route poses414 and416 can be displaced. Route poses414 and416 can displace (e.g., translated and/or rotated) until their net forces, in any direction, are substantially zero and/or minimized. In this way, route poses414 and416 can be displaced to locations indicative at least in part to an adjusted route forrobot202 to travel to avoid objects (e.g., obstacle402). The translation and/or rotation of a route pose due to the repulsive forces and attractive forces can be determined in accordance withblock460 ofmethod450.
There can be different adjustments made to determining the displacement of route poses414 and416. For example, in some cases, instead of considering all forces on route poses414 and416, attractive forces may only be considered. Advantageously, such a system can allowrobot202 to stick to static paths. Based at least in part on the displacement of route poses414 and416,robot202 can set a new path for the route planner. In the new path, the trajectory can be representative of a point onrobot202, such as the center ofrobot202, asrobot202 travels the path.
Afterrobot202 determines the displacement of route poses414 and416,robot202 can determine a path to travel. For example, based on the positions (e.g., locations and/or orientations) of route poses414 and416,robot202 can determine the path to navigate to and/or between route poses414 and416, and/or any other route poses from its present location. In some cases,robot202 will travel between consecutive (e.g., sequential) route poses in order, defining at least in part a path. For example, this determination can be based at least in part on an interpolation between route poses taking into account thepath robot202 can travel between those points. In many cases, linear interpolation can be used. By using performing interpolation,robot202 can account for the translated and/or rotated route pose in accordance withblock462 inmethod450.
FIG. 5 is an overhead view of a diagram showing interpolation between route poses414 and416 in accordance with some implementations of this disclosure. Based on forces placed on route poses414 and416, as described herein, route poses414 and416 have displaced. As illustrated, route pose414 has both translated and rotated. The translation can be measured in standard units, such as inches, feet, meters, or any other unit of measurement (e.g., measurements in the metric, US, or other system of measurement) and/or relative/non-absolute units, such as ticks, pixels, percentage of range of a sensor, and the like. Rotation can be measured in degrees, radians, etc. Similarly, route pose416 has also been translated and/or rotated. Notably, both route poses414 and416clear obstacle402. Since route poses414 and416 represent discretized locations along a path travelled byrobot202,robot202 can interpolate between them to determine the path it should take. Interpolated poses502A-502D illustrate a path travelled between route poses414 and416. Notably,robot202 may also interpolate other paths (not illustrated) to move to route poses and/or between route poses.
Interpolated poses502A-502D can have associated footprints substantially similar to the footprints of one or more of route poses414 and416. In some cases, as illustrated inFIG. 5, interpolated poses502A-502D can be interpolated route poses. Accordingly, interpolated poses502A-502D can represent the position and/or orientation thatrobot202 would be along a route. Advantageously, this can allow the interpolated path to guiderobot202 to places whererobot202 would fit. Moreover, interpolated poses502A-502D can be determined such that there is no overlap between the footprint of any one of interpolated poses502-502D and an object (e.g.,obstacle402,object210, or object212), thereby avoiding collisions.
Interpolated poses502A-502D can also be determined taking into account the rotation and/or translation to get from route pose414 to routepose416. For example,robot202 can determine the pose of route pose414 and the pose of route pose416.Robot202 can then find the difference between the poses of route poses414 and route poses416, and then determine how to get from the pose of route pose414 to the pose of route pose416. For example,robot202 can distribute the rotation and translation between interpolated poses502A-502D such thatrobot202 would rotate and translate from route pose414 to routepose416. In some cases,robot202 can distribute the rotation and translation substantially equally betweeninterpolated poses502A-502D. For example, if there are N number of interpolation positions,robot202 can divide the difference in location and rotation of the poses of route poses414 and416 substantially evenly across those N number of interpolation positions. Alternatively,robot202 can divide the difference in location and/or rotation of the poses of route poses414 and416 un-evenly across those N number of interpolation positions. Advantageously, even division can allow forrobot202 to travel smoothly from route pose414 to routepose416. However, un-even division can allowrobot202 to more easily account for and avoid objects by allowing finer movements in some areas as compared to others. For example, in order to avoid an object in which interpolated poses502A-502D comes near,robot202 would have to make a sharp turn. Accordingly, more interpolated poses around that turn may be desirable in order to account for the turn. In some cases, the number of interpolation positions can be dynamic, and more or fewer than N number of interpolation positions can be used as desired.
FIG. 6 is a process flow diagram of anexemplary method600 for operation of a robot in accordance with some implementations of this disclosure.Block602 includes creating a map of the environment based at least in part on collected data.Block604 includes determining a route in the map in which the robot will travel.Block606 includes generating one or more route poses on the route, wherein each route pose comprises a footprint indicative of poses of the robot along the route and each route pose has a plurality of points therein.Block608 includes determining forces on each of the plurality of points of each route pose, the forces comprising repulsive forces from one or more of the detected points on the one or more objects and attractive forces from one or more of the plurality of points on others of the one or more route poses.Block610 includes repositioning each route pose in response to the forces on each point of each route pose.Block612 includes perform interpolation between the one or more repositioned route poses to generate a collision-free path between the one or more route poses for the robot to travel.
FIG. 7 is a process flow diagram of anexemplary method700 for operation of a robot in accordance with some implementations of this disclosure.Block702 includes generating a map of the environment using data from one or more sensors.Block704 includes determining a route on the map, the route including one or more route poses, each route pose comprising a footprint indicative at least in part of a pose, size, and shape of the robot along the route and each route pose having a plurality of points therein.Block706 includes computing repulsive forces from a point on an object in the environment onto the plurality of points of a first route pose of the one or more route poses.Block708 includes repositioning the first route pose in response to at least the repulsive force.Block710 includes performing an interpolation between the repositioned first route pose and another of the one or more route poses.
As used herein, computer and/or computing device can include, but are not limited to, personal computers (“PCs”) and minicomputers, whether desktop, laptop, or otherwise, mainframe computers, workstations, servers, personal digital assistants (“PDAs”), handheld computers, embedded computers, programmable logic devices, personal communicators, tablet computers, mobile devices, portable navigation aids, J2ME equipped devices, cellular telephones, smart phones, personal integrated communication or entertainment devices, and/or any other device capable of executing a set of instructions and processing an incoming data signal.
As used herein, computer program and/or software can include any sequence or human or machine cognizable steps which perform a function. Such computer program and/or software may be rendered in any programming language or environment including, for example, C/C++, C#, Fortran, COBOL, MATLAB™, PASCAL, Python, assembly language, markup languages (e.g., HTML, SGML, XML, VoXWL), and the like, as well as object-oriented environments such as the Common Object Request Broker Architecture (“CORBA”), JAVA™ (including J2ME, Java Beans, etc.), Binary Runtime Environment (e.g., BREW), and the like.
As used herein, connection, link, transmission channel, delay line, and/or wireless can include a causal link between any two or more entities (whether physical or logical/virtual), which enables information exchange between the entities.
It will be recognized that while certain aspects of the disclosure are described in terms of a specific sequence of steps of a method, these descriptions are only illustrative of the broader methods of the disclosure, and may be modified as required by the particular application. Certain steps may be rendered unnecessary or optional under certain circumstances. Additionally, certain steps or functionality may be added to the disclosed implementations, or the order of performance of two or more steps permuted. All such variations are considered to be encompassed within the disclosure disclosed and claimed herein.
While the above detailed description has shown, described, and pointed out novel features of the disclosure as applied to various implementations, it will be understood that various omissions, substitutions, and changes in the form and details of the device or process illustrated may be made by those skilled in the art without departing from the disclosure. The foregoing description is of the best mode presently contemplated of carrying out the disclosure. This description is in no way meant to be limiting, but rather should be taken as illustrative of the general principles of the disclosure. The scope of the disclosure should be determined with reference to the claims.
While the disclosure has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and not restrictive. The disclosure is not limited to the disclosed embodiments. Variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed disclosure, from a study of the drawings, the disclosure and the appended claims.
It should be noted that the use of particular terminology when describing certain features or aspects of the disclosure should not be taken to imply that the terminology is being re-defined herein to be restricted to include any specific characteristics of the features or aspects of the disclosure with which that terminology is associated. Terms and phrases used in this application, and variations thereof, especially in the appended claims, unless otherwise expressly stated, should be construed as open ended as opposed to limiting. As examples of the foregoing, the term “including” should be read to mean “including, without limitation,” “including but not limited to,” or the like; the term “comprising” as used herein is synonymous with “including,” “containing,” or “characterized by,” and is inclusive or open-ended and does not exclude additional, unrecited elements or method steps; the term “having” should be interpreted as “having at least;” the term “such as” should be interpreted as “such as, without limitation;” the term ‘includes” should be interpreted as “includes but is not limited to;” the term “example” is used to provide exemplary instances of the item in discussion, not an exhaustive or limiting list thereof, and should be interpreted as “example, but without limitation;” adjectives such as “known,” “normal,” “standard,” and terms of similar meaning should not be construed as limiting the item described to a given time period or to an item available as of a given time, but instead should be read to encompass known, normal, or standard technologies that may be available or known now or at any time in the future; and use of terms like “preferably,” “preferred,” “desired,” or “desirable,” and words of similar meaning should not be understood as implying that certain features are critical, essential, or even important to the structure or function of the present disclosure, but instead as merely intended to highlight alternative or additional features that may or may not be utilized in a particular embodiment. Likewise, a group of items linked with the conjunction “and” should not be read as requiring that each and every one of those items be present in the grouping, but rather should be read as “and/or” unless expressly stated otherwise. Similarly, a group of items linked with the conjunction “or” should not be read as requiring mutual exclusivity among that group, but rather should be read as “and/or” unless expressly stated otherwise. The terms “about” or “approximate” and the like are synonymous and are used to indicate that the value modified by the term has an understood range associated with it, where the range can be ±20%, ±15%, ±10%, ±5%, or ±1%. The term “substantially” is used to indicate that a result (e.g., measurement value) is close to a targeted value, where close can mean, for example, the result is within 80% of the value, within 90% of the value, within 95% of the value, or within 99% of the value. Also, as used herein “defined” or “determined” can include “predefined” or “predetermined” and/or otherwise determined values, conditions, thresholds, measurements, and the like.

Claims (20)

What is claimed is:
1. A robot comprising:
one or more sensors configured to collect data about an environment including detected points on one or more objects in the environment; and
a controller configured to:
create a map of the environment based at least in part on the collected data;
determine a route in the map in which the robot will travel;
generate one or more route poses on the route, wherein each route pose comprises a footprint indicative of poses of the robot along the route and each route pose has a plurality of points disposed therein;
determine forces on each of the plurality of points of each route pose, the forces comprising repulsive forces from one or more of the detected points on the one or more objects and attractive forces from one or more of the plurality of points on others of the one or more route poses;
reposition one or more route poses in response to the forces on each point of the one or more route poses; and
perform interpolation between one or more route poses to generate a collision-free path between the one or more route poses for the robot to travel.
2. The robot ofclaim 1, wherein:
the one or more route poses form a sequence in which the robot travels along the route; and
the interpolation comprises a linear interpolation between sequential ones of the one or more route poses.
3. The robot ofclaim 1, wherein the interpolation generates one or more interpolation route poses having substantially similar footprints to the footprint of each route pose.
4. The robot ofclaim 1, wherein the determination of the forces on each point of the one or more route poses further comprises a computation of a force function that associates, at least in part, the forces on each point of each route pose with one or more characteristics of objects in the environment.
5. The robot ofclaim 4, wherein the one or more characteristics includes one or more of distance, shape, material, and color.
6. The robot ofclaim 4, wherein:
the force function associates zero repulsive force exerted by a first detected point on a first object where a distance between the first detected point and a second point of a first route pose is above a predetermined distance threshold.
7. The robot ofclaim 1, wherein the footprint of each route pose has substantially similar size and shape as the footprint of the robot.
8. The robot ofclaim 1, wherein the robot comprises a floor cleaner.
9. A method for dynamic navigation of a robot in an environment, comprising:
generating a map of the environment using data from one or more sensors;
determining a route on the map, the route including one or more route poses, each route pose comprising a footprint indicative at least in part of a pose and a shape of the robot along the route and each route pose having a plurality of points disposed therein;
computing repulsive forces from a point on an object in the environment onto the plurality of points of a first route pose of the one or more route poses;
repositioning the first route pose in response to at least the repulsive forces; and
performing an interpolation between the repositioned first route pose and another of the one or more route poses.
10. The method ofclaim 9, further comprising determining attractive forces from a point on another of the one or more route poses exerted on the plurality of points of the first route pose.
11. The method ofclaim 9, further comprising:
detecting a plurality of objects in the environment with the one or more sensors, each of the plurality of objects having detected points; and
defining a force function, the force function computing repulsive forces exerted by each of the detected points of the plurality of objects on the plurality of points of the first route pose, wherein each repulsive force comprises a vector.
12. The method ofclaim 11, wherein the repositioning of the first route pose comprises calculating a minimum of the force function.
13. The method ofclaim 9, wherein the repositioning of the first route pose comprises translating and rotating the first route pose.
14. The method ofclaim 9, wherein the performing of the interpolation comprises:
generating an interpolation route pose having a footprint substantially similar to the shape of the robot; and
determining a translation and rotation of the interpolation route pose based at least on a collision-free path between the translated and rotated first route pose and the another of the one or more route poses.
15. The method ofclaim 9, further comprising computing a magnitude of the repulsive forces as proportional to a distance between the point on the object and each of the plurality of points of the first route pose if the point on the object is outside of the footprint of the first route pose.
16. The method ofclaim 9, further comprising computing a magnitude of the repulsive forces as inversely proportional to a distance between the point on the object and each of the plurality of points of the first route pose if the point on the object is inside the footprint of the first route pose.
17. The method ofclaim 9, further comprising computing torque forces onto the plurality of points of the first route pose due to the repulsive forces.
18. A non-transitory computer-readable storage apparatus having a plurality of instructions stored thereon, the instructions being executable by a processing apparatus to operate a robot, the instructions configured to, when executed by the processing apparatus, cause the processing apparatus to:
generate a map of an environment using data from one or more sensors;
determine a route on the map, the route comprising one or more route poses, each route pose comprising a footprint indicative at least in part of a pose and a shape of the robot along the route and each route pose having a plurality of points disposed therein; and
compute repulsive forces from a point on an object in the environment onto the plurality of points of a first route pose of the one or more route poses.
19. The non-transitory computer-readable storage apparatus ofclaim 18, further comprising one or more instructions, which when executed by the processing apparatus, further cause the processing apparatus to determine attractive forces from a point on another of the one or more route poses exerted on the plurality of points of the first route pose.
20. The non-transitory computer-readable storage apparatus ofclaim 18, further comprising one or more instructions, which when executed by the processing apparatus, further cause the processing apparatus to determine torque forces from a point on another of the one or more route poses exerted on the plurality of points of the first route pose.
US15/341,6122016-11-022016-11-02Systems and methods for dynamic route planning in autonomous navigationActive2036-11-16US10001780B2 (en)

Priority Applications (9)

Application NumberPriority DateFiling DateTitle
US15/341,612US10001780B2 (en)2016-11-022016-11-02Systems and methods for dynamic route planning in autonomous navigation
CA3042532ACA3042532A1 (en)2016-11-022017-10-31Systems and methods for dynamic route planning in autonomous navigation
KR1020197015595AKR102528869B1 (en)2016-11-022017-10-31 System and method for dynamic itinerary planning in autonomous navigation
JP2019523827AJP7061337B2 (en)2016-11-022017-10-31 Robots for maneuvering along the route, systems for dynamic navigation and dynamic route planning of robots in the environment, methods for dynamic navigation and dynamic route planning of robots, and their non-temporary Computer media and their programs
EP17867114.5AEP3535630A4 (en)2016-11-022017-10-31Systems and methods for dynamic route planning in autonomous navigation
CN201780074759.6ACN110023866B (en)2016-11-022017-10-31System and method for dynamic route planning in autonomous navigation
PCT/US2017/059379WO2018085294A1 (en)2016-11-022017-10-31Systems and methods for dynamic route planning in autonomous navigation
US16/011,499US10379539B2 (en)2016-11-022018-06-18Systems and methods for dynamic route planning in autonomous navigation
US16/454,217US20200004253A1 (en)2016-11-022019-06-27Systems and methods for dynamic route planning in autonomous navigation

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
US15/341,612US10001780B2 (en)2016-11-022016-11-02Systems and methods for dynamic route planning in autonomous navigation

Related Child Applications (1)

Application NumberTitlePriority DateFiling Date
US16/011,499ContinuationUS10379539B2 (en)2016-11-022018-06-18Systems and methods for dynamic route planning in autonomous navigation

Publications (2)

Publication NumberPublication Date
US20180120856A1 US20180120856A1 (en)2018-05-03
US10001780B2true US10001780B2 (en)2018-06-19

Family

ID=62021299

Family Applications (3)

Application NumberTitlePriority DateFiling Date
US15/341,612Active2036-11-16US10001780B2 (en)2016-11-022016-11-02Systems and methods for dynamic route planning in autonomous navigation
US16/011,499ActiveUS10379539B2 (en)2016-11-022018-06-18Systems and methods for dynamic route planning in autonomous navigation
US16/454,217AbandonedUS20200004253A1 (en)2016-11-022019-06-27Systems and methods for dynamic route planning in autonomous navigation

Family Applications After (2)

Application NumberTitlePriority DateFiling Date
US16/011,499ActiveUS10379539B2 (en)2016-11-022018-06-18Systems and methods for dynamic route planning in autonomous navigation
US16/454,217AbandonedUS20200004253A1 (en)2016-11-022019-06-27Systems and methods for dynamic route planning in autonomous navigation

Country Status (7)

CountryLink
US (3)US10001780B2 (en)
EP (1)EP3535630A4 (en)
JP (1)JP7061337B2 (en)
KR (1)KR102528869B1 (en)
CN (1)CN110023866B (en)
CA (1)CA3042532A1 (en)
WO (1)WO2018085294A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20180061137A1 (en)*2016-08-302018-03-01Lg Electronics Inc.Mobile terminal and method of operating thereof
CN108742346A (en)*2018-06-272018-11-06杨扬The method for traversing the method for working environment and establishing grating map
CN110101340A (en)*2019-05-242019-08-09北京小米移动软件有限公司Cleaning equipment, clean operation execute method, apparatus and storage medium
US10650179B2 (en)*2016-01-272020-05-12Capital Normal UniversityMethod and system for formally analyzing the motion planning of a robotic arm based on conformal geometric algebra
US11092458B2 (en)*2018-10-302021-08-17Telenav, Inc.Navigation system with operation obstacle alert mechanism and method of operation thereof
US20210331312A1 (en)*2019-05-292021-10-28Lg Electronics Inc.Intelligent robot cleaner for setting travel route based on video learning and managing method thereof
WO2022140969A1 (en)*2020-12-282022-07-07深圳市优必选科技股份有限公司Method for dynamically generating footprint set, storage medium, and biped robot
CN114947655A (en)*2022-05-172022-08-30安克创新科技股份有限公司Robot control method, device, robot and computer readable storage medium

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US10429196B2 (en)*2017-03-082019-10-01Invensense, Inc.Method and apparatus for cart navigation
US10293485B2 (en)*2017-03-302019-05-21Brain CorporationSystems and methods for robotic path planning
FR3065853B1 (en)*2017-04-272019-06-07Peugeot Citroen Automobiles Sa METHOD AND DEVICE FOR CONTROLLING THE TRANSMISSION OF DATA FROM A VEHICLE TO A COMMUNICATION EQUIPMENT
US10422648B2 (en)*2017-10-172019-09-24AI IncorporatedMethods for finding the perimeter of a place using observed coordinates
KR102500634B1 (en)*2018-01-052023-02-16엘지전자 주식회사Guide robot and operating method thereof
EP3823795A4 (en)*2018-07-162022-04-06Brain Corporation SYSTEMS AND METHODS FOR OPTIMIZING ROUTE PLANNING FOR TIGHT TURNS FOR ROBOTIC APPLIANCES
US10809734B2 (en)2019-03-132020-10-20Mobile Industrial Robots A/SRoute planning in an autonomous device
US11958183B2 (en)2019-09-192024-04-16The Research Foundation For The State University Of New YorkNegotiation-based human-robot collaboration via augmented reality
WO2021183898A1 (en)*2020-03-132021-09-16Brain CorporationSystems and methods for route synchronization for robotic devices
US11592299B2 (en)2020-03-192023-02-28Mobile Industrial Robots A/SUsing static scores to control vehicle operations
CN113741550B (en)*2020-05-152024-02-02北京机械设备研究所Mobile robot following method and system
CN112015183B (en)*2020-09-082022-02-08安徽工程大学Obstacle avoidance method for mobile robot in terrain with concave-convex features under constraint of energy consumption
US11927972B2 (en)*2020-11-242024-03-12Lawrence Livermore National Security, LlcCollision avoidance based on traffic management data
CN112595324B (en)*2020-12-102022-03-29安徽工程大学Optimal node wheel type mobile robot path planning method under optimal energy consumption
CN112833899B (en)*2020-12-312022-02-15吉林大学Full-coverage path planning method for unmanned sanitation vehicle
CN112971621A (en)*2021-03-112021-06-18河北工业大学Indoor intelligent cleaning system and control method
US11940800B2 (en)*2021-04-232024-03-26Irobot CorporationNavigational control of autonomous cleaning robots
US20230071338A1 (en)*2021-09-082023-03-09Sea Machines Robotics, Inc.Navigation by mimic autonomy
CN114355925B (en)*2021-12-292024-03-19杭州海康机器人股份有限公司Path planning method, device, equipment and computer readable storage medium
CN114431122B (en)*2022-01-272023-03-24山东交通学院Road greening sprinkling intelligent control system and method
US12037769B1 (en)*2023-06-292024-07-16Built Robotics Inc.Autonomous offroad vehicle path planning with collision avoidance
CN119305544A (en)*2023-07-062025-01-14深圳引望智能技术有限公司 Control method, device and intelligent driving equipment
CN118192548B (en)*2024-03-062025-08-05北京机械工业自动化研究所有限公司 Path optimization method, device, and storage medium for inspection robot

Citations (173)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US4638445A (en)1984-06-081987-01-20Mattaboni Paul JAutonomous mobile robot
US4763276A (en)1986-03-211988-08-09Actel PartnershipMethods for refining original robot command signals
US4852018A (en)1987-01-071989-07-25Trustees Of Boston UniversityMassively parellel real-time network architectures for robots capable of self-calibrating their operating parameters through associative learning
US5121497A (en)1986-03-101992-06-09International Business Machines CorporationAutomatic generation of executable computer code which commands another program to perform a task and operator modification of the generated executable computer code
US5280179A (en)1979-04-301994-01-18Sensor Adaptive Machines IncorporatedMethod and apparatus utilizing an orientation code for automatically guiding a robot
US5446356A (en)1993-09-091995-08-29Samsung Electronics Co., Ltd.Mobile robot
US5602761A (en)1993-12-301997-02-11Caterpillar Inc.Machine performance monitoring and fault classification using an exponentially weighted moving average scheme
US5612883A (en)1990-02-051997-03-18Caterpillar Inc.System and method for detecting obstacles in the path of a vehicle
US5673367A (en)1992-10-011997-09-30Buckley; Theresa M.Method for neural network control of motion using real-time environmental feedback
US5719480A (en)1992-10-271998-02-17Minister Of National Defence Of Her Majesty's Canadian GovernmentParametric control device
US5841959A (en)1989-10-171998-11-24P.E. Applied Biosystems, Inc.Robotic interface
US5994864A (en)1995-09-111999-11-30Kabushiki Kaisha Yaskawa DenkiRobot controller
US6124694A (en)1999-03-182000-09-26Bancroft; Allen J.Wide area navigation for a robot scrubber
US6169981B1 (en)1996-06-042001-01-02Paul J. Werbos3-brain architecture for an intelligent decision and control system
US6243622B1 (en)1998-10-162001-06-05Xerox CorporationTouchable user interface using self movable robotic modules
WO2001067749A2 (en)2000-03-072001-09-13Sarnoff CorporationCamera pose estimation
US6366293B1 (en)1998-09-292002-04-02Rockwell Software Inc.Method and apparatus for manipulating and displaying graphical objects in a computer display device
US20020107649A1 (en)2000-12-272002-08-08Kiyoaki TakiguchiGait detection system, gait detection apparatus, device, and gait detection method
US6442451B1 (en)2000-12-282002-08-27Robotic Workspace Technologies, Inc.Versatile robot control system
US20020158599A1 (en)2000-03-312002-10-31Masahiro FujitaRobot device, robot device action control method, external force detecting device and external force detecting method
US20020175894A1 (en)2001-03-062002-11-28Vince GrilloHand-supported mouse for computer input
US20020198854A1 (en)2001-03-302002-12-26Berenji Hamid R.Convergent actor critic-based fuzzy reinforcement learning apparatus and method
US20030023347A1 (en)2000-09-282003-01-30Reizo KonnoAuthoring system and authoring method, and storage medium
US20030025082A1 (en)2001-08-022003-02-06International Business Machines CorporationActive infrared presence sensor
US6560511B1 (en)1999-04-302003-05-06Sony CorporationElectronic pet system, network system, robot, and storage medium
US20030108415A1 (en)2001-07-132003-06-12Martin HosekTrajectory planning and motion control strategies for a planar three-degree-of-freedom robotic arm
US6584375B2 (en)2001-05-042003-06-24Intellibot, LlcSystem for a retail environment
US20030144764A1 (en)2001-02-212003-07-31Jun YokonoOperational control method, program, and recording media for robot device, and robot device
US6636781B1 (en)2001-05-222003-10-21University Of Southern CaliforniaDistributed control and coordination of autonomous agents in a dynamic, reconfigurable system
US20030220714A1 (en)2002-02-122003-11-27The University Of TokyoMethod for generating a motion of a human type link system
US20040030449A1 (en)2002-04-222004-02-12Neal SolomonMethods and apparatus for multi robotic system involving coordination of weaponized unmanned underwater vehicles
US20040036437A1 (en)2001-04-032004-02-26Masato ItoLegged mobile robot and its motion teaching method, and storage medium
US20040051493A1 (en)2001-06-072004-03-18Takayuki FurutaApparatus walking with two legs, walking control apparatus, and walking control method thereof
US6760645B2 (en)2001-04-302004-07-06Sony France S.A.Training of autonomous robots
US20040167641A1 (en)2001-06-272004-08-26Masakazu KawaiMethod of estimating floor reactions of bipedal walking body, and method of estimating joint moments of bipedal walking body
US20040172168A1 (en)2003-02-272004-09-02Fanuc Ltd.Taught position modification device
US20040172166A1 (en)2003-02-262004-09-02Paul LapstunRobot
US6812846B2 (en)2001-09-282004-11-02Koninklijke Philips Electronics N.V.Spill detector based on machine-imaging
US20040267404A1 (en)2001-08-312004-12-30George DankoCoordinated joint motion control system
US20050008227A1 (en)2003-07-082005-01-13Computer Associates Think, Inc.Hierarchical determination of feature relevancy
US20050065651A1 (en)2003-07-242005-03-24Joseph AyersProcess and architecture of robotic system to mimic animal behavior in the natural environment
US20050069207A1 (en)2002-05-202005-03-31Zakrzewski Radoslaw RomualdMethod for detection and recognition of fog presence within an aircraft compartment using video images
US20050125099A1 (en)2003-10-242005-06-09Tatsuo MikamiMotion editing apparatus and method for robot device, and computer program
US6961060B1 (en)1999-03-162005-11-01Matsushita Electric Industrial Co., Ltd.Virtual space control data receiving apparatus,virtual space control data transmission and reception system, virtual space control data receiving method, and virtual space control data receiving program storage media
US7002585B1 (en)1999-10-122006-02-21Fanuc LtdGraphic display apparatus for robot system
US20060207419A1 (en)2003-09-222006-09-21Yasunao OkazakiApparatus and method for controlling elastic actuator
US20060250101A1 (en)2005-04-132006-11-09Oussama KhatibTorque-position transformer for task control of position controlled robots
US20070074177A1 (en)2005-09-292007-03-29Hitachi, Ltd.Logic extraction support apparatus
US7212651B2 (en)2003-06-172007-05-01Mitsubishi Electric Research Laboratories, Inc.Detecting pedestrians using patterns of motion and appearance in videos
US20070151389A1 (en)2005-12-202007-07-05Giuseppe PriscoMedical robotic system with programmably controlled constraints on error dynamics
US7243334B1 (en)2002-01-162007-07-10Prelude Systems, Inc.System and method for generating user interface code
US20070200525A1 (en)2004-02-252007-08-30The Ritsumeikan TrustControl system of floating mobile body
US20070255454A1 (en)2006-04-272007-11-01Honda Motor Co., Ltd.Control Of Robots From Human Motion Descriptors
US20070260356A1 (en)2003-05-222007-11-08Abb AbControl Method for a Robot
US20080040040A1 (en)2006-08-082008-02-14Takanori GotoObstacle avoidance method and obstacle-avoidable mobile apparatus
US20080059015A1 (en)*2006-06-092008-03-06Whittaker William LSoftware architecture for high-speed traversal of prescribed routes
US7342589B2 (en)2003-09-252008-03-11Rockwell Automation Technologies, Inc.System and method for managing graphical data
US20080097644A1 (en)2004-11-022008-04-24Rotundus AbBall Robot
US20080112596A1 (en)2006-01-232008-05-15Rhoads Geoffrey BSensing Data From Physical Objects
US20080140257A1 (en)2006-12-072008-06-12Fanuc LtdRobot control apparatus for force control
US20080319929A1 (en)2004-07-272008-12-25Frederic KaplanAutomated Action-Selection System and Method , and Application Thereof to Training Prediction Machines and Driving the Development of Self-Developing Devices
US20090037033A1 (en)2007-05-142009-02-05Emilie PhillipsAutonomous Behaviors for a Remote Vehicle
US7576639B2 (en)2006-03-142009-08-18Mobileye Technologies, Ltd.Systems and methods for detecting pedestrians in the vicinity of a powered industrial vehicle
US20090234501A1 (en)2006-05-252009-09-17Takehiro IshizakiWork Robot
US20090231359A1 (en)2008-03-172009-09-17Intelliscience CorporationMethods and systems for compound feature creation, processing, and identification in conjunction with a data analysis and feature recognition system
US20090265036A1 (en)2007-03-292009-10-22Irobot CorporationRobot operator control unit configuration system and method
US20090272585A1 (en)2008-05-012009-11-05Kenichiro NagasakaActuator Control Device, Actuator Control Method, Actuator, Robot Apparatus, and Computer Program
US7668605B2 (en)2005-10-262010-02-23Rockwell Automation Technologies, Inc.Wireless industrial control user interface
US20100114372A1 (en)2008-10-302010-05-06Intellibot Robotics LlcMethod of cleaning a surface using an automatic cleaning device
US20100152899A1 (en)2008-11-172010-06-17Energid Technologies, Inc.Systems and methods of coordination control for robot manipulation
US20100152896A1 (en)2008-02-062010-06-17Mayumi KomatsuRobot, controlling device and controlling method for robot, and controlling program for robot-controlling device
US20100228264A1 (en)2009-03-092010-09-09David RobinsonAdaptable integrated energy control system for electrosurgical tools in robotic surgical systems
US20100286824A1 (en)2002-08-212010-11-11Neal SolomonSystem for self-organizing mobile robotic collectives
US20100305758A1 (en)2009-05-292010-12-02Fanuc LtdRobot control system provided in machining system including robot and machine tool
US20100312730A1 (en)2009-05-292010-12-09Board Of Trustees Of Michigan State UniversityNeuromorphic spatiotemporal where-what machines
US20110026770A1 (en)2009-07-312011-02-03Jonathan David BrookshirePerson Following Using Histograms of Oriented Gradients
US20110035188A1 (en)2009-07-162011-02-10European Space AgencyMethod and apparatus for analyzing time series data
US20110060460A1 (en)2008-10-312011-03-10Kabushiki Kaisha ToshibaRobot control apparatus
US20110067479A1 (en)2009-09-222011-03-24Gm Global Technology Operations, Inc.System and method for calibrating a rotary absolute position sensor
US20110144802A1 (en)2009-12-102011-06-16The Boeing CompanyControl System for Robotic Vehicles
US20110160906A1 (en)2009-12-282011-06-30Honda Motor Co., Ltd.Control device for mobile robot
US20110160907A1 (en)2009-12-282011-06-30Honda Motor Co., Ltd.Control device for robot
US20110158476A1 (en)2009-12-242011-06-30National Taiwan University Of Science And TechnologyRobot and method for recognizing human faces and gestures thereof
US20110196199A1 (en)2010-02-112011-08-11Intuitive Surgical Operations, Inc.Method and system for automatically maintaining an operator selected roll orientation at a distal tip of a robotic endoscope
US20110218676A1 (en)2009-07-022011-09-08Yasunao OkazakiRobot, control device for robot arm and control program for robot arm
US20110244919A1 (en)2010-03-192011-10-06Aller Joshua VMethods and Systems for Determining Image Processing Operations Relevant to Particular Imagery
US20110282169A1 (en)2008-10-292011-11-17The Regents Of The University Of Colorado, A Body CorporateLong Term Active Learning from Large Continually Changing Data Sets
US20110296944A1 (en)2010-06-022011-12-08Disney Enterprises, Inc.Three-axis robotic joint using four-bar linkages to drive differential side gears
US20120008838A1 (en)2000-08-072012-01-12Health Discovery CorporationSystem and method for remote melanoma screening
US20120017232A1 (en)1991-12-232012-01-19Linda Irene HoffbergAdaptive pattern recognition based controller apparatus and method and human-factored interface thereore
US20120045068A1 (en)2010-08-202012-02-23Korea Institute Of Science And TechnologySelf-fault detection system and method for microphone array and audio-based device
US20120072166A1 (en)2010-09-222012-03-22Invensense, Inc.Deduced reckoning navigation without a constraint relationship between orientation of a sensor platform and a direction of travel of an object
US8145492B2 (en)2004-04-072012-03-27Sony CorporationRobot behavior control system and method, and robot apparatus
US20120079670A1 (en)2010-10-052012-04-05Samsung Electronics Co., Ltd.Dust inflow sensing unit and robot cleaner having the same
US20120109150A1 (en)2002-03-062012-05-03Mako Surgical Corp.Haptic guidance system and method
US8174568B2 (en)*2006-12-012012-05-08Sri InternationalUnified framework for precise vision-aided navigation
US20120143495A1 (en)2010-10-142012-06-07The University Of North TexasMethods and systems for indoor navigation
US20120144242A1 (en)2010-12-022012-06-07Vichare Nikhil MSystem and method for proactive management of an information handling system with in-situ measurement of end user actions
US20120150777A1 (en)2010-12-132012-06-14Kabushiki Kaisha ToshibaAction history search device
US20120209432A1 (en)2005-09-132012-08-16Neurosciences Research Foundation, Inc.Hybrid control device
US20120221147A1 (en)2009-03-092012-08-30Intuitive Surgical Operations, Inc.Control panel for an adjustable ergonomic control console
US20120303091A1 (en)2010-03-262012-11-29Izhikevich Eugene MApparatus and methods for polychronous encoding and multiplexing in neuronal prosthetic devices
US20120303160A1 (en)2005-09-302012-11-29Irobot CorporationCompanion robot for personal interaction
US20120308136A1 (en)2010-03-262012-12-06Izhikevich Eugene MApparatus and methods for pulse-code invariant object recognition
US20120308076A1 (en)2010-03-262012-12-06Filip Lukasz PiekniewskiApparatus and methods for temporally proximate object recognition
US20130000480A1 (en)2010-12-172013-01-03Mayumi KomatsuControl apparatus, control method, and control program for elastic actuator drive mechanism
US8364314B2 (en)2009-04-302013-01-29GM Global Technology Operations LLCMethod and apparatus for automatic control of a humanoid robot
US8380348B2 (en)2008-01-152013-02-19Honda Motor Co., Ltd.Robot
US8380652B1 (en)2011-05-062013-02-19Google Inc.Methods and systems for autonomous robotic decision making
US20130044139A1 (en)2011-08-162013-02-21Google Inc.Systems and methods for navigating a camera
US20130066468A1 (en)2010-03-112013-03-14Korea Institute Of Science And TechnologyTelepresence robot, telepresence system comprising the same and method for controlling the same
US8419804B2 (en)2008-09-042013-04-16Iwalk, Inc.Hybrid terrain-adaptive lower-extremity systems
US8423225B2 (en)2009-11-112013-04-16Intellibot Robotics LlcMethods and systems for movement of robotic device using video signal
US20130096719A1 (en)2011-10-132013-04-18The U.S.A. As Represented By The Administrator Of The National Aeronautics And Space AdministrationMethod for dynamic optimization of a robot control interface
US20130116827A1 (en)2011-11-072013-05-09Seiko Epson CorporationRobot control system, robot system, and sensor information processing apparatus
US8452448B2 (en)2008-04-022013-05-28Irobot CorporationRobotics systems
US20130173060A1 (en)2012-01-042013-07-04Hyundai Motor CompanyMethod of operating a wearable robot
US20130206170A1 (en)2005-12-022013-08-15Irobot CorporationCoverage robot mobility
US8514236B2 (en)2000-11-242013-08-20Cleversys, Inc.System and method for animal gait characterization from bottom view using video analysis
US8515162B2 (en)2009-10-092013-08-20Primax Electronics Ltd.QR code processing method and apparatus thereof
US20130218339A1 (en)2010-07-232013-08-22Aldebaran Robotics"humanoid robot equipped with a natural dialogue interface, method for controlling the robot and corresponding program"
US20130245829A1 (en)2012-03-152013-09-19Jtekt CorporationRobot control method, robot control device, and robot control system
US20130274924A1 (en)2006-06-012013-10-17Samsung Electronics Co., Ltd.Method, medium and apparatus classifying and collecting area feature information according to a robot's moving path, and a robot controlled by the area features
US20130310979A1 (en)2012-04-182013-11-21Massachusetts Institute Of TechnologyNeuromuscular Model-Based Sensing And Control Paradigm For A Robotic Leg
US20130325775A1 (en)2012-06-042013-12-05Brain CorporationDynamically reconfigurable stochastic learning apparatus and methods
US20130346347A1 (en)2012-06-222013-12-26Google Inc.Method to Predict a Communicative Action that is Most Likely to be Executed Given a Context
US20140002843A1 (en)2012-06-292014-01-02Kyocera Document Solutions Inc.Image forming apparatus and control method therefor
US20140016858A1 (en)2012-07-122014-01-16Micah RichertSpiking neuron network sensory processing apparatus and methods
US8639035B2 (en)2008-02-072014-01-28Nec CorporationPose estimation
US8639644B1 (en)2011-05-062014-01-28Google Inc.Shared robot knowledge base for use with cloud computing system
US20140081895A1 (en)2012-09-202014-03-20Oliver CoenenSpiking neuron network adaptive control apparatus and methods
US8679260B2 (en)2009-11-112014-03-25Intellibot Robotics LlcMethods and systems for movement of an automatic cleaning device using video signal
US20140089232A1 (en)2012-06-012014-03-27Brain CorporationNeural network learning and collaboration apparatus and methods
US20140114479A1 (en)2011-11-102014-04-24Panasonic CorporationRobot, robot control device, robot control method, and robot control program
US20140187519A1 (en)2012-12-272014-07-03The Board Of Trustees Of The Leland Stanford Junior UniversityBiomarkers for predicting major adverse events
US20140190514A1 (en)2013-01-082014-07-10Bluebotics SaFloor treatment machine and method for treating floor surfaces
US8793205B1 (en)2012-09-202014-07-29Brain CorporationRobotic learning and evolution apparatus
US20140276951A1 (en)2013-03-152014-09-18Intuitive Surgical Operations, Inc.Software Configurable Manipulator Degrees of Freedom
US20140277718A1 (en)2013-03-152014-09-18Eugene IzhikevichAdaptive predictor apparatus and methods
US8843244B2 (en)*2006-10-062014-09-23Irobot CorporationAutonomous behaviors for a remove vehicle
US20140350723A1 (en)2013-05-232014-11-27Fluor Technologies CorporationUniversal construction robotics interface
US20140358828A1 (en)2013-05-292014-12-04Purepredictive, Inc.Machine learning generated action plan
WO2014196925A1 (en)2013-06-032014-12-11Ctrlworks Pte. Ltd.Method and apparatus for offboard navigation of a robotic device
US20140371912A1 (en)2013-06-142014-12-18Brain CorporationHierarchical robotic controller apparatus and methods
US20140371907A1 (en)2013-06-142014-12-18Brain CorporationRobotic training apparatus and methods
US20150032258A1 (en)2013-07-292015-01-29Brain CorporationApparatus and methods for controlling of robotic devices
US8958911B2 (en)2012-02-292015-02-17Irobot CorporationMobile robot
US8958912B2 (en)2012-06-212015-02-17Rethink Robotics, Inc.Training and operating industrial robots
US8958937B2 (en)2013-03-122015-02-17Intellibot Robotics LlcCleaning machine with collision prevention
US20150094852A1 (en)2013-09-272015-04-02Brain CorporationRobotic control arbitration apparatus and methods
US20150094850A1 (en)2013-09-272015-04-02Brain CorporationApparatus and methods for training of robotic control arbitration
WO2015047195A1 (en)2013-09-242015-04-02Ctrlworks Pte. Ltd.Offboard navigation apparatus capable of being coupled to a movable platform
US9008840B1 (en)2013-04-192015-04-14Brain CorporationApparatus and methods for reinforcement-guided supervised learning
US9015093B1 (en)2010-10-262015-04-21Michael Lamport CommonsIntelligent control with hierarchical stacked neural networks
US20150120128A1 (en)2012-11-022015-04-30Irobot CorporationAutonomous Coverage Robot
US20150127155A1 (en)2011-06-022015-05-07Brain CorporationApparatus and methods for operating robotic devices using selective state space training
US20150185027A1 (en)2014-01-022015-07-02Microsoft CorporationGround truth estimation for autonomous navigation
US20150199458A1 (en)2014-01-142015-07-16Energid Technologies CorporationDigital proxy simulation of robotic hardware
US20150213299A1 (en)2012-07-042015-07-30José Vicente Solano FerrándezInfrared image based early detection of oil spills in water
US9144907B2 (en)2013-10-242015-09-29Harris CorporationControl synchronization for high-latency teleoperation
US20150283703A1 (en)2014-04-032015-10-08Brain CorporationApparatus and methods for remotely controlling robotic devices
US20150306761A1 (en)2014-04-292015-10-29Brain CorporationTrainable convolutional network apparatus and methods for operating a robotic vehicle
US20150317357A1 (en)2014-05-022015-11-05Google Inc.Searchable Index
US20150339589A1 (en)2014-05-212015-11-26Brain CorporationApparatus and methods for training robots utilizing gaze-based saliency maps
US9242372B2 (en)2013-05-312016-01-26Brain CorporationAdaptive robotic interface apparatus and methods
US20160057925A1 (en)2009-06-182016-03-03RoboLabs, Inc.System and method for controlling and monitoring operation of an autonomous robot
US20160075026A1 (en)2014-09-122016-03-17Toyota Jidosha Kabushiki KaishaAnticipatory robot navigation
US20160182502A1 (en)2014-12-232016-06-23Ned M. SmithUser profile selection using contextual authentication
US20160282862A1 (en)2013-01-182016-09-29Irobot CorporationEnvironmental management systems including mobile robots and methods using same
US20160309973A1 (en)2015-04-242016-10-27Avidbots Corp.Apparatus and methods for semi-autonomous cleaning of surfaces
US9746339B2 (en)2014-08-072017-08-29Nokia Technologies OyApparatus, method, computer program and user device for enabling control of a vehicle
US20170329347A1 (en)*2016-05-112017-11-16Brain CorporationSystems and methods for training a robot to autonomously travel a route
US20170329333A1 (en)*2016-05-112017-11-16Brain CorporationSystems and methods for initializing a robot to autonomously travel a trained route

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JPS63229503A (en)*1987-03-191988-09-26Fujitsu Ltd Robot posture control method
US7418346B2 (en)*1997-10-222008-08-26Intelligent Technologies International, Inc.Collision avoidance methods and systems
DE19745656C2 (en)*1997-10-162000-06-21Daimler Chrysler Ag Impact absorber for a motor vehicle
AUPS123702A0 (en)*2002-03-222002-04-18Nahla, Ibrahim S. MrThe train navigtion and control system (TNCS) for multiple tracks
KR100520049B1 (en)*2003-09-052005-10-10학교법인 인하학원Path planning method for the autonomous mobile robot
JP5287051B2 (en)*2008-09-042013-09-11村田機械株式会社 Autonomous mobile device
DE102009052629A1 (en)*2009-11-102011-05-12Vorwerk & Co. Interholding Gmbh Method for controlling a robot
JP5446765B2 (en)2009-11-172014-03-19トヨタ自動車株式会社 Route search system, route search method, route search program, and moving body
CN103026396B (en)*2010-07-272015-09-23丰田自动车株式会社 driving aids
KR101743926B1 (en)*2010-09-202017-06-08삼성전자주식회사Robot and control method thereof
KR101233714B1 (en)*2010-09-302013-02-18아주대학교산학협력단Autonomous mobile robot avoiding obstacle trap and controlling method for the same
WO2013071190A1 (en)*2011-11-112013-05-16Evolution Robotics, Inc.Scaling vector field slam to large environments
KR101133037B1 (en)*2011-12-012012-04-04국방과학연구소Path updating method for collision avoidance of autonomous vehicle and the apparatus
CN104029203B (en)*2014-06-182017-07-18大连大学Realize the paths planning method of space manipulator avoidance
CN104317291A (en)*2014-09-162015-01-28哈尔滨恒誉名翔科技有限公司Artificial-potential-field-based robot collision preventation path planning method
US9403275B2 (en)*2014-10-172016-08-02GM Global Technology Operations LLCDynamic obstacle avoidance in a robotic system
CN104875882B (en)*2015-05-212018-02-27合肥学院Four-axle aircraft
CN105549597B (en)*2016-02-042018-06-26同济大学A kind of unmanned vehicle dynamic path planning method based on environmental uncertainty
CN105739507B (en)*2016-04-292018-11-20昆山华恒机器人有限公司A kind of optimum path planning method of robot anticollision
CN105955273A (en)*2016-05-252016-09-21速感科技(北京)有限公司Indoor robot navigation system and method

Patent Citations (178)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US5280179A (en)1979-04-301994-01-18Sensor Adaptive Machines IncorporatedMethod and apparatus utilizing an orientation code for automatically guiding a robot
US4638445A (en)1984-06-081987-01-20Mattaboni Paul JAutonomous mobile robot
US5121497A (en)1986-03-101992-06-09International Business Machines CorporationAutomatic generation of executable computer code which commands another program to perform a task and operator modification of the generated executable computer code
US4763276A (en)1986-03-211988-08-09Actel PartnershipMethods for refining original robot command signals
US4852018A (en)1987-01-071989-07-25Trustees Of Boston UniversityMassively parellel real-time network architectures for robots capable of self-calibrating their operating parameters through associative learning
US5841959A (en)1989-10-171998-11-24P.E. Applied Biosystems, Inc.Robotic interface
US5612883A (en)1990-02-051997-03-18Caterpillar Inc.System and method for detecting obstacles in the path of a vehicle
US20120017232A1 (en)1991-12-232012-01-19Linda Irene HoffbergAdaptive pattern recognition based controller apparatus and method and human-factored interface thereore
US20150204559A1 (en)1991-12-232015-07-23Steven M. HoffbergAdaptive pattern recognition based controller apparatus and method and human-interface therefore
US5673367A (en)1992-10-011997-09-30Buckley; Theresa M.Method for neural network control of motion using real-time environmental feedback
US5719480A (en)1992-10-271998-02-17Minister Of National Defence Of Her Majesty's Canadian GovernmentParametric control device
US5446356A (en)1993-09-091995-08-29Samsung Electronics Co., Ltd.Mobile robot
US5602761A (en)1993-12-301997-02-11Caterpillar Inc.Machine performance monitoring and fault classification using an exponentially weighted moving average scheme
US5994864A (en)1995-09-111999-11-30Kabushiki Kaisha Yaskawa DenkiRobot controller
US6169981B1 (en)1996-06-042001-01-02Paul J. Werbos3-brain architecture for an intelligent decision and control system
US6366293B1 (en)1998-09-292002-04-02Rockwell Software Inc.Method and apparatus for manipulating and displaying graphical objects in a computer display device
US6243622B1 (en)1998-10-162001-06-05Xerox CorporationTouchable user interface using self movable robotic modules
US6961060B1 (en)1999-03-162005-11-01Matsushita Electric Industrial Co., Ltd.Virtual space control data receiving apparatus,virtual space control data transmission and reception system, virtual space control data receiving method, and virtual space control data receiving program storage media
US6124694A (en)1999-03-182000-09-26Bancroft; Allen J.Wide area navigation for a robot scrubber
US6560511B1 (en)1999-04-302003-05-06Sony CorporationElectronic pet system, network system, robot, and storage medium
US7002585B1 (en)1999-10-122006-02-21Fanuc LtdGraphic display apparatus for robot system
WO2001067749A2 (en)2000-03-072001-09-13Sarnoff CorporationCamera pose estimation
US20020158599A1 (en)2000-03-312002-10-31Masahiro FujitaRobot device, robot device action control method, external force detecting device and external force detecting method
US20120008838A1 (en)2000-08-072012-01-12Health Discovery CorporationSystem and method for remote melanoma screening
US20030023347A1 (en)2000-09-282003-01-30Reizo KonnoAuthoring system and authoring method, and storage medium
US8514236B2 (en)2000-11-242013-08-20Cleversys, Inc.System and method for animal gait characterization from bottom view using video analysis
US20020107649A1 (en)2000-12-272002-08-08Kiyoaki TakiguchiGait detection system, gait detection apparatus, device, and gait detection method
US6442451B1 (en)2000-12-282002-08-27Robotic Workspace Technologies, Inc.Versatile robot control system
US20030144764A1 (en)2001-02-212003-07-31Jun YokonoOperational control method, program, and recording media for robot device, and robot device
US6697711B2 (en)2001-02-212004-02-24Sony CorporationOperational control method, program, and recording media for robot device, and robot device
US20020175894A1 (en)2001-03-062002-11-28Vince GrilloHand-supported mouse for computer input
US20020198854A1 (en)2001-03-302002-12-26Berenji Hamid R.Convergent actor critic-based fuzzy reinforcement learning apparatus and method
US20040036437A1 (en)2001-04-032004-02-26Masato ItoLegged mobile robot and its motion teaching method, and storage medium
US6760645B2 (en)2001-04-302004-07-06Sony France S.A.Training of autonomous robots
US6584375B2 (en)2001-05-042003-06-24Intellibot, LlcSystem for a retail environment
US6636781B1 (en)2001-05-222003-10-21University Of Southern CaliforniaDistributed control and coordination of autonomous agents in a dynamic, reconfigurable system
US20040051493A1 (en)2001-06-072004-03-18Takayuki FurutaApparatus walking with two legs, walking control apparatus, and walking control method thereof
US20040167641A1 (en)2001-06-272004-08-26Masakazu KawaiMethod of estimating floor reactions of bipedal walking body, and method of estimating joint moments of bipedal walking body
US20030108415A1 (en)2001-07-132003-06-12Martin HosekTrajectory planning and motion control strategies for a planar three-degree-of-freedom robotic arm
US20030025082A1 (en)2001-08-022003-02-06International Business Machines CorporationActive infrared presence sensor
US20040267404A1 (en)2001-08-312004-12-30George DankoCoordinated joint motion control system
US6812846B2 (en)2001-09-282004-11-02Koninklijke Philips Electronics N.V.Spill detector based on machine-imaging
US7243334B1 (en)2002-01-162007-07-10Prelude Systems, Inc.System and method for generating user interface code
US20030220714A1 (en)2002-02-122003-11-27The University Of TokyoMethod for generating a motion of a human type link system
US20120109150A1 (en)2002-03-062012-05-03Mako Surgical Corp.Haptic guidance system and method
US20040030449A1 (en)2002-04-222004-02-12Neal SolomonMethods and apparatus for multi robotic system involving coordination of weaponized unmanned underwater vehicles
US20050069207A1 (en)2002-05-202005-03-31Zakrzewski Radoslaw RomualdMethod for detection and recognition of fog presence within an aircraft compartment using video images
US20100286824A1 (en)2002-08-212010-11-11Neal SolomonSystem for self-organizing mobile robotic collectives
US20040172166A1 (en)2003-02-262004-09-02Paul LapstunRobot
US7148644B2 (en)2003-02-262006-12-12Silverbrook Research Pty LtdMarking robot
US20040172168A1 (en)2003-02-272004-09-02Fanuc Ltd.Taught position modification device
US20070260356A1 (en)2003-05-222007-11-08Abb AbControl Method for a Robot
US7212651B2 (en)2003-06-172007-05-01Mitsubishi Electric Research Laboratories, Inc.Detecting pedestrians using patterns of motion and appearance in videos
US20050008227A1 (en)2003-07-082005-01-13Computer Associates Think, Inc.Hierarchical determination of feature relevancy
US20050065651A1 (en)2003-07-242005-03-24Joseph AyersProcess and architecture of robotic system to mimic animal behavior in the natural environment
US20060207419A1 (en)2003-09-222006-09-21Yasunao OkazakiApparatus and method for controlling elastic actuator
US7342589B2 (en)2003-09-252008-03-11Rockwell Automation Technologies, Inc.System and method for managing graphical data
US20050125099A1 (en)2003-10-242005-06-09Tatsuo MikamiMotion editing apparatus and method for robot device, and computer program
US20070200525A1 (en)2004-02-252007-08-30The Ritsumeikan TrustControl system of floating mobile body
US8145492B2 (en)2004-04-072012-03-27Sony CorporationRobot behavior control system and method, and robot apparatus
US20080319929A1 (en)2004-07-272008-12-25Frederic KaplanAutomated Action-Selection System and Method , and Application Thereof to Training Prediction Machines and Driving the Development of Self-Developing Devices
US20080097644A1 (en)2004-11-022008-04-24Rotundus AbBall Robot
US20060250101A1 (en)2005-04-132006-11-09Oussama KhatibTorque-position transformer for task control of position controlled robots
US20120209432A1 (en)2005-09-132012-08-16Neurosciences Research Foundation, Inc.Hybrid control device
US20070074177A1 (en)2005-09-292007-03-29Hitachi, Ltd.Logic extraction support apparatus
US20120303160A1 (en)2005-09-302012-11-29Irobot CorporationCompanion robot for personal interaction
US7668605B2 (en)2005-10-262010-02-23Rockwell Automation Technologies, Inc.Wireless industrial control user interface
US20130206170A1 (en)2005-12-022013-08-15Irobot CorporationCoverage robot mobility
US20070151389A1 (en)2005-12-202007-07-05Giuseppe PriscoMedical robotic system with programmably controlled constraints on error dynamics
US20080112596A1 (en)2006-01-232008-05-15Rhoads Geoffrey BSensing Data From Physical Objects
US7576639B2 (en)2006-03-142009-08-18Mobileye Technologies, Ltd.Systems and methods for detecting pedestrians in the vicinity of a powered industrial vehicle
US8924021B2 (en)2006-04-272014-12-30Honda Motor Co., Ltd.Control of robots from human motion descriptors
US20070255454A1 (en)2006-04-272007-11-01Honda Motor Co., Ltd.Control Of Robots From Human Motion Descriptors
US20090234501A1 (en)2006-05-252009-09-17Takehiro IshizakiWork Robot
US20130274924A1 (en)2006-06-012013-10-17Samsung Electronics Co., Ltd.Method, medium and apparatus classifying and collecting area feature information according to a robot's moving path, and a robot controlled by the area features
US20080059015A1 (en)*2006-06-092008-03-06Whittaker William LSoftware architecture for high-speed traversal of prescribed routes
US20080040040A1 (en)2006-08-082008-02-14Takanori GotoObstacle avoidance method and obstacle-avoidable mobile apparatus
US8843244B2 (en)*2006-10-062014-09-23Irobot CorporationAutonomous behaviors for a remove vehicle
US20160078303A1 (en)*2006-12-012016-03-17Sri InternationalUnified framework for precise vision-aided navigation
US8174568B2 (en)*2006-12-012012-05-08Sri InternationalUnified framework for precise vision-aided navigation
US20080140257A1 (en)2006-12-072008-06-12Fanuc LtdRobot control apparatus for force control
US20090265036A1 (en)2007-03-292009-10-22Irobot CorporationRobot operator control unit configuration system and method
US20090037033A1 (en)2007-05-142009-02-05Emilie PhillipsAutonomous Behaviors for a Remote Vehicle
US8380348B2 (en)2008-01-152013-02-19Honda Motor Co., Ltd.Robot
US20100152896A1 (en)2008-02-062010-06-17Mayumi KomatsuRobot, controlling device and controlling method for robot, and controlling program for robot-controlling device
US8639035B2 (en)2008-02-072014-01-28Nec CorporationPose estimation
US20090231359A1 (en)2008-03-172009-09-17Intelliscience CorporationMethods and systems for compound feature creation, processing, and identification in conjunction with a data analysis and feature recognition system
US8452448B2 (en)2008-04-022013-05-28Irobot CorporationRobotics systems
US20090272585A1 (en)2008-05-012009-11-05Kenichiro NagasakaActuator Control Device, Actuator Control Method, Actuator, Robot Apparatus, and Computer Program
US8419804B2 (en)2008-09-042013-04-16Iwalk, Inc.Hybrid terrain-adaptive lower-extremity systems
US20110282169A1 (en)2008-10-292011-11-17The Regents Of The University Of Colorado, A Body CorporateLong Term Active Learning from Large Continually Changing Data Sets
US20100114372A1 (en)2008-10-302010-05-06Intellibot Robotics LlcMethod of cleaning a surface using an automatic cleaning device
US20110060460A1 (en)2008-10-312011-03-10Kabushiki Kaisha ToshibaRobot control apparatus
US20100152899A1 (en)2008-11-172010-06-17Energid Technologies, Inc.Systems and methods of coordination control for robot manipulation
US20120221147A1 (en)2009-03-092012-08-30Intuitive Surgical Operations, Inc.Control panel for an adjustable ergonomic control console
US20100228264A1 (en)2009-03-092010-09-09David RobinsonAdaptable integrated energy control system for electrosurgical tools in robotic surgical systems
US8364314B2 (en)2009-04-302013-01-29GM Global Technology Operations LLCMethod and apparatus for automatic control of a humanoid robot
US20100305758A1 (en)2009-05-292010-12-02Fanuc LtdRobot control system provided in machining system including robot and machine tool
US20100312730A1 (en)2009-05-292010-12-09Board Of Trustees Of Michigan State UniversityNeuromorphic spatiotemporal where-what machines
US20160057925A1 (en)2009-06-182016-03-03RoboLabs, Inc.System and method for controlling and monitoring operation of an autonomous robot
US20110218676A1 (en)2009-07-022011-09-08Yasunao OkazakiRobot, control device for robot arm and control program for robot arm
US20110035188A1 (en)2009-07-162011-02-10European Space AgencyMethod and apparatus for analyzing time series data
US20110026770A1 (en)2009-07-312011-02-03Jonathan David BrookshirePerson Following Using Histograms of Oriented Gradients
US20110067479A1 (en)2009-09-222011-03-24Gm Global Technology Operations, Inc.System and method for calibrating a rotary absolute position sensor
US8515162B2 (en)2009-10-092013-08-20Primax Electronics Ltd.QR code processing method and apparatus thereof
US8679260B2 (en)2009-11-112014-03-25Intellibot Robotics LlcMethods and systems for movement of an automatic cleaning device using video signal
US8423225B2 (en)2009-11-112013-04-16Intellibot Robotics LlcMethods and systems for movement of robotic device using video signal
US20110144802A1 (en)2009-12-102011-06-16The Boeing CompanyControl System for Robotic Vehicles
US20110158476A1 (en)2009-12-242011-06-30National Taiwan University Of Science And TechnologyRobot and method for recognizing human faces and gestures thereof
US20110160906A1 (en)2009-12-282011-06-30Honda Motor Co., Ltd.Control device for mobile robot
US20110160907A1 (en)2009-12-282011-06-30Honda Motor Co., Ltd.Control device for robot
US20110196199A1 (en)2010-02-112011-08-11Intuitive Surgical Operations, Inc.Method and system for automatically maintaining an operator selected roll orientation at a distal tip of a robotic endoscope
US20130066468A1 (en)2010-03-112013-03-14Korea Institute Of Science And TechnologyTelepresence robot, telepresence system comprising the same and method for controlling the same
US20110244919A1 (en)2010-03-192011-10-06Aller Joshua VMethods and Systems for Determining Image Processing Operations Relevant to Particular Imagery
US20120308076A1 (en)2010-03-262012-12-06Filip Lukasz PiekniewskiApparatus and methods for temporally proximate object recognition
US20120303091A1 (en)2010-03-262012-11-29Izhikevich Eugene MApparatus and methods for polychronous encoding and multiplexing in neuronal prosthetic devices
US20120308136A1 (en)2010-03-262012-12-06Izhikevich Eugene MApparatus and methods for pulse-code invariant object recognition
US20110296944A1 (en)2010-06-022011-12-08Disney Enterprises, Inc.Three-axis robotic joint using four-bar linkages to drive differential side gears
US20130218339A1 (en)2010-07-232013-08-22Aldebaran Robotics"humanoid robot equipped with a natural dialogue interface, method for controlling the robot and corresponding program"
US20120045068A1 (en)2010-08-202012-02-23Korea Institute Of Science And TechnologySelf-fault detection system and method for microphone array and audio-based device
US20120072166A1 (en)2010-09-222012-03-22Invensense, Inc.Deduced reckoning navigation without a constraint relationship between orientation of a sensor platform and a direction of travel of an object
US20120079670A1 (en)2010-10-052012-04-05Samsung Electronics Co., Ltd.Dust inflow sensing unit and robot cleaner having the same
US20120143495A1 (en)2010-10-142012-06-07The University Of North TexasMethods and systems for indoor navigation
US9015093B1 (en)2010-10-262015-04-21Michael Lamport CommonsIntelligent control with hierarchical stacked neural networks
US20120144242A1 (en)2010-12-022012-06-07Vichare Nikhil MSystem and method for proactive management of an information handling system with in-situ measurement of end user actions
US20120150777A1 (en)2010-12-132012-06-14Kabushiki Kaisha ToshibaAction history search device
US20130000480A1 (en)2010-12-172013-01-03Mayumi KomatsuControl apparatus, control method, and control program for elastic actuator drive mechanism
US8380652B1 (en)2011-05-062013-02-19Google Inc.Methods and systems for autonomous robotic decision making
US8639644B1 (en)2011-05-062014-01-28Google Inc.Shared robot knowledge base for use with cloud computing system
US20150127155A1 (en)2011-06-022015-05-07Brain CorporationApparatus and methods for operating robotic devices using selective state space training
US20130044139A1 (en)2011-08-162013-02-21Google Inc.Systems and methods for navigating a camera
US20130096719A1 (en)2011-10-132013-04-18The U.S.A. As Represented By The Administrator Of The National Aeronautics And Space AdministrationMethod for dynamic optimization of a robot control interface
US20130116827A1 (en)2011-11-072013-05-09Seiko Epson CorporationRobot control system, robot system, and sensor information processing apparatus
US20140114479A1 (en)2011-11-102014-04-24Panasonic CorporationRobot, robot control device, robot control method, and robot control program
US20130173060A1 (en)2012-01-042013-07-04Hyundai Motor CompanyMethod of operating a wearable robot
US8958911B2 (en)2012-02-292015-02-17Irobot CorporationMobile robot
US20130245829A1 (en)2012-03-152013-09-19Jtekt CorporationRobot control method, robot control device, and robot control system
US20130310979A1 (en)2012-04-182013-11-21Massachusetts Institute Of TechnologyNeuromuscular Model-Based Sensing And Control Paradigm For A Robotic Leg
US20140089232A1 (en)2012-06-012014-03-27Brain CorporationNeural network learning and collaboration apparatus and methods
US20130325775A1 (en)2012-06-042013-12-05Brain CorporationDynamically reconfigurable stochastic learning apparatus and methods
US8958912B2 (en)2012-06-212015-02-17Rethink Robotics, Inc.Training and operating industrial robots
US20130346347A1 (en)2012-06-222013-12-26Google Inc.Method to Predict a Communicative Action that is Most Likely to be Executed Given a Context
US20140002843A1 (en)2012-06-292014-01-02Kyocera Document Solutions Inc.Image forming apparatus and control method therefor
US20150213299A1 (en)2012-07-042015-07-30José Vicente Solano FerrándezInfrared image based early detection of oil spills in water
US20140016858A1 (en)2012-07-122014-01-16Micah RichertSpiking neuron network sensory processing apparatus and methods
US20140081895A1 (en)2012-09-202014-03-20Oliver CoenenSpiking neuron network adaptive control apparatus and methods
US8793205B1 (en)2012-09-202014-07-29Brain CorporationRobotic learning and evolution apparatus
US20150120128A1 (en)2012-11-022015-04-30Irobot CorporationAutonomous Coverage Robot
US20140187519A1 (en)2012-12-272014-07-03The Board Of Trustees Of The Leland Stanford Junior UniversityBiomarkers for predicting major adverse events
US20140190514A1 (en)2013-01-082014-07-10Bluebotics SaFloor treatment machine and method for treating floor surfaces
US20160282862A1 (en)2013-01-182016-09-29Irobot CorporationEnvironmental management systems including mobile robots and methods using same
US8958937B2 (en)2013-03-122015-02-17Intellibot Robotics LlcCleaning machine with collision prevention
US20140277718A1 (en)2013-03-152014-09-18Eugene IzhikevichAdaptive predictor apparatus and methods
US20140276951A1 (en)2013-03-152014-09-18Intuitive Surgical Operations, Inc.Software Configurable Manipulator Degrees of Freedom
US9008840B1 (en)2013-04-192015-04-14Brain CorporationApparatus and methods for reinforcement-guided supervised learning
US20140350723A1 (en)2013-05-232014-11-27Fluor Technologies CorporationUniversal construction robotics interface
US20140358828A1 (en)2013-05-292014-12-04Purepredictive, Inc.Machine learning generated action plan
US9242372B2 (en)2013-05-312016-01-26Brain CorporationAdaptive robotic interface apparatus and methods
WO2014196925A1 (en)2013-06-032014-12-11Ctrlworks Pte. Ltd.Method and apparatus for offboard navigation of a robotic device
US20140371907A1 (en)2013-06-142014-12-18Brain CorporationRobotic training apparatus and methods
US20140371912A1 (en)2013-06-142014-12-18Brain CorporationHierarchical robotic controller apparatus and methods
US20150032258A1 (en)2013-07-292015-01-29Brain CorporationApparatus and methods for controlling of robotic devices
WO2015047195A1 (en)2013-09-242015-04-02Ctrlworks Pte. Ltd.Offboard navigation apparatus capable of being coupled to a movable platform
US20150094852A1 (en)2013-09-272015-04-02Brain CorporationRobotic control arbitration apparatus and methods
US20150094850A1 (en)2013-09-272015-04-02Brain CorporationApparatus and methods for training of robotic control arbitration
US9144907B2 (en)2013-10-242015-09-29Harris CorporationControl synchronization for high-latency teleoperation
US20150185027A1 (en)2014-01-022015-07-02Microsoft CorporationGround truth estimation for autonomous navigation
US20150199458A1 (en)2014-01-142015-07-16Energid Technologies CorporationDigital proxy simulation of robotic hardware
US20150283703A1 (en)2014-04-032015-10-08Brain CorporationApparatus and methods for remotely controlling robotic devices
US20150306761A1 (en)2014-04-292015-10-29Brain CorporationTrainable convolutional network apparatus and methods for operating a robotic vehicle
US20150317357A1 (en)2014-05-022015-11-05Google Inc.Searchable Index
US20150339589A1 (en)2014-05-212015-11-26Brain CorporationApparatus and methods for training robots utilizing gaze-based saliency maps
US9746339B2 (en)2014-08-072017-08-29Nokia Technologies OyApparatus, method, computer program and user device for enabling control of a vehicle
US20160075026A1 (en)2014-09-122016-03-17Toyota Jidosha Kabushiki KaishaAnticipatory robot navigation
US20160182502A1 (en)2014-12-232016-06-23Ned M. SmithUser profile selection using contextual authentication
US20160309973A1 (en)2015-04-242016-10-27Avidbots Corp.Apparatus and methods for semi-autonomous cleaning of surfaces
US20170329347A1 (en)*2016-05-112017-11-16Brain CorporationSystems and methods for training a robot to autonomously travel a route
US20170329333A1 (en)*2016-05-112017-11-16Brain CorporationSystems and methods for initializing a robot to autonomously travel a trained route

Non-Patent Citations (27)

* Cited by examiner, † Cited by third party
Title
"Detection of ArUco Markers" accessed Jun. 20, 2016, available at the following Web address:http://docs.opencv.org/3.1.0/d5/dae/tutorial_aruco_detection.html#gsc.tab=0, 18 pgs.
Asensio et al., "Robot Learning Control Based on Neural Network Prediction" ASME 8th Annual Dynamic Systems and Control Conference joint with the JSME 11th Motion and Vibration Conference 2012 [Retrieved on: Jun. 24, 2014]. Retrieved fro internet:http://msc.berkeley.edu/wjchen/publications/DSC12.sub.--8726.sub.--FI-.pdf<http: />, 11 pgs.
Bouganis, Alexandros, et al.,"Training a Spiking Neural Network to Control a 4-DoF Robotic Arm based on Spike Timing-Dependent Plasticity", Proceedings of WCCI 2010 IEEE World Congress on Computational Intelligence, COB, Barcelona, Spain, Jul. 18-23, 2010, pp. 4104-4111.
Brown, et al., Detecting Problems in Buildings Using Infrared Cameras, Fluke Digital Library, retrieved on Jun. 8, 2015 from the Web address: www.fluke.com/library, 3 pgs.
Coupard, Pierre-Philippe, An Availabot-like computer-controlled push puppet for Linux, https://web.archive.org/web/20081106161941/http://myspace.voo.be/pcoupard/push_puppet_to_y/, 2008, 7 pgs .
Hardware and Software Platform for Mobile Manipulation R&D, 2012, https://web.archive.org/web/20120128031010/http://www.willowgarage.com/pages/pr2/design, 4 pgs.
Heikkila J., et al., "A Four-Step Camera Calibration Procedure with Implicit Image Correction," Computer Vision and Pattern Recognition, 1997, Proceedings, 1997 IEEE Computer Society Conference on, San Juan, 1997, pp. 1106-1112.
Hopkins, Chasing Water with Thermal Imaging, Infrared Training Center, 2011, 14 pgs.
Hunt, et al., "Detection of Changes in Leaf Water Content Using Near-and Middle-Infrared Reflectance," Journal of Remote Sensing of Environment, 1989, vol. 30 (1), pp. 43-54.
Jain, Learning Trajectory Preferences for Manipulators via Iterative Improvement, 2013, Advances in Neural Information Processing Systems 26 (NIPS 2013), 8 pgs.
Joshi, Blog Post from Perpetual Enigma Website, "Understanding Camera Calibration" posted May 31, 2014, accessed Jun. 20, 2016 at the following Web address: https://prateekvjoshi.com/2014/05/31/understanding-camera-calibration/, 5 pgs.
Kalman Filter; wikipedia, 32 pgs .
Kasabov, "Evolving Spiking Neural Networks for Spatio-and Spectro-Temporal Pattern Recognition", IEEE 6th International Conference Intelligent Systems 2012 [Retrieved on Jun 24, 2014], Retrieved from the Internet: http://ncs.ethz.ch/projects/evospike/publications/evolving- spiking-neural-networks-for-spatio-and-spectro-temporal-pattern-recognition-plenary-talk-ieee-is/view, 6 pgs.
Maesen, et al., "Tile Tracker: A Practical and Inexpensive Positioning System for Mobile AR Applications" pp. 1-8.
PR2 User Manual, Oct. 5, 2012, 40 pgs.
Rahman, et al., "An Image Based Approach to Compute Object Distance," International Journal of Computational Intelligence Systems, 2008, vol. 1 (4), pp. 304-315.
Rosebrock,Tutorial "Find Distance from Camera to Object/marker using Python and OpenCV" Jan. 19, 2015, accessed Jun. 20, 2016 at the following Web address: http://www.pyimagesearch.com/2015/01/19/find-distance-camera-objectmarker-using-python-opencv/, 33 pgs.
Rosenhahn, et al., Pose Estimation in Conformal Geometric Algebra Part I: The Stratification of Mathematical Spaces, Journal of Mathematical Imagine and Vision 22:27-48, 2005.
Steele, The Human Touch Makes Robots Defter, Nov. 6, 2013, Cornell Chronicle. http://www.news.cornell.edu/stories/2013/11/human-touch-makes-robots-defter, 2 pgs.
Thermal Imaging for Moisture and Restoration, retrieved on Apr. 5, 2016 from the following Web address: www.flir.com/home, 4 pgs.
Torralba, et al., "Depth Estimation from Image Structure," Journal of IEEE Transactions on Pattern Analysis and Machine Intelligence, 2002, vol. 24 (9), pp. 1226-1238.
Triggs, "Camera Pose and Calibration from 4 or 5 known 3D Points," 7th International Conference on Computer Vision (ICCV '99), IEEE Computer Society, 1999, vol. 1, pp. 278-284.
Tutorial "Camera Calibration with OpenCV" accessed Jun. 20, 2016 from the following Web address http://docs.opencv.org/2.4/doc/tutorials/calib3d/camera_calibration/camera_calibration.html, 11 pgs.
UNCC Machine Lab Wiki Documentation "ROS and Camera Calibration" accessed Jun. 20, 2016 at the following Web address: http://visionlab.uncc.edu/dokuwiki/ros_and_camera_calibration#aruco_-_augmented_reality_library_from_the_university_of cordoba, 5 pgs.
Wan, et al., "Automatic Navigation System with Multiple Sensors," IFIP International Federation for Information Processing, vol. 259, Computer and Computing Technologies in Agriculture, 2008, vol. 2, pp. 769-776.
Wikipedia Page "Pose (Computer Vision)" accessed Jun. 20, 2016, available at the following Web address: https://en.wikipedia.org/wiki/Pose_(computer_vision), 2 pgs.
Zhang, A Flexible New Technique for Camera Calibration, last updated 12-5-9, Technical Report MSR-TR-98-71, Dec. 2, 1998, 22 pgs.

Cited By (10)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US10650179B2 (en)*2016-01-272020-05-12Capital Normal UniversityMethod and system for formally analyzing the motion planning of a robotic arm based on conformal geometric algebra
US20180061137A1 (en)*2016-08-302018-03-01Lg Electronics Inc.Mobile terminal and method of operating thereof
US10373389B2 (en)*2016-08-302019-08-06Lg Electronics Inc.Mobile terminal and method of operating thereof
CN108742346A (en)*2018-06-272018-11-06杨扬The method for traversing the method for working environment and establishing grating map
US11092458B2 (en)*2018-10-302021-08-17Telenav, Inc.Navigation system with operation obstacle alert mechanism and method of operation thereof
CN110101340A (en)*2019-05-242019-08-09北京小米移动软件有限公司Cleaning equipment, clean operation execute method, apparatus and storage medium
US20210331312A1 (en)*2019-05-292021-10-28Lg Electronics Inc.Intelligent robot cleaner for setting travel route based on video learning and managing method thereof
US11565411B2 (en)*2019-05-292023-01-31Lg Electronics Inc.Intelligent robot cleaner for setting travel route based on video learning and managing method thereof
WO2022140969A1 (en)*2020-12-282022-07-07深圳市优必选科技股份有限公司Method for dynamically generating footprint set, storage medium, and biped robot
CN114947655A (en)*2022-05-172022-08-30安克创新科技股份有限公司Robot control method, device, robot and computer readable storage medium

Also Published As

Publication numberPublication date
EP3535630A1 (en)2019-09-11
US20180120856A1 (en)2018-05-03
US20180364724A1 (en)2018-12-20
US10379539B2 (en)2019-08-13
US20200004253A1 (en)2020-01-02
WO2018085294A1 (en)2018-05-11
JP2020502630A (en)2020-01-23
EP3535630A4 (en)2020-07-29
CA3042532A1 (en)2018-05-11
JP7061337B2 (en)2022-04-28
KR20190077050A (en)2019-07-02
CN110023866B (en)2022-12-06
CN110023866A (en)2019-07-16
KR102528869B1 (en)2023-05-04

Similar Documents

PublicationPublication DateTitle
US10379539B2 (en)Systems and methods for dynamic route planning in autonomous navigation
US11701778B2 (en)Systems and methods for robotic path planning
US12393196B2 (en)Systems and methods for training a robot to autonomously travel a route
US20250068163A1 (en)Systems and methods for optimizing route planning for tight turns for robotic apparatuses
US20210223779A1 (en)Systems and methods for rerouting robots to avoid no-go zones
US20220016778A1 (en)Systems, apparatuses, and methods for cost evaluation and motion planning for robotic devices
US11886198B2 (en)Systems and methods for detecting blind spots for robots
HK40012306A (en)Systems and methods for dynamic route planning in autonomous navigation
HK40012306B (en)Systems and methods for dynamic route planning in autonomous navigation
WO2025151787A1 (en)Systems and methods for modulating speed limits of robotic devices

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:BRAIN CORPORATION, CALIFORNIA

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GABARDOS, BORJA IBARZ;PASSOT, JEAN-BAPTISTE;REEL/FRAME:041921/0848

Effective date:20170306

STCFInformation on status: patent grant

Free format text:PATENTED CASE

ASAssignment

Owner name:HERCULES CAPITAL, INC., CALIFORNIA

Free format text:SECURITY INTEREST;ASSIGNOR:BRAIN CORPORATION;REEL/FRAME:057851/0574

Effective date:20211004

FEPPFee payment procedure

Free format text:MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

FEPPFee payment procedure

Free format text:SURCHARGE FOR LATE PAYMENT, SMALL ENTITY (ORIGINAL EVENT CODE: M2554); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 4TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2551); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

Year of fee payment:4


[8]ページ先頭

©2009-2025 Movatter.jp