Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

Autonomous robot

From Wikipedia, the free encyclopedia
(Redirected fromAutonomous robots)
Robot that performs behaviors or tasks with a high degree of autonomy

Anautonomous robot is arobot that acts without recourse to human control. Historic examples includespace probes. Modern examples include self-drivingvacuums andcars.

Industrial robot arms that work on assembly lines inside factories may also be considered autonomous robots, though theirautonomy is restricted due to a highly structured environment and their inability tolocomote.

Components and criteria of robotic autonomy

[edit]
icon
This sectionneeds additional citations forverification. Please helpimprove this article byadding citations to reliable sources in this section. Unsourced material may be challenged and removed.(December 2020) (Learn how and when to remove this message)

Self-maintenance

[edit]

The first requirement for complete physical autonomy is the ability for a robot to take care of itself. Many of the battery-powered robots on the market today can find and connect to a charging station, and some toys like Sony'sAibo are capable of self-docking to charge their batteries.

Self-maintenance is based on "proprioception", or sensing one's own internal status. In the battery charging example, the robot can tell proprioceptively that its batteries are low, and it then seeks the charger. Another common proprioceptive sensor is for heat monitoring. Increased proprioception will be required for robots to work autonomously near people and in harsh environments. Common proprioceptive sensors include thermal, optical, and haptic sensing, as well as theHall effect (electric).

Robot GUI display showing battery voltage and other proprioceptive data in lower right-hand corner. The display is for user information only. Autonomous robots monitor and respond toproprioceptive sensors without human intervention to keep themselves safe and operating properly.

Sensing the environment

[edit]

Exteroception issensing things about the environment. Autonomous robots must have a range of environmental sensors to perform their task and stay out of trouble. The autonomous robot can recognize sensor failures and minimize the impact on the performance caused by failures.[1]

  • Common exteroceptive sensors include theelectromagnetic spectrum, sound, touch, chemical (smell, odor), temperature, range to various objects, and altitude.

Some robotic lawn mowers will adapt their programming by detecting the speed in which grass grows as needed to maintain a perfectly cut lawn, and some vacuum cleaning robots have dirt detectors that sense how much dirt is being picked up and use this information to tell them to stay in one area longer.

Task performance

[edit]

The next step in autonomous behavior is to actually perform a physical task. A new area showing commercial promise is domestic robots, with a flood of small vacuuming robots beginning withiRobot andElectrolux in 2002. While the level of intelligence is not high in these systems, they navigate over wide areas and pilot in tight situations around homes using contact and non-contact sensors. Both of these robots use proprietary algorithms to increase coverage over simple random bounce.

The next level of autonomous task performance requires a robot to perform conditional tasks. For instance, security robots can be programmed to detect intruders and respond in a particular way depending upon where the intruder is. For example,Amazon launched its Astro for home monitoring, security and eldercare in September 2021.[2]

Autonomous navigation

[edit]

Indoor navigation

[edit]

For a robot to associate behaviors with a place (localization) requires it to know where it is and to be able to navigate point-to-point. Such navigation began with wire-guidance in the 1970s and progressed in the early 2000s to beacon-basedtriangulation. Current commercial robots autonomously navigate based on sensing natural features. The first commercial robots to achieve this were Pyxus' HelpMate hospital robot and the CyberMotion guard robot, both designed by robotics pioneers in the 1980s. These robots originally used manually createdCAD floor plans, sonar sensing and wall-following variations to navigate buildings. The next generation, such as MobileRobots'PatrolBot and autonomous wheelchair,[3] both introduced in 2004, have the ability to create their own laser-basedmaps of a building and to navigate open areas as well as corridors. Their control system changes its path on the fly if something blocks the way.

At first, autonomous navigation was based on planar sensors, such as laser range-finders, that can only sense at one level. The most advanced systems now fuse information from various sensors for both localization (position) and navigation. Systems such as Motivity can rely on different sensors in different areas, depending upon which provides the most reliable data at the time, and can re-map a building autonomously.

Rather than climb stairs, which requires highly specialized hardware, most indoor robots navigate handicapped-accessible areas, controlling elevators and electronic doors.[4] With such electronic access-control interfaces, robots can now freely navigate indoors. Autonomously climbing stairs and opening doors manually are topics of research at the current time.

As these indoor techniques continue to develop, vacuuming robots will gain the ability to clean a specific user-specified room or a whole floor. Security robots will be able to cooperatively surround intruders and cut off exits. These advances also bring concomitant protections: robots' internal maps typically permit "forbidden areas" to be defined to prevent robots from autonomously entering certain regions.

Outdoor navigation

[edit]

Outdoor autonomy is most easily achieved in the air, since obstacles are rare.Cruise missiles are rather dangerous highly autonomous robots. Pilotless drone aircraft are increasingly used for reconnaissance. Some of theseunmanned aerial vehicles (UAVs) are capable of flying their entire mission without any human interaction at all except possibly for the landing where a person intervenes using radio remote control. Some drones are capable of safe, automatic landings, however.SpaceX operates a number ofautonomous spaceport drone ships, used to safely land and recoverFalcon 9 rockets at sea.[5] Few countries like India started working onrobotic deliveries of food and other articles bydrone.

Outdoor autonomy is the most difficult for ground vehicles, due to:

  • Three-dimensional terrain
  • Great disparities in surface density
  • Weather exigencies
  • Instability of the sensed environment

Open problems in autonomous robotics

[edit]
[icon]
This sectionneeds expansion. You can help byadding to it.(July 2008)

Several open problems in autonomous robotics are special to the field rather than being a part of the general pursuit of AI. According to George A. Bekey'sAutonomous Robots: From Biological Inspiration to Implementation and Control, problems include things such as making sure the robot is able to function correctly and not run into obstacles autonomously. Reinforcement learning has been used to control and plan the navigation of autonomous robots, specifically when a group of them operates in collaboration with each other.[6]

Energy autonomy and foraging

Researchers concerned with creating trueartificial life are concerned not only with intelligent control, but further with the capacity of the robot to find its own resources throughforaging (looking for food, which includes both energy and spare parts).

This is related toautonomous foraging, a concern within the sciences ofbehavioral ecology,social anthropology, andhuman behavioral ecology; as well asrobotics,artificial intelligence, andartificial life.[7]

Systemic robustness and real-world brittleness

Autonomous robots remain highly vulnerable to unexpected changes in real-world environments. Even minor variations like a sudden beam of sunlight disrupting vision systems or unanticipated terrain irregularities can cause entire systems to fail.[8] This brittleness stems from robotics being an inherentlysystems problem, where a deficiency in any module (perception, planning, actuation) can compromise the whole robot.

Open-world scene understanding

Robots often depend on datasets captured under controlled conditions, limiting their ability to generalize to novel, dynamic real-world scenarios. They struggle with unknown objects, occlusions, varying object scales, and rapidly changing environments. Developingself-supervised, lifelong learning systems that adapt toopen-world conditions remains a pressing challenge.[9]

Multi-robot coordination and decentralization

Scaling robot systems raises thorny issues in coordination, safety, and communication. Inmulti-agent navigation, challenges like deadlocks, selfish behaviors, and sample inefficiencies emerge. Innovations such as dividing planning into sub-problems, combining RL with imitation learning, hybrid centralized-decentralized approaches (e.g., prioritized communication learning), attention mechanisms, and graph transformers have shown promise, but large-scale, stable, real-time coordination remains an open frontier.[10]

Simulation-to-real (“reality gap”) transfer

Deep reinforcement learning is a powerful tool for teaching robots navigation and control, but training in simulation introduces discrepancies when deployed in reality. Thereality gap (or differences between simulated and real environments) continues to impede reliable deployment, despite strategies to mitigate it.[11][12]

Hardware and biohybrid constraints

Physical limitations of batteries, motors, sensors, and actuators constrain robot autonomy, endurance, and adaptability, especially for humanoid or soft-bio hybrid robots. WhileBiohybrid system (e.g., using living muscle tissue) hint at leveraging biological energy and actuation, they introduce radically new challenges in materials, integration, and control.[13][14]

Ethics, liability, and societal integration

As robots become more autonomous, especially in public or collaborative roles, ethical and legal issues grow. Who is responsible when an autonomous system causes harm? Regulatory frameworks are still evolving to address liability, transparency, bias, and safety in systems likeSelf-driving car or socially interactive robots.[15]

Embodied AI and industrial adoption

While AI algorithms have made strides, embedding them into robots (embodied AI) for real-world use remains slow-moving. Hardware constraints, economic viability, and infrastructure limitations limit widespread adoption. For instance, humanoid robots like “Pepper (robot)” failed to achieve ubiquity due to fundamental cost and complexity issues.[14][16]

Systemic robustness and real-world brittleness

Autonomous robots remain highly vulnerable to unexpected changes in real-world environments. Even minor variations like a sudden beam of sunlight disrupting vision systems or unanticipated terrain irregularities can cause entire systems to fail.[8] This brittleness stems from robotics being an inherentlysystems problem, where a deficiency in any module (perception, planning, actuation) can compromise the whole robot.

Societal impact and issues

[edit]

As autonomous robots have grown in ability and technical levels, there has been increasing societal awareness and news coverage of the latest advances, and also some of the philosophical issues, economic effects, and societal impacts that arise from the roles and activities of autonomous robots.

Elon Musk, a prominent business executive and billionaire has warned for years of the possible hazards and pitfalls of autonomous robots; however, his own company is one of the most prominent companies that is trying to devise new advanced technologies in this area.[17]

In 2021, a United Nations group of government experts, known as theConvention on Certain Conventional Weapons – Group of Governmental Experts on Lethal Autonomous Weapons Systems, held a conference to highlight the ethical concerns which arise from the increasingly advanced technology for autonomous robots to wield weapons and to play a military role.[18]

Technical development

[edit]

Early robots

[edit]

The first autonomous robots were known asElmer and Elsie, constructed in the late 1940s byW. Grey Walter. They were the firstrobots programmed to "think" the way biological brains do and were meant to havefree will.[19] Elmer and Elsie were often labeled as tortoises because of how they were shaped and the manner in which they moved. They were capable ofphototaxis, the movement that occurs in response to light stimulus.[20]

Space probes

[edit]

The Mars roversMER-A andMER-B (now known asSpirit rover andOpportunity rover) found the position of the Sun and navigated their own routes to destinations, on the fly, by:

  • Mapping the surface with 3D vision
  • Computing safe and unsafe areas on the surface within that field of vision
  • Computing optimal paths across the safe area towards the desired destination
  • Driving along the calculated route
  • Repeating this cycle until either the destination is reached, or there is no known path to the destination

The plannedESA Rover,Rosalind Franklin rover, is capable of vision based relative localisation and absolute localisation to autonomously navigate safe and efficient trajectories to targets by:

  • Reconstructing 3D models of the terrain surrounding the Rover using a pair of stereo cameras
  • Determining safe and unsafe areas of the terrain and the general "difficulty" for the Rover to navigate the terrain
  • Computing efficient paths across the safe area towards the desired destination
  • Driving the Rover along the planned path
  • Building up a navigation map of all previous navigation data

During the final NASA Sample Return Robot Centennial Challenge in 2016, a rover, named Cataglyphis, successfully demonstrated fully autonomous navigation, decision-making, and sample detection, retrieval, and return capabilities.[21] The rover relied on a fusion of measurements frominertial sensors, wheel encoders, Lidar, and camera for navigation and mapping, instead of using GPS or magnetometers. During the 2-hour challenge, Cataglyphis traversed over 2.6 km and returned five different samples to its starting position.

General-use autonomous robots

[edit]
The Seekur and MDARS robots demonstrate their autonomous navigation and security capabilities at an airbase.
Sophia, a robot known for human-like appearance and interactions
AMR transfer cart for in-factory transfer needs

The Seekur robot was the first commercially available robot to demonstrate MDARS-like capabilities for general use by airports, utility plants, corrections facilities andHomeland Security.[22]

TheDARPA Grand Challenge andDARPA Urban Challenge have encouraged development of even more autonomous capabilities for ground vehicles, while this has been the demonstrated goal for aerial robots since 1990 as part of the AUVSIInternational Aerial Robotics Competition.

AMR transfer carts developed by Seyiton are used to transfer loads of up to 1500 kilograms inside factories.[23]

Between 2013 and 2017,TotalEnergies has held theARGOS Challenge to develop the first autonomous robot for oil and gas production sites. The robots had to face adverse outdoor conditions such as rain, wind and extreme temperatures.[24]

Some significant current robots include:

  • Sophia is an autonomous robot[25][26] that is known for its human-like appearance and behavior compared to previous robotic variants. As of 2018, Sophia's architecture includes scripting software, a chat system, andOpenCog, an AI system designed for general reasoning.[27] Sophia imitates human gestures and facial expressions and is able to answer certain questions and to make simple conversations on predefined topics (e.g. on the weather).[28] The AI program analyses conversations and extracts data that allows it to improve responses in the future.[29]
  • Nine other robot humanoid "siblings" who were also created byHanson Robotics.[30] Fellow Hanson robots are Alice,Albert Einstein Hubo,BINA48, Han, Jules, Professor Einstein, Philip K. Dick Android, Zeno,[30] and Joey Chaos.[31] Around 2019–20, Hanson released "Little Sophia" as a companion that could teach children how to code, including support for Python, Blockly, and Raspberry Pi.[32]

Military autonomous robots

[edit]

Lethal autonomous weapons (LAWs) are a type of autonomous robotmilitary system that can independently search for and engage targets based on programmed constraints and descriptions.[33] LAWs are also known as lethal autonomous weapon systems (LAWS), autonomous weapon systems (AWS), robotic weapons, killer robots or slaughterbots.[34] LAWs may operate in the air, on land, on water, under water, or in space. The autonomy of current systems as of 2018[update] was restricted in the sense that a human gives the final command to attack – though there are exceptions with certain "defensive" systems.

  • UGV Interoperability Profile (UGV IOP), Robotics and Autonomous Systems – Ground IOP (RAS-G IOP), was originally a research program started by theUnited States Department of Defense (DoD) to organize and maintainopen architectureinteroperability standards forUnmanned Ground Vehicles (UGV).[35][36][37][38] The IOP was initially created by U.S. Army Robotic Systems Joint Project Office (RS JPO):[39][40][41]
  • In October 2019, Textron and Howe & Howe unveiled theirRipsaw M5 vehicle,[42] and on 9 January 2020, the U.S. Army awarded them a contract for the Robotic Combat Vehicle-Medium (RCV-M) program. Four Ripsaw M5 prototypes are to be delivered and used in acompany-level to determine the feasibility of integrating unmanned vehicles into ground combat operations in late 2021.[43][44][45] It can reach speeds of more than 40 mph (64 km/h), has a combat weight of 10.5 tons and a payload capacity of 8,000 lb (3,600 kg).[46] The RCV-M is armed with a30 mm autocannon and a pair ofanti-tank missiles. The standard armor package can withstand12.7×108mm rounds, with optional add-on armor increasing weight to up to 20 tons. If disabled, it will retain the ability to shoot, with its sensors and radio uplink prioritized to continue transmitting as its primary function.[47]
  • Crusher is a 13,200-pound (6,000 kg)[48]autonomous off-roadUnmanned Ground Combat Vehicle developed by researchers at theCarnegie Mellon University'sNational Robotics Engineering Center forDARPA.[49] It is a follow-up on the previous Spinner vehicle.[50] DARPA's technical name for the Crusher isUnmanned Ground Combat Vehicle and Perceptor Integration System,[51] and the whole project is known by the acronym UPI, which stands forUnmanned Ground Combat Vehicle PerceptOR Integration.[49]
  • CATS Warrior will be an autonomous wingman drone capable of take-off & landing from land & in sea from anaircraft carrier, it will team up with the existing fighter platforms of theIAF likeTejas,Su-30 MKI andJaguar which will act like its mothership.[52]
  • The Warrior is primarily envisioned for the Indian Air Force use and a similar, smaller version will be designed for theIndian Navy. It would be controlled by the mothership and accomplish tasks such as scouting, absorbing enemy fire, attacking the targets if necessary with its internal & external pylons weapons or sacrifice itself by crashing into the target.
  • The SGR-A1 is a type of autonomoussentry gun that was jointly developed bySamsung Techwin (nowHanwha Aerospace) andKorea University to assist South Korean troops in theKorean Demilitarized Zone. It is widely considered as the first unit of its kind to have an integrated system that includes surveillance, tracking, firing, and voice recognition.[53] While units of the SGR-A1 have been reportedly deployed, their number is unknown due to the project being "highly classified".[54]

Types of robots

[edit]

Humanoid

[edit]

Tesla Robot andNVIDIA GR00T are humanoid robots. Humanoids are machines that are designed to mimic the human form in appearance and behavior. These robots typically have a head, torso, arms, and legs, making them look like humans.

Delivery robot

[edit]
Main article:Delivery robot
See also:Delivery drone
A food delivery robot

A delivery robot is an autonomous robot used for delivering goods.

Charging robot

[edit]

An automatic charging robot, unveiled on July 27, 2022, is an arm-shaped robot to charge an electric vehicle. It has been running a pilot operation at Hyundai Motor Group's headquarters since 2021. VISION AI System based on deep learning technology has been applied. When an electric vehicle is parked in front of the charger, the robot arm recognizes the charger of the electric vehicle and derives coordinates. And automatically insert a connector into the electric car and operate fast charging. The robot arm is configured in a vertical multi-joint structure so that it can be applied to chargers at different locations for each vehicle. In addition, waterproof and dustproof functions are applied.[55]

Construction robots

[edit]

Construction robots are used directly on job sites and perform work such as building, material handling, earthmoving, and surveillance.

Research and education mobile robots

[edit]

Research and education mobile robots are mainly used during a prototyping phase in the process of building full scale robots. They are a scaled down version of bigger robots with the same types of sensors,kinematics and software stack (e.g.ROS). They are often extendable and provide comfortable programming interface and development tools. Next to full scale robot prototyping they are also used for education, especially at university level, where more and more labs about programming autonomous vehicles are being introduced.

Legislation

[edit]

In March 2016, a bill was introduced in Washington, D.C., allowing pilot ground robotic deliveries.[56] The program was to take place from September 15 through the end of December 2017. The robots were limited to a weight of 50 pounds unloaded and a maximum speed of 10 miles per hour. In case the robot stopped moving because of malfunction the company was required to remove it from the streets within 24 hours. There were allowed only 5 robots to be tested per company at a time.[57] A 2017 version of the Personal Delivery Device Act bill was under review as of March 2017.[58]

In February 2017, a bill was passed in the US state ofVirginia via the House bill, HB2016,[59] and the Senate bill, SB1207,[60] that will allow autonomous delivery robots to travel on sidewalks and use crosswalks statewide beginning on July 1, 2017. The robots will be limited to a maximum speed of 10 mph and a maximum weight of 50 pounds.[61] In the states of Idaho and Florida there are also talks about passing the similar legislature.[62][63]

It has been discussed[by whom?] that robots with similar characteristics to invalid carriages (e.g. 10 mph maximum, limited battery life) might be a workaround for certain classes of applications. If the robot was sufficiently intelligent and able to recharge itself using the existing electric vehicle (EV) charging infrastructure it would only need minimal supervision and a single arm with low dexterity might be enough to enable this function if its visual systems had enough resolution.[citation needed]

In November 2017, the San Francisco Board of Supervisors announced that companies would need to get a city permit in order to test these robots.[64] In addition, the Board banned sidewalk delivery robots from making non-research deliveries.[65]

See also

[edit]

Scientific concepts

[edit]

Types of robots

[edit]

Specific robot models

[edit]

Others

[edit]

References

[edit]
  1. ^Ferrell, Cynthia (March 1994)."Failure Recognition and Fault Tolerance of an Autonomous Robot".Adaptive Behavior.2 (4):375–398.doi:10.1177/105971239400200403.ISSN 1059-7123.S2CID 17611578.
  2. ^Heater, Brian (28 September 2021)."Why Amazon built a home robot".Tech Crunch. Retrieved29 September 2021.
  3. ^Berkvens, Rafael; Rymenants, Wouter; Weyn, Maarten; Sleutel, Simon; Loockx, Willy."Autonomous Wheelchair: Concept and Exploration".AMBIENT 2012 : The Second International Conference on Ambient Computing, Applications, Services and Technologies – viaResearchGate.
  4. ^"Speci-Minder; see elevator and door access"Archived January 2, 2008, at theWayback Machine
  5. ^Bergin, Chris (2014-11-18)."Pad 39A – SpaceX laying the groundwork for Falcon Heavy debut".NASA Spaceflight. Retrieved2014-11-17.
  6. ^Matzliach, Barouch; Ben-Gal, Irad; Kagan, Evgeny (2022)."Detection of Static and Mobile Targets by an Autonomous Agent with Deep Q-Learning Abilities".Entropy.24 (8): 1168.Bibcode:2022Entrp..24.1168M.doi:10.3390/e24081168.PMC 9407070.PMID 36010832.
  7. ^Kagan E., Ben-Gal, I., (2015) (23 June 2015).Search and Foraging: Individual Motion and Swarm Dynamics (268 Pages)(PDF). CRC Press, Taylor and Francis.{{cite book}}: CS1 maint: multiple names: authors list (link) CS1 maint: numeric names: authors list (link)
  8. ^abBrondmo, Hans Peter."Inside Google's 7-Year Mission to Give AI a Robot Body".Wired.ISSN 1059-1028. Retrieved2025-08-25.
  9. ^"Frontiers | Advancing Autonomous Robots: Challenges and Innovations in Open-World Scene Understanding".www.frontiersin.org. Retrieved2025-08-25.
  10. ^Chung, Jaehoon; Fayyad, Jamil; Younes, Younes Al; Najjaran, Homayoun (2024-02-08)."Learning team-based navigation: a review of deep reinforcement learning techniques for multi-agent pathfinding".Artificial Intelligence Review.57 (2): 41.doi:10.1007/s10462-023-10670-6.ISSN 1573-7462.
  11. ^Majid, Amjad Yousef; van Rietbergen, Tomas; Prasad, R Venkatesha (2024-08-23)."Challenging Conventions Towards Reliable Robot Navigation Using Deep Reinforcement Learning".Computing&AI Connect.1 (1): 1.doi:10.69709/CAIC.2024.194188.ISSN 3104-4719.Archived from the original on 2025-07-09. Retrieved2025-08-25.
  12. ^Wijayathunga, Liyana; Rassau, Alexander; Chai, Douglas (2023-08-31)."Challenges and Solutions for Autonomous Ground Robot Scene Understanding and Navigation in Unstructured Outdoor Environments: A Review".Applied Sciences.13 (17): 9877.doi:10.3390/app13179877.ISSN 2076-3417.
  13. ^Dery, Mikaela (2018-02-16)."10 big robotics challenges that need to be solved in the next 10 years".create digital. Retrieved2025-08-25.
  14. ^ab"Client Challenge".www.ft.com. Retrieved2025-08-25.
  15. ^Herold, Eve."How Smart Should Robots Be?".TIME.Archived from the original on 2025-07-24. Retrieved2025-08-25.
  16. ^Hawkins, Amy (2025-04-21)."Humanoid workers and surveillance buggies: 'embodied AI' is reshaping daily life in China".The Guardian.ISSN 0261-3077. Retrieved2025-08-25.
  17. ^Elon Musk warned of a ‘Terminator’-like AI apocalypse — now he’s building a Tesla robot, Tue, Aug 24 2021, Brandon Gomez, cnbc.com
  18. ^Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects, July 14, 2021, UN Official website at undocs.org.
  19. ^Ingalis-Arkell, Esther"The Very First Robot Brains Were Made of Old Alarm Clocks"Archived 2018-09-08 at theWayback Machine, 7 March 2012.
  20. ^[Norman, Jeremy,"The First Electronic Autonomous Robots: the Origin of Social Robotics (1948 – 1949)", Jeremy Norman & Co., Inc., 02004-2018.
  21. ^Hall, Loura (2016-09-08)."NASA Awards $750K in Sample Return Robot Challenge". Retrieved2016-09-17.
  22. ^"Weapons Makers Unveil New Era of Counter-Terror Equipment", Fox News
  23. ^" Autonomous Mobile Robots (AMR)", Seyiton
  24. ^"Enhanced Safety Thanks to the ARGOS Challenge".Total Website. Archived fromthe original on 16 January 2018. Retrieved13 May 2017.
  25. ^"Photographing a robot isn't just point and shoot".Wired. March 29, 2018.Archived from the original on December 25, 2018. RetrievedOctober 10, 2018.
  26. ^"Hanson Robotics Sophia".Hanson Robotics.Archived from the original on November 19, 2017. RetrievedOctober 26, 2017.
  27. ^"The complicated truth about Sophia the robot — an almost human robot or a PR stunt".CNBC. 5 June 2018.Archived from the original on May 12, 2020. Retrieved17 May 2020.
  28. ^"Hanson Robotics in the news".Hanson Robotics.Archived from the original on November 12, 2017. RetrievedOctober 26, 2017.
  29. ^"Charlie Rose interviews ... a robot?".CBS 60 Minutes. June 25, 2017.Archived from the original on October 29, 2017. RetrievedOctober 28, 2017.
  30. ^ab"The first-ever robot citizen has 7 humanoid 'siblings' — here's what they look like".Business Insider.Archived from the original on January 4, 2018. RetrievedJanuary 4, 2018.
  31. ^White, Charlie."Joey the Rocker Robot, More Conscious Than Some Humans".Gizmodo.Archived from the original on December 22, 2017. RetrievedJanuary 4, 2018.
  32. ^Wiggers, Kyle (January 30, 2019)."Hanson Robotics debuts Little Sophia, a robot companion that teaches kids to code".VentureBeat.Archived from the original on August 9, 2020. RetrievedApril 2, 2020.
  33. ^Crootof, Rebecca (2015)."The Killer Robots Are Here: Legal and Policy Implications".Cardozo L. Rev.36: 1837 – via heinonline.org.
  34. ^Johnson, Khari (31 January 2020)."Andrew Yang warns against 'slaughterbots' and urges global ban on autonomous weaponry".venturebeat.com.VentureBeat. Retrieved31 January 2020.
  35. ^Robotics and Autonomous Systems – Ground (RAS-G) Interoperability Profile (IOP) (Version 2.0 ed.). Warren, Michigan, USA: US Army Project Manager, Force Projection (PM FP). 2016. Archived fromthe original on 2018-09-02. Retrieved2022-02-27.
  36. ^"U.S. Army Unveils Common UGV Standards".Aviation Week Network. Penton. 10 January 2012. Retrieved25 April 2017.
  37. ^Serbu, Jared (14 August 2014)."Army turns to open architecture to plot its future in robotics".Federal News Radio. Retrieved28 April 2017.
  38. ^Demaitre, Eugene."Military Robots Use Interoperability Profile for Mobile Arms". Robotics Business Review. Archived fromthe original on August 14, 2020. Retrieved14 July 2016.
  39. ^Mazzara, Mark (2011)."RS JPO Interoperability Profiles". Warren, Michigan: U.S. Army RS JPO. Retrieved20 March 2017.[dead link]
  40. ^Mazzara, Mark (2014)."UGV Interoperability Profiles (IOPs) Update for GVSETS"(PDF). Warren, Michigan: U.S. Army PM FP. Retrieved20 March 2017.[permanent dead link]
  41. ^Demaitre, Eugene (14 July 2016)."Military Robots Use Interoperability Profile for Mobile Arms".Robotics Business Review. EH Publishing. Retrieved28 April 2017.[permanent dead link]
  42. ^Textron Rolls Out Ripsaw Robot For RCV-Light … And RCV-Medium.Breaking Defense. 14 October 2019.
  43. ^US Army picks winners to build light and medium robotic combat vehicles.Defense News. 9 January 2020.
  44. ^GVSC, NGCV CFT announces RCV Light and Medium award selections.Army.mil. 10 January 2020.
  45. ^Army Picks 2 Firms to Build Light and Medium Robotic Combat Vehicles.Military.com. 14 January 2020.
  46. ^Army Setting Stage for New Unmanned Platforms.National Defense Magazine. 10 April 2020.
  47. ^Meet The Army’s Future Family Of Robot Tanks: RCV.Breaking Defense. 9 November 2020.
  48. ^"UPI: UGCV PerceptOR Integration"(PDF) (Press release). Carnegie Mellon University. Archived fromthe original(PDF) on 16 December 2013. Retrieved18 November 2010.
  49. ^ab"Carnegie Mellon's National Robotics Engineering Center Unveils Futuristic Unmanned Ground Combat Vehicles"(PDF) (Press release). Carnegie Mellon University. April 28, 2006. Archived fromthe original(PDF) on 22 September 2010. Retrieved18 November 2010.
  50. ^"Crusher Unmanned Ground Combat Vehicle Unveiled"(PDF) (Press release). Defense Advanced Research Projects Agency. April 28, 2006. Archived fromthe original(PDF) on 12 January 2011. Retrieved18 November 2010.
  51. ^Sharkey, Noel."Grounds for Discrimination: Autonomous Robot Weapons"(PDF).RUSI: Challenges of Autonomous Weapons: 87. Archived fromthe original(PDF) on 28 September 2011. Retrieved18 November 2010.
  52. ^"Strikes from 700km away to drones replacing mules for ration at 15,000ft, India gears up for unmanned warfare – India News".indiatoday.in. 4 February 2021. Retrieved22 February 2021.
  53. ^Kumagai, Jean (March 1, 2007)."A Robotic Sentry For Korea's Demilitarized Zone". IEEE Spectrum.
  54. ^Rabiroff, Jon (July 12, 2010)."Machine Gun Toting Robots Deployed On DMZ". Stars and Stripes. Archived fromthe original on April 6, 2018.
  55. ^"Robotics Lifestyle Innovation Brought by Robots".HyundaiMotorGroup Tech. August 2, 2022. Archived fromthe original on August 3, 2022. RetrievedAugust 3, 2022.
  56. ^"B21-0673 – Personal Delivery Device Act of 2016". Archived fromthe original on 2017-03-06. Retrieved2017-03-05.
  57. ^Fung, Brian (24 June 2016)."It's official: Drone delivery is coming to D.C. in September" – via www.washingtonpost.com.
  58. ^"B22-0019 – Personal Delivery Device Act of 2017". Archived fromthe original on 2017-03-06. Retrieved2017-03-05.
  59. ^"HB 2016 Electric personal delivery devices; operation on sidewalks and shared-use paths".
  60. ^"SB 1207 Electric personal delivery devices; operation on sidewalks and shared-use paths".
  61. ^"Virginia is the first state to pass a law allowing robots to deliver straight to your door". March 2017.
  62. ^"Could delivery robots be on their way to Idaho?". Archived fromthe original on 2017-03-03. Retrieved2017-03-02.
  63. ^Florida senator proposes rules for tiny personal delivery robots January 25, 2017
  64. ^Simon, Matt (6 December 2017)."San Francisco Just Put the Brakes on Delivery Robots".Wired. Retrieved6 December 2017.
  65. ^Brinklow, Adam (6 December 2017)."San Francisco bans robots from most sidewalks".Curbed. Retrieved6 December 2017.

External links

[edit]
Aerial
Ground
Walking
Other
Underwater
Surface
Space
Other
National
Other
Retrieved from "https://en.wikipedia.org/w/index.php?title=Autonomous_robot&oldid=1322270244"
Categories:
Hidden categories:

[8]ページ先頭

©2009-2025 Movatter.jp