Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

iCub

From Wikipedia, the free encyclopedia

Open source robotics humanoid robot testbed
iCub
An iCub robot inTrento,Italy, in 2018. The robot is 104 cm high and weighs around 22 kg.
ManufacturerItalian Institute of Technology
CountryItaly
Year of creation2009–present
TypeHumanoid robot
Purposeresearch, recreational
Websitewww.icub.org
iCub software
DeveloperItalian Institute of Technology
Initial release2009; 17 years ago (2009)
Stable release
2.10.0 / September 4, 2025; 5 months ago (2025-09-04)
Written inC++[1]
Operating systemFree/Libre operating systems:Linux,FreeBSD,NetBSD,OpenBSD;
Non-free operating systems:OS X,Windows
TypeArtificial Intelligence,Robotics
LicenseGNU GPL/GNU LGPL[2] (Free software)
Websitegithub.com/robotology/icub-main

iCub is a one meter tallopen source roboticshumanoid robottestbed for research into human cognition and artificial intelligence.

It was designed by theRobotCub Consortium of several European universities, built byItalian Institute of Technology, and is now supported by other projects such asITALK.[3] The robot isopen-source, with the hardware design, software and documentation all released under theGPL license. The name is a partial acronym,cub standing for Cognitive Universal Body. Initial funding for the project was8.5 million fromUnit E5 – Cognitive Systems and Robotics – of theEuropean Commission'sSeventh Framework Programme, and this ran for 65 months from 1 September 2004 until 31 January 2010.

The motivation behind the strongly humanoid design is theembodied cognition hypothesis, that human-like manipulation plays a vital role in the development of human cognition. A baby learns many cognitive skills by interacting with its environment and other humans using its limbs and senses, and consequently its internal model of the world is largely determined by the form of the human body. The robot was designed to test this hypothesis by allowing cognitive learning scenarios to be acted out by an accurate reproduction of the perceptual system, and an articulation of a small child so that it could interact with the world in the same way that such a child does.[4]

Specifications

[edit]
An iCub at a live demo making facial expressions

The dimensions of the iCub are similar to that of a 3.5-year-old child. The robot is controlled by an on-boardPC104 controller which communicates with actuators and sensors usingCANBus.

It utilises tendon driven joints for the hand and shoulder, with the fingers flexed byteflon-coated cable tendons running inside teflon-coated tubes, and pulling against spring returns. Joint angles are measured using custom-designedHall-effect sensors and the robot can be equipped with torque sensors. The finger tips can be equipped with tactile touch sensors, and a distributed capacitive sensor skin is being developed.

The software library is largely written in C++ and uses YARP for external communication via Gigabit Ethernet with off-board software implementing higher level functionality, the development of which has been taken over by the RobotCub Consortium.[4] The robot was not designed for autonomous operation, and is consequently not equipped with onboard batteries or processors required for this —instead an umbilical cable provides power and a network connection.[4]

In its final version, the robot has 53 actuateddegrees of freedom organized as follows:

  • 7 in each arm
  • 9 in each hand (3 for the thumb, 2 for the index, 2 for the middle finger, 1 for the coupled ring and little finger, 1 for the adduction/abduction)
  • 6 in the head (3 for the neck and 3 for the cameras)
  • 3 in the torso/waist
  • 6 in each leg

The head hasstereo cameras in a swivel mounting where eyes would be located on a human and microphones on the side. It also has lines of red LEDs representing mouth and eyebrows mounted behind the face panel for making facial expressions.

Since the first robots were constructed the design has undergone several revisions and improvements, for example smaller and more dexterous hands,[5] and lighter, more robust legs with greater joint angles and which permit walking rather than just crawling.[6]

Capabilities of iCub

[edit]
iCub at an exhibition in 2014

The iCub has been demonstrated with capabilities to successfully perform the following tasks, among others:

  • crawling, using visual guidance with optic marker on the floor[7]
  • solving complex 3D mazes[8][9]
  • archery, shooting arrows with a bow and learning to hit the center of the target[10][11]
  • facial expressions, allowing the iCub to express emotions[12]
  • force control, exploiting proximal force/torque sensors[13]
  • grasping small objects, such as balls, plastic bottles, etc.[14]
  • collision avoidance within non-static environments, as well as, self-collision avoidance[15][16][17]

iCubs in the world

[edit]
An iCub robot mounted on a supporting frame

These robots were built byIstituto Italiano di Tecnologia (IIT) inGenoa and are used by a small but lively community of scientists that use the iCub to study embodied cognition in artificial systems. There are about thirty iCubs in various laboratories mainly in theEuropean Union but also one in the United States.[18] The first researcher in North America to be granted an iCub wasStephen E. Levinson, for studies of computational models of the brain and mind and language acquisition.[19]

The robots are constructed by IIT and cost about €250,000[20] each depending upon the version.[21]Most of the financial support comes from the European Commission's Unit E5 or the Istituto Italiano di Tecnologia (IIT) via the recently created iCub Facility department.[18]The development and construction of iCub at IIT is part of an independent documentary film calledPlug & Pray which was released in 2010.[22]

See also

[edit]


References

[edit]
  1. ^iCub Source Code
  2. ^"iCub". Retrieved27 November 2019.The iCub is distributed as Open Source following the GPL/LGPL licenses and can now count on a worldwide community of enthusiastic developers.
  3. ^"An open source cognitive humanoid robotic platform".Official iCub website. Retrieved30 July 2010.
  4. ^abcMetta, Giorgio; Sandini Giulio; Vernon David; Natale Lorenzo; Nori Francesco (2008).The iCub humanoid robot: an open platform for research in embodied cognition(PDF). PerMIS’08. Retrieved1 January 2018.
  5. ^June, Laura (12 March 2010)."iCub gets upgraded with tinier hands, better legs".Engadget. Retrieved30 July 2010.
  6. ^Tsagarakis, N.G.; Vanderborght Bram; Laffranchi Matteo; Caldwell D.G.The Mechanical Design of the New Lower Body for the Child Humanoid robot 'iCub'(PDF). IEEE International Conference on Robotics and Automation Conference, (ICRA 2009). Archived fromthe original(PDF) on 20 July 2011. Retrieved30 July 2010.
  7. ^Crawling. iCub HumanoidRobot. 16 April 2010.Archived from the original on 11 June 2016. Retrieved18 February 2022 – viaYouTube.
  8. ^Nath, Vishnu; Stephen Levinson.Learning to Fire at Targets by an iCub Humanoid Robot. AAAI Spring Symposium 2013 : Designing Intelligent Robots : Reintegrating AI II. Archived fromthe original on 4 March 2016. Retrieved29 September 2013.
  9. ^iCub autonomously solving a puzzle. Vishnu Nath. 8 March 2013.Archived from the original on 16 May 2016. Retrieved18 February 2022 – viaYouTube.
  10. ^Kormushev, Petar; Calinon Sylvain; Saegusa Ryo; Metta Giorgio.Learning the skill of archery by a humanoid robot iCub(PDF). IEEE International Conference on Humanoid Robots, (Humanoids 2010). Retrieved19 March 2011.
  11. ^Robot Archer iCub. PetarKormushev. 22 September 2010.Archived from the original on 22 January 2021. Retrieved18 February 2022 – viaYouTube.
  12. ^iCub facial expressions. Vislab Lisboa. 17 March 2009.Archived from the original on 5 September 2020. Retrieved18 February 2022 – viaYouTube.
  13. ^Force control exploiting proximal force/torque sensors - pt.2. iCub HumanoidRobot. 11 October 2010.Archived from the original on 9 June 2016. Retrieved18 February 2022 – viaYouTube.
  14. ^"Toward Intelligent Humanoids".iCub manipulating a variety of objects. Archived fromthe original on 10 March 2014. Retrieved22 July 2013.
  15. ^Frank, Mikhail; Jürgen Leitner; Marijn Stollenga; Gregor Kaufmann; Simon Harding; Alexander Förster; Jürgen Schmidhuber.The Modular Behavioral Environment for Humanoids & other Robots (MoBeE)(PDF). 9th International Conference on Informatics in Control, Automation and Robotics (ICINCO). Archived fromthe original(PDF) on 8 April 2014. Retrieved8 April 2014.
  16. ^Leitner, Jürgen ‘Juxi’; Simon Harding; Mikhail Frank; Alexander Förster; Jürgen Schmidhuber.Transferring Spatial Perception Between Robots Operating In A Shared Workspace(PDF). IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2012). Archived fromthe original(PDF) on 8 April 2014. Retrieved8 April 2014.
  17. ^Stollenga, Marijn; Leo Pape; Mikhail Frank; Jürgen Leitner; Alexander Förster; Jürgen Schmidhuber.Task-Relevant Roadmaps: A Framework for Humanoid Motion Planning. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2013).
  18. ^ab"The iCub humanoid robot project".Istituto Italiano di Tecnologia (IIT). Retrieved1 January 2018.
  19. ^"Humanoid Robot Learns Like a Child".Discovery News. Retrieved11 February 2013.
  20. ^"XE: (EUR/USD) Euro to US Dollar Rate".www.xe.com. Retrieved20 November 2015.
  21. ^"Archived copy".iCub website. Archived fromthe original on 17 February 2018. Retrieved30 July 2010.{{cite web}}: CS1 maint: archived copy as title (link)
  22. ^Plug & Pray, documentary film about the social impact of robots and related ethical questions

External links

[edit]
Wikimedia Commons has media related toICub.
Legged
Mini
Small,
medium
Human-
sized
Big
Wheeled
Tracked
Upper torso
Related
Preceded by
RobotCub
Humanoid robotsSucceeded by
-
Retrieved from "https://en.wikipedia.org/w/index.php?title=ICub&oldid=1316798286"
Categories:
Hidden categories:

[8]ページ先頭

©2009-2026 Movatter.jp