Movatterモバイル変換


[0]ホーム

URL:


Falco et al., 2012 - Google Patents

Data fusion based on optical technology for observation of human manipulation

Falco et al., 2012

ViewPDF @Full View
Document ID
687134722314586959
Author
Falco P
De Maria G
Natale C
Pirozzi S
Publication year
Publication venue
International Journal of Optomechatronics

External Links

Snippet

The adoption of human observation is becoming more and more frequent within imitation learning and programming by demonstration approaches (PbD) to robot programming. For robotic systems equipped with anthropomorphic hands, the observation phase is very …
Continue reading atwww.tandfonline.com (PDF) (other versions)

Classifications

The classifications are assigned by a computer and are not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the classifications listed.
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterized by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterized by the transducing means by opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/033Indexing scheme relating to G06F3/033
    • G06F2203/0331Finger worn pointing device
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision

Similar Documents

PublicationPublication DateTitle
Lee et al.Visual-inertial hand motion tracking with robustness against occlusion, interference, and contact
Palli et al.The DEXMART hand: Mechatronic design and experimental evaluation of synergy-based control for human-like grasping
US10976863B1 (en)Calibration of inertial measurement units in alignment with a skeleton model to control a computer system based on determination of orientation of an inertial measurement unit from an image of a portion of a user
EP3903080B1 (en)Tactile sensor
KR20200110784A (en) Magnetic sensor-based proximity detection
Kondo et al.Recognition of in-hand manipulation using contact state transition for multifingered robot hand control
Baldi et al.Using inertial and magnetic sensors for hand tracking and rendering in wearable haptics
Su et al.Sensor fusion-based anthropomorphic control of under-actuated bionic hand in dynamic environment
Falco et al.Data fusion based on optical technology for observation of human manipulation
Wang et al.Whole-body teleoperation control of dual-arm robot using sensor fusion
Pan et al.A sensor glove for the interaction with a nursing-care assistive robot
Cavallo et al.Optoelectronic joint angular sensor for robotic fingers
Zhang et al.ADG-Net: A Sim2Real multimodal learning framework for adaptive dexterous grasping
Luo et al.Enhancing Human–Robot Collaboration: Supernumerary Robotic Limbs for Object Balance
Hernoux et al.Investigation of dynamic 3D hand motion reproduction by a robot using a Leap Motion
Gilday et al.A Vision-Based Collocated Actuation-Sensing Scheme for a Compliant Tendon-Driven Robotic Hand.
HaschkeGrasping and manipulation of unknown objects based on visual and tactile feedback
KR20210035669A (en)Hand motion tracking system and method for safety education of driven agricultural machinery based on virtual reality
Denz et al.A high-accuracy, low-budget Sensor Glove for Trajectory Model Learning
Jin et al.A Vision-Based Motion Retargeting for Teleoperation of Dexterous Robotic Hands
Metta et al.Force control and reaching movements on the icub humanoid robot
KR102203933B1 (en)Method and apparatus for motion capture interface using multiple fingers
Wang et al.Finger Tracking for Human Computer Interface Using Multiple Sensor Data Fusion
Veber et al.Assessing joint angles in human hand via optical tracking device and calibrating instrumented glove
Huang et al.HandCept: A Visual-Inertial Fusion Framework for Accurate Proprioception in Dexterous Hands

[8]
ページ先頭

©2009-2025 Movatter.jp