Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up

Real-time 6 DOF grasp detection in clutter.

License

NotificationsYou must be signed in to change notification settings

ethz-asl/vgn

Repository files navigation

VGN is a 3D convolutional neural network for real-time 6 DOF grasp pose detection. The network accepts a Truncated Signed Distance Function (TSDF) representation of the scene and outputs a volume of the same spatial resolution, where each cell contains the predicted quality, orientation, and width of a grasp executed at the center of the voxel. The network is trained on a synthetic grasping dataset generated with physics simulation.

overview

This repository contains the implementation of the following publication:

  • M. Breyer, J. J. Chung, L. Ott, R. Siegwart, and J. Nieto. Volumetric Grasping Network: Real-time 6 DOF Grasp Detection in Clutter.Conference on Robot Learning (CoRL 2020), 2020. [pdf][video]

If you use this work in your research, pleasecite accordingly.

The next sections provide instructions for getting started with VGN.

Installation

The following instructions were tested withpython3.8 on Ubuntu 20.04. A ROS installation is only required for visualizations and interfacing hardware. Simulations and network training should work just fine without. TheRobot Grasping section describes the setup for robotic experiments in more details.

OpenMPI is optionally used to distribute the data generation over multiple cores/machines.

sudo apt install libopenmpi-dev

Clone the repository into thesrc folder of a catkin workspace.

git clone https://github.com/ethz-asl/vgn

Create and activate a new virtual environment.

cd /path/to/vgnpython3 -m venv --system-site-packages .venvsource .venv/bin/activate

Install the Python dependencies within the activated virtual environment.

pip install -r requirements.txt

Build and source the catkin workspace,

catkin build vgnsource /path/to/catkin_ws/devel/setup.zsh

or alternatively install the project locally in "editable" mode usingpip.

pip install -e .

Finally, download the data folderhere, then unzip and place it in the repo's root.

Data Generation

Generate raw synthetic grasping trials using thepybullet physics simulator.

python scripts/generate_data.py data/raw/foo --scene pile --object-set blocks [--num-grasps=...] [--sim-gui]
  • python scripts/generate_data.py -h prints a list with all the options.
  • mpirun -np <num-workers> python ... will run multiple simulations in parallel.

The script will create the following file structure withindata/raw/foo:

  • grasps.csv contains the configuration, label, and associated scene for each grasp,
  • scenes/<scene_id>.npz contains the synthetic sensor data of each scene.

Clean the generated grasp configurations using thedata.ipynb notebook.

Finally, generate the voxel grids/grasp targets required to train VGN.

python scripts/construct_dataset.py data/raw/foo data/datasets/foo
  • Samples of the dataset can be visualized with thevis_sample.py script andvgn.rviz configuration. The script includes the option to apply a random affine transform to the input/target pair to check the data augmentation procedure.

Network Training

python scripts/train_vgn.py --dataset data/datasets/foo [--augment]

Training and validation metrics are logged to TensorBoard and can be accessed with

tensorboard --logdir data/runs

Simulated Grasping

Run simulated clutter removal experiments.

python scripts/sim_grasp.py --model data/models/vgn_conv.pth [--sim-gui] [--rviz]
  • python scripts/sim_grasp.py -h prints a complete list of optional arguments.
  • To detect grasps using GPD, you first need to install and launch thegpd_ros node (roslaunch vgn gpd.launch).

Use theclutter_removal.ipynb notebook to compute metrics and visualize failure cases of an experiment.

Robot Grasping

This package contains an example of open-loop grasp execution with a Franka Emika Panda and a wrist-mounted Intel Realsense D435. Since the robot drivers are not officially supported on ROS noetic yet, we used the following workaround:

  • Launch the roscore and hardware drivers on a NUC withlibfranka installed.
  • Run MoveIt and the VGN scripts on a second computer with a ROS noetic installation connected to the same roscore following theseinstructions. This requires the latest version ofpanda_moveit_config.

First, on the NUC, start a roscore and launch the robot and sensor drivers:

roscore &roslaunch vgn panda_grasp.launch

Then, on the 20.04 computer, run

roslaunch panda_moveit_config move_group.launchpython scripts/panda_grasp.py --model data/models/vgn_conv.pth

Citing

@inproceedings{breyer2020volumetric, title={Volumetric Grasping Network: Real-time 6 DOF Grasp Detection in Clutter}, author={Breyer, Michel and Chung, Jen Jen and Ott, Lionel and Roland, Siegwart and Juan, Nieto}, booktitle={Conference on Robot Learning}, year={2020},}

About

Real-time 6 DOF grasp detection in clutter.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

[8]ページ先頭

©2009-2025 Movatter.jp