Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up

Q-attention (within the ARM system) and coarse-to-fine Q-attention (within C2F-ARM system).

License

NotificationsYou must be signed in to change notification settings

stepjam/ARM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Codebase of Q-attention, coarse-to-fine Q-attention, and other variants. Code from the following papers:

task grid image missing

Installation

ARM is trained using theYARR framework and evaluated onRLBench 1.1.0.

Install all of the project requirements:

# Create conda environmentconda create -n arm python=3.8# Install PyTorch 2.0. Go to PyTorch website to install other versions.conda install pytorch=2.0 torchvision torchaudio pytorch-cuda=11.7 -c pytorch -c nvidia# Install YARRpip install git+https://github.com/stepjam/YARR.git# Install CoppeliaSim 4.1.0 for Ubuntu 20.04# Refer to PyRep README for other versionsexport COPPELIASIM_ROOT=${HOME}/.local/bin/CoppeliaSimcurl -O https://www.coppeliarobotics.com/files/CoppeliaSim_Edu_V4_1_0_Ubuntu20_04.tar.xzmkdir -p$COPPELIASIM_ROOT&& tar -xf CoppeliaSim_Edu_V4_1_0_Ubuntu20_04.tar.xz -C$COPPELIASIM_ROOT --strip-components 1## Add environment variables into bashrc (or zshrc)echo"export COPPELIASIM_ROOT=$COPPELIASIM_ROOTexport LD_LIBRARY_PATH=\$LD_LIBRARY_PATH:\$COPPELIASIM_ROOTexport QT_QPA_PLATFORM_PLUGIN_PATH=\$COPPELIASIM_ROOT">>~/.bashrc# Install PyRepgit clone https://github.com/stepjam/PyRep.git .local/PyRepcd .local/PyReppip install -r requirements.txtpip install.cd ../..# Install RLBenchgit clone https://github.com/stepjam/RLBench.git .local/RLBenchcd .local/RLBenchpip install -r requirements.txtpip install.cd ../..# Install ARM dependenciespip install -r requirements.txt

Running experiments

Be sure to have RLBench demos saved on your machine before proceeding. To generate demos for a task, go to thetools directory in RLBench(rlbench/tools), and run:

python dataset_generator.py --save_path=/mnt/my/save/dir --tasks=take_lid_off_saucepan --image_size=128,128 \--renderer=opengl --episodes_per_task=100 --variations=1 --processes=1

Experiments are launched viaHydra. To start trainingC2F-ARM on thetake_lid_off_saucepan task with the default parameters ongpu 0:

python launch.py method=C2FARM rlbench.task=take_lid_off_saucepan rlbench.demo_path=/mnt/my/save/dir framework.gpu=0

To launchC2F-ARM+LPR:

python launch.py method=LPR rlbench.task=take_lid_off_saucepan rlbench.demo_path=/mnt/my/save/dir framework.gpu=0

To launchC2F-ARM+QTE:

python launch.py method=QTE rlbench.task=take_lid_off_saucepan rlbench.demo_path=/mnt/my/save/dir framework.gpu=0

About

Q-attention (within the ARM system) and coarse-to-fine Q-attention (within C2F-ARM system).

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages


[8]ページ先頭

©2009-2025 Movatter.jp