Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Massively parallel rigidbody physics simulation on accelerator hardware.

License

NotificationsYou must be signed in to change notification settings

google/brax

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

394 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

BRAX

WARNINGOnlybrax/training is actively being maintained as of 0.13.0. Instead ofbrax/envs, users should useMuJoCo Playground, all of which train well withbrax/training. If you want to use Brax for physics simulation, please use MJX available atgithub.com/google-deepmind/mujoco (pip install mujoco_mjx) orMuJoCo Warp rather than Brax as a wrapper to MuJoCo physics simulation. We may repurposebrax purely as an RL library in the future.

Brax is a fast and fully differentiable physics engine used for research anddevelopment of robotics, human perception, materials science, reinforcementlearning, and other simulation-heavy applications.

Brax is written inJAX and is designed for useon acceleration hardware. It is both efficient for single-device simulation, andscalable to massively parallel simulation on multiple devices, without the needfor pesky datacenters.

Brax simulates environments at millions of physics steps per second on TPU, and includes a suite of learning algorithms that train agents in secondsto minutes:

One API, Four Pipelines

Brax offers four distinct physics pipelines that are easy to swap:

These pipelines share the same API and can run side-by-side within the samesimulation. This makes Brax well suited for experiments in transfer learningand closing the gap between simulation and the real world.

Quickstart: Colab in the Cloud

Explore Brax easily and quickly through a series of colab notebooks:

  • Brax Basics introduces the Brax API, and shows how to simulate basic physics primitives.
  • Brax Training introduces Brax's training algorithms, and lets you train your own policies directly within the colab. It also demonstrates loading and saving policies.
  • Brax Training with MuJoCo XLA - MJX demonstrates training in Brax using theMJX physics simulator.
  • Brax Training with PyTorch on GPU demonstrates how Brax can be used in other ML frameworks for fast training, in this case PyTorch.

Using Brax Locally

To install Brax from pypi, install it with:

python3 -m venv envsource env/bin/activatepip install --upgrade pippip install brax

You may also install fromConda orMamba:

conda install -c conda-forge brax  # s/conda/mamba for mamba

Alternatively, to install Brax from source, clone this repo,cd to it, and then:

python3 -m venv envsource env/bin/activatepip install --upgrade pippip install -e .

To train a model:

learn

Training on NVidia GPU is supported, but you must first installCUDA, CuDNN, and JAX with GPU support.

Learn More

For a deep dive into Brax's design and performance characteristics, please seeour paper,Brax -- A Differentiable Physics Engine for Large Scale Rigid Body Simulation, which appeared in theDatasets and Benchmarks Track atNeurIPS 2021.

Citing Brax

If you would like to reference Brax in a publication, please use:

@software{brax2021github,  author = {C. Daniel Freeman and Erik Frey and Anton Raichuk and Sertan Girgin and Igor Mordatch and Olivier Bachem},  title = {Brax - A Differentiable Physics Engine for Large Scale Rigid Body Simulation},  url = {http://github.com/google/brax},  version = {0.14.1},  year = {2021},}

Acknowledgements

Brax has come a long way since its original publication. We offer gratitude andeffusive praise to the following people:

  • Manu Orsini and Nikola Momchev who provided a major refactor of Brax'straining algorithms to make them more accessible and reusable.
  • Erwin Coumans who has graciously offered advice and mentorship, and manyuseful references fromTiny Differentiable Simulator.
  • Baruch Tabanpour, a colleague who helped launch brax v2 and overhauled the contact library.
  • Shixiang Shane Gu andHiroki Furuta, who contributed BIG-Gym and Braxlines, and a scene composer to Brax.
  • Our awesomeopen source collaborators and contributors. Thank you!

About

Massively parallel rigidbody physics simulation on accelerator hardware.

Topics

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Contributors51


[8]ページ先頭

©2009-2026 Movatter.jp