Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up

Evolution Strategies in JAX 🦎

License

NotificationsYou must be signed in to change notification settings

RobertTLange/evosax

Repository files navigation

PyversionsPyPI versionRuffcodecovPaper

Tired of having to handle asynchronous processes for neuroevolution? Do you want to leverage massive vectorization and high-throughput accelerators for Evolution Strategies?evosax provides a comprehensive, high-performance library that implements Evolution Strategies (ES) in JAX. By leveraging XLA compilation and JAX's transformation primitives,evosax enables researchers and practitioners to efficiently scale evolutionary algorithms to modern hardware accelerators without the traditional overhead of distributed implementations.

The API follows the classicalask-eval-tell cycle of ES, with full support for JAX's transformations (jit,vmap,lax.scan). The library includes 30+ evolution strategies, from classics like CMA-ES and Differential Evolution to modern approaches like OpenAI-ES and Diffusion Evolution.

Get started here 👉Colab

Basicevosax API Usage 🍲

importjaxfromevosax.algorithmsimportCMA_ES# Instantiate the search strategyes=CMA_ES(population_size=32,solution=dummy_solution)params=es.default_params# Initialize statekey=jax.random.key(0)state=es.init(key,params)# Ask-Eval-Tell loopforiinrange(num_generations):key,key_ask,key_eval=jax.random.split(key,3)# Generate a set of candidate solutions to evaluatepopulation,state=es.ask(key_ask,state,params)# Evaluate the fitness of the populationfitness= ...# Update the evolution strategystate=es.tell(population,fitness,state,params)# Get best solutionstate.best_solution,state.best_fitness

Implemented Evolution Strategies 🦎

StrategyReferenceImportExample
Simple Evolution StrategyRechenberg (1978)SimpleESColab
OpenAI-ESSalimans et al. (2017)Open_ESColab
CMA-ESHansen & Ostermeier (2001)CMA_ESColab
Sep-CMA-ESRos & Hansen (2008)Sep_CMA_ESColab
xNESWierstra et al. (2014)XNESColab
SNESWierstra et al. (2014)SNESColab
MA-ESBayer & Sendhoff (2017)MA_ESColab
LM-MA-ESLoshchilov et al. (2017)LM_MA_ESColab
Rm_ESLi & Zhang (2017)Rm_ESColab
PGPESehnke et al. (2010)PGPEColab
ARSMania et al. (2018)ARSColab
ESMCMerchant et al. (2021)ESMCColab
Persistent ESVicol et al. (2021)PersistentESColab
Noise-Reuse ESLi et al. (2023)NoiseReuseESColab
CR-FM-NESNomura & Ono (2022)CR_FM_NESColab
Guided ESMaheswaranathan et al. (2018)GuidedESColab
ASEBOChoromanski et al. (2019)ASEBOColab
Discovered ESLange et al. (2023a)DESColab
Learned ESLange et al. (2023a)LESColab
EvoTFLange et al. (2024)EvoTF_ESColab
iAMaLGaM-FullBosman et al. (2013)iAMaLGaM_FullColab
iAMaLGaM-UnivariateBosman et al. (2013)iAMaLGaM_UnivariateColab
Gradientless DescentGolovin et al. (2019)GLDColab
Simulated AnnealingRasdi Rere et al. (2015)SimAnnealColab
Hill ClimbingRasdi Rere et al. (2015)SimAnnealColab
Random SearchBergstra & Bengio (2012)RandomSearchColab
SV-CMA-ESBraun et al. (2024)SV_CMA_ESColab
SV-OpenAI-ESLiu et al. (2017)SV_OpenESColab
Simple Genetic AlgorithmSuch et al. (2017)SimpleGAColab
MR15-GARechenberg (1978)MR15_GAColab
SAMR-GAClune et al. (2008)SAMR_GAColab
GESMR-GAKumar et al. (2022)GESMR_GAColab
LGALange et al. (2023b)LGAColab
Diffusion EvolutionZhang et al. (2024)DiffusionEvolutionColab
Differential EvolutionStorn & Price (1997)DEColab
Particle Swarm OptimizationKennedy & Eberhart (1995)PSOColab

Installation ⏳

You will need Python 3.10 or later, and a working JAX installation.

Then, installevosax from PyPi:

pip install evosax

To upgrade to the latest version ofevosax, you can use:

pip install git+https://github.com/RobertTLange/evosax.git@main

Examples 📖

Key Features 💎

  • Comprehensive Algorithm Collection: 30+ classic and modern evolution strategies with a unified API
  • JAX Acceleration: Fully compatible with JAX transformations for speed and scalability
  • Vectorization & Parallelization: Fast execution on CPUs, GPUs, and TPUs
  • Production Ready: Well-tested, documented, and used in research environments
  • Batteries Included: Comes with optimizers like ClipUp, fitness shaping, and restart strategies

Related Resources 📚

Citingevosax ✏️

If you useevosax in your research, please cite the followingpaper:

@article{evosax2022github,author  ={Robert Tjarko Lange},title   ={evosax: JAX-based Evolution Strategies},journal ={arXiv preprint arXiv:2212.04180},year    ={2022},}

Acknowledgements 🙏

We acknowledge financial support by theGoogle TRC and the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) under Germany's Excellence Strategy - EXC 2002/1"Science of Intelligence" - project number 390523135.

Contributing 👷

Contributions are welcome! If you find a bug or are missing your favorite feature, pleaseopen an issue or submit a pull request following ourcontribution guidelines 🤗.

Disclaimer⚠️

This repository contains independent reimplementations of LES and DES based and is unrelated to Google DeepMind. The implementation has been tested to reproduce the official results on a range of tasks.


[8]ページ先頭

©2009-2025 Movatter.jp