An open toolkit for composable, automatic, and scalable learning
Composable
To quickly assemble your applications
Automatic
To automatically tune your models
Scalable
To efficiently train your large models
For machine learning in the real world
Learn moreProjects
Examples
3
Scale across GPUs with Minimal Coding
A novel TensorFlow training engine for distributed deep learning
CASL Updates
Latest updates and news about CASL



AdaptDL and AutoDist Tutorial (AAAI 2021)
Simplifying and Automating Parallel Machine Learning via a Programmable and Composable Parallel ML System



Texar and Forte Tutorial
(KDD 2020)
(KDD 2020)
Learning from All Types of Experiences: A Unifying Machine Learning Perspective
Research and Technology

OSDI 2022
Alpa: Automating Inter- and Intra-Operator Parallelism for Distributed Deep Learning

AAAI 2021
BANANAS: Bayesian Optimization with Neural Architectures for Neural Architecture Search

Journal of Machine Learning Research (JMLR), 2020
Tuning Hyperparameters without Grad Students: Scalable and Robust Bayesian Optimisation with Dragonfly

AAAI 2021
On Trustworthiness of ML Algorithms -- and implications in AI-driven healthcare