
Posted on • Edited on • Originally published atmehmetakar.dev
Oumi: Making AI Models Fully Open Source
I want to talk about a new platform aiming to make AI Models fully open source.They've got 1229 stars from Github just today(2nd February, 2025)
Oumi: Open Universal Machine Intelligence
Oumi is a fully open-source platform designed to simplify the lifecycle of foundation models, from data preparation and training to evaluation and deployment. Whether you're working on a small-scale project or training large models, Oumi offers the necessary tools and workflows with one consistent API.
About Oumi
Oumi was founded by formerGoogle and Apple AI engineers, aiming to address the transparency and accessibility challenges in AI research. The company is backed by13 leading research universities, includingPrinceton, Stanford, MIT, UC Berkeley, University of Oxford, University of Cambridge, University of Waterloo, and Carnegie Mellon.
Unlike traditional AI development models, Oumi adopts adistributed approach by leveraging university clusters and cloud-based computing instead of relying on massive centralized data centers. With a$10 million seed round, the company is focused on building a truly open and collaborative AI ecosystem.
According toManos Koukoumidis, Oumi's CEO and a formerGoogle Cloud AI senior engineering manager, existing AI models like DeepSeek and Llama offer limited transparency.Oumi's goal is to remove these barriers by providing complete access to model architectures, training data, and methodologies.
Key Features
- 🚀Scalability: Train models from 10M to 405B parameters with state-of-the-art techniques.
- 🤖Model Variety: Supports Llama, DeepSeek, Qwen, Phi, and more.
- 🔄Data Synthesis & Curation: Use LLM judges for dataset refinement.
- ⚡Optimized Inference: Deploy with vLLM and SGLang engines.
- 📊Evaluation: Benchmark models efficiently.
- 🌎Cross-Platform Compatibility: Run on laptops, clusters, or cloud services like AWS, Azure, GCP, and Lambda.
- 🔌API Integration: Works with OpenAI, Anthropic, Vertex AI, and more.
🚀 Getting Started
Installation
Oumi is easy to install and set up:
# Install the package (CPU & NPU only)pipinstalloumi# OR, with GPU support (Requires Nvidia or AMD GPU)pipinstalloumi[gpu]# Install the latest version from the sourcegit clone https://github.com/oumi-ai/oumi.gitcdoumipipinstall.
For additional installation options, check out theInstallation Guide.
Basic Usage
Once installed, you can start training, evaluating, and running inference with Oumi using simple CLI commands.
🔧 Training a Model
oumi train-c configs/recipes/smollm/sft/135m/quickstart_train.yaml
📋 Evaluating a Model
oumi evaluate-c configs/recipes/smollm/evaluation/135m/quickstart_eval.yaml
🤖 Running Inference
oumi infer-c configs/recipes/smollm/inference/135m_infer.yaml--interactive
Running Jobs Remotely
Oumi supports remote training on cloud platforms like AWS, GCP, and Azure.
# Deploy on AWS\oumi launch up -c configs/recipes/smollm/sft/135m/quickstart_aws_job.yaml
More details on cloud integrations can be found in theCloud Deployment Guide.
🛠 Advanced Features
🏗 Fine-Tuning with LoRA
Oumi supportsLoRA,QLoRA, andDPO for efficient fine-tuning of large models.
oumi train-c configs/recipes/llama3_1/sft/8b_lora/train.yaml
⚡ Inference with vLLM
For efficient inference at scale:
oumi infer-c configs/recipes/llama3_1/inference/8b_rvllm_infer.yaml
📈 Evaluating Across Benchmarks
Oumi provides built-in evaluation tools for model benchmarking.
oumi evaluate-c configs/recipes/llama3_1/evaluation/8b_eval.yaml
🌟 Why Choose Oumi?
- Zero Boilerplate: Start quickly with pre-configured recipes.
- Enterprise-Grade: Designed for large-scale model training.
- Research-Friendly: Easily reproducible experiments.
- Broad Model Support: Works with small to massive models.
- Optimized Performance: Supports distributed training.
- Open Source: Community-driven, free to use.
Oumi’sradical approach contrasts with AI giants like OpenAI, which invest billions into massive infrastructure projects. Oumi proves thathigh-quality AI models can be built and trained without centralized data centers, usingcollaborative university-based computing.
For more details, visit the officialOumi Documentation.
Top comments(2)

I was very excited to learn about Oumi last week. It has the potential to become as important as HuggingFace in the coming months and years!! HuggingFace had sort of become the central hub for all collaboration between Open Source AI developers, and I think it's a good thing that there's now Oumi as well.

- LocationBursa, Türkiye
- EducationKoc University, Istanbul, Türkiye.
- WorkIndependent Researcher
- Joined
Yeah, let's see...
For further actions, you may consider blocking this person and/orreporting abuse