- Notifications
You must be signed in to change notification settings - Fork132
Ultrafast serverless GPU inference, sandboxes, and background jobs
License
beam-cloud/beta9
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
Beam is a fast, open-source runtime for serverless AI workloads. It gives you a Pythonic interface to deploy and scale AI applications with zero infrastructure overhead.
- Fast Image Builds: Launch containers in under a second using a custom container runtime
- Parallelization and Concurrency: Fan out workloads to 100s of containers
- First-Class Developer Experience: Hot-reloading, webhooks, and scheduled jobs
- Scale-to-Zero: Workloads are serverless by default
- Volume Storage: Mount distributed storage volumes
- GPU Support: Run on our cloud (4090s, H100s, and more) or bring your own GPUs
pip install beam-client
- Create an accounthere
- Follow ourGetting Started Guide
Spin up isolated containers to run LLM-generated code:
frombeamimportImage,Sandboxsandbox=Sandbox(image=Image()).create()response=sandbox.process.run_code("print('I am running remotely')")print(response.result)
Create an autoscaling endpoint for your custom model:
frombeamimportImage,endpointfrombeamimportQueueDepthAutoscaler@endpoint(image=Image(python_version="python3.11"),gpu="A10G",cpu=2,memory="16Gi",autoscaler=QueueDepthAutoscaler(max_containers=5,tasks_per_container=30))defhandler():return {"label":"cat","confidence":0.97}
Schedule resilient background tasks (or replace your Celery queue) by adding a simple decorator:
frombeamimportImage,TaskPolicy,schema,task_queueclassInput(schema.Schema):image_url=schema.String()@task_queue(name="image-processor",image=Image(python_version="python3.11"),cpu=1,memory=1024,inputs=Input,task_policy=TaskPolicy(max_retries=3),)defmy_background_task(input:Input,*,context):image_url=input.image_urlprint(f"Processing image:{image_url}")return {"image_url":image_url}if__name__=="__main__":# Invoke a background task from your app (without deploying it)my_background_task.put(image_url="https://example.com/image.jpg")# You can also deploy this behind a versioned endpoint with:# beam deploy app.py:my_background_task --name image-processor
Beta9 is the open-source engine poweringBeam, our fully-managed cloud platform. You can self-host Beta9 for free or choose managed cloud hosting through Beam.
We welcome contributions big or small. These are the most helpful things for us:
- Submit afeature request orbug report
- Open a PR with a new feature or improvement
About
Ultrafast serverless GPU inference, sandboxes, and background jobs
Topics
Resources
License
Code of conduct
Contributing
Security policy
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Packages0
Uh oh!
There was an error while loading.Please reload this page.


