- Notifications
You must be signed in to change notification settings - Fork23
deploying an ML model to Heroku with FastAPI
License
NotificationsYou must be signed in to change notification settings
testdrivenio/fastapi-ml
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
Check out thetutorial.
Build and tag the Docker image:
$ docker build -t fastapi-prophet.
Spin up the container:
$ docker run --name fastapi-ml -e PORT=8008 -p 8008:8008 -d fastapi-prophet:latest
Train the model:
$ dockerexec -it fastapi-ml python>>> from model import train, predict, convert>>>train()
Test:
$ curl \ --header"Content-Type: application/json" \ --request POST \ --data'{"ticker":"MSFT"}' \ http://localhost:8008/predict
Create and activate a virtual environment:
$ python3 -m venv venv&&source venv/bin/activate
Install the requirements:
(venv)$ pip install -r requirements.txt
Train the model:
(venv)$ python>>> from model import train, predict, convert>>>train()
Run the app:
(venv)$ uvicorn main:app --reload --workers 1 --host 0.0.0.0 --port 8008
Test:
$ curl \ --header"Content-Type: application/json" \ --request POST \ --data'{"ticker":"MSFT"}' \ http://localhost:8008/predict
About
deploying an ML model to Heroku with FastAPI
Topics
Resources
License
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Uh oh!
There was an error while loading.Please reload this page.