Movatterモバイル変換


[0]ホーム

URL:


Skip to content
DEV Community
Log in Create account

DEV Community

Cover image for Lesson 11 – Getting Started with TensorFlow, Docker, and Kubernetes
Daniel Azevedo
Daniel Azevedo

Posted on

     

Lesson 11 – Getting Started with TensorFlow, Docker, and Kubernetes

Hi devs,

In this post, I'll guide you through the process of creating a simple machine learning app usingTensorFlow, containerizing it withDocker, and deploying it onKubernetes. Don't worry if you're new to these tools—I'll break everything down step by step.

Step 1: Building a Basic TensorFlow Model in Python

Let's kick things off by writing a simple TensorFlow app that trains a model using the MNIST dataset, which is a classic dataset for handwritten digit recognition.

Here’s the Python code to get started:

# app.pyimporttensorflowastffromtensorflow.kerasimportdatasets,layers,models# Load and preprocess the MNIST dataset(train_images,train_labels),(test_images,test_labels)=datasets.mnist.load_data()train_images,test_images=train_images/255.0,test_images/255.0# Build a simple convolutional neural networkmodel=models.Sequential([layers.Conv2D(32,(3,3),activation='relu',input_shape=(28,28,1)),layers.MaxPooling2D((2,2)),layers.Flatten(),layers.Dense(64,activation='relu'),layers.Dense(10,activation='softmax')])# Compile and train the modelmodel.compile(optimizer='adam',loss='sparse_categorical_crossentropy',metrics=['accuracy'])model.fit(train_images,train_labels,epochs=1)# Evaluate the modeltest_loss,test_acc=model.evaluate(test_images,test_labels)print(f'Test accuracy:{test_acc}')
Enter fullscreen modeExit fullscreen mode

This script does a few things:

  1. Loads the MNIST dataset and normalizes it.
  2. Builds a simple convolutional neural network (CNN).
  3. Trains the model on the training data and evaluates it on the test data.

After running this code, you should see the model train for one epoch, and then it will print out the test accuracy.

Step 2: Dockerizing the TensorFlow App

Now, let's containerize this Python app using Docker. This ensures the app runs consistently across different environments.

First, create aDockerfile in the same directory as yourapp.py:

# DockerfileFROM tensorflow/tensorflow:2.7.0WORKDIR /appCOPY . .CMD ["python", "app.py"]
Enter fullscreen modeExit fullscreen mode

ThisDockerfile is pretty straightforward:

  • It starts from an official TensorFlow image (tensorflow/tensorflow:2.7.0).
  • Sets/app as the working directory.
  • Copies all the files in the current directory to the Docker image.
  • Runsapp.py when the container starts.

To build and run the Docker container, use the following commands:

# Build the Docker imagedocker build-t tensorflow-app.# Run the Docker containerdocker run tensorflow-app
Enter fullscreen modeExit fullscreen mode

Once the container starts, the TensorFlow app will run inside the container, and you'll see the output in your terminal.

Step 3: Deploying with Kubernetes

Now that our app is containerized, the next step is deploying it to a Kubernetes cluster. We’ll use a basic YAML file to describe the deployment.

Here’s a simplekubernetes.yaml for deploying the TensorFlow app:

apiVersion:apps/v1kind:Deploymentmetadata:name:tensorflow-appspec:replicas:1selector:matchLabels:app:tensorflow-apptemplate:metadata:labels:app:tensorflow-appspec:containers:-name:tensorflow-appimage:tensorflow-app:latestports:-containerPort:80
Enter fullscreen modeExit fullscreen mode

This configuration defines:

  • ADeployment namedtensorflow-app with 1 replica.
  • The app will be using the Docker imagetensorflow-app:latest.
  • The app exposes port 80 for access.

To deploy this to your Kubernetes cluster, run the following:

kubectl apply-f kubernetes.yaml
Enter fullscreen modeExit fullscreen mode

This will create a deployment and run the container inside your cluster. To expose the app externally, you can create a service:

kubectl expose deployment tensorflow-app--type=LoadBalancer--port=80
Enter fullscreen modeExit fullscreen mode

Once the service is up, Kubernetes will assign an external IP (if you're using a cloud provider) to access your app.

Conclusion

This workflow is essential for scaling machine learning models in production environments, allowing you to manage and deploy models efficiently.

With these tools in your arsenal, you're well on your way to tackling more complex ML workloads and scaling them across environments. Keep experimenting, and stay tuned for more advanced topics on scaling AI with Kubernetes!

Keep Coding :)

Top comments(1)

Subscribe
pic
Create template

Templates let you quickly answer FAQs or store snippets for re-use.

Dismiss
CollapseExpand
 
anna_lapushner profile image
anna lapushner
Visionary professional ready to contribute inside data-driven organization
  • Joined

Yes !!! You got it 100%

Are you sure you want to hide this comment? It will become hidden in your post, but will still be visible via the comment'spermalink.

For further actions, you may consider blocking this person and/orreporting abuse

C# developer passionate about clean code and design patterns. Sharing insights on software architecture, coding best practices, and tech tips. Join me on this journey to better code!
  • Location
    Portugal
  • Joined

More fromDaniel Azevedo

DEV Community

We're a place where coders share, stay up-to-date and grow their careers.

Log in Create account

[8]ページ先頭

©2009-2025 Movatter.jp