Get started with a local deep learning container Stay organized with collections Save and categorize content based on your preferences.
This page describes how to create and set up a local deep learning container.This guide expects you to have basic familiaritywithDocker.
Before you begin
Complete the following steps to set up a Google Cloud account, enablethe required APIs, and install and activate the required software.
In the Google Cloud Console, go to theManage resources pageand select or create a project.
Note: If you don't plan to keep the resources you create in this tutorial,create a new project instead of selecting an existing project.After you finish, you can delete the project, removing all resourcesassociated with the project and tutorial.If you're using a Linux-based operating system, such as Ubuntu or Debian,add your username to the
dockergroup so that you can run Dockerwithout usingsudo: Caution: Thesudousermod-a-Gdocker${USER}dockergroup is equivalent to therootuser.SeeDocker's documentationfor details on how this affects the security of your system.You may need to restart your system after adding yourself tothe
dockergroup.Open Docker. To ensure that Docker is running, run the followingDocker command, which returns the current time and date:
docker run busybox dateUse
gcloudas the credential helper for Docker:gcloud auth configure-dockerOptional: If you want to run the container using GPU locally,install
nvidia-docker.
Create your container
Follow these steps to create your container.
To view a list of containers available:
gcloud container images list \ --repository="gcr.io/deeplearning-platform-release"You may want to go toChoosing a containerto help you select the container that you want.
If you don't need to use a GPU-enabled container, enter the following codeexample. Replacetf-cpu.1-13 with the name of the containerthat you want to use.
docker run -d -p 8080:8080 -v /path/to/local/dir:/home/jupyter \ gcr.io/deeplearning-platform-release/tf-cpu.1-13If you want to use a GPU-enabled container, enter the following codeexample. Replacetf-gpu.1-13 with the name of the containerthat you want to use.
docker run --runtime=nvidia -d -p 8080:8080 -v /path/to/local/dir:/home/jupyter \ gcr.io/deeplearning-platform-release/tf-gpu.1-13
This command starts up the container in detached mode, mounts the localdirectory/path/to/local/dir to/home/jupyter in the container, and mapsport 8080 on the container to port 8080 on your local machine. Thecontainer is preconfigured to start a JupyterLab server, which you canvisit athttp://localhost:8080.
What's next
- Learn more about how to work with containers in theDockerdocumentation.
Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-12-15 UTC.