Get started with a local deep learning container

This page describes how to create and set up a local deep learning container.This guide expects you to have basic familiaritywithDocker.

Before you begin

Complete the following steps to set up a Google Cloud account, enablethe required APIs, and install and activate the required software.

  1. In the Google Cloud Console, go to theManage resources pageand select or create a project.

    Note: If you don't plan to keep the resources you create in this tutorial,create a new project instead of selecting an existing project.After you finish, you can delete the project, removing all resourcesassociated with the project and tutorial.

    Go to Manageresources

  2. Install and initialize the gcloud CLI.

  3. Install Docker.

    If you're using a Linux-based operating system, such as Ubuntu or Debian,add your username to thedocker group so that you can run Dockerwithout usingsudo:

    sudousermod-a-Gdocker${USER}
    Caution: Thedocker group is equivalent to theroot user.SeeDocker's documentationfor details on how this affects the security of your system.

    You may need to restart your system after adding yourself tothedocker group.

  4. Open Docker. To ensure that Docker is running, run the followingDocker command, which returns the current time and date:

    docker run busybox date
  5. Usegcloud as the credential helper for Docker:

    gcloud auth configure-docker
  6. Optional: If you want to run the container using GPU locally,installnvidia-docker.

Create your container

Follow these steps to create your container.

  1. To view a list of containers available:

    gcloud container images list \  --repository="gcr.io/deeplearning-platform-release"

    You may want to go toChoosing a containerto help you select the container that you want.

  2. If you don't need to use a GPU-enabled container, enter the following codeexample. Replacetf-cpu.1-13 with the name of the containerthat you want to use.

    docker run -d -p 8080:8080 -v /path/to/local/dir:/home/jupyter \  gcr.io/deeplearning-platform-release/tf-cpu.1-13

    If you want to use a GPU-enabled container, enter the following codeexample. Replacetf-gpu.1-13 with the name of the containerthat you want to use.

    docker run --runtime=nvidia -d -p 8080:8080 -v /path/to/local/dir:/home/jupyter \  gcr.io/deeplearning-platform-release/tf-gpu.1-13

This command starts up the container in detached mode, mounts the localdirectory/path/to/local/dir to/home/jupyter in the container, and mapsport 8080 on the container to port 8080 on your local machine. Thecontainer is preconfigured to start a JupyterLab server, which you canvisit athttp://localhost:8080.

What's next

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2025-12-15 UTC.