Running Django on the Google Kubernetes Engine

Django apps that run onGKE scale dynamically according to traffic.

This tutorial assumes that you're familiar with Django web development. Ifyou're new to Django development, it's a good idea to work throughwriting your first Django app before continuing.

While this tutorial demonstrates Django specifically, you can use this deployment processwith other Django-based frameworks, such asWagtail andDjango CMS.

This tutorial uses Django 5,which requires at least Python 3.10.

You also need to haveDocker installed.

Objectives

In this tutorial, you will:

  • Create and connect a Cloud SQL database.
  • Create and use Kubernetes secret values.
  • Deploy a Django app to Google Kubernetes Engine.

Costs

In this document, you use the following billable components of Google Cloud:

To generate a cost estimate based on your projected usage, use thepricing calculator.

New Google Cloud users might be eligible for afree trial.

Before you begin

  1. Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
  2. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Roles required to select or create a project

    • Select a project: Selecting a project doesn't require a specific IAM role—you can select any project that you've been granted a role on.
    • Create a project: To create a project, you need the Project Creator role (roles/resourcemanager.projectCreator), which contains theresourcemanager.projects.create permission.Learn how to grant roles.
    Note: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.

    Go to project selector

  3. Verify that billing is enabled for your Google Cloud project.

  4. Enable the Cloud SQL, GKE and Compute Engine APIs.

    Roles required to enable APIs

    To enable APIs, you need the Service Usage Admin IAM role (roles/serviceusage.serviceUsageAdmin), which contains theserviceusage.services.enable permission.Learn how to grant roles.

    Enable the APIs

  5. Install thegcloud CLI.

  6. If you're using an external identity provider (IdP), you must first sign in to the gcloud CLI with your federated identity.

  7. Toinitialize the gcloud CLI, run the following command:

    gcloudinit
    Note: You can run the gcloud CLI in the Google Cloud console without installing thegcloud CLI. To run the gcloud CLI in the Google Cloud console,use Cloud Shell.
  8. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Roles required to select or create a project

    • Select a project: Selecting a project doesn't require a specific IAM role—you can select any project that you've been granted a role on.
    • Create a project: To create a project, you need the Project Creator role (roles/resourcemanager.projectCreator), which contains theresourcemanager.projects.create permission.Learn how to grant roles.
    Note: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.

    Go to project selector

  9. Verify that billing is enabled for your Google Cloud project.

  10. Enable the Cloud SQL, GKE and Compute Engine APIs.

    Roles required to enable APIs

    To enable APIs, you need the Service Usage Admin IAM role (roles/serviceusage.serviceUsageAdmin), which contains theserviceusage.services.enable permission.Learn how to grant roles.

    Enable the APIs

  11. Install thegcloud CLI.

  12. If you're using an external identity provider (IdP), you must first sign in to the gcloud CLI with your federated identity.

  13. Toinitialize the gcloud CLI, run the following command:

    gcloudinit
    Note: You can run the gcloud CLI in the Google Cloud console without installing thegcloud CLI. To run the gcloud CLI in the Google Cloud console,use Cloud Shell.

Prepare your environment

Clone a sample app

The code for the Django sample app is in theGoogleCloudPlatform/python-docs-samples repository on GitHub.

  1. You can eitherdownload the sample as a ZIP file and extract it or clone the repository to your local machine:

    gitclonehttps://github.com/GoogleCloudPlatform/python-docs-samples.git
  2. Go to the directory that contains the sample code:

    Linux/macOS

    cdpython-docs-samples/kubernetes_engine/django_tutorial

    Windows

    cdpython-docs-samples\kubernetes_engine\django_tutorial

Confirm your Python setup

This tutorial relies on Python to run the sample application on your machine. The sample code also requires installing dependencies

For more details, refer to thePython development environment guide.

  1. Confirm your Python is at least version 3.10.

    python-V

    You should seePython 3.10.0 or higher.

    Note: If you see a version number starting with "2", you may need to runpython3 instead ofpython. If so, remember to reference your chosen Python installation when runningpython commands.
  2. Create a Python virtual environment and install dependencies:

Download Cloud SQL Auth Proxy to connect to Cloud SQL from your local machine

Note: If you are completing this tutorial from Cloud Shell, this section is not required.

When deployed, your app uses the Cloud SQL Auth Proxy that is built intothe Google Kubernetes Engine environment to communicate with your Cloud SQLinstance. However, to test your app locally, you must install and use a localcopy of the proxy in your development environment. For more details, refer to theCloud SQL Auth Proxy guide.

The Cloud SQL Auth Proxy uses the Cloud SQL API to interact with your SQL instance. To do this, it requires application authentication through thegcloud CLI.

  1. Authenticate and acquire credentials for the API:

    gcloudauthapplication-defaultlogin
  2. Download and install the Cloud SQL Auth Proxy to your local machine.

    Linux 64-bit

    1. Download the Cloud SQL Auth Proxy:
      curl-ocloud-sql-proxyhttps://storage.googleapis.com/cloud-sql-connectors/cloud-sql-proxy/v2.21.0/cloud-sql-proxy.linux.amd64
    2. Make the Cloud SQL Auth Proxy executable:
      chmod+xcloud-sql-proxy

    Linux 32-bit

    1. Download the Cloud SQL Auth Proxy:
      curl-ocloud-sql-proxyhttps://storage.googleapis.com/cloud-sql-connectors/cloud-sql-proxy/v2.21.0/cloud-sql-proxy.linux.386
    2. If thecurl command is not found, runsudo apt install curl and repeat the download command.
    3. Make the Cloud SQL Auth Proxy executable:
      chmod+xcloud-sql-proxy

    macOS 64-bit

    1. Download the Cloud SQL Auth Proxy:
      curl-ocloud-sql-proxyhttps://storage.googleapis.com/cloud-sql-connectors/cloud-sql-proxy/v2.21.0/cloud-sql-proxy.darwin.amd64
    2. Make the Cloud SQL Auth Proxy executable:
      chmod+xcloud-sql-proxy

    Mac M1

    1. Download the Cloud SQL Auth Proxy:
      curl-ocloud-sql-proxyhttps://storage.googleapis.com/cloud-sql-connectors/cloud-sql-proxy/v2.21.0/cloud-sql-proxy.darwin.arm64
    2. Make the Cloud SQL Auth Proxy executable:
      chmod+xcloud-sql-proxy

    Windows 64-bit

    Right-clickhttps://storage.googleapis.com/cloud-sql-connectors/cloud-sql-proxy/v2.21.0/cloud-sql-proxy.x64.exe and selectSave Link As to download the Cloud SQL Auth Proxy. Rename the file tocloud-sql-proxy.exe.

    Windows 32-bit

    Right-clickhttps://storage.googleapis.com/cloud-sql-connectors/cloud-sql-proxy/v2.21.0/cloud-sql-proxy.x86.exe and selectSave Link As to download the Cloud SQL Auth Proxy. Rename the file tocloud-sql-proxy.exe.

    Cloud SQL Auth Proxy Docker image

    The Cloud SQL Auth Proxy has different container images, such asdistroless,alpine,andbuster. The default Cloud SQL Auth Proxy container image usesdistroless, whichcontains no shell. If you need a shell or related tools, then download an image based onalpine orbuster.For more information, seeCloud SQL Auth Proxy Container Images.

    You can pull the latest image to your local machine using Docker by using the following command:

    docker pull gcr.io/cloud-sql-connectors/cloud-sql-proxy:2.21.0

    Note: The Cloud SQL Auth Proxy uses a repository that supports thegcr.io domain but serves images from Artifact Registry. For more information, seeTransition from Container Registry.

    Other OS

    For other operating systems not included here, you cancompile the Cloud SQL Auth Proxy from source.

    You can choose to move the download to somewhere common, such as a location on yourPATH, or your home directory. If you choose to do this, when you start the Cloud SQL Auth Proxy later on in the tutorial, remember to reference your chosen location when usingcloud-sql-proxy commands.

Create backing services

This tutorial uses several Google Cloud services to provide thedatabase, media storage, and secret storage that support the deployed Djangoproject. These services are deployed in a specific region. For efficiency betweenservices, all services should be deployed in the same region.For more information about the closest region to you, seeProducts available by region.

Set up a Cloud SQL for PostgreSQL instance

Django officially supports multiple relational databases, but offers the mostsupport for PostgreSQL. PostgreSQL is supported by Cloud SQL, so thistutorial chooses to use that type of database.

The following section describes the creation of a PostgreSQL instance, database, and database user for the app.

  1. Create the PostgreSQL instance:

    Console

    1. In the Google Cloud console, go to theCloud SQL Instances page.

      Go to the Cloud SQL Instances page

    2. ClickCreate Instance.

    3. ClickChoose PostgreSQL.

    4. ForSQL Edition, choose "Enterprise".

    5. ForEdition Preset, choose "Sandbox".

    6. In theInstance ID field, enterINSTANCE_NAME.

    7. Enter a password for the postgres user.

    8. Keep the default values for the other fields.

    9. ClickCreate Instance.

    It takes a few minutes for the instance to be ready for use.

    gcloud

    • Create the PostgreSQL instance:

      gcloudsqlinstancescreateINSTANCE_NAME \--projectPROJECT_ID \--database-versionPOSTGRES_16 \--tierdb-n1-standard-2 \--regionREGION

    Replace the following:

    • INSTANCE_NAME: the Cloud SQL instance name
    • PROJECT_ID: the Google Cloud project ID
    • REGION: theGoogle Cloud region

    It takes a few minutes to create the instance and for it to be ready for use.

  2. Within the created instance, create a database:

    Console

    1. Within your instance page, go to theDatabases tab.
    2. ClickCreate database.
    3. In theDatabase Name dialog, enterDATABASE_NAME.
    4. ClickCreate.

    gcloud

    • Create the database within the recently created instance:

      gcloudsqldatabasescreateDATABASE_NAME \--instanceINSTANCE_NAME

      ReplaceDATABASE_NAME with a name for the database inside the instance.

  3. Create a database user:

    Note: Users created this way get additional database rights. SeeLimit the database user privileges for an alternative method.

    Console

    1. Within your instance page, go to theUsers tab.
    2. ClickAdd User Account.
    3. In theChoose how to authenticate dialog under "Built-in Authentication":
    4. Enter the usernameDATABASE_USERNAME.
    5. Enter the passwordDATABASE_PASSWORD
    6. ClickAdd.

    gcloud

    • Create the user within the recently created instance:

      gcloudsqluserscreateDATABASE_USERNAME \--instanceINSTANCE_NAME \--passwordDATABASE_PASSWORD

      ReplacePASSWORD with a secure password.

Create a service account

The proxy requires a service account with Editor privileges for yourCloud SQL instance. For more information about service accounts, seetheGoogle Cloud authentication overview.

If you are connecting from Compute Engine, make sureyour VM has the properscope to connect using the Cloud SQL Admin API.

Configure the service account to have either of the following access scopes:

  • https://www.googleapis.com/auth/sqlservice.admin
  • https://www.googleapis.com/auth/cloud-platform
  1. In the Google Cloud console, go to theService accounts page.

    Go to Service accounts

  2. Select the project that contains your Cloud SQL instance.
  3. ClickCreate service account.
  4. In theService account name field, enter a descriptive name for the service account.
  5. Change theService account ID to a unique, recognizable value and then clickCreate and continue.
  6. Click theSelect a role field and select one of the following roles:
  7. ClickDone to finish creating the service account.
  8. Click the action menu for your new service account and then selectManage keys.
  9. Click theAdd key drop-down menu and then clickCreate new key.
  10. Confirm that the key type is JSON and then clickCreate.

    The private key file is downloaded to your machine. You can move it to another location. Keep the key file secure.

Configure the database settings

Use the following commands to set environment variables for database access.These environment variables are used for local testing.

Set up your GKE configuration

  1. This application is represented in a single Kubernetes configuration calledpolls. Inpolls.yaml replace<your-project-id> with yourGoogle Cloud project ID (PROJECT_ID).

  2. Run the following command and note the value ofconnectionName:

    gcloudsqlinstancesdescribeINSTANCE_NAME--format"value(connectionName)"
  3. In thepolls.yaml file, replace<your-cloudsql-connection-string> withtheconnectionName value.

Run the app on your local computer

With the backing services configured, you can now run the app on your computer. This setup allows for local development, creating a superuser, and applying database migrations.

  1. In a separate terminal, start the Cloud SQL Auth Proxy:

    This step establishes a connection from your local computer to yourCloud SQL instance for local testing purposes. Keep theCloud SQL Auth Proxy running the entire time you test your app locally. Runningthis process in a separate terminal allows you to keep working whilethis process runs.

  2. In the original terminal, set the Project ID locally:

  3. Run the Django migrations to set up your models and assets:

    pythonmanage.pymakemigrationspythonmanage.pymakemigrationspollspythonmanage.pymigratepythonmanage.pycollectstatic
  4. Start the Django web server:

    pythonmanage.pyrunserver8080
  5. In your browser, go tohttp://localhost:8080.

    If you are in Cloud Shell, click theWeb Preview button, and selectPreview on port 8080.

    The page displays the following text: "Hello, world. You'reat the polls index." The Django web server running on your computer deliversthe sample app pages.

  6. PressCtrl/Cmd+C to stop the local web server.

Use the Django admin console

In order to log into Django's admin console, you need to create asuperuser. Since you have a locally accessible connection to the database, you can run management commands:

  1. Create a superuser. You will be prompted to enter a username, email, and password.

    pythonmanage.pycreatesuperuser
  2. Start a local web server:

    pythonmanage.pyrunserver
  3. In your browser, go tohttp://localhost:8000/admin.

  4. Log in to the admin site using the username and password you used whenyou rancreatesuperuser.

Note: Since you are connected to your Cloud SQL database, you will log into your deployed Django application with the same credentials.

Deploy the app to GKE

When the app is deployed to Google Cloud, it uses the Gunicornserver. Gunicorn doesn't serve static content, so the app uses Cloud Storageto serve static content.

Collect and upload static resources

  1. Create a Cloud Storage bucket and make it publicly readable.

    gcloudstoragebucketscreategs://PROJECT_ID_MEDIA_BUCKETgcloudstoragebucketsadd-iam-policy-bindinggs://PROJECT_ID_MEDIA_BUCKET--member=allUsersrole=roles/storage.legacyObjectReader
  2. Gather all the static content locally into one folder:

    pythonmanage.pycollectstatic
  3. Upload the static content to Cloud Storage:

    gcloudstoragersync./staticgs://PROJECT_ID_MEDIA_BUCKET/static--recursive
  4. Inmysite/settings.py, set the value ofSTATIC_URL to the following URL,replacing[YOUR_GCS_BUCKET] with your bucket name:

    http://storage.googleapis.com/PROJECT_ID_MEDIA_BUCKET/static/

Set up GKE

  1. To initialize GKE, go to theClusters page.

    Go to the Clusters page

    When you use GKE for the first time in a project, youneed to wait for the "Kubernetes Engine is getting ready. This may take aminute or more" message to disappear.

  2. Create a GKE cluster:

    gcloudcontainerclusterscreatepolls \--scopes"https://www.googleapis.com/auth/userinfo.email","cloud-platform" \--num-nodes4--zone"us-central1-a"

    If an error message similar toProject is not fully initialized with the default service accountsappears, you might need toinitialize Google Kubernetes Engine.

    Initialize GKE

    If you received an error, go to the Google Cloud console to initializeGKE in your project.

    Go to the Clusters page

    Wait for the "Kubernetes Engine is getting ready. This can take a minute ormore" message to disappear.

  3. After the cluster is created, use thekubectl command-line tool, whichis integrated with thegcloud CLI, to interact with yourGKE cluster. Becausegcloud andkubectl are separatetools, make surekubectl is configured to interact with the right cluster.

    gcloudcontainerclustersget-credentialspolls--zone"us-central1-a"

Set up Cloud SQL

  1. You need severalsecrets to enable your GKE app to connect with yourCloud SQL instance. One is required for instance-level access(connection), while the other two are required for database access. For moreinformation about the two levels of access control, seeInstance access control.

    1. To create the secret for instance-level access, provide the location,PATH_TO_CREDENTIAL_FILE, of the JSON service accountkey that you downloaded when you created your service account (seeCreating a service account):

      kubectlcreatesecretgenericcloudsql-oauth-credentials \--from-file=credentials.json=PATH_TO_CREDENTIAL_FILE
    2. To create the secrets for database access, use the SQL database,username, and password defined when you created backing services.SeeSet up a Cloud SQL for PostgreSQL instance:

      kubectlcreatesecretgenericcloudsql \--from-literal=database=DATABASE_NAME \--from-literal=username=DATABASE_USERNAME \--from-literal=password=DATABASE_PASSWORD
  2. Retrieve the public Docker image for the Cloud SQL proxy.

    dockerpullb.gcr.io/cloudsql-docker/gce-proxy
  3. Build a Docker image, replacing<your-project-id> with your project ID.

    dockerbuild-tgcr.io/PROJECT_ID/polls.
  4. Configure Docker to usegcloud as a credential helper, so that you canpush the image toContainer Registry:

    gcloudauthconfigure-docker
  5. Push the Docker image. Replace<your-project-id> with your project ID.

    dockerpushgcr.io/PROJECT_ID/polls
    Note: This command requires write access to Cloud Storage. If yourun this tutorial on a Compute Engine instance, youraccess to Cloud Storage might be read-only. To get write access,create a service account anduse the service account to authenticate on your instance.
  6. Create the GKE resource:

    kubectlcreate-fpolls.yaml
    Note: If you used different names(other thancloudsql-oauth-credentialsandcloudsql) when creating the secrets in the previous commands, then youneed to update thepolls.yaml file to match those new names.

Deploy the app to GKE

After the resources are created, there are threepolls pods on the cluster.Check the status of your pods:

kubectlgetpods

Wait a few minutes for the pod statuses to display asRunning. If the podsaren't ready or if you see restarts, you can get the logs for a particular podto figure out the issue.[YOUR-POD-ID] is a part of the output returned by thepreviouskubectl get pods command.

kubectllogs[YOUR_POD_ID]

See the app run in Google Cloud

After the pods are ready, you can get the external IP address of the loadbalancer:

kubectlgetservicespolls

Note theEXTERNAL-IP address, and go tohttp://[EXTERNAL-IP] in your browser to see the Django pollslanding page and access the administrator console.

Understand the code

Sample application

The Django sample app was created using standard Django tooling. The following commands create the project and the polls app:

django-adminstartprojectmysitepythonmanage.pystartapppolls

The base views, models, and route configurations were copied fromWriting your first Django app (Part 1 andPart 2).

Database configuration

Thesettings.py contains the configuration for your SQL database:

DATABASES={"default":{# If you are using Cloud SQL for MySQL rather than PostgreSQL, set# 'ENGINE': 'django.db.backends.mysql' instead of the following."ENGINE":"django.db.backends.postgresql","NAME":os.getenv("DATABASE_NAME"),"USER":os.getenv("DATABASE_USER"),"PASSWORD":os.getenv("DATABASE_PASSWORD"),"HOST":"127.0.0.1","PORT":"5432",}}

Kubernetes pod configurations

Thepolls.yaml file specifies two Kubernetes resources. The first is theService,which defines a consistent name and internal IP address for the Django web app.The second is anHTTP load balancer with a public-facing external IP address.

# The polls service provides a load-balancing proxy over the polls app# pods. By specifying the type as a 'LoadBalancer', Kubernetes Engine will# create an external HTTP load balancer.# For more information about Services see:#   https://kubernetes.io/docs/concepts/services-networking/service/# For more information about external HTTP load balancing see:#   https://kubernetes.io/docs/tasks/access-application-cluster/create-external-load-balancer/apiVersion:v1kind:Servicemetadata:name:pollslabels:app:pollsspec:type:LoadBalancerports:-port:80targetPort:8080selector:app:polls

The service provides a network name and IP address, andGKE pods run the app's code behind the service.Thepolls.yaml file specifies adeployment that provides declarative updates for GKE pods. The servicedirects traffic to the deployment by matching the service's selector to thedeployment's label. In this case, the selectorpolls is matched to the labelpolls.

apiVersion:apps/v1kind:Deploymentmetadata:name:pollslabels:app:pollsspec:replicas:3selector:matchLabels:app:pollstemplate:metadata:labels:app:pollsspec:containers:-name:polls-app# Replace  with your project ID or use `make template`image:gcr.io/<your-project-id>/polls# This setting makes nodes pull the docker image every time before# starting the pod. This is useful when debugging, but should be turned# off in production.imagePullPolicy:Alwaysenv:-name:DATABASE_NAMEvalueFrom:secretKeyRef:name:cloudsqlkey:database-name:DATABASE_USERvalueFrom:secretKeyRef:name:cloudsqlkey:username-name:DATABASE_PASSWORDvalueFrom:secretKeyRef:name:cloudsqlkey:passwordports:-containerPort:8080-image:gcr.io/cloudsql-docker/gce-proxy:1.16name:cloudsql-proxycommand:["/cloud_sql_proxy","--dir=/cloudsql","-instances=<your-cloudsql-connection-string>=tcp:5432","-credential_file=/secrets/cloudsql/credentials.json"]volumeMounts:-name:cloudsql-oauth-credentialsmountPath:/secrets/cloudsqlreadOnly:true-name:ssl-certsmountPath:/etc/ssl/certs-name:cloudsqlmountPath:/cloudsqlvolumes:-name:cloudsql-oauth-credentialssecret:secretName:cloudsql-oauth-credentials-name:ssl-certshostPath:path:/etc/ssl/certs-name:cloudsqlemptyDir:{}

Clean up

To avoid incurring charges to your Google Cloud account for the resources used in this tutorial, either delete the project that contains the resources, or keep the project and delete the individual resources.

Delete the project

    Caution: Deleting a project has the following effects:
    • Everything in the project is deleted. If you used an existing project for the tasks in this document, when you delete it, you also delete any other work you've done in the project.
    • Custom project IDs are lost. When you created this project, you might have created a custom project ID that you want to use in the future. To preserve the URLs that use the project ID, such as anappspot.com URL, delete selected resources inside the project instead of deleting the whole project.

    If you plan to explore multiple architectures, tutorials, or quickstarts, reusing projects can help you avoid exceeding project quota limits.

  1. In the Google Cloud console, go to theManage resources page.

    Go to Manage resources

  2. In the project list, select the project that you want to delete, and then clickDelete.
  3. In the dialog, type the project ID, and then clickShut down to delete the project.

Delete the individual resources

If you don't want to delete the project, delete the individual resources.

  1. Delete the Google Kubernetes Engine cluster:

    gcloudcontainerclustersdeletepolls
  2. Delete the Docker image that you pushed to Container Registry:

    gcloudcontainerimagesdeletegcr.io/PROJECT_ID/polls
  3. Delete the Cloud SQL instance:

    gcloudsqlinstancesdeleteINSTANCE_NAME

What's next

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2026-02-18 UTC.