Movatterモバイル変換


[0]ホーム

URL:


Skip to content
DEV Community
Log in Create account

DEV Community

Cover image for ⚖️Scaling Django⚖️
Lyamaa
Lyamaa

Posted on • Edited on

     

⚖️Scaling Django⚖️

Why Scaling?

The potential of your application to cope with increasing numbers of users simultaneously interacting with it. Ultimately, you want it to grow and be able to handle more and more requests per minute (RPMs). There are a number of factors that play a part in ensuring scalability, and it’s worth taking each of them into consideration.

Contents

Requirements:

Well, i am usingDocker to wrap up all my necessary tools and django apps on docker-container. Of-course you can ignore docker but have to install required tools independentely, it all up to you how you go through to it.

Well i am not going through with much details and explanation, please help yourself.

Quikstart

Feeling lazy?

$ python3 -m venv env # create virtual environment$ source env/bin/activate $ poetry install # make sure you have install poetry on your machine
Enter fullscreen modeExit fullscreen mode

OR


$ mkdir scale && cd scale$ python3 -m venv env # create virtual environment$ source env/bin/activate$ poetry init # poetry initialization and generates *.toml file$ poetry add djangorestframework psycopg2-binary Faker django-redis gunicorn$ djang-admin startproject config .$ python manage.py startapp products$ touch Dockerfile$ touch docker-compose.yml
Enter fullscreen modeExit fullscreen mode

Project structure:

─── scale    ├── config    │ ├── **init**.py    │ ├── asgi.py    │ ├── settings    │ │ ├── **init**.py    │ │ ├──base.py    │ │ ├──dev.py    │ │ ├──prod.py    │ ├── urls.py    │ └── wsgi.py    ├── manage.py    └── products    └── .env    └── manage.py    └── docker-compose.yml    └── Dockerfile
Enter fullscreen modeExit fullscreen mode

note: above structure i have breakdown settings intobase.py,prod.py,dev.py. Help yourself to break down, or you can get from hereboilerplate

Let's start with docker.

Dockerfile

FROM python:3.8.5-alpine# prevents Python from generating .pyc files in the containerENV PYTHONDONTWRITEBYTECODE 1# Turns off buffering for easier container loggingENV PYTHONUNBUFFERED 1RUN \    apk add --no-cache curl# install psycopg2 dependenciesRUN apk update \    && apk add postgresql-dev gcc python3-dev musl-dev# Install poetryRUN pip install -U pip \    && curl -sSL https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py | pythonENV PATH="${PATH}:/root/.poetry/bin"RUN mkdir /codeRUN mkdir /code/staticfilesRUN mkdir /code/mediafilesWORKDIR /codeCOPY . /codeRUN poetry config virtualenvs.create false \    && poetry install --no-interaction --no-ansi
Enter fullscreen modeExit fullscreen mode

docker-compose.yaml

version: "3.9"services:  scale:    restart: always    build: .    command: python manage.py runserver 0.0.0.0    volumes:      - .:/code    ports:      - 8000:8000    env_file:      - ./.env    depends_on:      - db  db:    image: "postgres:11"    volumes:      - postgres_data:/var/lib/postgresql/data/    ports:      - 54322:5432    environment:      - POSTGRES_USER=scale      - POSTGRES_PASSWORD=scale      - POSTGRES_DB=scalevolumes:  postgres_data:
Enter fullscreen modeExit fullscreen mode

Above we createDockerfile anddocker-compose.yaml file.

  • we used alpine based image
  • installed dependencies forpostgres andpoetry setup
  • create service namescale anddb

Run the command:

docker-compose up
Enter fullscreen modeExit fullscreen mode

you will get some errordatabase does not exist

let's create a database:

$ docker container lsCONTAINER ID   IMAGE         COMMAND                  CREATED          STATUS          PORTS                                         NAMES78ac4d15bcd8   postgres:11   "docker-entrypoint.s…"   2 hours ago      Up 31 seconds   0.0.0.0:54322->5432/tcp, :::54322->5432/tcp   scale_db_1
Enter fullscreen modeExit fullscreen mode

copyCONTAINER ID value

 $ docker exec -it 78ac4d15bcd8 bash :/# :/# psql --username=postgres psql (11.12 (Debian 11.12-1.pgdg90+1)) Type "help" for help. postgres=# CREATE DATABASE scale; postgres=# CREATE USER scale WITH PASSWORD 'scale'; postgres=# ALTER ROLE scale SET client_encoding TO 'utf8'; postgres=# ALTER ROLE scale SET default_transaction_isolation TO 'read committed'; postgres=# ALTER ROLE scale SET timezone TO 'UTC'; postgres=# ALTER ROLE scale SUPERUSER; postgres=# GRANT ALL PRIVILEGES ON DATABASE scale TO scale; postgres=# \q
Enter fullscreen modeExit fullscreen mode

make sure yoursettings/dev.py have config like this or your given credentials and change yourhostlocalhost todb:

from config.settings import BASE_DIRDATABASES = {    "default": {        "ENGINE": "django.db.backends.postgresql_psycopg2",        "ATOMIC_REQUESTS": True,        "NAME": "scale",        "USER": "scale",        "PASSWORD": "scale",        "HOST": "db",        "PORT": "5432",    }}# REDIS CONFIGCACHES = {    "default": {        "BACKEND": "django_redis.cache.RedisCache",        "LOCATION": "redis://redis:6379/0",        "OPTIONS": {"CLIENT_CLASS": "django_redis.client.DefaultClient"},    }}STATIC_URL = '/static/'STATIC_ROOT = BASE_DIR.parent / "staticfiles"  # for collect staticMEDIA_ROOT = BASE_DIR.parent / "media"MEDIA_URL = "/media/"
Enter fullscreen modeExit fullscreen mode

Nginx Setup

What is Nginx?

Next, we setupredis andnginx andgunicorn on docker:
docker-compose.yaml

version: "3.9"services:  scale:    restart: always    build: .    command: gunicorn config.wsgi:application --bind 0.0.0.0:8000    volumes:      - .:/code      - static_volume:/code/staticfiles      - media_volume:/code/mediafiles    expose:      - 8000    env_file:      - ./.env    depends_on:      - db      - redis  db:    image: "postgres:11"    volumes:      - postgres_data:/var/lib/postgresql/data/    ports:      - 54322:5432    environment:      - POSTGRES_USER=scale      - POSTGRES_PASSWORD=scale      - POSTGRES_DB=scale  redis:    image: redis    ports:      - 63799:6379    restart: on-failure  nginx:    build: ./nginx    restart: always    volumes:      - static_volume:/code/staticfiles      - media_volume:/code/mediafiles    ports:      - 2000:80    depends_on:      - scalevolumes:  postgres_data:  static_volume:  media_volume:
Enter fullscreen modeExit fullscreen mode

so, above we add two servicesredis andnginx and initialzegunicorn instead of our regular command. Next we create anginx dir on root project withDockerfile &nginx.conf

nginx/Dockerfile

FROM nginx:latestRUN rm /etc/nginx/conf.d/default.confCOPY nginx.conf /etc/nginx/conf.d
Enter fullscreen modeExit fullscreen mode

nginx/nginx.conf

upstream core {    server scale:8000;}server {    listen 80;    location / {        proxy_pass http://core;        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;        proxy_set_header Host $host;        proxy_redirect off;        client_max_body_size 100M;    }     location /staticfiles/ {        alias /code/staticfiles/;    }      location /mediafiles/ {        alias /code/mediafiles/;    }}
Enter fullscreen modeExit fullscreen mode

Above, we created aDockerfile which will build ournginx image andnginx.conf where we are serving our app and serving static and media files.

let's rundocker-compose file.

docker-compose up --build
Enter fullscreen modeExit fullscreen mode

Navigate this link to browserhttp://localhost:2000/

Note: Abovedocker-compose.yaml file onnginx service we initiatedport: 2000:80.
so our server will run on port 2000.


Caching Products

First lets try without caching.

Now, let's create amodel for ourproducts app.

products/models.py

from django.db import modelsfrom django.utils.translation import gettext_lazy as _class Category(models.Model):    name = models.CharField(_("Category Name"), max_length=255, unique=True)    description = models.TextField(null=True)    class Meta:        ordering = ("name",)        verbose_name = _("Category")        verbose_name_plural = _("Categories")    def __str__(self) -> str:        return self.nameclass Product(models.Model):    name = models.CharField(_("Product Name"), max_length=255)    category = models.ForeignKey(        Category, on_delete=models.DO_NOTHING)    description = models.TextField()    price = models.DecimalField(decimal_places=2, max_digits=10)    quantity = models.IntegerField(default=0)    discount = models.DecimalField(decimal_places=2, max_digits=10)    image = models.URLField(max_length=255)    class Meta:        ordering = ("id",)        verbose_name = _("Product")        verbose_name_plural = _("Products")    def __str__(self):        return self.name
Enter fullscreen modeExit fullscreen mode

so further moving forward let's create a dummy data using custom commands.
create a management directory inside products app.

── products│── management│ │── **init**.py│ │── commands│ │ │── **init**.py│ │ │── category_seed.py│ │ │── product_seed.py
Enter fullscreen modeExit fullscreen mode

category_seed.py

from django.core.management import BaseCommandfrom django.db import connectionsfrom django.db.utils import OperationalErrorfrom products.models import Categoryfrom faker import Fakerclass Command(BaseCommand):    def handle(self, *args, **kwargs):        faker = Faker()        for _ in range(30):            Category.objects.create(                name=faker.name(),                description=faker.text(200)            )
Enter fullscreen modeExit fullscreen mode

product_seed.py

from django.core.management import BaseCommandfrom django.db import connectionsfrom django.db.utils import OperationalErrorfrom products.models import Category, Productfrom random import randrange, randintfrom faker import Fakerclass Command(BaseCommand):    def handle(self, *args, **kwargs):        faker = Faker()        for _ in range(5000):            price = randrange(10, 100)            quantity = randrange(1, 5)            cat_id = randint(1, 30)            category = Category.objects.get(id=cat)            Product.objects.create(                name=faker.name(),                category=category,                description=faker.text(200),                price=price,                discount=100,                quantity=quantity,                image=faker.image_url())
Enter fullscreen modeExit fullscreen mode

so, i will create 5000 of products and 30 category

$ docker-compose exec scale sh/code # python manage.py makemigrations/code # python manage.py migrate/code # python manage.py createsuperuser/code # python manage.py collectstatic --no-input/code # python manage.py category_seed/code # python manage.py product_seed # takes while to create 5000 data
Enter fullscreen modeExit fullscreen mode

You can view data on pgadmin or admin dashboard if data are loaded or not.

After creation of dummy data let's create aserializers andviews

serializers.py

from rest_framework import serializersfrom .models import Product, Categoryclass CategorySerializers(serializers.ModelSerializer):    class Meta:        model = Category        fields = "__all__"class CategoryRelatedField(serializers.StringRelatedField):    def to_representation(self, value):        return CategorySerializers(value).data    def to_internal_value(self, data):        return dataclass ProductSerializers(serializers.ModelSerializer):    class Meta:        model = Product        fields = "__all__"class ReadProductSerializer(serializers.ModelSerializer):    category = serializers.StringRelatedField(read_only=True)    # category = CategoryRelatedField()    # category = CategorySerializers()    class Meta:        model = Product        fields = "__all__"
Enter fullscreen modeExit fullscreen mode

views.py

from products.models import Productfrom rest_framework import (    viewsets,    status,)import timefrom .serializers import ProductSerializers, ReadProductSerializerfrom rest_framework.response import Responseclass ProductViewSet(viewsets.ViewSet):    def list(self, request):        serializer = ReadProductSerializer(Category.objects.all(), many=True)        return Response(serializer.data)    def create(self, request):        serializer = ProductSerializers(data=request.data)        serializer.is_valid(raise_exception=True)        serializer.save()        return Response(            serializer.data, status=status.HTTP_201_CREATED)    def retrieve(self, request, pk=None,):        products = Product.objects.get(id=pk)        serializer = ReadProductSerializer(products)        return Response(            serializer.data        )    def update(self, request, pk=None):        products = Product.objects.get(id=pk)        serializer = ProductSerializers(            instance=products, data=request.data, partial=True)        serializer.is_valid(raise_exception=True)        serializer.save()        return Response(            serializer.data, status=status.HTTP_202_ACCEPTED)    def destroy(self, request, pk=None):        products = Product.objects.get(id=pk)        products.delete()        return Response(            status=status.HTTP_204_NO_CONTENT        )
Enter fullscreen modeExit fullscreen mode

urls.py

from django.urls import pathfrom .views import ProductViewSeturlpatterns = [    path("product", ProductViewSet.as_view(        {"get": "list", "post": "create"})),    path(        "product/<str:pk>",        ProductViewSet.as_view(            {"get": "retrieve", "put": "update", "delete": "destroy"}),    ),]
Enter fullscreen modeExit fullscreen mode

so, we created a view usingviewsets

let's try with postman using different serializers on viewsets to get lists of 5K data.

http://localhost:2000/api/v1/products

serializersTime
ReadProductSerializer (stringrelatedfield)6.42s
ReadProductSerializer (CategoryRelatedFeild)7.05s
ReadProductSerializer (Nested)6.49s
ReadProductSerializer (PrimaryKeyRelatedField)681 ms
ReadProductSerializer (without any)674ms

Note: response time may varies depending on your system.

Lets get data by using caching:

views.py

from rest_framework.views import APIViewfrom products.models import Category, Productfrom rest_framework import (    viewsets,    status,)from rest_framework.pagination import PageNumberPaginationimport timefrom .serializers import CategorySerializers, ProductSerializers, ReadProductSerializerfrom rest_framework.response import Responsefrom django.core.cache import cacheclass ProductListApiView(APIView):    def get(self, request):        paginator = PageNumberPagination()        paginator.page_size = 10        # get products from cache if exists        products = cache.get('products_data')        #  if products does not exists on cache create it        if not products:            products = list(Product.objects.select_related('category'))            cache.set('products_data', products, timeout=60 * 60)        # paginating cache products        result = paginator.paginate_queryset(products, request)        serializer = ReadProductSerializer(result, many=True)        return paginator.get_paginated_response(serializer.data)class ProductViewSet(viewsets.ViewSet):    def create(self, request):        serializer = ProductSerializers(data=request.data)        serializer.is_valid(raise_exception=True)        serializer.save()        # get cache of products        #  if exists        #  delete cache        for key in cache.keys('*'):            if 'products_data' in key:                cache.delete(key)        cache.delete("products_data")        return Response(            serializer.data, status=status.HTTP_201_CREATED)    def retrieve(self, request, pk=None,):        products = Product.objects.get(id=pk)        serializer = ReadProductSerializer(products)        return Response(            serializer.data        )    def update(self, request, pk=None):        products = Product.objects.get(id=pk)        serializer = ProductSerializers(            instance=products, data=request.data, partial=True)        serializer.is_valid(raise_exception=True)        serializer.save()        for key in cache.keys('*'):            if 'products_data' in key:                cache.delete(key)        cache.delete("products_data")        return Response(            serializer.data, status=status.HTTP_202_ACCEPTED)    def destroy(self, request, pk=None):        products = Product.objects.get(id=pk)        products.delete()        for key in cache.keys('*'):            if 'products_data' in key:                cache.delete(key)        cache.delete("products_data")        return Response(            status=status.HTTP_204_NO_CONTENT        )
Enter fullscreen modeExit fullscreen mode

so, i have created a seperateAPIView and removelist function fromviewsets. Which will fetch data from cache and paginated view.
change yourproducts/urls.py

from django.urls import pathfrom .views import ProductListApiView, ProductViewSeturlpatterns = [    path('products', ProductListApiView.as_view()),    path("product", ProductViewSet.as_view(        {"post": "create"})),    path(        "product/<str:pk>",        ProductViewSet.as_view(            {"get": "retrieve", "put": "update", "delete": "destroy"}),    ),]
Enter fullscreen modeExit fullscreen mode

So, try it again with postman with differentserializers.
you will get results between90 to 200ms depending upon your machine.

Note: in aboveapiview i have usedselect_related. Try removing it and run again with postman, will find a different results.

To learn more about queryset `i.e select_related, prefetch_related. click this linkN+1 Queries Problem

Final words:

Still there are lots of rooms to improve, it depends how?, where?, for what?, how many?.

Hope You guys liked it... chao 👋👋

Top comments(1)

Subscribe
pic
Create template

Templates let you quickly answer FAQs or store snippets for re-use.

Dismiss
CollapseExpand
 
megha_khateek_aec27554ca7 profile image
Megha Khateek
  • Joined

"I was struggling to scale my Django application, and this guide really helped me get things sorted! The combination of Nginx, Redis, and Docker was overwhelming at first, but breaking it down step by step made things much clearer. One of my biggest issues was setting up proper load balancing to distribute traffic efficiently, but I finally got it working. If anyone else is facing similar issues, I highly recommend checking out thisguide on setting up load balancing using Nginx. It helped me optimize my setup even further. Thanks again for this article—it's a lifesaver!"

Are you sure you want to hide this comment? It will become hidden in your post, but will still be visible via the comment'spermalink.

For further actions, you may consider blocking this person and/orreporting abuse

🤫 Top Secret 🤫
  • Location
    🌎 Earth 🌎
  • Work
    REMOTE
  • Joined

More fromLyamaa

DEV Community

We're a place where coders share, stay up-to-date and grow their careers.

Log in Create account

[8]ページ先頭

©2009-2025 Movatter.jp