Docker & Django appThe implementation of platform independence could ease the development process. In fact, developers usually work on different operating systems with custom configurations. Using containers to overcome the difficulties arising from the diversity of working environments is a common solution nowadays. In order to know how a container image works, I have tried Docker for my Django app.


Install Docker

Sign up to Docker Hub!

https://hub.docker.com/

Download & install Docker Hub!

https://hub.docker.com/editions/community/docker-ce-desktop-windows

If you have installed Docker you can check if it is installed correctly. Let's run the commands below!

Open your command-line interface or shell!

$ docker --version

$ docker run hello-world

$ docker info

Using Docker

Inside a container, you can run applications akin to a virtual environment, moreover with the benefit of platform independence. Docker container images are basically virtualized operating systems, ensuring an environment for applications and databases. As you have already noticed, virtual environments contain only Python packages, not databases or other non-Python applications. If you want to do everything inside a context, you should use Docker, for example. So, if you would like to containerize your Django app, then you need two files: a Dockerfile and a docker-compose.yml file. See the details here: https://docs.docker.com/samples/django/

Create Dockerfile without any extension!

Dockerfile
# Pull base image # set your python 3.x version FROM python:3 # Set environment variables ENV PYTHONDONTWRITEBYTECODE 1 ENV PYTHONUNBUFFERED 1 # Set work directory WORKDIR /code # Install dependencies COPY requirements.txt /code/ RUN pip install -r requirements.txt # Copy project COPY . /code/
docker-compose.yml

Docker compose file containes two services in this case. One service for the application itself (web) and one for the database (db), but you can add more services too, if you want. (e.g. React).

Create docker-compose file with yml or yaml extension!

The "web" service depends on the "db", so it will be the first to run. The SECRET_KEY environment variable must be exported from the .env file! If volume "postgres_data:/var/lib/postgresql/data/" is added to the "db" service, the data will not be lost when the container is stopped.

version: '3.9' services: web: build: . container_name: my_container command: python manage.py runserver 0.0.0.0:8000 volumes: - .:/code ports: - 8000:8000 depends_on: - db environment: # generate new, if necessary! - "SECRET_KEY=${SECRET_KEY}" - "DEBUG=True" db: image: postgres:14 container_name: pgdb volumes: - postgres_data:/var/lib/postgresql/data/ environment: - "POSTGRES_HOST_AUTH_METHOD=trust" volumes: postgres_data:

$ cd /your-project-dir/

$ docker-compose build

Build your image file.

$ docker-compose up -d

Run containers/services. The flag -d stands for detached mode and means "containers running in the background". With this flag, you can run Docker commands while containers are running.

Test if your app is running on localhost!

$ docker-compose down

Use this command, if you want to stop containers/services.