cannot set up docker it gives this error. Postgres and Django

`

django.db.utils.OperationalError: could not translate host name "db" to address: Temporary failure in name resolution

`

.env file

APP_ENV=development

DEBUG=True

ALLOWED_HOSTS=localhost,127.0.0.1,0.0.0.0

DB_NAME=book

DB_USER=book

DB_PASSWORD=book

DB_PORT=5432

DB_HOST=db

SECRET_KEY=django-insecure-0pmsi9w(^$$d@csiau%knd#70v(oe0%lxp20cklaswbh%05s=@3

This is my docker-compose.yml file

version: '3'

services:
  postgres:
    image: postgres:14-alpine
    ports:
      - 5432:5432
    volumes:
      - database-data:/var/lib/postgresql/data
    environment:
      - POSTGRES_PASSWORD=book
      - POSTGRES_USER=book
      - POSTGRES_DB=book
  bot:
    build:
      context: .
      dockerfile: Dockerfile
    command: >
      bash -c "python3 manage.py makemigrations && python3 manage.py migrate && python3 manage.py runserver 0.0.0.0:8000"
    volumes:
      - ./:/app
    ports:
      - "8000:8000"
    env_file:
      - .env
    depends_on:
      - postgres

volumes:
  database-data:

and this is my DB in settings.py

DATABASES = {
    'default': {
        "ENGINE": "django.db.backends.postgresql_psycopg2",
        "NAME": env("DB_NAME"),
        "USER": env("DB_USER"),
        "PASSWORD": env("DB_PASSWORD"),
        "HOST": env("DB_HOST"),
        "PORT": env("DB_PORT"),
    }
}

Please someone can help me to fix this error. I am using Windows. I appreciate any help you can provide.

Your DB_HOST is postgres, not db.

I changed to postgres. anyway It gives the same error.

/usr/local/lib/python3.11/site-packages/django/core/management/commands/makemigrations.py:160: RuntimeWarning: Got an e
rror checking a consistent migration history performed for database connection 'default': could not translate host name "postgres" to address: T
emporary failure in name resolution

This error seems to be talking about migration history so I think you should check your ‘makemigrations & migrate’ commands or even clean-up your ‘migrations’ folder (leaving __init__.py) and running the commands again.

you know bro I do not have any migrations folder. I am new in this field so I have faced so many issues. If I ask any simple questions please don’t get angry.

I pulled the initial project from github. and added .env file. After this I run this command docker compose build it seemed good. then I run docker compose up and got the above error.

If I had mistakes to run the project let me know.

If that’s what you’ve pulled from some git repo, then the original author failed to push the migration files along with the code.

You’ll need to run a manage.py makemigrations book general users to create the necessary migration files, then run manage.py migrate

One of the possible issues you’re facing here is that your code for the makemigrations is running before the postgres is fully up-and-running. You may want to introduce some kind of delay or “monitoring program” to cause this command to wait until after the database has been initialized.

Finally, you do not want to use runserver to run your application.

Quoting directly from the docs for runserver:

DO NOT USE THIS SERVER IN A PRODUCTION SETTING

You want to run your application using some other wsgi container such as gunicorn or uwsgi.

Now it is working but when I run docker compose build it does not install the modules from requirements.txt. For this reason I cannot create superuser. When I try to create it says import error.

To work this docker I changed some files. I am providing those codes below.

Docker file

FROM python:3.11

ENV PYTHONDONTWRITEBYTECODE 1

ENV PYTHONUNBUFFERED 1

WORKDIR /app

COPY requirements.txt /app/

RUN pip install --no-cache-dir -r requirements.txt

COPY . /app/

# Copy wait_for_db script

COPY wait_for_db.sh /app/wait_for_db.sh

RUN chmod +x /app/wait_for_db.sh

docker-compose.yml

version: '3'

services:
  postgres:
    image: postgres:15-alpine
    ports:
      - 5432:5432
    volumes:
      - database-data:/var/lib/postgresql/data
    environment:
      - POSTGRES_PASSWORD=book
      - POSTGRES_USER=book
      - POSTGRES_DB=book

  bot:
    build:
      context: .
      dockerfile: Dockerfile
    command: >
      bash -c "./wait_for_db.sh && python3 manage.py makemigrations book general users && python3 manage.py migrate && python3 manage.py runserver 0.0.0.0:8000"
    volumes:
      - ./:/app
    ports:
      - "8000:8000"
    env_file:
      - .env
    depends_on:
      - postgres

volumes:
  database-data:

And add new script wait-for-db.sh

#!/bin/bash
# wait_for_db.sh

# Function to check if PostgreSQL is ready to accept connections
check_postgres() {
    python3 - <<END
import psycopg2
import sys
import time

def check():
    try:
        conn = psycopg2.connect(
            dbname="book",
            user="book",
            password="book",
            host="postgres",
            port="5432"
        )
        conn.close()
        return True
    except psycopg2.OperationalError as e:
        return False

if __name__ == "__main__":
    retries = 20
    while retries > 0:
        if check():
            sys.exit(0)
        else:
            retries -= 1
            time.sleep(1)
    sys.exit(1)
END
}

# Wait for PostgreSQL to be ready
check_postgres

[+] Building 0.0s (0/2)
[2024-04-05T16:04:05.298072500Z][docker-credential-desktop][W] Windows version might not be up-to-date: The system cannot find
[+] Building 1.2s (12/12) FINISHED
 => [bot internal] load build definition from dockerfile                                                                 0.0s
 => => transferring dockerfile: 318B                                                                                     0.0s
 => [bot internal] load .dockerignore                                                                                    0.0s
 => => transferring context: 2B                                                                                          0.0s
 => [bot internal] load metadata for docker.io/library/python:3.11                                                       1.0s
 => [bot 1/7] FROM docker.io/library/python:3.11@sha256:58ef3c375e20ca749f5bf4d1a62186a01e9d1d5756671037e391c832a5062d1  0.0s
 => [bot internal] load build context                                                                                    0.0s
 => => transferring context: 3.24kB                                                                                      0.0s
 => CACHED [bot 2/7] WORKDIR /app                                                                                        0.0s
 => CACHED [bot 3/7] COPY requirements.txt /app/                                                                         0.0s
 => CACHED [bot 4/7] RUN pip install --no-cache-dir -r requirements.txt                                                  0.0s
 => CACHED [bot 5/7] COPY . /app/                                                                                        0.0s
 => CACHED [bot 6/7] COPY wait_for_db.sh /app/wait_for_db.sh                                                             0.0s
 => CACHED [bot 7/7] RUN chmod +x /app/wait_for_db.sh                                                                    0.0s
 => [bot] exporting to image                                                                                             0.0s
 => => exporting layers                                                                                                  0.0s
 => => writing image sha256:ad50d10415cc3ed5ed450ae4c3db167b1971d66fefd5e33b51be3c56053c4c1f                             0.0s
 => => naming to docker.io/library/share-book-bot    

Correct, because you’re now trying to run manage.py from outside your container environment.

Additionally, by shutting down your containers, postgres is no longer running. So when you’re trying to run createsuperuser, you don’t have a database available.

Can you teach me what to do?

Is this your first deployment of a Django project?

Or is this just your first Docker-based deployment?

Please provide some more detail about your working environment and where and how you’re trying to deploy this.

So, for the first time, I am working on a project as a team. Previously, I was doing projects by creating environments only through “venv”. I learned docker for this project. They prepared the initial part of the project and uploaded it to GitHub. I pulled a project from GitHub and I needed to run this project on my computer by creating an environment through Docker. Now, I used docker and now it doesn’t get the required modules from the “requirements.txt” file. Currently, the project is in the initial stage, we are implementing the bot project. We planned to make the bot admin panel in Django. I don’t need to deploy the project anywhere yet, I just need to use it properly in the local system.

What does your Dockerfile look like for your bot? That’s where you would be creating the environment and installing modules.

sorry, but I do not understand what you are asking.

For now, I need to run the project and able to create a superuser within a working docker container. Can you help me to do this.

I need to see the Dockerfile that you are using to build the bot container.

I think, you are asking this one

 bot:
    build:
      context: .
      dockerfile: Dockerfile
    command: >
      bash -c "./wait_for_db.sh && python3 manage.py makemigrations book general users && python3 manage.py migrate && python3 manage.py runserver 0.0.0.0:8000"
    volumes:
      - ./:/app
    ports:
      - "8000:8000"
    env_file:
      - .env
    depends_on:
      - postgres

No, that’s the docker-compose file.

Notice how it’s referencing a file named “Dockerfile”:

That’s the file I need to see next.

FROM python:3.11

ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1

WORKDIR /app

COPY requirements.txt /app/
RUN pip install --no-cache-dir -r requirements.txt

COPY . /app/

# Copy wait_for_db script
COPY wait_for_db.sh /app/wait_for_db.sh
RUN chmod +x /app/wait_for_db.sh

So you’ve got a couple of options here:

  • Build your “bot” container such that you have a shell available that would allow you to execute commands from a shell within that container.

  • Build a separate container for running manage commands and use it for running those commands.

  • Configure a virtual environment in your host system with your application, and configure your installation to use the postgres docker container as the database. (This may or may not be easy, depending upon how postgres is configured in its container.)

You mean to run only postgres without running the bot container from the “docker-compose.yml” file. I have to start the server with venv environment created right?