Best practices: Managing requirements.txt

Below is my requirements.txt. Notice the comparison operators. Whenever I push to my Heroku dyno, a buildpack is triggered which parses the requirements.txt and sequentially installs each package with the exact version number specified. Most of the package versions installed are “grandfathered” when I initialized the installation, meaning they are frozen in time. But naturally, as package maintainers add features and patch vulnerabilities as they increment each release, I am missing out on these updates thus putting my website at risk of exploitation. So if I change the comparison operator from “Equal” (==) to “Greater than or equal to” (>=), every time I deploy changes to my Heroku Dyno, the buildpack will grab the latest and highest release for each package. The benefit of this approach is that I will get all the latest security patches. But with this approach, I have two concerns:

  1. If I set Django to >= 3.2., pip would upgrade to Django v4.0 which is not what I want. I’d rather just have pip install the latest (highest) Django 3.2.x stable release but not below Django v4.x. My question is: How do I get pip to build the latest Django v3.2.x within these parameters inside requirements.txt?
  2. I figure another potential problem with this approach is that new bleeding edge packages (even those marked stable by their maintainers) could break backwards compatibility, causing administration headaches juggling cumbersome potential compatibility issues down the road as I maintain my website.

What other security best practices are worth considering in this context?

When I Google requirements.txt django there are many outdated Stack Overflow threads. When I search the official Django docs for requirements.txt nothing comes up.

Here is my requirements.txt:

Django==3.2.5

asgiref==3.3.2
attrs==20.3.0
certifi==2020.12.5
chardet==4.0.0
dj-database-url==0.5.0
django-environ==0.4.5
django-heroku==0.3.1
gunicorn==20.0.4
heroku==0.1.4
idna==2.10
iniconfig==1.1.1
Pillow==8.3.2
pluggy==0.13.1
psycopg2==2.9.1
py==1.10.0
pytest==6.2.1
python-dateutil==1.5
python-decouple==3.4
python-dotenv==0.15.0
pytz==2021.1
requests==2.25.1
sqlparse==0.4.2
toml==0.10.2
urllib3==1.26.5
whitenoise==5.2.0

You can specify a combination of version specifications in your requirements.

Django>=3.2,<4.0

The requirements.txt file is a Python / pip facility and is not part of Django.
See:

Regarding the stability of upgrade combinations, you could have what we call a “staging” environment. It’s a non-production environment used as the final step of testing before deployment. A reasonably comprehensive suite of tests would give you more confidence that it’s going to work in your production environment, but it’s still not proof!

I recommend using pip-compile. This is a tool that will take a requirements.in file with a ranged specifiers, and “compile” them into a requirements.txt that uses exact pinned version numbers (==).

For example, you can write in your requirements.in just:

django>=3.2,<4.0

Then compile it with:

$ pip-compile
...

And your resulting requirements.txt will contain (with current version numbers):

#
# This file is autogenerated by pip-compile with python 3.9
# To update, run:
#
#    pip-compile
#
asgiref==3.4.1
    # via django
django==3.2.9
    # via -r requirements.in
pytz==2021.3
    # via django
sqlparse==0.4.2
    # via django

You can later upgrade everything with pip-compile -U.

This will make it a lot easier to keep your versions pinned, but updated. It also solves problems with unpinned/added/removed transitive dependencies (the dependencies of your dependencies).

Hope that helps!

(I’m writing a whole section on pip-compile in my upcoming Django development experience book.)

1 Like