Best Practices: Deprecation policy for EOL Python versions

[This post was originally a Django Commons discussion and moved to the Django Forum]

Original author: @ulgens

Idea

When a Python version sees EOL, support for the given Python version should be removed from all packages either immediately or after a selected short time period: Status of Python versions

An automated rule/workflow that removes EOL Python versions from all packages in the organization may be the end goal there. If a package doesn’t want to follow the common deprecation policy, exceptions can be made.

  • “We don’t want to update our dependencies because people using us doesn’t update theirs” should not be a valid excuse to get exempt from the policy. Python EOL dates are announced multiple years in advance.
  • A debugging tool or other things that helps to move away from the EOL - outdated versions can be the perfect exception.

Benefits

  • I think it’s obvious but nobody should spend their energy to support EOL dependencies. Instead, we can recommend people to update their Python version, I believe it will be more beneficial for all parties.
  • It’s hard to debug old things. In Python’s scope, being EOL is a good indicator of something being old. Debugging a problem with a supported version of Python is easier than an EOL version.
  • It helps to keep things modern.

Rules

  1. Min Python requirement shouldn’t be in the EOL state.
  2. Min Python requirement doesn’t have to be the oldest supported version. Projects can define a higher lower bound.
  3. The project’s linter shouldn’t include anything older than the min Python requirement.
  4. The project’s Github Actions shouldn’t include anything older than the min Python requirement.
  5. The project’s trove classifiers shouldn’t include anything older than the min Python requirement.
  6. The project’s code shouldn’t use features from anything older than the min Python requirement.
  7. The project’s code should be optimized and modernized for the min Python requirement.

Automation

I’ll make some assumptions and list a couple of prerequisites that we didn’t discuss before.

  • All projects should include [pyupgrade](https://github.com/asottile/pyupgrade)
    • It is the perfect tool for the job, but is there any alternative?
  • pyupgrade should be configured with target-version = py<min_supported_version>
  • pyupgrade should be applied automatically in the development environment and GitHub Actions.
    • My personal preference for automatic quality checks in a development environment is pre-commit, and then I use it with a thin wrapper in GitHub Actions. I guess tox is the better choice for package development, but I think the same can be achieved with tox.

These steps will only handle rules 6 and 7. Please share your recommendations for the rest.


Updates

Oct 31, 2024

  • Grammar & typo improvements
  • Rewrote the idea part.
  • Update the title for clarity
1 Like

@CodenameTim Thanks for moving this; I hope it attracts some interest.

For Django Commons, would we like to enrich the package list with datapoints similar to the ones in hugovk (Hugo van Kemenade) · GitHub ‘s “maintained packages” table? I believe this will help to understand the current state of the packages.

+1
maintaining old python versions is painful
and in many cases makes developing new features a pain

2 Likes

I personally don’t see the value in doing that on that particular list page for Django Common’s documentation. I agree there’s value in understanding the state of a package, but I wouldn’t expect a person to be needing that information when looking at that page. I think having that information present on djangopackages.org is more relevant and is more likely to align with the scenario where a user is looking for that information.

Or having a separate page in the documentation that’s explicitly about surfacing that type of analysis. If we went that route, we’d want to find a way to programmatically calculate things, because I don’t think we should ask a package maintainer to update this page after making a release. But again, that feels like a djangopackages.org domain problem.

From my personal experience, it’s worth distinguishing between:

  • What the package docs state to set expectations with users and contributors.
  • The testing matrix / automation to QA for a given Python version.
  • The code using language constructs incompatible with given versions.

“We don’t want to update our dependencies because people using us doesn’t update theirs” should not be a valid excuse to get exempt from the policy. Python EOL dates are announced multiple years in advance.

The other side of that coin is if you’re too eager to drop support, you make it harder for devs who work in legacy / slow-moving environments to adopt your package. It’s a trade-off. From experience, it’s often felt worthwhile to do a bit more support than I’d planned to get this adoption.

1 Like

I do have some thoughts on this, and I already shared some of them in the proposal, but funny enough, I just learned that we are doing the opposite of this with Django releases.

Following the release of Django 6.0, we suggest that third-party app authors drop support for all versions of Django prior to 5.2.

While we are talking about not dropping EOL Python versions, Django release notes recommend dropping support for an active LTS (4.2) version.

I’m confused about what the common take is here.

Is it that comparable to start with? For Django versions, since there are breaking changes - the workarounds to support multiple versions start to pile up. Add enough supported versions and you really feel the pain. For Python versions, there’s really not that much cost in adding support for newer versions while retaining (some) support for older ones?

Having said that – that note in the release notes strikes me as too eager, considering 4.2 is supported for 5 more months. Unless there were big breakthroughs I could make by dropping support, I’d rather keep the package usable by a larger audience. And avoid awkward conversations with contributors / other key adopters who might be interested to help but might need the package to work in older versions.

Downloads by version over time / release adoption chart for ref:

In November 2025, Django 4.2 is 8M monthly downloads, about 30% of the downloads for supported versions (28M). Django 5.2 is 15M downloads, 57% of supported versions.

1 Like

Thinking about this for my own packages. I work at a government lab that struggles to keep versions of things up to date. It is getting better, but running an EOL python is not an unlikely restriction to run into so I see great value in supporting as far into the past as is easy. Usually what triggers me to drop support at the tails is the following decision tree:

  1. Is the dep EOL? If no → do not drop support (including for Django), yes → go to 2
  2. Do I get to delete code if I drop the EOLed thing? no→go to 3, yes → drop it
  3. Is supporting the EOLed thing preventing some thing I want to do that requires newer libs? no→go to 4 yes→ drop it
  4. Are my CI permutations getting ridiculous and slow? yes → drop it, no → maybe don’t drop it yet

I’m honestly not sure I care that much about what other libs do with their support right around the EOL boundary. I care way more about their forward support. The goal for every package should be to have release out thats tested against the RCs of their major upstream dependencies when those dependencies release those versions. If you’ve got one thing in your dependency stack thats lagging it can delay upgrading your downstream software or require you to do weird workarounds.

Guidance

I’m happy if ya’ll want to offer guidance along the lines of “We suggest supporting non EOL pythons and the oldest to latest active Django LTS releases.” I think its ideal to capture the tails of the Django LTS releases because a lot of orgs are slow to update and if you can’t get a critical bug or security issue fixed in a dep without upgrading your whole stack it might make decision makers think twice about continuing with Django.

Supporting the oldest active LTS also doesn’t mean your latest trunk has to support it - it may just mean that you’d be willing to issue a bug fix release off the last tag that supported the oldest LTS. Its hard to signal this with badges/classifiers though…

Easy Buttons

Guidance is great but easy buttons are better. I’ve long been annoyed by the amount of manual work there is in keeping my CI matrices up to date. Tools like nox can offer tie ins to github actions that allow you to put a little more automagic dust on your matrix resolution but the way a lot of these tools work is to force install matrix versions in place - which can create weird version combinations of ancillary dependencies in virtual environments you wouldn’t find on production systems. I started work on a tool (PTM - Portable Test Matrix) to solve this for my own libs. The idea being:

  1. use uv’s pip commands to build realistic virtual environment for each test matrix realization
  2. Support lowest/highest resolution strategies
  3. Allow matrix to be defined in the toml and fed into GHA (or other CI runner)
  4. Allow strategy to be referenced to an external source (HTTP GET)
  5. Support dynamic strategies based on dependency
  6. Nicely visualize test matrices

The easy button idea is to develop tooling that package maintainers could pull off the shelf to keep their package testing automatically up to date - and also to test against all the supported RDBMs if need be. Alas I got busy and do not have this in a workable state yet.