Subject: Automating Django Contribution Workflow – Input Needed
Hi everyone,
I’m exploring ways to automate certain manual processes within Django’s contribution workflow. The goal is to reduce errors and free up time for contributors by automating tasks like:
- Identifying active PRs and surfacing important ones based on engagement and priority.
- Managing aging PRs that need attention.
- Other repetitive or time-consuming tasks within the workflow.
As mentioned in the project idea, part of this effort involves working with Fellows and the community to identify the most valuable tasks to automate.
I’d love to hear your thoughts:
- What specific processes do you think would benefit most from automation?
- Are there existing pain points that slow down contributions?
- How should this automation be implemented? Should it be handled via GitHub Actions/Python scripts that write to files and auto-post updates to Discord/the forum, or would a web interface/dashboard for managing these be more useful?
Looking forward to your insights!
Thanks!
2 Likes
I think we need help giving feedback on PRs as the number of reviewers vs the number of PRs we receive is quite imbalanced.
This could mean:
- checking test coverage of new/updated code
- if code has changed there is at least some tests written
- versionchanged/versionadded notes are included (when needed)
- release notes (when needed)
- line length of the docs checked
- when deprecating a feature, the warning is tested. The deprecation is documented in all appropriate places
I recommend you go through PRs that have been reviewed and our contribution guidelines to understand the process. You can also look at some live PR reviews here: https://www.youtube.com/@djangonautspace/streams
But I would also be interested in ways we can “find more reviewers” so being able to identify SMEs based off who is contributing regularly to a component could be interesting.
Note that Rails has a contributor site: https://contributors.rubyonrails.org/
We have a dashboard: https://dashboard.djangoproject.com/
I would love our dashboard to be more useful and show us per component defined in Trac, top contributors (reviewers/commiters/ticket comments?) filterable over time (last month, year, all time)
I hope that’s at least a starting point 
2 Likes
Adding more to these as Sarah suggested on discord -
-
we should try to improve our CI to only run when required, so for example, not running the test suite on a docs change.
-
There’s a Django contributing make toast tutorial. Would love to make it impossible to create these PRs
also @sarahboyce you said
We also have things like deprecations
How do maintainers currently catch backwards-incompatible changes and missing deprecation messages?
Thanks
It’s not catching backwards incompatible changes but rather, if someone has added a RemovedInDjangoXXWarning
, they need to have tested the warning, added it to the release notes and docs/internals/deprecation.txt
.
Hey @sarahboyce,
I wanted to clarify a couple of things regarding the custom linters:
- Custom linter for coding style – As Django already uses Flake8, which enforces line length rules. Are you referring to additional checks beyond this, such as enforcing Django-specific import ordering, docstring formats, or other style conventions that are currently manually enforced?
- Custom linter for wrapping documentation – Do you mean ensuring
.rst
files follow specific line length limits and proper formatting (like headings, admonitions, etc.)? Would using rstcheck
or doc8
in CI be a good approach here, or do you have something else in mind?
Just want to ensure I align with what would be most beneficial for the project! Thanks.
You can read through our coding style guide: Coding style | Django documentation | Django
Yes we implement flake8, isort and black but we have other custom styles which need to be manually enforced