Limiting PR creation to “verified” contributors/collaborators

Hi all, I’m posting to suggest a possible workflow change to help use maintainer review time more efficiently for the Django project.

Recently, I’ve been hearing more often about an increase in pull requests that appear to be AI-assisted and submitted without sufficient review—for example, changes proposed without adequate testing, without attempting to reproduce the issue, or with too little context. If this trend continues, it may reduce the capacity to review and merge genuinely helpful contributions.

As one operational idea, GitHub provides repository settings that allow maintainers to disable Pull Requests depending on a project’s needs. Building on that, could we consider a flow like the following?

  • Allow Pull Request creation only for verified users (e.g., collaborators).

  • Ask other contributors to start with an Issue / Forum post that includes the proposal, reproduction steps, and scope.

  • After maintainers confirm direction and scope, they can:

    • invite the contributor as a collaborator, or
    • ask for a PR only after agreement on the Issue (e.g., apply an accepted label and then request a PR).

I understand this introduces some initial friction for new contributors. However, for projects where PR review costs are high, it may help reduce unnecessary traffic and protect limited maintainer resources.

My intent is not to discourage new contributions, but to explore policy options that help keep the project sustainable and healthy. Feedback—especially from those who have tried similar approaches (pros/cons, criteria for “verification,” and any helpful templates or automations)—would be greatly appreciated.

4 Likes

I appreciate you thinking about this situation. It’s definitely a challenge we’ve been facing for years.

I think we should avoid incorporating the forum into the workflow. Ideally that conversation about the ticket and the solution should happen on Trac or GitHub. I think I’m generally against adding more to the workflow for code contributions. There’s enough complexity to keep people away.

I think I like the idea of having a verified state for people. I think this happens already, but it’s all mental. People who are recognized and known to have higher quality work are more likely to have their PRs reviewed because they are easier to approach. There’s less of a hurdle. I believe that’s what you’re trying to formalize.

The benefit of formalizing it is that we can define the criteria to be another way to communicate what it means to be a positive contributor. My concern would be, how could that be added requiring the least amount of effort to maintain?

2 Likes

I absolutely agree we already have a Trac system which attracts mostly serious contributors and the ones who are learning(beginners) will most likely be using AI to understand the workflow to some extent (because why not). Adding more complex workflow can sometimes overwhelm the contributors who are here to learn!

Requiring “permission” to raise a PR is probably a bad idea. The low-quality submissions are a problem, but it’s not worth adding friction for legitimate contributors.

Improving automation around closing PRs from first-time contributors with no ticket may yield more benefit.

1 Like

I saw this repo the other day: GitHub - mitchellh/vouch: A community trust management system based on explicit vouches to participate.

Possibly one for us a community watch and see how it evolves, if we desire more control.

3 Likes

Yes, that’s correct! What I mentioned was only an example, and as you pointed out, I also hope that the idea of a “verified state” could be formalized. However, at this moment, I don’t have a clear or concrete solution for how to implement it. I believe this is something we can continue to explore and figure out together through further discussion here.

That said, I think the approach suggested by theorangeone—applying a specific workflow only to new contributors and automatically closing submissions if certain conditions are not met—could also be a practical option. However, I don’t think it would be appropriate to consider someone fully trusted after just a single contribution. It might make more sense to define a certain threshold (for example, a number of meaningful contributions), and once that threshold is met, the additional workflow requirements would no longer apply.

I agree with that point as well. I think using AI as a tool—like you mentioned—and contributing after doing sufficient review is absolutely a positive approach. However, there seem to be more cases where people rely on AI without proper review and end up submitting low-quality code. Because of that, I feel that having some kind of safeguard or requirement in place is becoming increasingly necessary.

While contributors are certainly important, I believe we should also be mindful of the reviewers, who often invest even more time and effort in the process. Reviewing thoughtful and well-prepared contributions can be very rewarding. However, there have been instances in the past where a large number of low-quality or spam-like pull requests were submitted within a short period of time, which understandably increased the workload for maintainers.

Of course, it’s difficult to predict whether such situations will occur again, but occasional low-quality submissions still seem to appear from time to time. For that reason, I feel it may be worth considering some form of safeguard to help protect reviewer time and maintain sustainability.

Ultimately, I believe the perspectives of Django Fellows and active code reviewers are especially important in this discussion, as they are the ones most directly impacted by these challenges.

You make a great point about the benefits of a good review and the burden of a poor one. Sustainability is important. If the Fellows experience burnout, the entire project will struggle. Since we can’t foresee when the next ‘spam wave’ might arrive, creating a safeguard now, even if it’s not fully active, seems like a wise way to manage technical debt. I’d like to hear from the current Maintainers what type of PR’s take up most of your time??

What if the DSF launched something like “Review First” as a community initiative? A clear, positive message: the best way to contribute to Django right now is to review a pull request. Put it on the website, talk about it at DjangoCon, make it part of how we welcome new contributors. Back it up with a visible review queue, lightweight onboarding for new reviewers, recognition for review work, and regular community review sprints. Not new rules or restrictions. Just a shift in what we celebrate and where we point people when they show up wanting to help. It’s also a great opportunity to embrace AI tools for code review.

4 Likes

Thank you everyone for this conversation! I came here after feeling overwhelmed, again, by low quality submissions. Below my thoughts as Django Fellow:

Answering to some points: while Django has faced this for years, the situation has gotten significantly worse recently (in the last year-ish I’d say), and I think there are two drivers that negatively retrofit each other:

  1. Longer-standing issue: the rapid improvement of LLMs has lowered the barrier to producing plausible-looking patches. (Seems like) A contributor no longer needs to understand the codebase to open a PR, they just need to paste a ticket description into an LLM interface and configure some GH tokens. The resulting submissions look superficially reasonable but far from Django quality. (Tangent: this pain point is also affecting security reports and it’s huge.)

  2. Seasonal issue and very acute right now: GSoC applications. We are currently seeing a flood of aspirants competing for a spot, and simple, well-defined tickets have become contested territory. It is not unusual to see two or three concurrent PRs open for the same ticket at the same time, all clearly LLM-generated, from contributors with no prior engagement with the project. The dynamic seems to be “I need a merged PR for my application” instead of “I want to contribute to Django”, therefore that incentive actively discourages coordination and care for the project.

On the verification idea, I agree with @CodenameTim: what he describes already happens informally, and formalizing it is IMHO the right direction. At this point, the situation is bad enough that I would rather have an imperfect solution in place than no solution at all. I would genuinely prefer to spend time manually updating an allowlist than continue dealing with the PR nightmare and the flood of low-quality conflicting submissions we have.

To address @theorangeone’s concern: the allowlist would be pre-populated with everyone who has already contributed to Django, so no existing contributor is blocked nor their flow gets affected. New contributors would just need a small, defined step before they can open PRs which IMHO is a fair ask.

@seanhelvey, I think your proposal is an amazing idea and I would most definitely support it. This feels like it deserves its own thread though, so the discussion can get the attention it warrants. I would also suggest looping in @ontowhee and @raffaellasuardini who have been working on incentivizing reviewers.

8 Likes

Some quick numbers on the PRs from this year that could be useful:
Of the 343 PRs opened, 96 are still opened. 247 are closed. Of the closed, 115 are merged. So only 46% of the closed ones are merged.

@nessita Thank you as always for taking the time to review, and for responding to this discussion. Your explanation was really clear, and it helped turn what felt like a blunt, vague set of options into something much more concrete and actionable — I genuinely appreciate it.

I was also wondering: has Django already been working on any concrete mitigations internally (like the allowlist approach you mentioned), or would this be something we’re only starting to explore now?

And relatedly, I think we could get a lot of leverage out of actually using an allowlist-style process via GitHub workflows. If any help is needed, I’d be very happy to contribute.

@frankwiles expressed interest in adding some automation to close PRs that don’t have accepted tickets or don’t fill out the PR checklist, with some exceptions for folks that have already committed.

Then I expect we could fine-tune from there.

1 Like

@jacobtylerwalls Thanks for the suggestion. That sounds like a very reasonable direction.

Automation around closing PRs without accepted tickets or an incomplete checklist could help filter out a large portion of low-context submissions while still keeping the contribution flow open.

Starting simple and then gradually refining the rules based on real-world usage also makes a lot of sense.

I’m thinking of opening a ticket related to this. What do you think?

A ticket wouldn’t hurt. It’s already advanced to the “trial first in a separate repo” stage here: GitHub - frankwiles/pr-playground: Automatically close bad PRs · GitHub, feedback welcome!

2 Likes

If we do move toward a ‘verified contributor’ model, the most critical part will be the onboarding path. We need a transparent set of criteria that grants PR rights.

​My concern is that ‘mental’ verification (as Tim mentioned) favors those already in the inner circle. If we formalize this, let’s make not make it just a barrier to entry.