Uploading files parameters seems to have no effect in deployment

I am struggling over this situation for a few days now and I cannot find a solution to my issue. When uploading small files (<2.5mo) everything works well, however when files gets bigger, I get a 502 bad gateway error. The issue is not related to time out because they effectively upload in browser up to 100% regardless of the size and time it takes to upload but then after a few seconds of “waiting for the hosts” it returns a 502 bad gateway error.

I have noticed that this error is happening before a specific function is called. I don’t think that my gunicorn or nginx file are wrong since none of them are logging errors. Which is leading me to think that it is a Django thing.

However, my django log does not output anything related to this issue, I only have a tracker in the specific function that does not show up with bigger files, which tells me that it does not get called.

Now, I have seen in Django Documentation about the upload parameters, I have included them inside of my settings.py but then seem to have no effect but so far I suspect that they could be what throws things off. I attached my views.py, forms.py and setttings, in the hope that someone could see something that I don’t!





class ETL(forms.Form):
    Historical = forms.FileField()
    Pre_processing = forms.FileField()
    Supplier = forms.FileField()
    parameters = forms.FileField()

    def process_data(self, url, *args, **kwargs):
        begin_time = datetime.datetime.now()
        t = urlparse(url).netloc


@method_decorator(login_required, name='dispatch')
class Getfiles(LoginRequiredMixin,FormView):
    template_name = 'upload.html'
    form_class = ETL
    logger.info('data acquired')
    success_url ='dash.html'
    def form_valid(self, form):
        url = self.request.build_absolute_uri()
        logger.debug('data has been processed')
        return super().form_valid(form)

Maybe somebody encounters the same problem and would have a hint about how to debug that, or maybe an alternative method to do the same thing. Thank you!

How large are these files that you’re uploading?

What is your client_max_body_size setting for nginx? And is it assigned in the right location?

You might want to increase the logging level to DEBUG at all points to see if the more detailed settings point you in the right direction.

Also, if you’re running this in a test / development environment, you might want to run these as foreground programs to see any console output that might be generated.

(In my experience, a 502 is returned when an uncaught exception is thrown, which could also explain why there’s nothing in the log about it.)


Hi! each time the user uploads 4 files with a total of max 20mo. I have set client_max_body_size to be 20Mo and I think it points to the right place, I tried setting it up to 1mo and it gives me an error which shows me that it verifies well.

I will try modifying my logging and see if anything shows up! There is also something that catches my attention. Django documentation mentions that files smaller than 2.5 mo are stored in memory while anything greater than that gets stored in a temporary folder. In that case, would that mean that nginx needs to look into two places?

I have tried setting up DATA_UPLOAD_MAX_MEMORY_SIZE to try what happen if everything is store on memory as well as setting DATA_UPLOAD_MAX_NUMBER_FIELD to a super small number to see if any error would show up but nothing. I am a little confused why, wouldn’t be good enough to have them in the settings file?