Reading the docs, we are cautioned against using streams:
Performance considerations
Django is designed for short-lived requests. Streaming responses will tie a worker process
for the entire duration of the response. This may result in poor performance.
Generally speaking, you should perform expensive tasks outside of the request-response
cycle, rather than resorting to a streamed response.
What I want to do is not so much to send a long response (e.g. the large csv file mentioned in the same doc), but rather frequent push notifications to clients. Server-sent events would be great here, because I really just want to tell the client âhey buddy this happenedâ and maybe send a reasonable amount of data with it. Django docs do not specifically call anything server-sent events, but this seems close enough to it.
The application doesnât involve a lot of clients since itâs a local network and the load is fairly predictable since the events are based on a process that generates data at a predefined rate.
So I am thinking that in spite of the docs, I am not going to have my life later by opting for this approach?
The question becomes one of âhow many of these responses are you expecting to have âliveâ at any moment?â If itâs only 1 or 2, youâre probably going to be ok. More than that and you can expect to start to have problems as each streaming response ties up a worker process.
Otherwise, using either SSE or websockets is going to involve adding Channels into the mix. The Django-Eventstream package may be about the easiest way to add SSE to your application. Or you could go with a full Channels / websockets application.
You could also implement a short-polling protocol from the clients. (This depends upon how soon after the event occurs that you want the client to be notified.)
Long polling would reduce the latency of the notification, but it would have the same effect as the streaming response.
Okay - I had hoped it would be simpler to implement. In NestJs for instance, you can pretty much just import a framework package in your âviewâ (they donât call them views but the rough equivalent), return the proper response type and thatâs it.
No I cannot guarantee itâd never have more than few open cx (E.g. some random dude might open a few tabs & not close them). It seems indeed that evenstream would be the best option.
Or them go with flask, which seems to offer easier implementation (though apparently adds a redis dependancy).
I donât think just making the view async will do it - I believe the entire stack needs to be async.
From the Warning at the docs you referenced:
You will only get the benefits of a fully-asynchronous request stack if you have no synchronous middleware loaded into your site.
So if all the middleware youâre using is async, then yes, you might see the benefits with a streaming response. (Since much of the database layer is still synchronous, there may not be as much a benefit as needed to support a large number of users.)
But Iâm not sure either way - youâd have to try it and see.
EventStream provides API endpoints for your Django application that can push data to connected clients. Data is sent using the Server-Sent Events protocol (SSE), in which data is streamed over a never-ending HTTP response .
Yes that sounds like possibly the best approach. I guess I was initially a bit annoying at having 2 more dependencies for what in my previous experience came with the framework. But then itâs really not that big a deal.