Server-sent event in django?

Hey,

Reading the docs, we are cautioned against using streams:

Performance considerations

Django is designed for short-lived requests. Streaming responses will tie a worker process
for the entire duration of the response. This may result in poor performance.

Generally speaking, you should perform expensive tasks outside of the request-response 
cycle, rather than resorting to a streamed response.

What I want to do is not so much to send a long response (e.g. the large csv file mentioned in the same doc), but rather frequent push notifications to clients. Server-sent events would be great here, because I really just want to tell the client “hey buddy this happened” and maybe send a reasonable amount of data with it. Django docs do not specifically call anything server-sent events, but this seems close enough to it.

The application doesn’t involve a lot of clients since it’s a local network and the load is fairly predictable since the events are based on a process that generates data at a predefined rate.

So I am thinking that in spite of the docs, I am not going to have my life later by opting for this approach?

1 Like

The question becomes one of “how many of these responses are you expecting to have ‘live’ at any moment?” If it’s only 1 or 2, you’re probably going to be ok. More than that and you can expect to start to have problems as each streaming response ties up a worker process.

Otherwise, using either SSE or websockets is going to involve adding Channels into the mix. The Django-Eventstream package may be about the easiest way to add SSE to your application. Or you could go with a full Channels / websockets application.

You could also implement a short-polling protocol from the clients. (This depends upon how soon after the event occurs that you want the client to be notified.)

Long polling would reduce the latency of the notification, but it would have the same effect as the streaming response.

1 Like

Okay - I had hoped it would be simpler to implement. In NestJs for instance, you can pretty much just import a framework package in your “view” (they don’t call them views but the rough equivalent), return the proper response type and that’s it.

No I cannot guarantee it’d never have more than few open cx (E.g. some random dude might open a few tabs & not close them). It seems indeed that evenstream would be the best option.

Or them go with flask, which seems to offer easier implementation (though apparently adds a redis dependancy).

Thanks

Does the remark still holds if I were to make the view async?

I don’t think just making the view async will do it - I believe the entire stack needs to be async.

From the Warning at the docs you referenced:

You will only get the benefits of a fully-asynchronous request stack if you have no synchronous middleware loaded into your site.

So if all the middleware you’re using is async, then yes, you might see the benefits with a streaming response. (Since much of the database layer is still synchronous, there may not be as much a benefit as needed to support a large number of users.)

But I’m not sure either way - you’d have to try it and see.

1 Like

EventStream provides API endpoints for your Django application that can push data to connected clients. Data is sent using the Server-Sent Events protocol (SSE), in which data is streamed over a never-ending HTTP response .

1 Like

Yes that sounds like possibly the best approach. I guess I was initially a bit annoying at having 2 more dependencies for what in my previous experience came with the framework. But then it’s really not that big a deal.