I’m new to async and websockets. I would like to stream a continuous feed to the frontend from the server with the ability to pause/resume on the frontend. I thought i would be able to accomplish this with async (to avoid using background workers) but I can’t seem to make it work. What would be the approach to do this with django-channels? At the moment the loop just blocks all communication. If I put a for loop it blocks communication until the loop finishes then acts normally. I could poll from the frontend every x seconds instead of streaming but I feel like it defeats the purpose of websockets??? or is that the approach to take.
import asyncio
import json
from channels.generic.websocket import AsyncWebsocketConsumer
class ChatConsumer(AsyncWebsocketConsumer):
pause = False
async def connect(self):
await self.accept()
await self._main_loop()
async def disconnect(self, close_code):
pass
async def receive(self, text_data):
text_data_json = json.loads(text_data)
print(text_data_json)
pause = text_data_json["pause"]
self.pause = bool(pause)
async def _main_loop(self):
while True:
if not self.pause:
await self.send(text_data=json.dumps({"message": "play"}))
else:
await self.send(text_data=json.dumps({"message": "pause"}))
await asyncio.sleep(1)```
Another attempt with the loop started in a consumer. No luck,
class ChatConsumer(AsyncWebsocketConsumer):
pause = False
main_loop_on = False
async def connect(self):
await self.accept()
async def disconnect(self, close_code):
pass
async def receive(self, text_data):
text_data_json = json.loads(text_data)
print(text_data_json)
pause = text_data_json["pause"]
self.pause = bool(pause)
if not self.main_loop_on:
self.main_loop_on = True
await self._main_loop()
async def _main_loop(self):
while True:
if not self.pause:
await self.send(text_data=json.dumps({"message": "play"}))
else:
await self.send(text_data=json.dumps({"message": "pause"}))
await asyncio.sleep(1)
Are you talking about continuous streaming such as audio or video?
If so, you might want to see the other discussions here in the forum about streaming data.
(Summary: Django isn’t really suited for doing this.)
If you’re talking about streaming chunks of data, such as JSON data feeds on a continual basis, that’s a different issue - and something that Django / Channels handles quite well.
But regardless of what you’re trying to attempt, you are going to need some degree of background worker processes that are independant of the Django infrastructure managing the connections.
@KenWhitesell
Yes the end goal is to stream chunks of data and not video/sound. It’ll be the log output of an application. I was hoping using async would avoid the need of background workers other than the one that is running the application. I would like to implement something like the output of github actions where you see the stdout of the CI workflow.
My current plan B if I can’t get the server initiate the sending on it’s own loop, i’ll just have the client send an update request every x seconds through the socket to get the updated log status.
No, native Django async is still tied to the request - response cycle. It does not allow for long-running tasks without tying up a process thread.
Our standard Channels deployment consists of a number of individual processes - nginx, uwsgi, daphne, redis, postgres, and multiple (as needed) workers. It’s actually easier to manage and debug things this way rather than trying to do everything in a single process.
Thank you,
I’ll pivot towards that I guess!
As a side note, our use-case is quite similar to what you describe as yours. We collect data from other devices, reformat it, and render it dynamically. There’s a worker process for each device responsible for collecting the data, and sending the reformatted data through channels to the consumers, which then send it out to the browsers.
So rough sketch of the design.
client requests an application run, idle animation
django puts a task in a task queue
when background worker ready django creates channel and sends id to background worker
background worker kicks off application and starts file watcher and sends updates to django channel
django forwards to client
I’d probably phrase this a bit differently - but it depends upon two factors:
-
How many people are going to listen to an individual file? (Can 10 people all monitor the same file?)
-
How many different files is one person going to listen to at once? (Is one person going to monitor 10 files at the same time?)
-
How is all this data going to be presented to the user?
- Is it just going to add text to the screen?
- Are you going to create separate areas for different files?
These questions would all affect how I would architect this.
For example, if the total number of files is small enough - or more precisely, if the total amount of updates are small enough, I would consider listening to all files being monitored in one process as a single async-based task. If those files aren’t sending a lot of data, there’d be little need to start up a separate process for each file.
Likewise, rather than tying a listener to an individual channel, I’d be looking to leverage the groups facility. I’d create a group for each file being monitored, and allow the users to connect to whatever groups represent the files they wish to monitor.
(Yes, you can also apply the permissions system to restrict what Channels groups a user can connect to.)
The key point here is that if 8 people are all monitoring the same file, there’s no need to start up 8 different monitors for that file.