Django Channels with Redis PubSub

Hello,

I want to stream stock price data over a websocket connection to my clients. The price data comes over a Redis server. So my plan is, that a client connects to my site and subscribes to a specific topic, which would be a stock in this case. Now every time a new price arrives in Redis it gets send over the websocket. I found in the suggestion to write a management command, which constantly listens to Redis and publishes any new price to a consumer group, but that doesn’t seem clean and scalable for me. Does Django-Channels have cleaner way for that?

When there is no other way… what is the best way to run the management command, because it would run forever.

Thanks for help

Realistically, what do you define as “scalable”?
What constitutes “clean” from your perspective?
These are not binary questions - those terms aren’t yes/no definitions, it’s all a matter of degree.

What do you envision as the ideal solution?

There is no one “best way to run the management command” in the absence of an understanding of the larger context.
You can start it using systemd or a classic init.d script. You have other options such as runit or supervisord. You could start it as the root process of a docker container. Take your pick. Any of these process managers give you a variety of ways to manage that as a persistent task.

Excuse me, I am just starting with django. So I want to run Django in a Docker-Container and it would be great if I could just implement the solution directly in django. What would be your recommendation?
Should I just program a management command and run it after I have started the server or is there an other way?

Regards

What I’ll refer to as “core” Django is not suited for this specific task. That’s why you’re getting recommendations for other implementations.

Whatever process, method, or implementation you choose, it will be something running outside the Django context. (Even management commands run independently of Django itself. You don’t run commands like makemigration or migrate from within your Django process.)

Ok, so what would be your way to go so subscribe to Redis and send it to a channel layer?

How much do you already know about working with Redis and Channels?

I know that Redis work as a in-memory database. So when I write a key-value pair in it, there is the possibility to subscribe to a specific stream of keys.
Channels is a package which let Django accept more protocols than HTTP, for example WebSockets. When a user connects to Django over WebSockets, there is a consumer which handles that connection. I can create multiple consumer groups to send messages simultaneously to all client, who are connected to that group. My Idea was it to send the Redis-Entrys in one of these groups.

I hope that is enough

Yes, that basic idea sounds about right. Superficially, it’s not that difficult a task. It’s just an issue of understanding how the pieces fit together.

Thanks for your patience. So correct me please :slight_smile:

I don’t think there’s anything to correct from what you’ve said. You’ve got the basic idea correct. Now it’s just an issue of implementing it.

If I would implement it in a Django-Management-Command, which would run outside of Django, can I just run it after I have started the web server?

Before or after - it doesn’t matter.

That’s why I made the point about it being an external process. It’s not related to the web server process, it’s something completely separate.

Ah ok thanks, so I can just run it in my Dockerfile at any point.

Thanks for your help and patience
Regards Nick Guber

As a separate docker container, yes. It becomes the main process for that container (as opposed to your docker container running your Django service).

Why is it not possible to run both in the same container? When I have to separate them I would need a full copy of Django for the Container, because otherwise I can’t access the Django-Channels.

Yea? So?

Our docker-compose file for one of our systems consists of separate and independent containers for uwsgi (to run Django), redis, a celery worker, a channels worker, a celery beats instance, a cas instance, a memcache instance, and an nginx instance.
Docker is architected under the principle of one process per container. Yes you can build a docker container to run multiple processes (see Run multiple processes in a container | Docker Docs), but then it’s up to you to manage those individual processes. Why create extra and unnecessary work for yourself?

1 Like

Ok thanks for your help. I will try to solve my problem with that new knowledge.

Regards :slight_smile:

How does the Container, which pushes the data from Redis in the consumer group knows he has to talk to the other container?

The Channels Workers communicate to the Consumers through the channel layer - which is typically a Redis instance. So Redis is the connection point between the processes.

1 Like

My last question to that topic is the following: I want to be able to stream around 12000 different stocks over my WebSocket. So for example I want to say on may frontend → ‘Show Apple Stock’ → the WebSocket now sends me the price for that stock in real time (every time I push an update through the Channel-Layer. How can I manage that with multiple users? Is it smart to generate a group for every stock, which would result in over 12000 Channel-Groups? Or should I not specify Groups, instead of that I implement the logic in my Redis-Publisher, who send the price data in the Channel-Layer?

Regards