Hello all,
I have a 3rd party server which is capable of sending me data over web socket. They published a python SDK. It uses twisted to do the connection. It does not depend on Django.
I want to build a UI where users can subscribe to selective data from the above 3rd party and do something about it. How do I go about it?
Right now I have successfully integrated channels to deal with my browser <–> Django websocket part. However I am out of ideas on where to run the 3rd party piece.
I am thinking if I should run it separately and somehow publish the data through channels, but this 3rd party script doesn’t do well in Django, so no channels there.
Any ideas?
Hello. Can you write the data from the 3rd party to Redis (as a first thought)? Then read it from there, and send it out in your Channels consumer.
Hi. I want to try your suggestion out. How’d I proceed?
I have callback methods which are called by 3rd party when data is available.
Say, def on_new_data(socket, data):
What would I write there to make sure the consumer gets it?
Unless you are saying, just write raw key value data into Redis! If so, how’d I be notified asynchronously in my Django consumer when new data arrives?
Well, you’re going to have to write it yourself
But Redis provides pub/sub which would address the last bit… Pub/Sub – Redis
You’ve got a couple different options here in addition to what Carlton has suggested.
Daphne is built on twisted - I’d be really surprised if their SDK couldn’t be adapted to work as an “external worker” within your Channels environment.
Or, if you’re already using Redis as your channels layer, you could have your external scripts write directly to those channels.
About approach 1 (Channels external worker) -
How would I achieve that? Run the twisted connection in channels worker?
class TwistedWorker(SyncConsumer):
def run_stuff(self):
twister_worker.connect()
??
I wouldn’t know. You haven’t provided any specifics about this third party API, its SDK, or about how you’re needing to use it.
For example, is it required that you open up a separate connection for each client? Or is it possible to multiplex data from multiple users across a single connection?
This idea isn’t going to be “plug and play” - none of these are. If you’re going to want to provide this level of integration, you’re going to need to dig into their source code and understand what they’re doing with their object - and probably override portions of them.
You will have work to do regardless of the method you choose - it really comes down to where you want to spend your time.
About 3rd party lib -
Basic example goes like this -
...imports...
client = DataSocket(api_key='xxx', access_token='xxx')
client.on_data = on_data
client.on_disconnect = on_disconnect
client.on_connect = on_connect
def on_data(ws, data):
# do something with data
def on_disconnect(ws):
ws.stop()
def on_connect(ws, response):
#do something to send data to 3rd party
#Main event loop
client.connect()
So the data that it brings in isn’t tied to users. One connection is all I need.
Data keeps coming 24x7, this needs to be running 24x7.
A runworker command would allow you to do that. It’s a persistent process that runs externally to your main Django / Channels application(s), communicating through the Channels layer. Unfortunately, I don’t have any relevant code that I can share, so implementing this would probably require some experimentation on your part.
Any pointers on your suggested approach 2?
What needs to be sent in redis so that consumers get the message right?
The messages themselves are just a dict, serialized to be passed through Redis. It’s probably more difficult to work out a way to get the right channel name for each recipient.
See the docs for the channel layer specs, Channel Layer Specification — Channels 3.0.3 documentation.