Hi i have use django to do a couple projects mainly usign Django REST Framework, im not a seasoned developer by any means.
Right now basicly i need to call an external API to get some data but, i need do it using a long polling aproach,so i’ll need to set up a task that will repeat around maybe every 5 seconds that makes a request to the API, then i will filter some stuff form the body of the response and store this data(in a global python variable or some form of cache i guess) so that way when the client makes a resuqest to my DRF endpoint i will return this data that i have saved from the previus request made to the external API. As right now i only need one task for a single espesific endpoint of the external API, but this could increase to around 4 or 5 task that will do something similar but for diferent enpoints of the same external API.
Searching online the use of celery with redis came up a lot, so if i go down this route, redis will also be the cache where i will store the data that i get from the request to the external API, but for this i will probably also have to set a different machine(a cloud VM/server) to run the celery workers, or thats what i think it will be better option.
so i also saw that there are other tools to set recuring background task in django, like for example Huey, django-rq and async_rq. But my question is, what di you guys think from your experience that will be the best way to aproach this problem?
Also i get that going the celery-redis way is going to be more expensive money wise for the deployment, do you think is worth it?
thanks a lot for yout help guys.