need input for async redis backend

hi, so 33573 ticket has an owner, but seems inactive, so i wanted to takeover

but, there seems to have been some unresolved conversation on how the backend should be implemented, and since there doesn’t seem to be any activity i’m raising the question here:
how to implement the async backend for redis?

my opinion is to add a new class, using the native async support provided by the redis package:
this way we’re actually using async code
and i think it would be easier to solve the other ticket 36047

on the other hand, there will be more code and needs more maintains work

tho since django’s redis backend is rather small i don’t think it’s that big of a problem

what do you think? how should this be?

cc: @carltongibson @Andrew-Chen-Wang @sarahboyce and anyone intereseted in this topic

hi, i’m sorry to ping :folded_hands:

with valkey-glide being at work, and since it’s an async only tool, i’m wondering if i should plan to support it on django-valkey
but since 33573 and 36047 are inconclusive, i don’t want to code something that can’t ever be used
(i’ve also discovered that the cache backend for sessions would probably need work as well)

so i’m wondering what’s django’s stance here, will there be async support for caching?

might be worth noting that valkey-glide works with redis too, at least for now

1 Like

Hi :slight_smile: I’m a Glide maintainer, and it’s great to see the project mentioned here. I’m excited to support the integration. The library is stable and, in many key aspects, more ready than other veteran libs, thanks to design improvements based on user feedback about the pain they suffer in those exact libs.

Key features include fault tolerance, reliability, topology change detection, sharded pub/sub with resubscription on disconnections, and the ability to quickly address issues due to the project design around stable and safe Rust core.
Plus, Glide has a strong community and a solid relationship with the Valkey community and core team, which helps us collaborate effectively.
We are effectivity taking parts of the core meeting, and they are joining ours, so we can shape the product the user get as a whole.

Currently, we’re working on introducing a synchronous API alongside the asynchronous one, and we might add AnyIO support. While I’m not aware of Django’s plans for async cache, regarding the linked issue, Glide functions as a multiplexer and doesn’t require closing all connections; it operates, from the user perspective, as a single connection, and the close function isn’t asynchronous since there’s no I/O involved.

I’m not familiar with the broader issues related to async cache, but I’m happy to help once I have the necessary information. Feel free to reach out to me on Valkey Slack about this issue or anything else you need assistance with. Just chatting is also acceptable :slight_smile:

2 Likes

hi @avifenesh :waving_hand: glad to see you here

these are all great points, thanks for putting the time to write this :raising_hands:

i don’t currently have slack, I’ll see if i can join, i do have questions :grinning_face_with_smiling_eyes:

i don’t think it’s feasible to implement an async backend without the community’s support.
(i know I’ve done it before, but i was young and foolish :melting_face:, also it’s not official)

since django is opinionated about it’s API, if i publish an official async client, if django decides to do the same, it’s possible for it to have a different API than mine, and force a backwards incompatible change.

there’s also the question of demand, do django users want this? cause if it’s not needed and not wanted, why make it?

so, even though I’d like for this to happen, I’m gonna have to wait for the community to decide on it

i do look forward to a sync API tho
i want to test things out, and perhaps we’ll have glide support in django through that

1 Like

For the opinionated API, does Django usually decides on one library to use as backend and offer this functionality solely?

For example, if Django will go with redis-py, does it force you to use valkey-py to be aligned?

After all, different libraries offer different features, and you want to let your users the opportunity to use the one they need and want.

For whether it is needed, it is feasible to find out, let use the all mighty internet power.
I open posts in variety places busy of devs, and accumulate data for my decision in glide.

For example, while .NET client is under development, to decide whether we should have 6 or 8 as the minimal version supported, i downloaded stack overflow questioner data and crunched it, and bad a reddit post with 160 devs telling me what versions their company uses, and what they use for personal things.

We can do something similar.
What is the question you want to ask exactly?

i might have spoken a bit vaguely
let me rephrase a bit

django’s caching system is pluggable, you’re not forced to use a specific library

but there are internal calls to it,
one instance is the close() method that is mentioned here
there are other places as well, e.g the caching Middleware, and the session framework

so let’s say i implement a sync method called close(), then django implements an adync backend that awaits close() that’s a backwards incompatible change for me
(and to be accurate, django should await close on async backends, since other libraries demand it)

about users demanding an async backend, it’s a good suggestion to go on Reddit

but still the main crowed that I’m seeking are people more active in the community, people who contribute to django

I’ll try to reach out to people in a few days, see what they think
and I’ll do a reddit post around the same time

thanks for following up on this, i feel like we can get something out of this

i wrote this comment #33573 (Add native async support to redis cache backend) – Django on the related ticket to explain the things i know about this
if anyones intereseted, i’d be glad to hear your feedback

1 Like

Hey @amirreza8002 — Thanks for posting.

I guess, at this point, what I’d like to see is an async (first) backend in a third-party package. That would allow pulling it in, configuring, using, … and then we can work out the remaining integration difficulties. I need to check out the django-valkey implementation. (Also, an overview of the existing options here is probably useful. Having not been paying attention, I can’t say off-hand what the state of play is, so likely others can’t other.)

Ref whether we need a separate sync backend (or just an async_to_sync adaptor, or even code generation for a sync version): I think we just have to wait and see. We already have the sync backends. It’s a long road to remove/update them. I don’t think we need to legislate in advance.

Rather, having async backends available would mean we could start adopting in earnest. I think usage will drive the direction.

1 Like

hi @carltongibson :raising_hands:, thanks for getting back to me, i appreciate the input :folded_hands:

i agree that 3rd-party is the best approach
but the real issue is the signal receiver that I’ve discussed in #36047

since that doesn’t support async backends, it’s hard to give out an official async backend

even in django-valkey I’ve documented thatthe async backend is not a guarantee

granted, this is probably not a big issue and it’ll probably just throw some “coroutine not awaited” warnings, but I’m not a full time cache maintainer so i don’t want to take the chance there

so in reality that receiver is all i care about, i don’t even use redis :upside_down_face:, but in the ticket people mentioned that to fix this bug django needs an async backend first

anyhow, even if the backend is not accepted, i do ask for the receiver to be fixed, I’ve made a proposal in the ticket about this

thanks again, sorry the text got a bit long

p.s: sorry for the ping in the now deleted message, I’m on phone and the phone ui is a bit weird :melting_face:

tldr:
if django allows for async backends, I’ll do the rest in a 3rd-party

I think we’re at an impasse there, looking at the triage state.

I would work around the signal, registering an async listener for my backend to the request finished and settings changes signals. (That should at least allow you to progress)

hmm

for that to work I’d have to import the caches object and implement the receiver in a file that is ensured to be read by django

i have to try but my initial instinct tells me it’s not gonna end very well :grinning_face_with_smiling_eyes:

I’ll report back when i have more info

i hope you don’t mind me pinging you, if it’s a bother please do let me know

from django.core import signals
from django.core.cache import caches, close_caches


async def close_async_caches(**kwargs):
    for conn in caches.all(initialized_only=True):
        if getattr(conn, "is_async", False):
            await conn.aclose()
        else:
            conn.close()


signals.request_finished.connect(close_async_caches)
signals.request_finished.disconnect(close_caches)

i added something like this, and added a is_async attr to my async backend
it works, tho i should do more tests

if everythings goes well, i’ll make django-valkey’s support for async cache backends official

and hopefully some day in django

1 Like

new discovery

if the close_caches signal changes in django,

this might need to be adjusted as well

1 Like

hi again :raising_hands:

i opened adjust cache signal receiver to work in async context by amirreza8002 · Pull Request #28 · django-utils/django-valkey · GitHub this PR to fix the receiver problem

it does not fix the testing issue i mentioned above

if anyone is interested, take a look, i would appreciate any feedback :folded_hands: