I have a boilerplate app and was playing around with Django Async(planning to add some GraphQL + subscriptions) and made a benchmark that shocked me.
Gunicorn with async worker uvicorn is much more slower than gthread.
Async Code:
async def my_async_view(request):
return JsonResponse(
{"async accounts": "Test"},
status=200,
)
Gunicorn Async command:
gunicorn --worker-class uvicorn.workers.UvicornWorker banking_api.asgi --workers=8 --threads=2
Sync Code
def my_sync_view(request):
return JsonResponse(
{"sync accounts": "Test"},
status=200,
)
Gunicorn Sync command:
gunicorn banking_api.wsgi --workers=8 --threads=2
Benchmark results:
Async:
Running 30s test @ http://localhost:8000/test/
12 threads and 50 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 57.72ms 75.82ms 1.08s 97.25%
Req/Sec 81.90 44.42 232.00 81.61%
29007 requests in 30.05s, 8.08MB read
Requests/sec: 965.41
Sync:
Running 30s test @ http://localhost:8000/test/
12 threads and 50 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 24.26ms 44.37ms 865.76ms 92.37%
Req/Sec 281.56 146.56 0.89k 66.32%
100051 requests in 30.08s, 30.25MB read
Requests/sec: 3326.37
What is happening? Why the difference of Requests/sec of 3326.37 (sync) vs 965.41 (async) ?
Is this expected or am I doing something wrong here?
I know sintetic benchmarks are not helpful, the only thing I want to make sure is if I’m going async, and all the hassle of writing wrappers of sync_to_async, async_to_sync in my code, that at least I don’t hit a performance hit.
Actually the reason I need this is because I am writing a GraphQL API and I need DataLoaders for my resources (using Graphene-django), that are async (latest graphql core removed Promise based DataLoaders).
I did find an alternative for a Sync Dataloader that might work for me.
Long story short, I also want to embrace the Async mindset and start writing some Async Django to get used to it and learn the caveats, but taking a performance hit is clearly not the way to start, that’s why it’s important for me to understand what is happening.