Hi,
I’m using Django Channels for WebSocket connections and I’m seeing what looks like
a CPU bottleneck in the event loop / Channels dispatch layer, even when the
connections are basically idle.
Environment
- Python: 3.13.5
- Django: 5.2.8
- channels: 4.3.2
- channels-redis: 4.3.0
- ASGI server: uvicorn 0.38.0
- Channel layer: Redis
Symptom
When I just keep WebSocket connections open (no heavy messages being sent),
the CPU usage of a single worker process keeps growing over time.
Roughly speaking, CPU usage increases by about 0.1% per second for that worker,
even though the app is not doing any significant work at the application level.
To understand what’s going on, I profiled the process and got this summary
(only the top part is shown here):
GIL: 92.00%, Active: 104.00%, Threads: 9
%Own %Total OwnTime TotalTime Function (filename)
9.00% 92.00% 10.05s 85.75s _run_once (asyncio/base_events.py)
9.00% 80.00% 9.23s 73.42s _run (asyncio/events.py)
12.00% 13.00% 7.88s 9.60s _wait (asyncio/tasks.py)
7.00% 59.00% 6.07s 54.17s await_many_dispatch (channels/utils.py)
5.00% 11.00% 5.31s 9.13s create_task (asyncio/base_events.py)
6.00% 28.00% 4.61s 21.55s wait (asyncio/tasks.py)
2.00% 8.00% 3.91s 9.97s dispatch (channels/consumer.py)
2.00% 10.00% 3.84s 13.81s dispatch (websocket_django_common/consumers.py)
...
0.00% 92.00% 0.84s 86.59s run_forever (asyncio/base_events.py)
0.00% 61.00% 0.07s 55.74s run_asgi (uvicorn/protocols/websockets/websockets_impl.py)
0.00% 61.00% 0.30s 55.59s __call__ (channels/routing.py)
0.00% 61.00% 0.17s 55.41s __call__ (websocket_django_common/middlewares.py)
0.00% 59.00% 0.16s 54.41s app (channels/consumer.py)
This just looks like the asyncio event loop running, no?
await_many_dispatch controls the tasks for each active connection consumer instance. If those are just sat idle, they’ll be waiting for ASGI events to come in. But the event loop will still be going over them in order each tick in order to see if they’re ready to progress.

@carltongibson Thanks for the quick response.
In my case, the worker doesn’t just stay around ~75% CPU — the usage actually increases by about 1% every second and eventually reaches close to 100%, even when the WebSocket connections are idle.
Could you advise which parts of Channels or the asyncio event loop I should focus on to understand this behavior?
For example, should I look into the number of active consumer tasks or the polling frequency of the Redis channel layer?
I’m also attaching the profile.svg in case it helps.

Well, you can look at await_many_dispatch directly, and put some logging in there, as well as in your consumer’s dispatch method to see when the ASGI events are being processed.
If I run my example project locally, with multiple websocket connections open but idling, the process uses a small amount of CPU time, mostly when I send I message, but % usage remains constant and low. (≈2%) — That’s barebones Channels. If you’re seeing anything else, it’s your app doing something somewhere, but I’m not sure how to help you pin that down.
@carltongibson Thanks for the suggestion! I added logging to await_many_dispatch and dispatch, and found the root
cause.
The issue was that our custom dispatch method was catching StopConsumer exception:
async def dispatch(self, message):
try:
return await super().dispatch(message)
except Exception as exc:
if message.get("type") == "websocket.disconnect":
return # This was swallowing StopConsumer!
await self.exception_handler(exc)
When websocket_disconnect raises StopConsumer, the except Exception was catching it and returning
silently. This prevented the await_many_dispatch loop from terminating, causing the ASGI server to keep
delivering websocket.disconnect events in an infinite loop → 100% CPU.
The fix was simple:
async def dispatch(self, message):
try:
return await super().dispatch(message)
except StopConsumer:
raise # Must re-raise for await_many_dispatch to exit
except Exception as exc:
if message.get("type") == "websocket.disconnect":
return
await self.exception_handler(exc)
Thanks for pointing me in the right direction!
1 Like
Yep, that’ll do it. Glad you got it worked out. 
1 Like