SynchronousOnlyOperation error in production (Heroku) but not local development

I am training a stable_baselines3 model using pytorch in a celery task. When doing this locally all works as expected but once deployed to heroku I receive the following error:

app[worker.1]: Traceback (most recent call last):
app[worker.1]: File "/app/.heroku/python/lib/python3.8/site-packages/celery/app/", line 385, in trace_task
app[worker.1]: R = retval = fun(*args, **kwargs)
app[worker.1]: File "/app/.heroku/python/lib/python3.8/site-packages/celery/app/", line 650, in __protected_call__
app[worker.1]: return*args, **kwargs)
app[worker.1]: File "/app/manekineko/agents/", line 96, in train_model
app[worker.1]: model.model.learn(total_timesteps=model.timesteps)
app[worker.1]: File "/app/.heroku/python/lib/python3.8/site-packages/stable_baselines3/ppo/", line 307, in learn
app[worker.1]: total_timesteps, callback = self._setup_learn(total_timesteps, eval_env, callback, eval_freq,
app[worker.1]: File "/app/.heroku/python/lib/python3.8/site-packages/stable_baselines3/common/", line 536, in _setup_learn
app[worker.1]: self._last_obs = self.env.reset()
app[worker.1]: File "/app/.heroku/python/lib/python3.8/site-packages/stable_baselines3/common/vec_env/", line 61, in reset
app[worker.1]: obs = self.envs[env_idx].reset()
app[worker.1]: File "/app/.heroku/src/gym-stock-trading/gym_stock_trading/envs/", line 527, in reset
app[worker.1]: self._initialize_data()
app[worker.1]: File "/app/.heroku/src/gym-stock-trading/gym_stock_trading/envs/", line 334, in _initialize_data
app[worker.1]: self.asset_data, self.previous_close = next(self.market_data)
app[worker.1]: File "/app/manekineko/agents/", line 61, in _yield_market_data
app[worker.1]: for stock_data in stock_data_queyset:
app[worker.1]: File "/app/.heroku/python/lib/python3.8/site-packages/django/db/models/", line 276, in __iter__
app[worker.1]: self._fetch_all()
app[worker.1]: File "/app/.heroku/python/lib/python3.8/site-packages/django/db/models/", line 1261, in _fetch_all
app[worker.1]: self._result_cache = list(self._iterable_class(self))
app[worker.1]: File "/app/.heroku/python/lib/python3.8/site-packages/django/db/models/", line 57, in __iter__
app[worker.1]: results = compiler.execute_sql(chunked_fetch=self.chunked_fetch, chunk_size=self.chunk_size)
app[worker.1]: File "/app/.heroku/python/lib/python3.8/site-packages/django/db/models/sql/", line 1149, in execute_sql
app[worker.1]: cursor = self.connection.cursor()
app[worker.1]: File "/app/.heroku/python/lib/python3.8/site-packages/django/utils/", line 24, in inner
app[worker.1]: raise SynchronousOnlyOperation(message)
app[worker.1]: django.core.exceptions.SynchronousOnlyOperation: You cannot call this from an async context - use a thread or sync_to_async.

Essentially the function queries the database and uses a generator to feed it to the model as needed. I’ve read through the docs and am still struggling to understand the fundamentals of what is causing the problem, especially if it works locally.

I appreciate any help, especially with an explanation of the role async plays here.

function I believe is causing the error

def _yield_market_data(model, mode):
    while True:
        if mode == 'train':
            stock_data_queyset = model.agent.asset.stockdata_set.filter(
                    - timedelta(model.evaluation_days + 1),
                    - timedelta(model.evaluation_days + model.training_days)
        elif mode == 'evaluate':
            stock_data_queyset = model.agent.asset.stockdata_set.filter(
                    - timedelta(model.evaluation_days),

        for stock_data in stock_data_queyset:
            asset_data = pd.read_json(stock_data.json_data)
            previous_close = stock_data.previous_close

            yield asset_data, previous_close

In case anyone come across this. I ended up just placing the function call in a thread and it resolved the issue. Still not sure I understand why this is necessary or why it works on local and not production but its working now.

1 Like

maybe it related to process and subprocess ,