Lost connection to Mysql during query

My script is running as service

#!/usr/bin/env python3
import os
import django

os.environ.setdefault(‘DJANGO_SETTINGS_MODULE’, ‘LPS.settings’)
django.setup()
from abonament.models import Abonament_info
from datetime import datetime, timedelta
import schedule
import time
from django.utils import timezone
def change_day():
now = timezone.now()
abo_to_change = Abonament_info.objects.filter(user_id__notatka__startswith=“@”,
end__gt=now,
begin__lt=now)
for item in abo_to_change:
new_date = now.replace(hour=item.begin.hour, minute=item.begin.minute, second=0)
item.begin = new_date
item.save()

#change_day()

schedule.every().day.at(“08:01”).do(change_day)

while 1:
schedule.run_pending()
time.sleep(1)

One day after running hi works ok next day I found in logs

Mysqldb.OperationalError: (2013, ‘Lost connection to MySQL sever durning query’)

Manual activation works ok.
There are 4-5 rows in the table to change the day and there are 120 rows, so there is not that much data to manage.
Maybe there is some command in ORM to keep the connection in this particular situation?
I found some CONN_MAX_AGE setting, but is this the right way? Won’t this lock up the database for other operations? This is a setting for the entire project.
I don’t have this problem in other parts of the project, e.g. with forms on the website

Set this up as a cron job to run once/day rather than keeping this process persistent as a service. That’s going to be easier than just about any other method to keep the connection active.

Trying to keep the connection active also isn’t going to help if there’s any other cause for the connection being lost, such as the database engine being restarted.

This means that you’d want to establish a new connection every day anyway, which then means you’re just as well off running a new instance of the process each day.

I’don’t like cron :slight_smile: If I run this as a service it’s easier to manage in my opinion.
If I want to stick with my solution, should I assume that it is a matter of time performing the operation? I don’t see any disturbing issues related to mySQL in the logs. For example, will replacing the ORM with a plugin and SQL be a solution or is it better to check the execution and retry the task?

This really isn’t an issue of “like” or “dislike”. It’s a question of what’s the most appropriate tool for a task.

Yes, you can build your task manager, and you can create your service to do exactly what cron is doing.

Or you can use cron, which is well-tested and known to be a reliable mechanism for initiating tasks on a periodic basis.

Or, if you’re already using Celery & Beat, you could configure this as a Celery task.

@kessero

The only part missing here is that you should call django.db.close_old_connections first thing first in change_day.

The alternative would be to explicitly close the connection used in the function at the end of it.

There’s a long standing issue about documenting close_old_connections and when it should be called in long running script/management commands.

I’am not familiar with mysql on this level.
It looks like this is the problem. Yesterday script crash i’don’t do anything and today at 8:01 it executed OK. So tomorrow should crash again.
close_old_connections will close all connections (django website, scripts), all connections this specific user or only connection from that script?

only connection from that script