Simple backup for important files on Ubuntu VPS

Hello,

I am running basically all my stuff on VPS with various versions of Ubuntu and I am wondering if anyone has a good “backup story” for similar needs.

The project I want to backup uses SQLite for DB and has a few media files in the media directory. So ideally I would want to have a “snapshot” of both of these each day and uploaded somewhere else.

DigitalOcean offers backups of the entire VPS but these aren’t daily.

Since I have paid DropBox I was thinking that just creating archive every day and uploading it to dedicated folder might work but I only found some homemade solutions with customs scripts…

Or maybe I am missing some super simple approach.

I think your best options depend upon whether you’re only looking for a “current state” backup or whether you want backups-by-date where you can go back to any specific previous date.

But in short, “tar & scripts” really is the best way to go, and is the “super simple” approach. I’ve been relying on that basic premise since it was taught to me in 1998 and it has never failed me.

(If you’re only looking for a “current state” snapshot, then rsync works really well for that.)

Yea I think current state is mostly what I want, although I am not sure if there is perhaps a risk of the SQLite ending up in corrupted state and then having multiple backups from different days would probably be better solution.

Currently losing one or two days of data would be a nuisance but not catastrophic however losing more than that would be quite more serious.

Do you have tested workflow that I can maybe take inspiration from?

Perhaps if I had CRON to backup the database in regular interval into a folder that would either be connected to Dropbox or “rsynced” somewhere that could work right?

That’s precisely the risk of a current-state-only backup.

Not really a workflow, but samples from a couple scripts.

This is a cron job that runs daily.

export TDATE=/tmp/pg-`date +\%Y\%m\%d`
mkdir $TDATE
echo $TDATE
sudo -u postgres pg_dumpall | gzip >$TDATE/FULL.sql.gz
sudo -u postgres pg_dumpall -c | gzip >$TDATE/SCHEMA.sql.gz
sudo -u postgres pg_dumpall -a | gzip >$TDATE/DATA.sql.gz

Want to keep the most recent 10 backups?

ls -rtd | tail -n +11 | xargs rm -r -f 

(This is run from within the directory where the backups are stored, and nothing else is kept in that directory.)

Absolutely - that’s all it takes. A handful of scripts and you’re all set.

Wanted to share the solution I arrived at…

I decided to backup with Duplicacy so I can have “snapshots” and not just current state and something smarter than backing up full SQLite copies which would eat space.

I initially wanted to store the backups on Dropbox but there were some configuration issues related to refresh tokens and the entire setup seemed like both Duplicacy and Dropbox don’t really “recommend” this.

So I ended up buying S3 storage from Digital Ocean and storing my backups there. To be extra safe, I connected my Synology NAS to just download everything from there in periodic intervals.

On my Mac I have the Duplicacy Web version connected to the S3 storage and when I need I can restore the SQLite and media folder in a couple of minutes.

The backup on server is done with bash script which uses the sqlite3 command to create DB backup in the directory outside the project which is backed up. I am also using rsync to “mirror” the media folder into the backup folder.

CRON then runs my backup script every day and I have Sentry Cron monitoring as additional measure to know that it is indeed running.

And that’s it.

This DO article was quite helpful: How To Manage Backups to the Cloud with Duplicacy | DigitalOcean

1 Like