*Cube-Host– full cloud services!!

Automating backups on Linux VPS

Automating backups on a Linux VPS

Automated backups are the baseline for running any production workload on a Linux VPS. Even a stable server can suffer from accidental deletion, filesystem corruption, failed updates, or a compromised account.

This guide shows a practical backup setup: what to include, how to create daily archives, how to keep offsite copies, and how to verify restores. If you’re hosting real projects, start with a reliable Linux VPS with enough storage and consistent disk performance to run backups without impacting users.

Backup Strategy for a Linux VPS

Before writing any scripts, define a simple strategy. This improves reliability and also makes troubleshooting much easier later.

  • RPO (Recovery Point Objective): how much data you can afford to lose (e.g., last 24 hours).
  • RTO (Recovery Time Objective): how quickly you must restore service (minutes vs hours).
  • Retention: how many days/weeks of backups you keep (e.g., 7 daily + 4 weekly).
  • Offsite copy: never store backups only on the same VPS.
  • Restore testing: a backup you never restore is just a file, not a plan.

What You Should Back Up

For most websites and applications, the “must-have” backup set looks like this:

  • Website/app files (commonly /var/www/)
  • System and service configs (/etc/)
  • User data (/home/)
  • Databases (MySQL/MariaDB/PostgreSQL dumps)
  • SSL certificates (often under /etc/letsencrypt/ or your web server config paths)
  • App secrets and environment files (where your application stores them)

Tip: If your VPS is used for containers (Docker), you usually want to back up persistent volumes and app configs, not container layers.

Create a Secure Backup Directory

Create a dedicated backup folder with strict permissions. This prevents other users/processes from reading sensitive backups.

sudo mkdir -p /backup
sudo chown root:root /backup
sudo chmod 700 /backup

Important: Make sure your backup path is not inside your website directory (for example, not under /var/www), otherwise you may accidentally expose archives via HTTP.

Option 1: Create a Simple Archive Backup with tar

A daily compressed archive is a good baseline. Use exclusions to avoid backing up virtual filesystems and to prevent recursive self-backups.

sudo tar -czf /backup/backup-$(date +%F).tar.gz \
  --one-file-system \
  --exclude=/backup \
  --exclude=/proc --exclude=/sys --exclude=/dev --exclude=/run \
  /var/www /etc /home

If your VPS is busy, you can lower backup impact using CPU and I/O priority:

sudo nice -n 19 ionice -c3 tar -czf /backup/backup-$(date +%F).tar.gz \
  --one-file-system \
  --exclude=/backup \
  --exclude=/proc --exclude=/sys --exclude=/dev --exclude=/run \
  /var/www /etc /home

What tar options mean

  • -c — create archive
  • -z — gzip compression
  • -f — output file
  • --one-file-system — don’t cross mount points (helps avoid surprises)

Database Backups

Files alone are not enough for most production servers. Add database dumps to your backup routine (preferably stored alongside your archives).

MySQL / MariaDB (dump all databases)

sudo mysqldump --all-databases --single-transaction --routines --events \
  | gzip > /backup/mysql-all-$(date +%F).sql.gz

Tip: For automation, configure credentials safely (for example in /root/.my.cnf) instead of putting passwords in cron.

PostgreSQL (dump all databases)

sudo -u postgres pg_dumpall | gzip > /backup/postgres-all-$(date +%F).sql.gz

Verify Backups

Always verify that files exist and are readable. A zero-byte archive or a broken dump can happen silently if you never check.

ls -lh /backup
tar -tzf /backup/backup-$(date +%F).tar.gz | head

For extra confidence, generate checksums (useful for offsite transfers):

cd /backup
sha256sum backup-$(date +%F).tar.gz > backup-$(date +%F).tar.gz.sha256

Create a Backup Script (Recommended)

Instead of complex one-liners in cron, put your logic into a script. This makes logging, retention, and offsite sync much cleaner.

sudo nano /usr/local/sbin/backup-vps.sh

Example script (edit paths to match your server):

#!/usr/bin/env bash
set -euo pipefail

BACKUP_DIR="/backup"
DATE="$(date +%F)"
HOST="$(hostname -s)"
RETENTION_DAYS="7"

ARCHIVE="${BACKUP_DIR}/${HOST}-files-${DATE}.tar.gz"
MYSQL_DUMP="${BACKUP_DIR}/${HOST}-mysql-${DATE}.sql.gz"
PG_DUMP="${BACKUP_DIR}/${HOST}-postgres-${DATE}.sql.gz"

mkdir -p "${BACKUP_DIR}"
chmod 700 "${BACKUP_DIR}"

# 1) Files archive (adjust folders if needed)
nice -n 19 ionice -c3 tar -czf "${ARCHIVE}" \
  --one-file-system \
  --exclude=/backup \
  --exclude=/proc --exclude=/sys --exclude=/dev --exclude=/run \
  /var/www /etc /home

# 2) MySQL/MariaDB dump (optional: remove if you don't use MySQL)
if command -v mysqldump >/dev/null 2>&1; then
  mysqldump --all-databases --single-transaction --routines --events \
    | gzip > "${MYSQL_DUMP}" || true
fi

# 3) PostgreSQL dump (optional: remove if you don't use PostgreSQL)
if command -v pg_dumpall >/dev/null 2>&1 && id postgres >/dev/null 2>&1; then
  sudo -u postgres pg_dumpall | gzip > "${PG_DUMP}" || true
fi

# 4) Checksums (helpful for integrity checks after transfers)
cd "${BACKUP_DIR}"
sha256sum "$(basename "${ARCHIVE}")" > "$(basename "${ARCHIVE}").sha256"

# 5) Retention cleanup
find "${BACKUP_DIR}" -type f -name "${HOST}-files-*.tar.gz" -mtime +"${RETENTION_DAYS}" -delete
find "${BACKUP_DIR}" -type f -name "${HOST}-mysql-*.sql.gz" -mtime +"${RETENTION_DAYS}" -delete
find "${BACKUP_DIR}" -type f -name "${HOST}-postgres-*.sql.gz" -mtime +"${RETENTION_DAYS}" -delete
find "${BACKUP_DIR}" -type f -name "${HOST}-files-*.tar.gz.sha256" -mtime +"${RETENTION_DAYS}" -delete

echo "Backup completed: ${DATE}"

Make it executable:

sudo chmod +x /usr/local/sbin/backup-vps.sh

Automate Backups with Cron

Schedule the script to run daily (example: 03:00). Edit root crontab:

sudo crontab -e

Add the line below:

0 3 * * * /usr/local/sbin/backup-vps.sh >> /var/log/backup-vps.log 2>&1

Tip: Check logs after the first run: tail -n 50 /var/log/backup-vps.log

Transfer Backups Offsite (rsync)

Do not keep backups only on the same VPS. A basic and reliable approach is copying backups to a second server using rsync over SSH.

1) Create an SSH key (on the VPS)

sudo ssh-keygen -t ed25519 -a 64 -f /root/.ssh/id_ed25519 -N ""

2) Copy the key to the remote backup server

sudo ssh-copy-id backupuser@remote-server

3) Sync the backup folder

sudo rsync -az /backup/ backupuser@remote-server:/remote-backup/$(hostname -s)/

If you want this automated, schedule it after the backup (example: 03:30):

30 3 * * * rsync -az /backup/ backupuser@remote-server:/remote-backup/$(hostname -s)/ >> /var/log/backup-vps-rsync.log 2>&1

Test Restore (Quick Sanity Check)

At least occasionally, do a test restore into a temporary directory. This is the fastest way to catch permission issues, missing paths, or broken archives.

sudo mkdir -p /tmp/restore-test
sudo tar -xzf /backup/$(hostname -s)-files-$(date +%F).tar.gz -C /tmp/restore-test
ls -la /tmp/restore-test | head

And verify checksums if you use them:

cd /backup
sha256sum -c $(hostname -s)-files-$(date +%F).tar.gz.sha256

Security Best Practices for VPS Backups

  • Keep /backup private: chmod 700 and root-owned.
  • Use SSH keys for offsite transfers (disable password auth on the backup server if possible).
  • Consider encrypting archives before offsite storage if backups contain sensitive data.
  • Store secrets safely (don’t hardcode DB passwords into cron).
  • Monitor backups: check log files, file sizes, and last modification time.

When to Upgrade Your VPS

If backups take too long, CPU usage spikes during compression, or disk I/O becomes a bottleneck, you’ll get more predictable automation on a stronger Linux VPS with better storage performance and additional resources—especially if you run heavy databases or multiple websites.

FAQ

How often should I run backups?

For most sites, daily backups are the minimum. If data changes frequently (orders, messages, uploads), consider more frequent database dumps (e.g., every 1–6 hours) with a longer retention policy.

Are VPS snapshots the same as backups?

No. Snapshots can be useful, but they’re not a full backup strategy. You still want independent, offsite copies and restore testing.

How long should I keep backups?

A simple and effective approach is 7 daily backups plus weekly/monthly archives for longer retention, depending on your business needs and storage budget.

What’s the biggest mistake people make?

Keeping backups on the same VPS only, and never testing restores. Offsite + restore testing is what turns “backups” into real recovery.

Deploy a Linux VPS Ready for Automated Backups

Want predictable backup jobs, stable performance, and enough headroom for archives and database dumps? Start with a reliable Linux VPS and automate daily backups + offsite sync from day one.

Prev