Automating backups on Linux VPS is a basic data protection measure that saves you from errors, crashes, and failed updates. Even if the server is running smoothly, any incident—from a damaged file system to accidental deletion—can lead to serious losses. A properly configured backup system solves this problem once and for all: copies are created automatically, stored in an orderly manner, and quickly restored when needed.
A well-thought-out backup structure
Before automating the process, it is important to determine what exactly needs to be stored. Usually, this includes:
- the directory with the website or application (/var/www/),
- service configurations,
- user data,
- directories with logs or static content.
Create a directory for backups:
sudo mkdir /backup
sudo chmod 700 /backup
Permissions 700 restrict access to the directory, which is important for security.
Creating an archive backup
The main archiving tool in Linux is tar. It is stable, fast, and supported by all systems.
Example of creating a backup:
sudo tar -czvf /backup/site-$(date +%F).tar.gz /var/www/
What is happening here:
- -c — create an archive,
- -z — enable gzip compression,
- -v — show the process,
- -f — path to the final file.
The date format of the names makes it easy to find the right archive by creation time.
Checking that the backup was created correctly
List of created archives:
ls -lh /backup
If the file is displayed, its size is realistic — you can add one more check:
tar -tf /backup/site-YYYY-MM-DD.tar.gz
This command displays the contents of the archive. If it is readable, the archive is in order.
Configuring automation via cron
Next, the most important thing: make sure that backups are created automatically.
Open cron:
sudo crontab -e
Add the line:
0 3 * * * tar -czf /backup/site-$(date +\%F).tar.gz /var/www/ >/dev/null 2>&1
Decryption:
- **0 3 * * *** — execution daily at 03:00,
- >/dev/null 2>&1 — output disabled to avoid generating unnecessary root letters.
In the morning, there will be a fresh archive in /backup.
Transferring backups to another server
Storing all backups on the same VPS is risky. The best option is to transfer them to a remote server or dedicated storage.
Command for manual sending:
rsync -avz /backup/ user@IP:/remote-backups/
Advantages of rsync:
- transfers only changed parts of files,
- works via SSH,
- suitable for automation.
Adding to cron:
30 3 * * * rsync -avz /backup/ user@IP:/remote-backups/ >/dev/null 2>&1
Now copying happens automatically, right after the archive is created.
Automatic cleanup of old backups
To keep the /backup directory from getting too full:
find /backup -type f -mtime +7 -delete
Deletes archives older than 7 days. The time period can be customized to suit your needs.
Command for cron:
0 4 * * * find /backup -type f -mtime +7 -delete
Final cron configuration
It is convenient to collect the entire plan in one place:
# Creating a backup
0 3 * * * tar -czf /backup/site-$(date +\%F).tar.gz /var/www/
# Transfer to an external server
30 3 * * * rsync -avz /backup/ user@IP:/remote-backups/
# Cleaning up old archives
0 4 * * * find /backup -type f -mtime +7 -delete
This set is sufficient for full automation.
When to upgrade your VPS
If archives take too long to create, the disk subsystem is overloaded, or the server freezes during backup, this is a clear sign that you don’t have enough resources. Sometimes it is easier to switch to a VPS with higher performance than to try to optimize the minimum configuration.
Conclusion
Backup automation on Linux VPS is based on a simple but reliable scheme: daily creation of archives, their transfer to an external server, and regular cleaning of old data. All this takes a few minutes to set up, but ensures the smooth operation of the project for many years to come. A properly organized backup system frees the administrator from routine tasks and ensures that data can be restored at any time.