Build a Simple Network Backup Service on Linux
In its simplest form, a backup is the safe guarding of data by keeping it redundant or copied out, in an ordered way. This is a small script that demonstrates a simple repetitive routine using crontab to backup computers and shares; Windows and Linux based.
The files will be copied to a mount point, only mounted when needed. The mount point will be updated per month and maintain a history of files each month. It’s not important here, but this mount point might be a small NAS device in an office next door or a cloud based share.
This is a great beginner exercise for a scheduled BASH script.
0 . Latest updates.
2026.01.01 – Updated with Ubuntu 24.04
Configuring a mount point is not included in this post. The assumption is the server can write to the locations involved using mount points.
1 . Prerequisites.
This post assumes you have a running Ubuntu-like server, otherwise almost any other Linux distribution will work. The destination which to place a copy of your files on, should be mounted. The posts on mounting Windows shares and logging on, unchallenged, with SSH keys are precursors for this post.
The user object that the backup runs under, must have read and write permissions where required.
The destination should be mounted in order to be written to, however, consideration should be given to whether the destination should remain mounted in the case it could be accidentally deleted, for example. In the script below, the destination will be mounted only when required and dismounted when the backup completes, this keeps safe the data in the case there is a problem on the server and it leaks into the destination. This can be changed simply by remarking out the commands in the script below as you need.
2 . Create a backup script.
Now to create the script:
sudo mkdir -p /opt/backups
sudo nano /opt/backups/backups.shThe backup script can be as simple or as complex as it needs be. The basic requirement is to copy all the important things to another place. A good script might look like this:
#!/bin/bash
# Mount the backup point here if you chose to leave it unmounted:
mount //servername/backups /mnt/backups -t cifs -o credentials=/etc/creds.file,uid=5000,gid=5000
# Check if backups share is mounted.
if [[ $(findmnt -m "/mnt/backups") ]]; then
echo "Backups mounted"
# Servers to backup replace server1 server2 etc with actual resolvable names
for server in servername1 servername2 servername3
do
month=$(date "+%Y.%m")
options="-avxz --safe-links --exclude swap.file --delete"
# folders to backup replace with actual folders you want to backup in any or in each host
for folder in "/var/backup" "/home" "/var/www" "/var/vmail" "/etc" "/opt" "/root" "/mnt/data"
do
source="root@$server:$folder/"
destination="/mnt/backups/$month.$server/root$folder"
mkdir -p $destination
rsync $options $source $destination
done
done
# Unmount the share here if you want to leave the location unreachable outside of backup operations (which is more secure)
umount /mnt/backups
else
echo "Backup not mounted"
fiThe backup location now receives a copy of files in a folder named by month, so each month, a new folder is created and synchronised as often as the script runs, giving some kind of basic history. For anybody worried about space on the share, you have to monitor the share or change the script to delete old backups.
Don’t forget to set the script to executable:
sudo chmod +x /opt/backups/backups.sh3 . Automate the backup.
The script can be set to run once a week with crontab:
crontab -eAdding the following line, will run the script each Saturday morning at 03:00:
0 3 * * 6 /opt/backups/backups.shThe backup now runs once a week. It is the most simple of backups, but functional. If you check the location of the backup, you should see your files are there to be easily retrieved or read back.
What happens if you run out of storage? The backup has to keep running, a message can be sent using a webhook to Discord, for example.
This snippet will check if the disk has 200% the free space of the last backup size, if not, the oldest folder is deleted:
# Determine the size of most recent backup set
recent=$(ls -t /mnt/backup | grep -P \\d{4}.\\d{2} | head -n 1)
recentsize=$(($(du -sh -BG /mnt/backup/$recent | awk '{print $1}' | sed -e 's/G$//')*2))
# Determine how much space available on the mount point
currentspace=$(df -h -BG | grep backup | awk '{print $4}' | sed -e 's/G$//')
# Delete the oldest folder by name and creation time if not enough space for backup.
oldest=$(ls -tr /mnt/backup | grep -P \\d{4}.\\d{2} | head -n 1)
if (($recentsize > $currentspace)) then
echo "2 x recent is less than free"
else
echo "it is recommended to rm /mnt/backup/$oldest"
fiThis script can be run as a crontab job and give output directly to a Discord channel,set-up before with replacing the lines with output:
# Determine the size of most recent backup set
recent=$(ls -t /mnt/backup | grep -P \\d{4}.\\d{2} | head -n 1)
recentsize=$(($(du -sh -BG /mnt/backup/$recent | awk '{print $1}' | sed -e 's/G$//')*2))
# Determine how much space available on the mount point
currentspace=$(df -h -BG | grep backup | awk '{print $4}' | sed -e 's/G$//')
# Delete the oldest folder by name and creation time if not enough space for backup.
oldest=$(ls -tr /mnt/backup | grep -P \\d{4}.\\d{2} | head -n 1)
if (($recentsize > $currentspace)) then
/opt/discord/discord.sh "server03" "2 x recent is less than free"
else
/opt/discord/discord.sh "server03" "it is recommended to rm /mnt/backup/$oldest"
fi
4 . Supporting this blog.
This type of content takes a lot of effort to write. Each post is drafted, researched, then tested multiple times, even a simple step or detail might take more than a few hours to go from idea to published blog post. Did you notice there are no adverts on this blog?
If you feel I have saved you some time, please choose to host with Digital Ocean, like I do – https://m.do.co/c/0fa838487fa8