Bobbie Smulders Creative Software Developer

Automated backups from DirectAdmin to Synology NAS

Synology Task Scheduler

For the past couple of years I’ve been a customer of Antagonist. I have a shared hosting package from them that is running on DirectAdmin. DirectAdmin has an option for manual backups, but as this option takes a lot of time and manual labor, it would be better if we could automate it. To automate it I came up with a three step process:

1. Create a full backup from DirectAdmin and store it locally

DirectAdmin has released a script on their website to automatically create backups. This script works on accounts that do not have admin rights (typical on shared hosting servers). Follow the instructions up until step 5. I have set up the cron job to create the backup at 04:00. This is how my cronjob looks like in DirectAdmin:

00 04 * * * /home/USERACCOUNT/backup.php

2. Transfer the backup from the shared hosting server to a Synology NAS

Transferring the backup from the shared hosting server to the Synology NAS can be accomplished in two ways: Letting DirectAdmin upload it to the Synology NAS or letting the Synology NAS download it from the shared hosting server. My shared hosting server doesn’t allow for scripts that upload to FTP (NCFTPPUT hasn’t been installed) so I let my Synology NAS download the backups. There are a couple steps needed for this.

1) Create an FTP account in DirectAdmin

In DirectAdmin, under “FTP Management”, create a new FTP account with an identifiable username and strong randomly generated password. Use the custom directory field and enter the directory where the DirectAdmin script from step 1 stores its backups. This ensures that the account is only capable of downloading the backups.

2) Create a directory on your Synology NAS

Create a directory anywhere on your Synology NAS. It does not matter where you store the backups.

3) Schedule a download script

In the Synology control panel, under “Task Scheduler”, create a task that runs the following script:

find /VOLUME/BACKUP_DIRECTORY/ -mtime +10 -exec rm {} \;
curl "ftp://FTP.DOMAIN.COM/backup-`date +%b-%-e-%Y`-1.tar.gz" -u USERNAME:PASSWORD --ftp-ssl -O

This script is a two part process. The first line deletes all previous backups older than 10 days. You may change the “10” parameter to suit your backup needs. The second and third line download the recently generated backup to the directory of your choice. Replace the “VOLUME”, “BACKUP_DIRECTORY”, “DOMAIN.COM”, “USERNAME” and “PASSWORD” sections to fit your website and credentials.

I have scheduled this task to run at 05:00, which gives DirectAdmin exactly one hour to create a backup. If your server is incapable of creating a backup within an hour, you may need to schedule this to a later time.

For the sake of simplicity I have hardcoded the username and password inside the cURL command. I highly recommend using –config to store your credentials in an external file if you can. If your server doesn’t have a valid SSL certificate you can add the –insecure parameter to ignore the certificate warning.

3. Delete the backup from DirectAdmin

After the backup has been downloaded, DirectAdmin needs to delete the backup. Schedule a cronjob in DirectAdmin to delete the backup at 06:00. This is how my cronjob looks like in DirectAdmin:

00 06 * * * rm -rf /home/USERACCOUNT/backups/*


Please note that this process may not apply to your particular situation. Also be aware that having your Synology NAS compromised may lead to your DirectAdmin account being vulnerable. Security wise, the best solution is to create a write-only FTP account on your Synology NAS and let DirectAdmin upload the backup files there. Unfortunately, my hosting provider doesn’t allow me to do that.