DSM-G600, DNS-3xx and NSA-220 Hack Forum

Unfortunately no one can be told what fun_plug is - you have to see it for yourself.

You are not logged in.

Announcement

#1 2007-10-14 19:52:49

dannystaple
Member
Registered: 2007-08-29
Posts: 9

Automated Server Backup Scripts

I have (over a while) built some bash scripts that take info from my site orionrobots along with a DB dump, and deposited it somewhere for another machine to pick it up, and store in a backup location. I decided after buying the DSM-G600 that this would be a perfect function for it, and have adapted the system to work on it in ash.

Requirements:
A little knowledge of fun_plug.
Telnet installed.
busybox installed.
dropbear installed.

dropbear seems to want to locate dbclient in /usr/bin, and I wanted to have plenty of other tools in there. Most of my custom tools are installed into a bin directory at the root of HD_a2. There is a /usr/bin existing on the box, but it contains one file - a symlink for passwd. I rebuilt this symlink in my bin directory, and modified fun plug to remove /usr/bin and symlink it back to my bin directory.

Perform the following via telnet:

Putting the same symlink in custom binary directory

Code:

ln -s /bin/tinylogin /mnt/HD_a2/bin/passwd

Add to fun_plug

Code:

rm -r /usr/bin
ln -s /mnt/HD_a2/bin /usr/bin

When you reboot, your bin directory will then be there in place of the default one. Bear in mind that by running the fun_plug commands directly as well, or sourcing fun_plug, you can try stuff out without rebooting.

Next you should establish keys for ssh. Since I already had keys generated before, I used dropbearconvert to generate dropbear compatible keys from them. This thread helped me get dropbear ssh client working in a script and not asking for a password. Also, you need to prime the machine by logging into the remote host at least once, causing an entry to be made in the known hosts, and accept it.

However, before converting keys, and accepting hosts, you will note that the directory used for ssh preferences is currently /.ssh. This is on the ram disk, and will be blitzed each time the box is rebooted. The solution is to create a permanent store on HD_a2, and an entry in fun_plug to symlink it on boot up. You do not necessarily have to copy my choice of path on HD_a2:

Create ssh preferences dir

Code:

mkdir -p /mnt/HD_a2/home/root/.ssh

Add to fun_plug

Code:

ln -s /mnt/HD_a2/home/root/.ssh /.ssh

Again run the fun_plug command manually or reboot to see the change.

At this point I copied my original keys to the .ssh directories and ran dropbearconvert:

Code:

cd /.ssh
dropbearconvert openssh dropbear id_rsa db_rsa_key

Then ensured that dropbear has the host cached in known_hosts, and test the key:

Code:

ssh -i /.ssh/db_rsa_key <theotherhost>

Ok, assuming like me you already had the server set up with an authorized_keys file, that should work. Once logged in, you can then exit that session, and you now want to get the backup scripts running.

The original scripts had (luckily) only two things to alter for them to run in this environment. The first was that they use /bin/bash for their default interpreter, and the second that they used the batch mode flag for the scp (remote copy) operation. I quickly adapted them by substituting /usr/bin/ash (which is broadly compatible with bash scripts), and the snippet "-i /.ssh/db_rsa_key" for the -b operation.

The scripts themselves are the following:
* a script on the remote server to make a db dump, and tarball it up along with the sites code, leaving in the correct place
* A daily backup script that uses scp (secure copy) to pull the remote tarball, and place it in a directory with daily rotated backup entries.
* A weekly backup script that creates a year,month, day directory for one day of a week, and copies one of the daily backups to it. It produces up to 15 weeks, then starts rotating.

All the scripts log as they go, perhaps with the daily backup logging slightly less than the others. They are all triggered by cron jobs, but I will get to those later.

The first script is simple enough, and needs to be run from the crontab of the remote site - likely to be some simple db + php host:

backup_site_process.sh

Code:

#!/bin/bash
#This is a regular backup preparation script.
#It does not perform the backup, but prepares a file for backing
#up offsite.
#remove old files
echo "`date +%D-%R` : Performing backup"
rm -f site_db.dmp
rm -f currentsite.tar currentsite.tar.gz
mysqldump site_db --user=db_user --password=db_password >site_db.dmp
tar -c -X tar_exclude -f currentsite.tar public_html/* public_html/.htaccess tar_exclude  backup_site_process.sh site_db.dmp
gzip currentsite.tar

Note that the script tarballs itself up as well. It also references a file tar_exclude - this is a list of files to exclude from the dump, like temporary files and working files. I wont share that as it is highly specific.

For the scripts running on the G600, I created some directories for them to work in. I placed the scripts in /mnt/HD_a2/backup/scripts and made /mnt/HD_a2/backup/sitename/site_backups the backup scripts working directory.

The second script is the daily backup script:

/mnt/hd_a2/backup/scripts/daily_backup

Code:

#!/usr/bin/ash
#Daily backup script
echo "`date +%D-%R`: Starting daily backup"
#Get the day of week
dow=`date +%A`
siteaddress="yoursite.com"
#You must have an ssh id in your .ssh directory setup for this to work
user="yoursiteuser"
#The backup file is expected to be in this path, relative to login path
remote_path="./"
#This is the directory at the top of the backup set
backup_local_root="/mnt/HD_a2/backups/sitename/site_backups"

backup_file_name=currentsite.tar.gz
#error values
E_REMOTE_COPY=60
E_MKDIR=61
#Make the path if needed
echo "`date +%D-%R` : Making dir ${backup_local_root}/${dow}"
mkdir -p ${backup_local_root}/${dow} || {
  echo "Unable to make directory"
  exit $E_MKDIR;
}
local_full_file_path=${backup_local_root}/${dow}/${backup_file_name}
#Clear an old file
echo "`date +%D-%R` : Cleaning file ${local_full_file_path}"
rm -f ${local_full_file_path}
#Put this one on top
echo "`date +%D-%R` : Connecting to ${user}@${siteaddress}"
echo "`date +%D-%R` : Copying ${remote_path}currentsite.tar.gz to ${local_full_file_path}"
scp -i /.ssh/db_rsa_key ${user}@${siteaddress}:${remote_path}${backup_file_name} ${local_full_file_path} || {
  echo "`date +%D-%R` : Unable to copy from remote system"
  exit $E_REMOTE_COPY;
}

echo "`date +%D-%R` : Finished backup"
#Exit and indicate success
exit 0

Finally the weekly script:
/mnt/hd_a2/backup/scripts/daily_backup

Code:

#!/usr/bin/ash
#Weekly backup script
#This is simpler - we would have already run a daily backup
#So we dont need to connect remotely

#You can either use date +%A for today, or override with a day name
echo "`date +%D-%R`: Starting weekly backup"
day_to_backup=`date +%A`
working_root="/mnt/HD_a2/backups/sitename/"
backup_local_root="${working_root}/site_backups"

weeknum_filename="${working_root}/WEEKNUM"
backup_file_name=currentsite.tar.gz

E_CANNOT_MKDIR=60
E_CANNOT_COPY=61
E_CANNOT_WRITE_WEEK_FILE=62
#Read week number from file
echo "`date +%D-%R`: Getting week number from ${weeknum_filename}"

if [ -e ${weeknum_filename} ]; then
  #read week number from file
  let weeknum=`cat <${weeknum_filename}`
  echo "Got ${weeknum}"
else
  let weeknum=0
fi
#write the next one back out
echo "`date +%D-%R`: Writing the next week number out to ${weeknum_filename}"
let next_num=(${weeknum} + 1)%15
echo $next_num >${weeknum_filename} || {
  echo "`date +%D-%R`: Unable to write the next week number out"
  exit $E_CANNOT_WRITE_WEEK_FILE;
}
#Make sure a week directory exists
week_dir="${backup_local_root}/week_${weeknum}"
mkdir -p ${week_dir} || {
  echo "`date +%D-%R`: Unable to create directory"
  exit $E_CANNOT_MKDIR;
}

echo "`date +%D-%R`: Copying ${backup_local_root}/${day_to_backup}/${backup_file_name} to ${week_dir}/${backup_file_name}"
#Copy the files
cp ${backup_local_root}/${day_to_backup}/${backup_file_name} ${week_dir}/${backup_file_name} || {
  echo "`date +%D-%R`: Unable to copy files"
  exit $E_CANNOT_COPY;
}

echo "`date +%D-%R`: Done"

exit 0

The WEEKNUM file created will tell it which week it is in. Do not delete or change it once generated, as it is important for cycling them. I will at some point break down and explain the scripts further, though the next part is the cron jobs for them.

The cron job on the remote server is easy:
crontab entry

Code:

0 2 * * * ~/backup_site_process.sh >>~/backup.log

This can be added via "crontab -e" normally. However, things are a bit more interesting for the G600. The crontab for this again sits in the ramdisk. I have opted for a different method for this for now, instead of moving it and symlinking, I am using fun_plug to append the entries for the file. The root crontab is in /var/spool/cron/crontabs/root. fun_plug should make changes there:

Add to fun_plug

Code:

#make backup job cron entries
echo >>/var/spool/cron/crontabs/root "0 23 * * * /mnt/HD_a2/backups/scripts/daily_backup >>/mnt/HD_a2/backups/sitename/backup.log"
echo >>/var/spool/cron/crontabs/root "30 23 * * 1 /mnt/HD_a2/backups/scripts/weekly_backup >>/mnt/HD_a2/backups/sitename/backup.log"

Echo is a simple way to append to a file. Note that each job has its output redirected to a backup log. In fact - to append to the same backup log.

You should probably run each of the scripts manually to ensure they all work as expected.

So there you have it, fully automated rotating disk-to-disk remote backup on a G600. Feel free to adapt it to your needs or something...

My next job is to figure out how, and if this box can be used as an SVN and trac server.

Offline

 

#2 2007-10-30 22:45:44

dannystaple
Member
Registered: 2007-08-29
Posts: 9

Re: Automated Server Backup Scripts

I have a correction to make to this - the weekly script was not working - and I did not notice. You see, these were adapted from bash scripts, and ash has a slightly different syntax for arithmetic operation. You need to wrap those in a $(( )) - for example "echo $(( 2 + 2 )) will display 4.

Also, I have ended up with an additional slash in the path, although that seems not to affect much, it makes the log files read badly.

So in the second script, at the line which reads:

Code:

working_root="/mnt/HD_a2/backups/sitename/"

Delete the final slash:

Code:

working_root="/mnt/HD_a2/backups/sitename"

And on the line which reads:

Code:

let next_num=(${weeknum} + 1)%15

Should become:

Code:

let next_num=$(( (${weeknum} + 1)%15 ))

Offline

 

Board footer

Powered by PunBB
© Copyright 2002–2010 PunBB