Unfortunately no one can be told what fun_plug is - you have to see it for yourself.
You are not logged in.
After following this great tutorial on backup's and snapshot's for the DNS-323 I've struggled the past week to get mine to work. The problem I am having is each time the job runs (either via cron or manually) it makes a complete copy of all the files in my source directory instead of updating and creating hard links. I've read many posts on this forum on adding the PATH to wc, removing the if / fi statement and nothing seems to work.
I am running FW 1.07 and ffp .5. I have scaled back my script to just backup a small directory of 400mb instead of my whole drive of 400gb to save on test time. It took over 24 hrs to backup 400 gb for the first time.
Thanks for any help or guidance you can give me. below is the script.
snapshot.sh
#! /bin/sh srcpath='/mnt/HD_a2/stuff /mnt/HD_a2/mm' dstpath=/mnt/HD_b2/backup ffppath=/ffp date=`date "+%Y%m%d_%H%M%S"` mkdir $dstpath/$date $ffppath/bin/rsync -aivx --link-dest=$dstpath/current $srcpath $dstpath/$date > $ffppath/log/snapshot-$date.log 2>&1 rm $dstpath/current ln -s $date $dstpath/current
Last edited by jgrrl (2009-12-17 23:10:37)
Offline
I think you only have a small mistake in the "ln" statement at the end.
ln -s $dstpath/$date $dstpath/current
There is a nice utility called "ccollect" that does all this and much more (garbage collection, keeping hourly/daily/weekly/monthly backups, etc). Available as ipkg (ipkg install ccollect) or on ccollect's website (http://www.nico.schottelius.org/software/ccollect/).
-Patrick
Offline
I suggest looking into rsnapshot - a perl wrapper around rsync. Nice tool.
Offline
Hey guys,
Thanks again for providing such great info. It turns out everything was working like it was supposed to. My lack of understanding on how rsync and hard links work. I was using windows to get folder information when the backups were done. The numbers did not compute because Samba and Windows don't differentiate the hardlinks from the actual files and I was getting 1.5 TB worth of data in a 750GB drive.
When I did a du in Linux it made more sense
root@NAS:/mnt/HD_b2/backup# du -s -h -d 1 419.5G ./20091217_131001 1.4G ./20091218_173007 420.9G .
So this makes a lot more sense. Between the backups I deleted a large video file and renamed another (the 1.4G difference)
Happy Holidays,
Jen
Offline
Hard links are misleading when it comes to disk space, even under Linux.
I wrote a small script (see attached) to compute the real disk usage and ignore hard links (unless all the files linked are under the directory specified). See attached.
-Patrick
Offline