Chris (Reddog) on 2 Dec 2009 09:12:00 -0800


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

Re: [PLUG] Self-hosted online backups?


Take a look at Backup Manager ( http://www.backup-manager.org/about/ )
it's what we use at work. Very highly configurable, you pick the
compression, whats backed up, and how it's backed up. Very
straightforward config file. Very easy unless you're looking for a GUI
approach.

Chris Callie
------------------------
Mail: reddog176@gmail.com
Web: http://www.google.com/profiles/reddog176



On Wed, Dec 2, 2009 at 11:32 AM, Richard Freeman
<r-plug@thefreemanclan.net> wrote:
> JP Vossen wrote:
>> Obviously, I could script something using tar, GPG, rsync, and/or other
>> tools, but I can't be the only person out there who wants this, and why
>> reinvent the wheel?
>>
>
> At various points I've used a few different approaches.
>
> I don't have servers at multiple geographic locations, which limits my
> options if I want offsite backup.
>
> Right now I'm using this cron script and AWS:
>
> #!/bin/sh
> # run weekly backup
> BACKDIR='/sstorage3/sarab-back'
> UPDIR='/sstorage3/sarab-up'
> BUCKET='s3://<bucket-name>'
>
> <insert command here to export your databases to a file in the backup path>
>
> ionice -n 7 nice -n 20 /usr/bin/sarab.sh
>
> cd $BACKDIR
> CURBACK=`ls --sort=time -1 | head -n 1`
> cd "$BACKDIR/$CURBACK"
> rm -r "$UPDIR/$CURBACK"
> mkdir "$UPDIR/$CURBACK"
>
> for encfile in * ; do nice -n 20 gpg -o "$UPDIR/$CURBACK/$encfile" -r
> <keyid> --encrypt $encfile ; done
>
> cd "$UPDIR"
>
> ionice -c 3 nice -n 20 s3cmd --no-delete-removed sync . "$BUCKET/" || exit
> ionice -c 3 nice -n 20 s3cmd --delete-removed sync . "$BUCKET/"
>
> So, sarab is doing most of the work, and then I encrypt the output and
> ship it to S3.  No trust needed on my part.  All the stuff in this
> script can be found via goodle and is available in Gentoo portage.
>
> Previously I used to use the backuppc package, which might work for you.
>  It runs on a server and connects to a series of hosts (with rules
> defined at various levels from default to per-host), and the backups are
> analyzed and stored in a sparse manner using hard links.  To get it to
> work you'd probably need a VPN connection to all of your hosts.
>
> Backuppc also has some advantages in that it has a nice web-based
> console and you can even give users access to do their own restores
> (either by downloading any saved version of the file via the web
> interface, or by having the server just put the file back on the host
> where it came from).
>
> As others have indicated, there are a ton of other solutions as well.
> The S3-based solution is pretty cheap if you aren't imaging your entire
> OS and you don't need to try to backup multiple systems with sparse
> files.  I pay a few dollars a month for it - and that is with all of my
> photos being protected (but I do burn them to DVD as they accumulate and
> store them offsite and exclude them from backups as that happens - so
> that greatly reduces my S3 bandwidth and space use).
> ___________________________________________________________________________
> Philadelphia Linux Users Group         --        http://www.phillylinux.org
> Announcements - http://lists.phillylinux.org/mailman/listinfo/plug-announce
> General Discussion  --   http://lists.phillylinux.org/mailman/listinfo/plug
>
___________________________________________________________________________
Philadelphia Linux Users Group         --        http://www.phillylinux.org
Announcements - http://lists.phillylinux.org/mailman/listinfo/plug-announce
General Discussion  --   http://lists.phillylinux.org/mailman/listinfo/plug