Rich Freeman on 9 Mar 2014 11:52:27 -0700


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

Re: [PLUG] Encrypting Sensitive Personal Information In the Cloud?


On Sun, Mar 9, 2014 at 1:32 PM, Louis Kratz <louis.kratz@gmail.com> wrote:
> Does anyone do this now, and if so, what kind of services/encryption
> software do you use? I'd like something with version control, if possible,
> so I can add new year's data without the risk of corrupting old ones. I am
> looking for something a little more user-friendly than gpging a file and
> dumping it on s3.

Honestly, I use a script to encrypt everything with gpg and upload it
to s3, and have a rule on the bucket on s3 to move it to glacier.
That is for stuff that I want to keep a backup of in general, not for
taxes in particular.  I'm not worried about Amazon losing it since in
this case I have a local copy as well - there is only a disaster if my
house and Amazon's datacenter have a fire on the same day.

Some thoughts regarding a few ideas that came up on this list:
1.  An encrypted filesystem using LUKS is convenient, but doesn't
actually move anything offsite.  You'll still want some kind of
offsite encrypted backup solution on top of that if you care about
security.  I might suggest duplicity, which can gpg encrypt and dump
files on s3 automatically - I'm doing that for my home backups.

2.  A risk with an encrypted filesystem using LUKS is that if
something does go wrong recovery may be more difficult.  I think that
LUKS is just a block-based solution so I don't think that is a huge
risk, but if something goes wrong with your encryption layer you may
be hosed.  If this is just a local backup solution and you can verify
data was written cleanly before moving it offsite or whatever then
that isn't a problem.  If you're just going to work directly on a LUKS
drive then that is just another reason to have a backup somewhere.

Honestly, you might want to look at duplicity just as an archiving
solution - run it once on a directory and point it at s3 and you'll
get an encrypted backup.  I think it is indexed so that you can do
partial restores.  However, I don't think it can do partial transfers
within an individual file, so if you need one file in the middle of a
10GB backup image file you may have to download 10GB of data from
Amazon, which costs money (especially if using Glacier).  I don't know
how partial restores work across file splits - if you have 10GB worth
of 2MB backup files I don't know if it is smart enough to identify the
10 files it needs and download just those.  For me the use case is
catastrophic loss of all my home backups, so I'll end up restoring
everything I have on Glacier anyway.

Rich
___________________________________________________________________________
Philadelphia Linux Users Group         --        http://www.phillylinux.org
Announcements - http://lists.phillylinux.org/mailman/listinfo/plug-announce
General Discussion  --   http://lists.phillylinux.org/mailman/listinfo/plug