librelist archives

« back to archive

Large remote backup, Best practise?

Large remote backup, Best practise?

From:
Petter Gunnerud
Date:
2014-12-07 @ 03:24
Summary:

What is the preferred way to do offsite backup using attic, and how would 
you do the initial (large) backup?
- Options: archive over nfs, local archive with rsync to remote server or 
ssh? Any other options?


Details:

I'm starting to look at attic for offsite backup of virtual machine disk 
files. This will make the initial backup quite large.

The source files are sparse.

The virtual machines are powered off while the backup i performed.
Encryption is required.


My initial idea was to mount a usb disk in sourcepc:/mnt/nfs, create an 
attic archive of the source files in sourcepc:/mnt/nfs/attic, then bring 
the drive to the backup pc, copy all files into backuppc:/backup/attic, 
and export backuppc:/backup for nfs (over ipsec).

This kind of works, except that creating and updated archive over nfs 
seems to require quite a lot of datatransfer. Between the initial backup 
and the update there has been about 5GB of data changed on the source 
files, and so far the process of backing up the changes has made 7GB of 
datatransfer and according to the verbose output it has only looked at 
about 30% of the image files so far, and disk usage on the backuppc has 
increased less than 500MB.


So what's the best way for large remote backups?
- NFS seems not so good from the above experience. I read in the mail list
archive that some had experienced issues with attic to nfs storage. (The 
mail didn't say what the issue was.)

- In another mail I read someone suggested running a local attic archive 
and rsync this to a remote server. This would also make it possible to do 
initial backup using a usb drive. But there is a index.xxxx that is 3.4GB 
and changing content and name frequently. Rsync isn't very suitable for 
files this large.


- Or maybe ssh would be the preferred way. But how would I go about making
the initial backup using usb disk when destination is ssh? Could I convert
the current nfs archive to a ssh archive?

Re: [attic] Large remote backup, Best practice?

From:
Wayne Scott
Date:
2014-12-07 @ 12:28
Just create the archive locally and then move it to the remote machine,
rsync/nfs/usb disk whatever.

Then change future backups to use SSH to the remote machine talking to
attic on that machine.  It appears to identify archives by a unique
identifier and that number doesn't change when the archive gets moved.

I just did a quick test locally and no data as moved over the network on
the second backup of the same data after I moved the archive.

-Wayne

On Sat Dec 06 2014 at 10:25:18 PM Petter Gunnerud <pgspm@yahoo.no> wrote:

> Summary:
>
> What is the preferred way to do offsite backup using attic, and how would
> you do the initial (large) backup?
> - Options: archive over nfs, local archive with rsync to remote server or
> ssh? Any other options?
>
>
> Details:
>
> I'm starting to look at attic for offsite backup of virtual machine disk
> files. This will make the initial backup quite large.
>
> The source files are sparse.
>
> The virtual machines are powered off while the backup i performed.
> Encryption is required.
>
>
> My initial idea was to mount a usb disk in sourcepc:/mnt/nfs, create an
> attic archive of the source files in sourcepc:/mnt/nfs/attic, then bring
> the drive to the backup pc, copy all files into backuppc:/backup/attic, and
> export backuppc:/backup for nfs (over ipsec).
>
> This kind of works, except that creating and updated archive over nfs
> seems to require quite a lot of datatransfer. Between the initial backup
> and the update there has been about 5GB of data changed on the source
> files, and so far the process of backing up the changes has made 7GB of
> datatransfer and according to the verbose output it has only looked at
> about 30% of the image files so far, and disk usage on the backuppc has
> increased less than 500MB.
>
>
> So what's the best way for large remote backups?
> - NFS seems not so good from the above experience. I read in the mail list
> archive that some had experienced issues with attic to nfs storage. (The
> mail didn't say what the issue was.)
>
> - In another mail I read someone suggested running a local attic archive
> and rsync this to a remote server. This would also make it possible to do
> initial backup using a usb drive. But there is a index.xxxx that is 3.4GB
> and changing content and name frequently. Rsync isn't very suitable for
> files this large.
>
>
> - Or maybe ssh would be the preferred way. But how would I go about making
> the initial backup using usb disk when destination is ssh? Could I convert
> the current nfs archive to a ssh archive?
>

Sv: [attic] Large remote backup, Best practice?

From:
Petter Gunnerud
Date:
2014-12-07 @ 14:43
Thanks. Switching to ssh seems to be working. I was expecting the cache 
and key to be bound to mountpoint. Turns out it wasn't.

I aborted the nfs update after 13 hours. Then 35GB was transferred for the
5GB change. Now ssh has been running for one hour. It has cough up with 
where the nfs was aborted. 36MB is transferred so far. I hope that is an 
indication that it will transfer less data, not just slower.

On the way I noticed some issues with the documentation. The doc say:
$ attic init --encryption=passphrase user@hostname:/arch.attic

When using custom port I'd expect the following syntax:
$ attic init --encryption=passphrase user@hostname:9999/arch.attic
But this syntax keeps trying default port.

I discovered that this syntax worked as expected:
$ attic init --encryption=passphrase ssh://user@hostname:9999/arch.attic





________________________________
Fra: Wayne Scott <wsc9tt@gmail.com>
Til: attic@librelist.com 
Sendt: s√łndag, 7. desember 2014 13.28
Emne: Re: [attic] Large remote backup, Best practice?






Just create the archive locally and then move it to the remote machine, 
rsync/nfs/usb disk whatever.

Then change future backups to use SSH to the remote machine talking to 
attic on that machine.  It appears to identify archives by a unique 
identifier and that number doesn't change when the archive gets moved.

I just did a quick test locally and no data as moved over the network on 
the second backup of the same data after I moved the archive.

-Wayne



On Sat Dec 06 2014 at 10:25:18 PM Petter Gunnerud <pgspm@yahoo.no> wrote:

Summary:
>
>What is the preferred way to do offsite backup using attic, and how would
you do the initial (large) backup?
>- Options: archive over nfs, local archive with rsync to remote server or
ssh? Any other options?
>
>
>Details:
>
>I'm starting to look at attic for offsite backup of virtual machine disk 
files. This will make the initial backup quite large.
>
>The source files are sparse.
>
>The virtual machines are powered off while the backup i performed.
>Encryption is required.
>
>
>My initial idea was to mount a usb disk in sourcepc:/mnt/nfs, create an 
attic archive of the source files in sourcepc:/mnt/nfs/attic, then bring 
the drive to the backup pc, copy all files into backuppc:/backup/attic, 
and export backuppc:/backup for nfs (over ipsec).
>
>This kind of works, except that creating and updated archive over nfs 
seems to require quite a lot of datatransfer. Between the initial backup 
and the update there has been about 5GB of data changed on the source 
files, and so far the process of backing up the changes has made 7GB of 
datatransfer and according to the verbose output it has only looked at 
about 30% of the image files so far, and disk usage on the backuppc has 
increased less than 500MB.
>
>
>So what's the best way for large remote backups?
>- NFS seems not so good from the above experience. I read in the mail 
list archive that some had experienced issues with attic to nfs storage. 
(The mail didn't say what the issue was.)
>
>- In another mail I read someone suggested running a local attic archive 
and rsync this to a remote server. This would also make it possible to do 
initial backup using a usb drive. But there is a index.xxxx that is 3.4GB 
and changing content and name frequently. Rsync isn't very suitable for 
files this large.
>
>
>- Or maybe ssh would be the preferred way. But how would I go about 
making the initial backup using usb disk when destination is ssh? Could I 
convert the current nfs archive to a ssh archive?
>