cancel
Showing results for 
Search instead for 
Did you mean: 

Managing large backups over a network

JMCC
Level 2

Our current backup scheme is to perform a full backup on weekends and incremental backups during the week. I would like to store these backups off-site but the full backups are several terabytes in size and it would take too long to transfer them over the network every single week. I thought I could perform one initial full backup and then just top it off with small incremental backups thereafter. However, the documentation warned against this because in the event that we had to restore everything we would have to retore the full backup and then every single incremental backup thereafter in order. For a disaster recovery backup that would get unwieldy very quickly.

Is there some sort of backup scheme that's more bandwidth efficient than performing full backups every week?

1 ACCEPTED SOLUTION

Accepted Solutions

pkh
Moderator
Moderator
   VIP    Certified
Yes. The recommended way is to have media servers with the dedup option on both ends. You backup to the dedup folder and then duplicate the backup set over to the dedup folder on the other end. This is called optimized duplication and only the new data blocks are sent over the link. In addition to the dedup option, you will need CASO which is part of the ESO license

View solution in original post

5 REPLIES 5

pkh
Moderator
Moderator
   VIP    Certified
Yes. The recommended way is to have media servers with the dedup option on both ends. You backup to the dedup folder and then duplicate the backup set over to the dedup folder on the other end. This is called optimized duplication and only the new data blocks are sent over the link. In addition to the dedup option, you will need CASO which is part of the ESO license

lmosla
Level 6

In addition to pkh, here is some more information on Backup Exec Optimized Duplication  http://www.slideshare.net/symantec/white-paper-46592309

JMCC
Level 2

Thanks. This seems like a great starting point.

JMCC
Level 2

So, if I'm understanding you correctly we will perform a local backup and then copy the changes to said backup (via deduplication) to a remote copy over the network? And we can just do deduplication copies from that point forward without worrying about negatively affecting our ability to do full restores?

Sorry if I didn't follow you 100% but it sounds like deduplication will allow me to accomplish what' I'm after. I'll research that option then. Thanks for the reply.

 

EDIT: Also, you're saying the remote copy of the backup has to be going to a server that is also running Backup Exec? In other words, I can't just copy to some random server space. It has to be a second server running it's own copy of Backup Exec and I have to use the Central Admin Server Option so my local server can properly operate the deduplication copy with the remote server.

pkh
Moderator
Moderator
   VIP    Certified

When you use optimised duplication, the entire backup set is duplicated to the dedup folder on the other media server.  However, since the backup set is dedup'ed, only the new data blocks are sent across the link.  The data blocks which already exist in the other dedup folder will not be sent over the link, thus saving bandwidth.  You can even throttle the bandwidth for the optimised duplication process if need be.

Using optimised duplication does not mean that you can do an incremental forever backup.  You would still need to do periodic full backups, but when you do optimised duplication on these full backups, only the new data blocks are sent over the link.

Yes. that is correct.  The other server needs to be running BE so that you can define a dedup folder at the other site.  You can only have one dedup folder per media server.  CASO allows you to share the dedup folder with other media servers.