cancel
Showing results for 
Search instead for 
Did you mean: 

Backup Exec 2010: Offsite Backup

Natanael_Berca
Level 2
Partner Accredited

Every one,

Please help, I have a case as  below:

- My customer have file server containing important data around 40GB on several offsite location, with 384 Kbps bandwidth each

- Recently they are doing tape back up on each location

- In order to reduce operating expense when transporting the back up files to HQ, they want to use BE 2010.

My question:

- It is possible to customer do a full backup to tape on offsite (let says every week) and then incremental backup to HQ via WAN every day? Because the bandwidth limitation.

- How is the licensing option?

- Any other opinion to solve this case? (Ex: using dedup or deduplicated set copy)

 

6 REPLIES 6

teiva-boy
Level 6

Dedupe can probably do it.  You'll want to take the tapes to the HQ, and duplicate each of them into the dedupe folder.  This way BE is aware of what you are trying to backup, and will compare to what you've just previously duplicated into the dedupe store.

 

Then setup client side dedupe on the remote locations, and back them up over the WAN.

 

Natanael_Berca
Level 2
Partner Accredited

Hi teiva-boy, thanks....

But the problem is, currently the file server is windows 2003 32 bit, so it is not supported for client dedup, but can we propose another media server at remote location only for dedup purpose?

How about the incremental option? Can we do the same method, bring the tape to HQ then using incremental to continue the back up?

 

teiva-boy
Level 6

A 32bit client can do client side dedupe.  A 32bit server cannot be a media server with dedupe enabled.  

So your file server, though 32bit, can perform client side dedupe just fine.  

You'll need a 64bit server as your central backup server with BE2010R2 installed.  

From there, you can still do your normal full/incremental strategy like before.  

Hywel_Mallett
Level 6
Certified

As teiva-boy says, dedupe might help you in this, but the thing that would concern me is how much of your data is changing each day. You don't have a whole lot of bandwidth, and if you've got a fair amount of data changing each day, you might be limited by the speed of your site links.

If you can get an idea of how much data is changing each day, then that will help you to determine if backup to HQ is feasible.

Ken_Putnam
Level 6

Another thing to remember -

The RAWS is notoriously fragile when used across WAN/VPN links

teiva-boy
Level 6

Agreed it is!  However one can also assume if you are doing source dedupe and receive a 10X reduction in data sent over the wire, you also are 10X less likely to have an interruption in data transfer...  Or something like that..

That said, the bandwidth is not your problem, it never is.  It's the latency and link quality.  dropped packets and high latency even if you had a 1000Mb link, will cause a job to fail, or increase in duration.

The best you can do is optimize the data transfer as best as possible e.g. dedupe on client, turn on compression in pd.conf; and setup a retry if the job fails..