11-04-2010 07:49 AM
Please help to solve.
We have:
Tape library is used for data backup. Connected through SAN. Network has low banwidth and the backup data amount is huge.
Question: What Backup Exec options can help to reduce backup time over there poor network connections?
I was thinking on Deduplication option, to use it on host. What do you think?
UPDATE:
Found additional answers in BE 2010 R2 Guide. Available here: http://www.symantec.com/docs/DOC2211
Solved! Go to Solution.
11-05-2010 04:45 AM
You have a fiber-attached tape library, so that's one of the the prerequisites for SSO.
If you want to use SSO on the fileserver, Exchange server and SQL server, then you'll need each to be a Backup Exec media server with a SSO license for each, so there's a reasonable cost involved.
When you say your network is slow, as a gigabit network, it's not that slow, it's just that you have fairly large amounts of data to move across it.
I suppose the next question is "where is the bottleneck?". With a gigabit network, I'm wondering if the network is saturated while you backup. Are you running multiple simultaneous jobs? (You don't say how many drives you have in your autoloader)
Personally, given your data volumes, I'm now thinking that deduplication for the file data may be the way to go. A dedupe license is cheaper than 3 SSO licenses too!
11-04-2010 09:16 AM
Which part of the network is low-bandwidth? All of it (everything plugged into a 10 base-T hub), or are we talking about things spread between sites?
You want to either reduce the amount of data you're sending over the network, or move (a copy of) the data closer to the backup server.
Deduplication should reduce the amount of data being sent over the network, but the results depends how much data is duplicated, and how much unique content is being generated. As I don't use de-dupe, I'm not sure what sort of data you can apply it to.
What sort of bandwidth, and what sort of quantity of data are we talking about?
11-04-2010 09:33 AM
Client side deduplication should reduce the amount of data transferred over the network, that is right. But as Hywel said it highly depends on your environment and where exactly the bottleneck is. e.g. if you have a high latency network even deduplication might will not be able to speed up the backup significantly.
General Ideas:
11-05-2010 01:21 AM
Shared storage option means that you can access the storage/tapelibrary you are using as backup target from different BE media servers. It again depends on your network environment but if general network throughput is the problem this wont't help you much.
11-05-2010 02:35 AM
Hywel, Simon, thank you.
I'm analyzing the environment and be back with answers shortly.
Have additional question. Under these circumstances will it be effective to implement Backup Exec SAN Shared Storage Option?
UPDATE
11-05-2010 03:27 AM
So, when I have one tape library with low throughput - no matter how much servers will be performing backup, the bottleneck will remain at network capacity? But what if I change the hardware and make the network quicker - will then Shared storage option increase backup effectiveness dramatically?
And the last question: can BE perform backup directly to tape library omitting sending data to the media server? (wierd, I know)
11-05-2010 04:45 AM
You have a fiber-attached tape library, so that's one of the the prerequisites for SSO.
If you want to use SSO on the fileserver, Exchange server and SQL server, then you'll need each to be a Backup Exec media server with a SSO license for each, so there's a reasonable cost involved.
When you say your network is slow, as a gigabit network, it's not that slow, it's just that you have fairly large amounts of data to move across it.
I suppose the next question is "where is the bottleneck?". With a gigabit network, I'm wondering if the network is saturated while you backup. Are you running multiple simultaneous jobs? (You don't say how many drives you have in your autoloader)
Personally, given your data volumes, I'm now thinking that deduplication for the file data may be the way to go. A dedupe license is cheaper than 3 SSO licenses too!
11-05-2010 04:54 AM
Hywel,
The low capacity is not gigabit network, you're right. It is big amounts of data. This causes big backup time.
Good point about licenses cost for SSO vs Dedupe.
Will work out deduplication option. With the biggest amount of data on a file server it might work.
11-05-2010 04:58 AM
...what if I change the hardware and make the network quicker - will then Shared storage option increase backup effectiveness dramatically?
With the Shared Storage Option the server attached to your data is a Backup Exec media server, and the network is not used for data transfer. Suppose your tape library and disk array are attached to the server with 4Gb Fiber Channel, then your data should be able to flow at up to 4Gb/s. Whether your tape drive can keep up is another matter...
can BE perform backup directly to tape library omitting sending data to the media server?
Once upon a time (I think it was last seen with Backup Exec 10d) there was a ServerFree Option, which was supposed to do precisely this. I never used it, and I don't know of anyone else who used it either, but it disappeared from the options list.
There is another option available to you, which is the Advanced Disk-based Backup Option. One of the features of this is off-host backup. The requirements for this are quite specific (you need to have transportable snapshots), but in a nutshell what it allows you to do is have your SAN take a snapshot (so this is only valid for the fileserver), then have the Backup Exec media server mount this snapshot and back it up. This means that the backup data doesn't traverse the network. ADBO is also a fairly cheap option.