04-17-2013 09:44 AM
I would like to start running one backup a week of a couple of our servers off site. Is there going to be a problem running a backup through a vpn... Both sites have a 100 mbs down and 10mbs up cable connection. We are eventually going to change the connection where the servers are to a fiber connection atleast 30mbs down and up. Thanks
Solved! Go to Solution.
04-17-2013 02:08 PM
So you'll have a BackupExec server with some disk attached to it as a drive letter. This disk must be decently quick. No USB please.
You will have licensed the deduption, and the RAWS agents for each client you want to backup.
The intial backup will take the longest. You'll get 2:1 compression, you might get more. Hard to say. But let's say you have 2TB TOTAL to backup. The intial backup will take up to but no more than 2TB. The next time a FULL is run, it will scan through all 2TB of data, but only most likely save a few hundred MB's of data. Yet it's a FULL backup. So at the end of a month, you probably can store those 8TB of data in just 2+TB of space.
You will of course still have incrementals or differentials. If it's just a file backup, you could even do Synthetic backups too for even greater bandiwdth savings. But that may require an extra license (It used to in older versions, but IMO it should be included!!!)
04-17-2013 10:11 AM
I would recommend you use the deduplication function, and enable client-side deduplication. In fact, it'll be pretty much required for what you want to do, unless your latency is <5ms with ZERO packet loss.
That said, backupexec doesn't like latency, and will not tolerate a single dropped packet. Your job will fail immediately. However if you can use deduplication, and minimize what is actually sent over the wire, you can minimize your risk and failed jobs.
04-17-2013 11:23 AM
I am not familiar with the duplication process, what and how does it work? Also what do you mean by "minimize what is actually sent over the wire". If I do use the duplication function, do you think I will have over a %50 success rate? Can you recommend the best method for accomplishing off site backups.
04-17-2013 11:39 AM
Start here:
http://www.symantec.com/theme.jsp?themeid=backupexec-deduplication
Basically, it's a way of sending dramatically less data over the wire. In doing that, you minimize the failures you may have due to poor quality links.
04-17-2013 01:55 PM
Sorry I kinda need this spoon fed to me... Do we need a media server at the other site for duplication to work... Also storage is a concern for me... to back up all of our servers for a month takes about 8 tb... I only have 3 tb that I can deadicate to the remote office. I was only planing on keeping a week's worth of backups off site.
04-17-2013 02:08 PM
So you'll have a BackupExec server with some disk attached to it as a drive letter. This disk must be decently quick. No USB please.
You will have licensed the deduption, and the RAWS agents for each client you want to backup.
The intial backup will take the longest. You'll get 2:1 compression, you might get more. Hard to say. But let's say you have 2TB TOTAL to backup. The intial backup will take up to but no more than 2TB. The next time a FULL is run, it will scan through all 2TB of data, but only most likely save a few hundred MB's of data. Yet it's a FULL backup. So at the end of a month, you probably can store those 8TB of data in just 2+TB of space.
You will of course still have incrementals or differentials. If it's just a file backup, you could even do Synthetic backups too for even greater bandiwdth savings. But that may require an extra license (It used to in older versions, but IMO it should be included!!!)