Forum Discussion

Bitzan01's avatar
9 years ago

Backup to Cloud Solution

I have been tasked to look at different options on how to remove tape backups, I am currently looking for any advice anyone has or can share with me.

I’m currently only backing up File Services and Exchange DAG

My Monthly Full backup sets are; File Services 4TB and Exchange 1TB

My boss doesn’t want to do full monthly’s each month as the cost would get expensive, and compound monthly.

So he was thinking about just doing an incremental backup job, and just keep on doing that however if I ever needed to restore a file from a long time ago, would I have to load all of that data?

Would a reverse incremental job even work with cloud providers? (I’m currently looking at Amazon S3 – Glacier services for my storage provider)

Can anyone recommend a cloud provider, and maybe a different backup solution to help me accomplish this goal?

7 Replies

  • I’m currently looking at Amazon S3 – Glacier services for my storage provider

    Please make sure that whatever cloud solution you pick is on the HCL.  You will find more choices with a newer version of BE.  There are multiple providers and multiple methods. then look under the "Cloud Storage"

  • Incremental forever is not usually recommended with Backup Exec - especially with anything that uses GRT.

    Basically if you lost or corrupted any part of the chain of incrementals all the way back to (and including) the first full  - if GRT is involved you would not be able to restore anything at all. For a non-GRT (basic file system) backup you would still be able to restore some data if you lost or corrupted an earlier set, although the exact effect on your data  would of course depend on what as backed up in the missing sets. You would also potentially have to do multiple restores to get as much back as possible instead of just one

    You could look into Synthetic Backups, however this would not help you with the cloud as you would be writing the equivalent of full backups each time the synthetic is generated from the previous synthetic full and incremental sets which kind of means (that other than the backup window differences themselves) you might as well just write the full backup into the cloud in the first place.

    I'd suggest you look into exactly how an "Incremental Forever" style of opration might work with different backup products and once you understand this, then look at what effect writing into the cloud might have on this process

  • We have a client which has backup exec to data domain from EMC. Deduplication is somewhere to 27:1 so i dont think your boss will need to worry about full backups.

    All i cant tell you is that it works. However you might have to backup to disk before backing up to data domain. 

    I dont know costs, but on tapes is way cheaper. 

  • For starters: I fully agree with Colin that incremental forever is a bad idea. You would have to keep all backup data indefinitely and your storage bill will grow accordingly.

    Backing up to the cloud is one thing, restoring another. When rebuilding a system time is at the essence and your internet connection may well be a limiting factor (and may not even be available when you need it most). Also, whereas sending data to cloud storage is usually free, getting it back is definitely not.

    The following is how I set it up for our company (SMB, 15 people). Our environment consists of an enterprise class server running Microsoft Hyper-V and 5 production VM's running a mix of Microsoft Windows and Linux. Uncompressed data size is approximately 700 GB.

    Our requirements were

    • Have a local copy of the backup data for fast restore with minimum dependencies (like internet connections)
    • Have a cloud copy of the backup data for fall back in case the local copy is lost
    • Minimum cost
    • Strong encryption (at least in the cloud)
    • Weekly full backups, daily incremental backups

    And this is how I set it up

    • Backup Exec V-Ray edition (initially BE 14, now BE 15)
    • Backup to Disk using an inexpensive RAID5 local storage unit
    • Backup Exec AES encryption (make sure you have a copy of the key in a safe place!)
    • Weekly full backups, daily incremental backups, no deduplication, no GRT
    • When backups are completed, mirror backup data from RAID5 storage to cloud (Syncback Pro task)

    Syncback Pro (from 2BrightSparks) supports a variety of cloud storage services which makes the solution very flexible. You can change cloud storage providers in minutes if you want.

    Your profile tags indicate that your are running BE 2012. Our approach should work for you. Backup Exec 15 has much more extensive cloud support. I have not bothered to explore this though as we are very satisfied with our solution.

    A few more words about backing up to the cloud. Total transfer time is not only determined by the amount of data and the bandwidth. Transaction overhead per file can become the dominating factor if each file is stored separately in the cloud (and it can add heavily to the bill too due to transaction fees if you have millions of small files). When I was evaluating solutions, some would fail horribly on cloud transfer time and cloud storage cost. In our solution Backup Exec creates 1 GB backup files (.bkf) on the RAID5 unit and Syncback Pro pushes these files to the cloud storage. The storage bill is kept low because we transfer almost zero data out of the cloud (we do restores from the RAID5 unit).

    Hope this helps.


  • Whilst I think your re-world scanario is good, I do have a word of caution about replicating/copying the BKF files also being not usually recommended. For the reasons why see this article:


    Admittedly as you are on BE 15 the risks of some of the concpets in that article are lessoned, primarily due to not allowing appends into BKF, you do stil needs to undertand where inconsistencies might be introduced between your actual media, the BEDB content and the catalogs.


    BE 15 however does introduce another problem, if you bring back a copy of a BKF file (for a restore) that has already been expired from the server, if you do not disable DLM reclaims while working on the restore then Backup Exec could delete the file before you get chance to complete the restore.

  • I agree with the word of caution. But please keep in mind that the cloud copy is not intended for regular restores (you will use the local copy for that). The cloud copy is a data source of last resort (e.g. when recovering from a destructive event like a fire). A likely scenario would be a system rebuilt from scratch, backup date brought in, inventoried and cataloged, and then a restore. This of course requires proper planning, but planning for restore is part of planning for backup so the restore scenario for this case should already have been written and tested.