05-31-2017 04:27 PM
We have a situation where we have a large data set that doesn't change much. What we want, conceptually, is a full followed by infinite incrementals. We were told that Synthetic backups were the answer. However, what we have found is that each new synthetic backup creates an entirely new backup. This means that our required backup storage is 2X our data set, just to generate the synthetics.
So, my question is, is there a more efficient way to backup a large data set like this that mostly grows via adds with a small number of modified files?
Thanks in advance.
05-31-2017 07:09 PM
What you can do is to archive the existing files on a regular basis, e.g., move the files in that folder to another folder, backup that folder and retain this backup for a long time. In the meantime, run your regular full and incremental backups on the particular folder. This way that particular folder is small.
Unfortunately, BE does not have the archive option anymore so you have to do this manually.
06-01-2017 12:30 AM
If the data that does not change very often also does not get accessed very often then you might want to look at Enterprise Vault (different product suite) and its File System Archiving ability