Well I got no where a couple years ago when I tried implementing this feature. It worked for a couple of smaller shares, but the majority of our shares are over 1T.
I am in the process of upgrading to BE 2010 and am thought maybe it would work better now that I am on a 64-bit machine but I am still having issues
I am trying to get the synthetic policy to work on our user home share. It is about 2 T with about 7 Million files. The Baseline backup takes days but eventually finished but had many errors on directories that couldn’t be backed up because they didn’t exist. I assumed these were just folders that were present during the initial scan and removed before they actually got backed up. I also was hoping that the future incrementals would take care of anything. Instead the future incrementals continue to back up almost everything again (and each take s couple days to run as well) I let it go for over a week just in case it got itself together, but it did not. I had to hold all those jobs and go back to the daily diffs.
I am backing up everything to Data Domain boxes.
Does anyone have any suggestions or know of any limits for how much data the synthetic and true image backups can handle? I honestly more interested in the True image restores than the synthetic backups. But both would be nice since we paid for the add-on.
I think this relates to the quantity of files being over 7 million and not the size of the data being protected and as such other than splitting the selection lists into multiple jobs with a lot less files per job, I am not sure we have a solution that wil be effective for that number of files (with Backup Exec)
You really want to consider a more enterprise and reliable solution than BackupExec in this case. Here are whats out there that are known to work in these high file count situation.
NetBackup with an Enterprise Client
Networker with Snap Image
Storage Foundation Flashsnap
Sorry can't give an exact byte count as in part it relates to environmental performance.
Although I believe that 7-10 million files is way above any level that would respond with acceptable speeds.
Well, we are also using a Datadomain box and that is what caught my attention in your post.
For the simple sake of the amout of time it takes, I created 4 selection lists, 1-F, G-L, M-R, and S-Z. I checked the first two lists and one is doing 2.5 million files while the other is 1.3 Million. Your issue is not with BE.
I have had the Netbackup argument with people for years and can simply make BE do the same job for a lot less money and effort.
So, first and foremost, split your backup selection lists down to something you can manage.
Second, do you have the Symantec Dedup option installed? You need it for the DataDomain.
Third, patience with the Datadomain box! I had to let it run backup jobs for 2-3 weeks before I started understanding how wonderful this thing really is.
Fourth, Look at your logs and make sure it is not the tmp files or something of that nature that is disappearing on you. I filter mine out with the Global Excludes Selection List (Edit, Manage Selection Lists, Excludes).
Set up a GFS Backup scheme and let it roll. Our DD box has been up for 208 days and holds right at 100 TB of restorable data. The DD box itself is 30 GB's and it is screaming fast.
The Dedupe option is NOT required for using a DataDomain appliance. That is a NetBackup restriction where you need Enterprise Disk licensed. If you want to leverage some of the replication/OST integration, then yes there are some BE and DD licecnses needed.
Breaking out selection lists, only takes you so far. It's a band-aid to problem that needs to be fixed with other block level technologies. NetBackup can do this better as with the Enterprise Client, you can take advantage of a block level snap of the volume, not having to talk the file system like BackupExec. BackupExec should not be compared apples to apples to NetBacckup.
There are clear distinctions where you have to choose one over the other. The flashbackup is on of them here!
After a couple of failed implementation attempts, I called DD Tech Support and Symantec. If you are going to use a DD device with OST, you need the dedupe option.
Breaking out selection lists - We are a hospital and have stringent windows in which to get backup jobs done. If you have only 4 hours in which to backup 6 million files contained in 4 TB's of disk space, you aren't going to get the job done with one selection list across Fiber from a SAN to a LTO 5 Tape. Be it a File System or image backup, taking longer than 4 hours is unacceptable and according to the CEO affects patient care. Been there, done that, got the T-shirts from multiple products.
What is OST good for? It's for controlled replication from within the backup app and catalog awareness of the replicated copy.
It's an option, not a requirement. I'll challenge anyone from Symantec and EMC to say otherwise. I know both products very intimately because of my background with them.
You were mislead that it's required.
You can still do replication between devices without catalog awareness that OST brings
You can still backup to Datadomain via CIFS, VTL, or NFS without the OST protocol in play.
If you need/want the OST benefits, then yes you need some Symantec licenses. Though, it's not required and that is my point.
Because you are in healthcare, you should know what other hospitals are implementing as I deal with them on a daily basis. It's mainly TSM and Avamar. Both products are better suited to the healthcare vertical and their unique needs (e.g. EPIC, Cerner, Meditech, etc)