Using NDMP option to backup very large netapp volumes. I need a way to reduce my windows so I do not cause week day performance issues.
We are curently running BE2010 to backup our netapp data filers using the NDMP agent. It seems like our volumes and aggrigates are growing bigger and bigger causing my backups to take longer and longer. An example would be the job below. It takes me nearly 3 days to back up 10.8 TB which includes most of monday if the backup job is started near the end of the day on Friday.
Down the road I see these volumes getting bigger and bigger since the next version of ONTAP allows for larger aggrigates. When I have a 16 TB volume/aggrigate my job will take about 4.3 days days to complete. This will deffinatly cause performance issues for several days of the week which will be unacceptable. Deduplication will not bennifit us at all because we would like a full copy every week. We are working with scisemic data that is all unieque.
The library a dual lto4 head library which is is fiber connected to the netapp so the network should not be the bottle neck. My best suggestion is to create a way for me to stream the backup from one job to many tape heads. I think this capability would cut my time in half because I am able to write the data to two heads at the same time. I can not create two separate jobs because backup exec will only allow me to backup a whole NDMP volume. I am unable to split up a volume in to two separate jobs.
Please let me know what you suggest.
Doug
You sir have run into the limitation of NDMP and large data sets. There is not much you can do with BE to solve this. No option or add on from a BE perspective is going to solve this.
You can go to LTO5, but the odds are that the filer cannot do it's UFS_DUMP fast enough to feed those LTO5 drives.
You can go to NetBackup which will now with 7.01 allow you to multiplex NDMP backups.
Or perhaps look at NetBackup and using synthetic backups with an incremental forever policy.
Lastly, is all of this data, active data? Perhaps the easiest approach, would be to start pruning the inactive data, say more than 6months old or 1yr old off the Filer using Enterprise Vault? Then move it to cheaper DAS based storage or similar. This would extend how much expensive NetApp storage you will need to buy, make backups faster, and reduce your backup windows. End-users will not know anything has changed, as Enterprise Vault can archive off CIFS and Netapp no problem.