cancel
Showing results for 
Search instead for 
Did you mean: 

Using NDMP option to backup very large netapp volumes. I need a way to reduce my windows so I do not cause week day performance issues.

dfiore
Level 3

We are curently running BE2010 to backup our netapp data filers using the NDMP agent.  It seems like our volumes and aggrigates are growing bigger and bigger causing my backups to take longer and longer.  An example would be the job below.  It takes me nearly 3 days to back up 10.8 TB which includes most of monday if the backup job is started near the end of the day on Friday.       

Down the road I see these volumes getting bigger and bigger since the next version of ONTAP allows for larger aggrigates.    When I have a 16 TB volume/aggrigate my job will take about 4.3 days days to complete.   This will deffinatly cause performance issues for several days of the week which will be unacceptable.   Deduplication will not bennifit us at all because we would like a full copy every week.  We are working with scisemic data that is all unieque.       

The library a dual lto4 head library which is   is fiber connected to the netapp so the network should not be the bottle neck.   My best suggestion is to create a way for me to stream the backup from one job to many tape heads.   I think this capability would cut my time in half because I am able to write the data to two heads at the same time.  I can not create two separate jobs because backup exec will only allow me to backup a whole NDMP volume.   I am unable to split up a volume in to two separate jobs. 



Please let me know what you suggest. 


Doug


 

1 ACCEPTED SOLUTION

Accepted Solutions

teiva-boy
Level 6

You sir have run into the limitation of NDMP and large data sets.  There is not much you can do with BE to solve this.  No option or add on from a BE perspective is going to solve this.

You can go to LTO5, but the odds are that the filer cannot do it's UFS_DUMP fast enough to feed those LTO5 drives.  

You can go to NetBackup which will now with 7.01 allow you to multiplex NDMP backups.

Or perhaps look at NetBackup and using synthetic backups with an incremental forever policy.

 

Lastly, is all of this data, active data?  Perhaps the easiest approach, would be to start pruning the inactive data, say more than 6months old or 1yr old off the Filer using Enterprise Vault?  Then move it to cheaper DAS based storage or similar.  This would extend how much expensive NetApp storage you will need to buy, make backups faster, and reduce your backup windows.  End-users will not know anything has changed, as Enterprise Vault can archive off CIFS and Netapp no problem.

View solution in original post

9 REPLIES 9

RahulG
Level 6
Employee

well backup exec does not support multi-streaming so the other thing which you can try are

 1. running synthetic backup which requires a ADBO option (but am not sure if you can run synthetic backup along with NDMP would work or not )

2. Run backup via seprate network card dedicate for backups

3. Implement full incremental backup stratergy .

dfiore
Level 3

1.  I am unsure about the ADBO option.  Is this the advanced disk-based backup option?   Woudent i need 11TB free to make a copy of this data?

2. The tape library is fiber connected to the ndmp filer head.   there is no networking equipment involved.   My backup jobs are streaming at 2000 to 4000 mb/min which is significantly faster than my windows backups running over the network. 

3. I do run daily incrimental backups, but we like to keep 5 weeks of full weekly backups for DR.   If we ever have to do a full restore I do not want to have to insert my full set and then 20 to 30 incrimental tapes. 

teiva-boy
Level 6

You sir have run into the limitation of NDMP and large data sets.  There is not much you can do with BE to solve this.  No option or add on from a BE perspective is going to solve this.

You can go to LTO5, but the odds are that the filer cannot do it's UFS_DUMP fast enough to feed those LTO5 drives.  

You can go to NetBackup which will now with 7.01 allow you to multiplex NDMP backups.

Or perhaps look at NetBackup and using synthetic backups with an incremental forever policy.

 

Lastly, is all of this data, active data?  Perhaps the easiest approach, would be to start pruning the inactive data, say more than 6months old or 1yr old off the Filer using Enterprise Vault?  Then move it to cheaper DAS based storage or similar.  This would extend how much expensive NetApp storage you will need to buy, make backups faster, and reduce your backup windows.  End-users will not know anything has changed, as Enterprise Vault can archive off CIFS and Netapp no problem.

teiva-boy
Level 6

If I could add one more thing to think about...  Get the data off the NetApp that is unstructured.  NetApp's make terrible storage for unstructured data due to cost.  There are better solutions.  Use Netapp where they shine, application data (exchange, vmware, sql, etc)  Their app integration and snapshots are awesome.  But for file data, wasted money.  There are better performing solutions that are 60% the cost of anything NetApp has to offer.

dfiore
Level 3

We are urging our departments to cleanup their data so we dont have to spend money to backup data that should be archived. 

Ken_Putnam
Level 6

Since Teiva-boy was the one to suggest this, I gave the Solution to him

teiva-boy
Level 6

Good idea, but where are they going to put it?  You should really look at pro-active archiving from say Enterprise Vault.  It can archive for you based on policy right off the filer to cheaper storage.  That would make your backups faster, as you would have to backup less data.  In most cases I see a 30-50% reduction for file data with Enterprise Vault, and closer to 70% with Exchange Data.

dfiore
Level 3

 

We already vault all email over 30days old, but we just don’t trust vault or any other
deduplication products for the files we are working with.  

The scenario is as follows.   Let’s say we have 100 5gb files are 95% the same.   At $5 million a file can you guarantee evault will restore them correctly.   We don’t even use netapp's dedup for the same reason.     As far as home drives and general shared network drives....they do not take up very much space in relation to our larger volumes.

As far as what we do with the data that is being archived.   We do a strait copy to tape twice.   One set stays onsite the other is sent to iron mountain.



   

teiva-boy
Level 6

Oh not talking dedupe at all.  I'm talking, just like email archiving after 30 days, but applying it to file data over say 3 months, or even 1yr old. 

Enterprise Vault is 10+ years old, it is the most widely used product for archiving, it supports companies much much larger than you who make more revenue than you.  It is trusted and safe in almost all cases I've implemented it.  

Point being, is that there are affordable, stable, and trusted solutions to help manage your data.  Enterprise Vault being one of them.  It can help free up space, move them to cheaper storage, you can then manage it all more effectively, and ultimately save you on expensive NetApp storage upgrades, and faster backups.