Forum Discussion

Twinkle_Sapra's avatar
12 years ago

Media Server Dedupe Rate

Hi Folks

I have a implement a solution based upon below scenario.

Remote Location :- 5 clients with 200 GB each ( Total of 1 TB). Data type is mostly text files.

Want to deploy one media server (7.1.0.2) with MSDP at Remote location and using SLP data will replicate to central Master Server/Media (7.1.0.2)  Server with MSDP. No issue with Network bandwidth between Remote media server and Central Master Server.

Is there any case study to calculate de-dupe rate based upon below requirement

Dedupe rate from client side deduplication to Remote media server MSDP?

Optimize Dedupe rate from Remote MSDP to Central Master/Media Server MSDP ?

Thanks in Advance !!!

 

  • Always difficult to say for this type of enquiry and compression set on the client and / or media server can make a difference too The first run of 200GB you would hope to give about 20% de-dupe rates as long as it is not encrypted data or the like After that first run you could be getting into the 90% range for de-dupe. You could use client side de-dupe straight back to the remote site and if have backed up plenty of clients already in the data centre then the de-dupe rate could be quite good - Accellerator really comes into play here too! If you have a remote MSDP Server then again it is all down to how much has already been backed up on the main site as to the ratio you will get - but they are all pretty much the same ratios as they just query the current fingerprint databases As an example, a remote client i did took a hefty 26 hours for its first backup, but using accelerator the next full backup took 37 minutes So sometime you either need to seed the data (put the MSDP local to the client for the first backup) or just accept the hit for that very first backup and then enjoy the subsequent ones! A ball park is to expect in the 20% range for the first backup then 60 to 90% after that - but all depends on the data types and how much you have already backed up to the MSDP pool to give the maximum set of fingerprints Hope this helps
  • Always difficult to say for this type of enquiry and compression set on the client and / or media server can make a difference too The first run of 200GB you would hope to give about 20% de-dupe rates as long as it is not encrypted data or the like After that first run you could be getting into the 90% range for de-dupe. You could use client side de-dupe straight back to the remote site and if have backed up plenty of clients already in the data centre then the de-dupe rate could be quite good - Accellerator really comes into play here too! If you have a remote MSDP Server then again it is all down to how much has already been backed up on the main site as to the ratio you will get - but they are all pretty much the same ratios as they just query the current fingerprint databases As an example, a remote client i did took a hefty 26 hours for its first backup, but using accelerator the next full backup took 37 minutes So sometime you either need to seed the data (put the MSDP local to the client for the first backup) or just accept the hit for that very first backup and then enjoy the subsequent ones! A ball park is to expect in the 20% range for the first backup then 60 to 90% after that - but all depends on the data types and how much you have already backed up to the MSDP pool to give the maximum set of fingerprints Hope this helps
  •  As Mark mentioned it difficult to say. But I usually use a rule of thumb of 20X reduction rate. It's a  conservative estimate, but better hitting with a cannon than not hitting at all.......

  • I'd hardly call "20x," conservative.  That's a greater than 95% dedupe rate, and IMO a best case scenario.  

    You want to quote conservative, quote 10x or about 90%.  

  • Never the less it's what I use, and until now it's has been working for me smiley

    But lets meet halfways and say 15X wink

  • Is your question answered? If so dont forget to close off the thread using the Mark as solution option Thanks