cancel
Showing results for 
Search instead for 
Did you mean: 

How to find Netbackup Deduplication Rate for multiple clients .

challa_007
Level 3
Certified

 

Hi ,

 We are using netbackup 7.7.3 .

 Master –Windows 2012.

 Media Servers—5230 Appliances .

 We had two DC Primary and Secondary.

 Data will be  duplicate each site ( Primary To Secondary)  and (Secondary to Primary).

 How to find what is  the amount of data Deduplication Rate  happing on each site is there any way can found ….I had around 1500 Servers.

 Best approach Pls ....

 

 

 

 

 

3 REPLIES 3

andrew_mcc1
Level 6
   VIP   

Questions like this and how much Back-End storage specific clients and policy types consume do come up, especially when more Back-End storage is used than expected. Unfortunately I don't think there are any easy answers, however an approach involving OpsCenter I've used in the past which did find storage hogs was:

i) Use NetBackup "Images On Disk" report to find the number backup images in a given disk pool

ii) Run an OpsCenter custom report for the disk pool concerned with the following high-level parameters:

  • Custom Backup/Recovery Report - Subcategory = Job/Image/Media/Disk
  • Filter by: Master Server/Dedupe Pool/Job Types = Backup, Catalog Backup, Import Image
  • Report Fields: Client Name, Policy Type, Job Protected Size, Job Size, Job Start Time, Image Expiration Time etc.

Once you are sure OpsCenter has found all the backup images for the disk pool (from "Images On Disk") the report can be exported as CSV and manipulated in Excel to show Dedupe Rates (from "Job Protected Size" and "Job Size") and Retentions (from "Job Start Time" and "Image Expiration Time"). From there you can use Excel to filter by client, policy type etc. to see total storage used for the filter set (SUBTOTAL "Job Size"). You can also check for unexpectedly long Retentions which will also consume storage.

Two caveats are:

  • The calculated deduplication rate is at backup time only, as a backup image ages its deduplication rate will decrease if other images it shares blocks with are expired (e.g. this will happen for weekly fulls with longer retention than daily incrementals)
  • It is possible that orphan fragments or blocks may exist in the pool and consume storage. I've not definitely seen this but there are a number of VOX posts and it would also account for unexplained Back-End storage consumption

Best regards, Andrew

t_jadliwala
Level 4
Partner Accredited

If you have ops center then there is an slp report which would be telling you exact data which is replicated across the site. and there is another report pre and post deduplication which will help you to find the deduplication ratio based on policy type and further you can segrigate it to clients

andrew_mcc1
Level 6
   VIP   

Yes OpsCenter does have several Deduplication Rates reports but I found them all pretty useless if trying to find if a specific client is hogging deduplication pool storage in a large environment.

Unless I'm mistaken, you can report on a specific set of clients but you can't get details for each client, just the totals; for a large number of clients this will be a major problem. Also it doesn't explicitly let you specify a particular deduplication pool (though a specific Media Server is possible and _MAY_ be enough) and there is no check if it is reporting on all images in a pool.

Best regards, Andrew

PS: My suggestion earlier relies on all images being in OpsCenter, so reporting will need to have been started soon after the pool was setup. It also need OpsCenter Analytics to be licensed so Custom Reports are enabled and reporting is not limited to 60 days.