We are using netbackup 7.7.3 .
Master –Windows 2012.
Media Servers—5230 Appliances .
We had two DC Primary and Secondary.
Data will be duplicate each site ( Primary To Secondary) and (Secondary to Primary).
How to find what is the amount of data Deduplication Rate happing on each site is there any way can found ….I had around 1500 Servers.
Best approach Pls ....
Questions like this and how much Back-End storage specific clients and policy types consume do come up, especially when more Back-End storage is used than expected. Unfortunately I don't think there are any easy answers, however an approach involving OpsCenter I've used in the past which did find storage hogs was:
i) Use NetBackup "Images On Disk" report to find the number backup images in a given disk pool
ii) Run an OpsCenter custom report for the disk pool concerned with the following high-level parameters:
Once you are sure OpsCenter has found all the backup images for the disk pool (from "Images On Disk") the report can be exported as CSV and manipulated in Excel to show Dedupe Rates (from "Job Protected Size" and "Job Size") and Retentions (from "Job Start Time" and "Image Expiration Time"). From there you can use Excel to filter by client, policy type etc. to see total storage used for the filter set (SUBTOTAL "Job Size"). You can also check for unexpectedly long Retentions which will also consume storage.
Two caveats are:
Best regards, Andrew
If you have ops center then there is an slp report which would be telling you exact data which is replicated across the site. and there is another report pre and post deduplication which will help you to find the deduplication ratio based on policy type and further you can segrigate it to clients
Yes OpsCenter does have several Deduplication Rates reports but I found them all pretty useless if trying to find if a specific client is hogging deduplication pool storage in a large environment.
Unless I'm mistaken, you can report on a specific set of clients but you can't get details for each client, just the totals; for a large number of clients this will be a major problem. Also it doesn't explicitly let you specify a particular deduplication pool (though a specific Media Server is possible and _MAY_ be enough) and there is no check if it is reporting on all images in a pool.
Best regards, Andrew
PS: My suggestion earlier relies on all images being in OpsCenter, so reporting will need to have been started soon after the pool was setup. It also need OpsCenter Analytics to be licensed so Custom Reports are enabled and reporting is not limited to 60 days.