03-27-2020 10:08 AM
So, a client of mine asked for help in creating a report that could help him track how backed up data has been growing, as well as how much of his storage is being used. He wants to do it per policy per month.
I used Total Job Size for tracking storage usage since it only considers what is actually being stored, and Maximum Job Protected Size to estimate how much data he's backing up with the policy (it should pick up his last full backup and present it as "how much he's backing up").
Running a couple of tests, I noticed that even if the backed up data doesn't change at all (i.e. a client that hasn't been touched), the written data still changes in a way that seems almost random. I think the only thing that could be causing that is if by running manual backups for testing, something is written to storage even if there's nothing different from the last image (considering deduplication).
Is there something wrong with my logic behind the report? Or is it something I'm missing about how NBU works?
thanks in advance
03-27-2020 10:12 AM
This example I attached might have something to do with the file being too small. Since the default deduplication segment is very large compared to the file, a lot of data is written without getting a fingerprint, so everytime a job runs, that part is rewritten and added to the total.
Could that be it?
03-28-2020 03:24 AM
The OS logs and other application logs and NetBackup Client's own logs could all have written new entries. Sometimes a large old log file might have been touched, sometimes not - hence sizes of client backups can appear to fluctuate to a small degree.
What percentages and across what sizes are you seeing differences?
03-30-2020 01:30 AM
I would say footprint size is to small - the variation you have is between 15 and 80 megabytes.