11-27-2015 06:57 AM
I'm trying to het my head round Per Terabyte licensing and image retention.
Last year we backed up 600GB of data (All drives), this is verified by the nbdeployutil tool, so 1TB licence required, no problem.
Then we purchased a dedupe backup device, and have increased our retention time from 1 month to 3 months (full backup every month).
This year we back up 600GB data, nbdeployutil log states there are 3 images over 3 months, each of around 600GB (slightly different as the data is changing slightly). In the log, it has added the size (in bytes) of 2 of these images and stated this as the total amount of data backed up. We have 1.2TB that is protected, and need a 2TB licence.
From the log:-
base servername_123456789 size 2 != image size 596960098
base servername_132547698 size 1 != image size 603646970
base newest streams calculation = 1200607068
Is this correct? Are we being charged extra due to the increased retention period? If so, why doesn't it add up all three images in the logs, not just 2 images? What can we do to alleviate this, change the retention time back to 1 month and expire the older 2 images?
Thank you.
11-27-2015 07:17 AM
Hello,
no it is not correct. No matter which frequencies and retentions you use, just source data volume is important.
nbdeployutil takes into account largest volume backed up by every client/policy combination.
Didi not you change the policy or disk label on the client or its hostname within last 3 months?
regards
Michal
11-27-2015 07:24 AM
Simply put, if you backup a client with 1TB on it every day from now until the end of time, and keep those images for infinity, you need 1TB Platform license.
Assuming no data growth :p
11-27-2015 07:32 AM
Getnleman
Thank you for your replies. Can you explain why the log is adding 2 image sizes together and the report is basing the chargeable data on this amount?
Cheers
11-27-2015 07:40 AM
Hi Dave
It just a tool, its automated and as with many things that are automated it only caters for certain scenarios. If you're backing up a client twice a day it would show double the TB but it would tag it somewhere that it found a potential duplicates (as far as I recall).
You should be reviewing the results and if you know, its 1 TB, then only buy 1TB.
Let me add though, I'm not dissing this tool, I know it is extremely difficult to calculate this using a script/automation. Especially in large environments where there are multiple schedules, log backups, etc, etc.
11-27-2015 10:22 AM
There must be something different about the images which causes it be added together, insteda of the utilility taking the max(image1,image2).
You could try reducing the time window of nbdeployutil from 90 days to just one cover a time period which encompasses the recent full backup of all clients - and this would restrict nbdeployutil to just using the most recent full size. You'll have to ask yourself whether this is "fair" if you have used (over a month ago) NetBackup to backup something - but might help in eradicating the unwarranted 'addition' of image sizes.
01-26-2016 06:37 AM
Update for future reference. Just had Symantec in to do an audit, it is the logs themselves that are reporting wrongly. So from my first post, the total protected data should be the highest of the 2 smaller figures, i.e. 603,646,970 KB, not 1,200,607,068 KB. You can have as many backups of the same data as you like, Symantec look at the largest figure going back over the last 3 months, regardless of retention times.
01-26-2016 07:36 AM
Yes, they look at the largest figure, but remember that the tool just looks at the image size per day, if you backed up 600GB twice in one day (for whatever reason, audit, testing) it will log 1.2 TB. This is however not what your FETB is....
01-26-2016 08:07 AM
The 2 images wer backed up one month apart, yet the tool still added them together to give an incorect FETB figure. Something we'll have to monitor carefully in future!
01-26-2016 08:09 AM
01-26-2016 09:30 AM
Dave did you ever discover the true root cause of the double count? i.e. where was the logic in teh script failing to handle your environment?