cancel
Showing results for 
Search instead for 
Did you mean: 

nbudeployutil -capacity as a SQL statement

PeteinSKy
Level 3

Hi,

I am looking to extract the nbudeployutil -capacity as a sql statement so this can be tracked automatically by an external application.

 

The key data is the Calculated Capacity Totals by Master Server and Capacity by Policy type.

 

Can it be done?

 

Thanks

14 REPLIES 14

RiaanBadenhorst
Moderator
Moderator
Partner    VIP    Accredited Certified

Have you looked at the report templates > client reports > jobs by application (set it to 1 month).

 

It gives each client's usage and per policy type.

 

If you want something more specific let me know. I'll try to do it.

PeteinSKy
Level 3

We want to get the information direct from the DB without using the interface. An unattended, automatic extraction that would let us see the total of the Front End Terabytes backed up by device to forecast the growth and thus the possible licence cost increases.

 

The SQL query would be run by a 3rd party app weekly and would be stored extrnally.

 

We are not looking at the monthly/daily amount being backed up.

We are looking at what the total amount backed up is.

 

At the moment nbudeployutil -capacity does it but only as a report.

 

 

RiaanBadenhorst
Moderator
Moderator
Partner    VIP    Accredited Certified

Ok,

 

Give this a try.

 

select sum (domain_JobArchive.filesBackedUp) as "Files",
COALESCE(CAST(sum (domain_JobArchive.preSISSize)/1073741824  AS NUMERIC(20,2)),0) as "Protected GB",
COALESCE(CAST(sum (domain_JobArchive.bytesWritten)/1073741824  AS NUMERIC(20,2)),0) as "Written GB",
COALESCE(CAST((sum(domain_JobArchive.preSISSize) - sum (domain_JobArchive.bytesWritten))/sum(domain_JobArchive.preSISSize)*100 AS NUMERIC(20,2)),0) as "Dedupe %",
domain_JobArchive.masterServerId as "Master",
domain_JobArchive.policyType as "Policy Type"
from domain_JobArchive
WHERE ( ( (domain_JobArchive.isValid = '1') AND (domain_JobArchive.filesBackedUp NOT IN (0 ) AND (domain_JobArchive.policyType NOT IN (-1, 30, 31, 32, 33, 34, 35, 36, 37, 38 )) )  ) )
AND DATEDIFF(day,UTCBigIntToNomTime(domain_JobArchive.endTime), GETDATE()) <= 1
GROUP BY "Master", "Policy Type"

RiaanBadenhorst
Moderator
Moderator
Partner    VIP    Accredited Certified

And for connecting to the opscenter database

https://support.symantec.com/en_US/article.TECH141827.html

PeteinSKy
Level 3

Amazing, thank you..one tiny last thing (Which I will work on, but am not confident I can find)

I need to change the Master and Policy type references to names.

 

Apart from that it looks great :)

 

 

PeteinSKy
Level 3

Ok... run into a problem.

 

I know for a fact that one of the Media servers under one of the Master servers is only running 1 policy and the size on the Pure disk is 11.77 TB. (This came straight from OpsCenter Gui)

 

When we run the SQL for the Master the value for the policy only comes to 676 GB front end and 438 Written Storage.

 

AS far as I can tell it should be 11.77 Tb as this is the only thing being backed up to this Media server.

 

Where am I going wrong?

RiaanBadenhorst
Moderator
Moderator
Partner    VIP    Accredited Certified

Can you run bpimagelist -client "client name" -d 01/01/1970 and verify that it reports 11TB too?

 

bpimagelist  -client "client" -l -d 01/01/1970 | awk '{ if ( $1 == "IMAGE" ) print $19 }' | awk '{ sum +=$1 } END { print sum }'

 

I'll work on the conversion of the policy types.

RiaanBadenhorst
Moderator
Moderator
Partner    VIP    Accredited Certified

Master and Policy Types added

 

select sum (domain_JobArchive.filesBackedUp) as "Files",
COALESCE(CAST(sum (domain_JobArchive.preSISSize)/1073741824  AS NUMERIC(20,2)),0) as "Protected GB",
COALESCE(CAST(sum (domain_JobArchive.bytesWritten)/1073741824  AS NUMERIC(20,2)),0) as "Written GB",
COALESCE(CAST((sum(domain_JobArchive.preSISSize) - sum (domain_JobArchive.bytesWritten))/sum(domain_JobArchive.preSISSize)*100 AS NUMERIC(20,2)),0) as "Dedupe %",
domain_masterServer.networkname as "Master",
lookup_policyType.name as "Policy Type"
from domain_JobArchive, domain_MasterServer, lookup_policytype
WHERE ( ( (domain_JobArchive.isValid = '1') AND (domain_JobArchive.filesBackedUp NOT IN (0 ) AND (domain_JobArchive.policyType NOT IN (-1, 30, 31, 32, 33, 34, 35, 36, 37, 38 )) )  ) )
AND
domain_masterServer.id=domain_jobarchive.masterserverid
AND
lookup_policytype.id=domain_JobArchive.policyType
AND DATEDIFF(day,UTCBigIntToNomTime(domain_JobArchive.endTime), GETDATE()) <= 1
GROUP BY "Master", "Policy Type"

PeteinSKy
Level 3

Hi..

 

Ok..real trouble running that command , but we ran ./bpimagelist -d 10/01/2002 -pt 'MS-SQL-Server' to have a look at the policy.

 

Obviously wat too much data but it did indicate that there was more likely to be 11TB on there rather than 600GB.

 

The idea is we see from a Licence pricing standpoint how much Raw data is being backed up and how and the size of it after deduplication.

 

One thought is that since we are storing 2 months worth of data we change

AND DATEDIFF(day,UTCBigIntToNomTime(domain_JobArchive.endTime), GETDATE()) <= 1

to

AND DATEDIFF(day,UTCBigIntToNomTime(domain_JobArchive.endTime), GETDATE()) <= 61

Unfortunatly this gives us too much data being backed up...

RiaanBadenhorst
Moderator
Moderator
Partner    VIP    Accredited Certified

Yeah sorry I was being daft with the bpimagelist command. It should not be totaled. It needs to be averaged. I actually use bpimagelist -hoursago 800 when I'm doing FETB calculations. But I don't just average it, i filter the information per day, and then further per policy type. I then take that information and import it into excel to give me a per day figure per client per policy type. I then use MAX and AVERAGE on each policy type to show me any erratic behavior and make a decision what the FETB is based on that.

 

With the that in mind I've rewritten the query to take a day total per policy type per client. Maybe this will give you a better idea of what is being backed up. Pleas

 

select
COALESCE(CAST(sum(domain_JobArchive.preSISSize)/1024.0/1024.0/1024.0 AS NUMERIC (20,2)), 0) AS 'Pre Dedup Size (GB)',
DATEFORMAT(cast(UTCBigintToNOMTime(domain_JobArchive.endTime) as DATETIME), 'yyyy-mm-dd') as myDateTime,
domain_JobArchive.clientname as "Client",
domain_JobArchive.policytype as "Policy Type"
from domain_JobArchive
WHERE
domain_JobArchive.Type IN (0 )
AND DATEDIFF(day,UTCBigIntToNomTime(domain_JobArchive.endTime), GETDATE()) <= 100
GROUP BY "Client", "Policy Type", myDateTime ORDER BY "Client", myDateTime

 

 

PeteinSKy
Level 3

Ok..one question..

Why have you set

domain_JobArchive.Type IN (0 )

RiaanBadenhorst
Moderator
Moderator
Partner    VIP    Accredited Certified

To only look at backup jobs, not snapshots, duplicates, restores, etc.

 

lookup_JobType
Reference table for Job Type (like Backup, Archive, Restore, Verify, Duplicate, Image Import)

 

-1 Undefined
0 Backup
1 Archive
2 Restore
3 Verify
4 Duplication
5 Image Import
6 Catalog Backup

 

And so forth, there are more.

PeteinSKy
Level 3

In your comments you say " I then use MAX and AVERAGE on each policy type to show me any erratic behavior and make a decision what the FETB is based on that."

HOw do you get this to match the data produced by the Traditional Licensing Report that is already created byOpsCenter.

 

I am miles away.

 

 

RiaanBadenhorst
Moderator
Moderator
Partner    VIP    Accredited Certified

I don't use the nbdeployutil of opscenter for calculating FETB. The reason for this is there is no way that an automated script can tell you how much data is running on your clients. It has too assume that you perform backup regularly and that there are no rerun. What if you get 99% of a 5TB Oracle DB and then you have to run it again, images are not always expired. What is you need to run something twice due to internal requirement. There are many examples. The method I use displays each client's backups per day and per application type so its easy verify if the figures are correct. And it needs to be verified by an administrator. The MAX and AVERAGE functions in Excel highlights in consistencies. If you have a 30 day period of 100GB backed up daily the AVERAGE would be 100GB and the MAX (highest value) would be 100GB. Run a backup twice one day and you'll get an instance of 200GB. This needs to be visible so it not included in the FETB calc.

 

If you PM me your email addres I can send you sheet to show you what I'm talking about.

 

The same could probably be pulled out of OpsCenter but I'm still a SQL Query newbie :)