cancel
Showing results for 
Search instead for 
Did you mean: 

see consumption spikes?

VersEV1
Level 4

is there a way to create a scheduled DQL report or some other way to show consumption spikes say in the last 24 hours? to see if user moved or saved files and a share grew? so if someone is downloading or saving lots files you can stop the next day.

say a share grew 10GB from yesterday or 5GB etc. can you be alerted to that?

2 REPLIES 2

VirgilDobos
Moderator
Moderator
Partner    VIP    Accredited Certified

I am also interested in this feature. Maybe someone from Veritas can advise?

--Virgil

Rod_p1
Level 6
Employee Accredited Certified

Let's consider that you are seeking something like a capacity report with built in trending that would somehow know the concept of time. I am sure Veritas has professional Services / Consulting that could work something up for you specific to your needs and environment if you are focused on DQL but have you considered options?.

If we viewed the baseline at the filer level we could use:

filergrowth.jpg

Example:

Created By,Report Type,Report Name,
Veritas Data Insight 6.1,Filer Growth Trend,Sample,

Summary Table
File Server,Capacity At Start of Period (GB),Free Space At Start of Period (GB),Utilization At Start of Period (GB),Capacity At End of Period (GB),Free Space At End of Period (GB),Utilization At End of Period (GB),Capacity Growth (%),Usage Growth (%),Free Space Change (%),
AUTOMATION1,5.95,5.88,0.07,5.95,5.86,0.09,0.0,37.85,-0.43,
TESTCMODE831,119.07,118.18,0.89,119.07,118.01,1.06,0.0,19.02,-0.14,


Details Table10.209.109.50,99.66,99.66,89.87,89.87,Sun Oct 16 05:30:00 IST 2016,9.78,
AUTOMATION1,5.95,5.95,5.88,5.88,Fri Oct 07 05:30:00 IST 2016,0.07,
AUTOMATION1,5.95,5.95,5.88,5.88,Sat Oct 08 05:30:00 IST 2016,0.07,
AUTOMATION1,5.95,5.95,5.87,5.87,Sun Oct 09 05:30:00 IST 2016,0.08,
AUTOMATION1,5.95,5.95,5.85,5.85,Tue Oct 11 05:30:00 IST 2016,0.1,
AUTOMATION1,5.95,5.95,5.86,5.86,Wed Oct 12 05:30:00 IST 2016,0.09,
AUTOMATION1,5.95,5.95,5.86,5.86,Thu Oct 13 05:30:00 IST 2016,0.09,
AUTOMATION1,5.95,5.95,5.84,5.84,Fri Oct 14 05:30:00 IST 2016,0.11,
AUTOMATION1,5.95,5.95,5.84,5.84,Sat Oct 15 05:30:00 IST 2016,0.11,
AUTOMATION1,5.95,5.95,5.86,5.86,Sun Oct 16 05:30:00 IST 2016,0.09,
TESTCMODE831,119.07,119.07,118.18,118.18,Fri Oct 07 05:30:00 IST 2016,0.89,
TESTCMODE831,119.07,119.07,118.12,118.12,Sat Oct 08 05:30:00 IST 2016,0.95,
TESTCMODE831,119.07,119.07,118.04,118.04,Sun Oct 09 05:30:00 IST 2016,1.03,
TESTCMODE831,119.07,119.07,117.99,117.99,Tue Oct 11 05:30:00 IST 2016,1.08,
TESTCMODE831,119.07,119.07,118.05,118.05,Wed Oct 12 05:30:00 IST 2016,1.02,
TESTCMODE831,119.07,119.07,118.07,118.07,Thu Oct 13 05:30:00 IST 2016,1.0,
TESTCMODE831,119.07,119.07,118.03,118.03,Fri Oct 14 05:30:00 IST 2016,1.04,
TESTCMODE831,119.07,119.07,118.0,118.0,Sat Oct 15 05:30:00 IST 2016,1.07,
TESTCMODE831,119.07,119.07,118.01,118.01,Sun Oct 16 05:30:00 IST 2016,1.06,


This would be at the filer level and may not be granular enough but we could go to the folder level:

foldergrowth.jpg

Example:

Header Table
Created By,Report Type,Report Name,
Veritas Data Insight 6.1,Consumption by Folders,Sample,

Directory Summary Table
File Server \ Web Application,Share Name,Directory,Total Size (GB),Total File Count,Active Size (GB),Active File Count,DFS Server,DFS Share,DFS Path,Filer Level Size,Filer Level Count,Filer Level Active Size,Filer Level Active Count,Active On Disk Size (GB),Total On Disk Size (GB),Filer Level On Disk Size,Filer Level On Disk Size,
10.209.109.50,home,\\10.209.109.50\home\,0.023453239351511,38,0.023453239351511,38,DFS_10.209.109.50,test1/home,\\DFS_10.209.109.50\test1\home\,0.023453239351511,38.0,0.023453239351511,38.0,0.0235595703125,0.0235595703125,0.0235595703125,0.0235595703125,
10.209.111.243,sharefortest,\\10.209.111.243\sharefortest\,0.01,2,0.01,1,,,,0.14148632179945708,7420.0,0.11799252707511186,107.0,0.01,0.01,0.11805606842041015,0.14504219055175782,
10.209.111.243,test,\\10.209.111.243\test\,0.13148632179945707,7418,0.10799252707511187,106,,,,0.14148632179945708,7420.0,0.11799252707511186,107.0,0.10805606842041016,0.1350421905517578,0.11805606842041015,0.14504219055175782,

File Group Details_20
File Group Name,File Count,Total Size (GB),
Backup and Archive Files,1,0.01,
Email Files,1,0.01,
All Files,18,0.01,

Which may be much too granular and as with most reporting would require the audience to have knowledge or script the previous values so that the comparison could be made. That is easily resolved by keeping multiple copies and exporting to a separate location for the report's output, using Excel type macros on output csv files, etc..

Now that we have investigated some of the canned reports we can see that there may not be much advantage other than the scoping to share level via DQL for the data we seek and even DQL is not an easy fix for the comparison component.

If we breakdown your logic into parts. I need a capacity report we can view the templates (Caveat: Specific to version) and then add in a time scope.

In DQL there are starting points that we refer to as templates. Probably the Data Management would be best for our purposes.

DQL_Template.jpg

This is where we can get an idea or two and while one cannot combine the templates certainly you can copy the queries out to your creative workspace to put together what you wish to experiment with. Let us take an example for you to work with like:
DQL_Template_SUMM.jpg

here we can see our output will contain size and be based on a share (change the "share1" in the list to a creation of your own as it likely will NOT match your environment) and show you the filer, then share, then path in extension cumulative order. Of course you would likely wish to change the order to size by folder (type is dir) and add in a time scope to be your last 24 hours by also pulling the create and accessed times less than one day from the report's scheduled time you want to kick off the report so lets say in days or hours versus an actual date.

Something akin to adding a get for :

formatdate(created_on,'YYYY-MM-DD HH:mm') AS created_date,
 formatdate(last_accessed,'YYYY-MM-DD HH:mm') AS last_accessed_date,

into the mix and a condition for say less one day like:

last_accessed > duration('1 days') AND
created_on < duration('1 days') AND

Unless you were only concerned with newly created files / folder then of course modify for those parameters.

Should you wish to experiment a bit on your own first or instead of using our consulting services then consider DataInsight Query Language is a limited tool allowing for the querying of internal databases based on static values that are updated regularly.

If you would like to seek our Professional Services assistance let me know and I'll ensure you make it up the chain, else happy plugging.


Rod