cancel
Showing results for 
Search instead for 
Did you mean: 

Question on 'splitting' up Journal Store

GertjanA
Moderator
Moderator
Partner    VIP    Accredited Certified

Hello all,

 

Think with me..

We currently have journalpartitions that are 1 TB in size. Expiry is set for 6 months. general usage per partition is average 85/90%. This obviously is a problem in backing these up. Talking with the backup team, they advised they can back this up more easily (and quicker) if I was to use a size of 250/300 GB per partition. I'm thinking of reconfiguring, so that a journalpartition is 250 GB. Diskwise, storage will provide/present 3 or 4 disks to the journalarchivingservers so I can easily close a partition, and open a new partition on a new disk. Obviously, the expiry scheme stays the same. The indexes are no problem.

 

I need to figure out how long it takes to fill 250 GB. Anyone knows of a SQL query I can run? Any other idea to keep this in manageable size? I know EV8 has automatic rollover based on size/date, but we're not upgrading soon.

 

Thanks for thinking with me.

Gertjan

 

Regards. Gertjan
1 ACCEPTED SOLUTION

Accepted Solutions

TonySterling
Moderator
Moderator
Partner    VIP    Accredited Certified
This might help you out.  just put in your date range and see daily rate with total size 

-- This one gives a daily rate:

--Runs against the VaultStore Database

select "Archived Date" = left (convert (varchar, archiveddate,20),10),

"Daily Rate" = count (*),

"Av Size" = sum (itemsize)/count (*),

"Total Size" = sum (itemsize)

from saveset

where archiveddate between '2004-05-01' and '2009-05-31'

group by left (convert (varchar, archiveddate,20),10)

order by "Archived Date" desc

View solution in original post

4 REPLIES 4

TonySterling
Moderator
Moderator
Partner    VIP    Accredited Certified
This might help you out.  just put in your date range and see daily rate with total size 

-- This one gives a daily rate:

--Runs against the VaultStore Database

select "Archived Date" = left (convert (varchar, archiveddate,20),10),

"Daily Rate" = count (*),

"Av Size" = sum (itemsize)/count (*),

"Total Size" = sum (itemsize)

from saveset

where archiveddate between '2004-05-01' and '2009-05-31'

group by left (convert (varchar, archiveddate,20),10)

order by "Archived Date" desc

MirrorSphere
Level 5
Partner Accredited

You could also look at your backups for the last month and see what is the relative growth each day and just do some extrapolation of the figures. 

 

I think the script gives the size of the item before archiving therefore it is not a true reflection on what size the actual vault store partition is.  Therefore, you might think that you are up to 250GB more quickly.

This being said I still like the script idea as we are not talking precise facts and figures!

TonySterling
Moderator
Moderator
Partner    VIP    Accredited Certified
That is right, the item size is the original size before compression.  Sorry I forgot to mention that earlier.  :)

GertjanA
Moderator
Moderator
Partner    VIP    Accredited Certified

ahh. I was not clear again.. It is not being backed up, so cannot check that. sorry. But, the script will do! I'm trying to get an estimate, and using the script, I can at least sort of calculate dailey/weekly/monthly additions.

 

Thanks guys.

Regards. Gertjan