cancel
Showing results for 
Search instead for 
Did you mean: 

Backup Exec backup metrics/statistics

Gary_Li
Level 3
Hi all,

Can someone advise where I can download some recent metrics/statistics on the Backup Exec 11 for Windows software?  I want to find out the standard time it takes to backup a certain amount of data, both full and incremental.  That way, I can see how our numbers are matching up against the benchmark.  If someone knows where I can download this informaiton, please advise.
8 REPLIES 8

pkh
Moderator
Moderator
   VIP    Certified
I don't think that such statistics exists.  The amount of time required to backup a certain amount of data is highly dependent on the particular computing environment.  If you are using a faster server, it backs up faster.  Likewise, a faster tape drive will help things.  Even when the hardware environment is the same, the workload of the server at the time of the backup has an impact on the backup time.  Given the high variability of the bases for such statistics, it will not be meaningful to collect them.  It would be like comparing apples and oranges.

Gary_Li
Level 3
One of our incremental backups last night took 1.5 hours to process 1.7 million files.  It didn't backup anything because nothing new was added.  Now, is 1.5 hour for 1.7 million files a normal statistic? 

CraigV
Moderator
Moderator
Partner    VIP    Accredited
1.7 million files is a lot of files, and depending on the size of each, BEWS needs to scan each and every file to check if it has changed.
So theoretically, it could take that long. Factors like when the backup is running (during application maintenance), running across a LAN/WAN, type of HDDs used etc. need to be factored in too...

Gary_Li
Level 3
I tested couple of backups and the results led me to ask about these metrics.  Backup #1 was for 68 GB of data (800k files).  A full backup completed in 1.5 hours and incremental run (no new data backup) completed in 10 minutes.    Backup #2 was for 128 GB of data (1.7 million files).  A full backup completed in 10 hours and incremental run (no new data backup) completed in 1.5 hours.  The files are all .tiff images.   My question is why such disparity in back up time when the data is just doubled?

teiva-boy
Level 6
Well that depends if you are using the "archive bit," or "modified time" using NTFS change journal (preferred)  One is much faster than the other by not having to scan the entire volume at the file level.

teiva-boy
Level 6
 It's not so much just the size of the data, but also the file count.  RAM, disk speeds, etc all contribute making backup speeds inconsistent throughout it's window.

It's a pain really as you cant really nail down a solid number to predict backup speeds.

Gary_Li
Level 3
You are certainly right on the file count.  But, I didn't expect that by doubling the file count, the full backup time went 6-7 times and incremental 7-8 times over.

teiva-boy
Level 6
 You could try short stroking your HDD, providing you can predict your growth accurately so as to not run out of space.  That would help contain your files, and use only the outer edges, giving you maximum disk throughput.

For you incr/diff's use modified time with the change journal, that should be faster providing we're talking NTFS.