cancel
Showing results for 
Search instead for 
Did you mean: 
Highlighted

Backup Exec backup metrics/statistics

Hi all,

Can someone advise where I can download some recent metrics/statistics on the Backup Exec 11 for Windows software?  I want to find out the standard time it takes to backup a certain amount of data, both full and incremental.  That way, I can see how our numbers are matching up against the benchmark.  If someone knows where I can download this informaiton, please advise.
8 Replies
Highlighted

I don't think that such

I don't think that such statistics exists.  The amount of time required to backup a certain amount of data is highly dependent on the particular computing environment.  If you are using a faster server, it backs up faster.  Likewise, a faster tape drive will help things.  Even when the hardware environment is the same, the workload of the server at the time of the backup has an impact on the backup time.  Given the high variability of the bases for such statistics, it will not be meaningful to collect them.  It would be like comparing apples and oranges.
Highlighted

One of our incremental

One of our incremental backups last night took 1.5 hours to process 1.7 million files.  It didn't backup anything because nothing new was added.  Now, is 1.5 hour for 1.7 million files a normal statistic? 
Highlighted

1.7 million files is a lot of

1.7 million files is a lot of files, and depending on the size of each, BEWS needs to scan each and every file to check if it has changed.
So theoretically, it could take that long. Factors like when the backup is running (during application maintenance), running across a LAN/WAN, type of HDDs used etc. need to be factored in too...
Highlighted

I tested couple of backups

I tested couple of backups and the results led me to ask about these metrics.  Backup #1 was for 68 GB of data (800k files).  A full backup completed in 1.5 hours and incremental run (no new data backup) completed in 10 minutes.    Backup #2 was for 128 GB of data (1.7 million files).  A full backup completed in 10 hours and incremental run (no new data backup) completed in 1.5 hours.  The files are all .tiff images.   My question is why such disparity in back up time when the data is just doubled?
Highlighted

Well that depends if you are

Well that depends if you are using the "archive bit," or "modified time" using NTFS change journal (preferred)  One is much faster than the other by not having to scan the entire volume at the file level.
Highlighted

 It's not so much just the

 It's not so much just the size of the data, but also the file count.  RAM, disk speeds, etc all contribute making backup speeds inconsistent throughout it's window.

It's a pain really as you cant really nail down a solid number to predict backup speeds.
Highlighted

You are certainly right on

You are certainly right on the file count.  But, I didn't expect that by doubling the file count, the full backup time went 6-7 times and incremental 7-8 times over.
Highlighted

 You could try short stroking

 You could try short stroking your HDD, providing you can predict your growth accurately so as to not run out of space.  That would help contain your files, and use only the outer edges, giving you maximum disk throughput.

For you incr/diff's use modified time with the change journal, that should be faster providing we're talking NTFS.