Millions of Files - Slow Performance
Hi there,
We are currently running Netbacku 7.6.0.1 with one media / master server (30 clients only for now)
File servers with 1+ million files take forever to backup with a throughput of 5 - 9 MB / sec. The backups are going directly to basic disk and even running just one job still shows slow throughput. I have a client that has been backing up since yesterday and it's only gone through 3 million files. The file server has 6 million files on it. Other backups (such as oracle, and smaller servers( appear fine and can acheieve a throughput of 20 MB / sec +.
I can log into the client and copy a 500 MB file directly to the disk staging unit of netbackup (E:\) and the throughput is 25 MB/sec. My question is, why is throughout so slow when backing up large file servers? Is there any way to increase this? The file server has an OS drive c:\ and a data drive d:\. Multi streaming in this case doesn't seem to improve performance so, it's turned off.
my settins for the two config files are as follows:
NUMBERS_DATA_BUFFERS_DISK = 64
SIZE_DATA_BUFFERS_DISK = 1048576
I'm considering using Netbackup Snapshot Client, but, need to research it and determine if a license is required.
Any advice or assistance is appreciated.
EDIT: I forgot to mention that differencial backups are fine and don't take long since I'm using Journaling. This issue pertains to full weekly backups.
This is a know dilemma for systems with many small files. Time used for opening,reading and closing many files simply take a long time when repeated enough times.
The best weapon is Netbackup deduplication (MSDP) and Accelerator.
The license for 7.6 should include using MSDP as far as I can recall - but I may be wrong.
Abduls Blog about accelerator:
https://www-secure.symantec.com/connect/blogs/frequently-asked-questions-netbackup-accelerator
https://www-secure.symantec.com/connect/blogs/frequently-asked-questions-netbackup-accelerator-part-ii