I created a backup to tape folder that points to "E:\" which is a dedicated 1.3 TB iSCSI array (one of 3). Media is set to no overwrite for 2 weeks and infinite append time. I created a job policy consisting of 4 backup jobs. I started all 4 at the same time. 2 were 14GB and finished in about an hour, the other two were at 300GB complete almost 20 hours later??
Whats worse is when I woke up to logon to check them, the backup server is now backing up at 150mb/min and the bengine.exe and system processes are together taking 100% of my CPU.
I set the max backup file size to 400gb. I had been collecting performance stats from the box during the backups, and noticed no changes in counters. The only thing I can think of is that 400GB is too large for a backup to disk file. I cannot find any documentation or guidence on sizing backup to disk files.
Also the resulting files were 2, 14GB files and 2, 300+ files, I assume even though the 14GB files were created they would have been appended to on the next job?
Sadly I called tech support which I was told was pre-sales tech support but no one would help me without a support contract.
Thanks!