Forum Discussion
Interested to know how this goes.
I have my CASO at the remote (DR) site across a 10Mbps LES circuit.
I find that optimised deduplication often fails and is very slow to complete. I've done tests doing backup, then an optimised deduplication and then repeating and finding that the second optimised deduplication takes just as long as the first to complete despite their not being any changed data.
I'm not convinced that all the issues with dedupe have been fixed in R2 - even with the patches a few weeks ago.
I often get jobs failing with errors like:
Source backup set had completed with following error/exceptions.
V-79-57344-33329 - Library - cleaning media was mounted.
Source backup set had completed with following error/exceptions.
V-79-57344-33329 - Library - cleaning media was mounted.
or
V-79-57344-1543 - Backup Exec cannot copy the deduplicated data from the source device to the destination device. The maximum image size at which Backup Exec splits the data stream on the destination device is smaller than the image on the source device.
You can increase the size at which Backup Exec splits the data stream and spans to a new image, and then try the job again. To edit this option, click Devices and then right-click the destination device and select Properties. Then on the properties dialog, click the Advanced tab.
Not wanting to hijack the thread but I have one of the top Symantec engineers due to look at this and would be happy to raise the speed issue. It's a great opportunity to raise any dedupe issues so if anyone wants me to ask anything please let me know.