When a backup job nears 100Gb, my tape will eject and ask to load a different tape. The tape capacity is not close to being reached. This is something that started happening out of the blue. I have even gone as far as having the tape drive exchanged and reinstalling Backup Exec.
I am using Quantum VS160 with Quantum tapes. The tape drive has been cleaned.
The VS160 only has 80GB native capacity. If you are getting 100GB on a tape, you are getting about 1.25:1 compression. I usually use 1.3:1 as a rule of thumb on new servers.
Remember that 2:1 compression is a marketing thing, not an engineering thing, and is rarely reached and seldom even approached in the real workl.
See http://seer.support.veritas.com/docs/199542.htm
Nope, Compression ratio is determined by the compression algorithm and the data to be compressed.
How often do you clean your drive(s)? Clear the stats for a tape before use and check after the job. How many soft and hard write errors are you getting?
I have cleaned my drive several times since this problem has been around. If I understood the link in your previous correctly, I need to be using Hardware compression if available, otherwise software, correct.
I have also used the tape diagnostic tool provided by Quantum, and they said everything looks good in regard to soft/hard errors.
You can use either SW or HW compression. The results won't be exactly the same, but will be very close, amd with todays computers, you will see little if any reduction in throughput using SW.
You are not backing up more non-compressible files now than you were before, are you?
NOTE : If we do not receive your intimation within two business days, this post would be marked 'assumed answered' and would be moved to 'answered questions' pool.
I am assuming the amount of files that cannot be compressed has increased. I chose not to back up one directory that had most of the jpeg and pdf files. The backup was successful. I'm assuming this was the problem