Throughput Benchmarking
We are going to be looking into doing some upgrades in my organization and one thing we want to switch out is our LTO 4 drives for LTO 6's as well as getting off of an older VNX device that hosts our disk pools and onto something newer and faster. With that said I am looking to do some benchmarking. Is there something I can do at the command line level that basically does a test backup/restore of files of differing sizes? Basically I want to run a backup/restore of a 1TB, 5TB and 10TB file both to disk as well as to tape and get our current stats as the environment sits right now so as we do demo's with vendors in the coming weeks I know better as to what I am looking to beat if you will.
Also if anyone has any experience with the Oracle ZFS Storage ZS4-4 appliance I would be interested in hearing about that as well.
- GEN_DATA Concept:
A need was identified to provide a means of generating test data to process through NetBackup. This data should be:- Repeatable and controllable.
- As 'light-weight' as possible during generation.
- Indistinguishable from regular data, to allow for further processing, such as duplications, verifies, restores, etc.
Documentation: How to use the GEN_DATA file list directives with NetBackup for UNIX/Linux Clients for Performance Tuning
https://www.veritas.com/support/en_US/article.000091135