Forum Discussion

adivya1's avatar
adivya1
Level 3
10 years ago

Regarding Netbackup Readiness

Hi All,

 

Is there any way we can check to see , What are the preferred bottleneck in a environment.

Any kind of a script that can give you the desired output from the current Netbackup Configuration

Also while doing hardware sizing for proposed netbackup solution is there any tool available we can check for

I need your support on this

Regards

Adivya Singh

 

  • You will never see a script telling you can do X megabytes per second. Performance does not work that. More likely you will have a sweet spot where the most backup can be produced.

    It like a car going up a mountain side slowly adding weight. At some time the optimal ratio between horsepower and weight has been reached. You can add more weight but the car will slow down and at some time it may even stop completely.

    So it's a question about finding the sweet spot - and for backup that's a tricky one because you have a lot of variables e.g.-disk, network, file sizes, operation system, storage medium (tape, disk, OST etc etc). 

    If you happen to have a Linux/UNIX media server you can use the  GEN_DATA file directive to determine current performance. 

    http://www.symantec.com/docs/TECH75213

    Else tools like IPERF (network bandwith) and IOmeter (disk speed) are good choises as well

     

  • I have never heard of a 'preferred bottleneck '. You want to identify and eliminate bottlenecks as this is what is causing slow backups. This document will help with identification of bottlenecks, performance tuning and with planning of a new environment : https://www-secure.symantec.com/connect/forums/updated-netbackup-backup-planning-and-performance-tuning-guide-release-75-and-release-76 To get an overview of existing installation and configuration, use nbsu script. This will create a bunch of output files that contain OS and NBU config.
  • You will never see a script telling you can do X megabytes per second. Performance does not work that. More likely you will have a sweet spot where the most backup can be produced.

    It like a car going up a mountain side slowly adding weight. At some time the optimal ratio between horsepower and weight has been reached. You can add more weight but the car will slow down and at some time it may even stop completely.

    So it's a question about finding the sweet spot - and for backup that's a tricky one because you have a lot of variables e.g.-disk, network, file sizes, operation system, storage medium (tape, disk, OST etc etc). 

    If you happen to have a Linux/UNIX media server you can use the  GEN_DATA file directive to determine current performance. 

    http://www.symantec.com/docs/TECH75213

    Else tools like IPERF (network bandwith) and IOmeter (disk speed) are good choises as well