Forum Discussion

Toddman214's avatar
Toddman214
Level 6
10 years ago

Dealing with catalog growth

Windows 2008r2 master and media servers running 7.5.0.4

 

Hello all,

I read some other discussion around dealing with large catalogs, but it was an older article, and the final recomendation was expanding the volume that contains it. My issue isnt so much that, but my catalog has reached 1.2TB in size, and the last full backup took over 30hours to complete. That backup didnt run particularly fast.....11MB/s or so, but thats actually fairly quick for us. Generally, its been taking closer to 40 hours. I run a couple of incremental backups against it per day as well.

Aside from compression, archiving, etc, are there other options you all use for backing up larger catalogs? I'd certainly prefer not to set up a second Master to break up the catalog (if thats even possible), but I'll need to do something soon. I appreciate any and all suggestions.

 

 

Thank you,

 

 

Todd

7 Replies

  • Thats very slow....you'll need to describe your environment Todd, the infrastructure, what lies where, what lies inbetween, server details, network details, disk layout of where the catalog lives.

    It goes without saying the more direct the server is to the tape (you are using tape?) the faster things get. I get 45M/s, so 4x what you are seeing, with some unspectacular kit.

    Your statement about that being fairly quick suggests an underlying issue: if using tape have you updated the default blocking factor? It will make a dramatic difference.

    Jim. 

  • Look at the retention periods. Are you retaining backups for longer than you should? Ask the owners of the data! Ensure you are also not backing up junk data (OS files, core dumps) when you don't need to.

    You could possibly backup to MSDP and utilize deduplication - this will ensure your backup is faster.

     

     

  • 1.2TB is the pertty much big.. 

    i would deal with retenction periods first.. to make sure all the images are have the required retenction periods and nothing is staying more than the need..... also can tune going with lesser rentection with Dev and test servers.

    once we are done with retenction tuning and if its still did not help.. I would perfer to go with the archive the catatlog for the longer/older retenctions images considering the size of it.. 

    keeping 1.2 TB on live system is not a good idea .. 

  • We are addressing two issues here: mine is addressing the data throughput, Ram and Rev are addressing the volume of data: you sort both out and you will be very pleased.

  • Being a Windows server and a catalog that big, you should also look at running a disk defrag. In our environment (at the time) we had a 35GB catalog. Just by defragging the disk, we went from 80 minutes to 45 minutes.
  • No reply in over a month and no response to PMs....

    1.2 TB is way more than the recommended maximum catalog size of 1TB.

    Time to setup another Master server or else have a good look at retention levels.
    See : 

    NetBackup 7.6 Best Practices: Optimizing Performance

    Updated NetBackup Backup Planning and Performance Tuning Guide for Release 7.5 and Release 7.6

    Upgrading to 7.6 may also help as one of the big improvements in 7.6 is Catalog Backup performance.

  • Sorry, I saw the PM but wasnt sure how to respond, and it fell off my radar. I have been given and have researched various options. I just need to determine which will be best for us, whether its compression, retention, second master, etc.  Right now, I'm leaning toward a secondary master.