cancel
Showing results for 
Search instead for 
Did you mean: 

Max catalog size

Skywalker1957
Level 2

Reference: https://www-secure.symantec.com/connect/forums/netbackup-scalibility-maximum-number-storage-capacity...

That's a really good question actually. I have a master that's running NBU 7.0.1 on Solaris 10 and the catalog is around 2.1 TB and growing about 1% weekly. I have a lot of NDMP jobs that backup a ton of tiny files, and the meta-data entries in the catalog are many. I only keep a 90 day retention on that data too.

If anyone knows of a high water mark, I'd appreciate it. I'm looking into standing up another master and media server for my NDMP backups exclusively going to a Quantum robot with LTO5 drives. The NDMP data accounts for about 75% of my catalog size. My instinct it to break that out from the standard backups.

Any thoughts?

Thanks!

Luke

6 REPLIES 6

teiva-boy
Level 6

I was always told 1TB is the max that is recommended.  Once you get into the 750GB size, you should split the domain's and have a new master.  This coming from one of Symantec's senior architects now PM...  I think he had a hand in the NBU tuning guides too.  You may want to look into that as well for some answers.

Boomchi
Level 4
Partner Accredited

Hi ,

                 First of all I would recommend to read the plannind and performance tunning guide page 30 and 31.

http://www.symantec.com/business/support/index?page=content&id=DOC4483

 

Also please read page no 764 - 769 for different catalog sizing and resizing options

 

http://www.symantec.com/business/support/index?page=content&id=DOC3650

 

- Kshitij

 

  

David_McMullin
Level 6

My catalog is 200GB and takes 3 hours to backup up - how long does your catalog backup take for 2TB?

The document referenced above suggests archiving at 750GB - "Online catalog should not contain more than 750 GB of data"

Skywalker1957
Level 2

It takes about 53 hours to do a catalog backup. I know, far too long. That's why I want to break this beast up into smaller chunks.

Boomchi
Level 4
Partner Accredited

              Well the best way is to Archive in your case as 2 TB is way too high , also if you are thinking of have a catalog split into two master servers then you may need to take help from Symantec consulting partners , which will be quite expensive.

Also remember , with 2 TB catalog you are moving towards a risk of unbale to browse the catalog for restores.

Regards,

Boomchi

Mark_Solutions
Level 6
Partner Accredited Certified

As Boomchi says your catalog is way too big and if the system does fall over or any corruption creeps in then you would be in real trouble

Archiving would be the immediate way to deal with it.

I guess the question also is what retention you have on your backups and is it actually required? As backups expire the catalog space is freed up again.

If you do decide to split then it would also be a major job due to its size and all inconsistencies would need to be removed first and just running the consistency check would take a vast amount of time.

We could do the split for you (we are approved) but if you are thinking of going that way then check back with me first as catalog archiving may have an effect on this and i would need to check that out.

Another solution would be to run an images on media report so that you know everything that is on every tape and then expire everything going back to a certain point in time - this is a bit of a rough way to treat things but you do appear to be in quite a situation! - you would of course then need to import anything required prior to that point in time in order to do restores.

Perhaps you just need to review your retention periods?

Get back to me if i can help further