cancel
Showing results for 
Search instead for 
Did you mean: 

Small file backup on Sun Cluster

mahesh_Wijenaya
Level 3
Partner Accredited Certified
I have been trying to backup a file system of size 200gb containing about 25 million files. The system is a cluster node running Solaris Cluster 3.2. The backup job starts and runs for about 1 day and then hangs.

Can some one help me on this issue?
4 REPLIES 4

Sriram
Level 6

I was also facing a similar kind of problem but with normal unix filesystem backup.

I had a filesystem with 650 GB and millions of small files inside.  It was taking 2 days and it never completes.  We had no backup of that server.  So as a workaround i split the filesystem backups into folders and now i'm having a backup of the server.

Srikanth_Gubbal
Level 6
Certified
try flash backup; but you require enterpise client licence for this. let me know in whihc version of netbackup you are; so that i shall let you know the porcedure to configure it.

Sriram
Level 6

Even we tried this but we had no license.

Stumpr2
Level 6
NetBackup uses filenames for determining the backup requirements. There is one line entry in the NetBackup catalog for each file on a server. 32 million lines of catalog information would be required to do a single full backup. When an incremental backup is run then Netbackup will have to look at ech of the 32 million files to see if it is a candidate for the incremental backup. This kills Netbackup by tying up the resources. Flash backup was created to backup the millions of small files and is necessary not only for backing up this server but also to keep ALL of the other servers current in their backups. It is just too much for NetBackup to do bothe this server with millions of files and the rest of the enterprise.