cancel
Showing results for 
Search instead for 
Did you mean: 

best practice to back up file servers

liuyang
Level 6
Partner Accredited

Hi, our master and media servers are all NBU 7.1 on Windows 2008 R2. We have 10+ file servers (all are Windows 2003, NBU 7.1 client). We found the data sizes in these file servers grow very fast. Now the total sizes grow to 10+ TBs. Every time it takes very long time to complete the full backups of these file servers. My question is: is there any way to improve the backup performance? what are best practice to back up file servers? One possible way may be to consolidate the file servers to fewer number of servers and then convert them to SAN media servers. Any other suggestions/recommendations? Thanks in advance.   

1 ACCEPTED SOLUTION

Accepted Solutions

Marianne
Level 6
Partner    VIP    Accredited Certified

This post is quite old but still unsolved.

I have voted for Mark's initial response.

Another 'quick win' is to break up big filesystems into multiple steams.

Ensure 'Allow multiple data streams' is enabled in policy attributes. Check directory structure on large user-drive. Break up Backup Selection into 4 - 5 streams, e.g.:

NEW_STREAM
x:\users\a*
x:\users\b*
x:\users\c*
......
.....
x:\users\h*
NEW_STREAM
x:\users\i*
x:\users\j*
x:\users\k*
....
...
....
NEW_STREAM
....
.....
......
NEW_STREAM
....
.....

 

Make this change before a full backup to prevent incrementals running as full.
 

View solution in original post

7 REPLIES 7

Mark_Solutions
Level 6
Partner Accredited Certified

Several Things:

1. Proper tuning using the NUMBER and SIZE BUFFER FILES on the Media Servers as well as memory and tcpip tuning

2. Use Multiple Streams where possible for the file servers so that you maximise the total throughput for each (i have customers running up to 50 streams per server where required)

3. Good networks - 10g if possible but teamed 1g if not

4. defrag the file servers

5. exclude NBU processes from Anti Virus

6. Enough Media Servers to do the job

7. Backup to disk

These are just a few pointers but 10TB shouldnt be to bad to move - making them SAN Media Servers may just complicate your installation and as networks are often as fast as a SAN these days there should be tuning based on the above that will get good results

Omar_Villa
Level 6
Employee

I will suggest to better use NBU Advance Disk or a Snapshot tool to backup your file server, NBU is realy bad when we talk about backing up 1000's of small files because it will check each file attribute and that slows down the backup drastically, if you run your backups through a Snapshop tool like Advance Disk or even PureDisk it will improve the speed because it will be at block level and not file level.

 

Hope this helps.

regards.

V4
Level 6
Partner Accredited

don't forget to use checkpoint restart feature to avoid backups being starting from scratch in case of network or drive failure

Mark_Solutions
Level 6
Partner Accredited Certified

Are you happy with the answers you have been given?

Perhaps you can close off the thread by marking a solution?

Marianne
Level 6
Partner    VIP    Accredited Certified

This post is quite old but still unsolved.

I have voted for Mark's initial response.

Another 'quick win' is to break up big filesystems into multiple steams.

Ensure 'Allow multiple data streams' is enabled in policy attributes. Check directory structure on large user-drive. Break up Backup Selection into 4 - 5 streams, e.g.:

NEW_STREAM
x:\users\a*
x:\users\b*
x:\users\c*
......
.....
x:\users\h*
NEW_STREAM
x:\users\i*
x:\users\j*
x:\users\k*
....
...
....
NEW_STREAM
....
.....
......
NEW_STREAM
....
.....

 

Make this change before a full backup to prevent incrementals running as full.
 

Omar_Villa
Level 6
Employee

If you are on Netbackup 7 and if you have the infrastructure you can try to use the new Deduplication feature and configure it at Client Side level, this will improve your performance, deduplicating your data a client level it will impact a bit your client performance but you will start to backup data at block level and only send the differentials, this will need more than you installing a client you will need to configure a PDDE and SPA but if is a business need and you are realy having trouble I think that for boxes where there are millions of small files and Flashbackup or Snapshots are not an option, this can work pretty well.

 

regards.

liuyang
Level 6
Partner Accredited

Thanks a lot for all your advices. At this moment, we are backing up the file servers by using the methods stated in Marianne's reply above. In future, we may look into deduplication to disk.