cancel
Showing results for 
Search instead for 
Did you mean: 

client with millions of small files

mohanl
Level 3
Partner

A client with > 2 million files of average size only 370KB each, is taking 24 hours to backup during weekend full job. Is this becuase each file must be opened and closed? Is yes, how to overcome this problem?

1 ACCEPTED SOLUTION

Accepted Solutions

BanksyMJ
Level 4
Employee Accredited

Hi ZeRoCOOL

When backing up millions of small files the only quick way of backing them up is to use a form of Snapshot. Symantec provide a couple of solutions that can perform snapshot backups:

1. NetBackup - This is an backup product aimed at the Enterprise Market. It includes the functionality to perform Snapshot backups both to disk and tape...

2...if NetBackup is to 'big' a solution then you could look at Symantec System Recovery Server (SSR). SSR ignores the files on a server and just backs up at a block level. The beauty though is that you can still restore individual files if required. The downside is it only supports backups to disk.

Hope that helps

View solution in original post

4 REPLIES 4

ZeRoC00L
Level 6
Partner Accredited

Yes, small files are slow to backup.

Depending on your environment you can create another backup.
For example:

- if the data is on a NAS (like EMC or NetApp), you can look into the NDMP option.

- if the data is on a virtual machine, you can look into the Vmware agent to directly backup the VMDK.

AmolB
Moderator
Moderator
Employee Accredited Certified

As per  http://www.symantec.com/docs/TECH49521

The total number of files on a disk and the relative size of each file impacts backup performance.

Fastest backups occur when the disk contains fewer large size files. Slowest backups occur when

the disk contains thousands of small files. A large number of files located in the same directory

path back up more efficiently than backing them up from multiple directory locations

Also refer to BE2012 Performance Tunning Guide, quoted on Pg# 39

http://www.symantec.com/docs/DOC5481

Although the article is for BE2012 but the concept is same for all the versions of BE.

 

 

Sush---
Level 6
Employee Accredited Certified

Hello Mohan,

Check the following 

 

6Size and Number of Files

The total number of files on a disk and the relative size of each file can either speed up backup or slow it down. The fastest backups occur when the disk contains a few large size files. The slowest backups occur when the disk contains thousands of small files. A large number of files located in the same directory path will back up more efficiently compared to backing them up from multiple locations.

 

This is from the technote  " http://www.symantec.com/docs/TECH8326 " which is about 

Reasons why the data throughput rate can be slower than the theoretical maximum when backing up to or restoring from tape media or disk (B2D) & how to troubleshoot or improve backup performance

 

Thanks,

-Sush...

 

BanksyMJ
Level 4
Employee Accredited

Hi ZeRoCOOL

When backing up millions of small files the only quick way of backing them up is to use a form of Snapshot. Symantec provide a couple of solutions that can perform snapshot backups:

1. NetBackup - This is an backup product aimed at the Enterprise Market. It includes the functionality to perform Snapshot backups both to disk and tape...

2...if NetBackup is to 'big' a solution then you could look at Symantec System Recovery Server (SSR). SSR ignores the files on a server and just backs up at a block level. The beauty though is that you can still restore individual files if required. The downside is it only supports backups to disk.

Hope that helps