We have a directory structure that a program creates for a index so to speak. This has grown to over 2.6 million directories containing around 5.3 million files. These are also all very small files and tons of folders are even empty. Total data is only around 1GB.
Both the server running our backups and the server containing the data are hefty, so it is not a performance issue of hardware. I get a rate of around 27mb/min and takes over 17hrs to complete. This is only going to continue to grow. When backing up these servers excluding these certain folders we get around 1650mb/min.
So, does anyone else out there have to deal with this and know of a good way to backup this info?
Any suggestions at all are appreciated.
Thanks