Hi, I'd like to get advice on this. I want to test the new forever incremental backup feature supported in BE20.6. I'm wondering what would be the best way to generate the daily incremental data. Does anybody have any advice on this? Obviously, simply copying files manually or creating only a few large files wouldn't be representative of real life situation so I'm thinking of using some kind of tool to generate massive number of small files (in the thousands) but I'm not quite sure if this is the right approach. I appreciate any feedback, experience on this. Thanks.
You may use fsutil to generate lot of files.
The syntax for using fsutil is:
fsutil file createnew filename filesize
Thanks for the reply. Yes, this is one of the tools I was thinking of using. I'll take this into consideration. I'm curious though how to determine the number of files, file size of each and size of the incremental itself. Obviously, depending on the choice you make, performance would vary. Any advice on this? Ex. a typical incremental would be 3% of the full backup size...that sort of thing.