cancel
Showing results for 
Search instead for 
Did you mean: 

Backup files modified since last x days

Kris87
Level 3
Accredited
Hello All, I am looking for an option to customise my backup selection to backup files that are modified since last x days. As I see there is no direct option in GUI, is there any query like modified time > date used in bpplinclude command. Has any1 tried before?? For reference, We have this feature in BE. Our Netbackup version is 7.6
1 ACCEPTED SOLUTION

Accepted Solutions

Nicolai
Moderator
Moderator
Partner    VIP   

Its not fair to protect 64TB with 3 LTO tapes crying

You could do this indirect.

On client generate a list of files changed witin X number of days and write info to a file e.g c:\temp\files_to_backup. 

Hint : 

http://www.windows-commandline.com/find-files-based-on-modified-time/

Then from client call Netbackup command bpbackup -p {NAME_OF_POLICY} -S {SCHEDULE_NAME} -f c:\temp\files_to_backup

This command will then backup all files speciyed in file "file_to_backup".

http://www.veritas.com/docs/000093112

You need to replace NAME_OF_POLICY and SCHEDULE_NAME with policy and schedule info from youre own Netbackup installation.

Best Regards

Nicolai

 

 

View solution in original post

8 REPLIES 8

Marianne
Level 6
Partner    VIP    Accredited Certified

The problem with such a manual method is that you need to use a command or script on the Client to generate the file list, then transfer it to the master server where it can be added to the Backup Selection.

If I may ask - what is the reason for this request?

Why are the default Full, Diff Inc or Cumm Inc schedules not working for you?

Nicolai
Moderator
Moderator
Partner    VIP   

There is no such option in Netbackup.

I don't think either such a option is required:

A backup type of differential incremental will backup files changed since the last full or earlier differential incremental (whereas a cumulative full will backup all files changed since last full).

Alternative, use Netbackup accelerator (msdp pool required). Accelerator only backup files that has changed in a "incremental forever" type.

Can you be more specific why you what this behavior ?

Do you have problems reaching backup windows ?

 

 

Kris87
Level 3
Accredited
Thanks for the response. This feature was handy when we had to backup 64tb of files with 3 LTO4 tape drives. There werent any successful backup due to long backup window.Client could not provide us with critical folders list to protect but told to protect all files that were changed since 1year first. I have a similar scenario now with netbackup and was looking for such option.

Nicolai
Moderator
Moderator
Partner    VIP   

Its not fair to protect 64TB with 3 LTO tapes crying

You could do this indirect.

On client generate a list of files changed witin X number of days and write info to a file e.g c:\temp\files_to_backup. 

Hint : 

http://www.windows-commandline.com/find-files-based-on-modified-time/

Then from client call Netbackup command bpbackup -p {NAME_OF_POLICY} -S {SCHEDULE_NAME} -f c:\temp\files_to_backup

This command will then backup all files speciyed in file "file_to_backup".

http://www.veritas.com/docs/000093112

You need to replace NAME_OF_POLICY and SCHEDULE_NAME with policy and schedule info from youre own Netbackup installation.

Best Regards

Nicolai

 

 

revarooo
Level 6
Employee

As Nicolai says, it's unfair to expect to backup 64Tb on 3 tape drives.

Split that over a day each drive working constantly would have to backup 246Mb/s to do a full backup.

If there is not a lot of change to the files, then incrementals *might* be able to cope, but I would add more drives personally and do a full and use incrementals every day.

 

 

 

Marianne
Level 6
Partner    VIP    Accredited Certified
Your customer seems to be in need of an archiving solution. If they don't need files older than 1 year to be backed up, they should move it off the primary /production servers and storage using an Archive product such as Veritas Enterprise Vault.

sdo
Moderator
Moderator
Partner    VIP    Certified

With a 'backup only files changed in last 'n' days' approach does not give you is... a complete recovery point.

All I can ask is... what about files that get backed-up once, and never change again?  How would these be restored?  You must be keeping tapes forever?  No?

.

Assuming that you had all of your tapes, and that they will all still be good, how long would it actually take to restore 64TB, especially when you factor in all the manual media changing.  Why then even bother with backups for a 64TB data-set.  You would be better off with some kind of archiving as Marianne suggest, and/or with some form of mirroring or replication at the file system or volume layer - which, whilst you won't have any point-in-time copies (which you get with backups), you'll at least have an alternate copy.  Maybe you could implement some kind of snapshot at the storage layer, and mirror/replicate those snapshots which is very easy with NetApp - but even this has subtleties and niceties to be aware of.

.

What you could try is a highly structured approach of multiple backup policies...

1) Work out how big your various sub-folders and trees are, and try to devise 12 distinct unique lists of folder names which are each roughly equal in size, i.e. each list contains the names of one twelth of the total 64TB - i.e. each list of volumes/drives/mount-points/folders/sub-folders would be about 5.3TB.

2) Setup twelve different backup policies, one for each list.

3) Schedule an "Annual_Full" schedule in each of the twelve backup policies, each to run only once a year but each during a different month to each of the other eleven "Manual_Full" schedules in each of the other eleven backup policies..

4) In each of the twelve backup polciies, schedule one "Monthly_Cinc" cumulative incremental schedule to run monthly (in eleven months of the year), and you could even space these out so that out of every four weeks of the month, only three will trigger.

5) In each of the twelve backup polcies, schedule one "Daily_Diff" differential incremental schedule to run daily (but not on the days that the Monthly_Cinc or Annual_Full schedules will run).

.

Then, for any given restore of any one of the twelve lists - you would then need to restore from one full + up to eleven cummulative + up to say 24 (assuming working days (Mon-Fri))... but you won't always have to restore from that many backups.   Do you see how the number of backups that you would have to restore from only increases through the month and through the year.

.

I'm sure that with a spreadsheet you could easily model this and so you could then know, on any given day, how many backup images you need and maybe even calculate how many tapes you might need.

sdo
Moderator
Moderator
Partner    VIP    Certified

It all depends upon the nature of the birth (location, naming), life (change rate), and death (or old age uselessness) of the files in the 64TB data set.  Is it in the form of new dated folders each day (if so, then my approach above is not relevant)?  In fact, is it even possible to apportion the "newness rate" or "change rate" across 12 different "lists"?  Is the data for user home drives?  Is it a data-set that lives on some kind of super-scale-out "big data" file system, if so could you leverage any features of this storage platform for snapshots which capture the "changes" which you can then backup (with NetApp you can)?  Is it already abstracted behind DFS?

The more I think about it, the more Marianne is right.

But then again, it might be a data-set where all of it always has 100% value and never lessens in value as it ages, and it all needs to be available for immediate re-call - if so, replication/mirroring might be the only way to protect the data to achieve your RPO/RTO.

Would love to know what your RPO/RTO requirement actually is, or maybe what the RPO/RTO is that you have confirmed that you will provide to your customer, for a data-set of this size?