cancel
Showing results for 
Search instead for 
Did you mean: 

Managing Disaster Recovery file.

Foxtrot_Lima
Level 3

Hi,

Im new to both Netbackup and Linux so I thought that someone must have done this before. Running SLES9 and Netbackup 6.5.1

 

What Im looking for is a solution to send the Disaster Recovery file out of the masterserver to another server and delete DR files that are couple of days old.

 

How are you managing your Disaster Recovery files? Is there a  simple way to do this?

 

Regards

Fredrik

8 REPLIES 8

Andy_Welburn
Level 6

We have a Solaris 9 Master Server running 6.5.1.

 

Our DR file is actually pointed at an NFS mount on the Master pointing to an area on one of our filers.

 

We have yet to delete any old DR files (currently go back to Sept 2007   Smiley Surprised   Just checked & there are 413 of them but they only take up 1.6Mb) but it should not be an issue setting up a cron to delete any over a certain mtime.

 

 

Foxtrot_Lima
Level 3

What I want is to have the DR file on two separated servers incase of a hardware failure, are your DR files in anyway located on two locations when using a NFS?

 

Been trying to get a shell script working for the removal of old files but I have some difficulties to get the date data sorted. Played around with the FIND command.

Andy_Welburn
Level 6

@Foxtrot_Lima wrote:

What I want is to have the DR file on two separated servers incase of a hardware failure, are your DR files in anyway located on two locations when using a NFS?

 


Could always copy them after the event ...

 


@Foxtrot_Lima wrote:
Been trying to get a shell script working for the removal of old files but I have some difficulties to get the date data sorted. Played around with the FIND command.

For older than 30 days (ish):

# find <directory for DR files> -type f -mtime +30 -exec ls -l  {}  \;

-r--------   1 root     other       1657 Sep 14  2007 ./Catalog_Backup_1189752750_FULL
.........etc 

-r--------   1 root     other       1491 Oct  5 03:40 ./Catalog_Backup_1223168654_FULL

 

I would certainly try this first to ensure you are getting the files you want before doing the following.

 

To remove these files it would be:

 # find <directory for DR files> -type f -mtime +30 -exec rm  {}  \;

 

 

Foxtrot_Lima
Level 3
Thanks, will try that one.

 

Since you seem to know some about scripts I would like to hear what you think about this automated method to copy the files (will make a useraccount that can use ssl connection without password)

 

search=`find <file dir> -type f -mmin -120`

scp $search <user>@<host>:/good_place

 

Cheers

Message Edited by Foxtrot_Lima on 11-05-2008 07:39 AM

Andy_Welburn
Level 6

Looks ok.

 

Initial concern was if $search included more than one entry but it shouldn't unless you run a catalog one directly after another!!

 

Other concern, would it just fail if the find command didn't find anything or hang? e.g. if catalog didn't run or failed & therefore no DR file produced:

 

scp "nothing" <user>@<host>:/good_place

 

So maybe wrap this in an 'if' statement to check whether $search=""

 

Even ls -1rt (that is a one) & pipe that thru' tail -1 (another one!) to get the last entry in the directory (if catalog failed it will only copy the previous days file anyway) so:

 

# ls -1rt|tail -1
Catalog_Backup_1225852549_FULL

 

"The possibilities are endless!"

 

 

 

Foxtrot_Lima
Level 3

I loved your way of picking out the latest file in a directory, and putting it in an IF statment sounds like a really good idea.

 

Thanks alot for your time and help, will play around some with the scripts on a test machine now before going live :p

 

Regards

Fredrik

J_H_Is_gone
Level 6

I have 2 AIX masters.

 

I created a dir for the 'other' master on each.

 

the I use rdist ( I have it run about 9 am after the catalog is done) in cron.

rdist copies anything 'new' up to the other server.

 

I am looking to set up a cron to remove old catinfo files that I don't need any more.

once this is in place....

rdist will also removed from the other server what no longer is on this server.

rdist is so easy to set up I did not have to do a bunch of scripting to get it to work.

Andy_Welburn
Level 6
Nice one! Checking the man page now!