cancel
Showing results for 
Search instead for 
Did you mean: 

Backup Exec failures

matt6502
Level 2

I have been trying to evaluate Backup Exec 2010 for several weeks, but I am having a lot of trouble getting my jobs to consistently complete successfully.  All the jobs I have created have run fine at least once, but more often than not, they fail.  I have actually had to do a restore at this point so I know they are working when the work, but the inconsistency is making the product virtually unusable to me. 

The latest issue that I am having is related to the following error that I received from a job this morning.  That error was: "The directory is invalid."  (which is one of the most common errors I get).  I believe this has to do with the fact that a directory was deleted but it is still contained in the selection list for the job, but I don't think this should fail the job.  I have seen a ton of posts about this issue, but the "solutions" are not acceptable in my opinion.  The solutions that I have seen are:  1)  Remove the failed directories from the Selection List    2)  Recreate the job and try to run it again.

Can someone please help me get this working?

4 REPLIES 4

JoaoMatos
Level 6
Partner

Hi,

A good practice when creating the job is to select the entire server and remove directories and files you do not want to backup

matt6502
Level 2

JoaoMatos,

Thanks for your input.  Performance is one of my concerns with that approach.  I would have to schedule the full backup of the file server to run over the weekend and hope it completes before Monday morning (which it may not based on the size of the file server).  If it doesn't complete, my users are going to be screaming at me on Monday morning because performance of the file server is not the best when the backups are running (which I know is another issue I have to deal with).  My other issue with this approach (and more important than the performance) is that I don't see how this is going to solve the problem that I am having.  Right now I have multiple jobs that selectively target the root shares from the file server and they fail individually.  I don't see how increasing the size of the selection list for the job is going to make the process more stable...I would actually expect this to have an opposite effect and I would see the success rate drop ever further.  I will go ahead and try your suggestion just as a test, but I have reservations about it working.  Please let me know if I am missing something.

Thanks,

Matt

JoaoMatos
Level 6
Partner

Matt,

for what I'm noticing, what is happening is that when a folder is removed, the BE does not "take off" from the backup. Contrary to what happens when we add more files.

The idea is that the first lines "View Selection Detail to be:

C:\*.* /SUBDIR

e:\*.* /SUBDIR

matt6502
Level 2

I agree...I would think that the wildcard would take care of the deletes as well.

This is the configuration for a job that routinely fails (although it has been succesful a few times).

\\servername.domain\departments\production-services\*.* /SUBDIR

 

And this is the configuration for a job that typically runs successfully (although does get the same error from time to time).

\\servername.domain\departments\legal\*.* /SUBDIR

 

\\servername.domain was eliminated for security, but is identical in both configurations and simply points to the fully qualified domain name for my file server.

 

Thanks again for your help!