Recent Discussions
Backup Exec Job got canceled
Hi, I'm new to Backup Exec - I did a sample backup job for a folder in my Windows File Server. The back up job seems to be successful until the verification step failed. The job status is : Canceled, timed out The job was automatically canceled because it exceeded the job's maximum configured run time. Is there a way to verify that the back up job is still valid? It took almost a week to run the job & I don't want to re-run it if possible. Please help. Thanks10Views0likes0CommentsBackupExec for Oracle 23ai?
Does anyone know when Backup Exec will offer support for Oracle Database 23ai?Solvedgirlrobot4 days agoLevel 1138Views0likes1CommentVeritas Backup Exec version upgrade & migrate to another new Server
Hello, We need to upgrade our Veritas Backup Exec version from 16 to latest compitable version as per hardware support in existing Server having OS is Windows Server 2008 R2 Standard, then need to migrate the same to new Server hardware with Windows Server 2022 Standard. Please help us to do the same. Thanks & Regards, Soumabrata Bhaumik246Views0likes2CommentsVeritas Backup Exec 23 - Slow Backup after Windows server 2019 upgrade
Hello, I have recently upgraded 3 windows servers from Windows 2012 R2 to Windows 2019 Standard. These are 2 physical servers and one virtual server running under Vmware Esxi. The Veritas server (version 23.0.1250.0) is on a separate physical machine running under Windows 2016 Standard. There are no other applications running on this system. The upgraded servers have the Veritas client (version 23.0.1250.0) installed. There is also an installation of Bitdefender antivirus which does not report any problems. The issue that I encounter is that the backup time of the Windows 2019 servers has **doubled or tripled** after the upgrade. There is no change in the backup jobs configuration or settings.The upgrade was smooth without any errors. There is no other change in network configuration, traffic, hardware or software. The servers were rebooted as required by the upgrade process. The data backed up is a mix of MS Office documents, pdf files, SQL databases etc. The size of the data backed up has not changed. The backups do not show any errors, all complete succesfully, BUT the backup rate in megabytes per minute has decreased by 2 or 3 times. Backups that usually take 70-80 minutes now require 3 or more hours. I have not being able to find any references related to this problem. I have run a number of tests troubleshooting the problem in the network settings of backup jobs, open file locks etc without success. I have disabled real time protection on the antivirus, but the problem remains. I would appreciate any insight you may have, Thank you,bxuser16 days agoLevel 1566Views0likes2CommentsManually copying .bkf files
We are backing up our files every night with Backupexec to a dedicated backup hard drive in the same server. We feel pretty good about being protected from a hard drive failure. However we are not protected from a fire that destroys the whole server. I would like to copy the .bkf files to an offsite storage medium. Is this gonna work? Doing some research, I saw suggestions that this is not going to work, that Backup Exec needs to write the .bkf files to the backup medium directly or it will throw an error when you try to restore a 2nd generation copy of a .bkf. This seems a little strange to me.... let's say if you have a complete disk failure of the C: drive and the whole server OS (including Backup Exec installation) is gone, and you have to do a bare metal restore. So it looks like you can't just diieinstall a fresh OS and then a fresh copy of Backup Exec and then do a restore of the .bkf files because the fresh Backup Exec software won't know what to do with my existing .bkf files? P.S. we do not use encryption because our files are only valuable to us... no one else would find it useful or valuable238Views0likes1CommentGet-BEBackupDefinition not showing all results
Giving BEMCLI a test with a view to automating switching duplicate jobs to new S3 buckets every 3 months. Get-BEBackupDefinition in conjuction withSet-BEDuplicateStageBackupTask seems to be exactly what is required from the examples in the help guide... Get-BEBackupDefinition "Backup Definition 01" | Set-BEDuplicateStageBackupTask -Name "Duplicate 2" -Storage "Any disk storage" | Save-BEBackupDefinition However, runningGet-BEBackupDefinition only returns a few results (from what i can see only agent based jobs). None of the VM based jobs show up. Running Get-BEJob shows everything as expected. Any pointers on how to use bemcli/powershell to automate changing jobs to use the new s3 bucket?150Views0likes0CommentsBack up to Local Disk Storage and then Duplicate to Cloud Deduplication Storage
We would like to have a local backup of our servers to a normal disk storage device in Backup Exec. This will allow for fast restore times. But we would also like the benefits of ransomware protection that cloud deduplication with immutable storage provides. So, we created a job that backs up to the local disk storage device and then runs a Duplicate job that has the cloud deduplication storage device as the destination. There are no errors from the job configured this way and we verified that the retention lock is being enabled properly in the immutable cloud storage. The problem is that the Duplicate job log shows this. Deduplication stats: scanned: 0 KB, CR sent: 0 KB, CR sent over FC: 0 KB, dedup: 0.0%, cache hits: 0, rebased: 0, where dedup space saving:0.0%, compression space saving:0.0% It seems that with this method we are getting immutable backups but not any deduplicated data. Is the log incorrect or does this method really not deduplicate anything? I don't know if it makes a difference, but the cloud storage is in Azure. We properly created the local deduplication volume and the cloud deduplication device with immutability support. I'm not asking for any help in setting that up and I have verified that the deduplication part of that is working if we backup straight from the server to the cloud deduplication device as shown here. Deduplication stats: scanned: 1129257857 KB, CR sent: 11590580 KB, CR sent over FC: 0 KB, dedup: 98.0%, cache hits: 8887711, rebased: 2994, where dedup space saving:98.0%, compression space saving:0.0%244Views0likes0CommentsBackup Exec server not available
Greetings, After configuring numerous backup jobs with an additional duplicate to tape job (practically identical other than the backup target), two of the jobs' status shows as "Ready; Backup Exec server not available". They remain in this state until the job fails with a status of "Missed". Meanwhile, all other jobs run normally. We have only one backup server. Has anyone dealt with this before? Not finding anything via web search. Thanks, PootytangSolvedpootytang130 days agoLevel 22.5KViews0likes2CommentsHow do I delete IMG folders and contents?
It appears that the creation of an infinite number (up to the point of "drive full")of IMG folders (when using B2D to back up Exchange Server) is a known issue. Unfortunately, I have learned this only after creating 23 IMG folders of approximately 11 GB each. I cannot find a method within Backup Exec for deleting these IMG folders and when I attempt a manual deletion using Windows Explorer I get the error message "Error Deleting File or Folder - cannot delete ese.dll: Access is denied." I must get rid of these blankety-blank things before I run out of disk space. Based on the number of related posts I'm guessing somebody must've figured out an answer to this problem. I just can't seem to find it posted here. Thanks in advance.3.5KViews0likes16Comments