BackupExec + S3 [A backup storage read/write error has occurred]
Hi, We have BackupExec 20.4 with StoreOnce on premise and use Amazon S3 with Storage Gateway for Virtual Tape Library (VTL). My jobs create backup onsite on StoreOnce and then they are pushed to the cloud via AWS S3 with a duplicate job. I only get this error from time to time and have already checked with my ISP, VPN Network team and opened a ticket with AWS. I ask if anyone can help me out with these failures that occur: Job ended: venerdì 19 giugno 2020 at 02:27:11 Completed status: Failed Final error: 0xe00084c7 - A backup storage read/write error has occurred.If the storage is tape based, this is usually caused by dirty read/write heads in the tape drive. Clean the tape drive, and then try the job again. If the problem persists, try a different tape. You may also need to check for problems with cables, termination, or other hardware issues.If the storage is disk based, check that the storage subsystem is functioning properly. Review any system logs or vendor specific logs associated with the storage to help determine the source of the problem. You may also want to check any vendor specific documentation for troubleshooting recommendations.If the storage is cloud based, check for network connection problems. Run the CloudConnect Optimizer to obtain a value for write connections that is suitable for your environment and use this value to run the failed backup job. Review cloud provider specific documentation to help determine the source of the problem. If the problem still persists, contact the cloud provider for further assistance.Final error category: Backup Media Errors Duplicate- VMVCB::\\XXXXX\VCGuestVm\(DC)XXXX(DC)\vm\XXXX An unknown error occurred on device "HPE StoreOnce:3".V-79-57344-33991 - A backup storage read/write error has occurred.If the storage is tape based, this is usually caused by dirty read/write heads in the tape drive. Clean the tape drive, and then try the job again. If the problem persists, try a different tape. You may also need to check for problems with cables, termination, or other hardware issues.If the storage is disk based, check that the storage subsystem is functioning properly. Review any system logs or vendor specific logs associated with the storage to help determine the source of the problem. You may also want to check any vendor specific documentation for troubleshooting recommendations.If the storage is cloud based, check for network connection problems. Run the CloudConnect Optimizer to obtain a value for write connections that is suitable for your environment and use this value to run the failed backup job. Review cloud provider specific documentation to help determine the source of the problem. If the problem still persists, contact the cloud provider for further assistance.V-79-57344-65072 - The connection to target system has been lost. Backup set canceled. I can't try cloudconnect optimizer because it's a iScSi connection. Any help would be great. Thank you, Federico PieracciniSolved3.5KViews0likes7CommentsNetBackup 10.1 - New PaaS Workload Protection
Starting with NetBackup 10, Veritas began expanding the support for PaaS workloads. In NetBackup 10.1, Veritas built an extensive framework designed to promote accelerated adoption of PaaS workloads protection. As a testament to that framework, NetBackup 10.1 adds support for the following 13 new PaaS workloads: Azure Workloads AWS Workloads GCP Workloads Azure PostgreSQL Amazon RDS Postgres Google MySQL Azure MySQL Amazon RDS MySQL Google PostgreSQL Azure Managed SQL Amazon RDS MariaDB Azure SQL Amazon Aurora SQL Azure MariaDB Amazon Aurora PostgreSQL Amazon DynamoDB The process of protecting and recovering PaaS workloads is easy and streamlined via NetBackup Web UI. NetBackup Snapshot Manager needs to be configured to facilitate the discovery of the supported PaaS workloads. Media Server with MSDP Universal Share configuration is also a requirement. After NetBackup Snapshot Manager and cloud provider credentials are configured, discovery process will be triggered automatically or can be started manually. Once the discovery runs successfully, supported workloads will be populated on the Web UI PaaS tab: Add PaaS credentials as required for the workloads to be protected. Credentials can be created previously and leveraged later for the workload to be protected or created as new during configuration. On this example, credential is being created previously using Credential Management tab: Add the credential to the PaaS workloads to be protected. Please note the “validation host” is the Media Server hostname that will be utilized to communicate with the cloud provider and PaaS workload. Media Server need to be able to resolve PaaS services to validate credentials: After that, it is just a matter of creating Protection Plan as usual. The following two prompts are specific to PaaS workloads: 1-) Protection Plan is for Cloud, same as the one used to protect virtual machines in the Cloud, for example. Check “Protect PaaS assets only” to be able to call the correct workflow and framework for PaaS: 2-) On step 4 (Backup options), storage path is the previously configured Universal Share mount point: Just complete Protection Plan workflow and that’s it! Protection Plan will run according to the schedule configuration and recoveries will be fully managed by NetBackup Web UI as well. Veritas NetBackup 10.1 now makes it easier to protect PaaS workloads, with a streamlined process guided by Web UI and leveraging benefits of NetBackup deduplication service (MSDP) and RBAC (role-base access control) to empower workload owners and administrators as needed. Here are some good references for more information about PaaS Workload protection with NetBackup 10.1: NetBackup 10.1 Web UI Cloud Administrator's Guide - Protecting PaaS objects NetBackup 10.1 Web UI Cloud Administrator's Guide - Recovering PaaS assets2.5KViews3likes0CommentsBackup Exec 16 - AWS S3 or Tape Gateway
Icurrently have the aws tape gateway setup to do weekly backups for our servers. Its working well minus the auto ejecting after the job is over (will see if this issue still happens with backupexec 16, just upgraded yesteraday). I noticed backupexec 16 has the ability to add AWS S3 as a storage. Can someone please explain the advantages/disadvantages of s3 versus tape gateway? Thanks2.3KViews0likes3CommentsBackup Exec 15 to AWS Tapes
Hey all, I've setup backup exec to backup our 2 weekly jobs to AWS tapes. It's been working fine the last few weeks. Each backup going to individual tapes. This past weekend, both jobs went to the same tape. I would like to set it so that it uses individual tapes so if i need to recover in the future, I dont have to pull down the entire weeks data and save on download costs. Can you please explain how i can set it up? Thanks1.2KViews0likes3CommentsBackup Exec: EC2 and On-Premises CAS/MBES configuration
We are working with a client who wants to move all his on-premises infrastructure to AWS cloud. There is a limitation with his WAS bandwidth which is 1gb. Their total data size is 30TB. Since the data size is big and the WAN bandwidth is limited they are going with a transportable disk which will take the initial full back to the cloud and then there will be just incremental backups that will be sent to cloud. My query is has anyone done a configuration where there is a CAS and an MBES on EC2 virtual machines and then there is another MBES which is on premises. Can we have some pointers on how we can achieve this setup. How will the communication between the CAS Backup Exec server on the EC2 and the on-premises MBES server happen? What are VPN configurations that need to be done to get this configurations working. I believe that if the communications between all the servers is working, then this setup should work and the customer can achieve what he is looking for. Any pointers regarding this will help a lot. Thanks in advance!1.1KViews0likes2Commentsset up a Veritas Netbackup server on Amazon cloud
We are planning to set up a Veritas Netbackup server on Amazon cloud for backing up our remote sites which we may or may not have network to access from our central data center. we do have primary Veritas domain up and running on data center. We do have difficulty to back up data from our new acquired remote offices. does anyone have experience on running a Veritas master or media server on Amazon cloud?868Views0likes1CommentS3 Cloud Storage setup across AWS accounts
Currently using NetBackup to backup servers in a single AWS account to S3 using an IAM user/access keys. This is working fine for backing up to S3 buckets in that account, but there is another AWS account that the NetBackup IAM user has cross acount access to that we want our master server to manage the backups of. The thing I'm trying to figure out is how do I configure netbackup cloud storage to use the IAM user in Account #1 to access/backup to the S3 buckets in Account #2? The IAM user is working just fine and I can access and manage the S3 buckets through CLI from the master in account #1 to account #2, but when configuring cloud storage, it doesn't have an option to say to use account #2's S3 bucket. It only loads the buckets in account #1 or gives me an option to Add Volume which seems to try and create the bucket in Account #1. Is it possible to point it directly to an existing bucket in a different account when the IAM user has cross account permissions or is an IAM user needed for each AWS account being accessed? Thanks864Views0likes2Comments