Veritas System Recovery adding cloud storage issue
Hi all, I would like to add S3 compatible cloud storage as my backup destination. Firstly, I have created a cloud instance as per https://www.veritas.com/support/en_US/doc/38007533-136670227-0/v132418412-136670227 Since I dont have Certificate Authority (CA)-signed certificate, I have to use SSL: 0 (disabled). However, when I tried to add cloud backup destination, in the packet received from the cloud library I see this message: The authorization mechanism you have provided is not supported. Please us AWS4-HMAC-SHA256 Unfortunately, on the cloud library (netapp) there is not possible to change the configuration. Do you think that there is any workaround to the issue in such situation? Thanks for your opinions.Solved2KViews0likes5CommentsUse Amazon Web Services to Store Your Backups
Moving your backups to AWS could present you an excellent alternative to tape from not only a cost perspective but also from an agility perspective – leapfrogging your data protection from the "dinosaur-era" to today's "instant era." Don’t be a dinosaur!16KViews3likes6CommentsBackupExec + S3 [A backup storage read/write error has occurred]
Hi, We have BackupExec 20.4 with StoreOnce on premise and use Amazon S3 with Storage Gateway for Virtual Tape Library (VTL). My jobs create backup onsite on StoreOnce and then they are pushed to the cloud via AWS S3 with a duplicate job. I only get this error from time to time and have already checked with my ISP, VPN Network team and opened a ticket with AWS. I ask if anyone can help me out with these failures that occur: Job ended: venerdì 19 giugno 2020 at 02:27:11 Completed status: Failed Final error: 0xe00084c7 - A backup storage read/write error has occurred.If the storage is tape based, this is usually caused by dirty read/write heads in the tape drive. Clean the tape drive, and then try the job again. If the problem persists, try a different tape. You may also need to check for problems with cables, termination, or other hardware issues.If the storage is disk based, check that the storage subsystem is functioning properly. Review any system logs or vendor specific logs associated with the storage to help determine the source of the problem. You may also want to check any vendor specific documentation for troubleshooting recommendations.If the storage is cloud based, check for network connection problems. Run the CloudConnect Optimizer to obtain a value for write connections that is suitable for your environment and use this value to run the failed backup job. Review cloud provider specific documentation to help determine the source of the problem. If the problem still persists, contact the cloud provider for further assistance.Final error category: Backup Media Errors Duplicate- VMVCB::\\XXXXX\VCGuestVm\(DC)XXXX(DC)\vm\XXXX An unknown error occurred on device "HPE StoreOnce:3".V-79-57344-33991 - A backup storage read/write error has occurred.If the storage is tape based, this is usually caused by dirty read/write heads in the tape drive. Clean the tape drive, and then try the job again. If the problem persists, try a different tape. You may also need to check for problems with cables, termination, or other hardware issues.If the storage is disk based, check that the storage subsystem is functioning properly. Review any system logs or vendor specific logs associated with the storage to help determine the source of the problem. You may also want to check any vendor specific documentation for troubleshooting recommendations.If the storage is cloud based, check for network connection problems. Run the CloudConnect Optimizer to obtain a value for write connections that is suitable for your environment and use this value to run the failed backup job. Review cloud provider specific documentation to help determine the source of the problem. If the problem still persists, contact the cloud provider for further assistance.V-79-57344-65072 - The connection to target system has been lost. Backup set canceled. I can't try cloudconnect optimizer because it's a iScSi connection. Any help would be great. Thank you, Federico PieracciniSolved3.5KViews0likes7CommentsVERITAS NETBACKUP WITH RED HAT CEPH STORAGE
We are using netbackup with ceph storage where ceph is presented to the media servers as a MSDP. Since MSDP has limitations of one pool per media and a sizing limit of 96 TB, we are planning to use CEPH as a S3 - backup target for the new media servers we have to configure. I want to know if we configure Ceph as a S3 cloud storage, can Netbackup perform deduplication? Will all dedup be just on client server using accelerator or we can have target dedup by netbackup as well? Or netbackup will send all backup data it receives to Ceph without any dedup, leaving all data reduction to be handled by CEPH? Can CEPH as a cloud storage perform dedup on its own? With CEPH 4 , we will have erasure coding, so the total storage used will be less but in terms of dedup what advantages can I have by using ceph as cloud storage instead of a MSDP. If anyone is using Netbackup with Ceph, could you share your approach please.Solved1.8KViews0likes1CommentEP70: VMworld: Demystifying the cloud with the AWS and Veritas partnership
Veritas and AWS are integrated in multiple dimensions such as the integration around NetBackup, Backup Exec as well as Veritas Resiliency Platform. No matter the use case, Veritas and AWS have you covered. Tune in to learn more.1.8KViews0likes0CommentsBackup Exec 16 - AWS S3 or Tape Gateway
Icurrently have the aws tape gateway setup to do weekly backups for our servers. Its working well minus the auto ejecting after the job is over (will see if this issue still happens with backupexec 16, just upgraded yesteraday). I noticed backupexec 16 has the ability to add AWS S3 as a storage. Can someone please explain the advantages/disadvantages of s3 versus tape gateway? Thanks2.3KViews0likes3CommentsNetbackup integration with S3 ...
Hi All, I am evaluating a scenario with netbackup 8.0 i have integrated netbackup with AWS S3 as backup target in S3, i have applied lifecycle rule where data will move to glacier after 1 day (netbackup will write data in S3, after 1 day, AWS will automatically transition data to glacier storage class) Now while restoring from netbackup, i am unable to get that data since data is stored in glacier and netbackup cannot talk to galcier directly. just want to verify if anyone have tested such scenario.3.3KViews0likes3CommentsConfiguring cloud storage using CLI
Has anyone successfully configured S3 cloud storage (from start to finish) using the CLI? The csconfig documentation (from the reference guide) is not that helpful. It's unclear how the access key and secret are entered, how the disk pool is configured (will nbdevconfig work with cloud storage), and so on. Almost every single document references the GUI but there are statements indicating that the csconfig command can be used but with no decent examples.1.1KViews0likes1Comment