When it comes to SECRETS, how secure is yourapplication?
Introduction Enterprises running various heterogeneous workloads ranging from on prem applications to applications spread across various cloud service providers, often struggle to manage credentials securely. We’ve seen a lot of technical debates about how to find a perfect balance between security and flexibility, but there’s no de facto standard hack which fits in for all. We’ve seen (sometimes radically) different opinions on “the right way” to manage secrets: “You should always use vault”, “You should encrypt creds” and the list is never ending! To cope up with these challenges, Veritas introduces Alta Recovery Vault short lived token-based authentication. For us, your data’s security is paramount to us. Prior to short lived tokens, Veritas provided ability to connect to Alta Recovery Vault with Standard Credentials (access and secret keys) as shown below : Diagram1: Creating a Credential with the Storage Account and Traditional Credentials (Access key and secret) given by Veritas Disadvantages of using Standard Credentials in Recovery Vault These standard credentials are long lived in nature. If compromised, they give attackers ample time to exploit the application. If they are stolen it would be a nightmare to discern which operations are legitimate. Thus, the only fail-safe choice is to cumbersomely rotate the keys and redistribute to customers. This is often overlooked action and adds extra pain for the DevOps.( p.s: It's not happier as it seems to be in the adajcent picture) Solution To help alleviate some of the above risks, Veritas has leveraged the ability to enhance security by introducingshort lived token-based authentication. Beginning with NetBackup 10.2 for Azure and NetBackup 10.4 for AWS (...GCP work in progress), users will have cloud storage accounts and a short-lived refresh token to connect securely to the Alta Recovery Vault storage. These new secrets are added as Credentials in the NetBackup Credential Management (as shown in diagram 2a and 2b) Once the initial connection is established, Veritas credential Management API is solely responsible for renewing, refreshing, accessing and sharing access signature. Isn’t it amazing just no pain to rotate the keys and redistribute! ( I see the cyber security team seems happier and overjoyed ) Diagram 2a: Creating a Credential with the Storage Account and Refresh Token given by Veritas for Azure Diagram 2b: Creating a Credential with the Refresh Token given by Veritas for AWS Solution Benefits Enhanced Security : Short-lived tokens have a limited lifespan, reducing the exposure window for potential attacks. If a token is compromised, its validity period is short, minimizing the risk of unauthorized access. Regular token expiration forces users to re-authenticate, ensuring better security. Mitigating Token Abuse : Tokens are often used to authorize access to resources. By making tokens short lived, we limit the time an attacker can use to abuse a stolen token. Thus, minimizing the risk window significantly. Better Management of Permissions : When permissions change (e.g., user roles or access levels), short-lived tokens automatically reflect the updates upon renewal. Long-lived tokens may retain outdated permissions, leading to security risks. Conclusion Introduction to Alta Recovery Vault short lived token authentication adds another layer for ransomware protection thus making applications more secure than ever before. At Veritas, your data’s security is paramount to us and this blog serves just as one simple example of the challenges Veritas short lived tokens can help solve. Further, Veritas is always looking and working for better ways to secure your data. Here are some additional helpful links : Veritas Alta Recovery Vault Technical White Paper Veritas Alta Recovery Vault Security Guide Veritas Alta Recovery Vault Azure ExpressRoute Overview Guide Veritas Alta™ Recovery Vault AWS Direct Connect Overview Guide Please feel free to give feedback and we can answer any queries !! Appreciate everyone time :)642Views3likes0CommentsNBU in Azure - Snapshot Manager
Hello, I would like to know if users have tested and installed the Snapshot Manager solution in the cloud (Azure or other) and their feedback on this solution? Did you encounter any problems with the configuration? Do you manage to backup many workloads (VMs)? How is your configuration approximately? Personally I have installed a complete environment (Primary, Media with MSDP-C and Snapshot Manager) and I have a lot of stability problems with the Backup... This topic is more about discussing the product. Thanks for your feedback!506Views0likes0CommentsTroubleshoot Snapshot Manager Installation o nAzure
Hi community, Tryig to install NetBackup on Azure, everything goes smoothly (Primary, Media Server) except Snapshot Manager. During installation receive error: VM has reported a failure when processing extension 'ExtensionForConfiguringCPscale Error message: \"Enable failed: failed to execute command: command terminated with exit status=1\n[stdout]\n_fqdn=nbsnap1011-scale000000.dclfbw4m3ymepm1gz40pmrj30f.frax.internal.cloudapp.net\ What could it be and how resolve? Thank you854Views0likes2CommentsNetBackup 10.1 - New PaaS Workload Protection
Starting with NetBackup 10, Veritas began expanding the support for PaaS workloads. In NetBackup 10.1, Veritas built an extensive framework designed to promote accelerated adoption of PaaS workloads protection. As a testament to that framework, NetBackup 10.1 adds support for the following 13 new PaaS workloads: Azure Workloads AWS Workloads GCP Workloads Azure PostgreSQL Amazon RDS Postgres Google MySQL Azure MySQL Amazon RDS MySQL Google PostgreSQL Azure Managed SQL Amazon RDS MariaDB Azure SQL Amazon Aurora SQL Azure MariaDB Amazon Aurora PostgreSQL Amazon DynamoDB The process of protecting and recovering PaaS workloads is easy and streamlined via NetBackup Web UI. NetBackup Snapshot Manager needs to be configured to facilitate the discovery of the supported PaaS workloads. Media Server with MSDP Universal Share configuration is also a requirement. After NetBackup Snapshot Manager and cloud provider credentials are configured, discovery process will be triggered automatically or can be started manually. Once the discovery runs successfully, supported workloads will be populated on the Web UI PaaS tab: Add PaaS credentials as required for the workloads to be protected. Credentials can be created previously and leveraged later for the workload to be protected or created as new during configuration. On this example, credential is being created previously using Credential Management tab: Add the credential to the PaaS workloads to be protected. Please note the “validation host” is the Media Server hostname that will be utilized to communicate with the cloud provider and PaaS workload. Media Server need to be able to resolve PaaS services to validate credentials: After that, it is just a matter of creating Protection Plan as usual. The following two prompts are specific to PaaS workloads: 1-) Protection Plan is for Cloud, same as the one used to protect virtual machines in the Cloud, for example. Check “Protect PaaS assets only” to be able to call the correct workflow and framework for PaaS: 2-) On step 4 (Backup options), storage path is the previously configured Universal Share mount point: Just complete Protection Plan workflow and that’s it! Protection Plan will run according to the schedule configuration and recoveries will be fully managed by NetBackup Web UI as well. Veritas NetBackup 10.1 now makes it easier to protect PaaS workloads, with a streamlined process guided by Web UI and leveraging benefits of NetBackup deduplication service (MSDP) and RBAC (role-base access control) to empower workload owners and administrators as needed. Here are some good references for more information about PaaS Workload protection with NetBackup 10.1: NetBackup 10.1 Web UI Cloud Administrator's Guide - Protecting PaaS objects NetBackup 10.1 Web UI Cloud Administrator's Guide - Recovering PaaS assets2.6KViews3likes0CommentsCloud Upload Throttle not Working
I have a Cloud Catalyst BYOD Media Server on Netbackup 8.3 We upgraded from 8.2 to fix cloud upload issues which were consistently failing. After upgrade to 8.3 , all upload issues went away and upload is working fine, but now its not honoring Network Throttle and is running at full utilisation. How can we fix this ? The Sampling Interval is 5 Seconds. and have rebooted the Server .787Views0likes0Comments