Netbackup licenses - capacity
we're trying to confirm the amount in TB we should be licensed for under capacity model , we ran the nbdeployutil but there are several amounts there , which one should be the number to be licensed then Flagged Capacity Figures (TB) Master Server Confirmed (TB) or Capacity by Policy Type Policy Type Total (TB) Or Capacity Totals (TB) Platform Base (TB) Master Server Calculated Charged1.5KViews0likes4CommentsTape Capacity Exceeded When It Shouldn't Be
Hi, Having an issue with Backup Exec 15 where the backup job is exceeding tape capacity, but I don't know why. Each week we have a full backup of our data and system states. As shown in the attachments, this comes to about 4.85TB of space required. I have 4 LTO5 tapes in the unit allocated. These have a base capacity of 5.52TB with no compression. Yet, when the job runs it completely fills those tapes and I have to routinely add another spare tape into the mix, an LTO4. Even that almost gets filled up. Even without compression (which also isn't happening but I can't explain why) I should have more than enough space on the 4 x LTO5 tapes, yet it fills them up and still needs more. I should also mention, the tapes are put into scratch state before the job starts, there is no appending, only overwrite of blank tapes. Can anyone suggest why this might be happening or what I can do to troubleshoot? I receive no relevent errors or warnings which would explain this. Thank you.822Views0likes2CommentsBackup Exec 15 does not consider overwritable media in storage capacity
Hi, I'm having a small issue with the Backup Exec storage screen. As shown below, the capacity of the tape library is reported as almost full: However, the media in the tape library is now outside of it's protection period and is marked as overwriteable: Should Backup Exec not see this overwritable media as available capacity? I should also mention that backups are running to tape correctly, it would just be nice to see usable capacity accurately on the storage screen, rather than having to drill down into the 'Slots' screen.Solved1.4KViews0likes2CommentsDetermining if you are compliant with your capacity license for FSA/SharePoint
Recently I have been asked by several customers how to determine the total number of TB's used in their capacity license.SharePoint and File System Archiving are sold on a per-TB basis within the EV Archiving Per TB solution. This blog will provide details on how to determine your current usage and whether or not you need to purchase additional TB's of data.Storage Management Essentials for Business
Storage capacity requirements are growing at an explosive rate, complicating data and storage management in mission-critical and compliance-driven environments. Enterprises need to securely store more information and more information types. Data must be safely secured and available for rapid recovery in the near term, while also meeting long-term archival and compliance regulations. These complex issues have created a variety of manageability, storage availability and price performance challenges, ranging from missed service levels to operational risks. Recent industry trend reports by analysts show that the IT budgets are growing at six percent a year; but data under management is growing between 50 and 70 percent or more. Keeping up with data growth while reducing the cost of data management, requires deep analysis and an understanding of underlying storage delivery infrastructure. To ensure the financial benefits associated with storage consolidation and data management initiatives, CIOs and storage managers need to collaborate with line of business (LOB) management. That collaboration should begin with development of storage cost management baselines for the enterprise to identify and measure the parameters such as by LOB, by Geographical locations, by Tiers (critical, operations, archival, disaster recovery, etc.), by Vendors, by Storage Attachment (SAN, NAS, DAS, RAID Levels etc.), by Data Classification / File types (structured, unstructured and semi-structured), by Network Storage Devices etc. Companies today are aware of the high costs associated with managing stored data and keeping this data available to business-critical applications. These management costs are escalating at a time when corporate IT organizations are looking at streamlining the operations to ensure that their significant infrastructure investments are leading to increase in productivity and profitability, and manage more infrastructure resources with fewer numbers of personnel. Highly centralized data centers serving the needs of both, internal departments and external customers, are now regarded as the path to address these problems by centralizing procurement and administration of complex systems. These data centers inherently contain a massive amount of storage and a heterogeneous set of server platforms suited to the needs of each application or department. In the recent past storage management tools have been compared on equal footing with the point tools provided by hardware vendors. These hardware management tools typically did one or two things well, but had to be used with other point tools on other hardware to perform a storage management function. Changing storage demands require an agile, adaptive infrastructure environment to reduce complexity and stay ahead of change. By adapting flexibly to changing requirements, businesses can better manage risk, increase performance, reduce costs and derive better IT value to the business. - Mandar Bhide, senior product manager597Views0likes0Comments