One of the things I always found challenging as a CIO was keeping up with new terms introduced to the market. I sat in numerous meetings where vendors described their products as “hyper converged,” “digital,” or “AI-enabled.” These terms felt like buzzwords without practical implications for my IT organizations. Cutting through the marketing to the meat was never easy.
Today, I find myself on the other side of the conversation, meeting with customers to discuss “software defined” approaches. Unlike some ideas of the past, I believe this concept has direct applicability to IT organizations and will lead to a new era, which I call “software-defined modernization”.
Let me explain why I think this is different.
The basic idea behind software-defined is that the product capabilities are driven by the software, not the hardware. IDC describes it as services delivered “via a software stack that uses (but is not dependent on) commodity hardware built with off-the-shelf components.”
Once a differentiator, hardware is no longer key to a successful IT operation. Current thinking is that standard hardware components, with the right software, can offer better solutions than proprietary hardware solutions. As we modernize, we have a unique opportunity to change not just how we buy, but what we buy.
Scalability is a big reason that the software-defined approaches are growing in popularity.
Scaling up your IT operations is much easier when you are not reliant on buying expensive, proprietary hardware. A software-defined model improves scalability because hardware controls are abstracted into the software, integrated with commodity hardware, and automated through policy. This minimizes the human and financial resources needed to support today’s modern workloads.
Let’s look at a few type of software-defined technologies.
Effective and efficient storage management is not easy, but has a huge impact on IT operations. Unfortunately, with applications and data in on-premises and multi-cloud environments, this is more difficult than ever. Further, Gartner estimates that 80% of data is unstructured. These challenges set the stage for new technologies, like software-defined storage (SDS), to take hold as agencies modernize.
A good way to think about SDS is to compare it to how you watch movies or listen to music. In the olden days (like 10-15 years ago), it was common to buy movies and CDs. I still have stacks of them in my house. Now, however, when I want to watch or listen to something, I stream it. Software-defined storage is similar in the sense that your costs are more in line with your consumption, rather than up front expenditures that lose value over time.
Further, traditional storage systems typically require that hardware and software upgrades occur together. This can be costly and add operational risk. SDS supports independent software and hardware upgrades, minimizing cost and risk.
As highlighted in the Federal Data Strategy, there is a huge opportunity to mine and analyze data to improve program performance and digital experiences for citizens. A huge impediment to this comes from cost-effectively addressing data storage for transactional and analytic workloads. Each have different performance and capacity needs. SDS mitigates this challenge by pooling existing storage resources for specific processing requirements and extending storage tiers to the cloud to optimize for cost and performance.
Google and Facebook pioneered software-defined networking (SDN) with similar efficiencies in mind. These organizations have vast networks and volumes of data, so they desperately need dynamic, scalable, and easily managed networks.
Similar to software-defined storage, software-defined networking decouples basic network functions. The technology disassociates routing (network control) from packet forwarding. This makes network devices programmable, via custom SDN programs, not proprietary software. As a result, network administrators can dynamically adjust traffic flow to meet changing needs. Further, this vendor-neutral approach replaces traditionally complex, decentralized architectures with networks that offer flexibility and centralized control.
Software-defined networks are available to agencies today. GSA’s recently awarded Enterprise Infrastructure Solutions (EIS) contract offers a number of SDN solutions. I expect SDN to be an important part of agency cloud adoption plans and a key driver of security enhancements described in the Cloud Smart strategy and the Report to the President on IT Modernization.
The introduction of new technologies is never easy. In my experience, however, the biggest challenge normally isn’t the technology. I don’t expect software-defined to be different.
For example, as software-defined approaches takes hold, look for machine learning (ML) to enter into the conversation. ML supports intelligent automation through the integration of operational data and programmable storage and networks. As a result, we’ll eventually rely on machines, not humans, to manage storage and networks. A change like this goes well beyond technology and requires comprehensive change management for it not to fail.
It’s true that all hardware products rely on software. What makes the current software-defined movement different, and not just a buzzword, is the decoupling of hardware and software.
This flexible, vendor neutral approach has the potential to upend current IT procurement and implantation practices. It also offers an unprecedented level of control as agencies modernize and define the future of their systems and data.
This article was originally published by GovLoop.com on November 5th, 2018.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.