It seems every year we’re introduced to new terms used to describe innovative technologies or industry trends. Often, these words are coined by major vendors or industry analysts and then adopted by the larger market. But, when everyone starts to jump onboard and companies start using the term to catch attention or sway potential customers it can be confusing to understand exactly what those terms mean.
Lately, that overused (and often misused) term is “software defined,” which VMware first unveiled to describe its vision for a data center in which resources are pooled and managed by software. In just the past few weeks I’ve seen network switches, traditional storage arrays, capacity planning software and even cloud services labeled as “software-defined” products.
This month, we’re asking our Advisory Board members to help cut through the marketing hype and explain what they think makes a product “software defined,” and whether the term itself is even useful.
Brian Kirsch, Milwaukee Area Technical College
Software defined is one of the biggest buzzwords since “cloud,” and yet “software-defined” has existed for many years. Was a PC truly a dedicated hardware platform purchased for a specific purpose or did Microsoft Windows allow the use of multiple applications? History tells us that you could use the desktop for a variety of applications from business to personal. However, PC manufactures did not advertise desktops as “software defined.”
Software has often been the unique piece in the computer world that gave vendors a difference from one hardware platform to another. As technology and time advances, most computer-related products have had fewer differences in the hardware, making the features and functions enabled through software the selling points. This does not mean that we should stop using the term software-defined. Rather, we need to understand that this term has been around for a lot longer than we realize and that it is getting to be more important.
Software-defined represents a new class of products where the software is the focus and is used to provide the solution rather than the hardware. In years past, data center growth was often a hardware path and software was used to support the function. With more vendors becoming part of this new shift to software-defined focus, many have increased product functionality and advertised their product as software defined. The question is whether or not they are truly software-defined products or software feature enhanced.
Data centers are no longer rooms that organizations show off to potential clients. Instead, they are becoming a collection of servers running hypervisors. The real highlights that companies are showcasing are the agility, scalability and redundancy that software-defined products give them. Just as the data center has transformed, so has the vendor infrastructure. When a collection of products becomes a commodity and software abstracts and defines its form and function, that is when you have a product that is truly software defined. Until then, it might simply be a collection of products with new software features.
Rob McShinsky, Dartmouth Hitchcock Medical Center
Not so long ago “virtual” was the word everywhere in the industry. Now “software-defined <fill in the blank>” has taken the lead as the latest catch phrase for vendors to sound modern and hip. But what does it mean? Here is my take on what software defined means:
- Abstraction of physical resources
- Automation of actions
- Predictive configuration or control of workloads, stretching the grip past administrator-defined rule sets for resources.
Let’s look at a few examples of what has been described as software defined and the features they have in common.
Software defined storage: I/O balance across an array or between arrays, or hyper-converged storage systems. Reactive should be expected in such a solution, predictive I/O resource balancing based on the type of workload, data signatures, or times of day should be the trend.
Software defined networking: Think of the cloud computing scenario where VM workloads are moved among data centers or even among regions. This transition of workloads needs to have network configurations apply just in time before the transition is performed and equally torn down as workloads transition back out.
Software defined data centers: This is beyond the base resources necessary to run workloads. Predictive models of potential risks like overall load in the current data center, potential power outages, and even global risk or weather events. For the health of the business, power and cooling costs at certain times of year, month, day or hour should be part of this predictive quality, dynamically moving workloads to optimize not only performance, but optimize business costs and service level agreements for any interval of time. Squeezing the pennies out of multi-million dollar data centers all the way down to a smaller business utilization local and cloud resources interchangeably to maximize availability and costs. At scale administrators can no longer handle the tasks of predicting what is best for the data center or for the business. I am not saying we are at the software defined level of Skynet, but self-aware computing might just be the next phrase to catch on.
Keith Townsend, IT management consultant
Software-defined has become the new catch marketing phrase. Before software-defined, we had cloud. Marketers placed the cloud label on any Internet-based service. Even enterprises were embracing “cloud washing” by calling their virtualized environments clouds. Now, everything seems to be software-defined. A storage array with an OpenStack plug-in? Well, welcome to software defined storage. Do you provide an API to provisioning VLANs on your switch? You now have software-defined networking.
So, what is software-defined? Software-defined is the ability to abstract the management and administrative capabilities of the technology. For example, with SDN, it’s the ability to control the provisioning of network devices, VLANs, firewall rules, etc. It’s also the ability to control the flow of data. Products like VMware NSX go one future step and abstract the data plane in addition to the control plane. The virtualization of the data plane isn’t needed to qualify as software defined.
Storage is a great example of an area where virtualization of the data plane isn’t needed to qualify as software-defined. EMC’s ViPR is an ambitious attempt to abstract the control plane of enterprise storage arrays. In the case of ViPR, the objective is to give a cloud management platform, such as OpenStack or vCloud, the ability to provision storage on virtually any enterprise-class storage array. ViPR doesn’t provide the storage, but provisions the storage. In theory, it doesn’t matter what the underlying storage technology is.
When the control plane is abstracted, policy-based rules can be set for the resources. In the case of SDN, flows can be adjusted based on application performance. Another use case is creating policies that create virtual firewalls as compute capacity is increased to accommodate demand.
At the end of the day, it’s the policy litmus test that can be used to determine if a product is software-defined or not. If the functionality touted doesn’t include the ability to control resources based on policy, then chances are you are looking at software-defined washing.