rolffimages - Fotolia


Building a foundation with software-defined architecture

Like any good orchestra, all of the elements of your data center must work in harmony. Get your infrastructure back in tune by creating the ideal software-defined data center.

Imagine an orchestra where the conductor has most of the players under control, but the trumpets and the drums are off doing whatever they please. That's where most of us are today when it comes to running a data center. We can easily orchestrate servers into automatically providing endless VMs or containers, but networks are difficult to handle and storage is composed of an unruly bunch of diverse products.

The main reason that we have this unstable view of a data center is because servers have standardized on the commercial off-the-shelf model for years, while storage and networks have generally been proprietary solutions that are a vertically-integrated meld of hardware and software. Making this gear fit the orchestration model is challenging at best, as each piece of gear needs a unique approach. That's where software-defined architecture comes in.

Finding a new approach for standardization

We've reached a tipping point on the issue of standardization, mainly as a result of large cloud service providers (CSP) and their influence on industry trends. With the massive growth of the cloud, the way CSPs do business is becoming the mainstream for the rest of the IT industry. The sheer scale and size of CSPs make it impossible for them to manage their data centers manually, as many of us do today. As a result, CSPs have developed ways to control networks and storage and, in the process, have changed the underpinning hardware to reduce costs.

The approach these CSPs take is "institutionalized" as software-defined architecture. The concept is simple: Take stripped down, bare-bones hardware designs using off-the-shelf components and build data services around them as applications in VMs or containers.

The advantage of this method is that it unlocks the software from the hardware, which allows for more flexibility and provides scalable solutions that can adjust to workload demand easily. This brings down hardware costs considerably -- by as much as 50% or more -- and increases competition, which lowers the cost of new software on the market.

We've reached a tipping point on the issue of standardization, mainly as a result of large cloud service providers and their influence on industry trends.

All of these are fundamentally economic benefits. Private clouds are much cheaper and easier to justify with the software-defined architecture approach, but there are also some strong operational benefits. Moving to orchestration carries the implication of automating many of the decisions that come with controlling your cloud. In the storage space, this means moving from an operational model that requires formal requests and justifications to a self-service or automated provisioning model. This approach brings up the question of ownership of the decision. Does central IT even need to get involved? Most of the time users at the department level make the decision to rent storage and take responsibility for ownership that rental confers. Central IT's role is to set up the scripts to allow the rental of space and to ensure the data is being handled according to governance guidelines, a process that can be built into the scripts.

The same is true of network setup. With software-defined networking, it is possible to use the scripting approach to build virtual LANs and reconfigure them to new members as needed. In both the storage and the network cases, the administrative efforts of central IT are more narrowly focused on delivering and maintaining robust scripts in a library and listening to users about new needs and problems. This frees up substantial bandwidth to address new opportunities across the IT spectrum.

Software-defined architecture saves time

Software-defined architecture also solves the issue of integrating new gear and removing old or damaged gear as this process becomes automated. The gear will operate to standard application programming interfaces rather than proprietary setup applications, simplifying recognition and configuration. Again, the result is a lot of time saved.

Automation also reduces finger trouble. The scripting approach, with a template library and a "fill-in-the-blanks" model, should reduce human errors. Events such as server updates will also be streamlined and handled within orchestration.

All of these time savings will lead the CIO to ask what to do with the team. Clearly, humdrum administration tasks are going to see some shrinkage, but there is a new challenge to implement system tuning that has likely been a back burner issue under the old regime of manual changes to infrastructure. All this new storage and networking gear and the associated software will create opportunities to build smarter and faster solutions, with faster job completion and less gear needed in the data center as payoff.

Having a strong technical base to build a robust hybrid cloud is another software-defined architecture benefit that, coupled with the relief from the pressure of daily operations that software-defined architecture brings, should help solve negotiating between a private cloud operation and a variety of public clouds. The automation process should be used to enshrine good security and data integrity practices, which in turn should lead to a much better hybrid cloud experience.

We are out of the hype stage on software-defined architecture. Networking is moving faster than storage, reaching the point that companies like Dell begin to offer formats based on purely open hardware and virtualized software. The technology of software-defined architecture is still evolving, but there's enough there already to make a "Software-Defined Sandbox" a worthwhile investment.

Next Steps

SDI challenges legacy IT culture

Top data center trends driving chance

Future-proof your data center with SDI

Dig Deeper on Virtual server backup and storage