Article

Microsoft strategist explains 'integrated' virtualization game plan

Alex Barrett
In part one of this interview, David Greschler, Microsoft's director of integrated virtualization strategy, discussed the company's virtualization strategy and plans.

    Requires Free Membership to View

Here in part two, Greschler discusses Microsoft's vision of greater integration between server and desktop and cloud computing.

For more on Microsoft's virtualization strategy:
A podcast of the David Greschler interview is available on SearchServerVirtualization.com
Your title is director of integrated virtualization strategy. What does 'integrated' refer to?
D.G.: When I first started four years ago, it was really about tying together the core pieces -- management, Hyper-V, application virtualization, and Terminal Services, now called Remote Desktop Services. Over time people have begun to understand how the different layers of isolation with virtualization provides leads us to cloud computing. My job changes with how technology is embraced, and then how we innovate at Microsoft and think about where our products are headed.

In the field, those doing server virtualization, desktop virtualization and cloud are three distinct groups -- data center folks, desktop support operations, and developers. So how do we get to this integration Microsoft is talking about?
DG: It takes time. Take the desktop. The desktop team has always been thinking about ways to make desktops more available and make desktop more agile and reduce the break/fix problems. They're being pressured by the server people toward VDI [virtual desktop infrastructure].

Why not run desktop workloads on a hypervisor? Of course the desktop problem is much more complex than just moving a desktop from an intalled machine over to a server. Initially they saw some technologies like Terminal Services, like app virtualization, and now they're looking at VDI. There's this slow recognition among desktop administrators that, yes, virtualization can help.

Microsoft is the leader in desktop management space with System Center Configuration Manager, and the next version --there is a beta coming out in Q2 -- will leverage virtualization. It includes a user-defined affinity feature: the idea that administrators can create rules and deliver desktops and applications to people depending on what device they're connecting to. So if you're at home and need to access an application, [SCCM] will now recognize that it's not your primary machine and will deliver the application to you virtually, using Terminal Services. The flexbility that virtualization provides is something that we're embracing and putting in to our products.

What about technologies designed for the desktop bubbling up to the server world?
D.G.: We developed application virtualization for the desktop because it solved a lot of issues there, and now we're putting it into our server offering. The next version of Virtual Machine Manager will include the ability to virtualize an application. That's different than wrapping a virtual machine around an application. It means taking that application and all its configurations and dependent applications (server applications are usually tied to databases and other applications) and encapsulating them and moving them around without actually installing them to the operating system. We call that Application Virtualization for Servers.

Any parting thoughts?
D.G.: People tend to underestimate Microsoft's commitment, investment and innovation in this space. But … beyond all the hype and he said, she said, Microsoft is grabbing and taking hold of what virtualization offers.
On the server side, with dynamic memory, we're going to be there soon enough, and in many ways we're at par. On the desktop side, we're ahead. We've also innovated dramatically in the cloud computing area. At Microsoft Management Summit, Bob Muglia demonstrated how Operations Manager will, from a single pane of glass, display a workload in your on-premise data center, workloads in a hosted environment, and workloads running in Azure. The ability to move between these different clouds seemlessly is something no other vendor can offer.

The underlying thought behind all of this is IT as a Service. We're moving away from a world where IT is something that you have to order hardware for setup, install, test, and eventually deliver two or four months later. Rather, IT is something that when you need it, you get it. It's turned on and off like a lightbulb. Whether it's delivering desktops as they move from one machine or one device to another, or on the server side, where increased demand for a server workload requires that you not just run it on your in-house data center, but in some form of cloud or another. It's really taking a new view on IT, and thinking of it as instantly available, and instantly turned off when necessary. That's what's underneath all these innovations and products that we're doing: treating IT as a service.

Let us know what you think about the story; email Alex Barrett, News Director at abarrett@techtarget.com, or follow @aebarrett on twitter.


There are Comments. Add yours.

 
TIP: Want to include a code block in your comment? Use <pre> or <code> tags around the desired text. Ex: <code>insert code</code>

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Sort by: OldestNewest

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to: