This article is part four of a six-part series exploring virtual desktops in today's IT environment.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
The previous articles in this series compared and contrasted the merits of virtual desktop infrastructure (VDI) solutions to server-based computing solutions and traditional local desktop environments. Now that we've explored the pros and cons of each technology, let's look at where VDI makes sense in today's world.
Like I wrote previously, VDI technology will never replace local desktop computing. But just as a lot of us got most of the mundane applications off of the desktop and into the data center delivered via SBC (server-based computing) technologies that only got us so far. (Maybe 80 percent?)
So, why only 80 percent? Why can't you bring the rest of your applications into the datacenter? Possible reasons include:
- Users need offline access (traveling laptops, etc.)
- The applications are not terminal-server compatible.
- The applications are resource hogs that "kill" a terminal server.
- The applications are graphics-intensive and don't work well over a thin-client remote display protocol like RDP (remote display protocol) or ICA (Independent Computing Architecture).
- The effort to make the apps work in the SBC environment isn't worth the benefit.
Like I said, VDI is not the be-all end-all to application delivery. SBC is a good foundation. From there, look at the above list and think about VDI. Reasons numbers two, three and five can be solved with VDI solutions.
(And just to reiterate, why would you "want" to solve items two, three and five from that list? Refer back to article two in this series.)
It should be obvious by now that any environment will benefit from a blended approach of SBC, VDI and traditional local desktops. Just as it makes sense to build a comprehensive application solution that involves SBC, traditionally installed applications, and application streaming, you should think about the desktop as "just another application" that can be delivered in many ways depending on the situation.
The over-hyped example of remote software developers' needs is always used to answer the question of "why would someone need VDI?" The idea is that the remote developers can each have their own VM (virtual machine) desktop and do whatever they want to it without affecting other users.
While I definitely think that use case is a good example, the problem is that VDI is also useful in so many other ways. My fear is that always using the developer example will lead people to think that they don't need VDI if they don't have any remote developers.
The reality is that VDI technology is useful in any scenario where you have power users or users who need strange, non-terminal-server-compatible applications, but where the users still need the flexibility associated with traditional SBC environments. (Examples include connecting to applications from anywhere, over slow connections, etc.)
I think that VDI will be useful just about everywhere, but in a limited way. It will just be one of the multiple methods that can be used to provide a desktop to a user. VDI can play a role in nearly 100% of all companies out there, but only for probably 2-4% of the users at those companies. So, yes, it's useful, but no, no one is throwing out their SBC environments or desktop computers.
The next article in this series will discuss what technology makes VDI possible.
About the author: Brian Madden is an independent technology analyst, author, and thinker based in Washington, DC. He's written several books and hundreds of articles about Citrix and thin-client computing technology. Brian is a three-time Microsoft MVP, a Citrix Technology Professional (CTP), and he currently speaks and teaches throughout the world.