How to plan for desktop virtualization

Tackle virtualization deployment in phases—not as an all-or-nothing proposition.

This Content Component encountered an error
This article can also be found in the Premium Editorial Download: Virtual Data Center: Planning for desktop virtualization:

The move to desktop virtualization is not easy.  Proper planning and testing are essential to ensure that suitable technologies are selected to accommodate business use cases and that the computing infrastructure can keep up with any resulting computing demands. For many data centers, it’s time to review some of the planning philosophies behind desktop virtualization as well as some basic considerations when growing an existing deployment.

Desktop virtualization vendors extol the virtues of their technologies, but the benefits can easily be lost in poorly planned deployments or inappropriate use cases. So the first consideration in any desktop virtualization plan is to decide where and when it should be used.

In general, today’s desktop virtualization deployments incorporate a  mix of virtual desktop infrastructure  (VDI) and application virtualization, first provisioning and serving a basic operating system image and then  providing access to a selection of virtualized applications based on each  user’s Active Directory setup. For many organizations, it’s an approach that works well, offering good performance with minimal storage demands. But IT pros must evaluate and choose their own best approaches.

For example, desktop virtualization is appealing for a mobile workforce with many locations or offices. IT support can be eliminated from those remote locations. If an endpoint fails, just plug in a new endpoint or log in from another endpoint to access the desktop again. Similarly, user groups with identical application needs—such as sales reps entering orders or physicians making rounds—can often benefit from desktop virtualization.

Some use cases may not perform well

But some use cases may not map well to desktop virtualization. Custom applications or demanding media-creation software such as Autodesk Inc.’s AutoCAD or Adobe Systems Inc.’s Adobe Creative Suite may not perform well in a client/server desktop paradigm.

This is partly because graphics intensive software running on a server does not have access to the high performance GPU resources often available in PC-based video cards. And even though user profiles allow a limited amount of personalization, users demanding high levels of personalization or completely unique desktop instances incur high storage and maintenance overhead, erasing much of virtualization’s benefit.

“A bad path to follow is simply taking your current physical environment that’s already inefficient and ineffective and expensive, and adding complexity to it by virtualizing it,” said Dustin Fennell, vice president and CIO at Scottsdale Community College in Scottsdale, Ariz. “That’s probably one of the most expensive strategies when leveraging this type of technology.” IT professionals who are using desktop virtualization say that planning is essential for a successful deployment.

A deployment plan can be broken into the following five areas:

  • Analysis
  • Assessment
  • Design
  • Implementation
  • Management

Generally, an analysis is the business case that helps to set expectations for the project. “This is about right-sizing the expectation—understanding what desktop virtualization could really bring to the organization,” said Ian Song, senior research analyst at IDC. “A lot of organizations have overarching expectations, thinking they can cut their desktop hardware or immediately eliminate all the desktop problems that they’re having in the environment, which certainly isn’t the case.”

Assessing assets and requirements

Next is an assessment, which considers the available assets and requirements that are needed. For example, a careful study of users’ needs—the way in which they use their systems and applications—should show best use cases. Each application must be thoroughly tested to verify that it will perform adequately in a virtualized environment.

Licensing costs for each operating system and application instance must be evaluated and understood. Weigh the available data center resources, and identify any server, network or storage limitations that should be addressed prior to deployment. In addition, look at the IT skill set, and determine whether there is enough expertise to deploy and manage desktop virtualization.

The design phase involves evaluating and selecting virtualization technologies and then developing the proof-of-concept around the expected user base. This is where the IT team can review the server, network and storage infrastructure performance, tweak and tune the approach, and predict potential bottlenecks as the project scales over time. The results of a design phase are often where an organization can start to see the early value of desktop virtualization.

Implementation is the actual rollout of an initial desktop virtualization infrastructure. This often includes some amount of scaling as additional user groups are added and integrated over time.

The important idea here is that implementation should always be approached in a systematic manner. “Virtualize a type of user or business unit first that can benefit the most,” Song said. “Then, depending on the success of that, the organization will have the ability to scale the project up or down.”

Management requires IT expertise

The final part of the plan should include management, which is critical if data centers are going to get any real value from desktop virtualization, Song said. Managing a virtual desktop environment requires a level of IT expertise to ensure that continuity is maintained.

For example, IT administrators should deploy new operating system patches or application updates quickly and efficiently. They should identify performance problems, address them decisively and provision new desktops as needed. If there is not enough IT support, it may be necessary to increase staffing or involve outside service providers. Otherwise, the user experience—and resulting productivity—may suffer and jeopardize the entire project.

Growing the desktop virtualization plan

Desktop virtualization can typically draw from VDI and application virtualization technologies to accomplish an organization’s goals. These technologies can be used independently, but they can also be mixed to provide a more efficient use of enterprise storage.

VDI, such as VMware View or Citrix XenDesktop, is often used to provision the core desktop instance with the operating system. Because most users run the same OS, this single image can be instanced for a large number of endpoints. Even when several different operating system versions are used, there is only a small number of “golden images” that can be instanced. Once the basic desktop is provisioned, an organization can use application presentation such as VMware ThinApp or Citrix XenApp to make applications available to end users. It is possible to incorporate applications into the VDI desktop image, but this increases the size and load time for each instance. It also results in a greater proliferation of golden images as each variation is patched and updated over time.

Keeping the desktop and application access separate will usually increase efficiency. But ultimately the choice of what to deliver by VDI and what to deliver through application virtualization will depend on the needs of the organization as defined in an initial analysis or assessment.

“When we use VDI, it’s really just the base operating system,” Fennell said. “All of our end users have a consistent experience no matter how they’re accessing our resources,” he said, adding that the experience is the same whether it’s on a college-owned computer, their home computers or the laptops they bring with them on campus.

Factors that limit size or scope

Each organization should also consider other factors that limit the size or scope of virtual desktop deployment. Today’s major vendors have refined their technologies extensively. The ability to scale a VDI is rarely limited by individual server, network or storage technologies. Certainly storage I/O performance and server resources are important, but proper resource monitoring and capacity planning can curtail unexpected resource shortages.

In most cases, though, it’s a lack of management that imposes the most limitations. “Often I see organizations use VDI or other desktop virtualization technology as a patch to plug existing holes in their desktop infrastructure,” Song said. “When a project is used in this kind of tactical manner, it will never grow beyond what it already fixes.” Song underscored the importance of strategic, long-term planning to provide a growth path for VDI projects. Another limitation occurs with licensing. Unplanned or enterprise wide licensing costs can stall a desktop infrastructure. For example, many third-party software vendors don’t have licensing that’s appropriate for delivering applications in a virtual environment.

Fennell described a particular licensing problem involving Adobe Creative Suite that limits the locations where users can access the software. “If I licensed Adobe how they would want, I would end up spending more money on the Adobe license than I do on my entire virtual environment every year,” Fennell said.

Which applications work well?

In terms of application compatibility, popular “canned” applications such as Microsoft Office applications, Adobe Acrobat and other mainstream software seem to work quite well when delivered through application virtualization. Instead, it’s the specialized applications and those that are developed internally that most often encounter problems with desktop virtualization.

There is no substitute for testing each application and evaluating its performance carefully among selected users before rolling it out to the entire enterprise. When a troublesome application is identified, it will have to be recoded to support the virtual environment or replaced with another vendor’s product.

In the end, a desktop virtualization plan should be approached in phases—not rolled out as an all-or nothing proposition. Organizations might implement application virtualization first and prove out core applications then deploy VDI later on. In the meantime, they may choose to use a hybrid model where applications are local and virtual so that users can access the software in both modes to gain confidence with it.


About the Author

Stephen J. Bigelow, a senior technology editor in the Data Center and Virtualization Media Group at TechTarget Inc., has more than 20 years of technical writing experience in the PC/technology industry. He holds a bachelor of science in electrical engineering, along with CompTIA A+, Network+, Security+ and Server+ certifications and has written hundreds of articles and more than 15 feature books on computer troubleshooting, including Bigelow’s PC Hardware Desk Reference and Bigelow’s PC Hardware Annoyances. Contact him at sbigelow@techtarget.com.

This was first published in October 2011

Dig deeper on Downtime and data loss in virtualized environments

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchVMware

SearchWindowsServer

SearchCloudComputing

SearchVirtualDesktop

SearchDataCenter

Close