Hybrid Desktop Virtualization Aligns IT’s Needs with End User Requirements

This series has perhaps unfairly pointed a finger at VDI for being insufficient across many business use cases. The reality is that both server‐based and client‐based desktop virtualization have their detractors. In the server‐based (VDI) model, consolidating desktops into the data center moves the processing off perfectly good hardware and onto perfectly new servers. Great idea, bad execution. In a straight comparison, a server is always more expensive than a desktop. So unless you have those servers lying around, you're in for a capital expense.

What many potential VDI adopters don't often realize is that even with relocating that processing, there still must be some device at the endpoint to receive the virtual machine . If that machine is the same old desktop, you're now running double‐desktops. We've already agreed that's a bad idea. However, if you switch to thin clients, you'll find yourself with yet another capital expenditure.

The client‐based approach isn't that much better. In a world where a Type‐1 hypervisor was available that supported every possible hardware configuration, the client‐based approach would absolutely be an attractive option. But that world currently doesn't exist. Type‐2 solutions are available, yet they create the same unexciting double‐desktops situation, and have been notoriously poor in terms of overall performance.

The everything‐for‐everyone end state offered by conventional wisdom's VDI decibel level begins to get muddied the more you analyze desktop virtualization's potential users and use cases. Dig deeper into your own classes of users and you'll find that the best solution is probably a combination of both.

Yet what still lacks in all the options discussed so far is that priority on preserving the users' experience. Users demand the ability to customize their workspace. They also, often without realizing it, demand local application processing in those circumstances discussed in the previous article. At the same time, IT must implement centralization to succeed in managing a growing environment that only gets more complex over time. Merging these two requirements is the goal of desktop virtualization in general.

Hybrid Desktop Virtualization and "Layering the OS"

Seek solutions in the middle. The correct one leverages all the centralization tactics proffered by VDI while retaining the local execution users demand. One way to get there can happen through a focus on desktop synchronization , or what is also called hybrid desktop virtualization. In this architecture, it becomes possible to achieve the goals of IT while giving users what they want.

The hybrid model abandons the server‐based model's movement of desktop processing into the data center. It at the same time eschews the notion of delivering a very large virtual machine to local hardware only to exist atop something else.

In place of these two opposing options, the hybrid approach works with the operating system (OS), applications, and data that are installed directly to the user's computer. Beginning with an OS template, the hybrid model deconstructs the monolithic notion of "the user's computer" into a series of stacked layers each of which can then be individually managed.

Although not specifically focused on the hybrid model's deconstruction, I introduce the notion of OS layering in a 2009 TechNet Magazine column called A Case for a Layered Approach to Deploying Windows Desktops (http://technet.microsoft.com/en-us/magazine/ee835710.aspx). In that article, I suggest that,

…your average Windows desktop is a lot like the layers of an onion. At its core is an operating system with default and out‐of‐the‐box settings. Installed atop that core are the necessary drivers as well as required OS updates and patches. Next up are the user's needed applications, along with individual configuration settings that define your custom desktop environment and modify application settings. Finally, the top layer includes the user's specific personality data such as bookmarks, desktop shortcuts and printers.

All of these individual layers combine to create what ultimately becomes the user's working environment. If you peel back each layer, you eventually make your way back to the core operating system itself.

Later in the article, I reformat the text in the quote above into a graphical format. That image I've reprinted here (see Figure 1) to visually separate each layer in relation to those that sit above and below. Although this image (and the article) reflects Microsoft's free tools that manage the content within each layer, you can obviously imagine the range of third‐party solutions that handle the behaviors of each.

Figure 1: Layers of the Windows desktop.

Significantly different between my original layering concept and the architecture that is hybrid desktop virtualization is the addition of a synchronization function. Imagine, if you will, that your desktop has been deployed using a solution that follows the hybrid model. In doing so, a base image has been laid down on disk, followed by a series of applications and specialized configurations. After that come your user personality elements.

Different than the typical Windows installation, this deployment is provisioned via synchronization from a centralized server and template image. Once deployed, your subsequent changes to that image are themselves synchronized back to the central database. Anything you do to impact one of the layers depicted in Figure 1 is automatically and transparently replicated somewhere else.

The operational result of this bidirectional synchronization completely changes IT's actions in managing its computers. Your personal desktop synchronizes to a server so that losing it or experiencing a non‐recoverable problem is resolved by merely resyncing a previous version back to your hardware. Lose the hardware, and you need only resync the instance to something different.

Layered applications and personality follow you around as well. Leave your desktop to enter a conference room and you can fully expect your personal settings to follow you into that location. Shift from a desktop to a laptop or vice‐versa, and the same scenario applies.

IT gains incredible efficiencies as the synchronization goes in the other direction. When IT needs to deploy an update, IT need do so only to the OS template on the server. Applications work in much the same way. They can be either fanned out in an automated fashion or directly installed via the manual method. Each use case benefits.

Seeking the middle needn't necessarily require abandoning the options at either end. The hybrid approach's greatest potential lies in its ability to pair with existing server‐based desktop virtualization solutions. Just like VDI, the hybrid approach won't necessarily work well across every use case. That said, its layering does present the possibility where serverbased processing can be accomplished when server‐class equipment (or data center‐grade security) is required. Synchronizing that instance down to local hardware needn't require long delays because much of the OS instance is mostly already cached locally.

Hybrid Desktop Virtualization Bridges VDI and Client

VDI may not be the answer, and the answer may not be VDI. But its technologies are important for those special cases. In others, the client approach works well. For those in the middle, hybrid desktop virtualization bridges the gaps in VDI's coverage. At the end of the day, your goal must be in maintaining user experience while ensuring the safety and security of the environment. Getting there sometimes requires taking the middle road.