Application virtualization simplifies the deployment and administration of software by isolating the application from the operating system and thereby eliminating most application conflicts. By encapsulating the applications and separating them from each other and the operating system, application virtualization makes it much easier to deliver applications to users, promoting flexibility and reducing help desk costs.
The application delivery process for virtualized applications is dramatically streamlined and improved. Instead of complex installation processes performed at each system, the application is packaged and delivered as a ready-to-run element. Since the application executes in its own isolated sandbox, the core desktop remains clean and efficient. Application use is easier to control, and troubleshooting is minimized.
While Application virtualization technology may be used effectively on both physical and virtual desktops, it is a critical technology in achieving desktop pooling in a desktop virtualization plan. The quantity and diversity of applications in today's market and corresponding variance in use is very large. As a result, desktop pool designs face a tradeoff between concurrency (which is driven by larger pools) and minimization of application license waste (which occurs when users are issued licenses for applications that they don't use). The larger the desktop pool, the larger the set of installed applications must be; this, in turn, leads to greater waste. A greater number of smaller pools minimizes waste, but leads to reduced concurrency.
Application virtualization solves this problem. Since users can be entitled to just the applications that they need (removing licenses for users that don't need or receive access to this software), the shared desktop pool base image becomes more flexible. A wider variety of users can share the same underlying image. This translates into fewer desktop pools and greater concurrency. Nearly all pooled desktop environments benefit from improved pool size, enhancing the base benefits of deployment simplification and application isolation inherently available through application virtualization.
To prepare an application for use as a virtualized software application, a preparation process is required during which the software is "packaged" for distribution and use. Typically, this requires that the state of a test system be evaluated before and after installation to allow determination of the changes that resulted from the software installation process. Additionally, the state of the memory image of the software after the user first begins execution is observed, in order to allow a determination to be made of which portions of the package are required to begin execution.
The preparation process is a complex undertaking, as applications may be complex and diverse. While some applications are delivered with little "baggage", packaging of typical commercial software may contain features that provide integration with the operating system, other applications, systems and applications located on other computers, interactions with commonly used applications such as Microsoft Office and Internet Explorer and various installed hardware among others. Such special features complicate packaging and the delivery of the virtualized application, because these interfaces may need to be physical or virtual, integrated or isolated or otherwise specially handled.
In the language of application virtualization, a package has been successfully "packaged" if the packaging process has been completed in a way that the use of the application preserves all of the functionality required by the user base. Most applications, typically 55-75% of the software estate, can be successfully packaged without special handling, because they do not require any "special" integration. Of the remaining software, some software can be packaged through special configuration or handling of required integrations; other software can be packaged with minor functionality loss, which may or may not be important to the user base; a small subset of software applications cannot be packaged successfully because their behavior cannot be sufficiently supported in the application virtualization environment.
In order to prepare for application virtualization use in a production environment, plans must be made to conduct the packaging process. In order to determine whether applications are to be delivered as virtualized, it is desirable to conduct an assessment of the application estate from the perspective of application virtualization.
The goals of Application Virtualization assessment are, for each and every software package in the estate, to determine:
Effective assessments can be a major undertaking. Software use in most commercial accounts is diverse, often with thousands of installed software packages. Lack of rigid policies, complications of mobility and sheer numbers of systems nearly always means imperfect version controls for most organizations. Users often install their own software as well, and some software may be delivered pre-installed on base computer systems or may be a downloadable capability provided by vendors that provide software.
Incomplete application virtualization assessments can be very troublesome. Often, image plans such as desktop pools depend heavily on proper delivery of a set of applications. The configuration and installation required may be directly influenced by application virtualization compatibility of the application mix. An incorrect assessment or incomplete assessment nearly always results in extra costs, lack of optimization and project delays.
When a software package has a characteristic or integration that complicates or precludes the packaging process, that software package is said to have an application virtualization "concern". We are careful to avoid saying that such packages "cannot be virtualized", as this is rarely true. Concerns are an indication that a decision must be made as to whether the associated feature is important (since many concerns are found to have no impact on the user base), whether the packaging complication has a specialized configuration or other workaround associated with it, or whether packaging is not possible or not feasible in light of the concern.
In light of this understanding of application virtualization concerns, a primary focus of the application virtualization assessment process is the proper identification of packaging concerns of each software package. Below, we highlight some of the most commonly found packaging concerns. SysTrack identifies all of these application virtualization concerns.
Device drivers are programs designed to operate as part of the operating system, often interfacing directly with hardware or other device drivers. Software packages that install device drivers may be problematic, because the application virtualization software precludes installation of these kernel mode components into the operating system.
Services are special programs that operate in user mode and provide system-type services to the system. Services generally do not interact with the user through a user interface, but may be critical to the operation of certain programs. In many application virtualization solutions, some types of services may be virtualized through specialized configuration needed to support the operation of these programs.
Some software still includes 16-bit components that run in a Windows-on-Windows (WOW) subsystem. Some of these applications also make certain assumptions about how different 16-bit programs may communicate when running in a shared WOW environment. The WOW environment provides an emulator for the deprecated 16-bit Windows platform, providing backward compatibility under 32-bit versions of Windows.
Some software includes 64-bit components, which require a 64-bit host system for support. Some application virtualization solutions do not yet fully accommodate these programs. Further complication may arise in the attempted use of these programs in a 32-bit virtual machine, where the 64-bit interface is unavailable; this may impact which client computers can execute the packaged software.
Shell extensions provide additional linkage and functionality that allows the software to be called directly by the Windows shell upon the occurrence of relevant events. Some such extensions are convenient, or sometimes critical, to the successful operation of the program by the user. In other cases, shell extensions may be rarely used components that can be easily sacrificed. Proper identification of the shell extension generally promotes the decision-making process.
Microsoft Office is widely used, and may be extended through a defined interface through which software programs implement add-in technologies that are tightly integrated with Office functionality. The extensions delivered via Office Add-ins are often very important in the day-to-day usage patterns of the user community, although some functionality is less demanded. An identification of the add-in component generally promotes the decision-making process.
Some software programs implement extensions to Internet Explorer that deliver functionality that is directly integrated with this popular web browser, while other extensions extend and enhance the user experience in the browser.
Internet Explorer supports the definition and use of Toolbar features that are provided by third party software programs. These integrations are often highly user visible.
Windows supports a deprecated mechanism that allows software programs to intercept and potentially filter certain messages and events that may occur on the desktop. Although these mechanisms are no longer recommended for use by developers, there still exists a community of software programs that rely on them. SysTrack does not currently report on this concern, although it may be extended to do so in a future release.
Certain DCOM configurations may be restricted by application virtualization technologies. A discussion of these matters is beyond the scope of this paper.
Application Virtualization assessment may be conducted in a number of different ways. Any successful solution must be automatic in discovery; if the user were required to enter a list of software programs to be discovered, the labor required for even simple environments would be too onerous.
Software is installed on systems in three primary ways: through the Microsoft-defined .msi process, through a variety of third-party installation assistance programs and through ad-hoc mechanisms. Most software distribution systems (such as Microsoft SCCM/SMS, IBM Tivoli, HP Radia, Symantec Altiris and others) are based on the use of .msi or equivalent processes. Successful application virtualization assessments provide mechanisms to deal with all three installation types.
There are three common approaches used to identify concerns for discovered software: a known-software database, install package examination and live inspection of the software. Below, we review and compare these methods.
In the known-software database method, a large database is maintained, often manually, that identifies a wide variety of common software. Each software package must be identified by at least name and version, since different versions of the same software package often have differing concerns. Such databases may be maintained where they can be accessed over the internet, or they must be shipped with the toolset and installed. The former approach is more easily updated, but is less easily configured and controlled by the end user and requires sharing information over the Internet.
To implement the known software database design, the assessment toolset acquires the name of the software package and the version of the software; some primitive assessment tools are not version aware and may produce limited and misleading results. The name and version are then checked against the database, which then returns the concerns list from the database.
This approach to the problem has a number of obvious limitations. First, in order to be effective, the database must contain an extraordinarily large collection of software package data, which must be entered carefully through an often manual process. Because the world contains a very, very large variety of software, the database maintenance process is an arduous one. Incomplete databases result in incomplete assessment data. Further complicating the process is the fact that there are many releases of each software package each year. This means that the maintenance process is even more complex, perhaps intractable. Each version may be different than its predecessor.
Another problem that may be less obvious is that the concerns for a given software package may vary with the installation environment and configuration. Often, a single package is installable on different operating systems, with different feature sets, a variety of options and various user selections. Different customers may find different concerns based on the installation. In the known-database approach, if the data was derived from the wrong platform, it is likely that the results will also be wrong. In other cases, potential concerns may be overstated, including the universe of all possible concerns in all possible configurations. While certain environmental data might be used to subdivide applications further at the database, this makes the database even more complex to maintain and more prone to error. Moreover, the configuration details that affect the concerns may be application-specific.
Finally, the database approach is ill-equipped to deal with the custom software world. Most customers have at least some software that is constructed in-house, or was built under contract for that specific customer. In many environments, a significant part of the software estate is made up from such packages. The known database approach cannot accommodate assessment of such software packages.
As a result of the above limitations, the industry has generally abandoned the known database approach in favor of processes that are more adaptable. More flexible and tractable techniques are dynamic, in the sense that they can examine software packages directly, without the use of a preexisting known database. As a result, they are always "up to date", handle versioning well and can accommodate custom software packages.
To accomplish this type of assessment, the user loads the uninstalled installation package into the toolset. In some cases, this toolset restricts the types of inspectable packages to those constructed as .msi packages by the software developer. By examining the software package, the content for some types of concerns may be discerned. Recommendations and analysis can then be provided from this process.
While this approach overcomes many problems associated with the known database method, it still has some important limitations. First, packages may be installed using techniques other than .msi; this inherently may limit the types of packages that can be reviewed. In many cases, the installation package may be specific to the application, making the examination impossible or nearly so.
Second, this approach fails to solve the problem of configurations that vary according to the environment, configuration, options or user settings. In this regard, this approach mirrors the limitations described above in the known database method.
Next, this discovery method may be very labor intensive for large environments. Often, packages must be acquired and loaded by hand, and results must be compiled. The step of associated discovered software (which some toolsets do not automate) must be related to the installation libraries, and loading of each package for analysis must be initiated. If the rule set for the tool is improved over time, it is necessary to repeat this analysis process from the start to take advantage of better intelligence.
Finally, tools such as these often deal with application virtualization compatibility in a void, separated from other aspects of the environment which may influence decisions regarding application delivery. While application virtualization is an important part of the overall planning process, resource loading, storage, power and cooling, user behaviors, faults and health considerations and security matters are all equally critical in proper planning. Successful toolsets must be integrated into a larger process for success.
SysTrack implements the most flexible of the approaches, Live Inspection. The Live Inspection approach avoids the pitfalls of the other two approaches, and additionally minimizes the effort required to conduct the assessment.
Under Live Inspection, SysTrack analyze already installed software packages as they exist and are used by members of the user community. The discovery and assessment phases are fully integrated, since both can occur simultaneously. There is no need for user involvement in enumerating software packages, since packages can be discovered in the installed environment. The user does not need to perform any special loading step to initiate the assessment process, as the estate can be analyzed "in place". This greatly reduces the labor required in the process.
Another benefit of Live Inspection is that it is effective on packages installed with a variety of methods and mechanisms. While support of packages installed with .msi is the most important, many packages are installed using other techniques. Use of Live Inspection broadens the set of assessable applications to include a wide variety of software.
Availability of the installed instance of each software package also allows examination of the real concerns for the package as impacted by the environmental setup. If the concern list varies from instance to instance based on hosting operating system, configuration, options or user settings, the assessment is done on the software as actually installed for each user. This makes the assessment more relevant and accurate. In many cases, this has the effect of eliminating false positives on the concern set.
Finally, the Live Inspection is inherently integrated with other assessment tools that are very helpful in the planning process. In addition to application virtualization assessment, SysTrack tools review resource loading, storage, power and cooling, user behaviors, latency, application faults and health considerations and security matters as valuable information acquired during the assessment. The integration of these disciplines offers are full-featured design and plan, minimizing labor and correlating otherwise disjoint data into a cohesive plan.
As experience with application virtualization expands in today's marketplace, there are two forces of change that drive SysTrack improvements: the need to keep pace with changes in the application virtualization technology and assessment improvements based on real world experience.
Application virtualization assessments provided by SysTrack application offer the state of the art in maximizing the benefits available to users. While the results obtained are broad and effective, the application virtualization products available change rapidly, delivering new capabilities and requiring improved assessments. The SysTrack assessment technology is similarly under constant improvement, delivering higher quality results with each release. The Live Inspection approach taken by SysTrack is highly adaptive, allowing customers to easily benefit from improvements in the toolset.
With each improved toolset, no additional labor is required to take advantage of new features beyond simply installation of the revised software. Since the assessment is performed in-place, the new tools may be immediately applied to the existing software estate, without the need for reloading or reconfiguration. Results are immediately updated based on new SysTrack features.
Second, when limitations of application virtualization use are discovered in the field for unusual software packages, this information can be directly used to enhance the SysTrack toolset. SysTrack's deep inspection offers a wide variety of information about the behavior of each installed software package, and new rules and techniques may often be applied directly to this rich data mine. Thus, as assessment limitations are discovered in the field, the process of toolset enhancement and field updates to leverage the improvements can be accomplished in a short timeframe.
Organizations that successfully harness the capabilities of application virtualization will greatly reduce administrative costs associated with their application infrastructure, improve flexibility and enhance the user experience. Through SysTrack application virtualization assessment capabilities, the steps necessary to take advantage of this technology are straightforward, reliable and require minimal effort. We encourage you to start assessing your infrastructure today!
To begin, contact your systems integrator or technology provider to arrange for installation of SysTrack virtualization tools. Installation is a fast and easy process, and you'll begin building your data mine immediately. Using SysTrack automation, you'll gain invaluable insight into your environment and detailed technical data to facilitate your use of application virtualization technology.