The phrase "you cannot manage what you do not measure" sums up the importance of a system of record for comprehensive data collection. Without meaningful, quantitative analysis it is impossible to understand or articulate the state of an environment and the satisfaction of its users. Forward thinking IT leaders have understood this for years, and now it has become accepted that this data collection is a requirement, not an option. Implicit in this is another area of key value: understanding an environment in the context of its peers through comparative analytics. Today's IT transformation is both a sprint and a marathon, in that it's not only how much IT improves the business, but how the rate of improvement compares relative to others.
This represents an expansion of the concept of IT analytics – the art and science of gathering detailed data from IT and end user systems and turning that data into useful information. However, thus far, organizations have been missing the ability to put the data from their own organization into a broader context of comparison to peer organizations in the industry and to determine how they are doing compared to others.
This whitepaper explores the concepts and benefits of comparative analytics and introduces the Lakeside SysTrack Community program, which allows IT leaders unprecedented access to additional data from sources outside of their own organization.
In a competitive and increasingly transparent market, IT leaders simply cannot deliver productive IT services if they are unable to track progress and assess their rate of improvement. By properly leveraging comparative analytics, IT leaders gain the ability to respond to the dynamic state of their organizations in real-time.
Business Intelligence (BI) solutions use technology that enables an organization to collect, maintain and organize specific knowledge, such as key performance indicators. Traditional BI solutions, however, are primarily focused on an organization's internal data, and generally do not allow the comparison of benchmarks against peers. Common uses include assessing internal performance metrics, measuring problem resolution turnaround rates, evaluating cost measures, analyzing total cost of ownership (TCO) or Return on Investment (ROI), reviewing end user experience, as well as monitoring compliance.
In contrast, comparative analytics solutions leverage the concepts of BI, but add the vital component of contextually comparing that data against peers and other benchmarks. Common uses include user productivity statistics, measuring hardware and software performance, software procurement rationalization, and comparing incident response rates.
Next-generation comparative analytics capabilities can even provide visibility into license utilization (concurrent vs. per user/device), hardware and software segmentation by user class, business unit, organizational size, or geographic location. More timely data allows for more accurate comparisons that reflect what is happening now so that it can be acted on immediately, before any financial impact is realized.
Business intelligence solutions enable organizations to measure their internal performance and identify best practices. However, how do organizations know whether or not their own best practices are actually "best"? What if an organization's internal best practices are still poor when compared to peers? Comparative analytics enable organizations to make that distinction by providing the necessary context — via benchmarking against peers — so they can focus their efforts on improving the areas that will deliver the most meaningful results.
Organizations need to collect, measure and manage key internal business metrics, but also need to compare them against peers to gain meaningful performance insights. Comparative analytics provides the contextual insights needed to manage enterprises in today's challenging business technology landscape. In addition, comparative analytics increases the transparency and accountability by organization, facility, department, and individual staff members. Comparative analytics allows organizations to pinpoint the areas where their improvement efforts can have the greatest impact. With resources scarce and margins getting thinner, determining the most efficient way to implement improvements is vital to the success of today's IT organizations.
Key factors for successfully using comparative analytics to improve IT performance include engaging senior leaders in the organization and business units to gain ground level insights, communication and learning with peers, transparency related to what the data show, how they are being used, and what value has been gained from the data, and - most importantly - incorporation of end user feedback in continually improving productivity.
A comparative data set is of course only as good as the data it contains. SysTrack is used by thousands of organizations in the initial assessment at the beginning of IT transformation projects and is the tool of choice by the major desktop and application virtualization vendors (Citrix, VMware) and Microsoft. SysTrack therefore starts with the broadest footprint in the industry and many organizations who leverage SysTrack for their initial user and application assessment are joining the community to gain additional comparative insights.
Most organizations who use SysTrack for the first time start by deploying the data collection agents to their systems and apply the factory default settings to gauge system health, alarm thresholds, etc. This is mostly done because most organizations don't quite know what to expect yet and let the system collect data for a few weeks as a baseline to then determine in which areas they can implement changes in order to positively impact the end user experience.
When looking at an individual system in the SysTrack Resolve tool, it is possible to compare and contrast that individual system to a peer group of similar systems in the deployment. This is done via the SysTrack grouping mechanism and can be critical to IT support staff to determine whether they are chasing an individually unhappy user or if the issue is consistent across an entire group of systems and is better addressed at a group level rather than the individual user or workstation.
Benchmarking takes this concept to the next, higher level. Organizations can now look at their own data in the areas of system and user health scores, application misbehavior, boot and login times, etc. and benchmark their own data against that of an industry wide peer group. This is a critical step towards determining if something is truly amiss and needs investment of time and resources to remediate the situation or if the situation is quite common and is considered of less importance or if the remediation is already known in the IT community.
A great example of the value of an industry insight is the selection of software solutions for the business. Let's assume that an organization is in the business of selling insurance and other financial services and the customer care department is nagging IT for a complete overhaul and upgrade of the telephony integration and customer relationship management software. There is an internal battle raging. Better customer service and more call volumes are the promised business benefits, but the complexities of the implementation and integration are cost prohibitive for IT. "But everybody is using this system in our industry!" is a commonly heard exclamation of frustration among the customer care leadership. IT is in a bind. Is that true and are the organizations who are using the proposed system experiencing better system and user health scores?
Previously, there was no way to find out. With SysTrack Community, organizations can quickly run reports to compare their own data against that of other companies in the same vertical. The top commonly used applications are clearly listed and it is now easy to gain insights into the software portfolio of the industry peer group, which may include competitors, and help make the case with the CIO and CFO.
For many IT organizations SysTrack provides an unprecedented amount of detailed system and application data and includes a historical record of the data in great detail. Especially in the field of end user computing (as opposed to network or datacenter instrumentation), IT organizations often lack the ability to interpret and correlate the data and make decisions about the appropriate course of action in response to certain data and data trends.
Because deep analytics in the end user computing space is a relatively new discipline there are relatively few published resources available that would help organizations with this dilemma.
For example, an organization may see a sudden increase in crashes and faults in a specific application. The question now is – is everybody else seeing this behavior, or could there be a dependency on another application or process in my organization that has changed recently and could therefore be responsible of the observed increase in application faults? Prior to the SysTrack Community's decision support capabilities, organizations had to employ an arduous trial and error to find out. By checking what other organizations are seeing with respect to the application, the decision on next steps can be made much faster and with greater confidence. This truly represents an evidence-based decision making process that was previously unavailable to IT organizations.
The SysTrack Community gives business decision makers a new set of tools to focus their IT investment in areas that actually drive improvements. The data can show that there are areas in which peers may be doing better than the own organization and those are good candidates to invest in, rather than investing in areas in which the own organization is already doing much better than the peer groups.