Archive for the ‘Governance’ Category

 

More Baseline Blogging – 24. December, 2010

As promised, there are more blogs to read at Baseline Consulting should you need something to do over the holidays…

Keep It On Track

Do You Know What Your Reports Are Doing?

Also, my recent post Making It Fit made it onto the B-Eye Network! Huzzah!

Happy Holidays!

Publication Notice, with a Twist – 22. February, 2010

A real live press release!

The Data Governance eBook

Publication Again – 17. July, 2009

Ever grateful for the opportunity provided by my company to write, I give you the latest installment…

Process is Half the Story

The Case for the IT Actuary, Part 3 – 24. November, 2008

In Part 2 of this article, we examined the requirements for introducing Actuarial concepts into IT project management. We conclude by describing how you might go about implementing these concepts.

——–

The Solution

It is clear that better measurement of costs associated with large IT initiatives is needed to manage these projects more effectively. It is also clear that there is little incentive to capture the data necessary to perform the statistical analysis required for effective risk management on an operational level. This contradiction contributes to information technology’s poor reputation on delivering solutions that satisfy business requirements.

Can this situation be remedied? I believe so, but it will take some new technology and a different perspective from both the people doing the work and the people managing it. The new technology is being developed right now – a great example is the ability of the Tivo corporation to “predict” the outcome of the weekly voting on the “American Idol” television show by recording the usage patterns of their subscribers. Capture techniques such as this will be necessary to collect the data necessary for detailed risk assessments and usage patterns of spreadmarts within an enterprise.

Management personnel will need to shift risk management topics to the forefront of their philosophies and remove risk management from an anecdotal focus to one of metric analysis. Part of this shift will include contacting and retaining actuarial skills for statistical assessment or risk parameters, reducing the uncertainty in the current level of analysis. This will make decisions more effective and save the organization resources and money.

There are examples of corporate executives requiring metrics-based analysis in support of decisions, as described in Jessica Tsai’s article on predictive analytics. Unfortunately, these examples are rare enough to be called out in vendor presentations, which is a poor commentary on the presence of these measures. The presence of an actuarial staff would improve this situation greatly.

Operational personnel will need to understand their role in reducing overall risk to the organization. The easiest way to do this is to demonstrate the cost of activities such as spreadmarts and place appropriate sanctions in place for continued spreadmart use. Often, the easiest way to correct behavior is to associate the activity with a tangible financial cost. Once the habit is broken, it is unlikely to reassert itself in the organization, finally realizing the cost savings desired.

Organizational change is rarely quick or easy, but the movement toward a better way of analyzing costs will bear many benefits over the long term for enterprises looking for more effective information technology management techniques.

The Case for the IT Actuary, Part 2 – 29. September, 2008

In Part 1 of this article, we explored the issues surrounding risk management in IT organizations. Part 2 of the article describes the role of the Actuary in business and how this concept could apply to issues facing IT.

—–

The Actuary
The dictionary defines an actuary as “someone versed in the collection and interpretation of numerical data (especially someone who uses statistics to calculate insurance premiums).” The
actuarial function is very important to the insurance industry since the accurate prediction of risk allows the accurate calculation of premium rates and loss exposure, which demonstrates the soundness of the business. Actuaries have access to decades of “experience” data to base their calculations, and a well-established certification process to ensure competence.

The application of this skill set would pay great dividends to an IT department that is interested in quantifying the risk associated with the continuance of “unapproved” behavior of business uers, such as shadow data analysis systems. For example, the statistical analysis of hours spent by business staff on gathering data for these systems and the reconciliation of numbers produced by similar systems could predict the financial effect of the continuation of the practice. From this data, decision-makers can weigh the costs of lost productivity with the benefits of the more local control and flexibility that these systems provide and make an intelligent decision on the continuance of the practice, as opposed to today’s practice of anecdotal decisions.

The Requirements
Why haven’t more enterprises embraced this sort of detailed analysis in their IT departments? Why have initiatives such as Enterprise Risk Management failed to gain much traction outside of regulatory compliance? The answer lies in data itself. Actuaries have decades of experiential information at their disposal to do their jobs, mainly collected by public agencies such as the United States Census, hospital records, accident records, and the like. This data is collected and made available as part of the normal operation of organizations that are external to the actuarial team in the enterprise, so the business incurs little if any cost in obtaining the data.

Business users and IT departments do not generally track staff utilization to the point where it would be useful for statistical analysis, and they would need to establish policies and infrastructure to collect the data within the organization. The cost of obtaining this infrastructure and training staff in its use would need to be added to any project plan implementing a risk management program. The fact that the cost would be borne entirely by the enterprise is a strong disincentive to undertake such an effort.

There is a more significant hurdle to cross than cost, however – the definition of the costs themselves. How difficult would it be to collect this data? Here are some examples:

  • When a business user creates a “spreadmart*,” how much time is devoted to obtaining
    and reconciling data, and how much time is devoted to the analysis itself?

  • What is the cost of retrieving quality data in a spreadmart and in the enterprise data
    warehouse?

  • What is the cost in time and materials for a local hard drive crash where spreadmart data
    is lost?

  • What is the reconciliation cost of data calculated by formulas that deviate from the
    corporate standard in a spreadmart?

  • What is the potential cost to the enterprise if a spreadmart metric is incorrect?

Very few, if any, enterprises collect data on staff utilization and costs at this level of detail, mainly because there are few automated methods for data collection, and the testimony of staff members in status reports or time sheets is fairly unreliable for statistical purposes. Yet, it is precisely this level of detail that is required to perform the analysis necessary to quantify risk.

Another requirement for actuarial analysis is scholarship regarding risk management standards in
the IT profession. Here, insurance actuaries have a huge advantage due to the relative age of the professions: information technology has been practiced for just over 50 years, where insurance has been practiced for hundreds of years, and entire university curricula is devoted to the subject.

However, this should not be a great impediment to the adoption of actuarial philosophy to information technology, provided that IT is viewed as simply another business process. The statistical measures of risk should be similar, although the specific situations may be different. The analytical techniques should be similar enough to encourage adoption.

—–

In the final installment of this article, we will bring the issues and definitions together to propose a potential solution to risk management issues in IT organizations.

* The term “Spreadmart” was coined by Wayne Eckerson, Director of Research for TDWI, to
describe the primary implementation of a shadow data analysis system as a spreadsheet with
data obtained from enterprise systems that functions as a data mart.

The Case for the “IT Actuary”, Part 1 – 18. September, 2008

In Part 1 of this paper, we’ll take a look at the issue of Risk Management in the Information Technology field…

The Case for the “IT Actuary”

The success rate of business intelligence system implementation has been poor over the last twenty years. Millions of dollars are spent routinely on projects that ultimately fail to fulfill user expectations. Business users develop their own systems to operate their business analysis processes because the Information Technology group is slow to deploy solutions or the tools provided are not sufficient for managing business operations.

The common thread running through these and other major IT management issues is the inability of the staff to properly identify and quantify risk to the enterprise. This deficiency is generally due to the lack of a risk management skill set within the IT management hierarchy.  This article will use the example of shadow decision support systems to describe how risk management personnel – specifically, the concept of an actuary – can assist IT management in quantifying the risk inherent in major IT decisions and help senior business management enact and enforce more efficient management policies regarding major IT initiatives. This article will also discuss requirements for setting up such a practice, and the hurdles standing in the way of successful implementation.

The Issue

The proliferation of “shadow systems” and the IT response for management of these systems is addressed in the March 2008 Best Practices Report published by The Data Warehousing Institute (TDWI) entitled “Strategies for Managing Spreadmarts.”  This report is the latest in a series of articles describing the characteristics of non-IT business support systems and the issues they cause, published by TDWI and others. Shadow systems cause the production of inconsistent reports from different organizations within the enterprise, compromise data quality, and consume valuable personnel resources that could be used in other activities.

All of these articles and reports do a good job of describing the issues and risks of unmanaged data stores to the enterprise, but they do so in a general manner that is ultimately not useful or persuasive to the senior management personnel who are ultimately responsible for implementing controls over these shadow systems. The irony of the situation is that one of the main functions of a data warehousing organization is to provide accurate, functional measurements of business activities, but the data warehousing group often falls short of providing detailed supporting information for their claims of risk to the enterprise in many areas, including the proliferation of shadow data systems.

The lack of detailed risk analysis from IT groups is not a reflection of a disregard for the seriousness of the issues involved – the risks are well known from an anecdotal sense, but the skill set for quantifying the risks is not present in most IT departments.  Fortunately, a discipline exists for quantifying and detailing risk in business operations, but it is not generally applied to the field of information technology.

Next Time

Next time, we’ll look at the definition of “actuary,” and discuss some of the issues we come across that can be solved by actuarial concepts.