Archive for September, 2008

 

Architect as Jiujitsu Master – 30. September, 2008

Mike Walker shares his observations on Day 2 of the Gartner Enterprise Architecture Summit:

Gartner EA Summit – Day 2

I’m particularly interested in the presentation by Brian Burke, and I’m sorry I missed it. It seems that the consensus in the field is settling on the idea that the most successful enterprise architects are more like masters of Jiujitsu and less like aggressive, type-A personalities. This is really encouraging for someone like me, who embraces the philosophy of picking one’s battles and passive influence, if not the physical aspects of it.

Perhaps there’s a future for me here after all…

Posted in Architecture

The Case for the IT Actuary, Part 2 – 29. September, 2008

In Part 1 of this article, we explored the issues surrounding risk management in IT organizations. Part 2 of the article describes the role of the Actuary in business and how this concept could apply to issues facing IT.

—–

The Actuary
The dictionary defines an actuary as “someone versed in the collection and interpretation of numerical data (especially someone who uses statistics to calculate insurance premiums).” The
actuarial function is very important to the insurance industry since the accurate prediction of risk allows the accurate calculation of premium rates and loss exposure, which demonstrates the soundness of the business. Actuaries have access to decades of “experience” data to base their calculations, and a well-established certification process to ensure competence.

The application of this skill set would pay great dividends to an IT department that is interested in quantifying the risk associated with the continuance of “unapproved” behavior of business uers, such as shadow data analysis systems. For example, the statistical analysis of hours spent by business staff on gathering data for these systems and the reconciliation of numbers produced by similar systems could predict the financial effect of the continuation of the practice. From this data, decision-makers can weigh the costs of lost productivity with the benefits of the more local control and flexibility that these systems provide and make an intelligent decision on the continuance of the practice, as opposed to today’s practice of anecdotal decisions.

The Requirements
Why haven’t more enterprises embraced this sort of detailed analysis in their IT departments? Why have initiatives such as Enterprise Risk Management failed to gain much traction outside of regulatory compliance? The answer lies in data itself. Actuaries have decades of experiential information at their disposal to do their jobs, mainly collected by public agencies such as the United States Census, hospital records, accident records, and the like. This data is collected and made available as part of the normal operation of organizations that are external to the actuarial team in the enterprise, so the business incurs little if any cost in obtaining the data.

Business users and IT departments do not generally track staff utilization to the point where it would be useful for statistical analysis, and they would need to establish policies and infrastructure to collect the data within the organization. The cost of obtaining this infrastructure and training staff in its use would need to be added to any project plan implementing a risk management program. The fact that the cost would be borne entirely by the enterprise is a strong disincentive to undertake such an effort.

There is a more significant hurdle to cross than cost, however – the definition of the costs themselves. How difficult would it be to collect this data? Here are some examples:

  • When a business user creates a “spreadmart*,” how much time is devoted to obtaining
    and reconciling data, and how much time is devoted to the analysis itself?

  • What is the cost of retrieving quality data in a spreadmart and in the enterprise data
    warehouse?

  • What is the cost in time and materials for a local hard drive crash where spreadmart data
    is lost?

  • What is the reconciliation cost of data calculated by formulas that deviate from the
    corporate standard in a spreadmart?

  • What is the potential cost to the enterprise if a spreadmart metric is incorrect?

Very few, if any, enterprises collect data on staff utilization and costs at this level of detail, mainly because there are few automated methods for data collection, and the testimony of staff members in status reports or time sheets is fairly unreliable for statistical purposes. Yet, it is precisely this level of detail that is required to perform the analysis necessary to quantify risk.

Another requirement for actuarial analysis is scholarship regarding risk management standards in
the IT profession. Here, insurance actuaries have a huge advantage due to the relative age of the professions: information technology has been practiced for just over 50 years, where insurance has been practiced for hundreds of years, and entire university curricula is devoted to the subject.

However, this should not be a great impediment to the adoption of actuarial philosophy to information technology, provided that IT is viewed as simply another business process. The statistical measures of risk should be similar, although the specific situations may be different. The analytical techniques should be similar enough to encourage adoption.

—–

In the final installment of this article, we will bring the issues and definitions together to propose a potential solution to risk management issues in IT organizations.

* The term “Spreadmart” was coined by Wayne Eckerson, Director of Research for TDWI, to
describe the primary implementation of a shadow data analysis system as a spreadsheet with
data obtained from enterprise systems that functions as a data mart.

And So It Begins – 29. September, 2008

Time out for sports…

Terrell Owens: I didn’t get the ball enough

From an ESPN blog:

After the game, a Cowboys starter on offense said he thought the team tried too hard to involve T.O. in the second half. It’s not good when a player senses that coaches are calling plays in order to keep a teammate happy. It’s not time to panic if you’re a Cowboys fan, but I’d certainly keep your eye on that situation. It’s a slap in the face to Witten, Patrick Crayton, Miles Austin — and especially rookie Felix Jones to freeze them out in order to please T.O.

As someone who lives in Northern California can tell you, just as someone who is in Philadelphia, as long as you’re winning, everything is roses with Owens. But once you start losing, it’s everyone else’s fault.

It’s really getting old, Terrell…

Posted in Sports

RIP Paul Newman – 27. September, 2008

Reg Dunlop

Posted in Uncategorized

Kabuki Theater in Washington – 26. September, 2008

I’ve really tried to resist commenting on the ridiculous spectacle of the “financial bailout” going on in Washington this week, but I can’t resist any longer. Watching the staid capitalists and politicians haggle about how many hundreds of billions we need to give to already-rich people brings this scene to my head, and I can’t get it out…

All I can say is, “sheesh”…

Posted in Uncategorized

The Case for the “IT Actuary”, Part 1 – 18. September, 2008

In Part 1 of this paper, we’ll take a look at the issue of Risk Management in the Information Technology field…

The Case for the “IT Actuary”

The success rate of business intelligence system implementation has been poor over the last twenty years. Millions of dollars are spent routinely on projects that ultimately fail to fulfill user expectations. Business users develop their own systems to operate their business analysis processes because the Information Technology group is slow to deploy solutions or the tools provided are not sufficient for managing business operations.

The common thread running through these and other major IT management issues is the inability of the staff to properly identify and quantify risk to the enterprise. This deficiency is generally due to the lack of a risk management skill set within the IT management hierarchy.  This article will use the example of shadow decision support systems to describe how risk management personnel – specifically, the concept of an actuary – can assist IT management in quantifying the risk inherent in major IT decisions and help senior business management enact and enforce more efficient management policies regarding major IT initiatives. This article will also discuss requirements for setting up such a practice, and the hurdles standing in the way of successful implementation.

The Issue

The proliferation of “shadow systems” and the IT response for management of these systems is addressed in the March 2008 Best Practices Report published by The Data Warehousing Institute (TDWI) entitled “Strategies for Managing Spreadmarts.”  This report is the latest in a series of articles describing the characteristics of non-IT business support systems and the issues they cause, published by TDWI and others. Shadow systems cause the production of inconsistent reports from different organizations within the enterprise, compromise data quality, and consume valuable personnel resources that could be used in other activities.

All of these articles and reports do a good job of describing the issues and risks of unmanaged data stores to the enterprise, but they do so in a general manner that is ultimately not useful or persuasive to the senior management personnel who are ultimately responsible for implementing controls over these shadow systems. The irony of the situation is that one of the main functions of a data warehousing organization is to provide accurate, functional measurements of business activities, but the data warehousing group often falls short of providing detailed supporting information for their claims of risk to the enterprise in many areas, including the proliferation of shadow data systems.

The lack of detailed risk analysis from IT groups is not a reflection of a disregard for the seriousness of the issues involved – the risks are well known from an anecdotal sense, but the skill set for quantifying the risks is not present in most IT departments.  Fortunately, a discipline exists for quantifying and detailing risk in business operations, but it is not generally applied to the field of information technology.

Next Time

Next time, we’ll look at the definition of “actuary,” and discuss some of the issues we come across that can be solved by actuarial concepts.

CDI and MDM Are Broken, Part II – 5. September, 2008

My latest article on CDI and MDM technology has been published by TDAN.com:

CDI and MDM Are Broken, Part 2

The synopsis of the article:

Part 2 of this article explains the application of semantic technologies to master data management applications, including customer data integration. It examines the two extremes of MDM implementation and proposes a middle ground.

How do you feel? Is your organization ready for MDM and/or semantics? Please share your experiences…

Lemon Curry Solutions Relaunch – 2. September, 2008

After many false starts, the new Lemon Curry Solutions website is operational. Please check it out when you get a free moment!

Posted in Uncategorized