digital assets

Anything Missing When Measuring Corporate IT Performance?

Let me provide some reassurance about corporate IT: all the accountabilities that are linked to quantitatively gauged measures of performance are subject to rigorous management and are never neglected.

The two broad categories of clearly defined and clearly measured performance objectives are KTLO and OTOB, acronyms for Keep The Light On, and On-Time On Budget, respectively.

The first category relates to IT operations. Corporate IT’s first and foremost responsibility is to make sure that what has been purchased, leased, built, installed, and has proven to work the first time, actually continues to do so, continuously and as long as your business runs. IT operations are less glamorous from an innovation point-of-view.  IT Ops – as it is often called – doesn’t invent new customer experiences. Neither does it re-architect your organization through radical business design.

But Ops is by far the most critical information technology function because its failure directly impacts the survival of your business in the very short term. If your organization cannot deliver the services to your customers and partners, it literally ceases to exist.  As such, IT operations should be taken very seriously; everything IT does or manages is monitored and measured quantitatively, down to fractions of a digit. Expectations on the quality, stability, and performance of operations are quantitatively defined up-front. Failure happens, but if the frequency or length of missteps is above the agreed-upon performance levels, some people will get seriously nervous about their jobs.

“With the quantitatively measured performance objectives of IT Operations, if failure happens too often, people get  nervous about their career.”

The second category, OTOB, relates to the execution of business change endeavors. Over the past few decades, there have been many scholar and trade discussions about the measurement of project performance, and how adequate – or not – the traditional triple evaluation scheme of cost-schedule-scope actually is. The model may have its limitations for those that are intimately involved in the execution of the endeavors that result in business change, but for those that command the change, assume the risks and reap the benefits – that is you, the paying customer – this performance measurement triad makes a lot of sense. The cost is how much money you need to spend to get what you want or need. The schedule is the time required to get it. And the scope is the extent of what you get for your money.

Scope can be subject to much discussion since the knowledge of what you want and what you really need in the end may differ quite a bit between the pre-project and end-of-project phases. To further complicate matters, there are as of yet no universal units of measure for scope for IT change projects. This imprecision contrasts with the universally understood measuring of cost and schedule.

That’s why many business people fall back on the sole use of on-time-on-schedule as a comprehensive tool for assessing the performance levels of IT in delivering change, assuming that what is delivered (scope) should be roughly what it ought to be for some business value stream to transmute to its new state.

“Scope of what is delivered by digital change projects is hard to measure and compare.  That’s why most business people will fall back on what they can grasp: on time and on budget.”

The importance of managing change is not an acute necessity for IT operations. Failure to be on-time or on-budget doesn’t have the same impact on personal and team performance evaluations, but performance is fathomed nonetheless and delivery dates are being managed.

So What’s Missing?

The major issue is that there are very few other quantitatively measured signs of excellence. The rest of IT is either subject to non-standard and qualitative evaluations or simply not measured at all. Non-quantified evaluations are debatable and easy to challenge on contextual differences. Non-standardized gauges are hard to compare.

In the end, IT measures itself for only a portion of what it does, focusing on improving what literally counts: where there are unchallengeable numbers with universally understandable units of measure. The rest is left to good intentions, or to how it is believed to positively impact OTOB or KTLO.

Notice that both KLTO and OTOB are measures of either immediate (KLTO) and short-term performance (OTOB). ‘Keeping the lights on’ means continuous operations, or short transactional tasks. Change projects are by definition temporary endeavors with a beginning and an end. What happens after the project is finished is completely irrelevant. Even the major transformation programs are split into manageable chunks that often fit into a civil year.

The IT management repercussion of short-termism is that the lasting impact of ITs work on your organization is veiled by short-range prerogatives.

The IT aspects that get the hit on the flank by short-term measures are quality and assets. More precisely, it is a hit on the quality of the work done that impacts the quality of the assets you get as a result.

The impact on your organizational capacity to adapt itself or respond quickly to changes in its environment is highly dependent on the quality of the assets. Asset readiness for change will suffer from lower quality work done in previous projects.

Get the bigger picture in this book about things executives need to know about IT – it will help you understand how most IT teams are evaluated today. These typical metrics have a direct impact on what gets improved, but also, what isn’t being taken care of.  Enjoy!