Minimal IT logo and link to home page
Research, training, consultancy and software to reduce IT costs
Home | About | Newsletter | Contact
Previous | Next Printer friendly
28 March 2006

Fix the system, not the results

By Andrew Clifford

I have an admission to make. I tried to fix a system assessment so that I would look good. But in the end I had to accept and act on the recommendations.

Our main product, Metrici Advisor, provides a framework for surveying and measuring IT systems. It uses an expert system to analyse the results and recommend improvements.

Obviously we want to show off our products, and show what open and honest people we are. So we thought we would analyse our own system in itself, and publish the results on our web site. Giving an honest view of ourselves would show visitors that we can give them an honest view of their systems, too.

But we found our own system a bit too good at finding faults. And the main issue it identified with itself was that it is provided by a new, small company.

I'm not trying to hide that we are a new, small company. I would argue that innovative solutions often come from startups. But I couldn't stomach highlighting this as the main issue with our system.

So I tried to fix the results. I wasn't going to lie. But I looked at why this issue had come up and realised that I could easily remove it.

I realised that I had assessed the system partly from our point of view (we use the system to provide a service), and partly from the point of view of our customers (who consume the service). To our customers, the system is provided by a new, small company. But from our point of view, our system is a custom system, which does not trigger the same issues.

If I assessed the system consistently from our point of view, the issue would disappear. Although my real motive was to make our results better, I also felt I was doing the right thing. The assessment would more clearly show our own management priorities for the system, rather than a muddle of our priorities and those of our customers.

So I repeated the assessment, fully expecting a positive result with no serious issues.

How wrong I was.

As I assessed the system more consistently from our point of view, a new picture emerged. Our customers only use the system occasionally, and not as part of their core business processes. But we use the system constantly to support our core business process of delivering service to our customers.

The system had now become business critical (because it is to us), and this triggered all sorts of issues and recommendations. Top of the list was that our system recovery plans had never been formalised and tested. I certainly did not want to advertise that.

This time, I had no excuse because I knew I had assessed the system properly. I had to accept the findings and act on the recommendations. (And we do now have a documented and tested system recovery procedure.)

Our own painful experience shows how an automated system can help identify improvements. You need to be clear from what point of view you are assessing the system. But more importantly, you have to be prepared to accept and act on the results of the analysis. You have to fix the system, not the results.

Next: Is system governance agile?


To subscribe to the newlsetter, simply send an email to
Privacy policy

Subscribe to RSS feed

Latest newsletter:
Magical metadata

We use the term "metadata-driven" to describe IT solutions in which functionality is defined in data. Taking this to the extreme can provide unparalleled levels of speed, simplicity and versatility.
Read full newsletter

System governance

System governance helps you implement high-quality systems, manage existing systems proactively, and improve failing systems.

Try it for free!

Find out more