|
1 July 2008Effective technical evaluation
By Andrew Clifford
Most technical evaluations are unnecessarily difficult and do not deliver good results. Making evaluation easier and more effective is common sense. We do a lot of technical evaluation in IT. We evaluate software before we buy it. We evaluate as part of quality assurance, to understand compliance, and as part of general fact-finding. I have had many frustrating experiences with evaluation.
- When PCs were new, I was involved in evaluating PC software. We used simple check lists and awarded one point for meeting each requirement. Unfortunately, three-quarters of our requirements were user interface standards, so we would buy anything as long as it looked nice.
- In one selection, one person's outspoken (but unexplained) dislike of a particular web server was allowed to override all other issues, and we were driven to adopt a solution with many worse problems, including the fact that the software had not yet been developed.
- Nearly ever evaluation asks stupid questions, such as asking a software vendor "Is the system included in the disaster recovery plan?" That's a reasonable question for the purchasing company to ask itself, but not one for the vendor.
- Getting sales brochures, not answers. If I have bothered to word my questions carefully, I expect some carefully worded answers, not brochures.
- Letting technical evaluation lead business evaluation. On the few occasions I have been involved in well-run technical evaluations, the clarity of the technical conclusions have overridden business decisions.
These problems can disappear with a little common sense.
- Start by thinking about how you will make a decision. Agree what is important, what is good, and what is bad before you write the questions.
- Spend time writing good questions. Phrase every question in the imperative. Do not ask "Is the system secure?", ask "Describe the security mechanisms within the system and explain how they ensure that the system is secure."
- Check the questions. Think through what a good answer would look like, an acceptable answer, an unacceptable answer, and any special cases. Check that every question will make sense to the vendor.
- If you have a scoring scheme, break it down into groups before you weight individual questions. "One point per question" means that you too might just be picking pretty software.
- Think how you are going to analyse the answers. I use a mixture of a weighted scoring scheme, and looking for specific issues. This lets me say where solutions are generally weak or strong, describe the issues that would need special management attention, and identify "show stoppers" that would rule out an option.
- Make it clear to vendors that they will be ignored if they do not answer the questions.
- Separate technical evaluation from business evaluation. Technical evaluation is about understanding the issues and costs of a solution. Business evaluation is about opportunity and value, which is totally different. Focus business evaluation on opportunity, effective support of business processes, and commercial issues. Be cautious of scoring software against long lists of features, as it diverts attention from overall business fit.
All these are common sense. The small effort involved in structuring an evaluation right is paid back many times in the ease and effectiveness of the decision making process.
Next: The new umbrella
Subscription
|
Latest newsletter: Magical metadataWe use the term "metadata-driven" to describe IT solutions in which functionality is defined in data. Taking this to the extreme can provide unparalleled levels of speed, simplicity and versatility.
Read full newsletter
System governance
System governance helps you implement high-quality systems, manage
existing systems proactively, and improve failing systems.
Find out more
|