Service Desk

Usable Metrics Are More than Just Numbers

Roy Eldar

6 min read

1170 views

We live in an age of measurements and analytics, where organizations like to calculate how well they are doing, and how well (or badly) their employees are performing. This applies to IT service management (ITSM) where metrics, business dashboards, key performance indicators, and similar terms are everyday ITSM speak.

So…we have lots of numbers that represent what we do, and we hear that ”the numbers do not lie,” but too many organizations don’t make use of these numbers as they should; meaning they don’t derive the right information or follow up with the right actions, based on the outcomes.

One of the reasons service desk performance metrics are so under-used is the apparent belief that the numbers mean one specific thing when, actually, they often mean very little without context.

Trends and Contexts Matter More than Values

If you tell me you weigh 70 kilos, it tells me what you weigh. If you also tell me that you were 71 kilos last week and 70 the week before that, then I know more about you, and can surmise something about your lifestyle. If I then discover that you have just taken up weightlifting and are building up your size and strength for a competition, then I’d no doubt be more impressed than I would’ve otherwise, without that knowledge.

The same applies to ITSM. Many organizations measure their user or customer satisfaction on a scale from 1 to 5. Telling me you scored 3.9 might sound good, but less so if you scored 4.1 last month and 4.5 at this time last year. And even if satisfaction has dropped over the last 12 months, what is the context? Has your workload doubled; did your company launch several new products?

Understanding the context can help with proper prioritization and service improvement. Just taking numbers at face value without context means you are addressing symptoms only, instead of the underlying cause. Take that falling satisfaction metric, for example – retraining service desk staff or even employing more of them won’t help if the underlying cause is new services that don’t work properly!

Dig a Little Deeper

Let’s take the example of the user satisfaction score one more step. If your score was 3.0, and it’s been around that figure steadily for the last year or so, and there’s been no obvious external factors nor any major changes going on – what could that mean? That there’s room for improvement and you should get started on new training for the whole service desk team perhaps?

But wait a moment. What kind of 3.0 is that? With a scale of 1 to 5, there are multiple ways of getting an average of 3, and finding out why you scored 3 might make an important difference to how you set about improving things. Here are some scenarios that will produce a score of 3 in your user satisfaction metric:

  • Just about everyone rates you as a 3. Could be you are average, or could be that folks are too busy to take the survey seriously and just go for the middle box.
  • You get an equal distribution of scores at 1, 2, 3, 4, and 5. The entire range of different levels of satisfaction averages to 3.
  • Half of your users rate you as 1, the other half as 5. Some real extremes here.
  • Half think you are 2, half as 4. A mix of good and bad but not so extreme.

Of course, real life is rarely as neat as any of these, but the point is that depending on how that score of 3 is obtained, you would set about improving things in different ways. In fact, depending on which option you were nearest, you might feel the need to go look for a range of other information.

Find the Pattern

The extreme example above is, sometimes, referred to as the ‘marmite factor’ (marmite is a yeast extract spread that people seem to either love or hate, with very few being indifferent to its taste.) Expressing this set of results only by its mean average of 3 is very misleading. If you get this result in a service desk situation, it raises two important questions:

  • Why are the happy people happy? What are we doing right?
  • Why are the others unhappy? What are we doing wrong to them?

Of course, in order to answer that, you need to find out which users are which. Is there a recognizable pattern? Do the people on one site love the service, and those on another hate it? Do your younger users like it and the older ones not? Is it language based? The list goes on and there is no point guessing; you need to do some analyzing to find out. But what you shouldn’t do is react immediately to that simple average without further investigation. You don’t…do you?

If you’re interested in learning more on the right service desk metrics, I highly recommend you watch this webinar presented by Rob England, the IT Skeptic.

Image credit

What did you think of this article?

Average rating 0 / 5. Vote count: 0

No votes so far! Be the first to rate this post.

Did you find this interesting?Share it with others:

Did you find this interesting? Share it with others:

About

the Author

Roy Eldar

Former SysAid VP Support, with over 15 years’ experience in IT and large-scale production operations, Roy was responsible for the cloud infrastructure that powers SysAid and for our signature customer-centric support. Outside of work, Roy is an avid photographer, who also loves road cycling.

We respect your privacy. By continuing to use our site, you agree to our privacy policy.

SysAid Reviews
SysAid Reviews
Trustpilot