I’ve written many blog posts for central London managed services provider, Plan-Net. The tone of voice is professional and designed to convey peerless expertise, while keeping the content accessible and engaging. Blog post on measuring user satisfaction with Service Desk offer for central London managed services provider, Plan-Net.*If no image of the finished project is available, my .pdf copyvisual or .docx copysheet is shown in its place.
Measuring how happy users are with your service desk.
Happiness. It’s such an abstract quality.
Ask just about anyone whether they are happy about this, about that or about the other, and their response is likely to be shaded. They are not unhappy. They are quite happy. They are mainly happy. Usually happy. Relatively happy.
You won’t generally have to ask too many people before someone inclined to more philosophical musings answers that, “that depends what you mean by happy.”
When it comes to how well a service desk is doing, however, it can be quite another matter. Just about every end user has a view on how happy they are with the service provided for them, and it usually falls to the service desk leader to monitor and report such satisfaction data so that its lessons can be mined to improve performance and efficiency.
Choosing your moment to gauge Service Desk happiness.
As a company that supplies service desks, augments service desks, manages, advises on and audits the performance of service desks, Plan-Net believe ourselves to be in as good a position as anyone to offer some guidance as to how and when to ask end users about how happy they are with how the desk is doing.
And while it might be tempting to ask only friendly users on a day when great service has just been delivered with uncommon speed, there are conventionally four situations in which it’s prudent to survey users.
The first and most obvious of these is post-transaction. Deal with the incident. Close the incident. Ask the user how well the desk performed.
Next comes the periodic survey. From time to time, typically every year or so, all users are surveyed as to how, in terms of general performance, they feel the service desk does.
Then there’s the post-incident aversion survey. This is when the self-rectifying tools within your ITSSM have headed off what could have been a major incident, but it’s possible that some users could still have experienced a drop in performance and availability.
Lastly comes a one-off survey, usually when you’ve just launched a new service or completed a rollout of some kind.
What questions to ask about Service Desk satisfaction.
As most of us now appreciate, how a question is expressed can be as important as asking the question at all.
When you’re surveying service desk performance, there are a number of questions we’ve found it beneficial to pitch, in both post-incident and periodic surveys.
Post-incident, our managers find it worthwhile to ask about the user’s general experience of dealing with the desk while the issue was being dealt with. Was this a comfortable or disquieting experience for the user. It’s also useful to ask the obvious and straightforward question: “Were you happy with the resolution?”
Aside from this, try enquiring whether the user felt the service desk agent who dealt with their issue had the right skills and whether they were courteous and helpful. Include the person who took the call in this, too.
Lastly, and this is the question beloved of so many consumer user satisfaction surveys, “Would you recommend the Service Desk service/tools you used to other people?”
If you’re conducting a periodic survey, try to measure overall satisfaction with various aspects of your desk’s service. This will help you to spot and address any areas of latent concern. It’s also a good to ask questions which will help you to gauge the competence of the desk and its engineers, enabling you to plan future training.
The things you need from your measurement tools.
There are a whole range of options available to you in choosing which tools to use to survey and report satisfaction with your service desk. You should start off by checking the capabilities of the survey tools that are embedded in your ITSSM tool, then consider what other functionality you need or would like to have.
According to Ben Whitehead, Plan-Net’s Head of Support Services, there are 8 key capabilities to consider when deciding on what to build your survey and reporting capability around.
As your most frequent surveying instance will be one linked to incident closure, you need to be sure that your survey tool can be configured to automatically survey end-users on closure of the incident.
Whatever tool you adopt needs to be capable of sending out one-off surveys as and when you wish – either as part of a periodic check or in response to some particular need to gain understanding.
It’s very useful indeed to choose a tool that enables branch and skip logic in your questioning. This lets you direct users down different follow-on paths dependent on their answers to earlier questions.
Just as branch and skip logic can be useful in designing telling surveys, piping – populating later questions with data from earlier responses, can also be extremely useful.
On mobile? Use the Category names below to return to the category you were browsing.