Do you have any best practice use cases on how to check quality of support?
Our community puts response time first when it comes to measuring support responsiveness, but we'd like also to check how users perceive our responses.
I guess you could use one or both of the following, I actually use both these methods in my own support community:
You could send any users who you help a survey where you could as satisfaction and NPS type questions to understand how happy the user was with the quality of support provided. No Dates but Lithium are working on a survey product which will enable you to do this and also target it at certain users i believe. Currently I have a basic survey hosted on my support site "you could use survey monkey" which we push users to it helps us understand how happy users are, if we have deflected them from calling as well as what they thing of the community.
2. Random Quality Checks:
Im lucky in that we have a team who compete quality checks on all the calls our entire business get, we adapted their score card and now have them reviewing X number of topics per support agent. They score them on things like: tone of voice, adherence to policy, did they do all they could, etc...
The 2nd option is the best to get a real idea of the true quality but it does take a lot of someone's time, before i had a central quality team I would spend 4 days a month just reviewing and scoring topics. I'd also suggest that the quality score is linked to the agents bonus/objectives so that you can drive top quality!!!
Hope this helps
Like @Fellsteruk, I would recommend you survey your customers. Some companies send their customers an NPS survey after every support interaction, regardless of the channel (phone, live chat, email, etc). If you had your community linked to your CRM system this is going to be a lot easier. But hopefully Lithium add their built-in survey offering very soon - it looks promising and should meet your needs.
Thanks for the great answers.
We've done a survey in our community as well but noticed that it's likely to be a once-a-year survey than regular quality checks. We are also looking for something specific that our support team can get feedback from users whenever they provided a solution. It's probably similar to what Lithium is doing whenever a support case is closed. Not sure how it works for forum discussions where many users are involved though, and if the feature is available for us.
That was always the quandary that we faced at my previous community. Having a survey that is triggered when a user accepts a solution seems the most appropriate trigger, but the results may not be representative - they could be artificially high because you are only surveying those who got a solution they were happy with. What about the users that didn't? They are unlikely to be as happy with your service.
It would be better to send the survey to all users who created a new thread in a certain period or a random selection of users who created a new thread. Unfortunately then though it's going to have to be a manual process of harvesting users. And there's also the need to wash the list against a list of users that have chosen not to receive any marketing promotion - you do not want to be spamming them.... We found that was a reasonably arduous process.
We have an on page 1 question survey with 1 follow-up question if they respond "no" similiar to what Aruba does (http://community.arubanetworks.com/t5/Controller-less-WLANs/When-logged-into-Aruba-Central-how-do-I-...
This is also roll based so that only members who have access to the support site can respond, which is also why I can't send a link. We cheated a little bit and found the best solution was an Eloqua form which captures the page URL of the response. However, it would be some manual munging to match the URL to a specific support responder.
One gem we found when analyzing our results from a survey is timing. If you pop the survey too soon, users will ding you for it. So something to keep in mind would be:
1. check for first time visitor
2. check for how many pages they've visited
3. did they just post or do they have multiple posts
those are just suggestions... and we are still trying to figure all this out as well.