Question to those in a more mature community (Although new CMs are also welcome to join in). I'm looking for some benchmark in avg CSAT scores and helpfulness scores ("Did you find what you're looking for - yes/no") across other communities.
We're just a bit over 4 months with Lithium and I'm still not seeing much increase from the initial launch. If anyone has some metrics to share that'd be great!
Sorry about the slow response @edgiansante.
I used to manage a support-based community. I would suggest that CSAT scores are not something that most businesses share widely. It's probably not also that useful to compare businesses. Much better to set your own goals based on your own circumstances and business objectives.
It's good that you are being ambitious and want to lift your CSAT but I would be cautious about reading too much into results over just four months. It can take a while to change customer behaviour in regards to using community as a support channel and even longer to change average customer satisfaction levels - particularly in the upwards direction!
If you'd like to share some of the ways in which you are hoping to improve CSAT I'm sure you would get some helpful feedback from community members here. There is lots of wisdom, expertise and experience in the group.
Thanks for sharing more information. It was interesting to read as we've also discussed surveying users after they get a response to be more comparable to other channels and the survey method, to get more insights on areas we can influence thus more actionable.
When looking at the community alone, you may find that result skewed. However, when comparing to an overall 'support experience' (which tends to be -> contact support staff -> receive information -> ticket is closed -> receive feedback survey) that's a much closer comparable (where I find the 1st one on pop up is more comparable to an NPS / pulse of the community).
I am not sure if I follow you there. Perhaps you are saying the same. A broad survey will measure overall experience and not specific to the interaction and experience on the forum.
On the other hand, people looking for solutions, browsing, searching and navigating, are also users that have had an experience with the forum. They should not be neglected in that sense but I think there is ways to work around that and perhaps have surveys designed for those who browse (and it could be periodically) and for those who post and receive a response from the community.
We use Qualtrics to receive customer feedback and have an algorithm and rules for when and where the survey pops.
The response rate is quite good, especially for our largest language English.
re: Overall increase/decrease..do you mean the actual result or response rate? We see numbers don't really fluctuate unless we make drastic changes in methodology. Since the feedback is very broad and mostly about the brand & products that's in line with my expectation.
Hope all is well. Quick question to you since you're using Qualtrics too
I then ran a 2nd test: inviting people to fill out a survey (using a qualtrics survey) with the exact same questions as the OOB one but after they received a response from either a member / super user or a moderator.
How did you actually do this? Did you send an email to users after they received a response with a link to the survey? Are you making it part of the "reply received" notification. If you were able to pop up the survey ONLY for users that got a reply I am very interested to learn about this and promise to bring you Stroopwafels should we ever meet (the stroopwafel bribe never failed me )