There are lots of different ways to collect customer feedback, but which ones work best? We asked our panel of experts for their opinions.
Close the loop
‘Want to win an iPad? Just fill out this survey’. We’ve all seen enticing messages such as this pop up after we’ve engaged with an organisation. However, the problem with customer feedback surveys is the perception that the feedback is not used for the better or taken into account when making changes to a business or service. This means that surveys are driven predominantly by the thought of winning something, and are very rarely a valued form of communication between customer and company. The only chance you have of me filling in a survey is if I can win an iPad, which leaves most surveys with poorly completed forms and half-hearted comments, which completely undermines the time and resources taken to prepare the survey in the first place.
The solution lies in closing the loop, proving that something is going to be done with the feedback and that the customer’s time is valued, and the information gained will go towards improvement. In order to do this, customers need to see how their feedback will be treated and you need to make the outcome evident.
Cloud-based customer experience solutions are enabling this loop to be closed by using real-time processing of feedback. The software automatically associates the information with the contact who gave the feedback in the CRM system. Depending on the information given, an internal case can be raised and escalated to the right people – not a quality department – so that a follow-up call can be scheduled to address any issues.
It is important that the customer who left the feedback receives a personal follow-up call, regardless of the feedback. This improves the experience and highlights that the customer is valued.
Make it easy to leave feedback
Another point that many surveys overlook is making it easy to leave feedback. By keeping the process simple and allowing the participant to invest as little time as possible, the customer won’t feel hassled. I personally find that if I get too many emails to process, or if I get sent a text and I reply and then you send me another, and another, I am going to give up. However, if at the end of a call you ask me to stay on the line to give feedback on the phone for just 30 seconds longer, I am much more likely to do that. Finally, if a customer has given negative feedback, ensure that it is highlighted and they are treated specially. Route them to the front of the queue and to your best agent. If this doesn’t improve quality of feedback I’m sure the iPad 3 will help.
By Paul Turner, Chief Operating Office, NewVoiceMedia (www.newvoicemedia.com)
5 key criteria for success
After 12 years and millions of CSAT/NPS surveys, Bright sees these 5 key criteria for customer feedback:
1. Break it down
Forget about statistically secure samples at company level, to be able to drive change you need to measure at staff level where the satisfaction or dissatisfaction is created. We recommend 20 surveys per agent and month. For a large centre this adds up, so the method needs to be cost-efficient IVR, email or text surveying. We prefer IVR surveys as you can record customer feedback and use it to great effect in one-to-ones.
2. Do it immediately
If you wait two weeks, customers will have either forgotten or mix you up with other suppliers. By instead surveying 10 seconds after, there is also a great bonus in that they have not started doing anything else so it is less intrusive and your take-up rate will be greater.
3. Feed it back to agents
By allowing them to see their own results and the reason behind them, you create a self-correcting solution.
4. Team leaders and correlations
Let team leaders see what really drives CSAT and who they need to focus on. If they manage to get all agents in the red circle into green in the example below, the impact on NPS will be massive. You will also be able to find breaking points, e.g. how long are your customers prepared to wait before they start scoring overall CSAT lower? (normally 2 minutes).
5. Feedback to other departments
By passing on segmented customer verbatim to other departments you will gain two things: A. They will take notice and stop doing stupid things to your customers. B. Your status will increase as you will be seen as the ones with the finger on the pulse.
Mats Rennstam, Managing Director, Bright (www.brightindex.co.uk)
“WOCAS” (What Our Customers Are Saying)
Last year we started our “WOCAS” (What Our Customers Are Saying) initiative, an employee focus group in which our team are able to pass on feedback on issues that are affecting our customers. From WOCAS, we have identified various processes that needed to be improved, along with smaller changes such as how a certain promotion is worded on the website, to a problem with our SMS messaging.
We have made it very easy for our customers to contact us – we have a free phone number and a new call-back option within the IVR, so if there is a 60-second delay in answering a call, the customer can leave their details for us to call them back.
What effect has it had?
We also proactively survey our customers by phone to find out how we can improve our service, but also to get their opinions on any projects or changes to our proposition that we are thinking of making, before we actually bring them in. Our customers help us to make the decisions! We have also engaged customers via social media, by surveying on Facebook and Twitter.
We send out an NPS survey to every customer, and our Quality Team pick up the responses and contact EVERY detractor to find out more about their journey. We also contact a sample of promoters for feedback purposes, to see what we have done well, and our current NPS score averages +80%!
Post-call IVR surveys
We have recently implemented a new telephony platform that has again improved the service we offer – we now have a post-call IVR survey available for customers to opt in to, and we ask them to rate the service they have received from our advisor. This allows us to measure our First Contact Resolution rate. We are currently averaging 4.85 out of 5 for advisor performance, and 90% for FCR!
I think the results speak for themselves!
Emma Tickle, Quality Manager, DRL (www.drllimited.co.uk)
Voice of the Customer
When looking at customer feedback there are five golden rules.
1. Engage customers at the point of experience: Timing is everything, and to gain the best quality responses, ensure that you time the request for feedback as close to the point of the most recent interaction in the customer journey as possible.
2. Comprehensive: Taking a small sample of responses will typically lead to results that are not representative.
3. Cross-channel: The feedback request should be made via the customer’s preferred channel of communication, as this will also improve the quality and volume of responses. However, it is worth noting that SMS messages have demonstrated a higher standard of response compared to the telephone, as responders cannot be influenced by interviewer bias and have more time to provide a considered response.
4. Open-ended questions and unstructured response: It is important to avoid questions that steer the customer to a particular response, and giving the opportunity to provide an unstructured response will reveal more insight. For example, I recently heard the story of a hotel chain that circulated a paper-based survey to its customers. One of the many questions asked if they liked or disliked the carpet. The resounding response was that people did not like it, so huge sums where spent ripping up and replacing the old flooring. Despite this change, the customer satisfaction rating did not improve.
5. Avoid incentives: Some people will complete the survey for the reward, not for the sake of improving the service and this can skew results.
Tom Lynam, NICE Systems (www.nice.com)
NetPromoter Score (NPS)
There are a number of techniques which organisations can employ to understand their customers’ needs better, one being a net-promoter programme. A net-promoter programme can be a great, quick and easy way in which to understand how loyal customers are to your specific brand, and how likely they are to recommend the brand to a friend.
However, although this is an effective method for understanding loyalty, it does not provide more detailed information into specific customer needs and expectations.
IVR or online surveys
For more detailed knowledge, an IVR (interactive voice response) survey or online survey can be very informative, and is a cost-effective way to gather customer data.
These methods are less intrusive than some other methods such as email, SMS and white mail as the customers have pre-agreed to take the survey. An IVR or online survey also tends to be much more representative as they take place at the time of, or just after, the interaction with an organisation. Customers’ thoughts and opinions are therefore much more fresh and are likely to be more representative of true feelings.
Mark King, Senior VP, Aspect Europe and Africa (www.aspect.com)
Multi-channel analytics
A multi-channel approach to collecting customer feedback is what most contact centres should be aiming for, as, dependent on customer demographics, the way they wish to feedback on their experience differs.
There are technologies out there now which can provide event-driven, conversational feedback which is relevant, timely and is a true representation of customer feelings either during or immediately after the customer experience and can be derived from any number of communication channels.
Incorporating natural language processing technology can quite quickly identify themes in responses received via text, for example, and which can trigger alerts back into the business immediately following a customer transaction.
This provides a level of immediacy never really achieved before, which can prove important for customer retention strategies.
Tracy McAvoy, Marketing Manager, Business Systems (UK) Ltd (bslgroup.com)
We tend to provide feedback after a poor experience
There are many ways to collect customer feedback, IVR, web forms, text, email, analytics, but through all these methods there are two fundamental issues.
Firstly, we provide feedback when we’ve had a poor experience. So any sample of feedback data, however it’s obtained, does not give a fair representation, it reflects the extremes of satisfaction.
Secondly, a client has multiple touch points with your organisation and although you may be measuring feedback on one area, the respondent’s perception may have been influenced by a poor experience from contact with a different department.
That’s not to say we shouldn’t employ these methods of collating feedback, only that we need to ensure that the feedback is used appropriately. Qualitative feedback can be invaluable in improving the customer experience as it highlights common process or product issues which are not being dealt with appropriately and which can be resolved through FAQs and scripting. However, analytics, texts emails, and IVR cannot reflect the tone or context of the response, and so the best way to collect feedback and measure customer advocacy is subjective focus groups or outbound cold calling.
Siobhain Goodall, mplcontact
Author: Jo Robinson
Published On: 26th Sep 2012 - Last modified: 4th Jan 2023
Read more about - Customer Service Strategy, Alvaria, Bright, Business Systems, Customer Satisfaction (CSAT), Customer Surveys, CX, Measurement, mplcontact, NICE, Vonage
Great piece and al useful insights. WOCAS as Emma quotes, started at Amazon when my colleague Bill Price was asked by Jeff Bezos – what would I know, that I dont know, if I worked on the front line each day. Since implementing WOCAS in many situations as one part of the wider Amazon approach we’ve learned a few things:
a) Someone in the company knows everything the company learns – its not about shiny new stuff
b) It’s about who learns what and do they learn it in context….
c) and do they learn it with business data and customer data attached
d) and do they see that learning as an ongoing process not an analysis to do a fix
e) and do they design a “decisionflow” which uses that learning continually to prioritise drive action at mutiple levels
I liked reading paul’s post. I agree that most of the time I spend time in filling up a survey, its to WIN something! I think as survey respondants we need to look to an attitudinal change. For my academic stuff, I use SoGoSurvey because the features it has are just awesome!