25 Ways to Improve Your Customer Satisfaction Surveys

Question marks printed on white cards are suspended by strings on a blue background
1,859

Rachael Royds of CallMiner shares the best practices and tips for creating the ideal customer satisfaction (CSat) survey.

There are hundreds of potential questions you might ask on a customer satisfaction survey, but the best questions for any particular survey depend on your goals, your audience, your industry, and other factors.

In other words, there are many things to consider when developing a customer satisfaction survey. For instance:

  • Customers may get fatigued and quit long surveys before completion.
  • Customers may become confused or uncomfortable if the wrong questions are asked.
  • The wrong questions – and questions asked in the wrong format – can result in incomplete or inaccurate data.

The good news is that you’re not limited to using only one type of survey. In fact, using a mix of NPS scores, CX scorecards, Customer Effort Score (CES) surveys and other tools such as engagement analytics can provide you with a broader, deeper data set that reveals valuable insights about how satisfied your customers truly are – and reveal actionable insights for improving satisfaction.

In other words, customer satisfaction surveys are just one tool in your customer satisfaction toolkit, but they are a crucially important one.

Below, you’ll find 25 expert tips on developing effective customer satisfaction surveys, the types of questions to ask, the best formats for customer satisfaction survey questions and tips for leveraging your customer satisfaction surveys (and the resulting data) for the best results.

1. Ask useful, actionable questions that measure a process or behaviour

Survey questionnaires can be long lists of 10-point ratings on every potential aspect of a customer’s experience.

One of the best ways to improve the quality of your direct feedback is to stop and look at what it is that you’re actually asking your customers.

To do this, focus on questions that are useful, actionable, and measure process or behaviour.

For example, by rephrasing ‘How do you rate our room service?’ to ‘How satisfied were you with the speed of room service?’ you are asking for valuable feedback that offers a clear option for direct action. That way you are guaranteed to have superior actionable insights.

Geoffrey Ryskamp, Medallia

2. Use CSAT surveys for in-the-moment feedback

A Customer Satisfaction Score (CSAT) survey is the simplest way of gauging what your customers think about any interaction, across any channel, they have with your organization.

Whether it’s speaking to a customer service representative at the contact centre, navigating the website or using an app, a CSAT survey asks customers to rate the experience in question on a three-point, five-point or seven-point scale.

The fact that it’s a single-question survey and can be answered by checking a box, typing a single number or selecting an emoji means that it can be sent via any channel.

However, with CSAT, speed is of the essence – it should be sent as quickly as possible to measure the moment while that moment is still fresh in the customer’s mind.

-Sitel Group

3. Scale questions produce more data you can analyse compared to simple yes–no questions.

The benefit of asking scale questions is that you will be provided with more data than a simple ‘Yes’ or ‘No,’ and this data can then be used to come up with scores based on the responses.

Again, this is an easy enough question for the respondent to answer and you may find that this kind of survey question has a better response rate than some of the others on our list.

Using scale questions means you can aggregate scores to see how satisfied a wide range of customers are and they can be used for practically any aspect of the customer journey.

Three examples of scale questions:

  • How likely are you to recommend us on a scale of 1 to 10?
  • How satisfied were you with your experience today on a scale of 1 to 10?
  • How easy was it to find what you were looking for? Answers ranging from very difficult to very easy.

-CustomerThermometer

4. Ask unbiased questions for the most accurate results

It’s hard to be objective when you think your customer service is outstanding. Take a step back from what you think you know and let your shoppers do the talking. Avoid embellishing your questions with superlatives.

Take the following prompt: “How would you describe our friendly customer service representatives?”
This is a leading question as it describes the reps as being ‘friendly.’ As a result, it isn’t likely to provide accurate results.

Instead, ask a focused question about an aspect of your customer service, such as: “How responsive or unresponsive were our customer service representatives?”

-SurveyMonkey

5. Avoid double-barrel questions

These questions touch on more than one issue, but only allow for one response. They are confusing for the respondent and you’ll get skewed data because you don’t know which question the respondent is answering.

-Qualtrics

6. Ask about product usage

When it comes to customer success and satisfaction, it’s critical that your business collects feedback about your product or service. If you don’t, then it’s more difficult to assess customer needs and provide effective solutions.

Finding out how satisfied your users are with your offer provides your marketing and product teams with valuable information that can be used to improve customer retention.

Some questions that you could ask in this section are:

  • How often do you use the product or service?
  • Does the product help you achieve your goals?
  • What is your favourite tool or part of the product or service?
  • What would you improve if you could?

-Ruchika Sharma, Hubspot

7. Semantic differential questions are a useful alternative to Likert-scale questions, offering more guidance to respondents for better ratings consistency

Semantic differential questions are similar to Likert scale questions in that they both use a 5- or 7-point scale.

What makes semantic differential scale questions unique is that they are more descriptive and ask the respondent to choose the option that best represents their opinion or attitude on a given subject instead of asking them to simply agree or disagree.

“How helpful do you find our video tutorials?”

1 – Not helpful at all
2 – Barely helpful
3 – Neither helpful nor unhelpful
4 – Somewhat helpful
5 – Very helpful

-Michael Redbord, Hotjar

8. Ask questions designed to measure customer loyalty

Although you can try to determine customer loyalty based on customer retention rates, customer loyalty is truly a different concept.

The questions below will help establish how your customers truly feel about doing business with you, now and in the future:

  1. Would you recommend our services/products to a friend? If not, why?
  2. Will you use our services/products to meet your X needs in the future?
  3. Will you buy more/similar products from our company?
  4. Do you identify as a loyal customer of our brand?
  5. Would you like to receive information regarding our new releases/future sales?

-Carla Jerez, Comm100;

9. Include an overall satisfaction question

It is a good idea to include a general satisfaction question, which will serve as an overall measure of how well your company is pleasing customers across all aspects of the business (product, brand, service, communication, etc.). For example: What is your overall satisfaction with (insert company name)?

The results of this question will provide a useful baseline to use in measuring your company’s progress over time.  Since this is a very common question asked by many companies, there are various industry benchmarks to also measure yourself against.

The most common way to analyse responses to this question is to look at the percentage of respondents who are either somewhat or very satisfied.  Most companies will find that this number usually lies somewhere in the 75%–85% range, although it will vary by product type and industry.

One downside to this question, however, is that it is not very actionable.  It might tell you that customers are relatively happy or unhappy overall, but it doesn’t tell you why they feel the way they do.

Knowing that 75% of your customers are satisfied isn’t very useful on its own, but I do think it is helpful to understand if that percentage is rising or falling from quarter to quarter or year to year.

Market Research Guy, My Market Research Methods

10. Ask competitor-based questions

In your satisfaction survey, you might want to know how your business, products, and services stack up to your competitors’.

Depending on what industry you’re in and what you sell, you might consider asking questions that allow your customers to compare you against other options they were considering before choosing you.

Competitor-Based Questions to Ask:

  • Why did you choose our product/service over a competitor’s? (open text)
  • Before buying from us, what other options did you consider? (open text)
  • Compared to similar products/services you purchased, is our product/service better, worse, or the same? (multiple choice or open text)

-Baylor Cherry, Blueleadz

11. Use multiple-choice questions to identify words customers associate with your product

“Which of the following words would you use to describe our product?”

Why should you care how people would describe your product? Because it’s an important issue when your product gets described as ‘buggy’ instead of ‘life-saving’.

If you don’t want to give your clients any suggestions, you can use the open-ended variation of this customer survey question and ask ‘How would you describe our product?’

Answers to any of those will show you how well you communicate your value proposition and product vision to users. If your goal is to provide them with the world’s most sophisticated graphics software and people describe it as ‘Paint with 1 cool feature’ it means that the communication is broken.

Hint: Check what are the most common positive words used to describe your product. If they are very frequent, it may indicate that this is what your customers are looking for in your product. You can use such words in your future marketing campaigns.

-Lucjan Kierczak, Survicate

12. Limit your customer satisfaction surveys to no more than five questions

They don’t even complain when they’re too long. People just give up, and you never find out how that customer feels, and whether they’re at risk. If you’re asking more than five scored questions it’s probably too many.

We looked at our data – millions of survey requests from hundreds of companies worldwide, and here’s how the number of questions on a survey affects abandon rate:

People run for the hills if your survey is longer than nine questions total, but around six is the ‘sweet spot’. Asking no more than five scored questions gives you a bit of breathing room.

-Customer Sure

13. Ask whether your customer service representatives act in your customers’ best interests

‘You’re not following our process.’ Drew J. told Nasser that one of his CSRs said this phrase to a customer. ‘This was a wake-up call for sure.’

If you simply train agents on a rigidly scripted customer encounter process, they can over-rely on these procedures and lack the ability to adapt to actual customer situations.

More important than any policy, procedure, or script are the reasons for these resources. Your focus should be on making the customer happy, and your agents and their tools should work for the customer, not the other way around.

Asking whether CSRs act in the best interest of your customer is an elevated version of getting at whether your service is good or bad.

Before you can solve a customer’s problem, you need to show them you care about their goals and concerns. If customers don’t believe CSRs are working in their best interest, nothing else will matter.

Cathy Reisenwitz, Capterra

14. Include one or two open-ended questions

Open-ended questions are questions that require a respondent to write a comment, an essay, or other type of free-reaction text.

You don’t need to make all the questions open-ended, rather include only one or two such questions.

This will unfetter your respondents and give them the freedom to blurt out what’s on their minds without holding back.

Below, are two smart open-ended questions culled from a recent survey by an e-commerce company:

  • What other products would you like to see in our online store? Please tell us other possible products you wish we were selling on our eCommerce platform.
  • What else would you like us to know? Any other information you’d like to share with us? We’d be happy to hear from you.

When asking an open-ended question, you need to be careful, as some respondents may get carried away and provide you with long compositions.

To prevent such incidents, provide a text box where they can write their answers, but limit the number of characters the text box can accept.

That way, they’ll provide you with only the most important things they have to say.

– Zoe Uwem, NeilPatel.com

15. Begin the survey design process with your survey goals in mind

Vague surveys lead to vague results. Beginning by defining your goal helps you create an organized, focused survey. That means you won’t leave out what may be your most relevant questions.

Grabbing a generic survey template or subscribing to an online survey platform is quick and easy, but you won’t get maximum effectiveness by taking the easy way out. It’s essential that you consider your particular business situation and goals when designing the survey.

Your clearly defined goal will lead to questions that are easier to answer and simpler for you to analyse and act upon later.

David Hoos, The Good

16. Customer satisfaction surveys are an important part of proactive engagement

To truly build long-term customer loyalty, companies need to move from transactional interactions with their customers to building company–customer relationships.

The first step to building these relationships is engaging with customers beyond basic one-way dialogs.

Customers don’t feel valued when they have to take the time to contact the companies they do business with about issues that the company should already know about like service outages.

At the same time, sending out mass text messages with no way to respond doesn’t give your customers warm fuzzy feelings either.

When proactive outreach is done right it can help organizations maximize productivity, customer satisfaction and contributions to the bottom line.

Examples of proactive outbound campaigns include collections, surveys, reminders, notifications, and confirmations. These interactions also lend themselves to effective delivery via self-service.

In addition, when done within an omnichannel experience, they can provide a more interactive communication channel resulting in higher completion rates and more customer comfort.

-Bernhard Santjer, Aspect

17. Focus on the right KPIs

The most effective CX-related KPIs focus directly on the real desired result – delivering an exceptional customer experience where the customer leaves the engagement not only satisfied, but positive about the brand.

These include:

First call resolution – FCR has long been used as a measure of contact centre effectiveness. However, first call resolution is only meaningful if the customer problem is actually resolved on the first call.

If for example, a customer calls back later in the day or a few days later, more than likely the problem wasn’t truly resolved.

Primary CX KPIs

  • Customer Satisfaction
  • Net Promoter Score (NPS) – both relational and transactional
  • Customer Effort Score

Post call surveys – Surveys are a good way to obtain the voice of the customer, but they should be timely, quick and easy to respond to.

Companies should utilize this feedback to adjust contact centre procedures and improve the customer experience. Don’t let these results sit idle – make effective change.

– Scott Kolman, Five9

18. The right metrics may vary for different types of businesses

The “right” metrics will vary with the type of business (B2B or B2C, for example), but generally companies should be gauging how easy it is to do business with them (Customer Effort Score) and how likely customers are to buy again (Net Promoter Score—but more on this follows).

Online sentiment analysis—how people talk about your company and products—can be very valuable in showing you where your weaknesses are and what market segments might find you more appealing if you did something differently from the way you do it now.

Personally, I don’t think Net Promoter is as informative as it could be. If you take it to the next level and seek out why the Passives identified in NPS are passive and what it would take to turn them into Promoters, that would be an improvement.

Of course, you need to pay attention to the big things like gross and net profit margins, cost of goods and services, and so on, always looking for both strengths and weaknesses.

-Roy Atkinson, as told to Robert Morrissey, RingCentral

19. Eliminate silos when measuring and supporting customer interactions

As the number of digital touchpoints increases, the ability of a contact centre to effectively address customer interactions will need to be even more closely managed to assure high satisfaction and resolution on whatever channel the customer chooses.

The reason for this is best explained via a McKinsey & Company concept called the ‘Customer Journey Experience’.

Essentially, contact centres today generally focus on measuring and supporting customer interactions in a siloed manner.

The issue is that from the customer perspective, customers don’t look at these stops on their journey as silos.

Customers view them as interconnected, and as their journey occurs, the aggregate interactions of all the interactions at a touchpoint form the basis for the overall experience the customer has with the company.

So, the customer satisfaction with the overall journey experience is an important metric to focus on.

Taken individually, satisfaction seems somewhat reasonable for each digital channel from a standalone perspective.

As the number and usage of digital channels increases, and the possible channel permutation complexity increases, contact centres will need to have the infrastructure not only to manage these channels, but to do it well on every channel to provide a high overall end-to-end satisfaction level across the entire customer digital journey. This is regardless of the consumer’s choice (or order) of channels for their digital journey.

-James Mastan, Nuance

20. Tie customer satisfaction feedback to agent performance and campaign activity

Most survey systems operate in a silo, separate from the contact centre ecosystem. This disconnection often results in surveys taking place days/weeks after the actual interaction.

A delayed survey not only minimizes the opportunity for accurate feedback, but also provides feedback without context.

Survey results are essentially provided without being tied to what, why, and how the interaction happened.

In essence, surveys often leave business leaders struggling to:

  • Determine agent-level performance: Delayed surveys without direct links to campaign activity leaves business managers struggling to pinpoint opportunities to improve agent performance or be completely confident in the accuracy of the feedback itself.
  • Make timely decisions: In today’s fast-paced consumer environment, any time lost between the actual interaction and survey feedback can lead to lost opportunities in creating up-sells, addressing consumer complaints, or gathering product insight.  

-LiveVox

21. Follow great survey design principles

When designing your customer satisfaction survey, all aspects of great survey design are important. But pay special attention to these three principles:

  • Remove bias – It’s your business; we know it’s difficult to not be biased. But when it comes to asking for customer feedback, it’s important to keep your own opinion out of the questions. You want customers to respond genuinely and give you the greatest insight into their experiences, even if that insight might be harsh.
  • Be concrete – Your questions should address specific customer interactions and be worded in a way that avoids miscommunication. Avoid or define technical and industry-insider terms whenever possible.
  • Stay focused – Your survey should address one aspect of the customer experience. It’s best to view these as a quick check-in with your customers with each question focused on an actionable takeaway. It should take the average respondent two minutes or less.

-Liz Millikin, SurveyGizmo

22. Conduct surveys at the right times

Here are the three major client survey types that combined will give you the most accurate image of your customers’ satisfaction:

  • Survey right after purchase – to reveal CS at that particular moment and thus lay the base for a good relationship with those customers.
  • Periodical surveys – to discover how specific segments of your customers are experiencing your products, at certain moments in time.
  • Ongoing surveys – so you can permanently take the pulse of your customers’ satisfaction and make sure you are maintaining high-quality standards. The most common type of survey here is the NPS, see below for more details on types of surveys.

-Omniconvert

23. Optimize your survey design to improve response rates

A Zendesk study found that 21% of respondents feel that they’re usually too busy to take a survey, while 16% claimed that surveys ask far too many personal questions.

A further 12% believe that surveys are often too long, making them far too time-consuming to fill out.

These figures stand as testament to the importance of optimizing your surveys to improve response rates.

A survey with low response rates won’t accurately capture how your customer base perceives your product and will ultimately fail to product statistically significant insights.

Vrinda Singh, Paperform

24. Choose the right medium for conducting customer satisfaction surveys

After determining your need and choosing a survey type, you need to pick a medium.

Will you be sending your survey over email, text, in-app, on paper, or in person?

Most surveys are collected over email, but some product development surveys could be done in person.

Surveys that measure a website’s ease of use can be done as pop-up surveys on the site itself.

-Meaghan Brophy, FitSmallBusiness;

25. Be sure you’re sending your customer satisfaction surveys to the right audience

Another reason why some companies achieve below average response rates is because they end up sending their survey to the wrong audience. It may seem obvious to send your survey to your customers, but it’s not that simple.

For example, your CRM might have several contacts for each account and clicking ‘select all’ makes no sense.

While you do business with some of them, there is a high chance that some of those contacts are people who work in administration or legal departments, and sending your survey to them probably doesn’t fit the purpose.

What’s more, depending on the survey goal you defined, you need to segment your customers.

By dividing your customers into different categories, you can deliver your key messages (and surveys) more precisely, thus better connecting with your audience and developing long-lasting engagement.

Jerome Collomb, MyFeelBack

This blog post has been re-published by kind permission of CallMiner – View the Original Article

For more information about CallMiner - visit the CallMiner Website

About CallMiner

CallMiner CallMiner is the leading cloud-based customer interaction analytics solution for extracting business intelligence and improving agent performance across all contact channels.

Find out more about CallMiner

Author: CallMiner

Published On: 10th Sep 2019 - Last modified: 17th Apr 2024
Read more about - Latest News, , , , ,

Follow Us on LinkedIn

Recommended Articles

Customer Satisfaction Score CSAT
What is a Customer Satisfaction Score (and How to Calculate CSAT)
Questioning Techniques
6 Effective Questioning Techniques for Customer Service
22 Top Tips to Improve your Customer Surveys
A photo of someone using a tablet to complete a survey
Customer Service Surveys – Bringing Sanity to the Survey