Customer Service Surveys – Bringing Sanity to the Survey

A photo of someone using a tablet to complete a survey
2,179

Dave Salisbury draws on a recent experience to highlight some key considerations and methods for building the ideal customer survey.

The Experience

Nissan sent me a post-car sale survey. I answered the questions honestly, and the sales rep did a great job. Yet those servicing and supporting the sales rep did some things that I was not particularly pleased with.

I was specific in the comment section, praising my sales rep, and particular on who and where the ball was dropped, creating issues.

Not 30 minutes after completing the survey, I receive a call from a senior director at the Nissan dealership, who said that my sales rep was going to be fired and lose all his commissions for the month because the survey is solely his responsibility.

The senior director went on to say: “This is an industry-wide practice and cannot be changed.”

I Disagree!

As a business consultant, long have I fought the “Voice-of-the-Customer” surveys for measuring things that a customer service person, salesperson, frontline customer-facing employee does not control.

If the customer-facing employee does not control all the facets that create a problem, then the survey should only be measuring what can be controlled.

If the customer-facing employee does not control all the facets that create a problem, then the survey should only be measuring what can be controlled.

For a car salesperson, they have a challenging job, and they rely upon a team to help close the deal. Including a service department, a finance department, sales managers, and more. The blame all falls upon the sales rep for problems in the back office.

What Does This All Mean for Our Contact Centre Surveys?

Quantitative data is useful but means nothing without proper context…

Quantitative data is useful but means nothing without proper context, support, purpose, and a properly designed survey analysis procedure.

Even with all those tools in place, at best, quantitative data can be construed, confused, and convoluted by the researcher, the organization paying for the research, or the bias of those reading the research report.

Qualitative data is useful, but the researcher’s bias plays a more active role in qualitative research. Qualitative research suffers the same problems as quantitative research for many of the same reasons.

Regardless, quantitative and qualitative data does not prove anything. The only thing qualitative and quantitative data does is support a conclusion. Hence, the human element remains the preeminent hinge upon which the data swings.

Leading to some questions that every business sending out a “Voice-of-the-Customer Survey” instrument needs to be investigating and answering continuously…

5 Questions to Ask Yourself When Creating Customer Surveys

  1. Is the data being captured relevant, timely, and accurate?
  2. What is being measured by the survey instrument, and why?
  3. How is the information being used to improve upon that which is measured?
  4. Who benefits from the survey, and why?
  5. Who is harmed by the survey, and why?

Even with all this taken into consideration, business leaders making decisions about “Voice of the Customer” survey data need to understand one person can make or break the service/sales chain; but it requires a team to support the customer-facing employee.

As Joseph M. Juran once remarked: “When a problem exists, 90% of the time the solution is found in the processes, not the people.”

When a problem exists, 90% of the time the solution is found in the processes, not the people.

Hence, when bad surveys come in, defend your people, check how your business is doing business, e.g., the processes.

The dynamics of “Voice of the Customer” survey instruments require something else for consideration, delivery.

AT&T recently sent me a “Voice of the Customer” survey via text message collecting the barest of numerical (quantitative) data, three text messages, three data pieces, none of which gets to the heart of the customer issue.  Barely rating the salesperson in the AT&T store I had previously visited.

Recently, I received a call from Sprint, where the telemarketer wanted to know if I wanted to switch back to Sprint and why.

The nasal voice, the rushed manner, and the disconnected mannerisms of the telemarketer left me with strong negative impressions, not about the telemarketer, but about Sprint.

Nissan sends emails, and while the data collected has aspects of the customer’s voice (qualitative) and numerical rankings (quantitative), my impressions of Nissan have sunk over the use of the survey to fire hard-working sales professionals.

My previous bank, Washington Mutual, had a good, not great, “Voice of the Customer” survey process, but the customer service industry continues to make the same mistakes in survey delivery and application.

The how and why in “Voice of the Customer” surveys, or the delivery and use of the survey data, leaves a longer-lasting impression upon the customer than the actual survey.

The how and why in “Voice of the Customer” surveys, or the delivery and use of the survey data, leaves a longer-lasting impression upon the customer than the actual survey.

So, if you are a business leader who purchased an off-the-shelf “Voice of the Customer” survey analytics package, and you cannot explain the how, why, when, where, what, and who questions in an elevator, the problem is not with the customer-facing employees doing your bidding.

If your back-office people supporting the customer-facing people are not being measured and held accountable, then the survey is disingenuous at best, and unethical at worst.

7 Methods for Improving Your Survey Process

I recommend the following as methods to improve the “Voice of the Customer” survey process:

  1. All business leaders using the customer service survey data must be able to answer the why, what, when, who, and how questions clearly to all who ask about the survey tool.
  2. If you have sections for the customer-facing employee and mix in other questions about the process, the customer-facing employee’s responsibility begins and ends with the specific questions about the customer-facing employee’s performance. You cannot hold a customer-facing employee responsible for broken back-office processes!
  3. Revise, review, and research the data collected. Ask hard questions of those designing the survey. Know the answers and practise responding to those asking questions.
  4. Get the customer-facing employees involved in forming what needs to be measured in a survey, have them inform and answer why. You will be surprised at what is discovered.
  5. Use the data to build, teach, train, coach, and mentor, not fire, customer-facing employees. Support your people with data, don’t destroy them.
  6. If you cannot explain a process or procedure in an elevator speech, the process is too complicated. No matter what the process is, use the elevator as a tool to simplify your business organization, processes, procedures, and tools.
  7. Be a customer! As a customer, ask tough questions, drive for answers that work, and if the customer-facing employee is struggling, train through the chain of command!
A thumbnail photo of Dave Salisbury

Dave Salisbury

Never forget, the value of the “Voice of the Customer” survey is found in actionable data, to improve cohesion between the front and back office, training talking points, and the power to return a customer to your business.

Anything else promised is, in my opinion, smoke and mirrors, a fake, a fallacy, and sales ruse.

Thanks to Dave Salisbury, an Operations and Customer Relations Specialist, for putting together this article.

Author: Dave Salisbury
Reviewed by: Jonty Pearce

Published On: 2nd Sep 2019 - Last modified: 16th May 2024
Read more about - Customer Service Strategy, , , , ,

Follow Us on LinkedIn

Recommended Articles

Survey rating icons next to person on tablet
How to Create the Best Customer Service Survey - With Examples
Questioning Techniques
6 Effective Questioning Techniques for Customer Service
Staff surveys concept with people sitting on row of chairs holding speech bubbles
29 Ways to Transform Your Call Centre Staff Surveys
Customer feedback concept
11 Best Practices for a Voice of the Customer Survey