2nd September 2019
Dave Salisbury draws on a recent experience to highlight some key considerations and methods for building the ideal customer survey.
Nissan sent me a post-car sale survey. I answered the questions honestly, and the sales rep did a great job. Yet those servicing and supporting the sales rep did some things that I was not particularly pleased with.
I was specific in the comment section, praising my sales rep, and particular on who and where the ball was dropped, creating issues.
Not 30 minutes after completing the survey, I receive a call from a senior director at the Nissan dealership, who said that my sales rep was going to be fired and lose all his commissions for the month because the survey is solely his responsibility.
The senior director went on to say: “This is an industry-wide practice and cannot be changed.”
As a business consultant, long have I fought the “Voice-of-the-Customer” surveys for measuring things that a customer service person, salesperson, frontline customer-facing employee does not control.
If the customer-facing employee does not control all the facets that create a problem, then the survey should only be measuring what can be controlled.
If the customer-facing employee does not control all the facets that create a problem, then the survey should only be measuring what can be controlled.
For a car salesperson, they have a challenging job, and they rely upon a team to help close the deal. Including a service department, a finance department, sales managers, and more. The blame all falls upon the sales rep for problems in the back office.
Quantitative data is useful but means nothing without proper context…
Quantitative data is useful but means nothing without proper context, support, purpose, and a properly designed survey analysis procedure.
Even with all those tools in place, at best, quantitative data can be construed, confused, and convoluted by the researcher, the organization paying for the research, or the bias of those reading the research report.
Qualitative data is useful, but the researcher’s bias plays a more active role in qualitative research. Qualitative research suffers the same problems as quantitative research for many of the same reasons.
Regardless, quantitative and qualitative data does not prove anything. The only thing qualitative and quantitative data does is support a conclusion. Hence, the human element remains the preeminent hinge upon which the data swings.
Leading to some questions that every business sending out a “Voice-of-the-Customer Survey” instrument needs to be investigating and answering continuously…
Even with all this taken into consideration, business leaders making decisions about “Voice of the Customer” survey data need to understand one person can make or break the service/sales chain; but it requires a team to support the customer-facing employee.
As Joseph M. Juran once remarked: “When a problem exists, 90% of the time the solution is found in the processes, not the people.”
When a problem exists, 90% of the time the solution is found in the processes, not the people.
Hence, when bad surveys come in, defend your people, check how your business is doing business, e.g., the processes.
The dynamics of “Voice of the Customer” survey instruments require something else for consideration, delivery.
AT&T recently sent me a “Voice of the Customer” survey via text message collecting the barest of numerical (quantitative) data, three text messages, three data pieces, none of which gets to the heart of the customer issue. Barely rating the salesperson in the AT&T store I had previously visited.
Recently, I received a call from Sprint, where the telemarketer wanted to know if I wanted to switch back to Sprint and why.
The nasal voice, the rushed manner, and the disconnected mannerisms of the telemarketer left me with strong negative impressions, not about the telemarketer, but about Sprint.
Nissan sends emails, and while the data collected has aspects of the customer’s voice (qualitative) and numerical rankings (quantitative), my impressions of Nissan have sunk over the use of the survey to fire hard-working sales professionals.
My previous bank, Washington Mutual, had a good, not great, “Voice of the Customer” survey process, but the customer service industry continues to make the same mistakes in survey delivery and application.
The how and why in “Voice of the Customer” surveys, or the delivery and use of the survey data, leaves a longer-lasting impression upon the customer than the actual survey.
The how and why in “Voice of the Customer” surveys, or the delivery and use of the survey data, leaves a longer-lasting impression upon the customer than the actual survey.
So, if you are a business leader who purchased an off-the-shelf “Voice of the Customer” survey analytics package, and you cannot explain the how, why, when, where, what, and who questions in an elevator, the problem is not with the customer-facing employees doing your bidding.
If your back-office people supporting the customer-facing people are not being measured and held accountable, then the survey is disingenuous at best, and unethical at worst.
I recommend the following as methods to improve the “Voice of the Customer” survey process:
Dave Salisbury
Never forget, the value of the “Voice of the Customer” survey is found in actionable data, to improve cohesion between the front and back office, training talking points, and the power to return a customer to your business.
Anything else promised is, in my opinion, smoke and mirrors, a fake, a fallacy, and sales ruse.
Thanks to Dave Salisbury, an Operations and Customer Relations Specialist, for putting together this article.
Reviewed by: Jonty Pearce