So far in this guide we’ve covered the basics – what is Net Promoter and how do you calculate it?
But you can’t calculate your score without getting some responses, so we’ll show you, with examples, how to put together a NPS survey that will give you actionable data and a fantastic response rate, and how to send it out to your customers.
This might seem painfully obvious – but when we help people with their NPS projects, this is where we start, because making mistakes here will cause problems that undermine everything else.
Think of it as a house – if you don’t take time to build good, solid foundatations, it doesn’t matter how you build the roof, what wallpaper you pick, or how many bathrooms you have – the house is going to fall down.
NPS was designed as a way of measuring loyalty. But, surprisingly, the biggest mistake we see people make is… Designing NPS surveys to… measure loyalty.
That’s not a typo.
You take your car to a mechanic, and they use a machine to measure its emissions. You go to the doctor, and they use a gizmo to_measure your pulse. And you use a thermometer to measure the temperature.
But there’s a huge difference between a satisfaction survey and a thermometer: A thermometer (sadly) can’t make you warmer, but a feedback form can make or break customer experience in two ways:
No customer, anywhere, wakes up in the morning feeling a burning need to have their loyalty measured.
So if the primary objective of your NPS survey is measuring loyalty, you’re not going to get great results. However much you try to hide it, all your decisions will give priority to ‘measuring loyalty’ ahead of anything else (‘delighting customers’, ‘solving problems’, etc). Your customers will see through you.
Instead,
The objective of your NPS survey should be to provide a brilliant customer feedback experience. It’s fine to measure loyalty as well, but it’s counterproductive to try and measure loyalty in a way which might decrease it.
You can’t improve the overall experience that customers have of your organisation if your customer experience improvement projects deliver a poor experience.
So…
Let’s start with the basics. It’s not a NPS survey without the NPS question:
But as we’ve just discussed, simply measuring loyalty isn’t a great idea. You need to set out to improve it. And to improve it, we need to get inside our customers’ heads.
The best way to do this is by asking a single, open ended question. There are lots of different ways to ask it, so pick one that you feel works best for your circumstances.
But whatever you do, don’t overcomplicate things. Resist the temptation to ask twenty different open-ended questions: how are you feeling about our prices today?, how do you feel about the agent you spoke to?, how would you characterise our hold music? and please describe the warm, fuzzy feeling you have after contacting us.
Don’t make them think, and trust that if you give them that one textbox, they will tell you everything you need to know to retain their business.
Then, it’s up to you and your team to keep them happy by acting on what they tell you.
Conventional wisdom holds that NPS is a 2-question survey. We’re here to tell you that it’s OK to ignore conventional wisdom, and we’ve got the data to prove it.
At CustomerSure, we help the teams who use our platform put themselves in their customers’ shoes and build feedback processes that those customers love being part of.
When we help people put their surveys together, we recommend adding a small number of ‘diagnostic’ scored questions, in addition to the ‘overall satisfaction’ question.
So for example,
It’s good to ask questions like these because they’re aligned with what your customers want from you. So customers don’t mind taking a fraction of a second out of their lives to give you a score.
There’s a “rule” which comes from usability research which says that fewer questions on a form is always better, and that more questions will always annoy the busy person completing your form and impact your response rate, but this isn’t quite true.
Like most “rules”, reality isn’t so black-and-white. Yes, overwhelming a form with pointless questions that customers have no interest in answering will impact your response rate. But when you add a few carefully-chosen questions that their customers have a stake in answering, this is the result:
If ‘speed of response’ is important to a customer, and they trust that you’re listening, if they can tell you in under a second how good your speed of response is, they will.
So it’s OK to ask these extra questions – they aren’t going to upset your customers and kill your response rate.
In exchange you get a constant ‘temperature check’ on the aspects of your service which are most important to your customers, and the ability to quickly put things right if you notice standards slipping.
We’ve already touched upon the ingredients of a great survey email elsewhere in this guide, but let’s bring them together into a short, easy-to-follow recipe.
If you want a more comprehensive guide to getting the right response rate for your surveys, we have one of those, and if you’d like to talk through any of the issues in this, or any of our guides with an expert, get in touch.
Skiing image licensed under cc-by-sa 2.0, from jfdervin on flickr.
Ready to elevate your VoC programme and ensure success using our expert guide? Learn the three foundations required for success.
Discover more »Connect with a CX expert who’ll help determine your current VoC programme maturity level and provide a 3-step action plan to improve.