What’s the problem?
I see the same mistakes – as well as some new ones – being made over and again in surveys and questionnaires. Most businesses and other organisations are dependent on surveys to a greater or lesser extent. They use them to find out what customers think of them, or what products they should be developing, or what issues need fixing… etc etc. Yet often those organisations are not getting accurate information. If survey questions are confusing or ambiguous, or constrain answer choices, they will be getting a skewed view of responses. It’s like a political poll asking, for example, ‘which candidate do you like’, rather than ‘who do you intend to vote for’. What do you actually want to know?
My advice is that if you are responsible for a survey of any sort, spend some time getting yourself up to speed with what makes for a good survey, and what some of the pitfalls are. It’s easy when you’re familiar with a topic to ask questions that respondents won’t necessarily understand, and it takes some self-discipline and customer knowledge to avoid the problem.
You can’t always entirely trust the ‘experts’ either. I’ve had many a debate with professional purveyors of surveys about their proposed wording for questions, as I’ve felt that they were
- unclear to my particular customers
- too similar to other questions
- not offering adequate response choices
You need at least to be able to judge whether the professional advising you really knows their stuff.
It pays to test a survey on a small sample before general release – and that means talking to people, and understanding how they interpret the questions, and whether it’s the interpretation that you intended.
There are many books on how to write questionnaires and surveys. One that I’ve read and can recommend is
Don’t expect a riveting read, it’s a textbook. But it does cover the ground with good examples.
Now I’ll describe how I decided to approach a survey question at British Airways that was more complicated than it seemed, and then I’ll give some examples from recent surveys that I’ve filled in.
The British Airways question
I was working on the wording for the ba.com site survey, and ended up with some convoluted logic. It wasn’t convoluted to the people filling it in (hopefully), as the sequence would make sense to them. It didn’t make a lot of sense though to colleagues and others who reviewed the questions, and I had to defend the structure many times.
When customers filled in the feedback survey on ba.com, we wanted to know if they were
- a member of the Executive Club (the frequent flyer scheme, abbreviated to EC)
- if so, which tier they were (Blue, Bronze, Silver, Gold)
- if they weren’t, were they registered with a site login
- if they weren’t registered at all
We could have gone with this:-
- An Executive Club (EC) Member
- Registered on ba.com
- Don’t know>
The problem with this is that some people don’t know if they are EC members. Generally, those who are know they are, as they’ve gone through the process of joining, but otherwise people could say, “How do I tell if I am?”. They might think that just registering on the site, or buying a plane ticket would give them membership. Internally within BA it came as a surprise to some that there could be this confusion.
It would be a little better to have
- An Executive Club Member
- Registered on ba.com (but not an Executive Club member)
- Don’t know>
The problem would still remain though that if you are either an Executive Club member, or registered, but weren’t sure which, you would answer ‘don’t know’, and then we wouldn’t have known whether you were registered at all.
It would also potentially confuse some Executive Club members who would think that they are both a member, and also registered.
What we went with was this.
<Do you have a login for ba.com?>
If the customer said no, skip to next question, if they said yes, then we asked
<Are you an Executive Club Member?>
If they said no, then they were registered, but not EC. If they said yes, we asked what Tier they were.
It’s still not perfect, but it does at least mean that we got better quality results on whether people were registered or not (without having to interpret what ‘registered’ means).
One of the most frustrating things that I see in a survey is when none of the answers apply. I’ve lost track of the number of times I’ve told people that they need an ‘other’ option. Sometimes there is a closed set of potential answers – either you bought something on this visit or you didn’t.
I filled in a survey after having attended a Rock weekend at Butlins (it was great). I answered a question saying that the experience could have been better, and then I was presented with this question asking what could be improved.
The problem is that my reason wasn’t any of these things, yet the only way to progress is to pick one. So I did. If they had an ‘other’ option which allowed me to enter text, I would have let them know that some of the behaviour from other guests who had had too much to drink had been annoying. But they’ll never know.
Butlins also asked this –
The drop-downs were the same for each, and showed this-
I would expect Butlins to have a good handle on what matters to their customers in general, but for me, Internet access can be a deal breaker for a holiday, and it’s not in the list. My wife always wants to know if there is a hair dryer in the room. These may not be our number one issue, but if you’re going as low as number five, then you risk missing out.
First Great Western (FGW) ask about reason for travel.
I think it’s reasonable to assume that business, commuting and leisure account for the majority of train journeys. But what if you’re travelling to a funeral, or other reasons? It may be a small enough proportion that FGW think it’s not worth making the survey more complex by having an ‘other’, which they are entitled to do. But each time a respondent has to think harder about a question, it’s an additional point at which they are likely to drop out.
It’s fairly common on a site survey to ask what the purpose of visiting was (again, it can be problematic to assume you know all the answers), and then to ask whether you were successful.
Maplin and First Great Western both use Foresee to serve their surveys, and they take a different approach to each other.
You can see that Maplin offer a ‘partial success’ option, which FGW don’t. It’s likely that for many sites a significant proportion of customers will be partially successful. With FGW, I might have come to buy a train ticket, and did so, but it wasn’t at the price or the time that I wanted. I count that as a partial success. By only offering the binary choice customers are forced to make a qualitative judgement which way to vote. When that happens, the survey owner loses useful information. That’s especially so if the customer votes for ‘success’, as then you don’t know there’s something that was an issue. You can still ask ‘was there anything else that would have improved your experience today’, but then you have a pile of verbatims, and the issues are lost from the headline reporting of the success question.
I filled in a Which? survey about my car. This was one of the questions, asking how I financed the purchase.
As with any web text, survey respondents don’t necessarily read the detail of each question. They will scan, and stop at the first answer they think applies to them. In this list, the choices for ‘Personal Contract Hire’ (the first option) and ‘Personal Contract Purchase’ (further down) are quite similar, and unless you are a wizard on car finance you have to read the detail to understand the difference. I suspect that Which? are going to get more responses to the first option than actually applies. You’ll get people like myself who have ‘Personal Contract Purchase’, who read just enough to decide that the first (and wrong) choice, applies to them.
In such cases, the two options should be next to each other. It doesn’t entirely solve the problem (it would further help if the order was randomised), but there’s more chance that people will spot the alternative, rather than just going with something that looks close enough.
Time and again I’m filling in a survey and think ‘what do they mean by that?’. Often, these will be technical questions, or ones requiring a subjective judgement but no guidance is offered.
On Google maps I often answer questions about places I’ve visited. It seems that Google has a standard set of questions, some of which do puzzle me.
What you and I consider to be ‘trendy’ can vary. Google may be ok with this, but I usually just go for ‘not sure’.
I always struggle with this one. Is a more expensive pub ‘upscale’? There are probably venues that clearly are, like the Ritz, and those that clearly aren’t, like McDonalds, but where’s the line?
Shouldn’t the question be ‘Is this place popular with travelers?’. Each time I see this I have to stop and think about what it means. Anyway, how can I tell who is a traveler? Does it mean tourists? People just in transit?
Google seem to be experimenting with images as well. Here’s one question I was asked.
I can well imagine that Google could be experimenting with the automation of image choices. Nevertheless, whilst asking me which image is more ‘helpful’ (what does that mean? Should it be ‘representative’?), the picture on the left is of Windsor, rather than Slough. The picture on the right is of some offices just outside Slough. I don’t think either are ‘helpful’, although the one on the right is at least of Slough.
There could be some rhyme and reason to all this. All I’m doing here is pointing out some of the confusion these things cause to me, and readers can decide for themselves whether it’s useful or relevant.
I filled in a Which? survey about pet insurance. We have a cat. The survey asks what type of cat it is.
As you can see, the response is selected with a check-box, but unlike radio buttons, check boxes are not mutually exclusive. This doesn’t make sense, as the cat can only be of one type. If you select more than one type, you get an error message.
This could easily be avoided by using radio buttons. Whilst most people aren’t going to pick multiples, if you have a cross-breed you might pick two boxes, or you might pick one, and then spot a more accurate description, and go for that. The initial question also does not specify that only one choice can be made.
Viking sell office supplies. They also use Foresee to serve their survey. At the end of the survey this is what you see.
It’s good that it says thanks, but where do you think ‘Contact Us’ links to? I’d assumed it would allow me to contact Viking, as I’m answering their survey, but it actually links you to the Foresee site.
Many years ago we discovered that some of our customers were contacting our survey supplier in the mistaken impression they were contacting us. Worse, the supplier was responding directly, rather than passing the messages back. It needs to be clear who the contact is with. Own your own survey.
A couple of positives
I’ll finish off by pointing out a couple of positives things I’ve seen.
This is a good(ish) sign-off from Butlins, thanking the customer for taking the time. It’s a shame that the message about entry into a prize draw is so small and barely readable. More could be made of it – and a happy picture would add to the experience.
Customers who respond to a survey may be inclined to help out with further research. FGW ask if customers are willing to do so, and it’s possible to build up quite a database of willing customers that can be segmented by the responses to the survey. The wording could be tightened up and made a bit more visually appealing though.
Finally, from the Which? survey on pet insurance, there’s a question about the age of the cat. It’s good that there is encouragement to answer approximately if you’re not sure. It gives that bit of permission not to sit and agonise about being precise.
This question reminds me of applying for car insurance years ago. Many insurers asked for the date when your licence was issued. In fact, all they were interested in was whether it was issued more than a certain number of years ago. It would have made my life easier if they had just asked that.