One of the most challenging questions I've ever had put to me was when I was rung up by the BBC World Service.160 "I'm the producer of the Africa Programme, and we have a `Letter of the Week' feature with a question from a listener in Nigeria that should interest you.160 I'll read it to you: "your programme talks about opinion polls frequently - tell me, `what is an opinion poll?'."160
Now that was a challenge. I thought long and hard about how I would describe public opinion to the listener. Public opinion I finally decided I could define as "the collective view of a defined population", so in seven words tried to encapsulate the fine tuning of the many nuances of the efforts of the editors of the Oxford Dictionary who took 842 words.160 In its essence, public opinion polling (and market research uses the same techniques) I define as "the collective view of a [sample of a] defined population".160
Public opinion is important.160 Why do I think this?160 Because the public does, and even more important, acts on its beliefs.160 When I first got into this business of market research I asked a sample of British adults the extent to which they agreed or disagreed that "A company that has a good reputation would not sell poor products", and was astonished to have three people in four say they agreed.160 Now that obviously wasn't entirely true, all smart companies do test marketing, all companies `try it on the dog'.160
Even more astonishing, over a third, 37%, said "I never buy products had by companies I've never heard of."160 Of course that was nonsense, but that is what they perceived!
So it's important to have good products and services, it's important to be price/quality competitive in the marketplace, whether running a fruit and vegetable stall or a bank (unless, of course, a monopoly).160 And it is important to know what your customers (and prospective customers) think.
That's where market research comes in.
I describe market research as the `marriage of the art of asking questions and the science of sampling'.160 It's a very simple business, all you have to do is ask the right questions, of the right sample, and add up the figures correctly.160
The Art of Asking Questions
My favourite question to give students to critique was put to the British public in 1938: "Are you in favour of direct retaliatory action against Franco's piracy?"160160160 In just eleven words, five rules of good question construction were broken:
- Ask balanced questions: "Do you favour or oppose...?"
- Define your terms: one man's direct retaliatory action is a punch on the nose; another's is nuclear bombs.
- Use language in common usage: "retaliatory" would likely be misunderstood by many people, especially in 1938.
- Explain who's who: wonder how many didn't know who Franco was.
- Eschew pejoratives: "Piracy"?160160 What was surprising was 22% were against taking action with a loaded question like that one!
Here are some other tips on what to look for in questions:
Are they clear?160 Read the question aloud.160 If you've forgotten the point by the time you get to the end, or if you stumble over them, chances are others would as well.160
- Does the question ask for a dual response?160 We call those `double-barrelled questions' (or even `triple'). All too often a question will be drafted to which a respondent can perfectly properly respond in two, or even three or more answers.
- Is the question precise?160 A good survey question says precisely what object the item refers to, leaving no room for ambiguity.160 Here's an example of a problem question sometimes asked by researchers: "When did you buy your watch?" The question is incomplete in that it fails to tell the respondent which watch, if the respondent has more than one, and it may be that the watch in question was a gift.
- Is the time period defined?160 The period over which the question is related be crucial to the respondent's answer.160 Time is a difficult concept for many people.160 For example: "Did you go abroad last year?" As well as possibly imprecise as to whether it was on business or on holiday, does "last year" refer to the previous calendar year, the year back from when the question was asked, or even for those at or with children at school, the school year?
- Is the question loaded?160 Reputable market researchers have too much at stake in their work to be caught intentionally biasing a question. 160Special interest groups, however, sometimes have a vested interest in loading a question to get a certain result.
- Does the question assume knowledge on the part of the respondent which he or she might not have?160 Another common error in survey questions is assuming the respondents know something about what the question asks, with a resulting distortion of the extent and direction of public opinion.160 Questions about opinion towards advertising can provide excellent examples. For instance: "Are you more or less likely to buy the watch as a result of seeing the ad?"160 Those who had no intention of buying any watch would likely say less, nothing to do with seeing, or not, the ad.160 Any question asking `more or less' can fall into this trap, yet you see polls frequently which ask questions which empirical research has shown to give nonsense answers.160160
- Does the question ask for a comparison that is meaningful to the respondent?160 If a question asks for the respondent to compare something he or she knows, but then asks for a comparison to something unknown, the resulting answer would be meaningless.160 For instance, "Do you think that the price you are being asked to pay (for this or that TV) is fair in light of what other TVs are costing?160 This question assumes a lot, not only that they know the level of their own local prices but they have a basis of comparison that is meaningful.160 It may well be that the respondent has a view on this related to what the respondent paid last year for her television, but has no idea of how it compares with other TVs on the market now.
- Is the question's meaning obscured by asking about a very complicated behaviour in simplistic terms?160 For example: "Where do you usually get most of your news about what's going on in the world today, from the newspapers or radio or television or talking to people or where?" Without knowing what type of news, it's hard to argue that this question has much meaning.160 Fashion news for the modern professional or business woman may come from a woman's magazine while her business news comes from the Financial Times and the Economist while her main source of national news comes from watching the ITN News at Ten.160
- Does the question use a balanced scale?160 Years ago the market research manager of the British Post Office said to me that seven in ten people in Britain were satisfied with the postal service.160 When I saw the questionnaire, I saw why.160 He'd asked: "Are you very satisfied, satisfied, or dissatisfied?"160 It may have made his bosses happy, but it certainly gave them bad research.
- Is it a "yes/no" question asking for an attitude?160 We virtually never ask "yes/no" questions other than to factual or behavioural questions such as "Do you normally wear glasses for reading?" In general, any question that has a yes/no answer is likely to inflate the favourable response to an item, regardless whether the question itself is loaded.160 But, a more subtle form of loading combines prestige attachment or social desirability with a tendency toward `yea-saying' by some respondents.160160160
- Are appropriate filter questions asked? There's little point of asking detailed political questions of someone who has just told you they are certain they would not vote in the general election and take no interest in politics.160 To do so invites muddy waters when the data are analysed.
- Are questions designed to get at intensity of feeling?160 Typically, opinion questionnaires are quite thorough in getting at direction or feeling (pro, con, or neutral).160 Often, though, effective public opinion that is the version of public opinion likely to be acted upon is more accurately reflected in how strongly, or intensely, people feel about the issues.160
The means by which questions are tested by conscientious researchers is to try them out on their colleagues first, then on members of the public in a pilot test, asking respondents the question first, listening carefully for the response or any questions arising, then asking the respondent what he or she meant by their answer, then their understanding of the question.
The Science of Sampling
The street corner survey that has been the basis of thousands of research reports typically uses sampling.160 It may even be erroneously called a "random" survey.160 People commonly use the word "random" to mean haphazard; the statistician uses it with precision to mean "having an equal probability of selection". Another important factor in survey sampling is to be certain the precise community, group, or class being talked about in the results is carefully defined.160 Internet surveys are becoming more popular, although those on the internet and thus can be reached are not `representative' of the population, being more middle class, more middle aged, more educated and more white.160 If a researcher presents data from a telephone or an internet survey, good questions to ask, are:
- "What steps were taken to deal with unlisted telephones, multiple email addresses, mobile phones, engaged or `dud' email addresses, etc.?"
- "What is the bias inherent in leaving people who are not on the telephone/internet out of the sample?"
- "How was the effect of this bias dealt with?"
- "What was the refusal rate, by subgroup of the sample?"
- "How was the effect of differential refusal dealt with?"
- "Were the questions suitable for asking over the telephone/internet?"
- "What is the record of the market research firm? You get what you pay for.
- "How many questions were asked and is there an `order' bias in questions asked earlier to influence the result of questions asked later in the questionnaire (sometimes called `position bias')?"
- "What techniques were used to get around the inability to use show cards, illustrations or other visual prompts (on the telephone) or position bias (on the internet), the effect of respondents knowing, where this is not provided for in the `script', questions coming next?"
Survey research is widely misunderstood. It can provide understanding, analysis and tracking of the behaviour, knowledge, opinions, attitudes and values of the public. 160By measuring this, within the limits of the science of sampling and the art of asking questions, surveys can determine what people do and what they think.
Sir Robert Worcester is the Founder of MORI.