Poll Findings And How To Report Them

The BBC censor political poll findings. They say they don't, but their journalists and editors complain privately they do, and the evidence is there, from the Today programme to What the Papers Say to the news broadcasts. They've thrown the baby out with the bath water, and ignore the only systematic and objective measure of British public opinion, and replace it with vox pops, phone-in ('voodoo') polls, interviews with party spokesmen and their own spin.

The BBC censor political poll findings. They say they don't, but their journalists and editors complain privately they do, and the evidence is there, from the Today programme to What the Papers Say to the news broadcasts. They've thrown the baby out with the bath water, and ignore the only systematic and objective measure of British public opinion, and replace it with vox pops, phone-in ('voodoo') polls, interviews with party spokesmen and their own spin.

Some papers, certainly Andreas Whittam Smith's Independent, did away with them after the last election. But polls are still news, and will be covered by most news outlets, commented on even by the BBC, and taken seriously by the politicians -- whatever they say about poll findings being 'absolute rubbish', as the Prime Minister did recently, comparing our poll for The Times to the party's private polling, which Central Office then refused to quantify. The Independent has also, within the last month, started commissioning polls again (from Harris Research).

Journalists often believe the clichй that 'people lie to pollsters'; they didn't at Wirral South. MORI found 35% said they intended to vote Tory: 34.4% did on the day.

The polls in the 1992 British General Election have a lot to answer for. Apart from the most exciting election night in a long time, the five major pollsters, Gallup, Harris, ICM, MORI and NOP, were more wrong, by a wider margin, than ever before.

The press and the public expected better of them: after all, MORI has been within one percent of the share for each party in five out of the last six general elections. But not in 1992. To put the error in perspective however, if one Tory voter in 200 had voted for the second party in his or her constituency, or if one Tory voter in 100 had not voted at all, the result would have been a hung Parliament. An exhaustive two-year Inquiry by a committee set up by the Market Research Society and consisting of pollsters, academics and other experienced market researchers not involved in opinion polling, reported in July 1994. I have discussed their conclusions and recommendations before in BPO, but I hope regular readers will forgive me if I rehearse them again, as they have a great many lessons for both pollsters and poll readers:

  • It was not the use of quota sampling or the method of interviewing
  • It was not sample size
  • It was not the overseas vote
  • It was not question wording
  • There was no evidence of widespread lying

The MRS Inquiry isolated three root causes of the discrepancy between the poll findings and the final result. By tackling these causes, we believe we can prevent the problems recurring and that the polling industry can accurately measure the 1997 election and justify the media in reporting our findings seriously.

The first of these causes was late swing. Polls are snapshots at a point in time, and that point is when the fieldwork was done, not when the results were published. If after that voters change their minds, the `don't knows' decide to vote after all and one party's supporters become so apathetic that they stay at home, the polls will "get it wrong".

Late swing wasn't the only problem (or the exit polls should have been spot on), but no-one at the time doubted that it was happening. Forgotten now are the Poll Findings And How To Report Thems on election day: "LATE SURGE BY TORIES CLOSES GAP ON LABOUR IN FINAL HOURS OF CAMPAIGN" was The Times' banner; "TORY HOPES RISE AFTER LATE SURGE" was the Poll Findings And How To Report Them over the "splash" in the Guardian. In The Daily Telegraph it was "TORIES NARROW GAP", and in the Financial Times the banner read "OPINION POLLS INDICATE LAST-MINUTE SWING FROM LABOUR TO TORIES" while the Daily Express trumpeted "TORY SURGE: POLLS SHOW LATE BOOST FOR MAJOR".

Labour's peak came on "Sheffield Wednesday", eight days before polling day, with published leads of 4, 6 and 7 points: sufficient to give them an overall majority. But Labour's triumphant Sheffield Rally proved the beginning of the end for Labour and its leader Neil Kinnock. From that point on, it seems to have been downhill. The Conservatives spent nearly all of their advertising money in the final three days (at a weight greater, annualised, than the spending of Proctor & Gamble or Unilever on soap powder), levelling all of their guns at the Liberal Democrats' voters "letting Labour in". The testimony of the Liberal Democrats' campaign manager was that this did great damage to their support in the final hours of the campaign. The Tory tabloids did all they could on their front pages, as well as the leader columns, to ensure the Conservatives were returned to power. Right to the end of the election, the "floating voters" were higher than ever before.

Of course, we can't stop voters changing their minds, so the possibility of late swing will always be a hurdle that pollsters must clear; but by polling as late as possible, by bearing the possibility of late swing in mind, and by questions intended to detect factors such a likely differential turnout, we can hope it will not normally catch us out.

Investigation has also made it plain that there was differential refusal: Conservative supporters were less likely to reveal their loyalties than Labour supporters. This certainly operated through the reluctance of some of those interviewed to reveal their voting intentions, both by outright refusal and by answering "don't know". A similar but numerically more significant effect probably operated through a refusal by some to be interviewed at all, although there is no solid evidence to support this -- consequently the samples interviewed were Labour-tilted and Conservative support was under-estimated. This probably arose through the operation of what has been described as "The Spiral of Silence" (Noelle-Neumann, 1984): Conservatives, feeling their party was unfashionable and that they were outnumbered were more reluctant to disclose their loyalties.

The third cause that appears to have contributed to the error was inadequacies in the sampling. This arose partly because the quotas set for interviewers and the weights applied after interviewing did not reflect sufficiently accurately the social profile of the electorate at the time of the election; we are now using more accurate, up-to-date sources for setting quotas and weighting. It also arose partly because the variables used in setting quotas and weights were not correlated closely enough with voting behaviour to ensure that the polls reflected the distribution of political support among the electorate.

MORI has put considerable effort since 1992 into correcting this fault, and has introduced important improvements in its Omnibus polls, including the political poll series for The Times, reported in BPO.

How poll findings ought to be reported

Those who report poll results have a responsibility to their readers and viewers, and to their own profession of journalism -- as well as to the pollsters who carried out the survey. This includes accuracy in reporting the findings; completeness in ensuring that the information that is reported is not so divorced from other information that it is misleading; that the basic information of the precise question wording, sample size, fieldwork dates, etc. are reported to give the reader confidence that the poll was carried out according to proper procedures -- and, to the journalists' ability, relating the poll results to other known information about the subject of the survey's findings. The basic points are:

  • Report the shares, not the gap. (Most of the fluctuations are between Labour and the Lib Dems; the Tories have been becalmed in the high 20s and low 30s for two years -- if the Tory share doesn't get above 40, they've had it)
  • Report the fieldwork dates (what is important is when the people were interviewed, not when the poll was published)
  • Report the sample size (the more the better, and the more sampling points the better)
  • Report the full question wording (language is important; pollsters work hard to ask unbiased questions)
  • Report in integers (polls are never accurate to decimal points)
  • Report the pollster (watch the brand names you know; the others may be fronting for a party or pressure group)
  • Ignore the 'voodoo' polls (they are inevitably biased and usually 'fixed')

Sincerely

Sir Robert Worcester

This letter is adapted from an article published in UK Press Gazette on 7th March 1997. Sir Robert Worcester's monograph, 'A Journalist's Guide to Understanding and Reporting Public Opinion' is available free from MORI.

More insights about Public Sector