from British Public Opinion, April 1999
30 April 1999
Forgive me if I return to the subject of Voodoo Polls. Normally I wouldn't, but these wretched things that some people describe as 'just a bit of fun' can be serious when there's a war on. As a presenter on BBC TV said last weekend, the war for public opinion on the war in Kosovo may turn out to be as important as the war itself. The first Sunday of the war there were three opinion polls published in three Sunday newspapers, carried out by three different polling organisations, asking three different question formulations about the use of British bombers in the NATO strike force, using three different sets of telephone interviewers, using three different but all legitimate sampling methods, and all coming up with the same result: 2:1 public support for the use of British airplanes to bomb Kosovo.
Just what you'd expect? Well not Sunday Business, who did not commission their own poll, but wrote a leader about how public opinion was against the war, based on the 84% who'd taken part in a phone-in poll on Talk Radio who opposed the war! That followed parallel pieces the day before in The Times and the Guardian, which without benefit of proper poll findings also shot from the hip, the Times' headline reading 'Backlash in Britain against Bombing', largely based on letters to local newspapers and interviews with local newspaper editors. The Times has not reported to its readers on Mondays so far the poll findings from the Sunday newspapers that have commissioned them.
Paul Routledge, writing in the Mirror on Saturday, April 10, was also taken in. 'Public opinion is divided about Britain's war in the Balkans, and may be turning against the NATO bombing. That is the message coming through from your letters to me over the last two weeks.' Reminds me of Tony Benn announcing in the House of Commons during the first weeks of the Falklands War in April 1982 that 'public opinion is swinging massively against the war', waving a sheaf of letters that had been sent to him. The following day we published a poll in the Economist showing that 78% of the British public supported sending the task force to the Falklands.
What is it about some journalists and politicians that lead them to set up polls as some sort of straw men, to be toppled by letters to newspapers or phony phone-ins, no matter how often the validity of these alternatives has been tested and shown to be wildly out of line with reality?
The usually reliable Economist recently published an article setting up national poll findings against local election results and by-elections, suggesting as one explanation that 'it could be that the national polls are just wrong', as if any pollster would claim that polls asking how people would vote in a General Election were any predictor of either local elections or by-elections. Actually, if we want to measure public opinion in local elections, we ask people who live in the areas local elections are being held how they intend to vote in local elections. The same for by-elections.
Further, the writer went on to state that 'at the 1997 general election, their average error was 4.4%', referring not to the share of the vote, but the gap, thus doubling the margin of error which on average was 2.2%, well within the statistically acknowledged plus or minus 3% which is 95 times in a hundred what theory would suggest is as close as can be expected, on average.
Throughout the last eight elections I have pleaded with the BBC and the papers to watch the share, not the gap, yet even such experienced political journalists as the Economist's still insist on reporting this potentially misleading measure. It was well and truly exposed in the last election, as reports of the closing of the gap led the public to think the Conservatives were catching up, when instead the closing was the result of an improvement in the share of the Liberal Democrats. This was predictable, and yet those who followed it were surprised by Labour's landslide, and the doubling of Lib Dem seats in the House of Commons.
The Economist article went on to talk about academics who 'comprehensively' survey, and 'leading psephologists' who compare voting in secondary elections and who find, surprise, surprise, that national opinion poll results are not much of a guide to the outcome of by-elections. They have never claimed to be.
Even worse is the phone, fax, and now internet scams, asking people to 'vote' in phone, fax or internet 'polls' which charge up to £2 a minute, or page, to take part in a 'poll' which will then be reported, sent to No. 10 or the White House, as if anyone in either would be taken in by them.
There is one Julian White who has a web site which invites people to take part in 'opinion polls', using the term very loosely. Recent findings from another, 'The Nation's Barometer', includes that 'Teletext viewers' believe that MPs don't deserve a pay rise (99%), parents must have the right to smack their children (95%), and the death penalty should be re-introduced (85%). They may or may not be right, but they are certainly not representative! They claim that 'over two million viewers have registered their votes on subjects from sex to Sunday shopping', concluding they don't know the impact these have made, but that their findings have 'even been used by MP John Redwood in seeking election for the Tory party leadership'. So there.
Finally, I was amused to see the 'To Our Readers' in TIME the week before last, where Chris Redman had a moan about the Turkish students who programmed their computers to robot 'bot' or swamp TIME's website 'poll' with thousands of 'votes', who were then countered by Greek students who organised a 'counter-bot'. As a result, the TIME site was overwhelmed, which led to thousands of faxes and 'tons' of letters of complaint. Not learning their lesson, Chris says they are still going to 'continue to poll our readers because we value their opinions'. Stick to readers' letters, Chris, voodoo polls aren't worth the hassle.
Sincerely Sir Robert Worcester
Government's COVID-19 Charity Support Fund delivers hope to 6.5 million people across country
The findings from our impact evaluation of the Coronavirus Community Support Fund (CCSF) have now been published. Ipsos MORI led the evaluation, which was delivered in partnership with NPC and The Tavistock Institute, on behalf of The National Lottery Community Fund and Department for Digital, Culture, Media and Sport.