• The Listener
  • North & South
  • Noted
  • RNZ

For whom the caller polls

Research shows that the Colmar Brunton One News poll consistently gives a higher estimate of National's support than other polls do. Does it (a) matter, (b) matter a little or (c) not matter at all?

It comes at dinnertime, of course. Just as the cat is wailing, the kids are fighting, and the beans are boiling over, the phone rings.

"Good evening. I'm Jason from Such and Such Research Company. Could I please speak to the person in your household who is over 18 and has the next birthday?"

You know what comes next. Questions on who you voted for at the last election, which party you would vote for if an election were held tomorrow and whom you would prefer as Prime Minister. They never ask the question you might like to answer. "Would you prefer Helen Clark more if her jackets were better fitting?" or "Would you consider voting for the Left if Jim Anderton promised to retire?"

No, you're stuck with what the pollsters ask, which, in some cases, can be a lengthy questionnaire that involves quizzing you on behalf of a number of their clients, about anything from credit cards to soap powder.

Colmar Brunton does not mix its political poll with other questions. Group account director Jeremy Todd says the poll the company conducts on behalf of One News is a stand-alone one that takes only about five minutes for each respondent to complete. And he says there is a status attached to it that he believes makes people more inclined to say yes to taking part. Even so, about 40% of people who are rung say no. It can take a lot of calls to get the 1000 respondents Colmar Brunton needs for each poll.

New research by UCLA PhD student Rob Salmond has compared the first 26 monthly polls run by Colmar Brunton since the 2002 elections, with the 14 done over the same period by TNS on behalf of TV3 and the 28 by UMR Research on behalf of the National Business Review. Salmond found the Colmar Brunton poll "provides consistently and significantly higher estimates of the National Party's support than either of the other two polls". Comparisons showed the Colmar Brunton poll had estimated the highest or equal-highest level of National support for each of the 26 polling months after the election, with the difference once reaching as high as 9.5%.

Salmond's analysis of the results suggests that the difference is owing to "deficiencies" in Colmar Brunton's polling method, namely because the company polls only on Monday to Thursday evenings, whereas TNS and UMR poll through the weekends. Salmond says the significance is that there is a bias in the set of people who are most likely to be home on weeknights.

"Weekday polling biases the sample in favour of wealthy citizens because those people whose jobs require them to work, without access to a telephone, outside of normal weekday working hours tend to have medium or low incomes."

Although neither Colmar Brunton nor TVNZ dispute Salmond's figures, they downplay their significance.

"The swings and roundabouts of polling tend to even themselves out in the public's mind and as far as we're concerned, we get a very professional job," TVNZ spokesman Richard Griffin says.

Todd says that Salmond's reasoning could be correct, but the company believes the best chance of getting people at home is Monday to Thursday weeknights.

"A lot of people are home at least once between Monday and Thursday weeknights.

"And I believe the TNS TV3 poll is more volatile than ours. They move around quite a lot compared with ours, which is much more stable and which is probably more reflective of gradual shifts in public opinion."

Unsurprisingly, TNS business director Steve Kirk disagrees, saying that it is noticeable that the Colmar Brunton poll "tends towards National consistently". He says that by interviewing over a full week "we ensure we get a true representation of the New Zealand population, including busy and socially active people and our poll seems much more sensitive in picking up changing trends".

Research companies watch each other's polls avidly. Todd admits that his heart has been in his mouth a couple of times when the Colmar Brunton poll has shown a sudden swing "particularly last year when the National Party jumped 17 percentage points, an unprecedented swing, after the Orewa speech and it was a bit of a relief to see their [TNS's] poll come into line with ours after that".

Such a swing is something that would make even Massey University professor of statistics Stephen Haslett take note, and he disregards much of what is claimed in post-poll commentary as shifts in party support.

"Most of the polls have margins of error that are too large to discuss the sorts of movements that turn into news," he says.

Todd disagrees, saying that small movements can be important.

"You can move down one percentage point every single month and you've moved within the margin for error, but, over several months, if you keep doing that, it's a trend and you've got to report those shifts."

Todd also points to the polling com-panies having accurately captured the election results in the only two MMP elections so far, in the polls taken in the week leading up to the 1999 and 2002 election days.

Haslett says that's a sample size of two; far too small to base claims of accuracy on.

His concerns go back to the basic presumption of research companies that their surveys are based on a random cross-section of the population. Haslett says some groups, like young men generally, and young Polynesian men in particular, are much less likely to be represented, and people who live in a household with one or two adults have a much higher chance of being polled than those who live with many adults.

Also, there is anecdotal evidence that professional people are less likely to donate precious minutes of their evening to a polling company.

Haslett says that although market research companies can and do make adjustments to try to balance their polls, these cannot adequately compensate for the groups that may be under-represented.

And he says the problem of under-represented groups is only likely to be exacerbated because in addition to the estimated 15% of people who have an unlisted number, more people are replacing their landline with a cellphone.

"And when it comes to elections, 50 or 60% of people vote. When it comes to polls, 50% might respond. It's possible, though highly unlikely, that the 50% who respond to the polls are the same 50% who don't vote. The people you're polling, even if everything else were perfect, may not be the people who are voting. This is quite separate from whether people who are polled even answer the question honestly. These are very hard things to tie down."

Haslett does not say that is a reason not to poll at all, "but let's admit it's a difficult problem and not get into the situation where we start saying this party has 48.1% support and this one has 48.2 so this one's ahead. From poll information, you wouldn't know. You'd need a difference more in the order of 5-6% to say they were apart."

Victoria University political science lecturer Jon Johansson is also a poll sceptic, but reserves most of his criticism for devices like the worm used on TVNZ's political leaders' debates in recent elections and, even more so, for "the ultimate lunacy and utterly ridiculous polls like the Holmes poll".

He says those types of polls might follow an item in which a poor elderly person has been the victim of an egregious crime and then a poll might be run with a question like "should the death penalty be restored", "and you're going to get 80-90% of people baying for blood".

"Those polls have no merit whatsoever. You tend to attract people with strong feelings, rather than the people in the middle who realise it's more complex."

Polls concern him because of the effect they have.

The worm, manipulated up or down "by an audience of so-called swinging voters", saw United Future's support suddenly swell after the worm rated leader Peter Dunne well in a 1999 leaders' debate.

Johansson says a politician who used words like "tolerance", and "working together to solve our country's problems" could make the worm rise, while anything negative made it drop.

"It mitigates against talking reality."

Griffin says TVNZ will not use any gimmicks in the coming election, and has dumped the worm.

"It was a gimmick that caused more anger and hurt than it did edification."

Research companies would reject any suggestion that polls are gimmicks, too. It is in the importance that journalists, commentators and sometimes politicians give to polls that the potential lies to vest them with more merit than they warrant.

Todd says polling companies do not aim to predict an election, "it's a barometer at that particular time".

That may be all a poll is, but One News will regularly take its Colmar Brunton poll and translate it via a coloured graphic to seats in Parliament, with the accompanying commentary saying that a particular party could or could not govern alone based on this poll result in an election. Polls are regularly reported as indicators of election outcomes, even if they lack the finesse to be reliably used that way.

Haslett urges more cautious treatment of poll results.

"Polls are useful, they give a guide, but you don't want to read too much into them and small fluctuations, in particular, in the order of 2-3% are best ignored.

"If you take that rule and then look at most of the stuff written, you'll find there's a lot of discussion about things that you really can't see properly."


When Wellingtonian Eddy Saul was polled by UMR Research one February evening, he thought that he was taking part in a regular political poll. As the questions went on, he became concerned that they had been framed to make him give answers favourable to Opposition Leader Don Brash. Finally, after he was asked whether Brash was likely to take a strong stand on welfare dependency, Saul objected and hung up.

Although UMR is reluctant to divulge its questions, it appears that Saul was randomly selected for the company's omnibus poll in February. The first questions, about political preferences, would have been used for the National Business Review's political poll. The other questions are likely to have been on behalf of one of UMR's major clients, the Labour Party.

UMR research manager Gavin White declined to reveal on whose behalf the questions were being asked, but says the company did ask questions in February on reaction to Brash's Orewa speech.

"It's called argument testing. We ask things that are both sides of the issues - in ways that might hurt your client, and ways that might advantage your client. We will ask a whole range of different statements asking, 'Do you agree or disagree with this statement?' But you give people the opportunity to agree or disagree. There is no point in asking a biased question.

"We have a reputation to protect."