Member log in

Political polls' accuracy vs the actual 2011 election result

It's election year so we're going to hear a lot about polls.  While it's a matter of time before politicians come out with that hoary old chestnut about the only poll that counts being election day, by and large polls in New Zealand have been pretty good when it comes to picking the election results.  Well, I would say that, wouldn't I?

In this blog I'm going to take a look at the final results from the mainstream polls from the 2011 election, and publicly reveal for the first time the results of UMR's own last poll from that campaign.  I'll also look back at some of the past elections to see how the trends stand up over time.

In New Zealand, there are five major media polls, plus a few others (such as ours) that are done privately.  The five major media polls now are:

  • One News Colmar Brunton
  • NZ Herald Digipoll
  • TV3 Reid Research
  • Fairfax Ipsos
  • Roy Morgan

The first four of those, and UMR (along with one of the private polls), are all members of the New Zealand Association of Market Research Organisations and recently signed up to an agreed set of guidelines in terms of methodologies and reporting. In theory at least, they're all much of a muchness, but there will inevitably be differences in terms of the exact questions asked and how they ensure the survey sample is as representative as possible.  All those surveys have margins of error of between +/- 3.1% and +/- 3.6%.  I'm not privy to what exactly the other companies ensure that their samples are representative and I'm not going to share our exact methods with you – we all jealously safeguard those because they can be points of competitive advantage.

Four of those five polls were around at the 2011 election, the exception being Fairfax (then conducted by Research International). Although some left wing blogs have been critical of the Fairfax poll on the grounds that it was a long way out in 2011, I think that's manifestly unfair as Ipsos wasn't doing it. That's like criticising Cadbury for the taste of a Peanut Slab. The most we can say about the Fairfax Ipsos poll in 2014 is that we don't know how it stacks up historically.

In terms of the polls above, and indeed ours, it's fair to say that they were all reasonably close  Every one of them showed National close to governing alone, Labour in the 20s and the Greens over 10%. While some were clearly closer than others, by and large they produced results that were a reasonable indication of what actually happened.

Let's start by looking at National's vote.  In every case, I've taken the company's final published poll:

  • Actual result: 47.3%
  • UMR: 48.6%
  • One News Colmar Brunton: 50.0%
  • Herald Digipoll: 50.9%
  • Roy Morgan: 49.5%
  • TV3 / Reid Research: 50.8%
  • Fairfax / Research International: 54.0%

Now Labour:

  • Actual result: 27.5%
  • UMR: 28.2%
  • One News Colmar Brunton: 28.0%
  • Herald Digipoll: 28.0%
  • Roy Morgan: 23.5%
  • TV3 / Reid Research: 26.0%
  • Fairfax / Research International: 26.0%

And the Greens:

  • Actual result: 11.1%
  • UMR: 12.4%
  • One News Colmar Brunton: 10.0%
  • Herald Digipoll: 11.8%
  • Roy Morgan: 14.5%
  • TV3 / Reid Research: 13.4%
  • Fairfax / Research International: 12.0%

Lastly, the only other party to pass or come close to the threshold, New Zealand First:

  • Actual result: 6.6%
  • UMR: 6.0%
  • One News Colmar Brunton: 4.2%
  • Herald Digipoll: 5.2%
  • Roy Morgan: 6.5%
  • TV3 / Reid Research: 3.1%
  • Fairfax / Research International: 4.0%

I won't go through the final results for the smaller parliamentary parties but for each of them it's a pretty mixed picture with some polls picking too high and some too low (for example, the range for ACT was 0.7% to 1.8%, versus an actual result of 1.1%).

So what can we learn from all of that? First and foremost, while some polls are closer than others, by and large they provided a reasonable picture of what actually happened. The two big differences for me, however, are:

  • National didn't get enough votes to govern alone, despite all five public polls suggesting that they would (49.5% would almost certainly have been enough for them to govern alone, because of 'wasted' votes cast for parties that didn't get seats in Parliament).
  • Only half the polls picked NZ First getting over the threshold and the three polls that didn't were all out by more than the margin of error. 

It's particularly interesting to note that all six polls listed, including our own, picked National too high. Three of them were out by more than the margin of error.  That's not what we'd expect from probability theory, although we do need to recognise that all these polls closed at least a few days before the election (but less than a week) and votes can shift in the last few days. There are only two explanations for that: either there's a systematic skew towards National or National shed votes in the last few days.

You might think that's just a one-off result but I went back and looked at poll results from every election since 1999. That gives us a total of 19 final polls from 1999 to 2011 conducted by companies that are still polling.  So how did they do:

  • 16 had National too high, while three had them too low.  The most any company had underestimated National's vote by was 2%, while the most a company had overestimated National's vote by was 9%.  One poll has had National's vote above their actual vote by more than the margin of error at three of the last five elections.
  • five had Labour too high, while five had them too low.
  • nine had the Greens too high, while three had them too low. That overstates the case a little, because the most any poll has been out for the Greens is 3.4%.
  • one had NZ First too high, and nine had them too low. The biggest difference was in 2002, when one poll had them 6% too low – mostly the differences are within 2%.

I think it's fair to say from that that there's a tendency for New Zealand polls to overstate the votes for National and to a lesser extent the Greens, and to at least slightly understate the vote for NZ First. When it comes to interpreting current polls, it doesn't really matter whether that's because of inherent biases in the polls or because National and the Greens' vote tends to drop in the last few days of the campaign while NZ First's picks up – the impact on our interpretation should be the same.

One way of looking at this further is to take the average (mean) error for these four parties across the 19 final polls included in this dataset.  That shows us that the average error is:

  • National: 2.7% too high
  • Labour: 0.7% too high
  • Greens: 1.0% too high
  • NZ First: 1.5% too low.

Counting all mainstream media polls since 2005 (excluding UMR but including TV3 and Fairfax / Research International polls in 2008 and 2011) leaves 14 polls and an average error of:

  • National: 2.4% too high
  • Labour: 0.5% too low
  • Greens: 1.5% too high
  • NZ First: 1.1% too low.

These differences didn't really matter at the 2011 election, because the overall result was never really in doubt. I guess you could argue that there would have been more emphasis on National's potential coalition partners had it been known that it probably wasn't going to be able govern alone but John Key spent plenty of time on cups of tea anyway, which suggests that National wasn't counting its chickens on that score.

It surely does matter in 2014, when at least until recently most of the public polls have shown Labour plus Greens within touching distance of National plus its current allies.  I think history suggests that:

  • If the total for Labour plus Greens is within about 2% of the total for National and its allies (whichever of ACT, United Future and the Conservatives makes it into Parliament), then it's actually pretty much a dead heat. 
  • If NZ First gets 4% in most of the mainstream polls, then it will probably pass the 5% threshold on election day.

Gavin White is research drector for UMR Research. This blog was originally posted on sayit.co.nz, a site run by UMR for members of its online research panel

Comments and questions
11

Gavin: What of these polls are done by landline phone versus face-to-face versus online research survey? My understanding is that most are done by landline phone and inevitably therefore are skewed against those (especially the young and poorer) who don't have a landline.

Gavin, whilst I agree that the polls appeared to systematically over estimate National in 2011, I don't think you can extrapolate this prediction to current polling. Most polling companies would have reworked their methodology in response to the 2011 election and may not continue to show the apparent bias.

As clear as mud. But who trusts the polls any more after the pro-gay marriage polls definitely did not reflect the wishes of the country.

Remember the severe shock. John Campbell got when the results of TV 3's own poll were announced on air? 73% of the country were and are against gay marriage - just ask around, out of the metrosexual belt area.

Campbell's shock was palpable. (Another independently commissioned poll said the the same thing) . But the official polls were all for it.

Let's face it - nobody trusts the polls - they can reflect the pollsters' own thinking and it seems they do.

What on earth are you on about? This is about the election - not any referendum.

I think the polls are valuable and show that it will be close and I have taken steps because of that. I want a National win as I think it is best for the country going forward. I wouldn't mind a strong Labour win but the polls show that we will not have a strong Labour result; The polls show a Labgren government is possible, but it is a completely unknown mix and is to me risky. I see there are two Green parties in one - partly good and partly scary and I am worried about the scary with too much power.

Anyway based on the polls I have taken steps and see these as valuable planning tools.

as I said I want a National win for the good of the country but I stand to make a much more money out of a Labgren win - i.e. uncertainty/risk is where you make money.

But the polls have shown that we can expect any outcome and your stats above show that, so planning for the outcome (if you are able) is important.

"In every case, I've taken the company's final published poll:"

Yeah that was the day before the election buried 10 mins into the major news broadcasts so they get their figures with the 3% margin of error. In the weeks before the election they were running National at 53% - twice the margin of error.

When people are consistently told National will win the election regardless of their greens or labour vote, guess what? ....they don’t vote, and a million kiwis didn't vote last time...so go figure.

I think if you told them consistently before the election that Labour would win, National would romp in with their largest ever majority.

What did Ipredict show?

So the polls historically have consistently over-estimated National's support and underestimated Labour's? There is undoubtedly some element of self-fulfilling prophecy which will help the centre-right parties. They also dont' seem to have coped very well with mobile-only households. In summary, I'd take the polls with a large grain of salt and assume labour will do better than expected and national will do a little worse.

All of the polls underestimated the level of support for the Conservatives.

"margins of error of between +/- 3.1% and +/- 3.6%" Does this mean that there is an absolute error of between +/-3.1-3.6% and as such most polls fit the election results within their margin of error. Or is it a relative error and suggest that the error varies with the number of people that pick a party and not the overall sample size and methodology. In the case where the units of measure are % it is hard to tell when the uncertainty is also quoted in percentage as to if it is absolute or relative to the value.