Hard News: Where are all the polls at?
216 Responses
First ←Older Page 1 2 3 4 5 … 9 Newer→ Last
-
There is one other kind of "polling", and the scare quotes are there because it isn't really any kind of polling at all. In previous campaigns, TV1 and TV3 have had text "polls" (self-select and pay, rig as much as you want) during and after leaders' debates. These are worse than useless, and should not be considered by any reputable broadcaster. Worse still, the networks include them in their subsequent news coverage. I don't yet know if they'll be doing them again this time round. Let's hope not.
-
A useful comment on my Facebook from Gavin White (ex-UMR):
FYI I think Digipoll are out of business. Online polls are now the norm in Australia, and did very well at the 2016 Federal election in particular - compulsory voting helps with that. CATI surveys are getting prohibitively expensive, and as you know NZ media outlets don't have much cash. Robos have come in in Oz as the cheap alternative to CATI and proper online surveys, but they're indicative at best.
-
At least we don't have 'push-polling' here. Or do we?
-
I was push-polled once, when Len Brown was campaigning against John Banks.It was amateurish and abundantly clear they were trying to sway my opinion.
-
I was push polled by National about 10 years ago, full of have-you-stopped-beating-your-wife sorts of questions. From my local list MP (Michael Woodhouse) on the govt's dime no less, disguised as a poll of constituents opinions
-
Where did I read recently that NZ's population is far too small to make statistically robust Online polling viable ?
-
Andrew Robertson, in reply to
Despite the name, push polls aren't actually polls, they're a type of campaigning. For that type of campaigning to work in an electorate, you'd need to call 10 times or more the number of people that get called for a random poll.
-
Andrew Robertson, in reply to
I'm not sure, but you're correct.
However if you have some really clever data science skill, you could draw useful conclusions from even the most unrepresentative data sources.
-
Paul Campbell, in reply to
I understand that, in this case he just had Parliament pay to send his poll to everybody in his electorate. No one made any phone calls.
-
-
Answering my own question, here's the story.
A big sample of 1350, which will surely have been culled from their individual electorate polls – so I presume Reid is doing those too, and not just tacking on a question.
-
It would be interesting to see the results from Labour's canvassing returns in and around Auckland, although obviously that would be QUITE SECRET
-
Point of order Russell - I'm ex UMR NZ, but still do some work for UMR Au (separate company) 😉 You're right though, I'm not privy to the UMR NZ polls any more so what I'm saying is from an 'interested outsider' perspective.
As Andrew says, what people think of as push polling doesn't meet the definition. Push polling is about using a pretend poll to persuade enough voters to switch to your desired position. The sample sizes of NZ polls are too small to be anything other than a drop in the bucket in terms of the overall vote.
What Mikaere is referring to was probably a poll designed to work out which messages are most effective. The pollster probably asked for a vote up front, tested the messages and then asked the vote again - they would have then looked at who changed their minds, and which messages seem to have caused that change. The politician, company or organisation behind the poll then knows which messages to focus on - it's about influencin the viees of tens of thousands of people, not the views of a single survey participant.
Although push polls do exist (and are against pur Code of Practice), they're only really practical in contests where a very small number of votes can influence very big issues. With conventional methodologies you'd be talking tens of thousands of dollars to influence a few hundred votes - remembering that not everyone contacted will change their minds, you'd need to push poll a lot more people than you need.
It's more feasible with robos I imagine, which is another reason not to trust them.
On your last point Russell, there's no way Reid surveyed all those people just for a preferred PM poll (which I strongly dislike mainly because they're usually skewed to the incumbent, which is what is interesting about the NZ ones at the moment). There must be at least a vote coming, and on the numbers so far it's bad news for the Maori Party.
-
There seems to be a certain irony in the apparently decreasing amount of available information about some things in the so-called information age.
-
Mikaere Curtis, in reply to
IIRC, the "questioning" was along the lines of:
One issue facing our city is [insert topic]. Would you prefer to see a mayor like Len Brown who can [insert awesome credentials/vision] or a mayor like John Banks who [insert negative credentials/vision] ? So I think it qualifies as push-polling, but happy to be corrected if I misunderstand your definition.
The questioning seemed to be a vehicle for the various assertions around the two main candidates rather than seeking enlightenment. I can't imagine it having been very effective (that is, I don't think it had an impact on voting patterns, Len still got my vote on base principles, if nothing else).
-
Huh, I always thought push polling was using questions designed to produce a specific result which is then presented as objective, not actually influence the person being polled.
I'm curious about the statistical adjustments made to account for only calling landlines, and how current the model is.
-
Russell Brown, in reply to
On your last point Russell, there’s no way Reid surveyed all those people just for a preferred PM poll (which I strongly dislike mainly because they’re usually skewed to the incumbent, which is what is interesting about the NZ ones at the moment). There must be at least a vote coming, and on the numbers so far it’s bad news for the Maori Party.
Yep, they're polling the individual Māori electorates for Māori Television. When I linked to the story yesterday I missed that further down there was a voting intention question too, with 2500 responses, gathered from July 11 to August 17.
The preferred PM sample is smaller because (for obvious reasons) it only covers responses from August 1 onwards, when Ardern became leader. It would have been interesting if they'd published the Little results for comparison.
-
Bart Janssen, in reply to
Where did I read recently that NZ's population is far too small to make statistically robust Online polling viable ?
Polling is about getting the results from a small sample that represent the results from the entire population.
The problem with any online poll is it samples a particular portion of the population so there is every reason to believe it won't represent the entire population. Same problem exists for landline polling or streetcorner polling or doorknocking or just asking your mates at the pub.
All polls in NZ suffer from biases and small sample sizes. They are always reported with bogus margins of error. And worst of all they are horribly inaccurate.
That last is the worst of their faults. If your sample fails to represent the entire population every time then you'd be an idiot to keep taking the same sample.
The real problem though is we know the public are influenced by polls - or we think we know since that data is also based on polling - but at least it gives Paddy something to talk about and that has to be good ... right?
-
Russell Brown, in reply to
All polls in NZ suffer from biases and small sample sizes. They are always reported with bogus margins of error. And worst of all they are horribly inaccurate.
That last is the worst of their faults. If your sample fails to represent the entire population every time then you’d be an idiot to keep taking the same sample.
Um, no. Sample sizes for national polling in NZ are 750-1000 (Colmar Brunton is usually slightly over 1000). The typical sample size for US national polls is 1000. New Zealand sample sizes are not small by international standards.
And they're not really that inaccurate either. Let's look at 2014.
National's party vote: 47.04%. Last five polls before election day, in reverse order: 47.7, 45, 48.2, 44.4, 46.5.
Labour's party vote: 25.13. Last five polls 26.1, 25, 25.9, 25.6, 24.
I wouldn't call that "horribly inaccurate".
-
Nick Russell, in reply to
Yeah, there is an enduring myth that polls in NZ are inaccurate but very little evidence. Sooner or later someone will trot out the cellphone line which is a complete red herring as far as I can see. And the discussions around margins of error are always depressing.
-
The polls (traditional) are generally accurate. The reporting of them generally isn't. Obviously, that's not the fault of the pollsters themselves.
One irritant for me is TV1 and TV3 reporting "our latest poll" over several days. e.g. On a Monday, they'll give the main party vote numbers. Then they will link questions on specific issues (housing, tax etc) to stories later in the week. Always invoking "our latest poll", which is in fact the same one - the same respondents - from before. That may be OK in the quiet years between election campaigns, but when things are moving this fast it can be misleading, if not meaningless.
It also means the same respondents get reported as "news" several times over. Always with "we can reveal ..." teasers, when a more accurate statement would be "we found out several days ago, but are only telling you now."
-
Bart Janssen, in reply to
I wouldn't call that "horribly inaccurate".
The polls are accurate for parties that poll close to 50%.
For smaller parties the polls are less accurate, hence the "surprising" results for NZ First etc.
-
Sample sizes for national polling in NZ are 750-1000 (Colmar Brunton is usually slightly over 1000). The typical sample size for US national polls is 1000.
~~
Very basically - for polls the margin of error of a survey depends on the number of people sampled not the size of the population being sampled.But that just highlights that not all the potential error is reflected in the margin of error of a survey.
~~~~
I think NZ polls have been quite good because the people who are under-represented in phone surveys are also under-represented in voting. -
Thomas Lumley wrote a great article about quantifying the actual margin of error compared to the theoretical margin of error. It's a bit statsy though.
http://www.statschat.org.nz/2014/07/02/whats-the-actual-margin-of-error/ -
linger, in reply to
for smaller parties the polls are less accurate
Only in terms of relative error, not in terms of the absolute error. The theoretical variance, and therefore the theoretical margin of error, in a proportion estimate is proportional to p(1-p), so is largest for p=(1-p)=50%, and falls as either p or (1-p) decreases. Lumley’s results (referenced by mpledger above) do not show any larger absolute error in poll estimates for smaller parties; indeed, he explicitly states the error should be largest for National because its support is nearest 50%.
Now, it might well be that in some polling methods the sample is less likely to be representative of certain minority voting blocs, and/or the reweightings applied might not adequately correct for that, and so estimates for parties with support concentrated in specific demographics might be less reliable — but that’s a separate issue from effects resulting only from lower overall support.
Post your response…
This topic is closed.