Hard News: Polls: news you can own
59 Responses
First ←Older Page 1 2 3 Newer→ Last
-
Commissioned polls are news you can can own. They are exclusive at the time of publication and if they’re sufficiently interesting they may then be picked up and quoted by other media organisations. The natural approach to maximising the value of your investment in a poll is to make a compelling story of it – to make it look at much like news as possible.
You could make the same argument about a news item that's complete fiction. That wouldn't justify misleading your viewers.
-
Excellent feature on this very subject on yesterday's Media Watch programme from RNZ
-
I rely on Danyl's poll of polls before I even bother with polling news. It's worth noting that he "adjusts for bias", which means that he turns down the average for each party by how much the average got it wrong at the last election. This was substantial for National. Or something like that - I don't know the exact maths of it.
His analysis is usually interesting, too. I think he has a tendency to overgeneralize from quite small fluctuations, and has serially called for every Labour leader's head as a result of this, but I still consider it essential reading.
-
It would be great if media outlets could stop reporting poll results as if we still had First Past the Post. Six elections is surely enough time to grow up and do their fecking jobs.
I do not care how either Labour or the Nats are rating on their own, just what the feasible coalitions they are part of are adding up to.
Lord knows what our journos and editors will do when National fractures into rural/urban parties like their Australian equivalent. Keep reporting the larger chunk as if the other is meaningless?
-
Russell Brown, in reply to
and has serially called for every Labour leader’s head as a result of this
He's not the only one to do that. I find the mass kvetching a bit wearying at times.
-
Further to the bias adjustment, btw, I do not know to what extent this reasoning is actually correct. It's possible that the pollsters are actually taking this into account already, in which case Danyl is reliably underestimating the National support. Does anyone actually know if they do? It's not like these polls aren't actually conducted by experienced statisticians.
-
Russell Brown, in reply to
That Gavin White analysis is a good place to start.
-
3 News was lauding new Act leader Jamie Whyte for lifting his party from zero (clearly not the party’s real level of support) to 1.1%
1.1% of, say, a thousand voters is 11 people. .4% (the TVNZ level of support for ACT) is 4 people. In other words, when you are dealing which such low percentages a party can appear to wax or wane based purely on a random half dozen people being home, and the variations in ACTs support are essentially meaningless. The only conclusion you can draw is that ACT remains politically irrelevant.
It would be great if media outlets could stop reporting poll results as if we still had First Past the Post…
This really hurts Labour, because the media persist in framing the narrative around National’s “commanding lead” in the polls, and dwells on the ‘crisis’ in Labour which, John Armstrong would tell us, needs John Key to slip on a “political banana” before it could dent the PMs “impressive connection” with the voters. When the reality is that undecided voters are going through the roof (probably being turned off politics in droves by the relentless sensationalism of the right wing smear machine) and of decided voters left and right blocks are neck and neck.
-
Peter Green, in reply to
My understanding is that they would prefer to reduce the error by improving their sampling techniques rather than using an adhoc fudge factor like we do.
It's entirely possible that they're measuring accurately, and that people are consistently having a last-minute change of heart on election day. An arbitrary adjustment would mask that sort of effect.
-
BenWilson, in reply to
Yes, I'm pretty sure Danyl made the same reasoning explicit in a post once. But the rub is in what Gavin say here:
I'm not privy to what exactly the other companies ensure that their samples are representative and I'm not going to share our exact methods with you - we all jealously safeguard those because they can be points of competitive advantage.
It's quite possible that they have made adjustments to the way they take their samples on account of the obviousness of this apparently systematic bias. I mean, isn't that what statisticians do? Try to remove or account for bias in their methodology?
-
the question of who gets to form a government in December
(my emphasis)
You really think Winnie's going to string everyone along for three months? :P
I guess it's not without precedent. -
BenWilson, in reply to
It’s entirely possible that they’re measuring accurately, and that people are consistently having a last-minute change of heart on election day.
Yes. Or even that there’s some kind of systematic lying about intentions to vote National. Or that the undecided at polling time subsequently decide before the election itself, and that they’re more likely to not be National voters.
But these are actually questions that possibly could be statistically answered. I’m curious whether polling companies would attempt that, considering that they do get competitive advantage out of having better results. By better I mean “results that more accurately predict what customers really want to know – what the actual election results will be”.
They could, for example, do follow up polls to the original sample, asking whether they changed their minds in the last days before the election, or how they voted if they were originally undecided, and maybe even some kind of analysis of the reasons for that.
-
Russell Brown, in reply to
You really think Winnie's going to string everyone along for three months? :P
I guess it's not without precedent.Heh. No, it was a typo. Thanks
-
Matthew Poole, in reply to
Figured as much, but it was good for a cheap laugh.
-
BenWilson, in reply to
Oh, and keep up the good work. I find those graphs extremely useful compared to anything in the MSM news. A real trove of info, and always leading to interesting debate.
-
Russell Brown, in reply to
When the reality is that undecided voters are going through the roof
At the least, the increase in undecided voters -- some of whom may be turning away from labour and Cunliffe, remember -- seems worthy of comment.
-
It intrigues me that the Green vote in the past has not reflected the various poll results, in that it is lower
Which if you think their supporters are young and connected would knock on the head the theory that polls underestimate those who don't haveI guess it might because they are at the small end of the spectrum and are subject to swings that all small parties can get but it seems to always overestimated it at the last three elections
-
because landlines or whatever
Even when everyone had a landline, there was a bias in who picked up the phone, could be bothered to talk to you, etc.
I believe that pollsters in these cases make an empirical adjustment based on demographics (so they ask people a/s/l and use that) and on how much they missed on previous elections.
But that works for a known unknown - if you're undersampling 18-25s by 50%, for instance. But what if there's a whole group of voters who you hardly sample at all?
This gets magnified with small samples and minor parties:
- if the Wierdo Freak Party gets 1% of the vote, you'd expect they'll have 4 voters in a sample of 400.
- but say they get all their votes from 18-20s living away from home for the first time, and 95% of these people have no landline.
- that means you'd expect zero voters in the sample to support the WFP, and that's the case for any population share of vote up to 5%.
- so after the first election the WFP contests and gets say, 2%, the pollster just adds a 2% fudge onto their 0%.
- but next time, the unpolled voters switch allegiance to the Freaky Wierdo Party, and they get 5%, despite polling zero every time. -
One interesting nugget in the full Colmar Brunton results is the proportion of undecided voters (who are excluded from the headline figures)– 18%, up five points on the previous poll. That didn’t feature in the report, but it seems a notable shift. On the preferred Prime Minister question, “don’t know” was 30% – the same as the last poll, but up six points on last May. That could signify a number of things.
The usual pollsters have constantly downplayed the undecided/refused vote, which skews the overall message. It’s part of a burning question: are polls intended to reflect public opinion, or influence it?
-
This statistician's view on election polls is still highly relevant.
Apart from the obvious and very troubling bias in all the polls towards National there is a huge problem with the way the data is reported for any party that gets significantly less than 50% of the vote. By reporting the margin of error as plus or minus 3.5% or so the news agencies are happy to allow folks to believe that the 31% for Labour could be as low as 27% and that the 11% for The Greens could be as scary as 14%. None of those things is true. It is only plus or minus 3.5% IF you get 50%, if you get 25% it is plus or minus a different (smaller) number.
What is most troubling about the polls is with their bias and with the awful reporting of what the numbers mean, both statistically and politically, the media is actually affecting the result of the election ..."why bother voting if National is going to win anyway". In some cases I believe that is innocent ignorance, but sometimes I'm less charitable.
-
'Don't know' in the Colmar poll were not 18%, they were 13% (up 3). There was also 'Refused' at 5% (up 2). Both add up to 18 but they are not the same thing.
My guess is that uncertainty will be across the spectrum. There's good cause to pause before committing to either National or Labour in particular at the moment, they've both had difficult months.
-
BenWilson, in reply to
Yes, it's extremely hard to make sense of the error margins they give. Each number should get its own 95% (or whatever they pick, which should also be reported) confidence interval. It's pretty shit when even people who do understand statistics can't reliably understand what the numbers they give represent.
-
If you haven’t already, it’s worth reading UMR’s Gavin White in the historical accuracy of the major political polls.
I'm not picking on UMR in particular, but one thing I'd like to see more of in pollsters commenting on polls and their reporting is full disclosure of their political/media clients. And, frankly, FUCK "COMMERCIAL SENSITIVITY". You're been presented as expert opinion, you don't get to pretend you're a disinterested party in how your industry is perceived.
-
Richard Aston, in reply to
It’s part of a burning question: are polls intended to reflect public opinion, or influence it?
Bloody good question Deep Red.
It seems to me that in this election more than any other the media is playing a much bigger role in "shaping" perceptions. The recent nazisification of Kim.Com was a classic case of media as propaganda attack dogs.
It concerns me how much Poll result influence the marginally motivated voters or those on the fence, people like to back winners. -
Sofie Bribiesca, in reply to
more than any other the media is playing a much bigger role in “shaping” perceptions.
I have just been reading interviews with various people on The Nation and from Rachel Smalley, Gower and Simon Shepherd. Can I say the Garneresque grotesque style of interviewer spitting out suggestive questions followed with constant interruption of attempts to answer and generally no impression of interest in said answer but just trying to trip the interviewee up, seems about as ugly as a tabloid gossip columnist that the Sun or Daily Mirror produce. How can anyone be informed by this?
Post your response…
This topic is closed.