Posts by Andrew Robertson
Last ←Newer Page 1 2 3 4 5 Older→ First
-
I'd love to try banning polls as an experiment - to see what influence this would have on politicians' claims during the banned period about where their party is polling (based on 'internal polls') and what the public think.
-
Hard News: Decidedly Undecided, in reply to
do you mean out of all respondents or just the ones who have earned the leaner?
Out of all responses to the party vote question.
-
Hard News: Decidedly Undecided, in reply to
True, but there’s a complication in that different polling firms handle the likely-voter issue differently. Some push further towards saying “Okay, if you were going to vote, what would that vote be?”
Andrew will be able to explain what Colmar Brunton’s practice is there.
We use a leaner and we also gauge likelihood to vote. The leaner is asked if anyone says they don’t know which party they would vote for (eg. Which party would you be most likely to vote for? ). Everyone is asked who they would vote for, even if they’re unlikely to vote. However those less than ‘very’ or ‘quite’ likely to vote are filtered out at the analysis stage.
-
Hard News: Decidedly Undecided, in reply to
Does anyone know what plans there are to poll Epsom and wherever Colin Craig is standing?
I suspect that's a sort of 'we could tell you but...' type situation. :)
-
Hard News: Decidedly Undecided, in reply to
Anybody from a research company want to tell us what your overall response rate is please? That is vital to know. As important as the percentage of undecided voters.
I'm planning a blog post about this. All surveys should report response rates. The problem is, NZ has no agreed upon formula or set of call outcomes codes (like the AAPOR does - which is what I use). I've seen some VERY dodgy response rate calculations used on VERY public NZ surveys.
I would like all polls to report response and refusal rates in the same way, using an agreed formula.
One thing to keep in mind is that response rate is not always an indicator of sample quality, as I point out here.
-
Something that didn't come up in discussion on the show is that, while the NZ Polling Code is binding on all members of the Research Association, it has no jurisdiction over media organisations.
In the research industry the clients own the report and all the data. The research company could produce everything required by the Code in their report, but they can't release anything unless their client approves.
-
We own our house now (well, nearly 20% of it), but we had very mixed experiences when renting.
We were the sort of tenants who believed in leaving things in better shape than we found them, and taking care of small problems ourselves if it was easy enough to do.
We had three separate runs of great landlords over about 10 years. Stayed in most places for around three years and never moved unless we really had to. None of our landlords ever increased the rent during the time at each house. I think they appreciated that we took good care of their house and we always paid the rent on time. None made us sign on for a fixed term.
Then we moved to a house that had a property manager, who refused to rent to us unless we signed on for a fixed term. We still had the same philosophy, to begin with, but when we couldn’t fix things ourselves the property manager would argue over who should pay. They put the rent up from $330 to $400 while we were there (not always with the approval of the landlord, we found out later), and refused to subtract a week’s rent the owner granted us for painting a room (fortunately we kept the landlords Thank You card!). I’m pretty sure the property manger considered us bad tenants. It’s not hard to imagine her telling people about how argumentative we were.
The owner ended up cancelling his contract with them after we told him about our experience.
-
Hard News: Meanwhile back at the polls, in reply to
When he or she is referring to media reports, probably not.
-
Hard News: Meanwhile back at the polls, in reply to
I’d be interested to know if the same is true of those excluded from the Colmar Brunton because they said they were more likely than not to stay at home this election.
Those figures aren’t shown because the base for unlikely-to-vote(rs) is really small, especially after excluding the undecideds/refused to answer, which is high among those unlikely to vote. So those figures aren’t even close to being robust. If they were in the report, someone would almost certainly report “CB says this” – regardless of how many caveats and warnings were provided along with the results (I’ve seen this happen more times than I can count).
In the last CB poll there was a likelihood to vote in the upcoming election question which I think does provide some of the analysis you’re interested in.
I think the key point to remember though, is that point of a political poll is to measure party vote sentiment if an election was held at the time of the poll. Given that this is the purpose, a poll shouldn’t include those unlikely to vote in the results for party support.
-
Hard News: Meanwhile back at the polls, in reply to
Perhaps, unlike Colmar Brunton , they (Roy Morgan) don’t ask about intention-to-vote, and therefore don’t exclude those unlikely to vote ?
It's very possible they don't filter on likelihood to vote, but I don't think that would make the difference because the % of undecideds in the CB poll is based on all eligible voters, not just those likely to vote.
I don't see undecideds as a bad thing - it's an indication that the poll is sampling non-politically obsessed people.