Posts by Andrew Robertson
Last ←Newer Page 1 2 3 4 5 Older→ First
-
Speaker: ‘Kiwimeter’ is a methodological…, in reply to
1. Social research objectives, market research method. Massively market oriented client (or at least, survey host) with zero demonstrable or stated interest in social goals.
Oh okay. In my view Kiwimeter doesn’t use market research methods.
- Self selecting sample – frowned upon by market researchers.
- Psycometrics – hardly ever used by market researchers.
- Extremly large sample – unusal in market research.
- Data analytics – majority of market researchers don’t do this.
- Data used by academics to publish in peer reviewed journals – my perception is academics usually look down their noses at market research (Also – I think the study I referenced in Political Science demonstrates an interest in social goals among those involved in Kiwimeter) -
Speaker: ‘Kiwimeter’ is a methodological…, in reply to
Can you fill in the blanks a little – or are you just having a dig at people who think “nice pen” is a good way to bribe someone?
Postal self-complete surveys can have severe non-response bias because people can see all the questions before choosing whether to take part. If the topic of the survey is controversial, or emotionally or politically charged, then the response bias will tend to be more severe.
-
Speaker: ‘Kiwimeter’ is a methodological…, in reply to
Hey, no worries at all.
I’m not for a second saying the Kiwimeter sample is representative. I’m just saying response rate is not the only indicator of sample quality.
For example, say we carried out a survey about legalising prostitution, in two different ways.
1. We post a self-complete questionnaire along with a nice pen to a random sample of people, follow up with two reminder post-cards, and mail second copy of the questionnaire to non-respondents. We get a response rate of 45%.
2. We carry out a random telephone survey where we make no mention of the topic during the introductory script, and we get a response rate of 25%.
Both of these approaches might cost about the same for a given number of respondents. However in this instance, I’d trust the results of the phone survey a lot more than I’d trust the results of the postal survey (which would have severe response bias directly related to the topic in question). The telephone survey still has response bias, but it’s not related to the topic, so it’s probably easier to make adjustments by weighting the final result.
-
Speaker: ‘Kiwimeter’ is a methodological…, in reply to
None of this is accurate. The best you can call it is a ‘good guess’ based on demographic weighting.
I dunno… we don’t know much about how they calculated their population estimates.
I’ve become much less critical of self-selecting surveys. Check out this one (PDF) for example. They show it was possible to forecast 2012 US election results using extremely large samples of highly unrepresentative data from opt-in polls taken over the Xbox gaming platform.
I guess the caveat is this is just one published study (out of how many studies that tried to do something similar but got it totally wrong?).
-
Speaker: ‘Kiwimeter’ is a methodological…, in reply to
Put your graphs and statistical models away. You got a 12.1% response rate.
I hear this argument a lot, but you can have a very representative sample with a 12.1% response rate, and a very unrepresentative sample with a 50% response rate. The response rate is not the only indicator of sample quality.
-
Speaker: ‘Kiwimeter’ is a methodological…, in reply to
One might politely enquire as to how the percentage of Maori respondents currently compares to the population proportion.
Not really. The methodolgy isn't designed to generate a representative sample in the first place.
With this sort of self-selecting design, modelling and adjustments are performed before results can be generalised to a wider group (I would hope).
The days of representative sampling are coming to an end, so it's becoming less relevant to hold surveys up to this standard.
-
Speaker: ‘Kiwimeter’ is a methodological…, in reply to
I guess that is possible, but I really really doubt it.
-
Speaker: ‘Kiwimeter’ is a methodological…, in reply to
There are a lot of assumptions in your post.
1. Kiwimeter isn’t market research.
2. Cognitive testing is not standard in market research, but it is fairly common in robust social research.
3. Cognitive testing is not necessarily a suitable technique for constructing these sorts of psychometric measures (and psychometric measures are not typically used in market research).
4. You’re assuming TVNZ is the client. I have no knowledge at all of the arrangement between TVNZ, the academics at Auckland Uni, and the Vote Compass folk, but I’m guessing (assuming) there’s some sort of partnership arrangement rather than the typical client-supplier arrangement that you might see in market research. It seems they each get something out of this. TVNZ get news coverage and the academics get a lot of data and findings they can publish in academic journals (see, for example, Vote Compass in the 2014 NZ election: Hearing the voice of New Zealanders, in Political Science, Volume 67).
-
Hi all.
The folks involved in the New Zealand Attitudes and Values study (me included) have written this open letter about the measurement of racism and prejudice.
This is not about the Kiwimeter survey. These issues come up from time to time on other research projects, so we thought it would be useful to release this position statement.
-
Oh I stand corrected. It sounds like UMR have designed a poll that might predict the referendum result.