Google recently released a new product aimed squarely at the Market Research industry – Consumer Surveys.
As with anything Google does, they tend to bring a fresh perspective to an old problem and their Consumer Survey product is no different.
Instead of following in the footsteps of the DIY survey crowd (poster boy – Survey Monkey), they have rethought the whole process. Google’s Consumer Surveys are short (two question maximum) surveys administered as access requirements to participating content sites.
The idea is beautifully simple. Content providers who have large audiences get to monetize some select content while business with research needs get access to on-demand sample. At only $0.10 per completed survey, that’s a ten to forty-fold saving over standard Panel rates (depending on who you want to talk to).
That’s a pretty disruptive business model in a market that needs a bit of disrupting.
However, if you want to use this service to do research, there are some things you need to know. The kind of things you are only going to find in the finer details of what Google is offering.
- You can only ask TWO questions of the SAME respondent – This is a pretty big limitation, but not surprising given the cost and methodology. It means that if you have a ten question survey and you want to understand how people who answered question five also answered question nine, you might be out of luck. Given that a lot of useful analysis is done between questions, this is a big restriction you need to be aware of. NOTE: technically it’s two questions per ‘request’, so the same respondent might be asked more than 2 questions if they have multiple ‘requests’ for the same study (but it’s not clear if Google even tracks this).
- Demographic information is inferred from IP address and browsing behavior - Given the two question limit you obviously can’t collect demographic data on each respondent. Google gets around this by inferring demos from from IP address and using census tract information to compute average income, age, etc. Another nice use of technology by Google, but there is really no discussion on how accurate this is. In studies that I have done looking at IP address for survey responders, upwards of 20% of the sample may be taking the study at work. Work IP address are different from home IP addresses and obviously not useful for inferring demos. While Google does a lot of explaining in their White Paper around sampling accuracy, it is pretty much void of a discussion on demo inference accuracy – that’s something I’d love to see some more data on. NOTE: I don’t consider the comparisons made in the sample accuracy work a direct measure of the accuracy of the demographics as it’s not clear how demographics interact with the questions they asked.
- Your data may be weighted by the inferred demographic data - Compounding the problem of potentially inaccurate demographic data is that this data is also used for post-stratification weighting just in case the sample is skewed is some way. This makes sense to do, but again, there is not a lot of discussion on the accuracy of the demographic data so the compounding error when using this data for weighting is also unknown. NOTE: The post-stratification average error in the Consumer Survey test was an improvement on the un-weighted data, but that’s for a single test of a US population sample – does this the accuracy still hold as you do more regionally targeted studies where the accuracy of IP demo inference might be more of an issue?
- You don’t get access to the raw data – Not a big deal for most of the companies that will use this system, but annoying if you want to present results in a third party application (I expect Google will allow raw result extraction at some point in the future).
- You can’t ask an open-end response – I couldn’t find any reference to open-end questions. And I don’t really think this type of survey-wall approach lends itself to getting people to explain their thoughts/feelings. So again, probably not a big limitation for the likely target audience for this product.
So despite some of these limitations, I think the product is well suited to smaller, DIY projects that don’t require any type of relatively sophisticated analysis. Which, if you’re honest about it, is probably 80% of most research requirements for small and medium sized businesses.
I also have to hand it to Google for at least doing some due diligence on the tool’s accuracy. While I agree with their own assessment of the limitations, I don’t see these limitations particularly hurting the majority of studies.
People in Market Research tend to have a very inflated sense of self importance when it comes to collecting survey data. If Google Consumer Surveys further reinforces the fact that data collection is simple, cheap and fast – that’s a good thing. It should be.