I’m writing this on Wednesday, 17th September, the day before the Scottish Independence vote, so we don’t have long to wait before we’ll know whether the pollsters are getting it right or are horribly wrong and will end up with egg all over their faces. Who could forget the 1992 general election when pollsters predicted Neil Kinnock’s Labour party would defeat the Tories under John Major? What is certain is that the small sample size of many of these polls (around 1,000) mean that there is no clear answer because of a 3% margin of error.
The media are certainly taking much more of an interest now, two weeks after the gap narrowed between the predicted yes and no voters, than they were over the months leading up to the referendum when the gap seemed so large as to make the result a foregone conclusion. Politics and the current state of the Union has certainly become front-page news.
Whatever the outcome, the pollsters are already debating what could have been skewing their sample sizes and their results. Some are talking about the differences in voting intentions across different geographical locations, different genders, social classes and different age groups whilst others are already examining the potential flaws in the methodology. For example, as well as the specific questions you ask of potential voters, does it make any difference to the truthfulness of their answers if you’re interviewing them face-to-face, on the telephone or ‘anonymously’ online?
What’s clear from the viewpoint of a marketeer is the lesson that it is essential that these things are considered before you commence your market research. A flaw in your sample size, in your sample selection or the method you choose to ask people’s opinions could actually give you the wrong result. So is there any point in doing market research at all?
On Facebook today I came across a straw poll that had been taken by someone who been asked to advertise their business at a particular location. While consulting all the friends you are connected to on Facebook to ask their opinion is a reasonable exercise to conduct for a bit of fun I would not place my faith in its accuracy. The sample size must be extremely small, the demographics of the sample are probably highly skewed, and probably most people in that small sample will not post a comment after reading the post.
I would always argue that some research has to be better than no research at all because of the insights it gives you into the potential behaviour or thought processes of your customers. However we clearly need to be cautious about placing all our faith in one method. We could get more reassurance that we’re getting things right by combining evidence from several different sources, which could include online questionnaires, telephone interviews, face to face interviews or focus groups and good old-fashioned desk research.