As a signed-up member of the YouGov polling panel I occasionally recieve a survey requesting my opinion. This morning I got the latest version delivered to my email in-box.
Now, opinion polls provide vital feedback on the broader political picture, but they also exist within the highly competitive world of the political economy where commentators constantly seek to provide sufficiently valuable content to 'go pro' and devote more time to their passion for politics as they seek to exert more influence over debate.
Polling organisations therefore become used for their ability to build up a reliable evidence bank for particular perspectives, with the accumulation of results establishing the mood of the nation with ever greater certainty.
Decades of refinement have enhanced the actual predictive capability of polls giving the term 'opinion poll' a measurable standing of some respect. Opinion polls have become a fact of life and now big decisions are rarely unleashed on the public without being tested on sub-sections first through various types of polls or consultations.
In the process this has elevated opinion to the rank that any and all views can be taken equally seriously - thereby stimulating the subordinate arguments that they should, must and deserve to be, without understanding the more complex point that the ends to which opinion works will depend on the means by which it is handled: we can be individually wise and collectively stupid, or we can be be individually stupid and collectively wise. And in this it raises the problem of how to seperate the two opposing dynamics when pollsters resist the means to differentiate between types of opinion.
With the advent of online polling their cost base has been slashed at the same time as the ability to ask more people more varied and detailed questions more regularly: while traditional doorstep and phone polls weigh samples of 1,000 people once per month, YouGov is able to compile and collate a similar-sized poll every day. Following its' launch in May 2000 YouGov has claimed these methods consistently produce more accurate results than traditional methods.
This view was given greater credence at the 2005 General Election when most traditional polls diverged from YouGov results by over-stating Labour support, albeit within the uppermost limits of margin-for-error. As it was Tony Blair's landslide majorities of 1997 and 2001 were massively reduced, and the traditional polling companies were shown that they failed to predict this. It has since been used as a powerful marketing concept for YouGov's methodology because the overall variability of differential swings in UK elections can easily be enough to produce completely different outcomes.
In more recent times polls which showed Cleggmania in full pomp (at one-third of respondees) after the pre-election leadership debates have rebounded showing LibDems in a seemingly endless slump (generally hovering at about one-tenth of respondees) - clearly some significant volatility of opinion is present. Although it would be foolish to try to seperate LibDem ratings from the seismic shift in politics which saw no overall majority in 2010 and the eventual creation of the first coalition government since wartime, it's worth considering other factors too, such as the difference in the polling methodologies.
I find it odd that monthly polls consistently diverge from daily polls (to the extent that LibDem results could vary by 100% between the two), so I asked Anthony Wells what effect the regularity of sampling may be having, citing my own example that I recall my earlier responses when answering. I have a lot of time for him, but I found his reply unconvincing.
My underlying point was whether the closed nature of their panel is combining with the regularity of questioning to create stale and unrepresentative results - have they 'overfished their pool'? Is YouGov reinforcing opinion rather than just measuring it?
Anthony answered that with about 350,000 registered members of the YouGov panel and a turnover of about 1,000 new members each month it is unlikely. Given his connection with the company he was never going to publicly concede the methods he depends upon are flawed, but nevertheless basic maths should raise some questions about YouGov's methodology over a long period compared to off-line methods.
YouGov's available panel is a self-selecting sample of about 1% of the voting population (other polling organisations select and weigh their samples according to demographic balance from the whole electorate), and YouGov produce political surveys about 20-times more regularly: I have had my opinion professionally surveyed by telephone and on the doorstep once each in my lifetime, whereas I am asked by YouGov for an online response approximately once every two months. As a respondent I am fully aware that I relate my latest answers to my recollection of previous answers rather than in isolation, so my YouGov replies become increasingly relative each time I am polled. And given that members of the YouGov panel can be assumed to be more politically engaged than average we are therefore more likely to be aware of result trends and more likely to be influenced by them.
The effects of this change may not be immediately obvious, but they could be powerful, particularly under conditions where social opinion is put under pressure (such as by a less than positive economy).
The simple fact that YouGov produces more results means it can effectively 'drown out' the competition and the social effect is to weigh YouGov more highly in political circles than, say, ICM, Ipsos-Mori or ComRes. YouGov is now the dominant force in the field, and perhaps this means it should be feared more than trusted.
By measuring a small sub-sample with such regularity YouGov has reversed the original dynamic of the polling project. It is no longer simply measuring opinion for the purposes of representative accuracy, but driving the polarisation of opinion for the purposes of conformity.
So it is potentially very damaging to the general political debate that political commentators are not distinguishing between the types of results produced by different types of polls. If I were tempted to take an extreme position I might suggest the correlation between polarisation of opinion towards the coalition government and the polarisation of opinion driven by YouGov's methodology was responsible for the increased level of protest seen in the rapid escalation of the violent riots.
That may seem a strange thing to say, but we saw how opinion treated as fact was the main instigator for flash mobs to spring up and cause violence beyond the control of the Police or beyond the natural restraint of a questioning conscience on behalf of those individuals who got involved. So it's entirely fair to conclude that where opinion replaces fact as a legitimate authority for action nobody should expect sanity to prevail.
As the high priests for the cult of opinion YouGov should be looking at themselves and the deeper psychological impact of their specific methodology on wider society - it's time to stop combining or conflating what YouGov's polls say with what other pollsters say.
I tried a second time with AW, this time he offered a sceptical response to any obvious answers for the discrepancy between polls - perhaps he's being won over!
Monday, 12 September 2011
- ▼ September (6)
- ► 2010 (58)
- ► 2009 (110)