The ‘Youthquake’ plot thickens…


iStock-157186183 Creditericsphotography

Mae'r cynnwys hwn ar gael yn Saesneg yn unig.

Earlier this month we saw an interesting development in the study of young people’s engagement with politics in the form of Professor Will Jennings and Professor Patrick Sturgis’ excellent analysis of Understanding Society data, which led them to argue that there was, in fact, a youthquake in the 2017 general election.

As soon as the polls closed in June last year, speculation began that young people had turned out to vote in high numbers. This surge, driven (depending on who you asked) by Brexit, frustration at austerity or Jeremy Corbyn’s policies, was said to have been pivotal to Labour’s performance in key constituencies and the eventual loss of Theresa May’s majority in the House of Commons. The evidence underpinning this claim (notwithstanding references to photographs of polling stations and surveys that turned out not to exist) came largely from opinion polls and online surveys, and led numerous academics (including me) to conclude that youth turnout had most likely increased markedly since the 2015 general election and was pivotal to Labour’s strong performance.

In January, however, the British Election Study (BES) challenged this, suggesting that there had been no (or only a very small) increase in the turnout of younger voters between 2015 and 2017. The BES is the ‘gold standard’ survey of electoral behavior in the UK. Not only does it have more a representative sample than online surveys (and spends considerable resources recruiting those who often refuse to respond to surveys, such as those who aren’t interested in politics), but it also provides ‘validated’ vote data in which the turnout of respondents is checked against the electoral register. This means that the BES can mitigate against two of the major sources of error when measuring turnout in surveys: samples that are biased in favour of those who are more interested in politics (which was a major factor behind many polls’ failure to accurately predict the result of the 2015 general election and the EU Referendum), and the tendency of people to over-report their political activity.

Professors Jennings and Sturgis’ analysis, however, suggests that turnout did increase amongst the under-30s (by roughly 10%) between 2015 and 2017. Understanding Society also uses superior sampling procedures to online surveys, and (like the BES) collects data through face to face interviews. The key distinction between the two is the sample size: while the BES analysis of turnout was based on data from several hundred respondents, the Understanding Society analysis was based on data from several thousand. This not only makes the Understanding Society analysis more likely to be representative, but (as Jennings and Sturgis point out) also more capable of statistically detecting significant changes in turnout from one election to another.

So is the issue settled? Sadly, no. While Understanding Society’s larger sample makes it better suited to identifying a significant shift in turnout, it is still dependent on respondents’ own reports as to whether or not they voted (although it is highly unlikely that the tendency to exaggerate political activity would have changed substantially for the respondents in just a few years). Nonetheless, the BES measure of turnout is vastly superior, and cannot be easily dismissed. Moreover, if the Understanding Society analysis is taken as the more accurate of the two, this can only be because of its larger sample size. This implies either that the BES data lacks the statistical power to detect a substantial change in turnout (which is certainly a possibility, but as the BES team point out, even if true it does not suggest that there was a surge in turnout indicative of a ‘youthquake’) or that the BES sample is biased towards less politically active young people who were less likely than those surveyed in online polls to vote (which would make it the first survey of electoral behavior in history to suffer such a problem).

There is no easy way of adjudicating between the two, and determining which is ‘right’ is impossible. The dispute reflects inherent methodological challenges that surround every estimate of turnout, party preference or support for Brexit in surveys. What can be said, is that thanks to Understanding Society and Professors Jennings and Sturgis, we finally have some reliable evidence (possibly the only reliable evidence) suggesting that there may have been a ‘youthquake’ in 2017 after all.

Image credit: ericsphotography, iStock


Share