Why do exit polls get it wrong so often?

Ashraf Engineer

June 22, 2024


Hello and welcome to All Indians Matter. I am Ashraf Engineer.

Now that the dust has settled on the general election results, it’s a good time to talk about the debacle that was the exit polls. You would remember that several pollsters, such as India Today-Axis My India, Chanakya and ABP-CVoter, projected 350 to 400 seats for the National Democratic Alliance or NDA. However, on results day, the NDA managed only 293 and the Bharatiya Janata Party or BJP securing 240. The predicted landslide had turned into a somewhat slim majority. Of course this is hardly the first time that exit polls have been off the mark. In 2014 and 2016, exit polls underestimated NDA numbers. The problem is that such polls aren’t projections to be merely consumed, talked about and then forgotten. They impact lives. For example, the latest exit polls sparked a surge in the stock markets with the Sensex initially soaring by 2,507 points, or 3.39%. When the results came out, it plummeted, wiping out an estimated Rs 33,36,284 crore of investor wealth. So, why do exit polls get it wrong?


The hugely inaccurate exit polls led to not just apologies from pollsters but also calls for them to be investigated for intentional manipulation – in other words, rigging.

Congress leader Rahul Gandhi, credited with the resurgence of the Opposition and its unity, said: “People very high up in the BJP have carried out a scam. We’d like to know if these polls were actually carried out, what was the methodology of the polls and who are these investors.”

First of all, let’s examine what exit polls are and how they are conducted.,

These are surveys conducted as voters leave polling stations. The researchers then use statistics and probability formulae to forecast the results.

Perhaps the first known exit poll was in 1936 when Gallup, an analytics firm, accurately predicted Franklin Delano Roosevelt’s victory over Alf Landon in the US Presidential election. For that, it employed scientific sampling of a few thousand people.

In India, exit polls took off in the late 1980s at a time when the polity was fracturing. Several sub-groups were emerging within the electorate and there was a strong trend of regionalisation. As a result, the voter was no longer simple to understand. Enter research firms who promised to dig deep and unravel the mysteries of electoral decision making.

Among the first to conduct opinion polls – not exit polls; I want to emphasise the difference – was Prannoy Roy who went on to found NDTV. The idea was to gauge the voters’ mood.

After that, opinion polls and then exit polls became the norm in every election.

The objective was to predict the number of seats, understand what issues had motivated the vote and – as such surveys got more sophisticated – to know the demographic breakdown of the vote.

To achieve this, researchers conduct sample surveys. Here’s where the problems begin. For a country of 1.4 billion, sample sizes are rarely representative of the population.

Some researchers ask people to indicate who they voted for by marking it anonymously on a chart and dropping it in a box. Others ask a series of questions without directly demanding who they voted for.

Many use modelling techniques to predict voter behaviour, which too is problematic. Each state is unique and different from its neighbour. Even among socioeconomically similar regions, voters can choose differently for various historical, political or social reasons.

In a massive country like India, sample biases, weightages assigned, regions chosen for the survey and several other factors can skew the projection.

Sample sizes and budgets are factors too. Usually, the money available is less than needed – which leads to the cutting of corners when it comes to sample sizes or tools used.

There are serious structural challenges to such surveys.

I have already spoken about sampling. There is also the mix of survey, expert opinions social media chatter, etc, that some researchers use that can impact the projection. Polling firms are not required to publish their methodology, so there’s no way of knowing what went wrong with the surveys. It’s a crowded field and who knows what level of quality control is deployed.

Among the other structural issues is voters who refuse to declare their support or simply lie to the researchers. In fact, this has been a major challenge across the world, even when the surveys assure anonymity. Exit polls assume that voters are honest during the interviews but this is not always true. Many lie and socially oppressed communities feel safer when keeping their choices to themselves. Because these surveys are conducted just outside polling stations, respondents often give socially acceptable answers rather than reveal their actual preference.

This brings us to the margin of error, which is usually 1% to 3%. In tight contests in India, the vote share difference is sometimes less than 1%. In such cases, the margin can throw projections wildly off the mark.

Human error only adds to the mess. Data gatherers may select polling booths easy to get to rather than the ones needed for accuracy, thus skewing results.

Usually, historical data is part of the projection mix. However, societal and demographic changes may have rendered it irrelevant.

We don’t have comprehensive caste and socioeconomic data either. The last caste census was conducted in 1934 and this government has been reluctant to conduct another one. This makes it tough to understand caste dynamics and how they affect the results.

Importantly, exit polls often inadequately represent women in their samples. While they constitute half the electorate, typically they comprise only 25% to 30% of the sample. This flaw is compounded if the constituency has more women voters than men. Here, the projection would be even more skewed.

Lastly, there is the persistent allegation that research firms were working for the BJP-led government. This is an allegation made by the Opposition that wants an investigation, but it’s difficult to prove. There is no evidence to support the charge but many point to the uniformity of their flawed numbers. My own sense is that the pollsters simply botched it up.

There is now a demand for a joint parliamentary committee into how the exit polls led to a stock market boom followed by the crash as the results came in. The question also is, if such polls impact people’s finances, should they be allowed at all?

Research agencies have countered all the criticism by asserting that they were not part of a rigging exercise. While they accept that they got the results wrong, they also continue to claim that their methodology was robust and open to scrutiny. Failing to predict the outcome correctly does not automatically mean they were dishonest or lacking in integrity, they say.

Incidentally, India is not the only nation where such polls go awry. There were failures in the US too in 1948 and 2016 – both Presidential elections.

In 1948, when Harry Truman defeated Thomas Dewey, the exit polls failed because of non-random sampling. In 2016, when Donald Trump beat Hillary Clinton, the exit polls were skewed due to a high non-response percentage – that is, many voters chose not to respond. Researchers in states like Michigan, Pennsylvania and Wisconsin did not spot the swing towards Trump among white blue-collar workers.

In the UK, research firms got the Brexit referendum wrong. Media reports said that only 55 of the 168 surveys predicted a leave vote.

While I don’t believe in bans, I do think that media organisations and pollsters should indulge in some soul searching to consider the purpose and usefulness of exit polls. They are unreliable and, especially in India, face several limitations. These are flawed exercises that can often cause great harm – as they did by sparking a stock market boom and crash. How is a sample-based prediction better than the actual count? And If you can’t get it right, why do it at all?

Thank you all for listening. Please visit allindiansmatter.in for more columns and audio podcasts. You can follow me on Twitter at @AshrafEngineer and @AllIndiansCount. Search for the All Indians Matter page on Facebook. On Instagram, the handle is @AllIndiansMatter. Email me at editor@allindiansmatter.in. Catch you again soon.