Opinion: Sonalde Desai.
Instead of focussing excessively on rankings with well-recognised shortcomings, recognising achievements and refining goals consistent with national priorities will be a more fruitful approach.
Developing global indices and rankings has turned into a minor industry. The Global Competitiveness Index, Global Happiness Index, Global Hunger Index, Ease of Doing Business Index, Corruption Perception Index, Global Go-To Think Tanks rankings, you name it. Think tanks specialise in creating these indices; they are good for increased funding and publicity. Some governments boast of improved rankings, while others rant about the methodology. Life goes on until the following year when the cycle begins again.
Every time these indices appear, I wonder why some countries are where they are. Apparently, young people in Lithuania and Israel are the happiest in the world. Why are they happier than the youth in Australia, New Zealand, or Sweden? Is Gallup just counting the Jewish population of Israel, or do Arabs count? Unfortunately, these questions rarely get asked and answered.
Sometimes, we get to see strange anomalies. Take, for example, the Global Gender Gap Index. India ranked 26th on educational attainment in 2023 but mysteriously dropped to 112th rank in 2024. As far as I know, no Taliban-style attacks on Indian girls’ education have taken place. This rapid descent remains inexplicable. Could there be some anomalies in the data?
All global rankings are not equivalent. Some, like the Human Development Index, are well thought out and carefully constructed, although they also face challenges in getting accurate country-level data. Others seem to be hastily put together, often excluding perspectives from the Global South. For example, the now-abandoned World Bank Ease of Doing Business Index focused on limited liability companies, covering only 14 per cent of Indian businesses and excluding sole proprietorships, the mainstay of Indian businesses. The Global Gender Gap Index focusses on the gender gap in earnings but not in poverty — an indicator on which the United States might do poorly due to a large number of mother-only families, but where South Asian countries might fare better.
Nonetheless, given how much international organisations and foundations that fund them love ranking countries and are convinced these are effective tools in holding countries accountable, it is unlikely that any criticism will vanquish this industry. However, it is possible to hold it accountable through simple steps.
First, we must expect that any index will contain a methodological appendix that justifies why specific indicators were chosen to be a part of the index and the rationale underlying the differential weights given to these indicators. The publications must include links to source data. The lazy approach of citing the World Bank indicators or the Food and Agriculture Organisation’s indicators is insufficient. Index authors must cite the original sources for each indicator for each country. As it stands, data errors in index construction are impossible to decipher, even when we see absurd results like India’s descent from rank 26 to 112 in educational attainment in the Gender Gap Index in a year. This does involve a considerable amount of work, but hard work is what research is all about, what the public and policymakers deserve. Where primary data is presented, sample sizes, sampling methodology and confidence intervals must be presented.
Second, those who cover the release of various indices must find a way of fact-checking the results. An editorial moratorium of coverage for 48 hours after the release of the index will give time to critically examine the results and consult experts. The rush to be the first to report that India is below war-torn Sudan on the Global Hunger Index without a critical examination does not serve the public. In particular, the rankings that do not provide citations to source data and methodology should not be covered.
Third, governments must stop taking these results seriously. Countries are well aware of their priorities and hopefully try to ensure that appropriate data are available to monitor their progress. However, these efforts have little to do with how a country is ranked globally. Take, for example, the Global Hunger Index (GHI). India’s child mortality fell from 9.1 at the turn of the century to 3.1 in two decades, and stunting, defined as low height-for-age, fell from 51 per cent to 36 per cent. Where India is lagging is in caloric intake and low weight-for-height resulting in it being ranked at 117 on GHI. Data challenges for these two indicators are well recognised.
Caloric intake is estimated from consumption expenditure data, which is a poor approximation at best. Moreover, the underlying figures for undernourishment, calculated by FAO combine the 2011-12 NSS consumption data and a recent Gallup poll of 3,000 people to estimate undernourishment. These models deserve greater scrutiny for external validity. Similarly, the wasting data for India is affected by most of the fifth National Family Health Survey interviews being conducted during the monsoon due to the pandemic-related delays. Greater intestinal infections during the monsoons are associated with weight loss, which biases wasting estimates. Instead of focussing excessively on rankings with well-recognised shortcomings, recognising achievements and refining goals consistent with national priorities will be a more fruitful approach.
Amartya Sen, one of the originators of the Human Development Index, has suggested it may be time to move beyond rankings. If we can’t get away from these rankings, at a minimum, we should set up parameters under which they are accurate and sensibly used.
The writer is Professor and Centre Director, NCAER National Data Innovation Centre and Professor Emerita, University of Maryland. Views are personal