Two main yardsticks are usually used to judge a modern economy’s progress: the rate of economic growth, and the rate of poverty decline. The first provides an idea of the aggregate economic performance, while the second provides an idea of how widely economic gains are being shared. The second yardstick is politically more contentious, and India’s ruling regime has largely avoided this issue since 2014. That has not stopped researchers from coming up with their own inferences on poverty in India, often based on limited data and questionable assumptions.
Official poverty lines in India are based on large-scale consumer expenditure surveys conducted by National Sample Survey (NSS) enumerators spread across the country. Since the 1970s, official estimates of poverty were estimated by the Planning Commission based on data from the quinquennial rounds of the NSS consumer expenditure survey. The last such exercise showed a rapid decline in poverty between 2004-05 and 2011-12.
A consumer expenditure survey was conducted in 2017-18, but it was hushed up. The leaked findings of the survey were published in Business Standard in November 2019, showing a decline in rural consumption. The leaked data implied a one percentage point rise in national poverty rates between 2011-12 and 2017-18, as calculations by Mint showed (‘India’s rural poverty has shot up’, 3 December 2019, bit.ly/3ZWzytc).
The ministry of statistics and programme implementation (Mospi) decided to junk the survey, citing divergence with other data-sets such as the national account statistics, and the findings of an expert panel. However, the expert panel in question did not recommend suppression of that survey on the grounds of such a divergence, a member of that panel told this writer. Mospi refused to share a copy of the expert panel’s report when this writer filed a right-to-information request (and subsequent appeal) to obtain it, citing the “sensitivity” of the matter.
The divergence between survey estimates and national accounts is neither new nor unique to India. A number of scholars, including the Nobel-winning economist Angus Deaton, have examined this issue in the past, and the broad consensus is that the survey data can’t be considered flawed simply because it diverges from the national account estimates. This is especially true in the case of countries such as India, where national account estimates of consumption are imputed, not directly estimated. An official committee set up by Mospi to examine this issue in 2015 and led by the statistician A.K. Adhikari had shown that a part of this divergence can be explained by definitional issues. The rest of it was because of errors in both survey and national account estimates.
Over the years, NSS consumer expenditure questionnaires have grown longer as new items have been added to reflect changing consumption patterns. However, queries on new items are typically placed at the end of the question set. So respondents may be too fatigued to reply properly. Since the new items make up a large part of India’s national accounts, this could partly explain the growing divergence between the data-sets.
To make the survey results more accurate, the methodology of the ongoing consumer expenditure survey (2022-23) has been revamped. Under the new method, different parts of the questionnaire are used in different visits, so that each individual interview remains short. However, this change makes the survey incomparable with past rounds.
To ensure comparability, the National Statistical Commission (NSC) suggested that the old method be continued in a sub-sample of the survey. But the NSC made a U-turn on this soon after, according to two people aware of developments. So there won’t be any straightforward way to compare results with the past. Such comparisons will probably still be made. The government’s cheerleaders may well use the new survey data to paint a narrative of an unprecedented decline in poverty over the past decade. Others might well use alternate data sources to contradict such assertions. Independent scholars might complain, but Mospi officials would claim that academics are unhappy because they don’t like change.
What was a much-needed revamp of an important survey is likely to get mired in controversy. India’s statistical establishment must bear the blame for it. A long line of scholars, from P.C. Mahalanobis to Moni Mukherjee to Deaton, had emphasized the importance of conducting the consumer expenditure survey on an annual basis. When a survey is conducted on a regular basis, the stakes in any one round are much lower. It also allows for gradual experimentation without sacrificing comparability altogether.
In the absence of an annual survey programme, the next best course of action would have been to conduct large-scale pilot surveys using the new questionnaires, as recommended by the Adhikari panel. If such trials were conducted, the results should have been released publicly before the new survey design was finalized. That didn’t happen.
Finally, note that India’s official statisticians have faced trade-offs between comparability and accuracy even in the past. While introducing changes in survey design, NSS surveyors would retain a sub-sample that would be conducted using the old method, to allow comparisons with previous rounds. This is precisely what the NSC had initially recommended. It is not clear why it reversed its own decision. India’s statistical leadership needs to come clean on this issue.
Pramit Bhattacharya is a Chennai-based journalist. His Twitter handle is pramit_b.
Download The Mint News App to get Daily Market Updates.
More
Less
#dont #Indias #poverty #numbers