New Thinking in the News

How to respond to rising sovereign debt? What do food shortages look like now? How can we guard against data authoritarianism? This and more in this week’s collection of #NewThinkingintheNews


1 | Hunger amid plenty: how to reduce the impact of COVID-19 on the world’s most vulnerable people in Reuters, by Mari Pangestu

“It’s important to not only ensure people access basic food supplies, but also that they have money to purchase them. On average, food accounts for up to 60 percent of household expenditures in low income countries and 40 percent in emerging and development market economies. Economic recession and loss of livelihoods quickly erode the food security of millions of people – especially if food prices increase. The World Bank estimates that 40 to 60 million more people will be living in extreme poverty in coming months, depending on the scale of the economic shock.”


2 | New Laws for the Fissured Workplace in the American Prospect, by David Weil

“After this acute crisis passes, we must confront the reality that our existing workplace policies no longer account for the millions of workers with jobs (often multiple jobs) that do not fit the narrow definitions of employment embodied in federal and state laws. Today’s workforce—and those displaced from it—requires core protections linked to work, not just employment, in areas like assuring a safe and healthy workplace, receiving a minimum wage, and being protected against retaliation from exercising rights granted by our laws. This crisis also reveals the long-term need for wide access for all workers to safety-net protections like unemployment insurance and workers’ compensation as well as to comprehensive paid-leave policies that protect workers, their households, and the wider community.


3 | How to Develop a COVID-19 Vaccine for All in Project Syndicate by Mariana Mazzucato

“To succeed, the entire vaccine-innovation process, from R&D to access, must be governed by clear and transparent rules of engagement based on public-interest goals and metrics. That, in turn, will require a clear alignment between global and national public interests… But today’s proprietary science does not follow that model. Instead, it promotes secretive competition, prioritizes regulatory approval in wealthy countries over wide availability and global public-health impact, and erects barriers to technological diffusion. And, although voluntary IP pools like the one that Costa Rica has proposed to the World Health Organization can be helpful, they risk being ineffective as long as private, for-profit companies are allowed to retain control over critical technologies and data – even when these were generated with public investments.”


4 | Preventing Data Authoritarianism in Project Syndicate by Katharina Pistor 

“While digital technologies once promised a new era of emancipatory politics and socio-economic inclusion, things have not turned out quite as planned. Governments and a few powerful tech firms, operating on the false pretense that data is a resource just like oil and gold, have instead built an unprecedented new regime of social control.


5 | The Necessity of a Global Debt Standstill that Works in Project Syndicate, by Beatrice Weder di Mauro and Patrick Bolton

“Without private-sector participation, any official debt relief for middle-income countries may simply be used to service their private-sector debt. It would be pointless for the official sector to lighten poorer countries’ debt burdens if this results only in a transfer to commercial creditors… All private creditors need to participate on an equal basis in any standstill on debt service, both as a matter of fundamental fairness and to ensure adequate funding for emerging economies. And their participation cannot be purely voluntary. If it is, relief provided by participating private creditors will simply subsidize the non-participants.”


Every week, we share a few noteworthy articles that showcase the work of new economic thinkers around the world. Subscribe to receive these shortlists directly to your email inbox.

It’s gotta be true, because data says so

Data and statistics are everywhere, especially in economics. But we forget that empirical results are often manipulated, biased, or inconclusive. To ensure we design policies responsibly, we must meet empirical work with greater skepticism.

by Selim Yaman

In 2008, Doucouliagos and Ulubasoglu of Deakin University conducted a meta-analysis of 84 studies about democracy and economic growth. After evaluating 483 regression estimates from these studies, they find that every outcome about the political democracy and growth relationship is possible; they observe that  

  • 37% of the estimates are positive and statistically insignificant
  • 27% of the estimates are positive and statistically significant
  • 15% of the estimates are negative and statistically significant
  • 21% of the estimates are negative and statistically insignificant.

The link between inequality and economic growth is equally difficult to identify. Dominics et al (2006) make a meta-analysis of studies focusing on this relationship. According to their study, most of the regressions yield a negative relationship between inequality and economic growth. Yet, when used different estimation techniques and panel datasets, this negative effect vanishes. So, after analyzing a vast amount of empirical literature, no clear relation appears.

Data and statistics grew increasingly important in recent decades. Big Data came to play a large role in many fields, from technology to healthcare. Economics is no different; regression analyses had already been popularized by the neoliberal school of thought. Economics was intentionally made “a real science, within which basic connections between phenomena could be established, like in physics.

In the neoliberal world of economics, you are free from complicated, theoretical discussions, and able to draw firm conclusions. Unlike more nuanced fields like sociology or political science, neoliberal economics allows for simple, elegant arguments. With the help of mathematical modeling and statistical results, arguments take up just a few pages. This neoliberal methodology sounds pretty good in the first place: Direct scientific results, no chit chat. But it’s not as simple as it looks.

To what extent we can trust these statistical methods or the economists that use them? Economists can easily manipulate data to fit their ideological stances, or to comply with their initial hypothesis. Errors rooted in research design create unreliable results too: the type of the data used, the selection of the sample, differences in evaluation methods of estimates, availability of data, direction of causation, regional/country specific characteristics all influence the results. They jointly create a big divergence among empirical macroeconomic studies, leading to a conundrum in many questions.

These problems are not confined to economics; as the use of econometric methodology expands to other fields, the risk contained in data-interpretation increases. Say, there are studies on how religiosity levels affect people’s career paths. Knowing that even the large, carefully executed polls have failed at predicting Brexit, Trump’s victory, and Labour’s success in the UK, how can we trust other surveys to teach us about religion or social preferences? How can someone even build a theory on such data? Above all, how can these studies shape policy designs?

Some of the empirical studies that resulted in wrong conclusions remain unharmed in the ivory tower of academia, desperately waiting for a rare reader. But many of these studies do integrate to the real world, either through policy-making (top to bottom) or through media outlets (bottom to top).

Via the policy-route, developing countries have been one of the victims of empirical studies. The Washington Consensus, for example, suggested fiscal consolidation and trade liberalization. Later, however, it became clear that this was bad advice; copying the economic institutions from the Western world North and applying them to developing nations without considering country-specific environments can be devastating. While the Washington consensus was an elegant argument and supported by data from the West, it failed to account for the complexities of the Global South.   

The second route of influence is the media; when people read the news and encounter headlines like “a recent study found…”, it sparks their attention. But that recent study’s sample size can be very low, and its data can be deficient, and neither editors in the media nor readers will be aware.  To them, the study’s seemingly conclusive result are what is important.

To avoid that we act on false conclusions, academics, policy-makers, and media professionals all carry the responsibility to treat their empirical findings with skepticism. If it was physics, then a causal relationship based on data could be trusted. But for economics and politics, human factors create complications that statistical methods cannot always handle. Overall, it’s better not to overly believe in statistics, because data says so.

About the Author
Selim Yaman works at TRT World Research Centre. Yaman received his BSc from the Economics Department of Boğaziçi University. He is currently a graduate student in Political Economy of Development, at SOAS in London.