BLOG POST

Making Development Economics More Scientific: A Young Journal Leads

April 01, 2011

Researchers who call their work scientific must make their work reproducible. That is, other scientists must be able to reproduce the same result in an essentially similar setting. If they can’t, the result gets dumped. When I was a boy, two scientists at the University of Utah claimed to discover a way to cheaply generate energy with “cold fusion”. But because other scientists could not reproduce that result, no one today builds energy policy around cold fusion.I thought of this when a colleague pointed out to me the newly-revised mission statement of a startup development journal. The Journal of Development Effectiveness explicitly welcomes papers that try to reproduce others’ results. Hear! Hear!Economists aspire to make their work more scientific. But it is a widely-recognized problem that few economists write, and few economics journals want, studies that try to reproduce other studies. An academic economist would be laughed off campus if she dared approach the tenure committee with a stack of papers simply confirming others’ work. How unimaginative! Dan Hamermesh of the University of Texas flatly states, “No editor of a major journal is likely to publish replications of previous original pieces.” Chris Blattman of Yale University finds development economists' efforts to replicate results “pretty weak”.To be sure, a few attempts at replication do get published—when they aren’t able to replicate the original result. Well-known examples include the work of Bill Easterly, Ross Levine, and David Roodman attempting to replicate a famous study on the effects of foreign aid, David Albouy trying to reproduce an influential study of how institutions shape development, and Jesse Rothstein attempting to replicate the results of a famous study on the effects of school choice. But the two big problems remain: First, such studies are rare and can be costly to the authors. Second, studies that confirm others’ work—rather than refute it—bring even less reward and are thus even rarer. Economic science suffers.Journal editors are the ones who must take the lead to solve this problem, writes Hamermesh, with an unforgettable simile:

“Editors need to take the lead by providing sufficient incentives for top-flight authors of empirical work to engage in pure and scientific replication… Without these changes occasional paeans to the virtues of replication are as likely to enhance the scientific soundness of empirical research in economics as programs that urge abstinence are to reduce teenagers’ sexual activity.”
Things might be starting to change. The editorial policy of the Journal of Development Effectiveness is refreshing and laudable:
“Systematic reviews and replication studies are particularly encouraged. … Journal of Development Effectiveness has an explicit policy of ‘learning from our mistakes’, discouraging publication bias in favour of positive results—papers reporting interventions with no, or a negative, impact are welcome.”
From researchers, this stance deserves applause. From other journal editors, it deserves—yes—replication.

Disclaimer

CGD blog posts reflect the views of the authors, drawing on prior research and experience in their areas of expertise. CGD is a nonpartisan, independent organization and does not take institutional positions.

Topics