Ideas to Action:

Independent research for global prosperity

X

David Roodman's Microfinance Open Book Blog

Feed

Once I was an outsider to the murky debate over the meaning of the influential Pitt & Khandker study of the impact of microcredit on poverty in Bangladesh (the debate being between Jonathan Morduch and Mark Pitt). Struggling to understand its practical implications, I found myself drawn into it and eventually became an insider.

Holden Karnofsky of GiveWell was in roughly the same position as I started in---and just responded in his own analytical style, which is probably a lot wiser and more efficient than mine. Instead of immersing himself in the methodological issues, he takes a step back and draws some good lessons:

  • Never put too much weight on a single study. If nothing else, the issue of publication bias makes this an important guideline. (On this note, note that the 2009 Roodman and Morduch paper was rejected for publication; its sole peer-reviewer was an author of the original paper that Roodman and Morduch were questioning.)
  • Strive to understand the details of a study before counting it as evidence. Many “headline claims” in studies rely on heavy doses of assumption and extrapolation. This is more true for some studies than for others.
  • If a study’s assumptions, extrapolations and calculations are too complex to be easily understood, this is a strike against the study. Complexity leaves more room for errors and judgment calls, and means it’s less likely that meaningful critiques have had the chance to emerge. Note that before the 2009 response to the study discussed here was ever published, GiveWell took it with a grain of salt due to its complexity (see quote above). Randomized controlled trials tend to be relatively easy to understand; this is a point in their favor.
  • If a study does not disclose the full details of its data and calculations, this is another strike against it - and this phenomenon is more common than one might think.
  • Context is key. We often see charities or their supporters citing a single study as “proof” of a strong statement (about, for example, the effectiveness of a program). We try not to do this - we generally create broad overviews of the evidence on a given topic and source our statements to these.

I had only minor comments on this post.

Related Topics:

Disclaimer

CGD blog posts reflect the views of the authors, drawing on prior research and experience in their areas of expertise. CGD is a nonpartisan, independent organization and does not take institutional positions.