BLOG POST

Malaria Summit

December 14, 2006

As malaria gets its 15 minutes of fame in Washington today at the White House Summit on Malaria, focusing on the $1.2 billion, five-year President's Malaria Initiative, many will be watching. Some will be looking for signs that Congress will line up with the Administration to fully fund a program that addresses one of the most devastating scourges on the African continent. Others will be observing the enthusiasm for the truly impressive technological developments that have arrived or are on the horizon - the bednets, the artemisinin-combination therapy, the future vaccine. Still others will be celebrity-watching or listening for clues about which government contractors will benefit. Personally, I will be watching for signs of humility.

In the control of malaria over 50 years, successes have been few, far between, and too often not sustained. In fact, over the past half-century, the lack of success has at times resulted in deliberate neglect of the problem - even as deaths mounted. Hubris has been partly to blame.

That's why we have to do better this time, learning from history that to succeed will require big-time funding over the long haul, and a willingness to pay attention to emerging evidence about which combination of strategies is working or failing in different settings. In the past, the bugs have adapted faster than we have, costing untold lives. Much as we might see potential in the use of bednets, the application of pesticides, the scale-up of ACT use or other strategies, an over-reliance on one approach versus others combined with unrealistic promises about very rapid progress is likely to lead us down the road to nowhere that others have followed before.

The modern malaria control era started with the discovery in the 1940s that DDT was an effective insecticide against the mosquito vector. A few years later, spraying with DDT was used to eliminate malaria in southern Europe, and enthusiasm grew for wider use throughout the more malarious regions of Asia, Africa and South America. By the mid-1950s, WHO had launched an initiative to eradicate malaria worldwide, based on a quasi-military strategy of spraying large areas with DDT and treating cases with the only effective drug known at the time, chloroquine. The program had high-level visibility, substantial funding and technologies that seemed to work. Initial results were promising.

Within a few years the strategy was failing in most regions where malaria's toll was greatest. Cases began to rise again in the 1960s - the result of a complex jumble of technical, financial and political problems. The centrally-controlled anti-malaria efforts were not well accepted by many communities in affected areas, and international donors failed to provide sufficient resources, responding to bad news by cutting funding. As a result, neither treatment nor prevention regimes were followed well. Chloroquine-resistant strains of the most deadly form of the disease began to emerge, and mosquitoes were developing partial resistance to DDT. Adding to this was a manmade environmental risk, with growing exposure to malaria in areas where dams had been constructed. By 1972, after about $1 billion had been spent on the effort, WHO declared that the Global Eradication of Malaria Program had failed.

The visible failure left WHO with a black eye and discouraged a generation of malaria scientists and advocates. Only relatively recently have we seen renewed international enthusiasm for tackling malaria, and promising technological and programmatic innovations.

The President's Malaria Initiative is a clear signal of a new era in the battle against malaria. It is truly remarkable that the US and other wealthy nations appear eager to put resources toward a disease that affects virtually none of their taxpayers, with programs that benefit few of their constituents; applause for doing the right thing is well-deserved. But as this moves forward, we all have a job to do to make sure that the temptation of promising quick wins and building political and financial commitment around particular technologies, approaches and timeframes doesn't interfere with the vital process of accruing new knowledge about what's really working, and then changing as we go - while at the same time maintaining the political support and funding.

Disclaimer

CGD blog posts reflect the views of the authors, drawing on prior research and experience in their areas of expertise. CGD is a nonpartisan, independent organization and does not take institutional positions.

Topics