The Borna Disease Virus Tragedy

The Robert Koch Institute in Berlin (remember Koch’s Postulates?) earlier this year cancelled its research into Borna Disease Virus (BDV). That is really regrettable, because BDV, a neurotropic virus, may cause depression and bipolar disorder. BDV may be transmitted by blood transfusions; research on this is being carried out in Australia. There may even be an effective cure; an antiviral drug, amantadine sulfate, approved for use against the common flu.

Given the prevalence of depression and bipolar disorder, you should have thought that a decent randomized trial would have been carried out. A small trial would not cost more that a couple of hundred thousand dollars. But amantadine is a 30 year old drug, no longer covered by patents. There is practically no incentive for pharmaceutical firms to finance such a study. So far there has been none. Given the potential huge benefits and the low cost, that is a tragedy.

In the meantime, German and Austrian doctors are using amantadine and reportedly getting good results. However, we still need the solid evidence we would get from a well-designed randomized controlled trial.

For background, read this article in Discovery, a more recent paper here. See also Continue reading

Advertisements

Where are the Systematic Reviews we need?

New Scientist writes (subscription necessary to read article),

If you want to know how to preserve biodiversity, do not rely on articles in conservation journals, a new study warns.

IF YOU want to know how to preserve biodiversity, don’t rely on articles in conservation journals. So says a study which argues that conservationists should follow the medical profession’s lead, and ensure that their decisions are objectively based.

“We’re about 30 years behind the medical revolution,” says Philip Roberts of the Centre for Evidence-Based Conservation at the University of Birmingham in the UK. As a standard to aspire to, Roberts and his team took the systematic reviews that are the bedrock of evidence-based medicine. These reviews start with a carefully framed question, and typically list the search terms used to find the studies to be analysed. Strict criteria are then applied to exclude poor-quality research, and finally rigorous statistical tests on the pooled results are used as the basis of an objective guide for doctors to what treatments work best.

The study referred to is Are review articles a reliable source of evidence to support conservation and environmental management? A comparison with medicine.

Abstract, Continue reading

Useful and interesting blogs

Take a look at these two blogs on evidence-based practice, Bob Sutton’s Work Matters and Tracy Allison Altman’s Evidence Soup.

Bob Sutton is a professor of management science and engineering at Stanford. He writes about Evidence Soup, “Reading Altman’s blog is like taking an ongoing course in how to make evidence-based decisions and how to take – and evaluate – evidence based actions.”

I have previously mentioned Bob Sutton as the co-author of Hard Facts, Dangerous Half-Truths And Total Nonsense: Profiting From Evidence-Based Management.

Read e.g. his Management Advice: Which 90% is Crap? (this is a reference to John Wanamaker’s statement that 50% of the money he spent on advertising was wasted, but he didn’t know which 50%).

Data quality problem in systematic review

J.S. Brooks et al. DEVELOPMENT AS A CONSERVATION TOOL: EVALUATING ECOLOGICAL, ECONOMIC, ATTITUDINAL, AND BEHAVIORAL OUTCOMES: REVIEW REPORT is an excellent attempt at a systematic review. Unfortunately, there is a serious problem with lack of quality data.

Main Results
The results of this review are that (1) very few studies provide adequate quantitative measures of success across multiple outcomes to provide a strong test of the hypotheses, and (2) that two separate statistical approaches to the data indicate market selling opportunities are associated with attitudinal outcomes, and community involvement in decision making and implementation is associated with behavioral success.

Conclusions

As regards the first objective, it is clear that without far better monitoring schemes in place it is still impossible to provide a systematic evaluation of how different strategies are best suited to different conservation challenges. First, there is a paucity of high quality data. Second, few studies provide quantitative evaluations of success. Third, few studies evaluate across the full range of relevant outcomes – behavioral, attitudinal, economic and ecological…

This and other papers can be found at the Centre for Evidence-Based Conservation at the University of Birmingham, U.K.

Gloom & Doom, Inc.

If a doom-monger came up with a set of recommendations that would imply less money and power to the class of people to whom he or she belongs, I would at least listen with interest.

We shouldn’t listen to the latest “obey us, or face doom” message.

A recent commentary in Nature (subscription necessary) calls for an international body of biodiversity experts.

…it should have a formal link to, and be funded by, governments. This feature, which distinguishes it from previous biodiversity initiatives, would ensure that negotiations within international biodiversity conventions are based on validated scientific information and lead to action at national and global levels.

So, because governments would (use taxpayers’ money to) pay for this, that would ensure action “at national and global levels”? And this “action” would be effective? The authors of the proposal owe a payment for ecosystem services to the owners of the ecosystem that produced whatever they smoked.

The consultation process, supervised by an international steering committee, will last 18 months and proceed in two phases. During the first phase, a number of studies will define the need for, and goals of, an international panel on biodiversity. These studies will examine the global decision-making landscape concerned with biodiversity, analyse successes and failures of biodiversity conservation efforts at different scales, and assess existing international mechanisms that deliver scientific expertise. In a second phase, this information will be used to articulate a set of recommendations for an international panel, which will be presented at a set of regional meetings to seek input from all sectors of society and all regions of the world.

This is not about action, or doing science, this is about creating and getting money for a talking shop.

We urgently need a scientific body of knowledge on conservation. What works, what doesn’t work, what are the cost and the benefits? We can’t do systematic reviews before the field trials have been done. Diverting scientists away from science and getting funds for scientists to pontificate on the basis of weak science is a waste both of scientists and of money.

Evidence-based interventions

Examples of systematic reviews, from the Cochrane Collaboration‘s recent newsletter,

  • Don’t bother with intravenous rehydration for diarrhoea – oral rehydration works just as well

In wealthy countries it is fashionable to prefer intravenous therapy (IVT) over oral rehydration therapy (ORT). A Cochrane Review published, however, shows that ORT is just as effective as IVT.

  • Treating water at home is effective in preventing diarrhoea, a major cause of death in young children in developing countries

Supplying clean water to a community helps reduce gastrointestinal diseases, but interventions that kill disease-causing waterborne micro-organisms (or microbes) once it has reached the home can be even more effective. These are the conclusions of a systematic review that considered the outcome of 38 field trials involving more than 53,000 participants.

Before you can carry out systematic reviews, you must have well-designed studies to review. Then you can generalize from those studies. “38 field trials involving more than 53,000 participants”. Sigh. How many useful, well designed field trials do we have in conservation?