Thursday, October 30, 2008

Hans-Heinrich Reckeweg Prizes and Wormbush for Your Heart

The first of the heel.ca articles I read were the "Hans-Heinrich Reckeweg Prizes." The link points to a word document. It doesn't contain any actual research, but is a list of prizes awarded to people for research, generally involving products sold by heel.ca. One of the awards refers to the study being published in Cancer.

In sort, it's a catalog of back patting and self promotion by a homeopathic manufacturer.

The next link held more promise. It's to a PDF described as "Efficacy of a homeopathic Crataegus preparation compared with usual therapy for mild (NYHA II) cardiac insufficiency: results of an observational cohort study."

The first thing I noticed is that the PDF appeared to be a scan of a printed document and it wasn't put through OCR. This means the PDF contains an image file of each page instead of text that can be copied and pasted for quoting purposes. This also means the PDF can't be searched.

The study consisted of only 212 subjects. While 216 had been enrolled, four had to be disqualified. There was no placebo branch to the study. The lack of a placebo arm is in and of itself a major red flag. The tiny sample size relegates this study to being a "pilot study" the kind of thing pharmaceutical companies do to determine if further research is even warranted. Small sample sizes like this are problematic, as small to moderate effects are easily swallowed by the mathematical margin of error inherent to all research. The small sample size means one or two anomalies can skew the results in a significant way. I want to reiterate, Bayer doesn't release a drug to market based up a study with 212 participants and no placebo. Scientists wouldn't consider such a study to be proof of anything, other than the need for a better study.

The study was conducted across 27 different locations. As noted in the study "the principal investigator had no control over the treatment assignment and there might have been large differnces(sic) in observed co-variates between treatment groups." Emphasis mine. In other words, not even the researchers are pretending these 212 people were evaluated consistently across the 27 locations used for the study.

The first page claims the paper was published in the "European Journal of Heart Failure 5 (2003) 319-326." This appears to be a publication of the "European Society of Cardiology"

The URL at the top of the document, "elsevier.com/locate/heafei" results in a "Page not Found" error from elsevier.com. A search on elsevier.com revealed the full text of the article could be purchased from ScienceDirect.

The study compares 110 people given "a homeopathic Crataegus preparation" (wormbush) to 102 given "usual therapy". Doctors used their own discretion in the dosages used for the "usual therapy". How many variables did these guys want in their study?

What was the "usual therapy" used as a baseline? "ACE Inhibitor / diuretics". Here's where the study really falls apart. Of the 102 people given the "usual therapy" 52% were given one of 7 different ACE Inhibitors. 6% got one of 6 different diuretics and the remaining 41.2% got some combination of both. The homeopathic treatment was a bit better, with 80% getting the recommended dosage and 15.4% got half that. No word on what the remainder were given.

The study compared two different dosages of a homeopathic treatment against unknown cocktails of 13 possible drugs over which the study had no control. Doctors at 27 different locations chose the drugs and dosages.

All of the patents were medically stable but in need of treatment for mild cardiac insufficiency. None of them were receiving treatment before the study. All of them were enrolled in an outpatient treatment program for the duration of the study.

It's possible that the improvement seen by these subjects was the result not of a homeopathic treatment or an unknown cocktail of drugs, but from receiving regular advice and direction from a cardiac specialist. A placebo arm to the study would have let us see if this was a likely explanation. Of course, proving homeopathic treatment works wasn't the goal of the study, just that it wasn't inferior to "usual therapy."

The study abstract concluded homeopathy wasn't inferior by most measures, it did note the "usual treatment" was better at reducing blood pressure. A closer look at the actual study however reveals that they had to use a far looser criteria to reach that conclusion. Their original, more stringent "non-inferiority" requirements has homeopathy coming out as "non inferior" in only 7 of 15 tests, many of which were subjective reports by doctors or patients. The actual improvements were small overall. The study even noted that "Baseline BP, HR and performance test scores did not differ significantly between treatments." The study's "Table 2" shows that the bulk of the effects fall within what should have been considered the study's margin of error.

The study was small. The abstract's claims aren't supported by the study's data. There was no placebo group so it's entirely possible that the minimal improvements noted were the result of having a doctor pay closer attention to the patient's diet and exercise. All this study really demonstrated is that there may not be a need for drugs in mild cardiac insufficiency, but a larger, better designed study would be needed to make that conclusion.

Because there HAS to be some humor in all this.

Obama Pictures and McCain Pictures
see Sarah Palin pictures



Obama Pictures and McCain Pictures
see Sarah Palin pictures

Obama Pictures and McCain Pictures
see Sarah Palin pictures

Obama Pictures and McCain Pictures
see Sarah Palin pictures