Defeat snatched out of the jaws of victory: study conclusions and media reporting


Ok, so this post has been hanging out in my draft queue for over a month. It’s time to send it on, even if it is still unpolished.

Few things bother me more than seeing a flawed study interpretation disseminated widely (usually due to sloppiness, sometimes due to trying to “prove” a point).

An “editorial”:http://www.lef.org/magazine/mag2006/jun2006_awsi_01.htm?sourcecode=MCC02E&source=INFEML_MCC02E&key=website written for “??Life Extension??”:http://www.lef.org, which was recently was brought to my attention, discusses the media reporting of several recent studies that seemed to be show that certain dietary supplements are worthless. However, the article points out that in many cases, the text of the publication doesn’t match the headline. (The editorial also criticizes some features of these studies, but I’m not knowledgeable enough to comment on whether glucosamine sulfate is the superior form of the supplement, for example.)

h3. Public spin on restricted-access data

A while back “Orac quoted an idea”:http://oracknows.blogspot.com/2006/01/how-is-academic-medicine-perceived-by.html (third quote box) that I think would help alleviate this problem: to make public the manuscripts that are the subject of press releases. While Orac comments that he doesn’t think it would help too much (but probably a little bit), I think that as more people blog as professionals, this step will help out a lot more. It also doesn’t take an advanced degree to point out that statistically significant results on the efficacy of a dietary supplement does not make the supplement “worthless.”

Of course, since I’m a statistician, I’d prefer to go all the way and see the data, at least on the subjects I’m interested in. Earlier, I “reverse-engineered”:# a graph published by the Geier father-and-son team and as a result was able to scrutinize their results carefully. Fortunately, their sources of data are public, and I know other bloggers have looked at those same sources of data. I think this has helped the discussion.

However, other stories about Echinacaea and homeopathy are stashed behind subscription barriers. Sometimes, you can get a reprint of the paper for anywhere from $12 to $30. (One time, I actually shelled out the $30, but that was for an article that directly related to a paper I was writing at the time.) Depending on the journal, I can actually get a copy of these, and I suppose I could head down to the university during one of the two hour-and-a-half blocks of spare time that I have each year (to park in the parking deck a half mile away, walk over to the health sciences library, locate the article, photocopy, and walk back to my car).

Note that this isn’t a call to open health journals for free, though there may be an argument for such. However, given the disparity between media reporting, press releases, and the actual content of a study, I think it would be a fair request to open up journal articles that are reported on in the popular press.

h3. Financing and conflicts of interest

I’ve heard both CAM advocates and CAM skeptics hurl accusations of conflict of interest. There seems to be an assumption, the “article I earlier referenced”:http://www.lef.org/magazine/mag2006/jun2006_awsi_01.htm?sourcecode=MCC02E&source=INFEML_MCC02E&key=website included, that studies financed by a particular entity (e.g. pharma company, dietary supplement company) influences the results. This is a well-known problem, and the professional journals and the NIH are taking measures to mitigate it. For example, the NIH requires a disclosure of any potential conflicts of interest. However, funding by a particular entity doesn’t invalidate or confirm the results of a particular study.

As long as the conduct of a public health study is transparent, the study should be scrutinized and criticized on the basis of its merits and deficiencies. (Ironically, shortly after I started this entry Orac is making the same point.) This is one reason we spell out everything beforehand in a protocol and the various plans we produce for a study. This still leaves the problem of publication bias, in which negative results tend not to be published, but positive ones are. Medical journals have proposed one solution to this problem: a registry where if a company wishes to publish results, they must register all their studies with the journal. I think this is a pretty good idea, and furthermore that this registry be made public.

At any rate, before these (admittedly imperfect) solutions are in place, I tend not to listen to arguments where the sole point is conflict of interest. That may raise my suspicions, but not necessarily undermine my confidence in any given study.

“This article”:http://msnbc.msn.com/id/12275329/ seemingly disagrees with my point. However, there are a lot of parameters from which to choose when designing a study, and I maintain that transparency in choosing these parameters will allow an objective evaluation of the study without having to rely on funding arguments.

h3. Editorializing

In some cases, an editorial accompanies the objective results of the study. This happened with an echinacaea study in June 2005 in NEJM and with the “End of Homeopathy” metaanalysis in ??The Lancet?? in August 2005. I personally can do without them, especially for ones that call for an end to debate or an end to study on a particular topic.

Advertisements

One Response

  1. […] But, again, I think he has a point. Several recent high-profile seemingly negative CAM studies have been touted as proof that certain popular remedies (e.g. echinacaea) and supplements (e.g. Vitamin D, gluthianone) are not effective when used for their popular indications. David Gumpert and Mark Schauss have more details on the recent supplmenet studies, and I’ve remarked before on the echinacaea studies. […]

Comments are closed.

%d bloggers like this: