Monday, May 9, 2011

Vaccines don't raise IMR. Period.

David Gorski wrote a great article ripping apart the latest in a series of poorly done studies linking infant mortality rate to vaccines on the blog Science Based Medicine.  Since Dr. Gorski is a little long-winded, I'd like to pull out the real big points for you here.  If you want to look at the study in question (which was in a peer-reviewed piece of literature), you can find it here.  Of course it's free, the antivaxxers are happy to toss money at a study like this to give open access to everyone.

As Gorski points out, this was a ridiculously simple paper.  The language was simple, the methods were simple, the entire process was simple.  I could have written most of this paper (sans the research/sources) within a week, and I'm not even a grad student yet.  They took some data about infant mortality in the US and other countries, compared them to number of vaccines given, and produced this lovely excel chart.


The thing that struck me when I looked at this chart was how all-over-the-place these data points were.  If you ignore the line they drew right through the middle, this wouldn't strike me as a great model for a linear fit.  As Gorski points out

Be that as it may, I looked at the data myself and played around with it One thing I noticed immediately is that the authors removed four nations, Andorra, Liechenstein, Monaco, and San Marino, the justification being that becayse they are all so small, each nation only recorded less than five infant deaths. Coincidentally, or not, when all the data are used, the r2=.426, whereas when those four nations are excluded, r2increases to 0.494, meaning that the goodness of fit improved. Even so, it’s not that fantastic, certainly not enough to be particularly convincing as a linear relationship.
The data isn't all that convincing, and worse yet, it's not all that rigorous.  The paper states that it pulled all of its information from a 2009 report.  Two years ago, and a single year taken (and many countries excluded).  Gorski points this out better than I could:
Miller and Goldman only looked at one year’s data. There are many years worth of data available; if such a relationship between IMR and vaccine doses is real, it will be robust, showing up in multiple analyses from multiple years’ data. Moreover, the authors took great pains to look at only the United States and the 33 nations with better infant mortality rates than the U.S. There is no statistical rationale for doing this, nor is there a scientific rationale. Again, if this is a true correlation, it will be robust enough to show up in comparisons of more nations than just the U.S. and nations with more favorable infant mortality rates. Basically, the choice of data analyzed leaves a strong suspicion of cherry picking. 
It's possible that they didn't cherry pick, in which case they weren't rigorous enough.  If the only result of this test is that it gets others to look at other data, so much the better.  Still, there's one last big problem that Gorski cites with the paper that makes the data look worse still.  He quotes from Bernadine Healy, M.D. , who says:
First, it’s shaky ground to compare U.S. infant mortality with reports from other countries. The United States counts all births as live if they show any sign of life, regardless of prematurity or size. This includes what many other countries report as stillbirths. In Austria and Germany, fetal weight must be at least 500 grams (1 pound) to count as a live birth; in other parts of Europe, such as Switzerland, the fetus must be at least 30 centimeters (12 inches) long. In Belgium and France, births at less than 26 weeks of pregnancy are registered as lifeless. And some countries don’t reliably register babies who die within the first 24 hours of birth. Thus, the United States is sure to report higher infant mortality rates. For this very reason, the Organization for Economic Cooperation and Development, which collects the European numbers, warns of head-to-head comparisons by country.
Infant mortality in developed countries is not about healthy babies dying of treatable conditions as in the past. Most of the infants we lose today are born critically ill, and 40 percent die within the first day of life. The major causes are low birth weight and prematurity, and congenital malformations. As Nicholas Eberstadt, a scholar at the American Enterprise Institute, points out, Norway, which has one of the lowest infant mortality rates, shows no better infant survival than the United States when you factor in weight at birth.
Go ahead.  Go back and read that quote again.  In fact, I'll highlight the spot that struck me as most noteworthy: "Norway, which has one of the lowest infant mortality rates, shows no better infant survival than the United States when you factor in weight at birth."  In other wordsIMR in the US is relatively low compared to other countries, which flies directly in the face of their previous data.  Not that I'd suggest using this data to begin with.  Pulling from IMRs is a load of rubbish since you don't have a consistent definition for what constitutes infant mortality, and the authors of the paper certainly didn't make any effort to clarify that.  They do like to make some pretty hefty conjectures about SIDS and its connection to vaccines, though.  From the paper itself:
Although some studies were unable to find correlations between SIDS and vaccines, there is some evidence that a subset of infants may be more susceptible to SIDS shortly after being vaccinated. For example, Torch found that two-thirds of babies who had died from SIDS had been vaccinated against DPT (diphtheria–pertussis–tetanus toxoid) prior to death. Of these, 6.5% died within 12 hours of vaccination; 13% within 24 hours; 26% within 3 days; and 37%, 61%, and 70% within 1, 2, and 3 weeks, respectively.

In the interest of being fair and not just looking to David Gorski's analysis of this, I decided to give them the benefit of the doubt about this Torch study and I looked it up.  Or rather, I tried to.  The report was from 1982.  Most of my sources go back to 1985 at best.  Doing a google search I found a few articles from anti-vaccination sites accusing scientists of silencing Torch because he "dared to use anectdotal data."  Almost everything I found was from anti-vaccination sites and generally involved something like this
Torch's report provoked an uproar in the American Academy of Pediatrics. At a hastily arranged press conference he was soundly chastised for using "anecdotal data," meaning (will you believe it?) that he actually interviewed the families concerned!
That's the best I have to offer.  A few references to his work, but very little results for the work itself.  I'm not the best at searching and don't have all the tools others have to find this information, so it's probably out there somewhere.  I'm just not able to find it myself.  Besides, I tend to be hesitant with any controversial data that's older than 1990 or so.  The older it gets, the more likely it is that there's a newer study with better information.  It's hard to say, though.  What I can say is that the American Academy of Pediatrics was right to react as they did if he used anecdotal evidence.  Anecdotal evidence is a great place to start research, but it's a horrible place to end it.  Anecdotal evidence raises the questions, but it does not give the answers.

No comments:

Post a Comment