Reanalysis of Epidemiologic Studies – Not Intrinsically WOEful

A recent student law review article discusses reanalyses of epidemiologic studies, an important, and overlooked topic in the jurisprudence of scientific evidence.  Alexander J. Bandza, “Epidemiological-Study Reanalyses and Daubert: A Modest Proposal to Level the Playing Field in Toxic Tort Litigation,” 39 Ecology L. Q. 247 (2012).

In the Daubert case itself, the Ninth Circuit, speaking through Judge Kozinksi, avoided the methodological issues raised by Shanna Swan’s reanalysis of Bendectin epidemiologic studies, by assuming arguendo its validity, and holding that the small relative risk yielded by the reanalysis would not support a jury verdict of specific causation. Daubert v. Merrell Dow Pharm., Inc., 43 F.3d 1311, 1317–18 (9th Cir. 1995).

There is much that can, and should, be said about reanalyses in litigation and in the scientific process, but Bandza never really gets down to the business at hand. His 36 page article curiously does not begin to address reanalysis until the bottom of the 20th page. The first half of the article, and then some, reviews some time-worn insights and factoids about scientific evidence. Finally, at page 266, the author introduces and defines reanalysis:

“Reanalysis occurs ‘when a person other than the original investigator obtains an epidemiologic data set and conducts analyses to evaluate the quality, reliability or validity of the dataset, methods, results or conclusions reported by the original investigator’.”

Bandza at 266 (quoting Raymond Neutra et al., “Toward Guidelines for the Ethical Reanalysis and Reinterpretation of Another’s Research,” 17 Epidemiology 335, 335 (2006).

Bandza correctly identifies some of the bases for judicial hostility to re-analyses. For instance, some courts are troubled or confused when expert witnesses disagree with, or reevaluate, the conclusions of a published article. The witnesses’ conclusions may not be published or peer reviewed, and thus the proffered testimony fails one of the Daubert factors.  Bandza correctly notes that peer review is greatly overrated by judges. Bandza at 270. I would add that peer review is an inappropriate proxy for validity, a “test,” which reflects a distrust of the unpublished.  Unfortunately, this judicial factor ignores the poor quality of much of what is published, and the extreme variability in the peer review process. Judges overrate peer review because they are desperate for a proxy for validity of the studies relied upon, which will allow them to pass their gatekeeping responsibility on to the jury. Furthermore, the authors’ own conclusions are hearsay, and their qualifications are often not fully before the court.  What is important is the opinion of the expert witness who can be cross-examined and challenged.  SeeFOLLOW THE DATA, NOT THE DISCUSSION.” What counts is the validity of the expert witness’s reasoning and inferences.

Bandza’s article, which by title advertises itself to be about re-analyses, gives only a few examples of re-analyses without much detail.  He notes concerns that reanalyses may impugn the reputation of published scientists, and burden them with defending their data.  Who would have it any other way? After this short discussion, the article careens into a discussion of “weight of the evidence” (WOE) methodology. Bandza tells us that the rejection of re-analyses in judicial proceedings “implicitly rules out using the weight-of-the-evidence methodology often appropriate for, or even necessary to, scientific analysis of potentially toxic substances.” Bandza at 270.  This argument, however, is one sustained non-sequitur.  WOE is defined in several ways, but none of the definitions require or suggest the incorporation of re-analyses. Re-analyses raise reliability and validity issues regardless whether an expert witness incorporates them into a WOE assessment. Yet Bandza tells us that the rejection of re-analyses “Implicitly Ignores the Weight-of-the-Evidence Methodology Appropriate for the Scientific Analysis of Potentially Toxic Substances.” Bandza at 274. This conclusion simply does not follow from the nature of WOE methodology or reanalyses.

Bandza’s ipse dixit raises the independent issue whether WOE methodology is appropriate for scientific analysis. WOE is described as embraced or used by regulatory agencies, but that description hardly recommends the methodology as the basis for a scientific, as opposed to a regulatory, conclusion.  Furthermore, Bandza ignores the ambiguity and variability of WOE by referring to it as a methodology, when in reality, WOE is used to describe a wide variety of methods of reasoning to a conclusion. Bandza cites Douglas Weed’s article on WOE, but fails to come to grips with the serious objections raised by Weed in his article to the use of WOE methodologies.  Douglas Weed, “Weight of Evidence: A Review of Concept and Methods,” 25 Risk Analysis 1545, 1546–52 (2005) (describing the vagueness and imprecision of WOE methodologies). See also “WOE-fully Inadequate Methodology – An Ipse Dixit By Another Name.”

Bandza concludes his article with a hymn to the First Circuit’s decision in Milward v. Acuity Specialty Products Group, Inc., 639 F.3d 11 (1st Cir. 2011). Plaintiffs’ expert witness, Dr. Martyn Smith claimed to have performed a WOE analysis, which in turn was based upon a re-analysis of several epidemiologic studies. True, true, and immaterial.  The re-analyses were not inherently a part of a WOE approach. Presumably, Smith re-analyzed some of the epidemiologic studies because he felt that the data as presented did not support his desired conclusion.  Given the motivations at work, the district court in Milward was correct to look skeptically and critically at the re-analyses.

Bandza notes that there are procedural and evidentiary safeguards in federal court against unreliable or invalid re-analyses of epidemiologic studies.  Bandza at 277. Yes, there are safeguards but they help only when they are actually used. The First Circuit in Milward reversed the district court for looking too closely at the re-analyses, spouting the chestnut that the objections went to the weight not the admissibility of the evidence.  Bandza embraces the rhetoric of the Circuit, but he offers no description or analysis of the liberties that Martyn Smith took with the data, or the reasonableness of Smith’s reliance upon the re-analyzed data.

There is no necessary connection between WOE methodologies and re-analyses of epidemiologic studies.  Re-analyses can be done properly to support or deconstruct the conclusions of published papers.  As Bandza points out, some re-analyses may go on to be peer reviewed and published themselves.  Validity is the key, and WOE methodologies have little to do with the process of evaluating the original or the re-analyzed study.