TORTINI

For your delectation and delight, desultory dicta on the law of delicts.

The Cherry-Picking Fallacy in Synthesizing Evidence

June 15th, 2012

What could be wrong with picking cherries?  At the end of the process you have cherries, and if you do it right, you have all ripe, and no rotten, cherries.  Your collection of ripe cherries, however, will be unrepresentative of the universe of cherries, but at least we understand how and why your cherries were selected.

Elite colleges pick the best high school students; leading law schools pick the top college students; and top law firms and federal judges cherry pick the best students of the best law schools.  Lawyers are all-too-comfortable with “cherry picking.”  Of course, the cherry-picking process here has at least some objective criteria, which can be stated in advance of the selection.

In litigation, each side is expected to “cherry pick” the favorable evidence, and ignore or flyblow the contrary evidence.  Judges are thus often complacent about selectivity in the presentation of evidence by parties and their witnesses.  In science, this kind of adversarial selectivity is a sure way to inject bias and subjectivity into claims of knowledge.  The development of the systematic review, in large measure, has been supported by the widespread recognition that studies cannot be evaluated with post hoc, subjective evaluative criteria. Cynthia D. Mulrow, Deborah J. Cook, Frank Davidoff, “Systematic Reviews: Critical Links in the Great Chain of Evidence,” 126 Ann. Intern. Med. 389 (1997)

The International Encyclopedia of Philosophy describes “cherry picking” as a fallacy, “a kind of error in reasoning.”  Cherry-picking the evidence, also known as “suppressed evidence,” is:

“[i]ntentionally failing to use information suspected of being relevant and significant is committing the fallacy of suppressed evidence. This fallacy usually occurs when the information counts against one’s own conclusion. * * * If the relevant information is not intentionally suppressed but rather inadvertently overlooked, the fallacy of suppressed evidence also is said to occur, although the fallacy’s name is misleading in this case.”

Bradley Dowden, “Suppressed Evidence,” International Encyclopedia of Philosophy (Last updated: December 31, 2010).

Cherry picking is a main rhetorical device for the litigator, and many judges simply do not understand what is so wrong with each side’s selection of the studies that it wishes to emphasize.  Whatever the acceptability of lawyers’ cherry picking in the presentation of evidence, it is antithetical to scientific methodology.  “Cherry picking (fallacy),” Wikipedia (describing cherry picking as the pointing to data that appears to confirm one’s opinion, while ignoring contradictory data)[last visited on June 14, 2012]

Given the pejorative connotations of “cherry picking,” no one should be very surprised that lawyers and judges couch their Rule 702 arguments and opinions in terms of whether expert witnesses engaged in this fruitful behavior.  Although I had heard plaintiffs’ and defendants’ counsel use the phrase, I only recently came across it in a judicial opinion.  Since the phrase nicely describes a fallacious form of reasoning, I thought it would be helpful to collect pertinent cases that describe the fallaciousness of fruit-pickin’ expert witness testimony.

United States Court of Appeals

Barber v. United Airlines, Inc., 17 Fed.Appx. 433, 437 (7th Cir. 2001) (affirming exclusion of “cherry-picking” expert witness who failed to explain why he ignored certain data while accepting others)

District Courts

Dwyer v. Sec’y of Health & Human Servs., No. 03-1202V, 2010 WL 892250 (Fed. Cl. Spec. Mstr. Mar. 12, 2010)(recommending rejection of thimerosal autism claim)(“In general, respondent’s experts provided more responsive answers to such questions.  Respondent’s experts were generally more careful and nuanced in their expert reports and testimony. In contrast, petitioners’ experts were more likely to offer opinions that exceeded their areas of expertise, to “cherry-pick” data from articles that were otherwise unsupportive of their position, or to draw conclusions unsupported by the data cited… .”)

In re Bausch & Lomb, Inc., 2009 WL 2750462 at 13 (D. S.C. 2009)( “Dr. Cohen did not address [four contradictory] studies in her expert reports or affidavit, and did not include them on her literature reviewed list [. . .] This failure to address this contrary data renders plaintiffs’ theory inherently unreliable.”)

Rimbert v. Eli Lilly & Co., No. 06-0874, 2009 WL 2208570, *19 (D.N.M. July 21, 2009) )(“Even more damaging . . . is her failure to grapple with any of the myriad epidemiological studies that refute her conclusion.”), aff’d, 647 F.3d 1247 (10th Cir. 2011) (affirming exclusion but remanding to permit plaintiff to find a new expert witness)

In re Bextra & Celebrex Prod. Liab. Litig., 524 F. Supp.2d 1166, 1176, 1179, 1181, 1184 (N.D. Cal. 2007) (criticizing plaintiffs’ expert witnesses for “cherry-picking studies”); id. at 1181 (“these experts ignore the great weight of the observational studies that contradict their conclusion and rely on the handful that appear to support their litigation-created opinion.”)

LeClerq v. Lockformer Co., No. 00 C 7164, 2005 U.S. Dist. LEXIS 7602, at *15 (N.D. Ill. Apr. 28, 2005) (holding that expert witness’s “cherry-pick[ing] the facts he considered to render his opinion, and such selective use of facts fail[s] to satisfy the scientific method and Daubert.”)(internal citations and quotations omitted)

Holden Metal & Aluminum Works v. Wismarq Corp., No. 00 C 0191, 2003 WL 1797844, at *2 (N.D. Ill. Apr. 2, 2003).

State Courts

Betz v. Pneumo Abex LLC, 2012 WL 1860853, *16 (May 23, 2012 Pa. S. Ct.)(“According to Appellants, moreover, the pathologist’s self-admitted selectivity in his approach to the literature is decidedly inconsistent with the scientific method. Accord Brief for Amici Scientists at 17 n.2 (“‘Cherry picking’ the literature is also a departure from ‘accepted procedure’.”)).

George v. Vermont League of Cities and Towns, 2010 Vt. 1, 993 A.2d 367, 398 (Vt. 2010)(expressing concern about how and why plaintiff’s expert witnesses selected some studies to include in their “weight of evidence” methodology.  Without an adequate explanation of selection and weighting criteria, the choices seemed arbitrary)

Scaife v. AstraZeneca LP, 2009 WL 1610575 at 8 (Del. Super. 2009) (“Simply stated, the expert cannot accept some but reject other data from the medical literature without explaining the bases for her acceptance or rejection.”)

In re Bextra & Celebrex, 2008 N.Y. Misc. LEXIS 720, *20, 239 N.Y.L.J. 27 (2008) (holding that New York’s Frye rule requires proponent to show that its expert witness had “look[ed] at the totality of the evidence and [did] not ignore contrary data.”); see also id. at *36 (“Moreover, out of 32 studies (29 published) cited by defendants, plaintiffs chose only 8 to plead their case.  This smacks of ‘cherry-picking,’ skewing their analysis by only looking at the helpful studies. Such practice contradicts the accepted method for an expert’s analysis of epidemiological data.”)

Bowen v. E.I. DuPont de Nemours & Co., 906 A.2d 787, 797 (Del. 2006) (noting that expert witnesses cannot ignore studies contrary to their opinions)

Selig v. Pfizer, Inc., 185 Misc. 2d 600, 607, 713 N.Y.S.2d 898 (Sup. Ct. N.Y. Cty. 2000) (holding that expert witness failed to satisfy Frye test’s requirement of following an accepted methodology when he ignored studies contrary to his opinion), aff’d, 290 A.D.2d 319, 735 N.Y.S.2d 549 (1st Dep’t 2002)

******************

Most but not all the caselaw uniformly recognizes the fallacy for an expert witness to engage in ad hoc selectivity in addressing studies upon which to rely.  In the following two cases, the cherry-picking was identified, but acquiesced in by judges.

McClellan v. I-Flow Corp., 710 F. Supp. 2d 1092, 1114 (D. Ore. 2010)(discussing cherry picking but rejecting “document by document” review)(“Finally, defendants contend that plaintiffs’ experts employ unreliable methodologies by ‘cherry-picking’ facts from certain studies and asserting reliance on the ‘totality’ or ‘global gestalt of medical evidence’. Defendants argue that in  doing so, plaintiffs’ experts fail to ‘painstakingly’ link each piece of data to their conclusions or explain how the evidence supports their opinions.”)

United States v. Paracha, 2006 WL 12768 (S.D. N.Y. Jan. 3, 2006)(rejecting challenge to terrorism expert on grounds that he cherry picked evidence in conspiracy prosecution involving al Queda)

King v. Burlington No. Santa Fe Ry, ___N.W.2d___, 277 Neb. Reports 203, 234 (2009)(noting that the law does “not preclude a trial court from considering as part of its reliability inquiry whether an expert has cherry-picked a couple of supporting studies from an overwhelming contrary body of literature,” but ignoring the force of the fallacious expert witness testimony by noting that the questionable expert witness (Frank) had some studies that showed associations between exposure to diesel exhaust or benzene and multiple myeloma).

Another Confounder in Lung Cancer Occupational Epidemiology — Diesel Engine Fumes

June 13th, 2012

Researchers obviously need to be aware of, and control for, potential and known confounders.  In the context of investigating the etiologies of lung cancer, there is a long list of potential confounding exposures, often ignored in peer-reviewed papers, which focus on one particular outcome of interest.  Just last week, I wrote to emphasize the need to account for potential and known confounding agents, and how this need was particularly strong in studies of weak alleged carcinogens such as crystalline silica.  See Sorting Out Confounded Research – Required by Rule 702.  Yesterday, the World Health Organization (WHO) added another “known” confounder for lung cancer epidemiology:  diesel fume.

According to the International Agency for Research on Cancer (IARC), a division of the WHO, a working group of international experts voted to reclassify diesel engine exhaust as a “Group I” carcinogen.  IARC: Diesel engines exhaust carcinogenic (2012).  This classification means, in IARC parlance, that ” there is sufficient evidence of carcinogenicity in humans. Exceptionally, an agent may be placed in this category when evidence of carcinogenicity in humans is less than sufficient but there is sufficient evidence of carcinogenicity in experimental animals and strong evidence in exposed humans that the agent acts through a relevant mechanism of carcinogenicity.”  The Group was headed up by Dr. Christopher Portier, who is the director of the National Center for Environmental Health and the Agency for Toxic Substances and Disease Registry at the Centers for Disease Control and Prevention.  Id.

The reclassification removes diesel exhaust from its previous categorization as a Group 2A carcinogen, which is interpreted “as probably carcinogenic to humans.”  Diesel exhaust has been on a high-priority list for re-evaluation since 1998, as result of epidemiologic research from many countries.  The Working Group specifically found that there was sufficient evidence to conclude that diesel exhaust is a cause of lung cancer in humans, and limited evidence to support an association with bladder cancer.  The Group rejected any change in classification of gasoline engine exhaust from its current IARC rating as “possibly carcinogenic to humans. (Group 2B).”

Unlike other IARC Working Group decisions (such as crystalline silica), which were weakened by close votes and significant dissents, the diesel Group’s conclusion was unanimous.  The diesel Group appeared to be impressed by two recent studies of lung cancer in underground miners, released in March 2012.  One study was in a large cohort, conducted by NIOSH, and the other was a nested case-control study, conducted by the National Cancer Institute (NCI).  See Debra T. Silverman, Claudine M. Samanic, Jay H. Lubin, Aaron E. Blair, Patricia A. Stewart , Roel Vermeulen, Joseph B. Coble, Nathaniel Rothman, Patricia L. Schleiff , William D. Travis, Regina G. Ziegler, Sholom Wacholder, Michael D. Attfield, “The Diesel Exhaust in Miners Study: A Nested Case-Control Study of Lung Cancer and Diesel Exhaust,” J. Nat’l Cancer Instit. (2012)(in press and open access); and Michael D. Attfield, Patricia L. Schleiff, Jay H. Lubin, Aaron Blair, Patricia A. Stewart, Roel Vermeulen, Joseph B. Coble, and Debra T. Silverman, “The Diesel Exhaust in Miners Study: A Cohort Mortality Study With Emphasis on Lung Cancer,” J. Nat’l Cancer Instit. (2012)(in press).

According to a story in the New York Times, the IARC Working Group described diesel engine exhaust as “more carcinogenic than secondhand cigarette smoke.”  Donald McNeil, “W.H.O. Declares Diesel Fumes Cause Lung Cancer,” N.Y. Times (June 12, 2012).  The Times also quoted Dr. Debra Silverman, NCI chief of environmental epidemiology, at length.  Dr. Silverman, who was the lead author of the nested case-control study cited by the IARC Press Release, noted that her large study showed that long-term heavy exposure to diesel fumes increased lung cancer risk seven fold. Dr. Silverman described this risk as much greater than that thought to be created by passive smoking, but much smaller than smoking two packs of cigarettes a day.  She stated that “totally” supported the IARC reclassification, and that she believed that governmental agencies would use the IARC analysis as the basis for changing the regulatory classification of diesel exhaust.

Silverman’s nested case-control study appears to have been based upon careful diesel exhaust exposure information, as well as smoking histories.  The study also searched and analyzed for other potential confounders, which might be expected to be involved in underground mining:

“Other potential confounders [ie, duration of cigar smoking; frequency of pipe smoking; environmental tobacco smoke; family history of lung cancer in a first-degree relative; education; body mass index based on usual adult weight and height; leisure time physical activity; diet; estimated cumulative exposure to radon, asbestos, silica, polycyclic aromatic hydrocarbons (PAHs) from non-diesel sources, and respirable dust in the study facility based on air measurement and other data (14)] were evaluated but not included in the final models because they had little or no impact on odds ratios (ie, inclusion of these factors in the final models changed point estimates for diesel exposure by ≤ 10%).”

Silverman, et al., at 4.  The absence of an association between lung cancer and silica exposure is noteworthy in a such a large study of underground miners.

Sorting Out Confounded Research – Required by Rule 702

June 10th, 2012

CONFOUNDING

Back in 2000, several law professors wrote an essay, in which they detailed some of the problems faced in expert witness gatekeeping.  They noted that judges easily grasped the problem of generalizing from animal evidence to human experience, and thus they simplistically emphasized human (epidemiologic) data.  But in their emphasis, the judges missed problems of internal validity, such as confounding, in epidemiologic studies:

“Why do courts have such a preference for human epidemiological studies over animal experiments? Probably because the problem of external validity (generalizability) is one of the most obvious aspects of research methodology, and therefore one that non-scientists (including judges) are able to discern with ease – and then give excessive weight to (because whether something generalizes or not is an empirical question; sometimes things do and other times they do not). But even very serious problems of internal validity are harder for the untrained to see and understand, so judges are slower to exclude inevitably confounded epidemiological studies (and give insufficient weight to that problem). Sophisticated students of empirical research see the varied weaknesses, want to see the varied data, and draw more nuanced conclusions.”

David Faigman, David Kaye, Michael Saks, Joseph Sanders, “How Good is Good Enough?  Expert Evidence Under Daubert and Kumho,” 50 Case Western Reserve L. Rev. 645, 661 n.55 (2000).  I am not sure that the problems are dependent in the fashion suggested by the authors, but their assessment that judges may be slow and frequently lack the ability to draw nuanced conclusions seems fair enough. Judges continue to miss important validity issues, perhaps because the adversarial process levels all studies to debating points in litigation.  See, e.g., In re Welding Fume Prods. Liab. Litig., 2006 WL 4507859, *33 (N.D.Ohio 2006)(reducing all studies to one level, and treating all criticisms as though they rendered all studies invalid).

[This discussion of confounding has been updated; see here and there.]

 

Meta-Meta-Analysis — The Gadolinium MDL — More Than Ix’se Dixit

June 8th, 2012

There is an tendency, for better or worse, for legal bloggers to be partisan cheerleaders over litigation outcomes.  I admit that most often I am dismayed by judicial failures or refusals to exclude dubious plaintiffs’ expert witnesses’ opinion testimony, and I have been known to criticize such decisions.  Indeed, I wouldn’t mind seeing courts exclude dubious defendants’ expert witnesses.  I have written approvingly about cases in which judges have courageously engaged with difficult scientific issues, seen through the smoke screen, and properly assessed the validity of the opinions expressed.  The Gadolinium MDL (No. 1909) Daubert motions and decision offer a fascinating case study of a challenge to an expert witness’s meta-analysis, an effective defense of the meta-analysis, and a judicial decision to admit the testimony, based upon the meta-analysis.  In re Gadolinium-Based Contrast Agents Prods. Liab. Litig., 2010 WL 1796334 (N.D. Ohio May 4, 2010) [hereafter Gadolinium], reconsideration denied, 2010 WL 5173568 (June 18, 2010).

Plaintiffs proffered general causation opinions (between gadolinium contrast media and Nephrogenic Systemic Fibrosis (“NSF”), by a nephrologist, Joachim H. Ix, M.D., with training in epidemiology.  Dr. Ix’s opinions were based in large part upon a meta-analysis he conducted on data in published observational studies.  Judge Dan Aaron Polster, the MDL judge, itemized the defendant’s challenges to Dr. Ix’s proposed testimony:

“The previously-used procedures GEHC takes issue with are:

(1) the failure to consult with experts about which studies to include;

(2) the failure to independently verify which studies to select for the meta-analysis;

(3) using retrospective and non-randomized studies;

(4) relying on studies with wide confidence intervals; and

(5) using a “more likely than not” standard for causation that would not pass scientific scrutiny.”

Gadolinium at *23.  Judge Polster confidently dispatched these challenges.  Dr. Ix, as a nephrologist, had subject-matter expertise with which to develop inclusionary and exclusionary criteria on his own.  The defendant never articulated what, if any, studies were inappropriately included or excluded.  The complaint that Dr. Ix had used retrospective and non-randomized studies also rang hollow in the absence of any showing that there were randomized clinical trials with pertinent data at hand.  Once a serious concern of nephrotoxicity arose, clinical trials were unethical, and the defendant never explained why observational studies were somehow inappropriate for inclusion in a meta-analysis.

Relying upon studies with wide confidence intervals can be problematic, but that is one of the reasons to conduct a meta-analysis, assuming the model assumptions for the meta-analysis can be verified.  The plaintiffs effectively relied upon a published meta-analysis, which pre-dated their expert witness’s litigation effort, in which the authors used less conservative inclusionary criteria, and reported a statistically significant summary estimate of risk, with an even wider confidence interval.  R. Agarwal, et al., ” Gadolinium-based contrast agents and nephrogenic systemic fibrosis: a systematic review and meta-analysis,” 24 Nephrol. Dialysis & Transplantation 856 (2009).  As the plaintiffs noted in their opposition to the challenge to Dr. Ix:

“Furthermore, while GEHC criticizes Dr. Ix’s CI from his meta-analysis as being “wide” at (5.18864 and 25.326) it fails to share with the court that the peer-reviewed Agarwal meta-analysis, reported a wider CI of (10.27–69.44)… .”

Plaintiff’s Opposition to GE Healthcare’s Motion to Exclude the Opinion Testimony of Joachim Ix at 28 (Mar. 12, 2010)[hereafter Opposition].

Wider confidence intervals certainly suggest greater levels of random error, but Dr. Ix’s intervals suggested statistical significance, and he had carefully considered statistical heterogeneity.  Opposition at 19. (Heterogeneity was never advanced by the defense as an attack on Dr. Ix’s meta-analysis).  Remarkably, the defendant never advanced a sensitivity analysis to suggest or to show that reasonable changes to the evidentiary dataset could result in loss of statistical significance, as might be expected from the large intervals.  Rather, the defendant relied upon the fact that Dr. Ix had published other meta-analyses in which the confidence interval was much narrower, and then claimed that he had “required” these narrower confidence intervals for his professional, published research.  Memorandum of Law of GE Healthcare’s Motion to Exclude Certain Testimony of Plaintiffs’ Generic Expert, Joachim H. Ix, MD, MAS, In re Gadolinium MDL No. 1909, Case: 1:08-gd-50000-DAP  Doc #: 668   (Filed Feb. 12, 2010)[hereafter Challenge].  There never was, however, a showing that narrower intervals were required for publication, and the existence of the published Agarwal meta-analysis contradicted the suggestion.

Interestingly, the defense did not call attention to Dr. Ix’s providing an incorrect definition of the confidence interval!  Here is how Dr. Ix described the confidence interval, in language quoted by plaintiffs in their Opposition:

“The horizontal lines display the “95% confidence interval” around this estimate. This 95% confidence interval reflects the range of odds ratios that would be observed 95 times if the study was repeated 100 times, thus the narrower these confidence intervals, the more precise the estimate.”

Opposition at 20.  The confidence interval does not provide a probability distribution of the parameter of interest; rather the distribution of confidence intervals has a probability of covering the hypothesized “true value” of the parameter.

Finally, the defendant never showed any basis for suggesting that a scientific opinion on causation requires something more than a “more likely than not” basis.

Judge Polster also addressed some more serious challenges:

“Defendants contend that Dr. Ix’s testimony should also be excluded because the methodology he utilized for his generic expert report, along with varying from his normal practice, was unreliable. Specifically, Defendants assert that:

(1) Dr. Ix could not identify a source he relied upon to conduct his meta-analysis;

(2) Dr. Ix imputed data into the study;

(3) Dr. Ix failed to consider studies not reporting an association between GBCAs and NSF; and

(4) Dr. Ix ignored confounding factors.”

Gadolinium at *24

IMPUTATION

The first point, above – the alleged failure to identify a source for conducting the meta-analysis – rings fairly hollow, and Judge Polster easily deflected it.  The second point raised a more interesting challenge.  In the words of defense counsel:

“However, in arriving at this estimate, Dr. Ix imputed, i.e., added, data into four of the five studies.  (See Sept. 22 Ix Dep. Tr. (Ex. 20), at 149:10-151:4.)  Specifically, Dr. Ix added a single case of NSF without antecedent GBCA exposure to the patient data in the underlying studies.

* * *

During his deposition, Dr. Ix could not provide any authority for his decision to impute the additional data into his litigation meta-analysis.  (See Sept. 22 Ix Dep. Tr. (Ex. 20), at 149:10-151:4.)  When pressed for any authority supporting his decision, Dr. Ix quipped that ‘this may be a good question to ask a Ph.D level biostatistician about whether there are methods to [calculate an odds ratio] without imputing a case [of NSF without antecedent GBCA exposure]’.”

Challenge at 12-13.

The deposition reference suggests that the examiner had scored a debating point by catching Dr. Ix unprepared, but by the time the parties briefed the challenge, the plaintiffs had the issue well in hand, citing A. W. F. Edwards, “The Measure of Association in a 2 × 2 Table,” 126 J. Royal Stat. Soc. Series A 109 (1963); R.L. Plackett, “The Continuity Correction in 2 x 2 Tables,” 51 Biometrika 327 (1964).  Opposition at 36 (describing the process of imputation in the event of zero counts in the cells of a 2 x 2 table for odds ratios).  There are qualms to be stated about imputation, but the defense failed to make them.  As a result, the challenge overall lost momentum and credibility.  As the trial court stated the matter:

“Next, there is no dispute that Dr. Ix imputed data into his meta-analysis. However, as Defendants acknowledge, there are valid scientific reasons to impute data into a study. Here, Dr. Ix had a valid basis for imputing data. As explained by Plaintiffs, Dr. Ix’s imputed data is an acceptable technique for avoiding the calculation of an infinite odds ratio that does not accurately measure association.7 Moreover, Dr. Ix chose the most conservative of the widely accepted approaches for imputing data.8 Therefore, Dr. Ix’s decision to impute data does not call into question the reliability of his meta-analysis.”

Gadolinium at *24.

FAILURE TO CONSIDER NULL STUDIES

The defense’s challenged including a claim that Dr. Ix had arbitrarily excluded studies in which there was no reported incidence of NSF. The defense brief unfortunately does not describe the studies excluded, and what, if any, effect their inclusion in the meta-analysis would have had.  This was, after all, the crucial issue. The abstract nature of the defense claim left the matter ripe for misrepresentation by the plaintiffs:

“GEHC continues to misunderstand the role of a meta-analysis and the need for studies that included patients both that did or did not receive GBCAs and reported on the incidence of NSF, despite Dr. Ix’s clear elucidation during his deposition. (Ix Depo. TR [Exh.1] at 97-98).  Meta-analyses such as performed by Dr. Ix and Dr. Agarwal search for whether or not there is a statistically valid association between exposure and disease event. In order to ascertain the relationship between the exposure and event one must have an event to evaluate. In other words, if you have a study in which the exposed group consists of 10,000 people that are exposed to GBCAs and none develop NSF, compared to a non-exposed group of 10,000 who were not exposed to GBCAs and did not develop NSF, the study provides no information about the association between GBCAs and NSF or the relative risk of developing NSF.”

Challenge at 37 – 38 (emphasis in original).  What is fascinating about this particular challenge, and the plaintiffs’ response, is the methodological hypocrisy exhibited.  In essence, the plaintiffs argued that imputation was appropriate in a case-control study, in which one cell contained a zero, but they would ignore a great deal of data in a cohort study with data.  To be sure, case-control studies are more efficient than cohort studies for identifying and assessing risk ratios for rare outcomes.  Nevertheless, the plaintiffs could easily have been hoisted with their own hypothetical petard.  No one in 10,000 gadolinium-exposed patients developed NSF; and no one in a control group did either.  The hypothetical study suggests that the rate of NSF is low and not different in the exposed and in the unexposed patients.  The risk ratio could be obtained by imputing an integer for the cells containing zero, and a confidence interval calculated.  The risk ratio, of course, would be 1.0.

Unfortunately, the defense did not make this argument; nor did it explore where the meta-analysis might have come out had a more even-handed methodology been taken by Dr. Ix.  The gap allowed the trial court to brush the challenge aside:

“The failure to consider studies not reporting an association between GBCAs and NSF also does not render Dr. Ix’s meta-analysis unreliable. The purpose of Dr. Ix’s meta-analysis was to study the strength of the association between an exposure (receiving GBCA) and an outcome (development of NSF). In order to properly do this, Dr. Ix necessarily needed to examine studies where the exposed group developed NSF.”

Gadolinium at *24.  Judge Polster, with no help from the defense brief, missed the irony of Dr. Ix’s willingness to impute data in the case-control 2 x 2 contingency tables, but not in the relative risk tables.

CONFOUNDING

Defendants complained that Dr. Ix had ignored the possibility that confounding factors had contributed to the development of NSF.  Challenge at 13.  Defendants went so far as to charge Dr. Ix with misleading the court by failing to consider other possible causative exposures or conditions.  Id.

Defendants never identified the existence, source, and likely magnitude of confounding factors.  As a result, the plaintiffs’ argument, based in the Reference Manual, that confounding was an unlikely explanation for a very large risk ratio was enthusiastically embraced by the trial court, virtually verbatim from the plaintiffs’ Opposition (at 14):

“Finally, the Court rejects Defendants’ argument that Dr. Ix failed to consider confounding factors. Plaintiffs argued and Defendants did not dispute that, applying the Bradford Hill criteria, Dr. Ix calculated a pooled odds ratio of 11.46 for the five studies examined, which is higher than the 10 to 1 odds ratio of smoking and lung cancer that the Reference Manual on Scientific Evidence deemed to be “so high that it is extremely difficult to imagine any bias or confounding factor that may account for it.” Id. at 376.  Thus, from Dr. Ix’s perspective, the odds ratio was so high that a confounding factor was improbable. Additionally, in his deposition, Dr. Ix acknowledged that the cofactors that have been suggested are difficult to confirm and therefore he did not try to specifically quantify them. (Doc # : 772-20, at 27.) This acknowledgement of cofactors is essentially equivalent to the Agarwal article’s representation that “[t]here may have been unmeasured variables in the studies confounding the relationship between GBCAs and NSF,” cited by Defendants as a representative model for properly considering confounding factors. (See Doc # : 772, at 4-5.)”

Gadolinium at *24.

The real problem is that the defendant’s challenge pointed only to possible, unidentified causal agents.  The smoking/lung cancer analogy, provided by the Reference Manual, was inapposite.  Smoking is indeed a large risk factor for lung cancer, with relative risks over 20.  Although there are other human lung carcinogens, none is consistently in the same order of magnitude (not even asbestos), and as a result, confounding can generally be excluded as an explanation for the large risk ratios seen in smoking studies.  It would be easy to imagine that there are confounders for NSF, especially given that it is relatively recently been identified, and that they might be of the same or greater magnitude as that suggested for the gadolinium contrast media.  The defense, however, failed to identify confounders that actually threatened the validity of any of the individual studies, or of the meta-analysis.

CONCLUSION

The defense hinted at the general unreliability of meta-analysis, with references to References Manual on Scientific Evidence at 381 (2d ed. 2000)(noting problems with meta-analysis), and other, relatively dated papers.  See, e.g., John Bailar, “Assessing Assessments,” 277 Science 529 (1997)(arguing that “problems have been so frequent and so deep, and overstatements of the strength of conclusions so extreme, that one might well conclude there is something seriously and fundamentally wrong with [meta-analysis].”).  The Reference Manual language carried over into the third edition, is out of date, and represents a failing of the new edition.  See The Treatment of Meta-Analysis in the Third Edition of the Reference Manual on Scientific Evidence” (Nov. 14, 2011).

The plaintiffs came forward with some descriptive statistics of the prevalence of meta-analysis in contemporary biomedical literature.  The defendants gave mostly argument; there is a dearth of citation to defense expert witnesses, affidavits, consensus papers on meta-analysis, textbooks, papers by leading authors, and the like.  The defense challenge suffered from being diffuse and unfocused; it lost persuasiveness by including weak, collateral issues such as claiming that Dr. Ix was opining “only” on a “more likely than not” basis, and that he had not consulted with other experts, and that he had failed to use randomized trial data.  The defense was quick to attack perceived deficiencies, but it did not illustrate how or why the alleged deficiencies threatened the validity of Dr. Ix’s meta-analysis.  Indeed, even when the defense made strong points, such as the exclusion of zero-event cohort studies, it failed to document that such studies existed, and that their inclusion might have made a difference.

 

Politics of Expert Witnesses – The Treating Physician

June 7th, 2012

If a party retains an expert witness who has actually conducted research on the issue in controversy, the witnesses’ underlying data and analyses will be sought in discovery.  Of course, litigants are entitled to every man’s (and woman’s) evidence, and independent research, but the involvement of an investigator-author as an expert witness will almost certainly increase the scope of discovery.  Counsel will seek manuscript drafts, emails with co-authors, interim data, protocols and protocol amendments, preliminary analyses, among other documents.  Many would-be expert witnesses are reluctant to put their own research into issue.  The result is that expert witnesses frequently do not have “hands-on” experience with respect to the exact issue raised by the litigation in which they serve.

The combination of these factors creates vulnerabilities for witnesses.  Expert witnesses who have not conducted research or written about the issue end up being more attractive to lawyers.  But even these witnesses will be flawed in the eyes of a jury or trial judge:  they have been paid for their time in reviewing literature, preparing reports, sitting for depositions, traveling, appearing at trial.  The compensation of a highly skilled and experienced professional can lead to large amounts of money, amounts sufficient to make juries skeptical and lawyers’ uncomfortable.

Physicians, who care and treat a claimant, represent a litigation Holy Grail:  the prospect of having a neutral, disinterested, and caring expert witness opine about causation, diagnosis, damages, or prognosis, without the baggage of having been selected and paid by lawyers.  A lot of sharp elbows are thrown in the process of trying to align treating physicians with one side or the other’s litigation positions.

In some litigations, in some states, ex parte interviews by defense counsel are forbidden, but similar interviews by plaintiffs’ counsel are allowed.  Much mischief results.  The practice of trying to turn the treating physician into a “causation” or “damages” witness runs amuck, especially when trial courts do not require full Federal Rules of Civil Procedure Rule 26 disclosures from the treating physicians.

Jurors will want to know what treating physicians said, and may regard them as disinterested.  Indeed, the supposed neutrality and beneficence of the treating physician is often emphasized by counsel in their addresses to juries.  See, e.g., Simmons v. Novartis Pharm. Corp., 2012 WL 2016246, *2, *7 (6th Cir. 2012)((affirming exclusion of retained expert witness, as well as a treating physician who relied solely upon a limited selection of medical studies given to him by plaintiffs’ counsel); Tamraz v. BOC Group Inc., No. 1:04-CV-18948, 2008 WL 2796726 (N.D.Ohio July 18, 2008)(denying Rule 702 challenge to treating physician’s causation opinion), rev’d sub nom. Tamraz v. Lincoln Elec. Co., 620 F.3d 665 (6th Cir. 2010)(carefully reviewing record of trial testimony of plaintiffs’ treating physician; reversing judgment for plaintiff based in substantial part upon treating physician’s speculative causal assessment created by plaintiffs’ counsel), cert. denied, ___ U.S. ___ , 131 S. Ct. 2454, 2011 WL 863879 (2011).  See generally Robert Ambrogi, “A ‘Masterly’ Opinion on Expert Testimony,” Bullseye: October 2010;   David Walk, “A masterly Daubert opinion” (Sept. 15, 2010);  Ellen Melville, “Comment, Gating the Gatekeeper: Tamraz v. Lincoln Electric Co. and the Expansion of Daubert Reviewing Authority,” 53 B.C. L. Rev. 195 (2012) (student review that mistakenly equates current Rule 702 law with the Supreme Court’s 1993 Daubert decision, while ignoring subsequent precedent and revision of Rule 702).

In the silicone gel breast implant litigation, plaintiffs corralled a herd of rheumatologists who were sympathetic to their claims of connective tissue disease, and who would support their “creative” causation theories.  As a result, defense rheumatologists were not likely to have seen many of the claimants in their practice.  The plaintiffs’ counsel capitalized upon this “deficiency” in their experience, by attacking the defense experts’ expertise and their experience with the newly emergent phenomenon of “silicone-associated disease” (SAD).  The treating physicians were involved early on in the SAD litigation exploit.

In New Jersey, defense counsel have a limited right to ex parte interviews of treating physicians.  Stempler v. Speidell, 100 N.J. 368, 495 A.2d 857 (1985).  Certain New Jersey state trial judges, however, have ignored the Stempler holding in mass tort contexts, and have severely limited defendants’ ability to get information from treating physicians.  Last week, the New Jersey Appellate Division waded into this contentious area, by reversing an aberrant trial judge’s decision that severely restricted defendants’ retention of any physician who had treated a plaintiff in the mass tort.  In Re Pelvic Mesh/Gynecare Litig., No. A-5685-10T4 (N.J. Super. App. Div. June 1, 2012).

The defendants, Johnson & Johnson and Ethicon, Inc., designed, made, marketed, and sold pelvic mesh medical devices for the treatment of pelvic organ prolapse and stress urinary incontinence.  In re Pelvic at 2.  Several hundred personal injury cases against the defendants were assigned to the Atlantic County law division.  In a pretrial order, the trial court barred “defendants from consulting with or retaining as an expert witness any physician who has at any time treated one or more of the plaintiffs.”  Id. Remarkably, the trial court’s order was not limited to attempts to contact a physician for purposes of discussing a particular plaintiff’s case.  The trial court’s order had the effect of severely limiting defendants access to expert witnesses, as well as disqualifying expert witnesses already retained.  Plaintiffs’ counsel, however, were free to line up their clients’ treating physicians, and other treating physicians with substantial clinical experience with the allegedly defective device.

The Appellate Division reversed the trial court’s asymmetrical rules regarding treating physicians as manifestly inconsistent with the New Jersey Supreme Court’s mandate in Stempler and other cases.  The Appellate Division showed little patience for the trial court’s weak attempt to justify the uneven-handed treatment of access to treating physicians.  The trial court had invoked the potential for interference with the doctor-patient privilege as a basis for its pretrial order, but hornbook law, in New Jersey and in virtually every state, treats the filing of a lawsuit as a waiver of the privilege.  Id. at 11.  Similarly, the Appellate Division rejected the trial court’s insistence that a treating physician was obligated to protect and advance patients’ litigation interests by either testifying for patients or refraining from testifying for defendants. Id. at 15.  A treating physician has no “duty of loyalty” to help advance a patient’s litigious goals.  Id. at 26. The trial court had myopically confused a duty to provide medical care and treatment with helping plaintiffs’ counsel advance their view of the patients’ welfare.

The Appellate Division’s reversal is a welcome return of sanity and equity to New Jersey law of expert witnesses.  The over-reaching rationale of the trial court posed some incredible implications.  The appellate court noted, as an example, that “radiologists, orthopedists, and neurologists who routinely testify as experts for the defense in numerous personal injury cases in our courts are likely to be treating or consulting physicians for other patients with similar injuries, and some of those patients may also have filed lawsuits or may do so in the future.”  Id. at 16.  The trial court’s reasoning would strip defendants in virtually all personal injury litigation of access to expert physician opinion.  In asbestos litigation, for instance, the defense would find any and all pulmonary physicians who was treating a worker with asbestos-related disease to be off limits to consulting or testifying.  The Appellate Division’s strong ruling should be seen as a cloud on the validity of the continuing practice of barring defense counsel from ex parte interviews of treating physicians in mass or other tort litigation.