The Eleventh Circuit Confuses Adversarial and Methodological Bias, Manifestly Erroneously

The Eleventh Circuit’s decision in Adams v. Laboratory Corporation of America, is disturbing on many levels. Adams v. Lab. Corp. of Am., 760 F.3d 1322 (11th Cir. 2014). Professor David Bernstein has already taken the Circuit judges to task for their failure to heed the statutory requirements of Federal Rule of Evidence 702. See David Bernstein, “A regrettable Eleventh Circuit expert testimony ruling, Adams v. Lab. Corp. of America,Wash. Post (May 26, 2015). Sadly, the courts’ strident refusal to acknowledge the statutory nature of Rule 702, and the Congressional ratification of the 2000 amendments to Rule 702, have become commonplace in the federal courts. Ironically, the holding of the Supreme Court’s decision in Daubert itself was that the lower courts were not free to follow common law that had not been incorporated into the first version of the Rule 702.

There is much more wrong with the Adams case than just a recalcitrant disregard for the law: the Circuit displayed an equally distressing disregard for science. The case started as a negligent failure to diagnose cervical cancer claim against defendant Laboratory Corporation of America. Plaintiffs claimed that the failure to diagnose cancer led to delays in treatment, which eroded Mrs. Adam’s chance for a cure.

Before the Adams case arose, two professional organizations, the College of American Pathologists (CAP) and the American Society of Cytopathology (ASC) issued guidelines about how an appropriate retrospective review should be conducted. Both organizations were motivated by two concerns: protecting their members from exaggerated, over-extended, and bogus litigation claims, as well as by a scientific understanding that a false-negative finding by a cytopathologist does not necessarily reflect a negligent interpretation of a Pap smear[1]. Both organizations called for a standard of blinded review in litigation to protect against hind-sight bias. The Adams retained a highly qualified pathologist, Dr. Dorothy Rosenthal, who with full knowledge of the later diagnosis and the professional guidelines, reviewed the earlier Pap smears that were allegedly misdiagnosed as non-malignant. 760 F.3d at 1326. Rosenthal’s approach violated the CAP and ASC guidelines, as well as common sense.

The district judge ruled that Rosenthal’s approach was little more than an ipse dixit, and a subjective method that could not be reviewed objectively. Adams v. Lab. Corp. of Am., No. 1:10-CV-3309-WSD, 2012 WL 370262, at *15 (N.D. Ga. Feb. 3, 2012). In a published per curiam opinion, the Eleventh Circuit reversed, holding that the district judge’s analysis of Rosenthal’s opinion was “manifestly erroneous.” 760 F.3d at 1328. Judge Garza, of the Fifth Circuit, sitting by designation, concurred to emphasize his opinion that Rosenthal did not need a methodology, as long as she showed up with her qualifications and experience to review the contested Pap smears.

The Circuit opinion is a model of conceptual confusion. The judges refer to the professional society guidelines, but never provide citations. (See note 1, infra.). The Circuit judges are obviously concerned that the professional societies are promulgating standards to be used in judging claims against their own members for negligent false-negative interpretations of cytology or pathology. What the appellate judges failed to recognize, however, is that the professional societies had a strong methodological basis for insisting upon “blinded” review of the slides in controversy. Knowledge of the outcome must of necessity bias any subsequent review, such as plaintiffs’ expert witness, Rosenthal. Even a cursory reading of the two guidelines would have made clear that they had been based on more than simply a desire to protect members; they were designed to protect members against bogus claims, and cited data in support of their position[2]. Subsequent to the guidelines, several publications have corroborated the evidence-based need for blinded review[3].

The concepts of sensitivity, specificity, and positive predictive value are inherent in any screening procedure; they are very much part of the methodology of screening. These measures, along with statistical analyses of concordance and discordance among experienced cytopathologists, can be measured and assessed for accuracy and reliability. The Circuit judges in Adams, however, were blinded (in a bad way) to the scientific scruples that govern screenings. The per curiam opinion suggests that:

“[t]he only arguably appreciable differences between Dr. Rosenthal’s method and the review method for LabCorp’s cytotechnologists is that Dr. Rosenthal (1) already knew that the patient whose slides she was reviewing had developed cancer and (2) reviewed slides from just one patient. Those differences relate to the lack of blinded review, which we address later.”

760 F.3d at 1329 n. 10. And when the judges addressed the lack of blinded review, they treated hindsight bias, a cognitive bias and methodological flaw in the same way as they would have trial courts and litigants treat Dr. Rosenthal’s “philosophical bent” in favor of cancer patients — as “a credibility issue for the jury.” Id. at 1326-27, 1332. This conflation of methodological bias with adversarial bias, however, is a prescription for eviscerating judicial gatekeeping of flawed opinion testimony. Judge Garza, in a concurring opinion, would have gone further and declared that plaintiffs’ expert witness Rosenthal had no methodology and thus she was free to opine ad libitum.

Although Rosenthal’s “philosophical bent” might perhaps be left to the crucible of cross-examination, hindsight review bias could and should have been eliminated by insisting that Rosenthal wear the same “veil of ignorance” of Mrs. Adam’s future clinical course, which the defendant wore when historically evaluating the plaintiff’s Pap smears. Here Rosenthal’s adversarial bias was very probably exacerbated by her hindsight bias, and the Circuit missed a valuable opportunity to rein in both kinds of bias.

Certainly in other areas of medicine, such as radiology, physicians are blinded to the correct interpretation and evaluated on their ability to line up with a gold standard. The NIOSH B-reader examination, for all its problems, at least tries to qualify physicians in the use of the International Labor Organization’s pneumoconiosis scales for interpreting plain-film radiographs for pulmonary dust diseases, by having them read and interpret films blinded to the NIOSH/ILO consensus interpretation.


[1] See Patrick L. Fitzgibbons & R. Marshall Austin, “Expert review of histologic slides and Papanicolaou tests in the context of litigation or potential litigation — Surgical Pathology Committee and Cytopathology Committee of the College of American Pathologists,” 124 Arch. Pathol. Lab. Med. 1717 (2000); American Society of Cytopathology, “Guidelines for Review of Gyn Cytology Samples in the Context of Litigation or Potential Litigation” (2000).

[2] The CAP guideline, for instance, cited R. Marshall Austin, “Results of blinded rescreening of Papanicolaou smears versus biased retrospective review,” 121 Arch. Pathol. Lab. Med. 311 (1997).

[3] Andrew A. Renshaw, K.M Lezon, and D.C. Wilbur, “The human false-negative rate of rescreening Pap tests: Measured in a two-arm prospective clinical trial,” 93 Cancer (Cancer Cytopathol.) 106 (2001); Andrew A. Renshaw, Mary L. Young, and E. Blair Holladay, “Blinded review of Papanicolaou smears in the context of litigation: Using statistical analysis to define appropriate thresholds,” 102 Cancer Cytopathology 136 (2004) (showing that data from blinded reviews can be interpreted in a statistically appropriate way, and defining standards to improve the accuracy and utility of blinded reviews); D. V. Coleman & J. J. R. Poznansky, “Review of cervical smears from 76 women with invasive cervical cancer: cytological findings and medicolegal implications,” 17 Cytopathology 127 (2006); Andrew A. Renshaw, “Comparing Methods to Measure Error in Gynecologic Cytology and Surgical Pathology,” 130 Arch. Path. & Lab. Med. 626 (2009).