TORTINI

For your delectation and delight, desultory dicta on the law of delicts.

Statistical Deontology

March 2nd, 2018

In courtrooms across America, there has been a lot of buzzing and palavering about the American Statistical Association’s Statement on Statistical Significance Testing,1 but very little discussion of the Society’s Ethical Guidelines, which were updated and promulgated in the same year, 2016. Statisticians and statistics, like lawyers and the law, receive their fair share of calumny over their professional activities, but the statistician’s principal North American professional organization is trying to do something about members’ transgressions.

The American Statistical Society (ASA) has promulgated ethical guidelines for statisticians, as has the Royal Statistical Society,2 even if these organizations lack the means and procedures to enforce their codes. The ASA’s guidelines3 are rich with implications for statistical analyses put forward in all contexts, including in litigation and regulatory rule making. As such, the guidelines are well worth studying by lawyers.

The ASA Guidelines were prepared by the Committee on Professional Ethics, and approved by the ASA’s Board in April 2016. There are lots of “thou shall” and “thou shall nots,” but I will focus on the issues that are more likely to arise in litigation. What is remarkable about the Guidelines is that if followed, they probably are more likely to eliminate unsound statistical practices in the courtroom than the ASA State on P-values.

Defining Good Statistical Practice

Good statistical practice is fundamentally based on transparent assumptions, reproducible results, and valid interpretations.” Guidelines at 1. The Guidelines thus incorporate something akin to the Kumho Tire standard that an expert witness ‘‘employs in the courtroom the same level of intellectual rigor that characterizes the practice of an expert in the relevant field.’’ Kumho Tire Co. v. Carmichael, 526 U.S. 137, 152 (1999).

A statistician engaged in expert witness testimony should provide “only expert testimony, written work, and oral presentations that he/she would be willing to have peer reviewed.” Guidelines at 2. “The ethical statistician uses methodology and data that are relevant and appropriate, without favoritism or prejudice, and in a manner intended to produce valid, interpretable, and reproducible results.” Id. Similarly, the statistician, if ethical, will identify and mitigate biases, and use analyses “appropriate and valid for the specific question to be addressed, so that results extend beyond the sample to a population relevant to the objectives with minimal error under reasonable assumptions.” Id. If the Guidelines were followed, a lot of spurious analyses would drop off the litigation landscape, regardless whether they used p-values or confidence intervals, or a Bayesian approach.

Integrity of Data and Methods

The ASA’s Guidelines also have a good deal to say about data integrity and statistical methods. In particular, the Guidelines call for candor about limitations in the statistical methods or the integrity of the underlying data:

The ethical statistician is candid about any known or suspected limitations, defects, or biases in the data that may impact the integrity or reliability of the statistical analysis. Objective and valid interpretation of the results requires that the underlying analysis recognizes and acknowledges the degree of reliability and integrity of the data.”

Guidelines at 3.

The statistical analyst openly acknowledges the limits of statistical inference, the potential sources of error, as well as the statistical and substantive assumptions made in the execution and interpretation of any analysis,” including data editing and imputation. Id. The Guidelines urge analysts to address potential confounding not assessed by the study design. Id. at 3, 10. How often do we see these acknowledgments in litigation-driven analyses, or in peer-reviewed papers, for that matter?

Affirmative Actions Prescribed

In the aid of promoting data and methodological integrity, the Guidelines also urge analysts to share data when appropriate without revealing the identities of study participants. Statistical analysts should publicly correct any disseminated data and analyses in their own work, as well as working to “expose incompetent or corrupt statistical practice.” Of course, the Lawsuit Industry will call this ethical duty “attacking the messenger,” but maybe that’s a rhetorical strategy based upon an assessment of risks versus benefits to the Lawsuit Industry.

Multiplicity

The ASA Guidelines address the impropriety of substantive statistical errors, such as:

[r]unning multiple tests on the same data set at the same stage of an analysis increases the chance of obtaining at least one invalid result. Selecting the one “significant” result from a multiplicity of parallel tests poses a grave risk of an incorrect conclusion. Failure to disclose the full extent of tests and their results in such a case would be highly misleading.”

Guidelines at 9.

There are some Lawsuit Industrialists who have taken comfort in the pronouncements of Kenneth Rothman on corrections for multiple comparisons. Rothman’s views on multiple comparisons are, however, much broader and more nuanced than the Industry’s sound bites.4 Given that Rothman opposes anything like strict statistical significance testing, it follows that he is relatively unmoved for the need for adjustments to alpha or the coefficient of confidence. Rothman, however, has never deprecated the need to consider the multiplicity of testing, and the need for researchers to be forthright in disclosing the the scope of comparisons originally planned and actually done.


2 Royal Statistical Society – Code of Conduct (2014); Steven Piantadosi, Clinical Trials: A Methodologic Perspective 609 (2d ed. 2005).

3 Shelley Hurwitz & John S. Gardenier, “Ethical Guidelines for Statistical Practice: The First 60 Years and Beyond,” 66 Am. Statistician 99 (2012) (describing the history and evolution of the Guidelines).

4 Kenneth J. Rothman, “Six Persistent Research Misconceptions,” 29 J. Gen. Intern. Med. 1060, 1063 (2014).

Gatekeeping of Expert Witnesses Needs a Bair Hug

December 20th, 2017

For every Rule 702 (“Daubert”) success story, there are multiple gatekeeping failures. See David E. Bernstein, “The Misbegotten Judicial Resistance to the Daubert Revolution,” 89 Notre Dame L. Rev. 27 (2013).1 Exemplars of inadequate expert witness gatekeeping in state or federal court abound, and overwhelm the bar. The only solace one might find is that the abuse-of-discretion appellate standard of review keeps the bad decisions from precedentially outlawing the good ones.

Judge Joan Ericksen recently provided another Berenstain Bears’ example of how not to keep the expert witness gate, in litigation claims that the Bair Hugger forced air warming devices (“Bair Huggers”) cause infections. In re Bair Hugger Forced Air Warming, MDL No. 15-2666, 2017 WL 6397721 (D. Minn. Dec. 13, 2017). Although Her Honor properly cited and quoted Rule 702 (2000), a new standard is announced in a bold heading:

Under Federal Rule of Evidence 702, the Court need only exclude expert testimony that is so fundamentally unsupported that it can offer no assistance to the jury.”

Id. at *1. This new standard thus permits largely unsupported opinion that can offer bad assistance to the jury. As Judge Ericksen demonstrates, this new standard, which has no warrant in the statutory text of Rule 702 or its advisory committee notes, allows expert witnesses to rely upon studies that have serious internal and external validity flaws.

Jonathan Samet, a specialist in pulmonary medicine, not infectious disease or statistics, is one of the plaintiffs’ principal expert witnesses. Samet relies in large measure upon an observational study2, which purports to find an increased odds ratio for use of the Bair Hugger among infection cases in one particular hospital. The defense epidemiologist, Jonathan B. Borak, criticized the McGovern observational study on several grounds, including that the study was highly confounded by the presence of other known infection risks. Id. at *6. Judge Ericksen characterized Borak’s opinion as an assertion that the McGovern study was an “insufficient basis” for the plaintiffs’ claims. A fair reading of even Judge Ericksen’s précis of Borak’s proffered testimony requires the conclusion that Borak’s opinion was that the McGovern study was invalid because of data collection errors and confounding. Id.

Judge Ericksen’s judicial assessment, taken from the disagreement between Samet and Borak, is that there are issues with the McGovern study, which go to “weight of the evidence.” This finding obscures, however, that there were strong challenges to the internal and external validity of the study. Drawing causal inferences from an invalid observational study is a methodological issue, not a weight-of-the-evidence problem for the jury to resolve. This MDL opinion never addresses the Rule 703 issue, whether an epidemiologic expert would reasonably rely upon such a confounded study.

The defense proffered the opinion of Theodore R. Holford, who criticized Dr. Samet for drawing causal inferences from the McGovern observational study. Holford, a professor of biostatistics at Yale University’s School of Public Health, analyzed the raw data behind the McGovern study. Id. at *8. The plaintiffs challenged Holford’s opinions on the ground that he relied on data in “non-final” form, from a temporally expanded dataset. Even more intriguingly, given that the plaintiffs did not present a statistician expert witness, plaintiffs argued that Holford’s opinions should be excluded because

(1) he insufficiently justified his use of a statistical test, and

(2) he “emphasizes statistical significance more than he would in his professional work.”

Id.

The MDL court dismissed the plaintiffs’ challenge on the mistaken conclusion that the alleged contradictions between Holford’s practice and his testimony impugn his credibility at most.” If there were truly such a deviation from the statistical standard of care, the issue is methodological, not a credibility issue of whether Holford was telling the truth. And as for the alleged over-emphasis on statistical significance, the MDL court again falls back to the glib conclusions that the allegation goes to the weight, not the admissibility of expert witness opinion testimony, and that plaintiffs can elicit testimony from Dr Samet as to how and why Professor Holford over-emphasized statistical significance. Id. Inquiring minds, at the bar, and in the academy, are left with no information about what the real issues are in the case.

Generally, both sides’ challenges to expert witnesses were denied.3 The real losers, however, were the scientific and medical communities, bench, bar, and general public. The MDL court glibly and incorrectly treated methodological issues as “credibility” issues, confused sufficiency with validity, and banished methodological failures to consideration by the trier of fact for “weight.” Confounding was mistreated as simply a debating point between the parties’ expert witnesses. The reader of Judge Ericksen’s opinion never learns what statistical test was used by Professor Holford, what justification was needed but allegedly absent for the test, why the justification was contested, and what other test was alleged by plaintiffs to have been a “better” statistical test. As for the emphasis given statistical significance, the reader is left in the dark about exactly what that emphasis was, and how it led to Holford’s conclusions and opinions, and what the proper emphasis should have been.

Eventually appellate review of the Bair Hugger MDL decision must turn on whether the district court abused its discretion. Although appellate courts give trial judges discretion to resolve Rule 702 issues, the appellate courts cannot reach reasoned decisions when the inferior courts fail to give even a cursory description of what the issues were, and how and why they were resolved as they were.


2 P. D. McGovern, M. Albrecht, K. G. Belani, C. Nachtsheim, P. F. Partington, I. Carluke, and M. R. Reed, “Forced-Air Warming and Ultra-Clean Ventilation Do Not Mix: An Investigation of Theatre Ventilation, Patient Warming and Joint Replacement Infection in Orthopaedics,” 93 J. Bone Joint 1537 (2011). The article as published contains no disclosures of potential or actual conflicts of interest. A persistent rumor has it that the investigators were funded by a commercial rival to the manufacturer of the Bair Hugger at issue in Judge Ericksen’s MDL. See generally, Melissa D. Kellam, Loraine S. Dieckmann, and Paul N. Austin, “Forced-Air Warming Devices and the Risk of Surgical Site Infections,” 98 Ass’n periOperative Registered Nurses (AORN) J. 354 (2013).

3 A challenge to plaintiffs’ expert witness Yadin David was sustained to the extent he sought to offer opinions about the defendant’s state of mind. Id. at *5.

White Hat Bias in the Lab and in the Courtroom

February 20th, 2017

nqhefb6sjs

FOIA Exemptions Gobble Up The Statute

November 27th, 2015

Last week, the Supreme Court refused to hear a case in which petitioners sought review of a First Circuit decision that upheld the “commercial information” exemption (exemption 4) to the Freedom of Information Act, 5 U.S.C. § 552 (FOIA). New Hampshire Right to Life v. Dep’t Health & Human Services, 778 F.3d 43 (1st Cir. 2015). See Lyle Denniston, “Court bypasses FOIA challenge,” SCOTUSblog (Nov. 16, 2015).

An anti-abortion group filed a FOIA request to obtain documents that Planned Parenthood had sent to the federal government’s Department of Health and Human Services, in support of federal funding, for family planning activities in New Hampshire. The requested documents described Planned Parenthood’s internal medical standards and guidelines, as well as its set fees for various services. The federal trial court upheld the agency’s refusal to disclose the Planned Parenthood documents on the basis of § 552(b)(4) (Exemption 4, for “trade secrets and commercial or financial information obtained from a per­son and privileged or confidential”), as well as internal agency documents, on the basis of § 552(b)(5) (Exemption 5). The First Circuit affirmed the non-freedom of information. 778 F.3d 43.

Justice Thomas, joined by Justice Scalia, dissented from the Court’s denial of review. New Hampshire Right to Life, No. 14–1273, SCOTUS (Nov. 16, 2015) [Thomas Dissent] Justice Thomas intimated that the First Circuit’s decision may well have offended the Supreme Court’s interpretation of FOIA as reflecting “a general philosophy of full agency disclosure unless information is exempted under clearly delineated statutory language.” Department of Defense v. FLRA, 510 U. S. 487, 494 (1994).

Justice Thomas noted that the First Circuit based its conclusion not on the ordinary meaning of the term “confidential,” but on speculation whether FOIA disclosure might harm Planned Parenthood’s position in a conjectured market. The First Circuit ordained the Planned Parenthood manual confidential because “[a]potential future competitor could take advantage of the institutional knowledge contained in the Manual” to com­pete against the organization in the future. Justice Thomas intimated that he, and concurring Justice Scalia, disapproved of this speculation upon speculation approach. Thomas Dissent at 2. The dissenters also noted that the Supreme Court has yet to interpret Exemption 4, to FOIA, and that the lower courts have embraced this exemption as a broad exclusion, in derogation of the language and spirit of FOIA.

In discovery efforts to obtain information about litigation science, funded by the National Institute of Environmental Health and Science (NIEHS), FOIA officers appear to invoke Exemption 4 routinely to deny disclosure. One case in point was the effort to obtain information about NIEHS-funded research of Dr. Brad A. Racette, on the prevalence of parkinsonism among welding tradesmen in Wisconsin Great Lakes shipyards. Racette is an academic researcher, on the faculty of Washington University St. Louis; he is not engaged in any commercial enterprise, in any imaginable use of the word “commercial.” His Wisconsin research was sponsored by the Boilermakers’ union, which had worked with the litigation industry (trial bar) to develop a litigation case against the manufacturers of welding rods. FOIA requests for scientific data, protocols, and analyses were met, by NIEHS, with over-zealous redactions with the invocation of FOIA exemptions, including assertions that data and analyses were “confidential commercial information.”

The redaction of one of Racette’s ESNAP reports, on Grant Number SR01ES13743-4, is illustrative. The multi-year grant, entitled “Epidemilogy [sic] of Parkinsonism in Welders,” was awarded to principal investigator Brad Racette in 2007. On October 29, 2009, Racette submitted a report that included data and data analysis. The NIEHS, on its own, or acting at the request of the principal investigator, redacted data, analyses, and conclusions, on grounds of “confidential commercial information.” Invoking an exemption for “commercial information” for federally funding of an epidemiologic study, conducted by university-based scientists seems an extreme distortion of the FOIA statute.

Cynics may say that Justices Thomas and Scalia dissented in the Planned Parenthood case because they were eager, to advance their theological ideology to exploit the opportunity to order disclosure that could hurt the good work that Planned Parenthood does. The dissenting justices deserve, however, to be taken at their word, and applauded for chastising their colleagues who were willing to abide the frustation of the word and spirit of the FOIA statute. Sadly, federal agencies seem to be determined to make information unfree. In the most recent evaluations, the Department of Health and Human Services received a failing grade, among the lowest grades for FOIA performance and responsiveness; only the State Department failed with a lower score. National Freedom of Information Coalition, “FOIA report card shows federal agencies missing the mark,” (Mar. 16, 2015); Center for Effective Government, “Making the Grade – Access to Information Scorecard 2015.”