TORTINI

For your delectation and delight, desultory dicta on the law of delicts.

Dodgy Data Duck Daubert Decisions

March 11th, 2020

Judges say the darndest things, especially when it comes to their gatekeeping responsibilities under Federal Rules of Evidence 702 and 703. One of the darndest things judges say is that they do not have to assess the quality of the data underlying an expert witness’s opinion.

Even when acknowledging their obligation to “assess the reasoning and methodology underlying the expert’s opinion, and determine whether it is both scientifically valid and applicable to a particular set of facts,”[1] judges have excused themselves from having to look at the trustworthiness of the underlying data for assessing the admissibility of an expert witness’s opinion.

In McCall v. Skyland Grain LLC, the defendant challenged an expert witness’s reliance upon oral reports of clients. The witness, Mr. Bradley Walker, asserted that he regularly relied upon such reports, in similar contexts of the allegations that the defendant misapplied herbicide to plaintiffs’ crops. The trial court ruled that the defendant could cross-examine the declarant who was available trial, and concluded that the “reliability of that underlying data can be challenged in that manner and goes to the weight to be afforded Mr. Walker’s conclusions, not their admissibility.”[2] Remarkably, the district court never evaluated the reasonableness of Mr. Walker’s reliance upon client reports in this or any context.

In another federal district court case, Rodgers v. Beechcraft Corporation, the trial judge explicitly acknowledged the responsibility to assess whether the expert witness’s opinion was based upon “sufficient facts and data,” but disclaimed any obligation to assess the quality of the underlying data.[3] The trial court in Rodgers cited a Tenth Circuit case from 2005,[4] which in turn cited the Supreme Court’s 1993 decision in Daubert, for the proposition that the admissibility review of an expert witness’s opinion was limited to a quantitative sufficiency analysis, and precluded a qualitative analysis of the underlying data’s reliability. Quoting from another district court criminal case, the court in Rodgers announced that “the Court does not examine whether the facts obtained by the witness are themselves reliable – whether the facts used are qualitatively reliable is a question of the weight to be given the opinion by the factfinder, not the admissibility of the opinion.”[5]

In a 2016 decision, United States v. DishNetwork LLC, the court explicitly disclaimed that it was required to “evaluate the quality of the underlying data or the quality of the expert’s conclusions.”[6] This district court pointed to a Seventh Circuit decision, which maintained that  “[t]he soundness of the factual underpinnings of the expert’s analysis and the correctness of the expert’s conclusions based on that analysis are factual matters to be determined by the trier of fact, or, where appropriate, on summary judgment.”[7] The Seventh Circuit’s decision, however, issued in June 2000, several months before the effective date of the amendments to Federal Rule of Evidence 702 (December 2000).

In 2012, a magistrate judge issued an opinion along the same lines, in Bixby v. KBR, Inc.[8] After acknowledging what must be done in ruling on a challenge to an expert witness, the judge took joy in what could be overlooked. If the facts or data upon which the expert witness has relied are “minimally sufficient,” then the gatekeeper can regard questions about “the nature or quality of the underlying data bear upon the weight to which the opinion is entitled or to the credibility of the expert’s opinion, and do not bear upon the question of admissibility.”[9]

There need not be any common law mysticism to the governing standard. The relevant law is, of course, a statute, which appears to be forgotten in many of the failed gatekeeping decisions:

Rule 702. Testimony by Expert Witnesses

A witness who is qualified as an expert by knowledge, skill, experience, training, or education may testify in the form of an opinion or otherwise if:

(a) the expert’s scientific, technical, or other specialized knowledge will help the trier of fact to understand the evidence or to determine a fact in issue;

(b) the testimony is based on sufficient facts or data;

(c) the testimony is the product of reliable principles and methods; and

(d) the expert has reliably applied the principles and methods to the facts of the case.

It would seem that you could not produce testimony that is the product of reliable principles and methods by starting with unreliable underlying facts and data. Certainly, having a reliable method would require selecting reliable facts and data from which to start. What good would the reliable application of reliable principles to crummy data?

The Advisory Committee Notes to Rule 702 hints at an answer to the problem:

“There has been some confusion over the relationship between Rules 702 and 703. The amendment makes clear that the sufficiency of the basis of an expert’s testimony is to be decided under Rule 702. Rule 702 sets forth the overarching requirement of reliability, and an analysis of the sufficiency of the expert’s basis cannot be divorced from the ultimate reliability of the expert’s opinion. In contrast, the ‘reasonable reliance’ requirement of Rule 703 is a relatively narrow inquiry. When an expert relies on inadmissible information, Rule 703 requires the trial court to determine whether that information is of a type reasonably relied on by other experts in the field. If so, the expert can rely on the information in reaching an opinion. However, the question whether the expert is relying on a sufficient basis of information—whether admissible information or not—is governed by the requirements of Rule 702.”

The answer is only partially satisfactory. First, if the underlying data are independently admissible, then there may indeed be no gatekeeping of an expert witness’s reliance upon such data. Rule 703 imposes a reasonableness test for reliance upon inadmissible underlying facts and data, but appears to give otherwise admissible facts and data a pass. Second, the above judicial decisions do not mention any Rule 703 challenge to the expert witnesses’ reliance. If so, then there is a clear lesson for counsel. When framing a challenge to the admissibility of an expert witness’s opinion, show that the witness has unreasonably relied upon facts and data, from whatever source, in violation of Rule 703. Then show that without the unreasonably relied upon facts and data, the witness cannot show that his or her opinion satisfies Rule 702(a)-(d).


[1]  See, e.g., McCall v. Skyland Grain LLC, Case 1:08-cv-01128-KHV-BNB, Order (D. Colo. June 22, 2010) (Brimmer, J.) (citing Dodge v. Cotter Corp., 328 F.3d 1212, 1221 (10th Cir. 2003), citing in turn Daubert v. Merrill Dow Pharms., Inc., 509 U.S. 579,  592-93 (1993).

[2]  McCall v. Skyland Grain LLC Case 1:08-cv-01128-KHV-BNB, Order at p.9 n.6 (D. Colo. June 22, 2010) (Brimmer, J.)

[3]  Rodgers v. Beechcraft Corp., Case No. 15-CV-129-CVE-PJC, Report & Recommendation at p.6 (N.D. Okla. Nov. 29, 2016).

[4]  Id., citing United.States. v. Lauder, 409 F.3d 1254, 1264 (10th Cir. 2005) (“By its terms, the Daubert opinion applies only to the qualifications of an expert and the methodology or reasoning used to render an expert opinion” and “generally does not, however, regulate the underlying facts or data that an expert relies on when forming her opinion.”), citing Daubert v. Merrill Dow Pharms., Inc., 509 U.S. 579, 592-93 (1993).

[5]  Id., citing and quoting United States v. Crabbe, 556 F. Supp. 2d 1217, 1223
(D. Colo. 2008) (emphasis in original). In Crabbe, the district judge mostly excluded the challenged expert witness, thus rendering its verbiage on quality of data as obiter dicta). The pronouncements about the nature of gatekeeping proved harmless error when the court dismissed the case on other grounds. Rodgers v. Beechcraft Corp., 248 F. Supp. 3d 1158 (N.D. Okla. 2017) (granting summary judgment).

[6]  United States v. DishNetwork LLC, No. 09-3073, Slip op. at 4-5 (C.D. Ill. Jan. 13, 2016) (Myerscough, J.)

[7]  Smith v. Ford Motor Co., 215 F.3d 713, 718 (7th Cir. 2000).

[8]  Bixby v. KBR, Inc., Case 3:09-cv-00632-PK, Slip op. at 6-7 (D. Ore. Aug. 29, 2012) (Papak, M.J.)

[9]  Id. (citing Hangarter v. Provident Life & Accident Ins. Co., 373 F.3d 998, 1017 (9th Cir. 2004), quoting Children’s Broad Corp. v. Walt Disney Co., 357 F.3d 860, 865 (8th Cir. 2004) (“The factual basis of an expert opinion goes to the credibility of the testimony, not the admissibility, and it is up to the opposing party to examine the factual basis for the opinion in cross-examination.”).

Practical Solutions for the Irreproducibility Crisis

March 3rd, 2020

I have previously praised the efforts of the National Association of Scholars (NAS) for its efforts to sponsor a conference on “Fixing Science: Practical Solutions for the Irreproducibility Crisis.” The conference was a remarkable event, with a good deal of diverse view points, civil discussion and debate, and collegiality.

The NAS has now posted a follow up to its conference, with a link to slide presentations, and to a You Tube page with videos of the presentations. The NAS, along with The Independent Institute, should be commended for their organizational efforts, and their transparency in making the conference contents available now to a wider audience.

The conference took place on February 7th and 8th, and I had the privilege of starting the event with my presentation, “Not Just an Academic Dispute: Irreproducible Scientific Evidence Renders Legal Judgments Unsafe”.

Some, but not all, of the interesting presentations that followed:

Tim Edgell, “Stylistic Bias, Selective Reporting, and Climate Science” (Feb. 7, 2020)

Patrick J. Michaels, “Biased Climate Science” (Feb. 7, 2020)

Daniele Fanelli, “Reproducibility Reforms if there is no Irreproducibility Crisis” (Feb. 8, 2020)

On Saturday, I had the additional privilege of moderating a panel on “Group Think” in science, and its potential for skewing research focus and publication:

Lee Jussim, “Intellectual Diversity Limits Groupthink in Scientific Psychology” (Feb. 8, 2020)

Mark Regnerus, “Groupthink in Sociology” (Feb. 8, 2020)

Michael Shermer, “Giving the Devil His Due” (Feb. 8, 2020)

Later on Saturday, the presenters turned to methodological issues, many of which are key to understanding ongoing scientific and legal controversies:

Stanley Young, “Prevention and Management of Acute and Late Toxicities in Radiation Oncology

James E. Enstrom, “Reproducibility is Essential to Combating Environmental Lysenkoism

Deborah Mayo, “P-Value ‘Reforms’: Fixing Science or Threats to Replication and Falsification?” (Feb. 8, 2020)

Ronald L. Wasserstein, “What Professional Organizations Can Do To Fix The Irreproducibility Crisis” (Feb. 8, 2020)

Louis Anthony Cox, Jr., “Causality, Reproducibility, and Scientific Generalization in Public Health” (Feb. 8, 2020)

David Trafimow, “What Journals Can Do To Fix The Irreproducibility Crisis” (Feb. 8, 2020)

David Randall, “Regulatory Science and the Irreproducibility Crisis” (Feb. 8, 2020)