Appeared in Law360 on February 4, 2016. Originally appeared in Kaye Scholer's Product Liability Report.
—by Pamela Yates, Bert Slonim and Aaron Levine
In any deposition, counsel wants to get the jury-friendly admission from a plaintiff’s medical expert that will be the key to a defense verdict—whether it is a causation admission that something other than your client’s product could have caused the injury or a credibility admission that casts doubt on the expert’s testimony. In some cases, however, there may be an opportunity to get something more than the admissions that will sway a jury. There may be an opportunity to exclude the expert from testifying at all or to substantially limit the scope of the expert’s testimony. Here, we discuss strategies and techniques for deposing an expert to set up a Daubert motion aimed at excluding plaintiff’s expert from testifying at trial.
Federal Rule of Evidence 702, which sets forth the requirements for the admissibility of expert testimony, provides:
If scientific, technical, or other specialized knowledge will assist the trier of fact to understand the evidence or to determine a fact in issue, a witness qualified as an expert by knowledge, skill, experience, training, or education, may testify thereto in the form of an opinion or otherwise, if (1) the testimony is based upon sufficient facts or data, (2) the testimony is the product of reliable principles and methods, and (3) the witness has applied the principles and methods reliably to the facts of the case.
Thus, the primary requirements for an expert to offer opinion testimony at trial are: (1) the expert possesses knowledge beyond that of a lay person that will assist the jury; (2) the expert must be qualified; and (3) the methodology must be reliable and must fit the case. An expert deposition should probe each of these requirements, particularly the reliability of the experts’ methodology.
In many ways, a Daubert challenge is like a high-school math test. You may get the right answer to the question, but if you don’t show that you got the answer the right way, you won’t pass the test. The same applies here. A Daubert challenge does not go to the expert’s conclusion, but rather the methodology. It bears repeating that a Daubert motion is not fundamentally about whether the plaintiffs’ expert is right or wrong because that is a jury question. Thus, a common rebuttal to any Daubert motion is that defendants are merely challenging the conclusions of the expert, not the expert’s actual methodology. The proponent of the expert will portray the issue as a legitimate scientific debate where reasonable experts merely disagree. In order to prevail on a Daubert motion, the aim is not to show that the expert reached a conclusion that is wrong, but that the experts used flawed methods to reach it.
Is the Expert Qualified?
Although it is rare that a party will put forth an expert that is simply unqualified to bring some special knowledge or skill beyond that of a layperson, setting up an appropriate attack on qualifications as an initial matter can ultimately alert the court to any limitations or biases that may raise skepticism about the expert’s methods. Further, even if the attack on an expert’s qualifications does not play directly into the Daubert challenge, it may provide useful material for cross-examination at trial. Therefore, when preparing for any expert deposition, counsel should start by thoroughly vetting the expert. Items that should be reviewed include: the expert’s CV; testimonial history; IDEX/DRI/Bloomberg reports; prior disciplinary actions; publications; Google searches; Twitter/Facebook and other social media profiles; and any YouTube or other video sources. Regardless of whether these searches turn up any smoking gun, they will inform the strategy for the deposition and counsel will have a feel for the witness before first meeting him or her at the deposition.
Digging into the expert’s background does not end the inquiry into qualifications. As part of the deposition strategy—whether for a Daubert challenge or for trial—it is important to wall off an expert from testifying about subjects and from offering opinions that that go beyond his or her expertise. Consider asking a series of “you are not . . .” questions. For example: You are not a medical doctor? You are not a cardiologist? You are not an epidemiologist? You are not a biologist? You are not a toxicologist? You are not board-certified in psychiatry? You are not a PhD in statistics? Just because a witness is an expert in one field does not give him or her license to offer opinions across the spectrum of medical or scientific disciplines.
Moreover, once the expert’s field has been suitably defined and narrowed, it is worth exploring whether there are guidelines about expert testimony that have been established by the professional societies related to the expert’s specialty. Some examples include the AMA Code of Medical Ethics Opinion 9.07, “Medical Testimony”; American Academy of Neurology Qualifications and Guidelines for the Physician Expert Witness; and The Teratology Society’s 2005 position paper regarding “Causation in Teratology-Related Litigation.” In particular, it may be possible to show that the expert’s methodology deviates from the standards promulgated by the professional society. In any event, these sources are useful for providing standards set forth by the expert’s peers in the relevant field.
Of course, there also are many other areas to explore with respect to the expert’s background and bias—how much money the expert has made testifying, whether the testimony is always on behalf of plaintiffs, whether the expert has published on the subject matter, whether the expert teaches the subject, etc.—but again, these attacks simply lay the foundation for casting doubt about the expert. The true Daubert attack must be on the methodology.
The Core Daubert Factors
While there is no single factor or group of factors that is determinative of a reliable methodology, courts have set forth a variety of factors that should be considered. These factors are aimed at drawing the distinction between an expert who properly employs an established method versus one who engages in junk science, speculation and inappropriate extrapolation. In Daubert, the Supreme Court set forth the following non-exhaustive list of core factors:
• Whether the expert’s theory or technique can be (and has been) tested
• Whether the expert’s theory or technique has been subjected to peer review and publication
• Whether there is a known or potential rate of error and/or standards that control the technique’s operation
• Whether the assessment or technique is generally accepted in the scientific community
Daubert at 2796-97.
Since the Supreme Court’s decision, Daubert progeny cases have set forth multiple other criteria for courts to consider. See e.g., General Electric Co. v. Joiner, 522 U.S. 136, 146 (1997) (excluding expert testimony where there was “simply too great an analytical gap between the data and the opinion proffered”); In re Paoli R.R. Yard PCB Litig., 35 F3d 717, 765 (3d Cir. 1994) (excluding expert testimony where the expert “place[d] heavy reliance on unreliable . . . data”).
In exploring the core Daubert factors, counsel should ask the expert whether the expert has tested his or her theory; what the error rate is of his or her method; whether the expert has published his or her findings to his or her peers; whether the expert has presented his or her theories at professional scientific meetings and symposia; and whether the expert can cite any published studies, peer-reviewed papers, text books or scientific authorities that endorse his or her methodology and concur with his or her conclusions.
Of course, the reliability of the method is not revealed simply by asking the expert whether she can answer the questions above. In order to uncover the actual failures in an expert’s method, a thorough exploration of the expert’s analysis is warranted. Counsel should start with the expert’s outlier conclusion and work backwards to dismantle it as only having been attainable through a faulty methodology. Every link in the expert’s chain of reasoning should be explored; if even one link is unreliable the opinion may fall.
What Are the Methodological Flaws?
The most important question that must be answered in any Daubert inquiry is why is this expert’s methodology unreliable? The expert will almost always say that he or she reviewed the relevant studies, analyzed the strengths and weaknesses of the data, and applied the proper criteria (e.g., causation experts will often say that they applied the Bradford Hill factors). But paying lip service to the methodology is not the methodology. So it is important to demonstrate where the specific failures lie within the expert’s methods. In preparing to do so, counsel should consider the following:
• Did the expert ignore contrary data?
• Did the expert cherry-pick data?
• Did the expert fail to consider all the relevant data?
If it is discovered that an expert cherry-picked only the most favorable studies or results supporting his or her conclusion, the inquiry does not end there. It still should be explored why the expert cherry-picked. In other words, could the exclusion of unfavorable studies be justified by a reliable methodology? (Maybe the excluded studies were seriously flawed.) To that end, counsel will want to know what inclusion/exclusion criteria the expert used in selecting studies; why some studies were omitted; and what criteria the expert used to determine which results within a particular study were most reliable.
Beyond just cherry-picking, methodological flaws may exist in the type of data relied on by an expert. An expert opining on human causation is usually required to cite human epidemiological studies in support of his or her opinion. But in a case where the epidemiologic data does not support the plaintiff’s case, an expert may look for other support for his or her opinions. To that end, counsel should consider:
• Did the expert inappropriately rely on data based on excessive doses?
• Did the expert inappropriately rely on animal studies?
• Did the expert inappropriately rely on in vitro (test-tube) studies?
• Did the expert inappropriately rely on weak and/or unreplicated statistical associations?
Where these issues occur, you will want to establish the significance of the expert’s reliance on such data. Some approaches to consider here are: Isn’t it true that the dose you cite in support of this finding is 10 times the dose that is given to humans? Isn’t it true that you are extrapolating from this in vitro or animal study to humans? Isn’t it true that the only finding you cite for this proposition is not statistically significant or that it has not been replicated in other studies?
Where Are the Analytical Gaps?
It is also important to consider how the expert is using the data to support his or her opinion. Most experts will provide some rationale for the manner in which they weighed or interpreted the data, but at times, uncovering that rationale will expose the flawed methodology. For instance, an expert claiming to consider all of the data, or the totality of the data, may simply be putting together a laundry list and then drawing unsupported conclusions about it. In determining how to uncover the analytical gaps in the expert’s methodology, counsel should consider:
• Is the expert overstating the results of some studies, or drawing conclusions that go beyond what the study authors themselves concluded?
• Is the expert drawing inappropriate extrapolations?
• Is the expert substituting personal opinion as scientific knowledge?
• Is the expert merely offering untested or unproven hypothesis (e.g., biological plausibility)?
To this end, it is often important to ask the expert what support there is for each area where she may be speculating, assuming or overstating the evidence. Questions such as: What is the basis for that statement? Isn’t it true that the study authors offer other explanations for this finding (quote the study)? Did you consider this study’s express qualifications/limitations regarding the results? What criteria did you use for determining how to weigh the results of different studies? Can you cite any peer-reviewed, published study that supports that statement? Have you tested your theory? Have you ruled out other possible causes?
Just because an expert uses an unreliable method doesn’t mean it is easily exposed. A savvy expert will not only know how to find a way to support his or her conclusion, but will also be prepared to defend the method at deposition. Since courts are not as well versed in the science as either the expert or the lawyers are, many judges can only look for the expert to provide a plausible explanation for their departure from standard methods. When an expert is prepared to answer why they did what they did, you may need to consider other options.
The use of hypothetical questions may be helpful in this regard. Of course, these questions vary greatly from case to case and need to be tailored to the specific case, but the following questions may help you decide how to do so in your case: You wouldn’t cite a finding from one study as reliable, but reject other findings from the same study as unreliable? You wouldn’t use one methodology to determine causation in this case, and use a different methodology to determine causation in another case involving a different drug?
Many times, an expert who testifies in one case will be designated in future cases involving the same product. Where an expert is offering case-specific opinions, you will not only want to consider the case at hand, but will also want to lock the expert in for future cases. Hypotheticals can be particularly useful for the “next” case where the expert may testify. For example: You agree that you must rule out “X” risk factor in order to determine that this product caused the injury? If Mr. Smith took drug “A,” drug “B” and drug “C,” how would you know which product caused her injury? If Mrs. Doe only used the medication for less than one month, would that affect your opinion?
Finally, counsel should be prepared to go off script. While it is essential that one is prepared with a detailed outline that attacks the flaws identified in the expert’s methodology, merely moving from one scripted question to another may not yield the necessary testimony. The toughest expert to challenge is a hired gun who will know where the weaknesses in the data are and how to stay on message. However, just because an expert says he/she follows a reliable methodology does not make it so. It is important to prepare for any deposition by working closely with the corresponding defense expert so that counsel is equally, if not better, prepared. So when the expert doesn’t adequately answer a question, don’t just ask it over and over again, and do not just move on. Be prepared to fight. When the science is strong enough to file a Daubert motion, counsel needs to know it and must consider where the expert dug him or herself a new hole with each answer. So, if the expert rejects one finding because the study had a certain flaw, counsel needs to know which of the studies the expert relied on have the same flaw. When an expert relies on a nonsignificant finding in support of his or her opinion, counsel will need to know which nonsignificant findings refute the expert’s opinion. At the end of the day, this may or may not accomplish what is needed for a Daubert motion, but if it doesn’t, it will develop useful material for trial.
Appeared in Law360 on February 29, 2016. Originally appeared in the Kaye Scholer Securities & Derivative Litigation Report.
—by Daphne Morduchowitz, Catherine Schumacher and Joseph Clark
Also of Interest
- And Now A Word From The Panel: 3 Alternatives To MDLs September 28, 2016 • Articles
- Kaye Scholer Represents Investors in Volkswagen Diesel Emissions Litigation September 28, 2016 • Client Successes
- NLJ Names Shores as Antitrust Trailblazer September 27, 2016 • Recognitions
- Shape of Things to Come: Protecting Product Configuration and Packaging Design September 26, 2016 • Articles
- On the Road to a Safe and Secure Internet of Things: What Companies Should Do September 19, 2016 • Articles
- Druckenbrodt Quoted in Bloomberg on Volkswagen Dieselgate September 14, 2016 • Media Mentions
- Arbisser Featured in The Recorder and Law360... September 7, 2016 • Media Mentions
- Consumer Products: Adapting to Innovation Fall 2016 • Reports / Newsletters
- ILS and Zinsser Analytic Shareholders Sell to Gardner Denver Medical September 2, 2016 • Client Successes