Results 1 to 6 of 6

Thread: A D.C. judge issues a much-needed opinion on ‘junk science’

  1. #1
    Join Date
    Jun 2017
    Location
    Tennessee
    Posts
    2,273
    Feedback Score
    0

    A D.C. judge issues a much-needed opinion on ‘junk science’

    https://www.washingtonpost.com/opini...rce=reddit.com

    Last September, the D.C. Superior Court restricted the testimony of a prosecution ballistics expert in a felony case. I want to draw some attention to the opinion, which I haven’t seen written up elsewhere, because it is one of the best decisions I have read in response to a challenge to the scientific validity of forensic evidence, particularly in a criminal case.

    As I’ve written here ad nauseam, judges are entrusted to be the gatekeepers of good and bad science in the courtroom. By and large, they’ve performed poorly. Judges are trained to perform legal analysis, not scientific analysis, and law and science are two very different fields. Science is forward-looking, always changing and adapting to discoveries and new empirical evidence. The law, by contrast, puts a premium on consistency and predictability. It relies on precedent, so courts look to previous courts for guidance and are often bound by prior decisions.

    By and large, judges have approached their task of scientific analysis just as we might expect them to: They have tried to apply it within a legal framework. This means when assessing whether a given field of forensics is scientifically reliable, judges tend to look to what previous courts have already determined. And when confronted with a new field, they tend to err on the side of relying on our adversarial system — they let the evidence in but also let the defense call its own experts to dispute the prosecution’s witness. The problem here is that by simply admitting the evidence, the courts lend it an air of legitimacy. Once the evidence is allowed in, whether jurors find it convincing tends to come down to which witness is most persuasive. State’s witnesses are often seen as unbiased and altruistic, while jurors tend to see defense witnesses as hired guns. And the set of skills it takes to persuade a jury isn’t necessarily the same skill set of a careful and cautious scientist. Indeed, the two are often in conflict.
    Religion is doing what you are told no matter what is right. Morality is doing what is right no matter what you are told...

  2. #2
    Join Date
    Mar 2012
    Location
    Kansas
    Posts
    9,937
    Feedback Score
    1 (100%)
    Would it be possible to copy the whole story? WaPo requires you to pay to view.
    Patriotism means to stand by the country. It does not mean to stand by the President... - Theodore Roosevelt, Lincoln and Free Speech, Metropolitan Magazine, Volume 47, Number 6, May 1918.

    Every Communist must grasp the truth. Political power grows out of the barrel of a gun. Our principle is that the Party commands the gun, and the gun must never be allowed to command the Party Mao Zedong, 6 November, 1938 - speech to the Communist Patry of China's sixth Central Committee

  3. #3
    Join Date
    Jun 2017
    Location
    Tennessee
    Posts
    2,273
    Feedback Score
    0
    Quote Originally Posted by 26 Inf View Post
    Would it be possible to copy the whole story? WaPo requires you to pay to view.
    I wasn't sure if that was OK here. If its not I apologize to the MODS.

    Last September, the D.C. Superior Court restricted the testimony of a prosecution ballistics expert in a felony case. I want to draw some attention to the opinion, which I haven’t seen written up elsewhere, because it is one of the best decisions I have read in response to a challenge to the scientific validity of forensic evidence, particularly in a criminal case.

    As I’ve written here ad nauseam, judges are entrusted to be the gatekeepers of good and bad science in the courtroom. By and large, they’ve performed poorly. Judges are trained to perform legal analysis, not scientific analysis, and law and science are two very different fields. Science is forward-looking, always changing and adapting to discoveries and new empirical evidence. The law, by contrast, puts a premium on consistency and predictability. It relies on precedent, so courts look to previous courts for guidance and are often bound by prior decisions.
    By and large, judges have approached their task of scientific analysis just as we might expect them to: They have tried to apply it within a legal framework. This means when assessing whether a given field of forensics is scientifically reliable, judges tend to look to what previous courts have already determined. And when confronted with a new field, they tend to err on the side of relying on our adversarial system — they let the evidence in but also let the defense call its own experts to dispute the prosecution’s witness. The problem here is that by simply admitting the evidence, the courts lend it an air of legitimacy. Once the evidence is allowed in, whether jurors find it convincing tends to come down to which witness is most persuasive. State’s witnesses are often seen as unbiased and altruistic, while jurors tend to see defense witnesses as hired guns. And the set of skills it takes to persuade a jury isn’t necessarily the same skill set of a careful and cautious scientist. Indeed, the two are often in conflict.

    This is why a field such as bite-mark analysis — which has been found to be unreliable by multiple scientific bodies — has yet to be disallowed by any courtroom in the country. Every time it has been challenged, the court has upheld its validity.
    This brings me to the September D.C. opinion of United States v. Marquette Tibbs, written by Associate Judge Todd E. Edelman. In this case, the prosecution wanted to put on a witness who would testify that the markings on a shell casing matched those of a gun discarded by a man who had been charged with murder. The witness planned to testify that after examining the marks on a casing under a microscope and comparing it with marks on casings fired by the gun in a lab, the shell casing was a match to the gun.
    This sort of testimony has been allowed in thousands of cases in courtrooms all over the country. But this type of analysis is not science. It’s highly subjective. There is no way to calculate a margin for error. It involves little more than looking at the markings on one casing, comparing them with the markings on another and determining whether they’re a “match.” Like other fields of “pattern matching” analysis, such as bite-mark, tire-tread or carpet-fiber analysis, there are no statistics that analysts can produce to back up their testimony. We simply don’t know how many other guns could have created similar markings. Instead, the jury is simply asked to rely on the witness’s expertise about a match.

    Because this sort of testimony has been accepted by courts thousands of times over, it would have been easy and relatively unremarkable for Edelman to have cited those decisions and allowed the evidence. He could have argued that any doubts about the evidence could have been addressed by the defense during cross examination or by putting on its own expert. Instead, Edelman held a thorough evidentiary hearing, known as a Daubert hearing (named for a Supreme Court case on the admissibility of scientific evidence), personally reviewed the testimony and scientific literature, and reached a conclusion.

    Here’s the heart of the opinion:

    After conducting an extensive evidentiary hearing in this case—one that involved detailed testimony from a number of distinguished expert witnesses, review of all of the leading studies in the discipline, pre- and post-hearing briefing, and lengthy arguments by skilled and experienced counsel—this Court ruled on August 8, 2019 that application of the Daubert factors requires substantial restrictions on specialized opinion testimony in this area. Based largely on the inability of the published studies in the field to establish an error rate, the absence of an objective standard for identification, and the lack of acceptance of the discipline’s foundational validity outside of the community of firearms and toolmark examiners, the Court precluded the government from eliciting testimony identifying the recovered firearm as the source of the recovered cartridge casing. Instead, the Court ruled that the government’s expert witness must limit his testimony to a conclusion that, based on his examination of the evidence and the consistency of the class characteristics and microscopic toolmarks, the firearm cannot be excluded as the source of the casing. The Court issues this Memorandum Opinion to further elucidate the ruling it made in open court.

    Note that Edelman did not rule that the witness couldn’t testify at all. He ruled that the witness could testify only to conclusions backed by scientific research. The witness could tell the jury that he could not exclude the gun as the weapon that produced the casing. But he could not say it’s a match because such a conclusion could not be proved.

    This is an important distinction. Even the most strident critics of these fields of forensics don’t claim that they’re useless. Even bite-mark analysis can have some (minimal) investigative value. If there are clear bite marks all over a victim, for example, and the main suspect has no teeth, it seems safe to say that the suspect isn’t the source of the bites.

    But it’s useful to compare fields like this with single-source DNA evidence, which is backed by science. DNA analysts don’t tell jurors that a suspect is a match. Instead, they use percentages. Because we know the frequency with which specific DNA markers are distributed across the population, analysts can calculate the odds that anyone other than the suspect was the source of the DNA in question. We can’t do that with marks on shell casings, or bite marks, or pry marks on a door because there is no way of knowing how many different guns or teeth or crowbars might, under the right conditions, produce identical marks.

    What is remarkable about Edelman’s opinion is he acknowledges that his ruling will be unusual and that it will cut against nearly every court to rule before him, including appellate courts. But he issues it anyway, because it happens to be correct.

    Judges across the United States have considered similar challenges to firearms and toolmark identification evidence. Of course, “for many decades ballistics testimony was accepted almost without question in most federal courts in the United States.” Based on the pleadings in this case, as well as the Court’s own research, there do not appear to be any reported cases in which this type of evidence has been excluded in its entirety. Earlier this year, the United States District Court for the District of Nevada also surveyed the relevant case law and concluded that no federal court had found the method of firearms and toolmark examination promoted by AFTE—the method generally used by American firearms examiners and employed by Mr. Coleman in this case—to be unreliable.
    Nevertheless, he determines that the guiding principle here should not be precedent. It should be science.

    In evaluating the persuasive weight of these decisions, however, the undersigned could not help but note that, despite the enhanced gatekeeping role demanded by Daubert, see 509 U.S. at 589, the overwhelming majority of the reported post-Daubert cases regarding this type of expert opinion testimony have not engaged in a particularly extensive or probing analysis of the evidence’s reliability. In 2009, the National Research Council (“NRC”) specifically criticized the judiciary’s treatment of issues relating to the admissibility of firearms and toolmark evidence and the judiciary’s failure to apply Daubert in a meaningful fashion. In the NRC’s view, “[t]here is little to indicate that courts review firearms evidence pursuant to Daubert’s standard of reliability.” …
    Without disparaging the work of other courts, the NRC’s critique of our profession rings true, at least to the undersigned: many of the published post-Daubert opinions on firearms and toolmark identification involved no hearing on the admissibility of the evidence or only a cursory analysis of the relevant issues.

    Yet, the case law in this area follows a pattern in which holdings supported by limited analysis are nonetheless subsequently deferred to by one court after another. This pattern creates the appearance of an avalanche of authority; on closer examination, however, these precedents ultimately stand on a fairly flimsy foundation. The NRC credited Professor David Faigman—one of the defense experts who testified at the Daubert hearing in this matter—with the observation that trial courts defer to expert witnesses; appellate courts then defer to the trial courts; and subsequent courts then defer to the earlier decisions.

    As someone who has been beating this drum for years, I can’t tell you how satisfying it is to see this in a court opinion. It’s just remarkable.
    Under Daubert v. Merrell Dow Pharmaceuticals Inc., the Supreme Court laid out markers that judges should look for when assessing scientific evidence, such as whether the methods in question are subject to peer review and whether the expert’s methods are generally accepted in the scientific community. Consequently, Daubert spawned cottage industries of forensic boards, certifying organizations and quasi-academic journals, all aimed at conferring legitimacy on dubious fields. When assessing a challenge to the scientific reliability of an entire discipline of forensics such as ballistics analysis or bite-mark analysis, then, too many judges have simply looked to these bogus boards and journals and concluded that the state’s expert and his or her methods are “generally accepted.”

    But they’re accepted only by other experts within those same suspect fields. These judges neglect to assess how the entire field has been assessed by actual scientists. It’s like assessing the scientific validity of an astrologer by citing astrology journals or by consulting other astrologists.
    In this case, the prosecution cited a publication called the Association of Firearm and Tool Mark Examiners (AFTE) Journal, which it claimed had published “peer-reviewed” studies concluding that ballistics analysts had a low rate of error. In his opinion, Edelman deftly slices through this noise:
    Overall, the AFTE Journal’s use of reviewers exclusively from within the field to review articles created for and by other practitioners in the field greatly reduces its value as a scientific publication, especially when considered in conjunction with the general lack of access to the journal for the broader academic and scientific community as well as its use of an open review process. …

    Other courts considering challenges to this discipline under Daubert have concluded that publication in the AFTE Journal satisfies this prong of the admissibility analysis. …
    It is striking, however, that these courts devote little attention to the sufficiency of this journal’s peer review process or to the issues stemming from a review process dominated by financially and professionally interested practitioners, and instead, mostly accept at face value the assertions regarding the adequacy of the journal’s peer review process. …

    In the undersigned’s view, if Daubert, Motorola, and Rule 702 are to have any meaning at all, courts must not confine the relevant scientific community to the specific group of practitioners dedicated to the validity of the theory—in other words, to those whose professional standing and financial livelihoods depend on the challenged discipline. As Judge Jon M. Alander of the Superior Court of Connecticut aptly stated, “[i]t is self evident that practitioners accept the validity of the method as they are the ones using it. Were the relevant scientific community limited to practitioners, every scientific methodology would be deemed to have gained general acceptance.”
    Edelman’s opinion is the Platonic ideal of a Daubert analysis. It ought to be the norm. But we should also be careful not to conclude that because Edelman did it correctly, other judges will too. Again, it’s just not realistic to expect people trained in law to accurately assess the validity of scientific evidence that sometimes gets quite complicated.

    One additional item worth noting: In 2016, President Barack Obama nominated Edelman to be a federal district court judge. Despite his impressive résumé, the Republican-controlled Senate never voted on the nomination; it expired eight months later.
    In 2017, President Trump nominated Matthew S. Petersen to fill the vacancy Edelman was denied. Petersen had no previous trial or criminal law experience. In a viral video taken during his confirmation hearing, Sen. John Neely Kennedy (R-La.) asked Petersen about basic concepts in criminal law with which any judge should be familiar. One of those concepts was the Daubert standard. Clearly flustered, Petersen responded, “I don’t have that readily at my disposal. But I would be happy to take a closer look at that.” Petersen later withdrew from consideration for the position.
    Religion is doing what you are told no matter what is right. Morality is doing what is right no matter what you are told...

  4. #4
    Join Date
    Jul 2009
    Posts
    34,104
    Feedback Score
    3 (100%)
    Good. They call this shit ballistic fingerprinting and everyone thinks it's exactly like DNA.

    Change the barrel and put a emery board to the chamber face and presto - no match. This is to say nothing about simply dropping "other" spent cases at the scene. So many ways to undermine it I don't know where to start.

    Innocent people go to jail, guilty people go free. Hard enough when the Nifongs of this world suppress reliable DNA evidence to win a case, when you let them have pseudo science and present it as irrefutable, it only get's worse.
    It's hard to be a ACLU hating, philosophically Libertarian, socially liberal, fiscally conservative, scientifically grounded, agnostic, porn admiring gun owner who believes in self determination.

    Chuck, we miss ya man.

    كافر

  5. #5
    Join Date
    Sep 2009
    Posts
    1,378
    Feedback Score
    8 (100%)
    Quote Originally Posted by tn1911 View Post
    I wasn't sure if that was OK here. If its not I apologize to the MODS.
    Even if so, please at least mark the passage that you’re pasting as coming from the linked article or the original work of the author. It reads better, and adds to the overall tone of the post and forum. It also helps to differentiate between the source material and your own commentary.

    On topic: It seems that every couple decades, someone has to pump the brakes to make sure we’re obtaining and using evidence (not proof- evidence) the right way. The polygraph went through similar scrutiny 20 or so years ago, and to me, it looks like other fields might start coming under similar scrutiny.
    The advice above is worth exactly what you paid for it.

  6. #6
    Join Date
    Feb 2011
    Location
    Not in a gun friendly state
    Posts
    3,810
    Feedback Score
    0
    Quote Originally Posted by Chameleox View Post
    Even if so, please at least mark the passage that you’re pasting as coming from the linked article or the original work of the author. It reads better, and adds to the overall tone of the post and forum. It also helps to differentiate between the source material and your own commentary.

    On topic: It seems that every couple decades, someone has to pump the brakes to make sure we’re obtaining and using evidence (not proof- evidence) the right way. The polygraph went through similar scrutiny 20 or so years ago, and to me, it looks like other fields might start coming under similar scrutiny.
    The problem is, the polygraph is still widely used in law enforcement. The majority of local, state, and federal LE agencies use it for pre-hire screening, and it's still used in criminal investigations, though the results aren't admissible in court unless the defense stipulates to it. The problem is that law enforcement personnel tend to be very rigid and unchanging in their beliefs. Even if science proves them wrong, most of them will just dismiss it on the grounds that it's not what they were taught. Even if what they were doing wasn't working, there was no doubt a reason why they were doing it that way, and you do NOT question tradition under any circumstances. I had conversations with cops where they insist that "everything we do is done for a very specific reason," and when I asked if things should be changed if they aren't working, and the answer was pretty "No. Because if we're doing it a certain way, it means it's the way it's supposed to be done." With that mentality, it's no surprise that junk science is so rampant in law enforcement. And it's not just ballistics or the polygraph. It's body language/deception detection, profiling, fingerprinting, psycho social theories, behavior predictions...the list goes on.
    Those who beat their swords into plowshares will plow for those who do not.-Ben Franklin

    there’s some good in this world, Mr. Frodo. And it’s worth fighting for.-Samwise Gamgee

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •