Popular Mechanics Separates CSI Fact from CSI Fiction

CSI, Forensic Files, The First 48 and other television programs of this genre are among my favorites. Investigators study a crime scene and learn all sorts of valuable information from blood spatter, shoe prints, tire marks, hair fibers, ballistics, and trace evidence. We are to believe that “the evidence doesn’t lie” and that these noble CSI crusaders seek only the truth and determine this truth by their many years of expertise in all areas of science.

That is what we are to believe but is this reliance on forensic science in solving crimes misplaced? The cover story in the August 2009 article of Popular Mechanics makes the argument that the “science” in forensic science isn’t always all it’s cracked up to be.

On television and in the movies, forensic examiners unravel difficult cases with a combination of scientific acumen, cutting-edge technology and dogged persistence. The gee-whiz wonder of it all has spawned its own media-age legal phenomenon known as the “CSI effect.” Jurors routinely afford confident scientific experts an almost mythic infallibility because they evoke the bold characters from crime dramas. The real world of forensic science, however, is far different. America’s forensic labs are overburdened, understaffed and under intense pressure from prosecutors to produce results. According to a 2005 study by the Department of Justice, the average lab has a backlog of 401 requests for services. Plus, several state and city forensic departments have been racked by scandals involving mishandled evidence and outright fraud.

But criminal forensics has a deeper problem of basic validity. Bite marks, blood-splatter patterns, ballistics, and hair, fiber and handwriting analysis sound compelling in the courtroom, but much of the “science” behind forensic science rests on surprisingly shaky foundations. Many well-established forms of evidence are the product of highly subjective analysis by people with minimal credentials—according to the American Society of Crime Laboratory Directors, no advanced degree is required for a career in forensics. And even the most experienced and respected professionals can come to inaccurate conclusions, because the body of research behind the majority of the forensic sciences is incomplete, and the established methodologies are often inexact. “There is no scientific foundation for it,” says Arizona State University law professor Michael Saks. “As you begin to unpack it you find it’s a lot of loosey-goosey stuff.”

This kind of pokes holes into the notion that the evidence doesn’t lie.

Here’s the money quote of the whole article:

[The National Academy of Science report concerning the state of forensic science used in the criminal justice system] specifically noted that apart from DNA, there is not a single forensic discipline that has been proven “with a high degree of certainty” to be able to match a piece of evidence to a suspect.

That’s right; according to the NAS report, ballistics, trace evidence, and even finger print analysis are far from perfect.

A 2006 study by the University of Southampton in England asked six veteran fingerprint examiners to study prints taken from actual criminal cases. The experts were not told that they had previously examined the same prints. The researchers’ goal was to determine if contextual information—for example, some prints included a notation that the suspect had already confessed—would affect the results. But the experiment revealed a far more serious problem: The analyses of fingerprint examiners were often inconsistent regardless of context. Only two of the six experts reached the same conclusions on second examination as they had on the first.

Ballistics has similar flaws. A subsection of tool-mark analysis, ballistics matching is predicated on the theory that when a bullet is fired, unique marks are left on the slug by the barrel of the gun. Consequently, two bullets fired from the same gun should bear the identical marks. Yet there are no accepted standards for what constitutes a match between bullets. Juries are left to trust expert witnesses. “‘I know it when I see it’ is often an acceptable response,” says Adina Schwartz, a law professor and ballistics expert with the John Jay College of Criminal Justice.

The good news, according to the article, is that there are certain forensic techniques which are considered good science:

Techniques that grew out of organic chemistry and microbiology have a strong scientific foundation. For example, chromatography, a method for separating complex mixtures, enables examiners to identify chemical substances in bodily fluids—evidence vital to many drug cases. The evolution of DNA analysis, in particular, has set a new scientific standard for forensic evidence. But it also demonstrates that good science takes time.

So should these other methods which do not have a strong scientific foundation all be junked? Not even the critics of these methods in this article are willing to go that far. The article goes on to explain that these methods should be explained in their proper context to jurors (i.e. strengths and weaknesses, variables which can affect the results, and whether the evidence is exclusionary or qualified supporting evidence, etc.). All of this should be disclosed up front rather than relying on a defense attorney who likely does not have a background in forensic science to identify each problem with the presentation of the evidence.

Of course with the damning NAS report, others like it, and more exposure to the weaknesses of forensic science used in the courtroom by mainstream publications like Popular Mechanics, criminal defense lawyers everywhere now have this in their arsenal to create reasonable doubt in the minds of jurors until expert witnesses are required to give full disclosure regarding the techniques.