By Hon. Mark E. Windham In People v. Nelson, 43 Cal.4th 1242 (2008), the Supreme Court has spoken definitively as to the latest controversy over DNA evidence.
The objective of this article and self-study test is to review the holding of the California Supreme Court's important DNA case in Nelson. Readers will learn how the Supreme Court allowed a "cold-hit" DNA case to proceed and how it allowed the presentation of powerful statistical DNA information.
While it might appear self-evident that a DNA match between a defendant and an object associated with a crime would be powerful evidence identifying him as a perpetrator, this is not necessarily so. Even though every person's DNA is unique, and every trace of an individual's DNA that he leaves behind is identical, the testing technology cannot compare every single gene or locus in the sample.
Matching at every single gene (or more precisely, "locus") tested doesn't necessarily prove the identity of the source. Almost the entire human genome is identical. Roughly ninety-nine percent of the genome makes us human. The standard 13 locus DNA test was designed to assess discreet locations on the genome which have a high degree of variability between individuals. In most cases this system works extraordinarily well, with an extremely high degree of discrimination.
In some cases testing limitations may result in matches at only a few loci, but millions of other individuals might also match these results. Even in a test indicating exclusion of an innocent person (because there are differences at one or more loci), the DNA will match at several other loci. Evidence indicating a match at multiple loci is an incomplete showing regarding identity.
Given a DNA match at each of the multiple loci tested, how valuable is this evidence in ascertaining identity? The answer has been provided through population statistics indicating the rarity of each type. The results at each locus can be multiplied when the independence of each is established. Use of this 'product rule" has been upheld in People v. Soto, 21 Cal.4th 512 (1999) (a case involving the now obsolete restriction fragment length polymorphism (RFLP) technology). Soto held that the product rule meets the Kelly standard for admissibility. See People v Kelly, 17 Cal.3d 24 (1976).
While no California case had yet examined the general acceptance of product rule statistics for the standard 13 locus short tandem repeats (STR)/DNA test, the issue has been litigated extensively and is no longer seriously in dispute. Nelson implicitly holds that general acceptance of the product rule for RFLP loci has resolved the issue for any DNA loci, including the current standard STR loci.
As in People v. Jackson, 163 Cal.App.4th 313 (2008), this use of the product rule is an example of a generally accepted technology which is being used in a somewhat different manner, which does not give rise to a prong one Kelly admissibility challenge. (The statistical methods for STR typing have passed the Daubert analysis applicable under federal law. U.S. v Trala, 162 F.Supp.2d 336 (F.D. Del. 2001); Daubert v Merrell Dow Pharmaceuticals, Inc., 509 US 579 (1993)). The bottom line is that in most cases the standard 13 locus STR/DNA tests can indicate that the suspect type found at the crime scene, or in the rape kit, and which matches the defendant is so rare that we would expect to see it one time in hundreds of trillions, or quadrillions of genetic profiles. Obviously this is powerful evidence of the identity of the source.
Nelson addresses the most recent controversy in the assessment of the weight of evidence. The power and promise of the STR/DNA testing technology always has been the digital format of its results allowing easy integration with databases. Nationwide, the Combined DNA Index System (CODIS) technology has facilitated the creation of huge databases of felons' genetic profiles in the National DNA Index System. Law enforcement agencies have created databases of suspect types in unsolved cases, linking cases to each other to establish serial crime cases, and increasingly matching recidivist offenders to unsolved crimes, generating "cold-hit matches." The law enforcement paradigm has shifted, with DNA matches at the beginning of an investigation rather than merely confirming identity after probable cause has been otherwise developed.
The power of this technology is also the source of its unwanted side-effect: it is so good at finding matches that occasionally there are coincidental matches to innocent persons. Though rarely reported, such coincidental matches have been documented. In the U.K., where database searching has been the norm for more than a decade, several "adventitious" matches are expected annually. These relatively rare coincidental matches can be readily identified as such by credible alibi or other evidence of impossibility, but it plausible that someone matched from a database of felons would not be so fortunate. It is the risk of coincidental match which is the bone of contention in Nelson.
The reason for the effectiveness of the DNA database searches in ascertaining matches is its systematic comparison of the forensic unknown to known offenders. This method by its nature involves an "ascertainment bias" which typically is illustrated by a statistical riddle known as "the birthday problem": "Assume that birthdays are evenly distributed and that the chance of two people having the same birthday is about 1 in 365. The chance of finding two people in one room having the same birthday however is greater than 50 percent once there are 23 people in the room. When there are 64 people in the room, the chance of a shared birthday is more than 99 percent. Comparing each person against the other is what makes the computational difference, and it is analogous to a database search." Scientific Evidence in California Criminal Cases, Section 5.34 (Cal CEB 2008).
This riddle illustrates the source of the controversy in Nelson, in which the victim was discovered raped and drowned in 1976. Detectives were unable to develop sufficient evidence to focus the investigation on a specific person and eventually, the matter became a cold case, that is, unsolved but inactive. Later for unrelated events, defendant was sentenced to a lengthy prison term. A biological sample was obtained from him for DNA analysis and entry into the state convicted offender databank. During July 2001, a review of the victim's death determined that the case had biological evidence warranting analysis. A databank search resulted in a match to the defendant and the DNA obtained from a subsequent oral swab of the defendant matched the DNA profile of each of the evidence samples. The defendant was charged and convicted of the murder.
Nelson first resolved the due process issue of precharging delay, holding that although the "defendant demonstrated some prejudice sufficient to require the prosecution to justify the preaccusation delay...the prejudice was minimal."
Regarding the DNA issue, the defense objected to evidence of the cold hit on the grounds that the statistics which gave it meaning were not generally accepted. The defendant pointed to a dispute regarding the appropriate calculation.
Nelson held that Kelly does not apply. The Court noted the authenticity of the problem of ascertainment bias and the four divergent approaches to reconcile the rarity of the type with chance of coincidental match. It relies to a large extent on U.S. v. Jenkins, 887 A.2d 1013 (D.C. 2005), which considered use of the product rule and the three leading variant approaches, including the recommendations of two reports of the National Research Council (NRC), and the "Balding/Donnelly" approach.
Briefly, those approaches to the problem work as follows. The first is to merely use the product rule, imperfect in the instance of a databank trawl on account of the ascertainment bias phenomenon, but arguably still appropriate. However, as noted by Nelson, there are problems with using the product rule as if the case was an ordinary DNA case, rather than a cold hit.
The second approach is suggested by the report commonly known as "NRC I" (National Research Council, DNA Technology in Forensic Science, 1992): in a retest, use new loci which would be independent of the original search. This is conceptually unassailable but valuable data is lost and the statistics may be far too conservative. A third approach is the database match probability, or "DMP" approach suggested in the NRC's 1996 report "NRC II" National Research Council, The Evaluation of Forensic DNA Evidence, 1996): multiply the random match probability by the size of the offender database searched (thereby reducing the denominator to adjust for the likelihood of coincidence - presently the offender database contains over one million genetic profiles). This adjustment compensates for the ascertainment bias of testing methodology by a multiplication proportionate to the degree of that bias. Because the methodology makes one million separate comparisons in an offender database of one million genetic profiles, the result is multiplied by one million. Though the mathematics are conceded to be sound, arguably this overstates the risk of a coincidental match. As noted in Nelson: "The Federal Bureau of Investigation's DNA Advisory Board suggests that [this] recommendation...is best read to require a presentation of both the database match probability and the rarity statistic. The database match probability ascertains the probability of a match from a given database. But the database is not on trial. Only the defendant is. . . . Thus, the question of how probable it is that the defendant, not the database, is the source of the crime scene DNA remains relevant. The rarity statistic addresses this question."
The fourth approach, known as the "Balding/Donnelly (Bayesian) method," expresses the result in terms of a "likelihood ratio": "Instead of focusing on the probability of obtaining a match, Balding-Donnelly focuses on the elimination of other profiles during the search. In their analysis, a match becomes more significant with larger database searches. They posit that in obtaining a match in a database search, one simultaneously eliminates other profiles as being the source of the sample."
Despite its refined precision, this method may be "inherently confusing, difficult to explain to a jury, and possibly misleading." Though the question of which method is the most appropriate measure is debatable, the proponents of each do not dispute the scientific validity of every other approach.
Nelson recognized that the debate here is not reliability, but relevancy. Each approach is scientifically valid and tends in reason to show the significance of a DNA match. While the product rule (rarity statistic) may not accurately describe the chance of coincidence in a database search scenario (random match probability), the rarity of the type still is relevant in its tendency to show the weight of the evidence. It is not excluded on account of its imperfection in this regard. By the same token, after Nelson, the other measures also may be relevant at trial: "The conclusion that statistics derived from the product rule are admissible in a cold hit case does not mean that they are the only statistics that are relevant and admissible. The database match probability statistic might also be admissible...[i]t is unlikely the database match probability statistic would have been significant to the jury in this case given the size of even that number. But in a different case, if the database were large enough and the odds shorter than those here, the database match probability statistic might also be probative. Nothing we say prohibits its admission."
Nelson effectively ended years of trial court litigation of the admissibility of statistics in cold hit cases. As cold hit cases proliferate, with the expansion of the offender database and the elimination of the backlog of untested crime scene samples, the importance of Nelson will increasingly become evident.
The Hon. Mark E. Windham is a judge with the Superior Court of California, County of Los Angeles.