The utilization of genetic testing in forensic science dates back 100 years to the advent of typing a individuals blood, however, it was not until the advance of the first Deoxyribonucleic acid (DNA) fingerprinting in 1984, followed by the development of Polymerase Chain Reaction (PCR) based methods, that such scientific advancement evolved and exuded its full potential. As these methods involving DNA became more extensive, research and further advancements in the later part of the 20th century and the early part of the 21st century significantly improved on those advancements. These advancements obliterated the boundaries of the application of DNA to circumstances that could not be fathomed a score earlier.
Friedrich Miescher first isolated DNA in 1869 and by the end of the 1940's, scientists understood that DNA contained chemical bases, but did not know how it looked. Chemist, Linus Pauling claimed to have discovered the DNA molecule's structure in 1953, however, James Watson perused Pauling's yet to be published paper and realized it was incorrect. Watson later saw an x-ray diffraction photo by Rosalind Franklin of a DNA crystal, which convinced Watson that the DNA molecule consisted of two chains that were arranged in a paired helix resembling a ladder or spiral staircase. Along with British researcher, Francis Crick, Watson a model of DNA structure and in March of 1953 finished the model and published their findings (Johnson, 2010).
Over the next three decades, DNA research continued, but courts were forced to rely heavily on eyewitness accounts and other testimony, which was often unreliable, however, in 1984, Alec Jeffreys gave the court system new hope when he discovered an interesting new marker in the human genome. Jeffreys developed the first DNA test capable of separating individuals, based on a technique called RFLP (Restriction Fragment Length Polymorphism). Together with Peter Gill, who was developing techniques for recovering DNA from bloodstains and from semen collected after a rape, he subsequently demonstrated that RFLP could be applied successfully in a forensic setting. The RFLP method would go on to become the nearly universal DNA forensic tool until it was eventually overtaken by STR based methods in the 1990s (Johnson, 2010).
Between every individual, most DNA is exact, however, the junk code connecting the genes is what makes each individual unique, and it is the junk code that is a useful investigative tool because it is found in bodily fluids, hair, skin, and bone marrow. Alec Jeffreys found that certain sequences of 10 to 100 base pairs repeated several times and these tandem repeats were the same for everyone, but the number of repetitions was the variable that made them specific to an individual. Prior to Jeffrey's work, a blood droplet discovered at a crime scene would only reveal the blood type of an individual, but since Jeffrey's discovery, the same blood droplet can reveal a vast amount of information (Jeffreys, 2007). In the last 20 years, the discovery of tens of thousands STR's have been found to transpire not only with genes and chromosomes, but also between them and does not give rise to any disruption of normal cell functions. These discoveries have also led to the fact that STR's mutate effortlessly, which points to the extreme level of variability among and between individuals (Butler, 2001) and these discoveries have become vital in criminal investigations today.
The first application of DNA fingerprinting in a criminal case turned out to be a powerful portent of the revolutionary role DNA would play in exonerating the innocent. In 1986, police were investigating the rape and murder of two girls, Lynda Mann and Dawn Ashworth. The police had a 17-year-old man, Richard Buckland, in custody. Buckland, who suffered from learning disabilities, had confessed to the murder of Dawn Ashworth after police questioning, however, Buckland denied murdering Lynda Mann. The police hoped to use DNA to link Buckland to the Mann murder. They called in Jeffreys and Gill to perform a test to match semen from the crime scenes to Buckland and ultimately freed an innocent man and found the real murderer (Pitchfork, R v, Court of Appeal - Criminal Division. 2009. EWCA Crim 963).
While many of the techniques used today were developed in the 1990s, some were not in widespread use until well into the 2000s. More recent developments in DNA testing have focused on a number of areas including: (1) the recovery of DNA from increasingly trace amounts of biological evidence. (2) Recovery of DNA from degraded samples. Tests based on alternative regions, such as Mini-STRs, have been developed and are starting to gain use. In these tests, the DNA examined is smaller than that used in standard STR tests, allowing for the use of more highly degraded DNA. One of the critical lessons that has emerged over the history of DNA testing is the importance of retaining biological samples, even when the samples do not provide useful results at the time of the investigation. In many cases, a conviction was overturned only when a new technique was developed that allowed for testing of a saved sample. In untold numbers of other cases, an innocent person has remained in prison because potentially exculpatory evidence was lost (Elkin, 2010).
Schiro (2000) best summarized the scientific changes and methodologies or theories that have led to significant changes in DNA when he said,
"Identification by DNA analysis is to the 20th century what fingerprints were to the 19th century. DNA analysis has revolutionized how blood and body fluids from the most violent crimes are analyzed and used for investigative information. DNA analysis wasn't even part of forensic science 20 years ago. Back then, ABO blood types and other genetic markers were used to analyze blood and body fluids. This analysis required a relatively large amount of bloodstain and these genetic markers were susceptible to environmental degradation. Today, a bloodstain the size of a pinhead can be analyzed using Polymerase Chain Reaction (PCR) technology invented by Kary Mullis. His invention was so revolutionary that he won a Nobel Prize. Since DNA is a very stable molecule, old and degraded blood and body fluid stains can be accurately typed. By analyzing a combination of sites on the DNA, a blood or body fluid stain can be linked back to a suspect or victim with a high degree of probability. All of these factors make DNA data ideal to be databanked."
Biological evidence gathered from crime scenes, whether the evidence is hair, body fluids, etc. has become more significant with the discovery and development of DNA analysis. Since DNA is found in body fluids and tissues, traditional serological methods have been rendered obsolete in the world of forensics because not only can DNA withstand harsher environments, but it can also be successful when other serological testing is not. Since the advent of Sherlock Holmes, deductive reasoning and viewing scientific evidence objectively has been widely used in criminal investigations. When a suspect is presented with irrefutable forensic evidence, a confession will likely ensue, but if it does not, then the evidence will likely be damning in front of a jury. Forensic evidence has progressed at a rapid rate over the last 20 years with the introduction of fingerprint databases and the DNA database, known as CODIS. These technological advances in forensics have made their way as a staple of criminal procedure despite objections, challenges, and controversies surrounding DNA testing. Questions and challenges into the validity and/or reliability of DNA testing have all but disappeared as the scores of published court opinions have solidified DNA testing (Riley, 2005). Probably the most probing question is what is the current impact of DNA testing on the criminal justice system and how has DNA revolutionized this system?
The longevity of DNA is also impressive and interesting because the power of this microscopic molecule is not only significant for crimes today and in the future, but also just as significant for crimes of the past. Evidentiary specimens of decades old crimes can still be tested, which can lead to the exoneration of the wrongly accused. America's first DNA death row exoneration was Kirk Bloodsworth (State v. Bloodsworth 84-CR-3138, 1984, who spent nearly nine years in prison for a crime he did not commit. Bloodsworth was convicted on the testimony of five witnesses who placed him with the victim or near the crime scene, and through the prosecution's forensic evidence that purported to link Bloodworth's shoes to marks on the victim's body. After ascertaining that the prosecution illegally withheld exculpatory evidence from the defense, the Maryland Court of Appeals overturned Bloodworth's conviction (Bloodsworth v. State, 307 Md. 164, 1986). Bloodworth was retried and convicted again, but attempted another appeal (Bloodsworth v. State, 76 Md. App. 23, 1988), but the appellate court affirmed the conviction.
In 1992, Bloodsworth obtained court approval for testing of biological evidence from the crime scene with newly discovered DNA technology, polymerase chain reaction, which established Bloodsworth's innocence and was released from prison on June 28, 1993. Nine years later, a Baltimore County forensic biologist was reviewing evidence from the case found stains on a sheet, which had not been analyzed and submitted the results to the national DNA database, which linked Kimberly Shay Ruffner (who once occupied a cell directly below Bloodsworth) to the crime. Ruffner was formally charged with the crime on September 5, 2003 (Connors, Lundregan, Miller, and McEwen, 1996).
DNA analysis has made great strides, which has led to its success, but it is not without its failures. These failures do not seem to be in the science itself, but because of the fallible effects of human involvement. After the commission of a crime, it is common knowledge to collect any physical evidence from the scene. The examination of four scholarly research articles may elucidate the effects of recovered DNA from a crime scene on particular cases and in the community of forensic science itself. McClure, Weisburd, and Wilson (2008) argued that in addition to science in the laboratory, science in the field is crucial to ascertain the usefulness of the assorted crime solving methods. There is a need for further research of many features of forensics, which question the strengths and interpretations of scientific evidence while excluding DNA testing. In homicides and sexual cases, DNA's presence at the scene and/or on the victim increases the probability of both prosecution of the perpetrator and ultimate conviction. The odds that DNA will be present in sexual cases is 33:1, while the odds that DNA will be present in homicide cases is 23:1. Their research indicated there was a gradual, but consistent decrease in homicide rates nationally, which began at the onset of the 20th century and continued well into the 21st century (McClure, Weisburd, and Wilson, 2008).
McClure, et al (2008) reported that the national homicide rate dropped from over 90 percent in the last half of the 20th century to 62 percent in the beginning of the 21st century. This dramatic decrease occurred with the advent of DNA testing, however it is still argued that the decrease is, at least in part, the result of stranger/stranger deaths resulting from drugs, domestic homicide decline, and pressures and restraint of law enforcement. Cole (2007) two primary corollaries of DNA advances: 1) the grandiose database resulting from genetic profiles and 2) the reliance on DNA recovered from the crime scene. To support his point, Cole (2007) examined two cases, which involved an African-American woman, who accused the lacrosse team of Duke University of rape, and a 1987 case, which highlighted the problematic occurrence of class, gender, and race. Both of these cases raise the broader sentiment that the criminal justice system in America is not equal, fair, or just.
Research recognized the constant scrutiny of DNA analysis when attempting to admit is in court. Judges have the discretion whether to admit scientific evidence if it seems ambiguous, obfuscating, prejudicial, or misleading to the trier of fact (Palermo, 2006).
There have been challenges over the past few years to some of the scientific disciplines, which have been around for a long time. In Daubert v. Merrell Dow Pharmaceuticals, Inc., (509 U.S. 579 (1993), the supreme court placed new standards for scientific evidence and defense council may use the Daubert standard to challenge scientific/forensic techniques, including, but not limited to, DNA testing. Carrell (2008) recognized that the trilogy relating to Daubert present potential issues for forensic testimony. The history of DNA suggests that in cases involving exoneration, there were forensic errors, which led to wrongful convictions and became fodder for popular television programming, which may bias the trier of fact to weigh DNA evidence unfairly.
Carrell (2008) examined 86 cases involving DNA and the eventual exoneration of the accused and found that the testing errors by forensic scientists was the causation of wrongful convictions, second only to the misidentification by eyewitnesses. When the DNA results of the Buckland case came in, mentioned supra, it revealed that the same man murdered both Ashworth and Mann, however, the perpetrator was not Buckland and realized that Buckland had given a false confession. Law enforcement tested the DNA of 4,000 men and the perpetrator was not ascertained. The case was becoming cold, when a man from the town told the interesting tale that a friend of his named Colin Pitchfork had paid the taleteller to take the DNA test for him. Pitchfork was subsequently arrested and convicted of the murders. (Pitchfork, R v, Court of Appeal - Criminal Division. 2009. EWCA Crim 963).
Predicting future fundamental leaps in technology is no easy feat, at least with any accuracy; such predictions are at best, an educated guess and at the very least, conjecture. Three decades ago, very few researchers or professionals, if any, could have predicted the impact that the PCR method would have had on molecular biology. If one spends a short amount of time perusing the vast amounts of literature of forensic biological evidence, it will quickly become clear that there are some gaps regarding the handling and subsequent analysis regarding the aforementioned forensic biological evidence. Those gaps are currently being or are will probably be closed utilizing the tools of molecular biology. The predictions regarding the gaps are reasonable and will likely be the driving force behind developments in the field of forensics within the next decade. The gaps, which will likely be addressed are enhancements to the present boundaries of typing samples of limited quantity and quality, investigative information, including, but not limited to, phenotyping inference and pharmacogenetic data for molecular autopsy/microbial forensics, microbial forensics; and computerization focusing on field testing (Budowle and van Daal, 2009).
In conclusion, this paper revealed that the utilization of genetic testing in forensic science dates back 100 years to the advent of typing a individual's blood. Its history, however, was not fully realized until the advance of the first Deoxyribonucleic acid (DNA) fingerprinting in 1984, followed by the development of Polymerase Chain Reaction (PCR) based methods, that such scientific advancement evolved and exuded its full potential. As these methods involving DNA became more extensive, research and further advancements in the later part of the 20th century and the early part of the 21st century significantly improved on those advancements. These advancements obliterated the boundaries of the application of DNA to circumstances that could not be fathomed a score earlier and has continued, and will continue to advance as new technological breakthroughs occur.