Forensic Science in the 19th and 20th Centuries Research Paper

This sample Forensic Science in the 19th and 20th Centuries Research Paper is published for educational and informational purposes only. If you need help writing your assignment, please use our research paper writing service and buy a paper on any topic at affordable price. Also check our tips on how to write a research paper, see the lists of criminal justice research paper topics, and browse research paper examples.

Medicine was the first science-based profession to be used for forensic purposes, and forensic medicine can be considered to be the mother of forensic science. In the sixth century, the Emperor Justinian recognized the special position of the expert witness when he declared that physicians were not ordinary witnesses but rather persons who gave judgments rather than testimony. In England, the Statute of Westminster in 1265 contained the first provision for the office of coroner, and, in 1507, the Bamberg Code in Germany required that medical evidence be presented in all cases of violent death. These provided a legal basis for the practice of forensic medicine and were the precursors of modern coroner and medical examiner legislation. In North America, the early English settlers brought the coroner system with them and inquests were held as early as 1635 in New England.

Prior to the nineteenth century, evidence of poisoning was purely clinical and circumstantial because there were very few chemical tests for poisons available. As knowledge of chemistry increased, it was applied to the detection of poisons, usually by university professors ensconced in traditional university departments. Other now commonly recognized forensic sciences – fingerprint identification, trace evidence examination, questioned document examination, firearms and tool-mark examination, and serology were essentially unknown.

Toward the end of the eighteenth century, precursors of what would now be considered “Institutes of Forensic Medicine” began to appear as personal chairs in university departments. The first such was the Chaire de me´decine le´gale et de histoire de l’art de gue´rir established in the University of Paris in 1794. Within a few years, similar chairs were established in the Universities of Strasbourg and Montpellier. Forensic science laboratories, however, were nonexistent.

Developments In The Nineteenth Century

Forensic Medicine

Although most of what we now refer to as the forensic sciences did not begin to develop until the latter half of this century, forensic medicine began to be recognized as a specialized branch of medicine, “legal medicine,” early in the 1800s. In Austria, a Chair in Legal Medicine and Hygiene was established in the University of Vienna in 1804, while in France l’ Institut de me´decine le´ gale de Paris was founded in 1868.

The first British Professor of Legal Medicine was Andrew Duncan Sr. in the University of Edinburgh. Duncan persuaded the University to establish the Regius (government-funded) Chair in Medical Jurisprudence in 1806; however, it was his son, Andrew Duncan Jr., who became the first incumbent in that Chair. This father/son succession pattern was repeated in the same Chair in 1906 when Henry Harvey Littlejohn succeeded his father Sir Henry Littlejohn. Similarly, at the University of Glasgow, John Glaister Sr. and Jr. combined to hold the Regius Chair of Medical Jurisprudence and Forensic Medicine, established in 1839, from 1898 to 1962. Edinburgh became the cradle of forensic medicine for the English-speaking world as graduates established forensic medical services in many other countries.

In England, the most prominent early name was Alfred Swaine Taylor who became Professor of Medical Jurisprudence at Guy’s Hospital Medical School in London in 1834. Taylor was a prodigious author, and his “Principles and Practice of Medical Jurisprudence,” first published in 1865 and continuously revised and updated by later authors, was long recognized as a standard work.

The first lectures on legal medicine in the USA were given in New York in 1804 by James S. Stringham, a graduate of the University of Edinburgh. A major development occurred in Massachusetts in 1877 with the abolition of elected coroners and the establishment of a medical examiner system which required that the medical examiner be a qualified pathologist.

Forensic Toxicology

As the science of chemistry developed, a few professors of forensic medicine began to use this new science for one of their more intractable problems, the detection of poisons. One of these was Mathieu Joseph Bonaventure Orfila, universally acknowledged to be the “Father of Toxicology” (Schuller 2003). Orfila studied at Valencia and Barcelona before graduating in medicine from the University of Paris in 1811.

He soon became Professor of Legal Medicine and Chemistry at that University and, in 1814, published the first major work on toxicology “Traite´ des poisons tire´s des re` gnes mine´ral ve´ ge´tal et animal ou Toxicologie Ge´ne´rale” which rapidly gained international recognition. Much of Orfila’s research was devoted to the study of arsenic, and, in 1839, he was the first to extract it from human organs rather than from just stomach contents. Orfila used the results of this research in 1840 in his testimony at the trial of Marie Lafarge for the murder of her husband by means of an arsenic-laced cake. This trial focused worldwide attention on the new science of toxicology; L’affaire Lafarge was probably the first case in history in which convincing scientific testimony (other than medical) was presented.

The introduction of analytical methods for metallic poisons caused potential poisoners to turn their attention to the alkaloids that were beginning to be isolated from plants. These were considered to be undetectable in human organs; however, in 1850, this began to change with the development by Jean Servais Stas, a Professor of

Chemistry at L’E´cole Royale Militaire in Brussels and a former student of Orfila’s, of a method for the extraction of nicotine from the organs of a murder victim. Stas’ basic procedure (as modified by F.J. Otto in 1856) was based on the differing solubility of such drugs in acids and bases and is still in use. Identification of any extracted drugs was based on colors developed, and microscopic morphology of crystals formed, when the extract was reacted with other chemicals.

A trend toward specialization in toxicology began near the end of the nineteenth century with the emergence of, for example, the most prominent of the early American toxicologists, Rudolph August Witthaus who had studied chemistry at Columbia University and at the Sorbonne. He became a Professor of Chemistry and Physiology at New York University in 1876 and published one of the earliest American textbooks on chemistry and toxicology in 1879.

Fingerprints

In the late nineteenth century, the only accepted method for identifying habitual criminals was photography. While useful when the criminal’s name was available, there was no way to file photographs other than by name. If the criminal changed his name, photograph files were of no assistance. In 1879, Louis Alphonse Bertillon, a junior records clerk in the headquarters of the Surete´ in Paris, devised a detailed system of anthropometric measurements, such as the length and width of head and lengths of left middle finger, right foot, and left forearm, which would be unique and relatively constant for an individual. His system, which he called “anthropometry” and which the press dubbed “Bertillonage” was adopted by the Surete´ in 1882. Bertillon went on to become the Director of the Police Identification Service in Paris.

Bertillonage was a cumbersome process with an unwieldy filing system and even Bertillon recognized that criminals do not leave their anthropometric measurements behind at a crime scene. The basis for a better technique had been established in a paper published in England in 1684 by Dr. Nehemiah Grew in which he referred to “innumerable little ridges” on the ends of the fingers. In 1686, these ridges were independently described by Marcello Malpighi, a Professor of Anatomy in the University of Bologna. Neither, however, suggested any potential of these for personal identification. It was not until 1823 that the patterns formed by these ridges were described by a Czech physiologist, Jan Evangelista Purkinje, of the University of Breslau. While Purkinje suggested a classification system based on nine major types, he too failed to recognize their potential for individualization.

In 1877, Thomas Taylor, a microscopist in the US Department of Agriculture, published an article in the “American Journal of Microscopy” in which he suggested that “markings on the palms of the hands and the tips of the fingers” could be used for identification in criminal cases. This suggestion was not pursued in the USA, and it was left to a Scottish physician working in Tokyo, Dr. Henry Faulds, to publish a letter in Nature in 1880 in which he suggested that they might be useful as a means of personal identification (Faulds 1880). His letter prompted publication of a second letter a month later in the same journal from William Herschel, a British officer working in the Indian Civil Service, reporting that he had been using thumb impressions to identify illiterate prisoners since 1856. Herschel, who later received a knighthood, also made the critical observation that the patterns of what he called “papillary lines” did not change with time.

Despite the fundamental importance of these observations, they were of limited practical value until Francis Galton, an English scientist and explorer, published a book titled “Fingerprints” in 1892 (Galton 1892). Galton corresponded with Herschel and they agreed that fingerprints were individual and permanent – a fundamental requirement for their value as a system for identification. Galton described a classification system, using the arches, loops, and whorls of fingerprint patterns, which came to be known as “dactyloscopy.” Also in 1892, in Argentina a Croatian, Juan Vucetich, used the identification of a bloody fingerprint to solve the murder of two small children by their mother, the first use of fingerprint identification in a criminal case.

Edward Richard Henry, the Inspector General of Police in Bengal in the 1890s, became aware of Galton’s work and, by 1897, had enhanced Galton’s classification system into a more practical system that bears his name and is still used worldwide. Henry published “Classification and Use of Fingerprints” in 1900 (Henry 1900), and his system formally replaced anthropometry in England in 1901. Henry was by then Assistant Commissioner in charge of the Criminal Investigation Branch at Scotland Yard.

Trace Evidence

Two of the commonest requests made of a forensic science laboratory are “What is this material?” and “Could these two items have come from the same source?” The items can be almost anything but typically are paint, glass, soil, hairs, fibers, metals, gunshot residue, flammable liquids, or explosives. The science involved in trace evidence examination is principally chemistry and the early practitioners were usually professors of chemistry who accepted this type of request from local police investigators as a public service.

Remarkably, it was a lawyer, not a scientist, who first emphasized the assistance that police investigators could derive from the basic sciences. Hans Gustav Adolf Gross was a Professor of Law at the University of Linz in Austria. He also served as an examining magistrate which inspired him to spend 20 years studying science textbooks and developing principles of criminal investigation. Gross first published his classic book “Handbuch fu€r Untersuchungsrichter” (“Handbook for Examining Magistrates”) in 1893. It was later translated into English under the simple title “Criminal Investigation” and resulted in Gross becoming widely referred to as the “Father of Scientific Criminology.”

Although chemistry was developing rapidly, it was microscopy which was the fundamental tool of the nineteenth-century forensic scientist and which piqued Gross’s interest. Even today, the examination of trace evidence usually starts with a low-power stereo microscope, often just to find the evidence, and then may progress to higher power microscopes and, in the nineteenth century, to the basic “wet chemical” procedures such as color and crystal tests. One of the first types of trace evidence to be studied was human hair. In 1857, Jean Louis Lassaigne, a French chemist, published the results of his systematic microscopic studies of hair “De l’examen physique des poils et des cheveux” (“On the Physical Examination of Hair”).

Firearms/Toolmarks

The marks left by a hard tool on a softer surface are known as “toolmarks” and their examination in an effort to, for example, associate a particular tool such as crowbar with a forced entry mark, is quite common. The striation marks made on a rifled firearm barrel by both the original rifling tool and subsequent use are impressed onto a bullet surface as it passes through the barrel and are a particular form of toolmark.

In 1835, Henry Goddard in London identified a lead ball which had killed a man as having been made in a mold belonging to a suspect on the basis of an unusual mold mark on the ball. During the Civil War in the USA, a number of shootings (including that of Confederate General Stonewall Jackson in 1863) were resolved by associating “class” characteristics of the bullets, for example, shape, caliber, number of lands and grooves, with the types of firearm suspected (but not with an individual weapon).

In 1889, Jean Alexandre Lacassagne, the founder of the Faculty of Legal Medicine at the University of Lyon in 1880, was probably the first to use a microscope to examine a bullet removed from a shooting victim. He observed seven longitudinal grooves on it and concluded that they must have been made by the rifling of the barrel. Further advances in firearms examination did not develop until the twentieth century.

Forensic Biology

Attempts to identify and individualize body fluids have been a major activity in forensic science since the early nineteenth century. In 1827, the above-mentioned Mathieu Orfila published his research on the microscopic identification of red blood cells based on their size and whether or not they were nucleated. This technique was quite impractical for dried blood in which the blood cells are no longer intact. In 1853, Ludwig Teichmann, a Polish Professor of Anatomy in Gottingen, Germany, described a microscopic crystal test for hemoglobin based on the formation of brownish crystals of hematin. A Dutch scientist, Izaac Van Deen, discovered in 1861 that when tincture of guaiacum and oil of turpentine were added to a bloodstain a blue color resulted. This “guaiac test” was the first of many color tests for hemoglobin; some such as luminol and phenolphthalein are still in use. These tests, although very sensitive, are not specific for blood and, therefore, are referred to as “presumptive tests.”

Questioned Document Examination

Establishing the genuine or bogus nature of handwriting, and of documentary material in general, has been an issue in investigations for centuries. Early “handwriting experts” were teachers, bankers, etc., usually having little or no experience in forensic work. For example, Alphonse Bertillon, who had no qualifications in handwriting examination, testified in the infamous Dreyfus affair in 1894 in France and, incorrectly (as it later turned out), identified Captain Alfred Dreyfus’ handwriting on the incriminating document.

By the end of the nineteenth century, although forensic medicine was frequently used in criminal investigations and trials, the other forensic sciences were just beginning to be recognized as having the potential to assist in such procedures. Some of the techniques (or at least the fundamental bases for them) had been established, but practitioners were relatively few. That was to change dramatically in the twentieth century.

Developments In The Twentieth Century

Forensic Science Institutes And Laboratories

Institutes of legal medicine began to increase in number in Europe in the early part of the century, but remained primarily based in universities. The Edinburgh influence continued to spread with the appointment of Sidney Smith, an Edinburgh graduate, as the Principal MedicoLegal Expert in Cairo, Egypt. There he persuaded a team of experts in other disciplines to devote some of their time to forensic science. One of these was Arthur Lucas of the Government Chemistry Laboratory who, in 1921, published “Forensic Chemistry” which was one of the first texts to cover forensic science topics other than medicine and toxicology. Smith himself developed expertise in the emerging field of firearms identification. He returned to Edinburgh to take up the Chair in the University in 1927 and received a knighthood in 1949.

In North America, there was minimal progress in forensic medicine until the establishment of the Office of the Chief Medical Examiner in New York City in 1918 under the leadership of Dr. Charles Norris. Norris’ objective was to develop a medicolegal institute which would provide service and research equivalent to the leading institutes in Europe. The Office also became a leading teaching center after Milton S. Helpern became an integral member in 1931. “Alumni” of the Office went on to establish important institutes across the USA.

More broadly based forensic science laboratories gradually began to appear, the first being L’Institut de police scientifique in the University of Lausanne established in 1909 by Dr. Archibald Rodolphe Reiss. Within a year, it was followed by Laboratoire de police technique in Lyon under the leadership of Edmond Locard a native of Lyon who had studied under Lacassagne in the Faculty of Legal Medicine. Because he wanted to significantly expand the scope of Lacassagne’s Institute, Locard persuaded the police of the Rhoˆne pre´fecture to provide him with two rooms in an attic and two assistants to establish a police laboratory in Lyon, equipped initially with only a microscope and a simple spectrometer. This lab went on to become a major center for teaching and research, in addition to providing forensic science services to the police. Students came from around the world, many returning home to establish laboratories of their own. Over the ensuing 15 years, similar laboratories were established in Germany, Austria, Canada, Finland, Germany, Holland, and Sweden.

In Britain, it was not until 1935 that a forensic science laboratory was formally established by the Metropolitan London Police. The “Met Lab” went on to become one of the most respected such organizations in the world. In the rest of England, laboratories that had been started by local police forces gradually came under the wing of the Home Office Forensic Science Service (FSS) – as did the Met Lab in 1996. (Ironically, one of the reasons for these transfers was to enhance the appearance of impartiality and independence from the police. However, following the decision by the Home Office in 2010 to disband the FSS (see below), some police forces decided to again establish their own labs).

In 1966, the Home Office opened the Central Research Establishment (CRE) at Aldermaston. This laboratory had a major impact on forensic science not only within the FSS but worldwide. Most operational laboratories have neither the time nor the capability to perform systematic research on fundamental issues. This dearth was the subject of a classic essay by Paul Kirk of the University of California in 1963 (Kirk 1963). He wrote: “In short, there exists in the field of criminalistics a serious deficiency in basic theory and principles, as contrasted with the large assortment of effective technical procedures.” CRE was staffed and funded to perform such studies and share the results internationally thus greatly enhancing the reliability and validity of scientific evidence. Unfortunately, CRE was closed in 1997.

All of the FSS Labs were funded from central and local government funds until 1991 when the FSS became an Executive Agency of the Home Office and changed to a fee for service basis. (This move toward “privatization” evolved into a full Government-Owned Contractor-Operated Company in 2005. The format was not successful financially in spite of the FSS’s stellar scientific achievements and reputation and, in December 2010, the Home Office announced that the operation would be closed in March 2012. A fundamental issue is whether forensic science service delivery should be “commercialized” or should be recognized as an essential part of the justice system and thus a core public service funded by the state!).

In North America, the first government-funded forensic science laboratory was established by Dr. Wilfred Deroˆ me in Canada in 1914. Deroˆ me had gone to Paris in 1909 to study under Professor Victor Balthazard and, while there, became aware of the laboratory Locard was developing in Lyon. He determined to do the same in Canada and, on his return from Paris, Deroˆ me persuaded the Attorney General of Quebec of the importance of this new type of service. As a result, in July 1914, the Premier of Quebec announced the establishment of Laboratoire de recherches me´dico-le´gales in Montreal (Coˆ te´ 2003).

The first forensic science laboratory in the USA was established in the Los Angeles Police Department in 1923 by Chief August Vollmer, a great innovator in police work. After his brief term with LAPD, Vollmer returned to the Berkeley Police Department where he had established a School of Criminology in the University of California. Although located on the campus, the School had no official status until 1948 when the School of Criminology was formed and Professor Paul Kirk became the first Chair of the Criminalistics Department. Kirk became a legendary figure in forensic science in the USA and many of his students went on to establish forensic science laboratories across the state and the country. Unfortunately, the UC School, one of the few in the USA with a graduate program in forensic science, was closed in the 1970s.

Although the LAPD Laboratory was the first in the USA, its early days were overshadowed by the prominence of a laboratory established in Chicago in 1930. Following the St. Valentine’s Day Massacre in 1929, a coroner’s jury of prominent citizens was so impressed with the firearms evidence presented by Calvin Goddard (see below) that one of their members provided $125,000 to fund the creation of a full-service “crime laboratory” under Goddard’s leadership. With the support of John Henry Wigmore, Dean of the Law School in Northwestern University, the “Scientific Crime Detection Laboratory of Chicago” was housed in the Law School building. At this lab, Goddard trained staff from many agencies around the USA. Following his retirement, the Laboratory was transferred to the Chicago Police Department in 1938 and then to the Illinois State Police in 1996.

What was to eventually become the largest and best known of the forensic science laboratories in the USA was not established until 1932 when J. Edgar Hoover created the FBI Laboratory in Washington, D.C., a modest facility with a staff of one who had studied firearms examination under Goddard in Chicago.

Beginning in the late 1960s, the number of forensic science laboratories in the USA virtually exploded, driven largely by “the war on drugs” and Federal Government funding through the Law Enforcement Assistance Administration (LEAA). The vast majority of these were established within law enforcement agencies, some staffed with police officers who had limited or no science backgrounds. As the century progressed, the trend was toward better qualified civilian scientific staff although sometimes still with police administrators.

Forensic Medicine

For much of the first half of the century, a giant in forensic medicine in England was Sir Bernard Spilsbury who was a Home Office Pathologist from1908 to 1948. It was he who convinced Scotland Yard Detectives of the importance of having the pathologist and other specialists at murder scenes. In the courtroom, he was able to exert a spell over judges and juries, sometimes with little more basis than confidence in his own vast experience of over 25,000 cases. His reputation was not based on his writings, however, since, unlike most of his predecessors and contemporaries, he published virtually nothing.

Following Spilsbury, forensic medicine in England was led by Keith Simpson at Guy’s Hospital and Francis Camps at the London University Hospital. Camps set up the Department of Forensic Medicine in the London Hospital Medical College in 1945. By the time of his retirement in 1970, he had built this Department into one of the foremost of its kind in the world.

In the USA during much of the twentieth century, forensic medicine was dominated by a relatively small number of individuals. In addition to Charles Norris, there were Milton Helpern in New York, Alan Moritz in Cleveland, Theodore Curphey in Los Angeles, Russell Fisher in Baltimore, and Charles Petty in Dallas. Forensic pathology services in much of the rest of the country were provided by “part-timers” with varying degrees of qualifications and expertise. That began to change toward the end of the century with more and better training and with recognition of the need for research-based (as opposed to experience-based) testimony. Nevertheless, in large parts of the country, the quality of death investigation services remained variable – elected coroners (often funeral directors) presided and, for much of the century, autopsies had to be performed in makeshift facilities such as funeral homes.

Forensic Toxicology

It was not until Norris established a Toxicology Section led by Dr. Alexander Gettler in 1918 that improvements in toxicological service began to occur. Gettler had no model to follow so much of his early work was of a pioneering nature, and his laboratory became the birthplace of forensic toxicology in the USA.

Virtually all of the early analytical methodology was based on wet chemistry and microscopic procedures which required large samples – 500 g of liver was the typical starting point – as contrasted with the 1–2 ml of blood that is now the norm. In addition to his innovative analytical procedures, Gettler’s principal contribution to modern forensic toxicology was a graduate course in toxicology at New York University which he started in 1935 and taught until his retirement in 1959. His graduates went on to become an outstanding “second generation” of forensic toxicologists who started laboratories across the country.

As the twentieth century proceeded, developments in the pharmaceutical industry made forensic toxicology much more complex. While the inorganic and vegetable poisons continued to be used, synthetic drugs which were much more potent became more common. Their detection required more sensitive analytical procedures. The traditional color and crystal tests began to be supplemented by ultraviolet spectrophotometry and paper chromatography in the late 1940s, gas chromatography in the late 1950s, thin layer chromatography in the mid-1960s, mass spectroscopy combined with gas chromatography (GC/MS) in the late 1960s, and by liquid chromatography and immunological techniques in the early 1970s. GC/MS became the predominant technique for identification; when it was introduced, the complexity of interpretation of the data was such that a Ph.D. was considered an essential qualification. Within a few years, interpretation of mass spectra became more a matter of “pattern matching” with databases which could be done by persons with lesser qualification. (A similar phenomenon later occurred with DNA analysis.)

Although not toxicology per se, identification of “street drugs” employs many of the same analytical techniques and this type of analysis, because of the “war on drugs” constitutes the largest volume of casework in most forensic labs in the USA.

Of all the drugs which forensic toxicologists deal with, by far the most significant in terms of its prevalence and impact on human behavior is ethyl alcohol. Dr. Erik Widmark, a Professor of Physiological Chemistry at the University of Lund in Sweden, was among the first to systematically study the absorption, distribution, and elimination of alcohol in the body (Widmark 1932). A prerequisite for such studies was a reliable microanalytical method for alcohol in blood. Widmark’s method, published in 1922 (Widmark 1922), became the most commonly used procedure worldwide and continued to be used well into the 1960s and 1970s. His mathematical pharmacokinetic parameters are still in use.

Another major development with alcohol was the invention of breath alcohol testing devices. Rolla N. Harger, a Professor of Pharmacology and Toxicology in the Indiana University Medical School, perfected the first such device, the “Drunkometer,” in 1937. Robert F. Borkenstein, of the Indiana State Police, improved substantially on this in 1954 with the “Breathalyzer®,” an instrument which, although now superseded by more modern equipment, continued to be used in the USA and Canada for over 50 years.

Fingerprints

The first investigator to use fingerprints for criminal investigation in the USA was Detective Sergeant Joseph A. Faurot of the New York City Police Department in 1911. It was not until 1930 that the Director of the Federal Bureau of Investigation, J. Edgar Hoover, set up a national bureau for identification which became the largest repository of fingerprints in the world.

Inked fingerprint impressions compared with the known fingerprints of an individual are still considered to be the most positive form of personal identification, although other biometrics may be used for specialized applications. Latent fingerprints, on the other hand, present greater challenges. For much of the century, development of latents was primarily by dusting with fine powders which adhered to the moisture or sebaceous deposits retained in the impression.

In the 1970s, it was recognized that the chemicals in perspiration, including amino acids and various salts, presented opportunities for chemical detection and some quite sophisticated chemistry and physics was applied to this challenge. These reactions, with other chemicals or induced by various forms of radiation including argon lasers (Dalrymple et al. 1977), can produce colored or fluorescent patterns distinguishing the ridges from the valleys. One interesting technique was discovered by accident in Japan in 1977. When a hair examiner in a crime lab was mounting hairs on a microscope slide using cyanoacrylate (Superglue), he noticed that his fingerprints were developing on the slides and brought this to the attention of a colleague who confirmed it to be a valuable latent fingerprint development process.

Latent prints left at the scene of a crime can be important investigative tools but only if they can be related to the known prints of a person. Until the late 1970s, finding the knowns for such comparisons was largely based on the memories for patterns of experienced latent print examiners. In 1977, the FBI introduced an automatic fingerprint recognition system for the computerized search of fingerprint databases. Commercial providers developed other systems over the next several years culminating in the introduction in 1999 by the FBI of the Integrated Automated Fingerprint Identification System (IAFIS) containing the fingerprints and criminal histories of millions of individuals. The ability of such systems to make “cold hits” has made fingerprints the important investigative tool that the pioneers in the field had originally envisaged.

Although the use of fingerprints for individual identification originated in the medical/scientific world, their widespread application became primarily a police function. In many areas in North America, the service is provided by police “identification bureaus” rather than by forensic labs. These bureaus are generally staffed by individuals with minimal scientific background. While there is now only slight dispute of the presumption that fingerprints (friction ridge patterns) are unique to an individual and persistent for life, there remains some valid criticism of the manner in which examinations are sometimes performed. Major efforts are being exerted in the twenty-first century to deal with this through enhanced training, research, and development of recognized standardized protocols.

Trace Evidence

The exchange principle that is the basis for trace evidence examinations – “every contact leaves a trace” – was first enunciated in 1920 by Edmond Locard in L’enquete criminelle et les methodes scientifique. When two objects, two persons, or an object and a person come into contact, dust, fibers, hair, paint, glass, soil, etc., may transfer between them and comparisons may establish the fact of the contact. Finding, identifying, and comparing such “trace evidence” introduced additional chemistry professors to the world of forensic applications of science during the first half of the twentieth century, and this type of examination later became a common part of the work of forensic science laboratories.

Although the microscope continues to be the basic tool of the trace evidence examiner, a variety of analytical instruments to supplement microscopy began to appear and these are now capable of providing almost unbelievable sensitivity. The first of these analytical instruments, introduced in the early 1940s, was the emission spectrograph which was ideally suited to trace evidence examination because it permitted the comparison of the trace element composition of a wide range of small samples. These results were often interpreted as indicating “batch” compositions of many types of materials, interpretations which are no longer considered feasible as a result of improvements in manufacturing processes. Later analytical techniques to appear include infrared (IR) spectrophotometry introduced in the early 1950s, gas chromatography (GC) in the late 1950s, atomic absorption spectrophotometry (AAS) in the early 1960s, gas chromatography linked with mass spectroscopy (GC/MS) in the late 1960s, scanning electron microscopy linked with X-ray spectrometry (SEM/XRS) in the early 1970s, and Fourier transform infrared spectrophotometry (FTIR) later in the 1970s. New developments such as the various forms of inductively coupled plasma spectroscopy (ICPS) in the 1980s/1990s continue to be adopted.

Ironically, the enhancements in analytical capability did not substantially change the answers that could be given to the typical question “Could these two items have had a common origin?” Because the evidence items are often common materials – either manufactured, for example, paint, glass, fibers, or natural, for example, soil, hair – the answers to the question were still typically limited to “Yes they could” or “No they could not.” However, the answers were of much greater quality and reliability because of the significant increase in the amount and type of analytical data derived from the items. Toward the end of the century, with investigators and courts becoming used to the much more definitive answers provided by DNA analysis (although applicable to only about 10 % of cases received in a typical forensic lab), demands for this type of examination began to diminish despite the fact that the results can still often be of considerable value to an investigation.

Questioned Document Examination

In the early years of the twentieth century, Albert S. Osborne of New York City began to develop the examination of handwriting and questioned documents into a specialized forensic science discipline. Osborne, originally a teacher of penmanship, published several papers on document examination, including one on typewriting identification in 1901, but it was the first edition of his book “Questioned Documents” in 1910 that cemented his position as the leader of the profession (Osborne 1910). This work, with later revised versions, formed the cornerstone for much of what document examiners continue to do. Today, these types of examinations require knowledge about inks, paper, writing instruments, printers, copiers, and computers. Nevertheless, the majority of the work remains the comparison of handwriting.

In England, Wilson R. Harrison published his massive work “Suspect Documents, Their Scientific Examination” in 1958 and Locard in France published Les Faux en E´criture et Leur Expertise in 1959.

Following World War II, many new challenges began to develop as a result of advances in the technology of writing and printing instruments. In 1945, the ballpoint pen was introduced and by the mid-1950s had almost completely replaced the fountain pen. Around 1960, the fiber-tipped pen started to become popular and brought its own challenges to identification work. Typewriter technology also began to change with the introduction of the electric typewriter in the 1930s, proportional spacing typewriters in the early 1950s, the single element type-ball machine in 1961, and the correcting “lift off” typewriter ribbon in 1973. The ability to easily change balls between machines and correct errors quickly presented another challenge to the examiner. The print-wheel typing unit which became common in word processing and computer systems was introduced in 1972 followed not long after by the dot matrix and laser printers that are so common today. All of these developments in “typewriting” reveal fewer of the individual anomalies which the document examiner relies on to identify the product of a particular machine.

As the office copier became common, it introduced a whole new set of challenges to the document examiner, including the identification of the machine on which a particular copy had been produced. The thermo-graphic process, introduced in 1950, was quickly replaced by the electrostatic plain paper reproduction process in the late 1950s.

The tools available to the document examiner have also expanded. Electronic ultraviolet and infrared viewers for the differentiation of inks; electrostatic detection apparatus for revelation of pressure patterns and indented writing, specialized photographic procedures, and computerized digital imaging equipment have all been added to the examiner’s arsenal. Improved training, enhanced quality assurance, and additional generally accepted protocols resulted in questioned document examinations becoming very reliable. (Although, as stated in the 2009 National Academy of Science Report (NRC 2009), “the scientific basis for handwriting comparison needs to be strengthened”).

Firearms/Toolmarks

While interest in the identification of the firearm from which a bullet has been fired began before the nineteenth century, it was not until the beginning of the twentieth century that an important discovery was made – that microscopic marks (“striations”) on bullets of the same caliber and type, fired through guns of different makes, varied in appearance. This critical observation was published in 1900 by Dr. Albert Llewellyn Hall in the Buffalo (New York) Medical Journal. One of the techniques for the examination of bullets used was to roll them on a plane surface of wax or lead foil to reproduce surface markings; however, only the gross markings could be reproduced.

The first use of striation matching did not involve bullets but rather knife marks in the wood of vandalized trees. R. Kockel, a Professor at the Institute of Legal Medicine in the University of Leipzig, published two papers in 1900 and 1903 describing a method for producing test marks with the knife and for using oblique lighting and photography for the comparisons.

The first case in the USA in which identifications of ammunition components were made occurred in what became known as the “Brownsville Affray” in which a civilian was killed in Brownsville, Texas, in 1906 by shots fired by several soldiers during a riot. In 1907, staff of the Frankford Arsenal in Pennsylvania were able to identify eleven of the fired cartridge cases to one of the soldiers’ rifles, eight to a second, eleven to a third, and three more to a fourth. This pioneering work remained relatively unknown for many years because the only publication of it was in the Annual Report of the US Army Chief of Ordnance.

In 1912, Victor Balthazard, a Professor of Legal Medicine in the University of Paris, described the principles involved in linking a particular fired bullet to an individual firearm. His procedure involved making a detailed sequential series of photographs of the surface of the complete circumference of the bullets, enlarging the photographs and comparing the striations revealed in the process. In 1922, he published “Identification des projectiles. Perfectionnement de la technique” in Annales de Me´decine Le´gale. Unfortunately, the system was too cumbersome to ever gain wide acceptance.

Between 1919 and 1923, Charles E. Waite, an investigator in the New York State Attorney General’s Office, accumulated rifling data for almost all the firearms manufactured in the USA and Europe. In 1925, he persuaded John H. Fisher, a precision instrument designer, Phillip O. Gravelle, a microscopist at Columbia University, and Major (later Colonel) Calvin Goddard, to join him in establishing the “Bureau of Forensic Ballistics” in New York City. Goddard, a graduate in medicine from Johns Hopkins, had served in the Army Medical Corps but, following World War I, had transferred to the Ordnance Corps. Gravelle, who had used a comparison microscope in work with textiles, had the inspiration to suggest applying it to the comparison of fired bullets and cartridge cases. He therefore assembled a hybrid instrument suitable for examining ammunition components. Initial tests of the equipment by Goddard in April 1925 confirmed its value, and the comparison microscope subsequently became the universal tool of firearms/ toolmark examiners worldwide. This technique continues to be used in much the same way that it was used by Goddard.

In 1925, Goddard published the first description of the use of the comparison microscope for firearms identification in the November/ December issue of “Army Ordnance” under the unfortunate title “Forensic Ballistics” – “unfortunate” because, from then onward, “Ballistics” became the popular – yet incorrect – name for the work of the forensic firearms examiner. Over the next several years, Goddard was contacted by many experts, including Professor J. Howard Mathews of the Department of Chemistry in the University of Wisconsin, Sydney Smith (then still in Egypt), Deroˆ me in Montreal, and Locard in France. Deroˆ me published a book on the subject (Deroˆ me 1929) and Professor Mathews produced a massive three-volume series “Firearms Identification,” the first two volumes in 1962 and the third (posthumously) in1973.

Confirmation of Goddard’s stature as the preeminent firearms expert in the USA came with the infamous “St. Valentine’s Day Massacre” in Chicago in 1929. He was asked to examine the 70 fired .45 caliber cartridge cases and 14 bullets recovered from the scene. His conclusion was that 50 of the casings had been fired from one Thompson submachine gun and 20 from another. Later that year, two “Tommy” guns were located in a house owned by a member of the Al Capone gang and identified as having fired the bullets.

There has been little fundamental change in the examination of firearms and tool-marks since the early work of Goddard and others. Attempts to bring greater objectivity to identifications, such as that by Alfred Biasotti with his publication “A Statistical Study of the Individual Characteristics of Fired Bullets” (Biasotti 1959), were initially received coolly by practitioners. Toward the end of the century, however, Biasotti’s concept of “Consecutive Matching Striations” began to receive greater acceptance by the discipline.

In 1991, Walsh Automation Inc. of Montreal developed the “Integrated Ballistics Identification System” (IBIS) for scanning the surface of bullets and cartridge cases using sophisticated laser and computerized digital imaging techniques. This system was adopted by the Alcohol, Tobacco, Firearms and Explosives (ATF) Bureau. In 1993, Mnemonic Systems, under contract to the FBI, developed a similar system “DRUGFIRE” for fired cartridge cases. The two systems were integrated in 1999 into the National Integrated Ballistics Information Network (NIBIN) operated by ATF. While highly valuable for developing and searching databases, final identifications continue to be made visually by examiners using the comparison microscope.

Similar to the situation with fingerprint identification, the delivery of firearms examination services developed in the latter half of the century, primarily within law enforcement agencies. Examiners frequently had little or no scientific background and were ineligible for membership in professional organizations such as the American Academy of Forensic Sciences (AAFS). They therefore established their own organization, the “Association of Firearm and Tool Mark Examiners” (AFTE), in 1969. AFTE has created valuable criteria for operational aspects such as training, examination protocols, and terminology. In 1972, the AFTE Journal was established and became the principal vehicle for publication of research results. Most of this research, however, relates to practical developments rather than the fundamental science of the process. Again, the NAS Report commented that “Individual patterns from manufacture or wear might, in some cases, be distinctive enough to suggest one particular source, but additional studies should be performed to make the process of individualization more precise and repeatable.”

Forensic Biology

By the end of the nineteenth century, forensic scientists could determine that a stain was blood but could not answer the follow-up question – “Is it human blood?” That answer was not long in coming. In 1901, Paul Theodor Uhlenhuth, an Assistant Professor in the University of Greifswald in Germany, published “A Method of Investigation of Different Types of Blood, Especially the Differential Diagnosis of Human Blood” in the German journal Deutsche Medizinishe Wochenschrift. Uhlenhuth had observed that when chicken egg protein was injected into rabbits, the blood of the rabbit would subsequently react with the chicken protein, causing it to precipitate, but would not react with other types of bird protein, that is, it was specific for chicken. This reaction became the basis of the “precipitin test.” Uhlenhuth soon expanded his research to develop antisera that would react with the protein of humans and other species. In his paper, Uhlenhuth modestly foresaw that his process “could prove to be of great importance in medical jurisprudence.”

Although blood could now be identified as human, it might (theoretically) be from any human. Progress toward reducing the number of potential human sources began almost immediately. Later in 1901, Karl Landsteiner, an Assistant Professor at the Institute of Pathology and Anatomy in the University of Vienna, published a paper entitled “On Agglutination Phenomena of Normal Human Blood.” It began with: “Some time ago I observed the fact that blood serum of normal persons can frequently agglutinate the red blood corpuscles of other healthy individuals.” His eventual conclusion was that human blood was either Type A, B, O, or AB. The first step toward demonstrating the biological individuality of humans, as observable in blood, had been taken.

Landsteiner’s test was based on antigens on the cells and, in 1915, Leon Lattes at the Institute of Forensic Medicine in the University of Turin in Italy, developed the first serum antibody test for the ABO blood groups. Lattes’s procedure continued to be used by many forensic serologists for the remainder of the twentieth century.

Blood is not the only body fluid of importance in forensic science; semen and saliva are also common evidence materials. Identification of saliva is usually based on tests for its amylase activity. Semen can be identified by the microscopic detection of spermatozoa, a process first described by Henri-Louis Bayard in France in 1839. A presumptive color test for semen based on its high acid phosphatase content was developed in 1945 by Frank Lundquist in the University of Copenhagen. In 1925, a Japanese scientist, Saburo Sirai, discovered that the ABO system could also be detected in these other body fluids in about 80 % of the population who are referred to as “secretors.”

Although the ABO system was widely used in forensic serology, its discriminating power (e.g., about 45 % of the population are group O) could establish exclusion of possible sources but was of only limited value for inclusion. Forensic serologists therefore continued to look for other “genetic markers” which could enhance their ability to discriminate between sources. This search led them in the 1950s and 1960s to a multitude of genetically controlled systems in blood including plasma proteins and variants of some of the enzyme systems. While the biological functions of these systems were important to medical researchers, genetic variants within any system were not. For this reason, identification and validation of these variants had to be performed almost exclusively by forensic scientists, primarily in the UK at the Metropolitan London Police Lab and the Home Office CRE.

The first of these polymorphic (“many forms”) systems to be introduced was the phosphoglucomutase (PGM) system, discovered in 1964. Brian Culliford, of the Met Lab was able to apply it to bloodstains in 1967. PGM was particularly useful because it was subsequently shown to be present in semen, vaginal secretions, and hair roots while most of the later systems could only be applied to blood. Using Culliford’s original procedure, three different PGM polymorphs could be identified. Later work in 1977 at CRE using a different process permitted the observation of ten different PGM types thereby vastly increasing its discriminating power. By the late 1970s, nine different polymorphic enzyme systems had been validated. In addition to the enzymes in red blood cells, some proteins in serum were also found to exhibit polymorphism. Not all of these systems became widely used, but they collectively provided a useful “toolbox” for the forensic serologist (Culliford 1971).

The incentive driving the development of additional blood group systems was the fact that each is inherited independently of the others. As independent variables, the population frequencies of each could be multiplied resulting in sometimes relatively low frequencies of occurrence for bloodstains. So, by the mid-1980s, forensic serologists had come a long way in their ability to exclude possible donors of a bloodstain.

As genetic variants, the blood group systems are manifestations of their genetic control by DNA. Why not, instead, look directly at the DNA? In forensic science, the ability to do so began in March 1985 when Alec J. Jeffreys (now Professor Sir Alec after being knighted in 1999) and two colleagues at the Department of Genetics in the University of Leicester in England published their seminal paper in Nature (Jeffreys et al. 1985). In that paper, they described regions along the length of the DNA molecule which were highly variable from one person to another. In the conclusion of that paper, Jeffreys wrote – in a massive understatement –

“We anticipate that these DNA ‘fingerprints’ —— can be used in forensic applications ——” thereby coining a name which, although like “ballistics” before it was incorrect, became popular in the media.

The first forensic application of Jeffreys’ process came in 1986 when a young man, accused of the rape/murder of two young girls near Jeffreys’ laboratory in Leicester, was exonerated by his DNA and Jeffreys concluded that one other man, then unknown, was the source of the semen in both victims. After a massive investigation which came to be known as “The Blooding,” Colin Pitchfork was eventually identified through his DNA and was convicted. As a result of this case, Peter Gill and David Werrett at the FSS became actively involved in the development of DNA profiling, and the FSS became a world leader in this field.

The analytical process Jeffreys used was “restriction fragment length polymorphism” (RFLP), which became the first technique used for “DNA profiling” in forensic laboratories in the late 1980s and 1990s.

In North America, the FBI quickly became interested in DNA and, through close cooperation with other forensic labs in the USA and Canada, developed standards of practice which became widely accepted. The first RFLP cases in North America were in 1987/1988.

The RFLP process, while highly discriminatory, was extremely labor intensive, very demanding in the size and condition of the sample, and required several weeks to carry out. In 1983, Kary Banks Mullis at the Cetus Corporation in California conceived the “polymerase chain reaction” (PCR) process which dramatically changed the face of molecular biology including forensic DNA profiling (Mullis et al. 1986). PCR uses small synthesized pieces of DNA (primers) to make a million or more copies of short but highly informative regions of the DNA molecule. As a result, much smaller samples in poorer condition can be profiled. Most forensic applications of DNA analysis are now based on PCR technology using “short tandem repeats” (STRs) of the molecule, introduced in1992 by Thomas Caskey of Baylor University in Texas.

DNA profiling represents what is unquestionably the most dramatic advance in forensic science of the twentieth century. In a period of only a few years, forensic science had gone from reporting frequencies of occurrence ranging from one in a few hundreds for blood, to frequencies of one in many millions or even billions – in effect individualization – for all body fluids. The dream of forensic serologists – individualization based on stains of body fluids – had become, virtually, a reality.

Other Developments

Until about the middle of the twentieth century, communication of developments in forensic science was primarily through publication in scientific journals, of which there were few devoted to forensic science. As a result, exchanges of information and discussion were limited and slow. In 1948, Dr. Rutherford B. Gradwohl, Director of the St. Louis Police Department Laboratory, invited 150 forensic scientists to attend a meeting in St. Louis which led to the formation of the American Academy of Forensic Sciences (AAFS). The Academy has since became the preeminent forensic science organization in the world and, in 2011, had over 6,000 members from 65 countries. The Academy’s Journal of Forensic Sciences, established in 1956, has become a major peer review vehicle for the exchange of scientific data and research.

In the UK, Stuart Kind of the FSS led the creation of The Forensic Science Society in 1959. The Society’s journal, now called “Science and Justice,” is another important vehicle for communication among forensic scientists. Such organizations also facilitated communication by personal contact which was then dramatically enhanced with the development of the Internet.

When Dr. Briggs White became the Director of the FBI Lab in 1970, he recognized the need for closer cooperation with state and local labs. In 1973, he invited 30 US and Canadian lab directors to a meeting at the FBI Academy in Quantico, Virginia. This led in 1974 to the birth of the American Society of Crime Laboratory Directors (ASCLD) devoted to management and policy matters. ASCLD has gone on to play a leading role in the improvement of forensic labs in North America. In Europe, an organization with similar goals and objectives, the European Network of Forensic Science Institutes (ENFSI) was established in 1995.

ASCLD played a critical role in a major initiative which has contributed significantly to dramatic improvements in quality assurance in forensic science – laboratory accreditation. In the late 1970s, ASCLD recognized the need for objective standards for crime lab operations. A committee began studying the matter and drafting a series of objective criteria which lab directors could use to evaluate their laboratories. What emerged in 1981 was the American Society of Crime Laboratory Directors/Laboratory Accreditation Board (ASCLD/LAB). The first laboratories were accredited in 1982 and, in1988, ASCLD/ LAB was incorporated as a totally separate entity from its parent. Recognition of the value of accreditation gradually expanded and there now are almost 400 forensic laboratories in the USA and around the world which have been accredited by ASCLD/LAB. Accreditation of forensic labs in other countries is done by national accrediting bodies such as UKAS in the UK, NATA in Australia, and the Standards Council of Canada.

Another major contributor to the enhancement of quality in forensic science is a “spin off” from the introduction of DNA profiling. The FBI recognized the enormous impact that this new process was likely to have and realized that strict standards of performance would be required to ensure acceptance of the evidence. This could best be accomplished through collaboration of many experts, and so it established the Technical Working Group on DNA Analysis Methods (TWGDAM) consisting of representatives from the fields of forensic, industrial, commercial, academic, and international communities. The name was subsequently changed to SWGDAM (“Technical” was changed to “Scientific”). The ongoing result of the development of these standards has led to much greater uniformity and consensus on the part of practitioners. SWGDAM was so successful that the concept was applied to other forensic science disciplines, and there now are about 20 “SWGs” developing recognized standards and protocols (Cormier et al. 2005).

Conclusion

Despite the many advances in forensic science in the past two centuries, there remain some fundamental limitations as described in the NAS Report. This, of course, does not mean that the procedures are invalid or unreliable but rather that there is a need to more adequately demonstrate their validity through conventional empirical data, peer review and publication. Much of the necessary data already exists because it is collected by every trainee during their training process to validate for themselves the reliability of the techniques and their ability to use them. Unfortunately, these data are scattered in thousands of widely distributed and uncoordinated laboratory training files and therefore not readily accessible; more systematized data collection is required to meet the required standards. This work is being done – in the twenty-first century!

Bibliography:

  1. Ashbough D (1999) Quantitative-qualitative friction ridge analysis: an introduction to basic and advanced ridgeology. CRC Press, Boca Raton
  2. Barnes JG (2011) History. In: McRoberts A (ed) The fingerprint source book. NIJ OJP, Washington, DC. Chap. 1
  3. Biasotti A (1959) A statistical study of the individual characteristics of fired bullets. J Foren Sci 9:428–433
  4. Cormier K, Calandro L, Reeder D (2005) Evolution of the quality assurance documents for DNA laboratories. Foren Mag 2:1–3
  5. Coˆ te´ J (2003) Wilfred deroˆ me: expert en homicides. Les Editions du Bore´al, Montre´al
  6. Culliford B (1971) The examination and typing of bloodstains in the crime laboratory. National Institute of Law Enforcement and Criminal Justice, Washington, DC
  7. Dalrymple B, Duff J, Menzel E (1977) Inherent fingerprint luminescence – detection by laser. J Foren Sci 22:106–115
  8. Deroˆ me W (1929) Expertise en armes aˆ feu. Beauchemin, Montre´al
  9. Faulds H (1880) On the skin – furrows of the hand. Nature 22:605
  10. Galton F (1892) Finger prints. MacMillan, New York
  11. Gunn A (2006) Essential forensic biology. Wiley, Chichester
  12. Hamby J, Thorpe J (1999) The history of firearms and tool-mark identification. AFTE J 33:266–284
  13. Henry E (1900) Classification and uses of finger prints, 4th edn. George Routledge, UK, London
  14. Houck M (2003) Trace evidence analysis. Academic, Burlington
  15. Houck M, Siegel J (2010) Fundamentals of forensic science. Academic, Burlington
  16. Jeffreys A, Wilson V, Thein S (1985) Hypervariable “minisatellite” regions in human DNA. Nature 314:67–73
  17. Jickells S, Negruz A (2008) Clarke’s analytical forensic toxicology. Pharmaceutical, London
  18. Kelly J, Lindblom B (2006) Scientific examination of questioned documents. CRC Press, Boca Raton
  19. Kirk P (1963) The ontogeny of criminalistics. J Crim Law Criminol Police Sci 54:235–238
  20. Mullis K, Faloona F, Scharf S, Saiki R, Horn G, Erlich H (1986) Specific enzymatic amplification of DNA In Vitro: the polymerase chain reaction. Cold Spring Harb Symp Quant Biol 51:263
  21. National Research Council of the National Academies of Science (2009) Strengthening forensic science in the United States: a path forward. National Academies, Washington, DC
  22. Osborne A (1910) Questioned documents. Lawyers’ Cooperative Publishing, St. Paul
  23. Schuller P (2003). Mateo orfila: a biography. National Library of Medicine http://www.ncbi.nlm.nih.gov/pubmed/15027707. Accessed 02 April 2012
  24. Siegel J (ed) (2000) Encyclopedia of forensic science. Academic, London
  25. Smith S (1951) History and development of forensic medicine. Brit Med J 24:599–607
  26. Tilstone W, Savage K, Clark L (2006) Forensic science: an encyclopedia of history, methods, and techniques. ABC-CLIO, Santa Barbara
  27. Widmark E (1922) Eine mikromethode zur bestimmungen von €athylalkohol im blut. Biochem Z 131:473–484
  28. Widmark E (1932) Principles and applications of medicolegal alcohol determination. Translated from the 1932 German edition: (trans: Baselt RC). Biomedical Publications, Seal Beach, CA, 1991, p 1–163

See also:

Free research papers are not written to satisfy your specific instructions. You can use our professional writing services to buy a custom research paper on any topic and get your high quality paper at affordable price.

ORDER HIGH QUALITY CUSTOM PAPER


Always on-time

Plagiarism-Free

100% Confidentiality
Special offer! Get discount 10% for the first order. Promo code: cd1a428655