Drugs enjoy a social significance different from other commodities, technologies, or artifacts. Celebrated by artists and visionaries from the 19th-century Romantics to the 20th-century Beats to 21st-century hip-hop musicians, drugs have been seen to shape minds and bodies in socially positive and problematic ways. Prescription drugs are credited with improving health, productivity, and well-being, whereas nonprescription drugs are blamed for destroying minds and bodies. How society views drugs depends on who produces them, how they are distributed and marketed, and who consumes them and how. Many controversies surround the workings of these fascinating, functional, and sometimes dangerous technologies.
Drugs as Pharmaceutical Wonders
History reveals a remarkable parade of “wonder drugs”— such as heroin, introduced in 1898 by the German pharmaceutical company Bayer as a nonaddicting painkiller useful for treating tuberculosis and other respiratory diseases. Bayer introduced aspirin a few years later as a treatment for rheumatoid arthritis but promoted it aggressively for relief of headache and everyday aches and pains. Today, aspirin is the world’s most widely available drug, but there was a time when pharmacists smuggled it across the U.S.- Canadian border because it was so much more expensive in the United States than elsewhere. Cocaine, distributed to miners in the Southwest as an energizing tonic, was used much as amphetamines and caffeine are used in postindustrial society. Barbiturates; sedative- hypnotics such as thalidomide, Seconal, or Rohypnol; major and minor tranquilizers; benzodiazepines such as Valium; and painkillers or analgesics have all been promoted as wonder drugs before turning out to have significant potential for addiction or abuse and are also important for medical uses—for instance, cocaine is used as an oral anesthetic.
Wonder drugs are produced by pharmacological optimism—the myth that a drug will free human societies from pain and suffering, sadness, anxiety, boredom, fatigue, mental illness, or aging. Today, lifestyle drugs are used to cope with everything from impotence to obesity to shyness to short attention spans. Yet adverse prescription drug reactions are the fourth leading cause of preventable death among adults in the United States. Some drugs, we think, cause social problems; we think others will solve them. Drugs become social problems when important interest groups define them as such. Recreational use of illegal drugs by adolescents has been considered a public health problem since the early 1950s, when the U.S. public attributed a wave of juvenile delinquency to teenage heroin addiction. Since our grandparents’ generation, adolescence has been understood as a time when many choose to experiment with drugs. Today, a pattern of mixed legal, illegal, and prescription drug use has emerged among the first generation to be prescribed legal amphetamines and antidepressants. Many legal pharmaceuticals have been inadequately tested in children, and the short-term effects and long-term consequences of these drugs are unknown.
Controversy and Social Context
Portrayed as double-edged swords, drugs do not lend themselves to simple pros and cons. Drug controversies can best be mapped by asking which interest groups benefit from current policies, whose interests are at stake in changing them, and how drugs are defined differently by each group of producers, distributors, and consumers.
The basic terms through which drug debates are framed are not natural and do not reflect pharmacological properties. The meaning of drug use is best thought of as socially constructed, because it is assigned meaning within social and historical contexts. Varied meanings were attributed to the major subcultural groups of opiate addicts in the early 20th-century United States. Opium smoking by 19th-century Chinese laborers in the United States was tolerated until the labor shortage that attracted them became a labor surplus. Although laborers have long used drugs to relieve pain, stress, and monotony, the larger population of 19th-century opiate addicts was white women, born in the United States, who did not work outside the home. Pharmacy records indicate that rates of morphine addiction were high among rural Southern women from the upper and middle classes—and almost nonexistent among African Americans. Morphine addiction among men was concentrated among physicians, dentists, and pharmacists— professions with access to the drug.
Why did so many native-born white people rely on opiates through the early 20th century? Prior to World War II, when antibiotics were found useful for fighting infection, doctors and patients had few effective treatments. Opiates were used to treat tuberculosis because they slow respiration and suppress cough, for diarrhea because they constipate, and for pain (their most common use today). Physicians and patients noticed that opiate drugs such as morphine and heroin were habit-forming, however. They used the term addict to refer to someone who was physiologically or psychologically dependent on these drugs. In the 20th century, physicians began to refrain from prescribing opiates except in cases of dire need. Improved public health and sanitation further reduced the need, and per-capita opium consumption fell. Despite this, the United States could still be termed a “drugged nation.”
Since the criminalization of narcotics with the Harrison Act (1914), U.S. drug policy has been based on the idea of abstinence. There was a brief period in the early 1920s when over 40 U.S. cities started clinics to maintain addicts on opiates. This experiment in legal maintenance was short-lived. Physicians, once the progenitors of addiction, were prosecuted, and they began to refuse to prescribe opiates to their upper- and middleclass patients. By the 1920s, the opiate-addicted population was composed of persons from the lower or “sporting” classes. Drug users’ median age did not fall, however, until after World War II. The epidemiology, or population-wide incidence, of opiate use in the United States reveals that groups with the greatest exposure to opiates have the highest rates of addiction.
Exposure mattered, especially in urban settings where illegal drug markets took root. Urban subcultures existed in the 19th century among Chinese and white opium smokers, but as users switched to heroin injection or aged out of smoking opium, the Chinese began to disappear from the ranks of addicts. Older dope-fi end subcultures gave way to injection heroin users, who developed rituals, argots or languages, and standards of moral and ethical behavior of their own. Jazz musicians, Hollywood celebrities, and those who frequented social scenes where they were likely to encounter drugs such as heroin, cocaine, and marijuana were no longer considered members of the respectable classes. The older pattern of rural drug use subsided, and the new urban subcultures trended away from whites after World War II. African Americans who had migrated to Northern cities began to enjoy increased access to illicit drugs that had once been unavailable to them. So did younger people.
Social conflict between the so-called respectable classes and those categorized as less respectable often takes place around drugs. Debates over how specific drugs should be handled and how users of these drugs should be treated by society mark conflicts between dominant social groups, who construct their drug use as normal, and subordinate social groups whose drug use is labeled as abnormal, deviant, or pathological. As historian David Courtwright (2001) points out, “What we think about addiction very much depends on who is addicted.” How drugs are viewed depends on the social contexts in which they are used, the groups involved, and the symbolic meanings assigned to them.
Recent medical marijuana campaigns have sought to reframe marijuana’s definition as a nonmedical drug by showing its legitimate medical uses and backing up that assertion with clinical testimonials from chronic pain patients, glaucoma sufferers, and the terminally ill. Who are the dominant interest groups involved in keeping marijuana defined as nonmedical? The voices most often heard defending marijuana’s status as an illegal drug are those of drug law enforcement. On the other hand, the drug policy reform movement portrays hemp production as an industry and marijuana use as a minor pleasure that should be decriminalized, if not legalized altogether. Views on drug policy range from those who want to regulate drugs entirely as medicines to those who are proponents of criminalization. A credible third alternative has emerged called harm reduction, risk reduction, or reality-based drug policy. Asking whose voices are most often heard as authoritative in a drug debate and whose voices are less often heard or heard as less credible can be a method for mapping the social relations and economic interests involved in drug policy. Who was marginalized when the dominant policy perspective was adopted? Who lost out? Who profited? Although the frames active in the social construction of drugs change constantly, some remain perennial favorites.
Drug Panics and Regulation
Not all psychoactive substances used as recreational drugs are currently illegal. Alcohol and tobacco have been commonly available for centuries, despite attempts to prohibit them. Both typically remain legal, except where age-of-purchase or religious bans are enforced. Alcohol prohibition in the United States lasted from 1919 to 1933. Although Prohibition reduced per-capita consumption of alcohol, it encouraged organized crime and bootlegging, and repeal efforts led to increased drinking and smoking among the respectable classes. Prohibition opened more segments of the U.S. population to the recreational use of drugs such as the opiates (morphine and heroin), cannabis, and cocaine. Although cannabis, or marijuana, was not included in the 1914 legislation, Congress passed the Marijuana Tax Act (1937) during a period when the drug was associated with, for example, Mexican laborers in the southwestern United States and criminal elements throughout the country. Cocaine was relatively underused and was not considered addictive until the 1970s. Although cocaine was present in opiate-using subcultures, it was expensive and not preferred.
Social conflicts led legal suppliers to strongly differentiate themselves from illegal drug traffickers. The early 20th-century experience with opiates—morphine, heroin, and other painkillers—was the real basis for U.S. and global drug control policy. The Harrison Act was a tax law that criminalized the possession and sale of narcotic drugs. It effectively extended law enforcement powers to the Treasury Department, which was responsible for enforcing alcohol prohibition. After repeal of Prohibition, this unit became the Federal Bureau of Narcotics, the forerunner of today’s Drug Enforcement Agency.
Pharmaceutical manufacturing firms began to use the term ethical to distance themselves from patent medicine makers. Pharmaceutical firms rejected the use of patents on the grounds that they created unethical monopolies. Unlike the patent medicine makers with their secret recipes, ethical firms avoided branding and identified ingredients by generic chemical names drawn from the U.S. Pharmacopeia (which standardized drug nomenclature). Ethical houses did not advertise directly to the public like pharmaceutical companies do today. They limited their business to pharmacists and physicians whom they reached through the professional press. Around the turn of the 20th century, however, even ethical firms began to act in questionable ways, sponsoring lavish banquets for physicians and publishing advertisements as if they were legitimate, scientifically proven theories. Manufacturing facilities were not always clean, so the drug industry was a prime target of Progressive campaigns that followed publication of Upton Sinclair’s muckraking book The Jungle, which was about the meatpacking industry. The Pure Food and Drug Act (1905) created a Bureau of Chemistry to assess fraudulent claims by drugmakers. After more than 100 deaths were attributed to a drug marketed as “elixir of sulfanilamide,” which contained antifreeze, in 1935, the U.S. Congress passed the Food, Drug, and Cosmetic Act (FDCA) in 1938. The FDCA created the Food and Drug Administration (FDA), the government agency responsible for determining the safety and efficacy of drugs and approving them for the market. Relying on clinical trials performed by pharmaceutical companies themselves, the FDA determines the level of control to which a drug should be subjected. In 1962, the FDCA was amended in the wake of the thalidomide disaster, and the FDA was charged not only with ensuring the safety and effectiveness of drugs on the market but also with approving drugs for specific conditions. Companies must determine in advance whether a drug has abuse potential or is in any way dangerous to consumers. Despite attempts to predict accurately which wonder drugs will go awry, newly released drugs are tested on only a small segment of potential users. For instance, OxyContin, developed by Purdue Pharma as a prolonged-release painkiller, was considered impossible to tamper with and hence not abusable. Soon known as “hillbilly heroin,” the drug became central in the drug panic.
Drug panics are commonly recognized as amplifying extravagant claims: the substance at the center of the panic is portrayed in mainstream media as the most addictive or most dangerous drug ever known. Wonder drugs turn to “demon drugs” as their availability is widened and prices fall. This pattern applies to both legal and illegal drugs. Another major social frame through which drugs are constructed, however, is the assumption that medical and nonmedical use are mutually exclusive.
Medical use versus nonmedical use is a major social category through which drugs have been classified since the criminalization of narcotics. If you are prescribed a drug by a medical professional and you use it as prescribed, you are a medical user. The old divisions between medical and nonmedical use break down when we think about something like cough medicine—once available over the counter with little restriction despite containing small amounts of controlled substances. Today, retail policies and laws restrict the amount of cough medicine that can be bought at one time, and purchasing-age limits are enforced. Availability of cough suppressants in home medicine cabinets led to experimentation by high school students with “chugging” or “robo-tripping” with Robitussin and dextromethorphan-based cough suppressants.
Medication, Self-Medication, and Medicalization
Practices of self-medication blur the medical-versus-nonmedical category. In some places, illegal drug markets have made these substances more widely available than the tightly controlled legal market. Many people who use heroin, cocaine, or marijuana are medicating themselves for depression, anxiety, or disease conditions. They lack health insurance and turn to drugs close at hand. Legal pharmaceuticals are also diverted to illegal markets, leading to dangerous intermixing, as in the illegal use of legal benzodiazepines as “xaniboosters” to extend the high of an illegal drug. The social construction of legal drugs as a social good has been crucial to the expansion of pharmaceutical markets. The industry has distanced itself from the construction of illegal drugs as a serious social problem, but this has become difficult in the face of a culture that has literally adopted a pill for every ill.
Drug issues would look different if other interest groups had the cultural capital to define their shape. Some substances are considered to be essential medicines, whereas others are controlled or prohibited altogether. When drugs are not used in prescribed ways, they are considered unnecessary or recreational. Like the other frames discussed, this distinction has long been controversial.
The history of medicine reveals sectarian battles over which drugs to use or not use, when to prescribe for what conditions, and how to prescribe dosages. The main historical rivals were regular or allopathic physicians, who relied heavily on “heroic” doses of opiates and purgatives, and homeopathic physicians, who gave tiny doses and operated out of different philosophies regarding the mind–body relation. Christian scientists and chiropractors avoided drugs, and other practitioners relied primarily on herbal remedies. As organized medicine emerged as a profession, allopathic physicians became dominant. After World War II, physicians were granted prescribing power during a period of affluence and optimism about the capacity of technological progress to solve social problems. By the mid- to late 1950s, popular attitudes against using a pill for every ill turned around thanks to the first blockbuster drug, the minor tranquilizer Miltown, which was mass marketed to middle-class Americans for handling the stresses of everyday life. Miltown was displaced first by the benzodiazepine Valium and then by the antidepressants Prozac and Zoloft and the antianxiety drugs Xanax and Paxil. A very high proportion of U.S. adults are prescribed these drugs, which illustrates the social process of medicalization.
Medicalization is the process by which a social problem comes to be seen as a medical disorder to be treated by medical professionals and prescription drugs. Many of today’s diseases were once defined as criminal or deviant acts, vices, or moral problems. Some disorders have been brought into existence only after a pharmacological fix has become available. During Depression Awareness Week, you will find self-tests aimed at young people, especially at young men. Typically, women medicalize their problems at higher rates, but the men’s market is now being tapped. Health care is a large share of the U.S. gross national product, and pharmaceutical companies maintain the highest profit margins in the industry, so there are huge economic stakes involved in getting you to go to your doctor and ask for a particular drug. Judging from the high proportion of the U.S. population on antidepressant prescriptions at any given time, these tactics have convinced people to treat even mild depression. Antidepressants are now used as tools to enhance productivity and the capacity to balance many activities, bringing up another active frame in the social construction of drugs: the difference between drugs said to enhance work or sports performance and drugs said to detract from performance.
Performance enhancement drugs first arose as a public controversy in relation to steroid use in professional sports and bodybuilding. However, this frame is also present in the discussion of Ritalin, the use of which has expanded beyond children diagnosed with attention deficit and hyperactivity-related disorders. Amphetamines, as early as the late 1940s, were known to have the paradoxical effect of settling down hyperactive children and allowing them to focus, but today the numbers of children and adolescents diagnosed with attention deficit disorder and attention deficit hyperactivity disorder is extremely high in the United States. Stimulants such as cocaine, amphetamines, and caffeine are performance-enhancing drugs in those who are fatigued. Caffeine is associated with productivity in Western cultures but with leisure and relaxation in Southern and Eastern Europe, Turkey, and the Middle East, where it is consumed just before bedtime. Different cultural constructions lead people to interpret pharmacological effects differently. Today, caffeine and amphetamines are globally the most widely used legal and illegal drugs—the scope of global trading of caffeine exceeds even that of another substance on which Western societies depend: oil.
Performance detriments are typically associated with addictive drugs, a concept that draws on older concepts of disease, compulsion, and habituation. With opiates, delight became necessity as individuals built up tolerance to the drug and became physically and psychologically dependent on it. Addiction was studied scientifically in response to what reformers called the opium problem evident on the streets of New York City by the early 1920s. The U.S. Congress created a research laboratory through the Public Health Service in the mid-1930s where alcohol, barbiturates, and opiates were shown to cause a physiological withdrawal syndrome when individuals suddenly stopped using them. The Addiction Research Center of Lexington, Kentucky, supplied data on the addictiveness of many drugs in popular use from the 1930s to the mid-1960s. During that decade, the World Health Organization changed the name of what it studied to “drug dependence” in an attempt to destigmatize addiction. It promoted the view that, as a matter of public health, drug dependence should be treatable by medical professionals whose treatment practices were based on science. This view brought the World Health Organization into political conflict with the expanding drug law enforcement apparatus, which saw the problem as one to be solved by interrupting the international trafficking. Public health proponents lost out during the 1950s, when the first mandatory minimum sentences were put into place by the 1951 Boggs Act. These were strengthened in 1956. By the end of the decade, law enforcement authorities believed that punishment-oriented drug policies had gotten criminals under control. They were proven wrong in the next decade.
Drug Usage and Historical Trends
Patterns of popular drug use often follow the contours of social change. Several factors tipped the scale toward constructing drug addiction as a disease in the 1960s. The U.S. Supreme Court interpreted addiction as an illness, opining, “Even one day in prison would be a cruel and unusual punishment for the ‘crime’ of having a common cold” (Robinson v. California, 1962). Finding it “unlikely that any State at this moment in history would attempt to make it a criminal offense for a person to be mentally ill, or a leper, or to be afflicted with a venereal disease,” the Court stated that prisons could not be considered “curative” unless jail sentences were made “medicinal” and prisons provided treatment. Four decades later, treatment in prison is still sparse, despite jails and prisons being filled with individuals on drug charges. In the late 1960s, civil commitment came about with passage of the Narcotic Addict Rehabilitation Act (1967), just as greater numbers of white middle-class youth entered the ranks of heroin addicts. Law enforcement was lax in suburban settings, where heroin drug buys and use took place behind closed doors, unlike urban settings. New drugs, including hallucinogens, became available, and marijuana was deeply integrated into college life. The counterculture adopted these drugs and created new rituals centered on mind expansion.
During this time, racial-minority heroin users and returning Vietnam veterans came to attention on the streets. In a classic paper titled “Taking Care of Business,” Edward Preble and John J. Casey (1969) observed that urban heroin use did not reflect apathy, lack of motivation, or laziness, but a different way to pursue a meaningful life that conflicted with ideas of the dominant social group. Hustling activities provided income and full-time, if informal, jobs where there were often no legitimate jobs in the formal economy. The lived experiences of drug users suggested that many people who got into bad relationships with drugs were simply self-medicating in ways designated by mainstream society as illegal. Members of this generation of heroin users suffered from the decline of social rituals and cultural solidarity that had once held drug-using subcultures together and enabled members of them to hold down legitimate jobs while maintaining heroin habits in the 1950s and early 1960s.
By the 1970s, heroin-using subcultures were more engaged in street crime than they had once been. The decline of solidarity became pronounced when crack cocaine came onto the scene in the mid-1980s at far lower cost than powder cocaine had been in the 1970s. Reading Preble and Casey’s ethnographic work, which was done 30 years before the reemergence of heroin use among middle-class adolescents and the emergence of crack cocaine, we see how drug-using social networks met members’ needs for a sense of belonging by forming social systems for gaining status and respect. In the 1970s, the Nixon administration focused the “war on drugs” on building a national treatment infrastructure of methadone clinics distributed throughout U.S. cities. Methadone maintenance has enabled many former heroin addicts to lead stable and productive lives. For a time, it appeared the opium problem might be resolved through public health.
But there is always a next drug, and cocaine surfaced as the new problem in the 1980s. Powder cocaine had been more expensive than gold, so it was viewed as a jet-set drug and was used in combination with heroin. However, a cheaper form called crack cocaine became available in the poorest of neighborhoods during the 1980s. Mainstream media tend to amplify differences between drug users and nonusers, a phenomenon that was especially pronounced in the racialized representation of the crack cocaine crisis. Crack widened the racial inequalities of the war on drugs at a time when social policy was cutting access to health care and service delivery and when urban African American communities were hit hard by economic and social crisis. The pregnant, crack cocaine–using woman became an icon of this moment. Women had long made up about one-third of illegal drug users (down from the majority status of white women morphine users in the early 20th century), and little attention was paid to them. They were represented as a distinct public threat by the late 1980s and early 1990s, however. Despite so-called crack babies turning out not to have long-lasting neurobehavioral difficulties (especially in comparison with peers raised in similar socioeconomic circumstances), “crack baby” remains an epithet. Nor did crack babies grow up to become crack users—like all drug epidemics, the crack cocaine crisis waned early in the 1990s.
Like fashion, fads, or earthquakes, drug cycles wax and wane, and policies swing back and forth between treatment and punishment. Policy is not typically responsible for declining numbers of addicts. Other factors, including wars, demographic shifts such as aging out or baby booms that yield large pools of adolescents, new drugs, and new routes of administration (techniques by which people get drugs into their bodies), change the shape of drug use. Social and personal experience with the negative social and economic effects of a particular drug are far better deterrents to problematic drug use than antidrug education and prevention programs; punitive drug policy; incarceration, which often leads to increased drug exposure; and even drug treatment. Although flawed in many ways, drug policy is nevertheless important because it shapes the experiences of drug sellers and users as they interact with each other.
The War on Drugs and Its Critics
Just as drugs have shaped the course of global and U.S. history, so have periodic wars on drugs. The current U.S. drug policy regime is based on the Controlled Substances Act (1970), which classifies legal and illegal drugs onto five schedules that proceed from Schedule I (heavily restricted drugs classified as having “no medical use” such as heroin, LSD, psilocybin, mescaline, or peyote) to Schedule V (less restricted drugs that have a legitimate medical use and low potential for abuse despite containing small amounts of controlled substances). This U.S. law implements the United Nations’ Single Convention on Narcotics Drugs (1961), which added cannabis to former international treaties covering opiates and coca. The Psychotropic Convention (1976) added LSD and legally manufactured amphetamines and barbiturates to the list. These treaties do not control alcohol, tobacco, or nicotine. They make evident the fact that drugs with industrial backing tend to be less restricted and more available than drugs without it, such as marijuana. Drugs that cannot be transported long distances such as West African kola nuts or East African qat also tend to remain regional drugs. Many governments rely heavily on tax revenue from alcohol and cigarettes and would be hard pressed to give them up. Courtwright (2001) argues that many of the world’s governing elites were concerned with taxing the traffic, not suppressing it. Modernity brought with it factors that shifted elite priorities toward control and regulation as industrialization and mechanization made the social costs of intoxication harder to absorb.
Drug regulation takes many forms depending on its basis and goals. Hence, there is disagreement among drug policy reformers about process and goals. Some seek to legalize marijuana and regulate currently illegal drugs more like currently legal drugs. Some see criminalization as the problem and advocate decriminalizing drugs. Others believe that public health measures should be aimed at preventing adverse health consequences and social harms, a position called harm reduction that gathered ground with the discovery that injection drug users were a main vector for transmitting HIV/AIDS in the United States. This alternative public health approach aims to reduce the risks associated with drug use.
Conflicts between those who advocate the status quo and those who seek to change drug policy have unfolded. Mainstream groups adhere to the idea that abstinence from drugs is the only acceptable goal. Critics contend that abstinence is an impossible dream that refuses to recognize the reality that many individuals experiment with drugs, but only a few become problematically involved with them. They offer evidence of controlled use and programs such as reality-based drug education, which is designed to teach people how to use drugs safely rather than simply avoid them. Critics argue that the “just say no” and “drug-free” schools and workplaces have proven ineffective (see the entry on drug testing for a full account of how drug-free legislation was implemented). In arguing that the government should not prohibit consensual adult drug consumption, drug policy reformers have appealed to both liberal and conservative political ideals about drug use in democratic societies. Today’s drug policy reform movement stretches across the political spectrum and has begun to gain ground among those who see evidence that the war on drugs War on Drugs has failed to curb drug use.
Also check the list of 100 most popular argumentative research paper topics.
Bibliography:
- Burnham, John, Bad Habits: Drinking, Smoking, Taking Drugs, Gambling, Sexual Misbehavior, and Swearing in American History. New York: New York University Press, 1994.
- Campbell, Nancy D., Discovering Addiction: The Science and Politics of Substance Abuse Research. Ann Arbor: University of Michigan Press, 2007.
- Courtwright, David, Forces of Habit: Drugs and the Making of the Modern World. Cambridge, MA: Harvard University Press, 2001.
- DeGrandpre, Richard, The Cult of Pharmacology: How America Became the World’s Most Troubled Drug Culture. Durham, NC: Duke University Press, 2006.
- DeGrandpre, Richard, Ritalin Nation: Rapid-Fire Culture and the Transformation of Human Consciousness. New York: W. W. Norton, 1999.
- Dingelstad, David, Richard Gosden, Brain Martin, and Nickolas Vakas, “The Social Construction of Drug Debates.” Social Science and Medicine 43, no. 12 (1996): 1829–1838. http://www.uow.edu.au/~bmartin/pubs/96ssm.html
- Husak, Douglas, Legalize This! The Case for Decriminalizing Drugs. London: Verso, 2002.
- Inciardi, James, and Karen McElrath, The American Drug Scene, 5th ed. New York: Oxford University Press, 2007.
- McTavish, Jan, Pain and Profits: The History of the Headache and Its Remedies. New Brunswick, NJ: Rutgers University Press, 2004.
- Musto, David, The American Disease: Origins of Narcotics Control, 3d ed. New York: Oxford University Press, 1999.
- Preble, Edward, and John J. Casey, “Taking Care of Business: The Heroin Addict’s Life on the Street.” International Journal of the Addictions 4, no. 1 (1969): 1–24.