Today, it’s widely accepted by major scientific associations that addiction is a medical illness. The National Institute on Drug Abuse [NIDA] and the American Psychiatric Association [APA] both define addiction as a “brain disease,” and the DSM-V lists criteria for classifying addiction as a mental health condition called “Substance Use Disorder.”
However, it wasn’t always this way. In the United States, there’s a long history of vilifying not only drugs and alcohol, but also the people who use them. Less than a century ago, addiction wasn’t seen as an illness outside of one’s control, but rather as a moral failing rooted in one’s personality.
In the 1930s, when scientists first began to study addiction, the prevailing view was that addicts were simply those too weak in willpower to say no. Because addiction wasn’t seen as an illness, there was no concept of treating it with rehabilitation centers and 12-step programs. Instead, heavy users of drugs and alcohol were seen as degenerates and criminals and were treated accordingly; they were imprisoned or institutionalized so as not to be a nuisance to society.
The tide of scientific opinion began to change as advances in research and technology revealed that repeated use of drugs actually led to physical changes in the brain that inhibit self-control and perpetuate intense cravings for the drug. This discovery shattered the notion of continued drug use as a “choice” and discredited the argument that addicts could just stop using anytime they wanted to.
Loss of Control: How Addiction Changes Your Brain
The major argument for why addiction should not be considered an illness centers on the role of choice. For example, some argue that you can’t choose to stop having cancer, but you can choose to stop using drugs if you exert the willpower to do so. This argument has been applied to other mental illnesses as well; for example some argue that people who suffer from depression should just “stop being sad.” In both cases, it’s not acknowledged that these illnesses map to changes in the brain’s structure and function that perpetuate the illness.
Drugs work by stimulating the reward circuitry in your brain. Typically, the reward circuit plays a role in learning — it exists to ensure you learn to repeat activities that are life-sustaining, like eating and sleeping. To do this, it releases dopamine — a chemical that causes feelings of pleasure — into your brain whenever you do an activity that is evolutionarily beneficial to your survival. As a result, an association is created between that activity and feelings of pleasure so that you’re motivated to do the activity again.
Drugs exploit the same learning pathway but kick it into overdrive. When you take a drug, it releases anywhere from 2 to 10 times the amount of dopamine compared to natural processes. This causes extreme feelings of euphoria that highly motivate you to want to do the drug again. But as you continue to take the drug, your brain adapts to these unnaturally large surges of dopamine by desensitizing itself to it.
The result is not only tolerance, the need to take increasingly larger doses to feel an effect, but also a loss of pleasure from normal activities that were reinforced by small amounts of dopamine like eating, sleeping, and hanging out with friends. Some people even become physically dependant on the drug, facing withdrawal symptoms like nausea, fatigue and insomnia without it. At this point, continuing to use the drug is no longer a matter of a choice; both your body and your brain have become addicted to it, needing it to function and feel pleasure.
Some People are at Greater Risk for Addiction
Despite evidence that long-term drug use leads to brain changes, some still argue that addiction differs from other mental illnesses because the initial decision to try drugs remains an individual’s choice. In other words, if you exert the willpower to not try drugs in the first place, you’ll never become addicted.
However, this line of thinking ignores the fact that there are several risk factors outside of one’s control that increase one’s likelihood of trying drugs. For example, environmental factors like growing up with parents that use drugs or going to a school where drug use is prolific. Then, once you’ve started using drugs, factors like genetics can increase your likelihood of quickly becoming addicted; studies suggest genetic factors account for somewhere between 40 to 60 percent of a person’s vulnerability to addiction.
Addiction is Still Stigmatized in Society
Scientific thinking around addiction has come a long way in the last 100 years. Most medical professionals today treat addiction like an illness with criteria to diagnose it and guidelines to treat it. However, despite this changed attitude in the medical community, addiction remains highly stigmatized in broader society.
A 2014 study from the Johns Hopkins Bloomberg School of Public Health found that “people are significantly more likely to have negative attitudes toward those suffering from drug addiction than those with mental illness, and don’t support insurance, housing, and employment policies that benefit those dependent on drugs.” Colleen L. Barry, PhD, MPP, who led the study, credits the difference in attitude to the fact that the, “American public is more likely to think of addiction as a moral failing than a medical condition.” This study makes it clear that there is still work to be done to educate the wider public on the fact that addiction is a mental illness.
If we are to help people with addiction recover, we have to stop blaming them for making “bad choices” and dismissing them as inherently bad or weak. Instead, we must recognize the complex web of social and environmental factors that can lead to drug use, and understand that addiction is rooted in changes to the brain that impact judgement, decision making, and self-control.
Bio: Tiffany Chi is a San Francisco-based writer who specializes in health and wellness. She enjoys reading, yoga, and trying out new recipes.
This post originally appeared on Talkspace, and you can view it here.