I cut myself for the first time at age 18, in the closet of my freshman dorm room. It was late afternoon, and my roommates weren’t around. I snuck into the closet, pulled a pink disposable shaving razor from underneath my socks and underwear in the top drawer of my dresser, took off the protective cap, and dragged the blade in a lateral direction across the top of my left thigh. I felt hot, nervous, exhilarated, and guilty.
I would continue this ritual over and over for several months. During this period, there wasn’t really anything that might signal to the casual observer that I was at risk of harming myself: I was getting pretty good grades in school. I had friends. Overall, I was functioning. But I was also deeply depressed and anxious, and always felt I was at the end of my rope. My depression told me that I deserved to be hurt physically, and I was also desperate for a distraction from my constant mental anguish. Physical pain accomplished both of things at once, and did so very effectively.
Cutting can be difficult to discuss, and not just because of the shame and stigma surrounding mental-health issues. Self-harm is violent, visceral, and a hard image to process. It’s something that makes people uncomfortable, even disturbed. But despite the relative silence on the subject (except for very active online self-harm communities), it’s not uncommon.
Statistics on self-harm in the U.S. are not all that easy to come by. By some estimates, around one percent of the population engages in some form of “self-abusive behavior,” a category that includes disordered eating as well as cutting. In 2010, The Journal of the American Board of Family Medicine published a paper on non-suicidal self-harm, stating that it was a practice most common among adolescents and young adults. Research suggests that men and women self-harm in equal numbers, though the methods likely differ: The Journal article stated that “men more frequently report burning and hitting themselves, whereas women are more likely to report cutting and burning themselves.”
It’s well-established in medical research that most people who self-harm do so with the intention of releasing and relieving psychological pain. By inducing physical pain, the body is triggered to release endorphins, which creates a natural effect similar to morphine, relieving the emotional pain. But while the pain that drives the decision to self-harm comes from the inside, the idea to self-harm itself is very much external.
I was deeply depressed when I started cutting, but in the moments leading up to the first time I hurt myself, I wasn’t thinking about my problems so much as a once-popular and very wholesome TV show I watched with my family: 7th Heaven. The family drama about a pastor, his wife, and their seven kids was popular in the late 1990s, and featured plotlines meant to address family dynamics and the low-hanging social-issue fruit of the day. The episode I had in mind was from 1998, where the third-oldest child in the family, Lucy, has a friend who cuts herself.
When my mental health deteriorated during my freshman year of college, I remember thinking that self-harm was what one did when severely depressed. It seemed like the logical extension of what I was feeling, a connection I’d unconsciously begun to make back when I first learned about the concept of self-harm in my middle-school health class: The way the teacher had described it — a way of gaining control, a brief relief of pain, and so on — sounded to me, in hindsight, like treatment instructions for severe depression. I’ve often wondered if I would have felt the same way about cutting had I never heard about it in school or on television. There’s a historical precedent for self-harm, and in each case, it’s a learned social behavior. If I’d never been taught about self-harm, would I ever have started doing it? Would anyone?
According to Janis Whitlock, the director of Cornell’s Research Program on Self-Injury and Recovery, there’s no definitive answer, but it seems nearly certain that there is no biological imperative to self-harm. In fact, humans naturally have the opposite impulse: to be physically safe and healthy.
Through her research, Whitlock has found that there are two primary ways that individuals will begin to self-injure. The first is by accident: One young woman that Whitlock had worked with told her that one day, she had accidentally scraped her leg against the sharp edge of a table while she was experiencing a lot of emotional pain. Noticing the slight relief and distraction that came with the physical pain in her leg, the patient got the idea that she could reproduce this sensation through intentional self-harm.
But for those who don’t happen upon self-harm by chance, the idea comes from peers, pop culture, and school settings, Whitlock says: Between exposure to peers who self-injure and depictions in the media, it is “really uncommon for a young person not to have come across it” in some fashion. Everyone knows it’s out there, and some people end up seeing it as an option.
Whitlock explains that these days, young people who self-injure almost always fall into this second camp. Whitlock observed the pop-culture effect on self-harm while working with self-injuring teenagers in the 2000s, a time when explicit pop-culture references to self-harm had been showing up for a couple of decades.
Between the 1990s and the present, celebrities such as Princess Diana, Johnny Depp, and Angelina Jolie have publicly disclosed a history of self-harm. In the mid-1990s, musician Marilyn Manson went on the infamous “Smells Like Children” tour, where he repeatedly cut himself onstage during his performances. The movie Thirteen, where the main character experiences druglike effects from cutting herself, was released in 2003. In a more recent example, the Netflix show 13 Reasons Why has inspired a bevy of articles about the glorification of self-harm. In other words, it’s a topic that’s spent some significant time in the cultural spotlight. And depressed young people have noticed.
This spotlight is what Whitlock says should be avoided when educating teenagers about self-harm, and in treating the self-harming behavior.
The antidote, Whitlock says, is to avoid invoking these pop-culture representations when educating teenagers about self-harm. In fact, she adds, the best prevention strategy may be not to dwell on it at all – focusing on self-harm in health education, psychological settings, or peer-to-peer support scenarios can also backfire. Instead, Whitlock says that it’s best to employ a “notice and respond approach,” shifting the focus from the behavior itself to other alternatives. (She’s also found that discussing the negative consequences of self-harm, like the lasting scars that can come from cutting or burning, seems to be an effective deterrent.)
I can speak to the effectiveness of these types of diversion-based techniques. When I stopped cutting myself, one of my first psychologists told me that when I felt the urge to self-harm, I should squeeze ice cubes instead. I barely shaved my legs for several months because I didn’t trust myself with a razor. I got a lot of tattoos and piercings in a short period of time because I wanted to feel pain without feeling guilty. I know others who would snap an elastic hair-binder or rubber band on their wrists when they were “weaning” themselves off of self-harming behaviors.
For many of us, these alternatives worked. Whitlock’s takeaway from her years of research — a take with which I strongly agree — is that the key to preventing self-harm is talking about it in just the right way: It’s about reframing the conversation to address internal pain without putting self-destructive behaviors in the spotlight. My detour came in the form of therapy, ice cubes, and tattoos, but whatever it is for others, what matters most is that it’s anything other than harming our own bodies.