Who Exactly Is Ashton Kutcher’s Anti-Sex-Trafficking Tech Company Helping?

The actor’s recent PR crisis has led to scrutiny around his advocacy work.

Photo: Paul Morigi/WireImage
Photo: Paul Morigi/WireImage
Photo: Paul Morigi/WireImage

This article was featured in One Great Story, New York’s reading recommendation newsletter. Sign up here to get it nightly.

Ashton Kutcher has not yet recovered from a public-relations fire that first ignited in early September. The crisis began with character letters written by Kutcher and his wife, Mila Kunis, to the judge in charge of sentencing their That ’70s Show co-star, Danny Masterson, for sexually assaulting two women. That Kutcher and Kunis described the convicted rapist as a wonderful father and an “outstanding role model” who “has always treated people with decency,” was enough to fuel backlash. Their mea culpa was widely mocked on Twitter, spawning meme videos and jokes about the “apology wall” celebrities use to convey humility. But Kutcher was called out for a particular strand of hypocrisy. The actor is also a self-professed anti-trafficking crusader, who co-founded a company that fights child sex abuse and has testified in front of Congress on the issue. Why, then, was he defending someone who violently ignored consent?

On September 14, Kutcher resigned from the organization, called Thorn. “I cannot allow my error in judgment to distract from our efforts and the children we serve,” Kutcher said in a statement to the board that was later made public. “The mission must always be the priority and I want to offer my heartfelt apology to all victims of sexual violence and everyone at Thorn who I hurt by what I did.” That mission, to be clear, is building “technology to defend children from sexual abuse” through a “company with the mission of a nonprofit.” But the anti-sex-trafficking advocates, sex workers, and trafficking survivors I spoke with all fear the nonprofit’s approach may harm the very people it claims to protect.

Kutcher and his first wife, Demi Moore, founded an organization called DNA in 2009 after watching a Dateline special about sex trafficking in Cambodia. (DNA, rebranded to Thorn after the couple split in 2012.) “Those Cambodian kids were 7, 8, 9 years old,” Kutcher said in a 2017 interview with W. “I started asking around, and people said to me, ‘Oh, no, it’s happening right here in Los Angeles.’” Kutcher told W that he was looking to lend his celebrity to a cause. In its first ten years, the organization says it raised roughly $27 million dollars and partnered with powerful companies like Goldman Sachs and nonprofits like the McCain Institute. In 2011, a public-awareness campaign for DNA featured Donald Trump, Jamie Foxx, and the slogan “Real Men Don’t Buy Girls.”

As a prolific tech investor, Kutcher leveraged his relationships with companies like Google, Twitter, and Amazon to help create a digital tool named Spotlight. The organization claimed the software would help save vulnerable children and, by the mid-2010s, gave Spotlight out to police departments around the country for free. Thorn’s flagship product acts like a search engine. If cops suspect someone is being trafficked, they can enter certain pieces of information, like a name, phone number, or photo, into the database, which then combs through millions of online escort ads to turn up results. It also has an algorithm that compiles ads with signs of trafficking — like sex workers calling themselves “young” or specifying race — so cops can look for potential victims in their area. Last year, the company spent almost $9 million on such “victim identification” efforts. Only a few similar programs exist on the market, and Spotlight’s backend logistics are mysterious (the company did not respond to any of my detailed questions about how it’s run, nor did Ashton Kutcher respond to a request for comment). Previous reporting from the New York Times and Forbes found that the software relies on tools like Rekognition, Amazon’s AI facial-recognition program. Its surveillance is sophisticated: The program can set up alerts that allow police to track potential victims in real time based on new ads, according to Forbes, and map out possible connections between people, creating a visual trafficking network.

Thorn says it has helped identify more than 17,000 child survivors over the past four years, one of the many impressive-sounding statistics the company often touts. Experts have pointed out, however, that based on similar government numbers, those metrics seem impossible. In 2017, Kutcher told Congress that 2,000 child-trafficking victims were identified in six months, thanks to Spotlight. Thorn’s impact report that year noted that 103 children were “rescued.” According to the FBI’s own numbers, they found 175 minor victims between 2009 and 2015 (at the state level, in 2015, there were 744 investigations into all sex-trafficking offenses across the country.) Cops still love the technology, though, and Thorn claims Spotlight is used by more than 8,000 of them. One detective working in the Seattle area, who I’ll call Brian, describes it as a “phenomenal tool.”

“If it were to go away, that would bring law enforcement back to the stone age,” he says. The program recently helped his unit discover a 15-year-old who was being trafficked. Brian’s colleague responded to a domestic disturbance, where he found a bruised and emaciated teenager. He asked for her phone number and ran it through Spotlight, which instantly turned up ads posted by the girl’s trafficker on a popular website called Skip the Games. “Without Spotlight, he probably wouldn’t have found that,” the detective told me, since survivors don’t always see themselves as being exploited or may lie to cops out of fear.

One group that doesn’t love this technology is sex workers. The software can’t credibly discern between real escort ads and sex-trafficking-disguised-as-escort ads, meaning consenting adults often get swept up in its surveillance. In fact, a recent study funded by the Department of Justice found that police regularly mistake certain “red flags” in escort ads — like 24/7 availability or the use of specific emojis — for signs of trafficking. That Thorn uses Amazon’s facial recognition tool is especially contentious. Research by MIT and the ACLU has shown that it falsely identified people of color, and the company itself has banned police departments from using Rekognition, except in trafficking cases through software like Spotlight.

Though officers are only supposed to use Spotlight for child-abuse cases, they can easily surveil sex workers and set up stings to arrest them. It’s an issue that worries Blair, a dominatrix, given that she was recently flagged by border patrol when trying to enter Canada. An agent questioned her and seemed to know about her cam work. “They treated me like a victim,” says the New York–based 28-year-old. “They were like, ‘You’re being trafficked.’” Blair’s sex worker colleagues have had similar experiences while traveling, she suspects because facial recognition software like Thorn offers has crawled their posts. “When you’re databasing my face off my ads, which I have to post to pay my bills, you’re putting me in closer proximity to people who could arrest me, deport me, or evict me,” she says.

The ability to post online ads has made sex workers safer, enabling them to screen clients and work from home. But recent laws like FOSTA/SESTA, which ban platforms from hosting these ads, and surveillance tools like Thorn sometimes make the job more dangerous under the pretense of protecting trafficking victims. Even the survivors I spoke with saw this tool as a potential threat, citing the high rates of sexual violence perpetrated by police. When Jax-Prince Cottrell was around 20 years old, they were trafficked by a man who claimed he wanted a live-in dominatrix. Cottrell, who is nonbinary and transgender, was addicted to heroin and desperate for a place to stay. Instead, they were repeatedly raped by him and forced to have sex with multiple men after moving into his New York apartment. As a sex worker, Cottrell had already been sexually assaulted by a cop in the back of a police car. And as a Black person, they see law enforcement as threatening. “I would literally rather take the risk of dying before I ask a cop for help,” they told me. “They are some of our biggest predators.” Cottrell grew up in poverty, and what they needed more than anything else was safe shelter and a support system.

But Spotlight does nothing to address the underlying factors. “Thorn builds products for police, not trafficking survivors,” says Sabra Boyd, a Seattle-based writer and consultant. “Silicon Valley loves funding the newest shiniest thing, not affordable housing.” She had been trafficked by multiple men, including her father, since she was a child. When Boyd was 13, an officer told her that if she wanted to press charges against one of her abusers, he couldn’t offer her any protection. “My trafficker had threatened to kill me and my family,” she says. “He’d been sentenced to jail and prison dozens of times and was always released more violent.” At 17, she remembers that while sleeping outside as a homeless teenager, another officer “kicked and beat me awake.” If Thorn really cared about victims, she says, they would invest in technology-based solutions that make finding a home, legal aid, healthcare, and counseling more accessible to vulnerable kids. Instead, Boyd says, “Thorn, like other tech companies, uses human trafficking — especially child trafficking — as a ruse to erode everyone’s privacy rights worldwide.”

Spotlight is a blunt instrument that can’t do anything beyond identifying a survivor. Brian, the Seattle cop, acknowledged that taking a victim away from their trafficker and keeping them safe over the long term are two different things. He and his colleagues sent the teenager they had found home with her mom, but that’s where she’d run away from in the first place. Brian couldn’t give more specifics about the situation, since the investigation is ongoing, but said: “When you have a 15-year-old who is suffering from substance-abuse disorders, mental-health issues, and other things, there’s just a number of barriers or factors that we have to overcome.” In the past few years, Thorn has rolled out programs and products that are more focused on mainstream use. Safer is a commercial software to help platforms like TikTok remove sexually explicit content involving kids, while NoFiltr and Thorn for Parents focus on educational resources and online campaigns.

Jean Bruggeman, the executive director of Freedom Network USA, the country’s largest anti-trafficking coalition, is concerned with what happens to survivors after they are identified by Thorn. How many trafficked children identified by Spotlight, Bruggeman wonders, are harmed by police intervention, physically or psychologically? How many are coerced into giving testimony against their trafficker? How many continue to be exploited even after a predator is arrested? Bruggeman alleges that she raised these concerns with an employee from Thorn about six years ago. They had coffee after a meeting hosted by Microsoft in Washington, D.C., and Bruggeman asked whether the company was doing anything to help children get their basic needs met. Had Thorn looked into the ways its technology might actually hurt survivors, she wanted to know, especially those who are most vulnerable already? The employee seemed “willing to have the conversation and hear my concern.” But Bruggeman offered to have a follow-up call and claims she never heard back from anyone at the company. “I just got a sense they really believe that their work is having some positive impact,” says Bruggeman, “and didn’t really seem interested in looking too deeply into any negative impact.”

Over the past two decades, the fight against child sex trafficking has become an obsession of the religious right and conspiracy theorists. Groups have used it as a recruitment tool; QAnon’s “Save the Children” campaign, for example, was a palatable gateway to more insidious right-wing ideologies. Organizations that claim to care about child exploitation tend to fabricate statistics and tell sensationalist stories of white children being kidnapped by strangers in airports or tied up in basements. The crime becomes synonymous with big-screen narratives, like Taken or Sound of Freedom, which don’t reflect the reality that most victims are exploited by people they know. Bruggeman says an emphasis on rescue missions and flashy numbers does more to boost egos than to help make sure children don’t end up exploited. “Hollywood loves a good white-savior narrative,” says Bruggeman. Only three of the eight people on the organization’s majority white board of directors have spent their careers doing anti-child-sex-trafficking work, and the organization’s CEO and executive director, Julie Cordua, has a background in marketing. “I think it’s critically important that the work be led by those with lived experience,” says Bruggeman, and that celebrities “defer to experts in the field.”

When Kutcher testified before Congress during a 2017 committee on modern slavery, he somberly declared that his day job is doing anti-trafficking work, as if the dozen or so movies he acted in after founding Thorn were a side hustle. After John McCain teasingly told Kutcher he was “better looking in the movies,” Kutcher blew him a kiss, generating breathless media headlines. (“Ashton Kutcher has been doing incredible work to combat child trafficking & we didn’t even realise,” gushed Glamor U.K.) Though he may have stepped down from the board, the organization has certainly been shaped in his image. Now that Spotlight is a firmly entrenched part of law enforcement’s arsenal, it doesn’t need a celebrity figurehead to spread even further. “If you create the technology, people tend to want to have the broadest access to it,” says Jared Trujillo, a former sex worker who teaches constitutional law at CUNY Law. That’s a scary thought, given how little the public knows about this surveillance tool, one that is most likely to harm the most vulnerable groups. “There’s really no sunlight on exactly how Spotlight operates, how its algorithms operate, and how people end up in their database,” says Trujillo. “‘Just trust Ashton Kutcher’ is terrible public policy.”

Who Is Ashton Kutcher’s Anti-Sex-Trafficking Org Helping?