The Debate on Deepfake Porn Misses the Point

Online discussions tend to focus on the fact that the images aren’t “real,” ignoring the real harms caused.
A glitch illustration of a deep fake porn recording
ILLUSTRATION: WIRED STAFF; GETTY IMAGES 

When Blaire, a streamer known online as QTCinderella, first heard that her face had been deepfaked onto a porn performer’s body, she was puzzled. A popular creator with more than 800,000 followers on Twitch, she often streamed herself playing video games or baking. When her boyfriend told her what had happened, she was busy planning the second annual Streamer Awards, an event she launched in 2022. The deepfake was creepy, and it was definitely gross, but it was a stranger’s body. “I didn’t really understand what was in store for me,” she says. 

The images themselves first came to the internet’s attention on January 26, when viewers of Brandon “Atrioc” Ewing’s Twitch stream spotted a website on his screen that contained nonconsensual deepfake pornography he’d bought that depicted popular streamers, like Blaire, Pokimane, and Maya Higa. These were Ewing’s colleagues and, in some cases, friends. Blaire and Ewing occasionally streamed together. She made Ewing and his wife a wedding cake. To make matters worse, he’d exposed the existence of those deepfakes to countless thousands of people; Ewing has more than 300,000 followers on Twitch alone. It took mere hours for some viewers to spread screenshots of the site, and then of the deepfakes themselves. Ewing had lit a match, and the fire was running wild.

Deepfakes are powerful tools for spreading disinformation, but they can also have a long-term effect on people’s perceptions. It’s not enough that some viewers can tell the media is fake. The consequences are real. Victims are harassed with explicit video and images made in their semblance, an experience some liken to assault. Repeated harassment with these videos or images can be traumatizing. Friends and family don’t always have the online literacy to understand that the media has been falsified. Streamers watch as their personal and professional brands are polluted through a proliferation of explicit content created without their knowledge or consent. 

Female Twitch streamers face intense scrutiny more often than their male counterparts. They’re harassed, threatened, stalked, and constantly sexualized against their will. It is a miserable, yet widely understood, component of their work. In the weeks since Ewing’s stream, and subsequent apology, much of the chatter around streamer deepfakes has largely focused on whether these women have the right to be upset over “fake” images. Fans have also honed in on the responses of Ewing, as well as Blaire’s boyfriend, a fellow streamer who was friends with Ewing. 

But these conversations miss the point. They brush aside legitimate harm in favor of bad-faith arguments. Centering “real” versus “fake” diminishes the lasting impact these images have on the streamers and their careers. “We are hurting,” Blaire says. “Every single woman involved in this is hurting.” 

On the morning she learned of the deepfakes, Blaire, whose last name WIRED has withheld for privacy reasons, began to get calls from some of the other women posted on the site. Finally, she saw screenshots. “It was a slap in the face,” she says. “Even though ‘it’s not my body,’ it might as well be. [It was] the same feeling—a violation that comes with seeing a body that’s not yours being represented as yours.” The photos were upsetting for what they were, but they also triggered the body dysmorphia she’s struggled with for years. She threw up her lunch for the first time in a long while. “Seeing these photos spread and seeing people just sexualizing you against your will without your consent, it feels very dirty. You feel very used.”

In the weeks after Ewing’s stream, the online conversation about the deepfakes continued to spiral. 

Commenters went on tirades about how streamers who’ve been deepfaked shouldn’t care, while others claimed it was all for attention. Above all, it was men—online and off—who had not been deepfaked who seemed determined to decide what was a proper way to react. Blaire says male friends apologized to her boyfriend, rather than to her, for the trouble the incident was causing. Pokimane rebuffed comments that photos she’d posted justify her, or anyone else’s, treatment. “People can post whatever they want, and that still means you still need their consent to do certain things, including sexualizing them and then profiting off of it,” she said during a stream.

On Twitter, Higa lambasted critics. “The debate over our experience as women in this is, not shockingly, amongst men,” Higa wrote. “None of you should care or listen to what any male streamer’s ‘take’ is on how we feel.” The situation has made her feel “disgusting, vulnerable, nauseous, and violated,” she continued. “This is not your debate. Stop acting like it is.”  

Other streamers who’d been targeted remained quiet. There seemed to be an unspoken understanding between them: damned if you do, damned if you don’t. Talking about how it felt to be deepfaked came with the unfortunate by-product of adding fuel to the fire. For Blaire, efforts to defend herself led to more harassment. Some streamers choose to have OnlyFans accounts, where they have the power to decide what gets posted and profit from it. Although Blaire is pro-sex work, this is not something she’s opted to do. Instead, sexualized images were created without her knowledge. “The most tepid take on all this is like, ‘Hey, consent is important,’” she says, “and there are still people that will argue that.” 

In an attempt to show viewers the impact these deepfakes had on her, a real person, she did a very human thing: She got on Twitch and streamed herself, red-faced and vulnerable. “This is what pain looks like,” she repeated in the video, crying openly. “It should not be a part of my job to have to pay money to get this stuff taken down,” she said. “It should not be part of my job to be harassed, to see pictures of me ‘nude’ spread around … It shouldn’t be a part of my job. And the fact that it is, is exhausting.” 

Blaire’s impassioned plea—the closest she could get to sitting in a room with thousands of people to let them absorb her presence as a person hurting—prompted some critics to double down. Fellow Twitch streamers made reaction content and jokes out of her video; mega-popular creator Ethan Klein of h3h3Productions streamed a segment where he played “Chestnuts Roasting on an Open Fire” over Blaire’s video, giggling and covering his face throughout. He later issued an apology. Across communities on Reddit and Twitter, commenters accused the women involved of exaggerating the impact of the deepfakes, comparing the fakes to a harmless photoshop job. One user tweeted a picture of a tablet at Blaire; the device showed an image of her from her pain-filled stream. Its screen was covered with semen.

“People are mad at you for reacting,” Blaire says. “And then other people are saying, ‘Oh, she’s baiting sympathy.’ It just never ends.” 

Arguing that deepfakes can’t be harmful because they’re not “real” is as reductive as it is false. It’s ignorant to proclaim they’re no big deal while the people impacted are telling you they are. Deepfakes can inflict “the same kinds of harms as an actual piece of media recorded from a person would,” says Cailin O’Connor, author of The Misinformation Age and a professor at the University of California, Irvine. “Whether or not they’re fake, the impression still lasts.”

Memes about politicians, for example, influence our perceptions of them, inspiring everything from disgust to playful affection. Once a right-wing tool against President Joe Biden, the “dark Brandon” memes have since been claimed by the left to turn Biden into an edgy agitator. “It’s not exactly misinformation because it doesn’t give you a false belief, but it changes someone’s attitudinal response and the way they feel about them,” O’Connor says. With pornography, it’s turning a person into a sex object against their will, in situations some find degrading or shameful. 

Ewing released his tearful apology on January 30, a 14-minute stream with his wife in the background in which he claims his interest in AI, deepfakes of music and art, and being “morbidly curious” fueled his decision to visit the site and view the videos of his peers. “I was on fucking Pornhub … and there was an ad [for the deepfake site],” he said. “There’s an ad on every fucking video for this so I know other people must be clicking it.” 

His apology—his regret, the quality of how he’d handled it—dominated headlines and discussion, instead of the damage caused. For some fans, the apology was enough: In their eyes, here was an otherwise good guy who’d made a mistake. He followed with a written apology on Twitter on February 1. “I think the worst thing of all of this is he released that apology before he apologized to some of the women involved,” Blaire says. “That’s not OK.” (Ewing did not respond to a request for comment about this.) 

The page hosting those deepfakes is now gone. Its owner took it down. In his apology, Ewing credits the page’s demise to Blaire and law firm Morrison Rothman. Ewing says he will also cover the costs for anyone affected who wishes to use Morrison Rothman to remove the deepfakes from the web and that he will continue to work with other law firms to get them taken down on sites like Reddit. (Morrison Rothman confirmed to WIRED that the firm is representing those affected by Ewing’s stream.) 

However, removing any content from the internet is a Sisyphean task, even under the best of circumstances. Blaire, who had vowed to sue the deepfake creator responsible, learned from multiple lawyers that she’s unable to do so without the help of federal legislation. Only three states—California, Texas, and Virginia—have laws in place to specifically address deepfakes, and these are shaky at best. Section 230 absolves a site’s owner of legal liability from users posting illicit content. And as long as someone acts in “good faith” to remove content, they’re essentially safe from punishment—which may explain why the page’s owner posted an apology in which they call the impact of their deepfakes “eye opening.”

Laws and regulations dealing with issues like these are impossible to enact with any speed, let alone against the lightning-fast culture of the internet. “The general picture we ought to be looking at is something like the equivalent of the FDA or the EPA,” says O’Connor, “where you have a flexible regulatory body, and various interests online have to work with that body in order to be in compliance with certain kinds of standards.” With that kind of system in place, O’Connor believes progress could be made. “The picture that I think we should all be forwarding is one where our regulation is as flexible and able to change as things on the internet are flexible and able to change.”

For Blaire and the women involved, there is little in the way of immediate relief. They’ve found sympathy and support in their communities. Yet whatever steps Ewing takes to do right by his victims will never erase the impact of these deepfakes, or the harm that distributing them has done. Some of the deepfaked images of Blaire have even made it to “very religious” members of her family, who don’t understand what deepfakes are and are convinced the photos are authentic. “They had no clue. They couldn’t understand it at all,” she says. “People that I love and care about think I [did something] I never want to do.”

These deepfakes harm more than just the women directly affected. People have even gone so far as to directly harass some of her family with those photos. Blaire’s young cousin occasionally comes into her streams; after finding his Twitch account, people messaged the deepfakes to him directly. “That’s unfair,” Blaire says. “He’s 17.”

Blaire has lost time and work to the chaos that’s followed all of this. She temporarily stepped away from streaming—a hit to her income. She worries that in the future she may be unable to attract sponsors, who could Google her only to find her name attached to pornography, or even to this issue, which could drive them away. These days, more people are talking about her deepfakes than the Streamer Awards. “The taboo of it is more interesting than actual hard work,” she says. “Business-wise, it’s just been demoralizing. If you go to my YouTube comments, it’s just a lot of harassment. If you go online anywhere, it’s just a lot of harassment.” All she can do is try to keep her head down and continue to work.  

“At the end of the day, you realize this is all because of a pervert online,” Blaire says. “This is nothing that I signed up for.”

Updated 3-1-23, 3:15 pm EST: This story was updated to correct that a page hosting streamer deepfakes was taken down, not the site as previously stated.