So...should I be worried about deepfakes?

Exploring what deepfakes are and how they mostly negatively impact women

In late January, Taylor Swift was about to kick off the first set of 2024 shows for her massively successful Eras Tour. Arguably the most famous woman in the world – and now a billionaire – Swift has long been at the center of the public eye. But in the weeks leading up to the Superbowl, where millions awaited her appearance, the content featuring her that swept across the Internet did not showcase her singing, songwriting, or support for Travis Kelce. Instead, sexually explicit deepfake images of Swift were everywhere on X. One of the most viewed such posts amassed over 45 million views, nearly 25 thousand reposts, and hundreds of thousands of likes and bookmarks over the 17 hours it remained visible before X managed to take it down. The company struggled so much to get the situation under control that it resorted to temporarily disabling searches referring to Swift while her fans attempted to reclaim hashtags related to the star that had been flooded with AI-generated pornographic content. Within days, the content was removed from X. But countless concerns, fears, and questions remain. So today we’ll try to wrap our heads around it all.

Can I ask you a personal question?

The deepfake floodgates are open

First things first, let’s define what we mean by “deepfake.” Simply put, a deepfake is an “artificial image or video (a series of images) generated by a special kind of machine learning called “deep” learning (hence the name).” Algorithms and AI models, such as generative adversarial networks (GANs) – a favorite of deepfake generators – train on large sets of data in order to create such realistic results that the human eye can’t detect that they are in fact synthetic. (Try refreshing thispersondoesnotexist.com a few times to see for yourself.) In practice, this means that these AI tools are capable of swapping another person’s likeness onto an existing image or video taken when they simply weren’t present. And it’s happening. A lot.

Some estimates put the number of deepfake videos available online in the hundreds of thousands, with that number increasing by as much as 500% from 2019 to 2023. One of the earliest reports in the space by security company Sensity, which specializes in deepfake detection, found that a staggering 96% of deepfake videos online in 2017 were pornographic in nature. More recent reports put that number even higher, and find that 99% of all individuals targeted by deepfake pornography are women. WIRED also reported that an analysis of over 35 websites either fully or partially dedicated to hosting deepfake pornography showed the number of deepfake videos uploaded in 2023 would exceed those in all years prior.

So easy, a caveman could do it

While extremely disturbing, all of this is perhaps not particularly surprising given that the term “deepfake” was “first coined in late 2017 by a Reddit user of the same name…where they shared pornographic videos that used open source face-swapping technology.” In 2019, Vice’s Motherboard released an article titled “This Horrifying App Undresses a Photo of Any Woman With a Single Click.” And that’s exactly what DeepNude did. Based on an algorithm developed by those at the University of California, Berkeley using GANs, the app allowed users to simply upload a photo of a clothed woman and generate a realistic, nude image of her. DeepNude, of course, was only developed to work on female bodies and could not produce images of naked men. Its creator, known only by an alias, stated that he thought he could make “an economic return from this algorithm” and that if it wasn’t created by him, “someone else will do it in a year.” He later shut down the app, citing the inability to control traffic and…some ethical concerns (genius). 

Today, Reddit posts telling you about the “best deepnude apps” are not hard to find. A quick Google search will present you with numerous options for undressing women in any photos you happen to have. Alternatively, if you have the absolute bare minimum technological skills required to download photos off an unwitting woman’s social media or dating app profiles, just to name a few terrifying options, that would work, too. Many of these search results (view – or don’t view – these links at your own discretion), like this one and this one, share the DeepNude name with their predecessor, and the former even offers some free examples of what it can do. Within about 30 seconds, I was able to test this out on myself and view the results for free. (We very much recommend you don’t do this yourself – don’t give them your data.) Of course, the fake nude image the site generated considerably altered and accentuated parts of my body to better fit societal beauty standards (smaller waist, larger breasts, more defined abs…we’ll dive into beauty standards in a future post). But overall, with a little tweaking and a few more tries, it would quite likely look reasonably realistic. The ease with which anyone can generate these images of other women – ones they know, and ones they don’t – is startling. There are likely millions of such images in existence already.

The new revenge porn

Women have long been the victims of online sexual harassment and nonconsensual pornographic threats and distribution. A 2021 Pew Research Survey found that “33% of women under 35 say they have been sexually harassed online.” Based on another meta-analysis of studies, it was estimated that as many as 12% of sexually explicit messages exchanged between young people were forwarded to others without the consent of the sender. A 2016 report, surveying over 3,000 U.S. Internet users over the age of 15, found that 1 in 25 Americans had “been threatened with or faced a vicious form of digital harassment in which explicit images are shared online without the subject's consent.” Women and those identifying as lesbian, gay, or bisexual were considerably more likely to face this harassment, with 10% of women reporting dealing with threats and 6% reporting their photos were distributed without their consent. The lead author, Amanda Lenhart, stressed the importance of focusing on threats as well as distribution because threats are a common coercive tactic in domestic abuse cases. This “revenge porn,” Lenhart also notes, can threaten the safety of victims, as their name, location, and place of work are often shared with the explicit photos.

A 2014 study from the Cyber Civil Rights Initiative revealed that “more than 50 percent of survivors’ full names and links to their social media profiles were posted with their naked photos on revenge porn sites, as were 20 percent of survivors’ email addresses and phone numbers.” It also found that roughly 90% of nonconsensual pornography victims are women and 57% of all victims are harassed by an ex-boyfriend despite the fact that more men than women have sent their partners explicit photographs. The impacts of such personal violations on their victims are extremely serious: 93% suffered significant emotional distress, 82% suffered significant impairment in social and occupational areas of their lives, 49% were stalked or harassed online by those who saw their content, 42% sought out psychological services in response to the experience, 37% were teased by others for their content, and 30% were stalked or harassed outside of the Internet.

New technology, same problem

Clearly, deepfake pornography is not where the sexualized digital exploitation of women's bodies began. However, these AI advances have made it dramatically easier to make a woman the victim of such harassment. Now, you no longer need to have a real explicit image or video of a woman nor do you need to go through the trouble of learning extensive Photoshop skills to create a remotely believable fake nude image; you simply need a couple minutes, zero dollars, and access to a photograph of decent quality of a clothed woman in order to violate someone in the most personal of ways. And if you want to share it virally? No problem. Various corners of the Internet will quickly pick it up and spread it so far and wide that it will be next to impossible for the subject of the content, who likely has no idea such content even exists, to ever successfully fully remove it from the public sphere.

One survey found that 48% of U.S. men have already seen deepfake pornography at least once. Another by the University of Antwerp of nearly 3,000 Belgian adolescents between the ages of 15 and 25 found that 7% owned deepnudes (largely men), nearly 14% had received deepnudes, and that 60% of those with knowledge of deepnudes (12% of the total) had actually tried to create one themselves. Actresses, female musicians, and other high-profile women have been frequent targets of nonconsensual deepfake pornograph. Emma Watson’s likeness, for example, was used in highly sexually suggestive ads featured on Meta for a “DeepFake FaceSwap” app. The Prime Minister of Italy, Giorgia Meloni, is seeking over $100,000 in damages from two men who created and circulated deepfake pornographic videos of her. In Northern Ireland, politician Cara Hunter was accosted by Al-generated sexualized videos of herself weeks away from the country’s 2022 legislative elections in what she called “a campaign to undermine me politically.” As one article put it, incidents like these are essentially “men telling a powerful woman to get back in her box.”

So what’s being done about it? 

It is not, by any means, just celebrities who have been forced to deal with these traumatic and distressing events in recent years. Across the country, from Florida, to California, to New Jersey, high school and even middle school administrations and parents have attempted to deal with deeply upsetting situations in which male students used AI to generate nude, sexually explicit images of their female classmates. In the years since deepfake videos first emerged, Google has received tens of thousands of takedown requests for AI-generated porn videos on websites hosting such content. For victims of these weaponized machine-learning tools, it can be extremely difficult or even impossible to track down let alone successfully remove all traces of explicit content of oneself from the Internet. While there are certain tools, like the Take It Down site, that can help in these efforts with revenge porn, they require the requester to provide the exact image in question; if the images have been altered in any way (cropped, edited, etc) or if there have simply been many different versions of the images created, such requests won’t be effective.

Furthermore, the legal system does not yet seem well poised to tackle such a complex problem. In 2019, the Duke Law and Technology Review published an article titled “Deepfakes: False Pornography is Here and the Law Cannot Protect You,” which more or less concluded what’s stated in its name and specifically focused on “on the legality of pornographic deepfakes featuring a non-celebrity, such as an acquaintance, and their circulation on websites and amongst friends.” The Vanderbilt Law Review seconded this opinion the following year with their own article: “The New Weapon of Choice: Law’s Current Inability to Properly Address Deepfake Pornography.” While citizens are legally protected against revenge porn in nearly all states, victims of pornographic deepfakes are left with less clear support. In the United States today, “there is no federal law regarding deepfakes,” though 10 states have a mix of laws that attempt to address these forms of harassment and violation. As one expert put it, “you’ve got a patchwork of criminal charges, which are going to be difficult in these cases,” as some states have legislation to target the distribution of explicit deepfake content but don’t thoroughly address the creation of such content. There is also the question of who should be liable in these cases: the software creator, the content creator, the content host. Some victims have taken to litigation, suing those who shared AI-generated content of them. However, for so many women who don’t have access to the resources of high-profile celebrities, ensuring that this content is removed and that those who violated them face consequences may be next to impossible.

Let us know what you think by voting in our poll and leaving an anonymous comment.

  💭 Our two cents

We hope you take us at our word when we say the purpose of this post is not fearmongering. The majority of women are not the targets of deepfake pornography or revenge porn, especially if (like us) you’re not in the public eye as much as most celebrities and politicians. You shouldn’t lose sleep at night over the possibility of explicit images and videos of yourself floating around on the (dark) web. But at the same time, we think it’s important to know what’s happening out there. It can be extremely upsetting to hear how thousands of women have been violated by deepfakes, and extremely frustrating that resources on the cutting edge of AI are being dedicated to the objectification of women. Obviously, the technological advances that make this content generation possible are remarkable and will likely have many revolutionary benefits. But that doesn’t mean it’s not okay to be incredibly disappointed, angered, or [insert whatever emotion you want] that the second men get their hands on this technology, one of its core purposes becomes reducing women from all walks of life to overly sexualized, fantasized, unrealistic versions of themselves. All the more reason we need more women at the table.

💃 The girls have spoken

Last week’s piece, So…why is women's pain so often dismissed?, talked through how and why women’s pain tends to be ignored by medical professionals. In the poll, a whopping 92% of you shared that you’ve had pain dismissed by a clinician at some point (with 18% saying you’ve experienced it a lot). One comment addressed how the gender of your practitioner might make a difference - for this reader, seeing a female clinician helped to mitigate against the bias. Of course, not all male doctors will be more dismissive and not all female doctors will be more attentive - and we couldn’t find any research that says that female doctors treat women better. However, the Association of American Medical Colleges (AAMC) does claim that Black patients receive better care from Black doctors - so there is research that shows that in some cases, seeing a doctor who looks like you can result in a better healthcare experience. Something to keep in mind!

💌 Up Next

That’s all for today! If you liked this edition of Not That Personal, we think one of your friends probably will too – refer one (or two or three) below. ;)

Have something to say? We’d love to hear it – reply to this email or leave an anonymous comment here :) 

Up next:  So…when did marriage and kids become our only milestones? 

💖 S & J