On Gender-Based Cyber Sexual Violence

By Malak Mansour

Amidst the rise and ­­ever-growing development of all derivatives of artificial intelligence, it truly seems as if everything is within a laptop’s reach. Powerful AI tools can write essays, code in multiple languages, and compute complex mathematical tasks amongst other day-to-day tasks. However, that entails that AI is powerful and convoluted enough so that it generates artificial videos and images of virtually anyone if fed enough, and seemingly little, data. While bleak to admit, women received the shorter end of the stick, once again. It didn’t take long for people to create scarily accurate deepfakes of women in pornographic contexts, whether as revenge porn or in an attempt to cater to a fantasy of any woman a person desires. It may seem that creating such content would be reserved for the highly technologically literate with any sort of malicious intent. However, the existence of powerful AI tools renders this task quite accessible and relatively easy to anyone with basic computer skills. The emergence of more AI tools that manipulate and generate visual content seems to be the starting ground for more ways to violate women.

In an article by The Washington Post published around two weeks ago, it was mentioned that Hany Farid, digital images analyst and professor at the University of California Berkeley explained, “since these models learn what to do by ingesting billions of images from the internet, they can reflect societal biases, sexualizing images of women by default.” The author, Tatum Hunter, then provides the example of Lauren Gutierrez, a 29-year-old from Los Angeles who fed the app Lensa, which generates AI portraits, publicly available photos of herself, such as her LinkedIn profile picture, after which Lensa returned naked images. This can be attributed to the fact that the training that these algorithms go through include a lot of pornographic content available online, which, in turn, can easily return nsfw (not necessarily just nude) content wi­thout being specifically asked.

Some might argue that the main targets of such content would be celebrities, but as we have seen, it can really be applied to any woman who has a social media presence. It goes without saying that the non-consensual generation and propagation of such content is also sexual harassment or violence. This sort of violence can be even more threatening since the perpetrator can maintain complete anonymity, so the victim does not only have fabricated videos of herself readily available online, but she also does not know the source. This creates a different power structure which exerts power and control over a victim from behind a screen and a stable Wi-Fi connection. In an article published at MIT Tech Review, author Karen Hao shares the story of Helen Mort, a UK based poet and broadcaster, who was subjected to a fake pornography campaign. “It really makes you feel powerless, like you’re being put in your place,” she [Mort] says. “Punished for being a woman with a public voice of any kind. That’s the best way I can describe it. It’s saying, ‘Look: we can always do this to you.’”

The offense does not only end with the release of the videos but can also extend to the professional lives of these women. Women can and have lost jobs and struggle with finding employment. Image-based sexual abuse is a gendered security issue. The concern with deepfakes is not limited to political campaigns, which was the initial reason people were alarmed, but it also extends to the personal livelihoods of many women. There exist many communities online that perpetuate such forms of media without much surveillance or regulation thus far, namely Reddit. For example, Reddit was the hub of a now banned subreddit dedicated to creating pornographic celebrity deep fakes. The problem may seem constricted to a “few ” hundred redditors, but an analysis conducted in 2019 by the cybersecurity company Deeptrace found that 96% of all deep fakes online are pornographic and disproportionately female. This sort of statistic is alarming, to say the least.

It is important to note that there are many legislative and social efforts to combat deep fakes that target and victimize women. Software to detect and recognize pornographic deep fakes are being developed and fine-tuned, but the battle of the AIs remains to this day as both kinds of software keep getting more powerful and nuanced. The grave reality remains true that our use of technology has always been gendered, so while it is important to celebrate revolutionary creations and developments, we must stay vigilant and wary of how it is being utilized without turning a blind eye.

 

Leave a Reply

Your email address will not be published. Required fields are marked *