by Lauren Pope
—
Content warning: discussion of porn and sexual violence
My husband showed me a video a few days ago that depicted Jerry Seinfeld as a character from Pulp Fiction. Through the magic of AI manipulation, someone had created a pretty passable scene of Seinfeld confronting John Travolta and Samuel L. Jackson. The joke, of course, rested in the fact that the viewer knew they were watching a deepfake. While I’m sure there will eventually be some sort of restriction on using an actor’s likeness for unauthorized videos, there was no intent to deceive the audience or besmirch Jerry Seinfeld’s reputation.
The same can not be said about the disturbing trend of deepfake and AI pornography. For those unfamiliar with this phenomenon, the technology now exists to create pornographic material appearing to be from anyone who has enough video content available online. This is not limited to celebrities, though that would be disturbing enough. Everyone from Twitch streamers to YouTube personalities to TikTok dancers can have their likeness stolen and transposed into non-consensual sexual material. This is illegal in only a small handful of states. Anywhere else, someone could take your persona and create and distribute videos or pictures of you engaging in whatever acts they so desire.
Obviously, this is extraordinarily dehumanizing. Our laws recognize the harm that “revenge porn” does to its victims, and there is little, if any, practical difference in the harm done by the digitally-imagined version. Although we may be more than simply our bodies or our voices, we are also not less than them. If someone creates a plausible image of another person performing a non-consensual sexual act, a sexual violation has occurred.
We recognize that each viewing of a non-consensual sexual image is a repeat violation. How much more of a violation is it, then, when someone can create new violations at whim? Worse still, because the focus of these videos are often women who have a public voice, they seem very much intended to shame them into silence. According to Sensity AI, about 90% of the deepfake videos worldwide are non-consensual pornography that targets women.
Beyond all of this, there’s a growing movement of AI porn that harvests pictures from various corners of the internet to generate a composite image based on a user’s specifications. Although the violation here is diffused across, well, all of us, so too is the potential for dehumanization. If we accept a society in which someone can create a personalized sex object on demand, what then are those users’ expectations for people in the real world?
We already know that consuming violent sexual media correlates with a person’s likelihood of committing a sexual crime. The key word here is consume. In this scenario, there is no person behind the image. It is simply a unit of titillation designed to look like a person. There can, by definition, be no human intimacy here. A combination of photos of muscles and fat and hair is thrown up on a screen to be consumed by a user who has no emotional connection to the pixels on the screen.
But this reduction cannot help but cause users and distributors to start seeing living, breathing people as nothing more than the sum of their parts. In a world where we already struggle with body dysmorphia and violent sexual expectations, do we really need another medium encouraging people to degrade us?
My hope is that we can turn away from this precipice and remember our shared humanity. We are not just images and body parts. We are humans, capable of beauty and creation. We have empathy and compassion and anger and passion. We are not marionettes, and it would be a tragedy if we allow ourselves to be diminished by AI strings.
Comentarios