While lawmakers on Capitol Hill have released a roadmap for effectively and ethically dealing with the rising challenges of Machine Learning and Generative AI technologies, very little has yet been done about the ubiquity of non-consensual NSFW media this technology can create. From AOC, to Taylor Swift, to kids in High School, no one is safe from technology like Deepfakes, Inpainting, or Image Generators, which allows some creep to turn them into pornstars.
In a 2023 study by Home Security Heroes, a company dedicated to protecting people from ID fraud, the data revealed that 98% of all deepfake videos online are pornographic, and 99% of them feature women. At the moment, there is no hard data on how many of these NSFW deepfake videos and images depict minors. However, stories about faked images, including fake nudes, being passed around High Schools have made headlines in recent years.
The 'Take it Down' Act
To combat the increasingly pervasive problem of AI-generated NSFW content, legislators recently introduced a bipartisan bill called the "Take It Down" Act that uses common-sense solutions to protect people, especially minors, from this digital-age crime. You can read a one-page explainer here.
Suspicion of any anti-porn legislation is understandable, especially those purported to “protect children," which tend to use kids as a way of banning all porn and sex work. But so far, this bill is a rare exception. What is shocking about this bill is not just the lack of hidden dangers to adults, Queer folks, or Sex Workers, but who is spearheading it.
Ted Cruz. Yeah. That Ted Cruz. Hardcore Conservative, widely despised, leaves-his-constituents-during-an-emergency Ted Cruz is doing a genuinely good thing.
To better understand this baffling phenomenon, let’s take a deeper look at the “Take It Down" Act itself, and what it means for the future of AI porn.
What is the 'Take It Down' Act trying to address?
For people who are not familiar with Machine Learning (ML) or Generative Artificial Intelligence (Generative AI), and there seem to be many, here's a quick explainer of the tech and how it has been being used to create Non-Consensual Intimate Imagery (NCII).
Contrary to popular belief, what we call AI today is not Skynet or some other evil pop sci-fi villain. Think of it more like the predictive text function we have all had on our phones for years.
ML models are loaded with data mined from the internet for information and use that to give the user what they are asking for in the prompt. Much like predictive text, your phone will learn what you are most likely to type next and that you aren’t saying “There’s nowhere to ducking lark” as you circle the block. The more it gets used and the more data gets, the better it becomes.
When Generative AI became available to the public in 2022 and started producing images, the quality wasn’t great. There were issues with hands looking weird, eyes going every direction, and face replacement looked like someone was wearing a Halloween mask.
These days, however, it takes a person with ill intentions less than a minute to make reasonably convincing fake porn from images of anyone they choose, be it a celebrity or a random stranger.
While many programs, especially popular ones like Midjourney, have protections in place, they are easily defeated with a quick tutorial. There are also plenty of free apps and sites where the lack of protection is the selling point.
To be clear, the issue here is not porn itself. The issue is how easy it is to make fake pornographic images of people and the unwillingness of social media sites like Snapchat and X (formally known as Twitter) to take those images down. That unwillingness is actually how this all started and why Cruz is so heavily involved.
Late last year, High School student Elliston Berry was a victim of this technology. Only 14 at the time of the incident, she awoke to find faked nude images of her posted on Snapchat, which were being shared around her school.
Beyond shocked and horrified, Berry and her mother, Ana McAdams, tried desperately to get Snapchat to take the images down. The bureaucratic response they got meant it took around eight months to have the images removed. Snapchat only did so, according to Cruz, after the mother and daughter came to him for help and he ordered his team to get him in touch with Snapchat immediately.
Legislative solutions for NSFW deepfakes
Cruz has said that Elliston Berry’s horrific ordeal was the inspiration for this bill, being a father of teen girls himself. He's been pretty hands-off about this topic when it comes to the rampant fake porn featuring Democratic Representative Alexandria Ocasio-Cortez over the last few years, but whatever.
Cruz is not alone in his endeavor to curb the scourge of non-consensual fake porn. A similar bipartisan bill referred to as "The Defiance" Act, which gives citizens the ability to sue the creators and distributors of this media, was being pushed by Democratic Senator Dick Durbin. It should be noted that this method of civil lawsuits is less than ideal. Going to court is expensive, time consuming, makes private information public, and puts the onus of responsibility on the victim.
That bill was blocked by Republican Sen. Cynthia Lummis, saying it was “overly broad in scope” and “stifle American technological innovation.” according to CNBC.
While this is a clear case of protecting Big Tech from behaving responsibly, the Take It Down Act aims right at companies like Meta, Snapchat, and X. The proposed bill would make websites and platforms liable for hosting the material and require them to develop a system that allow victims to report fake porn and have it taken down within a 48-hour period. The bill also encourages them to make every reasonable effort to remove any copies of the image or video, including those shared in private groups.
These actions would be enforced by the Federal Trade Commission (FTC), which is responsible for consumer protection. The FTC will also have the power to being criminal charges against the creators of these images.
In a statement made to CNBC, Cruz said “By creating a level playing field at the federal level and putting the responsibility on websites to have in place procedures to remove these images, our bill will protect and empower all victims of this heinous crime.”
During his press conference on June 18th of this year, Cruz pointed out that if a celebrity or politician has fake nudes posted, they have the power to get them taken down. That power is out of reach for most regular people, especially minors.
“We’re all aware that this happened to Taylor Swift,” said Cruz. “And if you happen to be a global popstar, you can get the images pulled down. But, if you’re just a teenager in Texas, or New Jersey, or anywhere else, Big Tech ignores your pleas for help.”
The Take It Down Act is modeled after copyright laws, such as the effective DMCA law.
“If you put up The Lion King,” said Cruz, “That can get taken down incredibly quickly… and yet if you put up non-consensual intimate images, the Big Tech companies act as if they have no idea how to do this... This is designed to make it mandatory to take it down. Once passed, and I do believe we will see this passed, we will see immediate compliance just like we do on the copyright side.”
Speaking at the same press conference, CEO of National Center on Sexual Exploitation Dawn Hawkins cited platforms like Reddit, X, and Pornhub as allowing anyone to post anything without consequence.
Both Reddit and Pornhub both have their issues, but to be fair, Reddit banned Deepfake porn in 2018 as did Pornhub, around the same time. X, on the other hand, has done very little to fight AI porn fakes.
Despite Hawkins’ call out, this bill is not meant to attack all porn. The focus is limited to AI generated NCII.
An impressive part of this bill is how it avoids encroaching on lawful free speech. According to the bill’s summary:
“The bill is narrowly tailored to criminalize knowingly publishing NCII without chilling lawful speech. The bill conforms to current first amendment jurisprudence by requiring that computer-generated NCII meet a 'reasonable person' test for appearing to realistically depict an individual.”
When asked about how companies like TikTok or X will respond, Cruz said “...I hope they do the right thing and recognize this is the right solution. Frankly, it’s what they should have been doing anyways.”
Not like those other anti-porn bills
The fact that the "Take It Down" Act is necessary at all reflects the sad state of affairs surrounding AI. The Pandora’s Box that is Generative AI has been opened and there is no closing it.
As it's written right now, the "Take It Down" Act would hold those who profit and benefit from non-consensual AI images accountable and protect those victimized by it. And unlike many of the anti-porn legislation conservatives have tried to introduce, there are no sinister bits in the "Take It Down" Act that would help rich creeps or make life harder for erotica-loving adults.
It’s a good bill and it came from Ted Cruz. Crazy, right?