‘Take It Down Act’ targets deepfake perverts exploiting teens online

Apr 29, 2025 - 21:28
 0  0
‘Take It Down Act’ targets deepfake perverts exploiting teens online


Elliston Berry was 14 years old when a classmate used an AI editing app to turn her social media photo into a deepfake nude. He circulated the fake image on Snapchat. The next day, similar deepfake images of eight more girls spread among classmates.

The victims’ parents filed a Title IX complaint. Authorities charged the student who created the images with a class A misdemeanor. Still, the deepfake nudes stayed online. Berry’s mother appealed to Snapchat for more than eight months to remove the images. Only after U.S. Sen. Ted Cruz (R-Texas) personally contacted the company did Snapchat finally take the pictures down.

The Take It Down Act would make it illegal to knowingly publish ‘nonconsensual intimate imagery’ depicting real, identifiable people on social media or other online platforms.

As AI becomes cheaper and more accessible, anyone can create exploitative digital content — and anyone can become a victim. In 2023, one in three deepfake tools allowed users to produce AI-generated pornography. With just one clear photo, anyone could create a 60-second pornographic video in under 25 minutes for free.

The explosion of deepfake pornography should surprise no one. Pornography accounted for 98% of all online deepfake videos in 2023. Women made up 99% of the victims.

Even though AI-generated images are fake, the consequences are real — humiliation, exploitation, and shattered reputations. Without strong laws, explicit deepfakes can haunt victims forever, circulating online, jeopardizing careers, and inflicting lifelong damage.

First lady Melania Trump has made tackling this crisis an early priority — and she’s right. In the digital age, technological advancement must come with stronger protections for kids and families online. AI’s power to innovate also carries a power to destroy. To curb its abuse, the first lady has championed the Take It Down Act, a bipartisan bill sponsored by Cruz and Sen. Amy Klobuchar (D-Minn.).

The bill would make it illegal to knowingly publish “nonconsensual intimate imagery” depicting real, identifiable people on social media or other online platforms. Crucially, it would also require websites to remove such images within 48 hours of receiving notice from a victim.

The Take It Down Act marks an essential first step in building federal protections for kids online. Pornography already peddles addiction in the guise of pleasure. AI-generated pornography, created without the subject’s knowledge or consent, takes the exploitation even further. Deepfake porn spreads like wildfire. One in eight teenagers ages 13 to 17 know someone who has been victimized by fake nudes.

The bill also holds AI porn creators accountable. Victims would finally gain the legal means to demand removal of deepfake images from social media and pornography sites alike.

Forty-nine states and Washington, D.C., ban the nonconsensual distribution of real intimate images, often called “revenge porn.” As AI technology advanced, 20 states also passed laws targeting the distribution of deepfake pornographic images.

State laws help, but they cannot fully protect Americans in a borderless digital world. AI-generated pornography demands a federal solution. The Take It Down Act would guarantee justice for victims no matter where they live — and force websites to comply with the 48-hour removal rule.

We are grateful that the first lady has fought for this cause and that the Senate has acted. Now the House must follow. With President Trump’s signature, this critical protection for victims of digital exploitation can finally become law.

What's Your Reaction?

Like Like 0
Dislike Dislike 0
Love Love 0
Funny Funny 0
Angry Angry 0
Sad Sad 0
Wow Wow 0
Fibis I am just an average American. My teen years were in the late 70s and I participated in all that that decade offered. Started working young, too young. Then I joined the Army before I graduated High School. I spent 25 years in, mostly in Infantry units. Since then I've worked in information technology positions all at small family owned companies. At this rate I'll never be a tech millionaire. When I was young I rode horses as much as I could. I do believe I should have been a cowboy. I'm getting in the saddle again by taking riding lessons and see where it goes.