U.S. President Donald Trump signed a bill making it a federal crime to post “revenge porn” – whether it is real or generated by artificial intelligence. The ‘Take It Down Act,’ passed with overwhelming bipartisan congressional support, criminalizes non-consensual publication of intimate images, while also mandating their removal from online platforms.
What is the Take It Down Act?
The Take It Down Act makes it illegal to “knowingly publish” or threaten to publish intimate images without a person’s consent, including AI-created “deepfakes.” It also requires websites and social media companies to remove such material within 48 hours of notice from a victim. The platforms must also take steps to delete duplicate content. Many states have already banned the dissemination of sexually explicit deepfakes or revenge porn, but the Take It Down Act is a rare example of federal regulators imposing on internet companies.
Who supports the Take It Down Act?
The Take It Down Act has garnered strong bipartisan support and has been championed by Melania Trump, who lobbied on Capitol Hill in March, saying it was “heartbreaking” to see what teenagers, especially girls, go through after they are victimized by people who spread such content.
The Act was inspired by Elliston Berry and her mother, who visited his office after Snapchat refused for nearly a year to remove an AI-generated “deepfake” of the then 14-year-old.
Meta, which owns and operates Facebook and Instagram, supports the legislation.
Why is the new law important?
An online boom in non-consensual deepfakes is currently outpacing efforts to regulate the technology around the world due to a proliferation of AI tools, including photo apps digitally undressing women.
While high-profile politicians and celebrities, including singer Taylor Swift and Democratic congresswoman Alexandria Ocasio-Cortez, have been victims of deepfake porn, experts say women not in the public eye are equally vulnerable.
A wave of AI porn scandals have been reported at schools across US states with hundreds of teenagers targeted by their own classmates.
Such non-consensual imagery can lead to harassment, bullying or blackmail, sometimes causing devastating mental health consequences, experts warn.
Experts say that the bill is a “significant step” in addressing the exploitation of AI-generated deepfakes and non-consensual imagery.
(With inputs from agencies)
Published – May 20, 2025 12:26 pm IST