Pennsylvania became the first state to approve a new bill that cracks down on AI-generated child sexual abuse material by outlawing the distribution of salacious or pornographic deepfakes.
Deepfakes are images, videos and audio that are generated by artificial intelligence tools to depict both real and non-existent people.
In June, the Pennsylvania Senate unanimously approved Senate Bill 1213. The legislation makes it a crime to harass someone by distributing a deepfake image of them without their consent while in a state of nudity or engaged in a sexual act. The offense would be more serious if the victim is a minor.
In October, the state House passed the bipartisan bill which is now Act 125 of 2024.
Get top local stories in Philly delivered to you every morning. >Sign up for NBC Philadelphia's News Headlines newsletter.
On Monday, Dec. 9, Sen. Tracy Pennycuick (R – Berks and Montgomery counties) – one of the sponsors of the bill – joined Montgomery County District Attorney Kevin Steele and Pennsylvania Chief Deputy Attorney General of Child Predator section Angela Sperrazza to discuss the new law.
“What we are witnessing is the troubling rise in AI-generated sexual deepfake images of minors and non-consenting adults,” Sen. Pennycuick said. “What’s most alarming is that the children are the prime targets.”
Officials said the new law will hold the creators of pornographic deepfakes accountable, even if they’re underage.
Local
Breaking news and the stories that matter to your neighborhood.
“This is a crime and they will be held accountable for it,” Steele said. “Crimes go to adults and juveniles within the system.”
The law will also allow officials to prosecute sexually explicit deepfake creators the same way they prosecute child pornographers by making editorial changes to replace references to the term “child pornography” with references to the term “child sexual abuse material.”
“This new law has removed a substantial obstacle for us to achieve justice for those families,” Steele said.
While Pennsylvania’s Act 125 is the first bill of its kind, the country isn’t too far behind. In July, the Senate unanimously passed the Defiance Act, a federal bill that would allow victims of nonconsensual sexually explicit deepfakes to sue people who create, share and receive the images. Officials are waiting for the legislation to go through the House.
The problem with deepfakes isn’t new, but experts say it’s getting worse as the technology to produce it becomes more available and easier to use.
Researchers have been sounding the alarm on the explosion of AI-generated child sexual abuse material using depictions of real victims or virtual characters. Last year, the FBI warned it was continuing to receive reports from victims, both minors and adults, whose photos or videos were used to create explicit content that was shared online.
Several states have passed their own laws to try to combat the problem, such as criminalizing nonconsensual deepfake porn or giving victims the ability to sue perpetrators for damages in civil court.