Sen. Tracy Pennycuick: Addressing the dangers of AI-generated deepfakes in Pennsylvania
In August, Lancaster County police launched an investigation into a disturbing case involving 20 high school female students. The perpetrator took these teenage girls’ real pictures and used artificial intelligence (AI) technology to generate nude “deepfake” images and distributed them on the internet. Despite the clear harm caused, the district attorney pointed out a critical problem: a gray area exists in the law that prevents charges from being filed in cases like these.
This incident is far from unique. We are witnessing a troubling rise in AI-generated sexual images of both minors and non-consenting adults. This technology can be used to create photos and videos to depict individuals in explicit scenarios that never occurred with astonishing and nearly indistinguishable accuracy.
Unfortunately, these deepfake images were not explicitly covered by existing state laws, including our child sexual abuse statutes.
Until recently, for example, it was not illegal for a friend, colleague, or even a stranger to take photos from someone’s public social media profile, use AI to create explicit content, and then distribute the “deepfakes” online. Shockingly, some websites have even published realistic AI-generated sexual images of children.
As AI technology advances, it offers significant benefits to our daily lives, from healthcare innovations to improving transportation and business operations. But with this progress comes serious risks and unintended consequences. The National Institute of Standards and Technology has already called for federal standards to address the potential misuse of AI. However, Congress has yet to fully address the dangers posed by AI-generated content.
Here in Pennsylvania, as chair of the Senate Communication and Technology Committee, I introduced Senate Bill 1213 to address the alarming rise of AI-generated deepfake sexual images of children and non-consenting adults. Although current state law prohibits the distribution of intimate images without consent, it did not clearly address the use of AI deepfake technology. This loophole left many Pennsylvanians vulnerable to a new form of digital abuse, as seen in the recent case in Lancaster County.
The bill also explicitly prohibits the use of AI to generate child sexual abuse material — previously referred to as “child pornography.” With the changes contained in SB 1213, law enforcement now has the ability to prosecute individuals who generate and disseminate these types of child sexual abuse materials.
I’m pleased to report that the Pennsylvania General Assembly passed Senate Bill 1213, which was signed into law earlier this week. For the first time in Pennsylvania’s history, legislation to combat the prevalent and highly disturbing “deepfake” images of minors and child pornography generated by artificial intelligence is now law.
This bipartisan effort garnered widespread support, including from the Pennsylvania Attorney General and district attorneys throughout the commonwealth.
AI technology has incredible potential for good, but it can also be exploited. Pennsylvania needs strong laws to protect its citizens from those who use this technology to generate sexual images without consent, particularly child sexual abuse materials. With the passage of SB 1213, we are sending a clear message: the insidious use of AI to harm others will not be tolerated in our state.
And most importantly, innocent victims, like the high school girls in Lancaster County, will be able to seek justice.
Sen. Tracy Pennycuick represents the 24th Senatorial District, covering parts of Berks and Montgomery counties.