Two lawmakers have reintroduced a bill that might make the nonconsensual sharing of digitally altered pornographic images a federal crime.
The legislators’ renewed a push on Tuesday to pass the “Stopping Deepfakes of Intimate Images Act” was led by Rep. Joseph Morelle, according to The Wall Street Journal.
The Democratic Recent York congressman first authored the act in May 2023, which he said in a press release on the time was created “to protect the proper to privacy online amid a rise of artificial intelligence and digitally-manipulated content.”
Morelle’s renewed push of the “Stopping Deepfakes of Intimate Images Act” now includes a co-sponsor, Rep. Tom Kean, a Republican from Recent Jersey, The Journal reported.
The congressmen created their respective bills within the wake of an incident at Westfield High School in NJ, where AI-generated pornographic images of female students at the college were circulated by male classmates without their consent.
On Oct. 20, Westfield High School Principal Mary Asfendis confirmed the troubling event to the parents of every of the college’s roughly 1,900 students after girls reported the photos to school administrators.
“That is a very serious incident,” Asfendis wrote. “Recent technologies have made it possible to falsify images and students need to know the impact and damage those actions could cause to others.”
The reintroduced laws was referred to the House Committee on the Judiciary, however the committee has yet to make a decision on whether or not to pass the bill.
Kean has also been outspoken about putting guardrails around AI, introducing the AI Labeling Act of 2023 in November, which might require creators to disclose whether images or written content were created using AI.
Except for making the sharing of digitally altered intimate images — also often known as “deepfakes” — a criminal offense, Morelle and Kean’s proposed laws also would allow victims to sue offenders in civil court.
When The Post sought comment from Morelle, his office shared a press release on his renewed push of the bill, where he said: “Deepfake pornography is sexual exploitation, it’s abusive, and I’m astounded it isn’t already a federal crime. My laws will finally make this dangerous practice illegal and hold perpetrators accountable.”
Greater than 30 female students at Westfield High School allegedly fell victim to the deepfake scheme, though it wasn’t immediately clear what number of students were involved in creating the fake nude images, or if any disciplinary motion had been taken.
At a highschool school in Miami, nonetheless, two boys were suspended for creating and spreading images so disturbing that several victims didn’t want to return to class.
The perpetrators obtained headshots of the scholars — each female and male — from the college’s social media account and used an AI deepfake app to create the nude images that were then shared to social media.
According to visual threat intelligence company Sensity, greater than 90% of deepfake images are pornographic.
One other concerning report from the University College London said that humans are unable to detect over a quarter of deepfake speech samples generated by AI.
The UCL researchers warned that deepfake technology is simply getting stronger, as the newest pre-trained algorithms “can recreate a person’s voice using just a 3-second clip of them speaking.”
In an example of how convincing this technology will be, several Taylor Swift fans were reportedly scammed out of lots of of dollars earlier this month after tricksters released advertisements employing AI-generated video of the Grammy winner peddling Le Creuset in an attempt to steal money and data from fans.
The ads — which will be found across all social media platforms — show Swift, 34, standing next to the Le Creuset Dutch oven, which, according to the official website, runs anywhere from $180 to $750 depending on the scale and magnificence.
Earlier this yr, other deepfake images of Pope Francis in a Balenciaga puffer jacket and Donald Trump resisting arrest also took the web by storm.