More News:

June 14, 2023

AI and deepfake identity theft could become illegal in New Jersey

A proposed law would give fines and jail time to those who use digitally altered media to impersonate and defraud

As the use of artificial intelligence through chatbots and photo generators becomes more popular and accessible, some New Jersey lawmakers are cracking down on the creation of deepfakes — pictures and videos of real people digitally altered to look like someone else.

The bipartisan bill, introduced earlier this month by Sen. Doug Steinhardt (R-Warren) and Sen. Brian Stack (D-Hudson), would update New Jersey's existing identity left law to include provisions about AI-generated images and deepfakes. Deepfakes are often used maliciously, either to spread misinformation or create pornographic material without the consent of the person whose likeness is used.

The legislation would not ban the use of deepfakes or other artificial intelligence software entirely. Rather, it would criminalize the use of deepfake technology to impersonate or falsely depict an individual, people and organizations in order to gain money, benefits or services.

If passed, punishment would be tied to the severity of the theft. Those who use the technology to defraud a single person could face up to 18 months in jail, which rises to four years if four people are defrauded and five years if five or more people are impacted. 

"As artificial intelligence tools have grown increasingly more powerful and available to the general public, they've opened the door for scammers to commit shockingly disturbing new crimes involving identity theft," Steinhardt said. "With very little technical expertise, scammers can download pictures or videos of a person from online sources and run it through AI tools to imitate their voice or generate realistic video of the person saying or doing things that never happened. It's leading to new scams that puts both the imitated victim and other parties, including relatives, at risk." 

Victims would be able to file civil lawsuits against alleged perpetrators even if they are already being prosecuted criminally. If the perpetrator is convicted, they would be required to issue a public retraction of any false depiction or statement attributed to the victim. 

The bill notes that deepfakes can be used to cause "perceptible individual or societal harm," including misrepresentation, damage to a person's reputation, harassment, financial loss, incitement of violence or interference in official proceedings, like courts or elections. 

Recent pushes to regulate the technology concern deepfake porn, in which photos or videos of a person are inserted into existing pornographic material without their consent. But advocates have been pushing for regulation for several years, also noting how the technology can be used to apply for jobs and impersonate political candidates; one viral example came in 2018, when director Jordan Peele teamed up with Buzzfeed to create a deepfake video of former President Barack Obama delivering a public service announcement about fake news.

In March, Sen. Kristin Corrado, a Republican serving portions of North Jersey, introduced a bill that would subject deepfake porn to civil and criminal penalties under New Jersey's revenge porn laws, though it has yet to move on from the Senate Judiciary Committee. 

"It should alarm everyone that you can get a call that sounds exactly like a child, grandchild, or friend asking for help or money, but it's not really them," Steinhardt said. "Imagine receiving a pornographic picture or video in your inbox of someone who looks just like you with the warning that it'll be shared with your friends, relatives, and co-workers unless you pay up. While these horrific acts sound like science fiction, they are happening to victims today." 

Pennsylvania lawmakers have attempted to pass similar legislation against deepfake technology. In 2021, Rep. Nick Pisciottano, a Democrat from Allegheny County, introduced a bill that would bar people from making political deepfakes within 90 days of an election. The bill stalled in the House Judiciary Committee shortly after its introduction. 

The only states with legislation against deepfakes are California, Virginia and Texas. Most of those laws are specific to pornographic deepfakes, while others focus on misinformation that seeks to alter public opinion, according to Davis Political Review

Steinhardt and Stack's bill was referred to the Senate Judiciary Committee, where it will receive hearings and a vote. If passed, it will be put before the full Senate later this year.