NJ residents getting scammed by deepfake technology
⚠ Crooks now using AI to copy your voice and your pictures to commit crimes
⚠ All they need is 20 seconds of audio or video
⚠A push is made to update NJ’s identity theft law
A New Jersey lawmaker is pushing a plan to update the state’s identity theft law to include fraudulent impersonation using artificial intelligence and deepfake technology.
According to state Sen. Doug Steinhardt, R-Warren, deep-fake technology uses audio or video to falsely personify somebody else.
He said scammers can now record someone’s voice for about 20 seconds and then replicate it, make an exact copy of that voice saying whatever they want, in order to “steal someone’s identity or just steal from them personally, or quite frankly extort some concession from them, that’s the state of the world we’re in today.”
It's your voice but it's not you
“To take a voice and be able to make it sound like I was talking to you or you were talking to me and we would never know the difference, that’s a terrifying thought," Steinhardt said.
“With very little technical expertise, scammers can download pictures or video of a person from online sources and run it through AI tools to imitate their voice or generate realistic video of the person saying or doing things that never happened. It’s leading to new scams that put both the imitated victim and other parties, including relatives, at risk.”
Steinhardt’s legislation, S3926, extends the crime of identity theft to include fraudulent impersonation or false depiction by means of artificial intelligence or deep-fake technology, and depending on the severity of the crime in terms of monetary value or the number of victims, it could be prosecuted as a second-degree crime punishable by up to 10 years in prison.
Easy pickings for scammers
Many people now have audio and video clips of themselves on many websites, which means fraudsters can easily get access to that sound and video and create whatever kind of scenario they want.
“Kids, young adults, even adults, they post things on social media accounts where they’re talking or just having a conversation with someone and that can be picked up and sampled.”
The FBI has warned about a number of recent scams involving grandparents getting calls from what they think are their grandchildren pleading for money to supposedly get out of prison or some other horrible predicament.
Phony porn starring you
They have also had cases where scammers used downloaded pictures to generate deep-fake pornography that is then used to blackmail victims in “sextortion” schemes.
“The old-time identity theft of stealing somebody’s signature or ID card or copying a fraudulent document, it’s so much more sophisticated than that these days," Steinhardt said. “Criminals are always a couple of steps ahead of the rest of us, and sometimes the law needs to catch up.”