Law enforcement agencies are raising concerns about voice-cloning AI tools, warning consumers that scam callers are using the tech to mimic loved ones’ voices in elaborate phone call frauds.
New AI systems can create realistic audio clips using just a snippet of a target’s voice. Scammers can easily source from content posted online and allegedly use the voice of a victim’s loved ones to hook victims into forking over cash.
KIRO7 reported last week that a family in Tacoma, Wash., recently encountered a phone scammer they believed to be their 16-year-old daughter, informing them of her involvement in a serious car accident. The news station reported that the scammer, using voice-cloning software, demanded at least $10,000 for their daughter’s safe return.
The incident, which involved local law enforcement, pushed the county’s Sheriff’s Department to issue a warning about the rise of phone scam using voice-cloning software.
“Artificial intelligence is no longer a far-fetched idea out of a sci-fi movie,” said the post from Pierce County, originally published by the FTC in March. “We’re living with it, here and now.”
The FTC noted that consumers should create plans with their family members to avoid falling victim to scam calls using voice-cloning tools. If targeted, victims should verify their loved ones’ situation using a known phone number, or try to get in contact with another family member or close friend. Additionally, they should watch out for requests to hide money trails, such as wiring money, using cryptocurrency, or sharing gift card details, the FTC advised.
The release of new voice-cloning tools comes amid an ongoing onslaught of fraud and spam calls. A recent report by Seattle caller identification startup Hiya found that during the first quarter of this year, nearly 25% of all unknown calls globally were flagged as spam or fraud — that’s 73.6 million per day.
Government officials are trying to crack down on these illegal robocalls. The FCC declared the matter a top priority last year and implemented fines and policies to combat fraudulent calls.
In April, Washington state lawmakers passed new legislation to limit robocalls and give residents and the state’s attorney general the ability to sue companies for unsolicited calls. The legislation is part of a broader effort initiated last year by state AG Bob Ferguson to battle the often annoying calls and to aid victims targeted by scams.
from GeekWire https://www.geekwire.com/2023/police-near-seattle-issue-warning-about-ai-phone-scammers-impersonating-family-members/