Survey Reveals Shocking Statistics on AI Voice-Cloning Scams – Are You at Risk?

Cybercriminals are using new AI voice-cloning tools to impersonate people’s friends and family by phone in order to conduct financial scams. Through a short audio sample of only 3 seconds, the tools can clone someone’s voice with up to 85% accuracy. The scammers then send messages asking for money due to fake emergencies like accidents or robberies. Over a quarter of people surveyed had experienced this type of scam directly or knew someone who had. The research also found voice data is easily accessible online through social media posts, which helps scammers collect voice samples. Multiple startups are offering open-source voice cloning that is instantaneous.

The American Federal Trade Commission (FTC) is launching the Voice Cloning Challenge to promote the development of ideas to protect consumers from the misuse of artificial intelligence-enabled voice cloning technology. The goal is to address consumer harms before they occur in the marketplace or enforce the law when harms do happen. According to a survey, 1 in 4 people have experienced or know someone who experienced an AI voice cloning scam. The challenge will offer $25,000 to the winner, and submissions will be accepted online from January 2 to January 12, 2024 on the challenge website. Up to 53% of adults share their voice data online weekly, providing material for cybercriminals to create clones.

The submissions will be evaluated based on the following criteria: Administrability and Feasibility to Execute, Increased Company Responsibility, Reduced Consumer Burden, and Resilience.

The challenge is aimed at developing solutions for prevention and authentication of unauthorized use, real-time detection and monitoring of cloned voices, and post-use evaluation of audio clips. It also aims to address the harms of AI-enabled voice cloning, such as fraudulent extortion scams targeting families and small businesses, the appropriation of voice artists’ voices in ways that threaten their livelihoods and deceive the public, and the broader misuse of biometric data and creative content.

A study by McAfee Labs found that 36% of people who reported losing money lost between $500 and $3,000, while 7% were taken for amounts between $5,000 and $15,000. Cybercriminals can easily source original voice files to create clones, as 53% of adults share their voice data online or in recorded notes at least once a week. Nearly half of respondents would reply to a voicemail or voice message purporting to be from a friend or loved one in need of money, particularly if they thought the request had come from their partner, spouse, mother, or child. Cybercriminals often ask for difficult-to-trace payment methods, such as gift cards, wire transfers, reloadable debit cards, and cryptocurrency.

To protect yourself from AI voice clone attacks, set a unique codeword with trusted family and friends. Question the source of the information, even if it’s a voicemail or text from a familiar number. Be cautious about sharing personal information on social media, setting profiles to “friends and families” only. Protect your identity by using identity monitoring services to alert you if your personal information is exposed on the dark web. Clear your name from data broker sites, as scammers may have obtained your phone number from these sites.

 

Links

FTC Announces Exploratory Challenge to Prevent the Harms of AI-enabled Voice Cloning

The FTC Voice Cloning Challenge

FTC Now Accepting Submissions for Voice Cloning Challenge

Artificial Imposters—Cybercriminals Turn to AI Voice Cloning for a New Breed of Scam

Please complete the required fields.




Back to top button