Last Updated on January 11, 2024 by SPN Editor
The Federal Trade Commission (FTC) is soliciting innovative solutions to safeguard consumers from the fraudulent use of AI voice cloning technology. The individual or organization that proposes the most effective solution will be awarded a $25,000 prize.
AI voice cloning technology, while improving rapidly, is increasingly being exploited for phone scams and fraud. The FTC, responsible for protecting consumers from such fraudulent activities, is finding it increasingly challenging to keep up with the advancements in AI voice cloning.
Despite the controversies surrounding AI voice cloning, there are valid reasons to continue developing this technology. For instance, it can restore the ability to communicate for someone who has lost their voice or provide a voice for a beloved movie character after the original voice actor has passed away.
Last year, New York Mayor Eric Adams utilized an AI clone of his voice to converse with city residents in multiple languages.
However, this technology has also simplified the process for scammers and fraudsters to deceive unsuspecting individuals over the phone. A report by McAfee last year highlighted a rise in AI voice cloning fraud, and the FTC issued a warning to consumers about “family emergency scams” where a caller impersonates a distressed family member.
Identifying an AI voice clone from a real person is extremely difficult, unless one ends the call and redials the person using a known number.
In response to this, the FTC has launched a challenge inviting individuals or organizations to propose “innovative ideas aimed at preventing, monitoring, and evaluating the malicious use of AI voice cloning technology”. The challenge, which runs from January 2 to January 12, offers a $25,000 reward for the winning idea.
The proposed solutions must address at least one of the following three intervention points identified by the FTC:
- Prevention or authentication: The solution should restrict the use or application of voice cloning software by unauthorized users.
- Real-time detection or monitoring: The solution should be able to detect cloned voices or the use of voice cloning technology.
- Post-use evaluation: The solution should be able to verify if an audio clip contains cloned voices.
Samuel Levine, Director of the FTC’s Bureau of Consumer Protection, stated, “We will use every tool to prevent harm to the public stemming from abuses of voice cloning technology.”
Stephanie T. Nguyen, the FTC’s Chief Technology Officer, indicated that the challenge is designed to ensure that companies are held accountable for the direct and indirect effects of the products they release.
However, achieving this may be difficult or even impossible. Even if reputable companies embed a “watermark” in the audio generated by their tools, scammers can still exploit freely available open-source solutions.
If you’ve ever sent a voice note, left a message on an answering machine, or posted a video of yourself on YouTube, your voice could potentially be cloned.
As generative AI continues to evolve, our best defense may be to become more skeptical and not readily trust what we see and hear.
How can I protect myself from AI voice cloning fraud?
Safeguarding yourself against AI voice cloning fraud necessitates a combination of vigilance, critical thinking, and proactive measures. Here are some strategies you can employ:
Confirm the Caller’s Identity:
When receiving a call from someone claiming to be a friend or relative, especially if it’s from an unfamiliar number, take steps to verify their identity. Asking a personalized question that only they can answer is one way to authenticate their legitimacy.
Establish a Secure Phrase:
Create a secure phrase that is known exclusively to you and your close associates. This can serve as an additional layer of verification when communicating with others, helping to ensure that you are interacting with the intended person.
Restrict Voice Sharing:
Exercise caution in sharing your voice or video content online. Cybercriminals can exploit such information to mimic your voice for fraudulent activities. Limiting the exposure of your content can mitigate the risk of voice cloning.
Make Social Media Accounts Private:
Increase your privacy settings on social media platforms to restrict access to your voice or video content. By making your accounts private, you reduce the chances of your personal information being exploited by potential fraudsters.
Verify Information Independently:
In situations where a caller claims that a family member is in distress and urgently requires financial assistance, independently verify this information before taking any action. Contact the family member or another reliable source to confirm the authenticity of the request.
Stay vigilant and cultivate a healthy level of skepticism, particularly when dealing with unexpected or urgent calls. Be attuned to inconsistencies in vocal patterns, and the urgency of the message, and be cautious with calls from unfamiliar numbers.