New Scam In Town: AI Voice Cloning Scam Fools Victims Into Giving Thousands

New Scam In Town: AI Voice Cloning Scam Fools Victims Into Giving Thousands

Earlier, scams were in the form of fake-looking SMSs claiming you won a jackpot or a mansion or calls claiming they need your credentials for some offer. Scams that you could ignore and be saved from with some awareness. But today, with the magical era of AI, these scams have gotten way more clever and personal. Meet your new threat- Ai Voice Cloning Scam!

A new breed of cybercriminals has been among the first to profit in the six months after Open AI initially released Chat GPT to the public and sparked an artificial intelligence arms race with the ability to rewrite history.

These cutting-edge thieves use high-tech tools and methods to steal hundreds of thousands of dollars from regular folks like you and me. Cybercriminals have developed a new type of swindle using recently forged artificial intelligence (AI) voice copying technologies. They can virtually clone anyone’s voice with a short audio sample, allowing them to deliver false voicemails and voice messaging texts. This new type of AI voice scam is growing intently due to AI voice generators being available with just one click.

Scam in Seconds

A stranger’s voice may now be created by anybody with a $1, a few minutes, and an internet connection. What’s at risk is our capacity to believe that the voices of persons we connect with online are authentic as normal people. We could someday live in a world where it’s difficult to tell whether a call you get from your mother or your employer is truly coming from them.

Although there may not yet be an AI voice cloning scam issue, one is seen in the distance. Creating methods with your loved ones to protect against the chance that your voices are synthesized, such as code phrases or a form of human two-factor authentication, according to some experts, is now the time.

You could anticipate the type of communications that cybercriminals produce. Those that are distressed and in a rush. They will imitate a victim’s friend or family member using the cloning technique, leaving a voicemail in which they claim the victim has been in a car accident, robbed, or otherwise harmed. In any case, the fake communication frequently claims that they urgently want money.

Is it A Serious Issue?

One in four respondents to a recent worldwide poll, which included 7,000 people, stated they had fallen victim to an AI voice cloning scam themselves or knew someone who had.

These voice clone messages use a small portion of a person’s voice and a script written by a cybercriminal. In the global poll, 70% of respondents claimed they were unsure whether they could tell the difference between a cloned voice and the real thing.

Overall, the strategy has so far shown to be fairly successful. In our study, one in ten respondents reported receiving a message from an AI voice clone, and 77% of those victims claimed they suffered financial losses as a result.

A Real Example

We always go through these studies here and there, but the question of credibility always stays. But as for this AI voice scam, there are plenty of real-life incidences to back.

Ruth Card received a random call on a random day. The person on the other end sounded exactly like her grandson and was crying claiming he is in jail. Her “grandson” also said he doesn’t have his phone, wallet, or card on him and he needs to pay for the bail.

Freaked out, Ruth, 73, and her husband Greg Grace, 75, rushed to their bank in Regina Saskatchewan, and withdrew 3000 Canadian dollars.

However, as they were making their way to another branch to withdraw more money, the manager stopped them and said another account holder had gotten a similar phone call and all of this was an AI voice scam made possible with AI voice generators.

Fortunately for this particular incident, there wasn’t a permanent and huge loss of money or other valuables, but the cases in which that does happen are many.

Are There Laws and Regulations?

Artificial intelligence advancements have introduced a disturbing new element, making it possible for villains to mimic a voice with just a little audio sample, such as a few phrases. With the use of artificial intelligence (AI), a plethora of low-cost web programs can convert an audio clip into a voice clone, enabling a con artist to make whatever they write “speak”.

According to experts, government authorities, law enforcement, and the courts lack the tools necessary to stop the growing swindle. The majority of victims have few clues to follow to track down the offender, and it is challenging for the authorities to track down calls and money from international fraudsters. Additionally, there isn’t much legal precedence for courts to hold tool manufacturers responsible for their usage. With AI voice cloning scams picking up pace, the laws and regulations need to do the same to control them.

Another issue is Ai voice generators are available with just one search. Plus they have an extremely interface. Once you feed it with a voice sample, you are ready for your journey of AI voice scam.

Be Safe, and Be Aware!

Tell the caller you have a terrible connection and will call them back if they claim to be from police enforcement. Ask what organization they are calling from (college security, the local jail, the FBI), then hang up (even though con artists will say almost anything to keep you on the line). If you are unable to reach your loved one, call local police authorities or find out the facility’s phone number to let them know what is happening.

Pro Tip! Have a secret passcode or phrase with your close circle and say it before every call to avoid such incidences.

Releated Posts:

Digi_Marc

digimarcfreelancing@gmail.com

JOIN OUR NEWSLETTER

get daily update to join our Magazine