AI

83% of Indians fell prey to AI Voice Scam, with 48% losing ₹50,000+

83% of Indians fell prey to AI Voice Scam, with 48% losing ₹50,000+

Image generated using AI.

According to McAfee's "The Artificial Imposter" research, 83% of Indians lost money, 48% lost more than ₹50,000, and 69% could not differentiate between human and AI-generated voices.

by
83% of Indians fell prey to AI Voice Scam, with 48% losing ₹50,000+

Image generated using AI.

Table of contents

83% of Indians fell prey to AI Voice Scam, with 48% losing ₹50,000+

Twitter is swamped with AI criticisms and posts, with one user saying, "I have had two scam calls like this today." I have had two scam calls like this today. One AI voice message left said $200+ would be deducted from my bank for some reason, and another one in Chinese IDKWTF said."

83% of Indians fell prey to AI Voice Scam, with 48% losing ₹50,000+

A user attacked SPAM callers from Kolkata, saying, "Just wait until scam callers in Kolkata use voice AI to sound like a legitimate customer service rep calling from a reputable company."

83% of Indians fell prey to AI Voice Scam, with 48% losing ₹50,000+

Another user expressed concern about having your voice available on social media, calls, or even the playground, adding, "So just three seconds of your voice opens you up to AI voice scams. This worries streamers with grudges or anyone with a sensitive family."

83% of Indians fell prey to AI Voice Scam, with 48% losing ₹50,000+

McAfee executed a research project named "The Artificial Imposter" (1) with 7054 individuals across seven nations, including India, and found that 47% of Indians have encountered or know someone who has fallen victim to an AI Voice Scam, which is double the global average.

83% of Indians fell prey to AI Voice Scam, with 48% losing ₹50,000+

The cost of falling for such frauds resulted in a 48% losing more than $50,000. The researchers discovered that dozens of AI voice-cloning programs are accessible for free on the Internet, some requiring only basic training and expertise.

These techniques enable users to produce an 85% synchronized voice with only three seconds of audio. They can obtain a 95% voice match on a handful of video recordings available anywhere by developing high-end data models of cloning.

Cybercriminals worldwide, including India, are employing AI technology to clone voices, make deceptive voice notes, voice mails, and even direct calls to the victim's contacts while claiming the user is in trouble.

As a result, 66% of Indian people reacted to the voicemail or voice note appearing to be from a friend or close relative of the user. Unsurprisingly, it is gaining traction because scammers and even regular people know how emotional the Indian audience is.

This covers an extensive region and targets multiple persons under the guise of 34-46% partner or spouse and 12% pretending to be their child. Messages stating that the sender has been robbed has been in an accident, has lost their phone or wallet, or needs assistance when traveling abroad are most likely to receive a response.

According to the survey, the texts claiming the sender was robbed were 70% involved in a car accident, 69% lost their phone or wallet, and 62% needed assistance while traveling abroad.

While on call, approximately 45% of those surveyed said they reacted to a cash request, 48% to a car breakdown or robbery, 43% to a lost phone or wallet, and 41% to those stating they needed assistance abroad.

83% of Indians fell prey to AI Voice Scam, with 48% losing ₹50,000+

The comparison between genuine and voice-cloning tools is astonishing. As previously stated, anyone can use any voice-cloning program to generate modest to high-quality voice recordings for free today of any user. Since AI can match up to 95% of the user's voice tone, it will make you feel "real."

People are becoming more skeptical about the integrity of internet content due to the rise of deepfakes and disinformation. According to the report, 27% of Indian people have begun to lose their trust in social media platforms, and 43% are concerned about the growing presence of false information.

According to McAfee, voice cloning scams are on the rise in India, with cyber criminals utilizing artificial intelligence to control the voices of friends and family members due to 86% of Indian adults sharing their speech data online once a week.

This makes voice cloning a more potent tool in the hands of criminals, and to complicate the matter, they discovered that 69% of Indian adults could not tell the difference between a cloned voice and the actual one, making it easier for scammers to fool and loot the people.

83% of Indians fell prey to AI Voice Scam, with 48% losing ₹50,000+

Here are some warning indicators and notions to help you identify a potential scam call or message:

  • If the contact demands or depicts an urgency, it is a hoax, as scammers frequently claim they require immediate assistance or something bad will happen.
  • When someone asks you to transfer money to a specific account in an emergency, out of worry, or due to a personal or medical situation, it may expose your personal information.
  • If you detect anything out of the ordinary from someone you know, such as distress, it is likely to be a scam, as is a shift in tone indicating that the individual requires assistance.
  • If you get a strange call from a friend or family member that asks for your personal information, never give it to them since it could be a scam, or your phone could be on the verge of a data breach or leak.
83% of Indians fell prey to AI Voice Scam, with 48% losing ₹50,000+

McAfee labs revealed their results and evaluations from a comprehensive investigation of AI voice-cloning technology and its application by cyber criminals. They revealed that since each person's voice is unique, it is the verbal counterpart of a biometric fingerprint and is a generally regarded and trustworthy source.

However, 86% of Indian adults share their voice data via social media or in recorded notes at least once weekly. Given this and the assistance of AI-generated voice cloning, it has become a powerful tool in cybercriminals' arsenals worldwide.

The popularity and usage of AI tools have made it easier to alter photographs, videos, texts, and, most alarmingly, the voices of relatives, close friends, loved ones, and celebrities.

83% of Indians fell prey to AI Voice Scam, with 48% losing ₹50,000+

As mentioned before, according to McAfee's report, scammers are using AI technology to clone voices and then send false voicemails or voice notes or even phone the victim's contact directly, posing a risk.

It utilized one of these programs to replicate a researcher's voice tone to navigate how it worked. With just three to four seconds of recorded speech, the free tool generated a convincing clone of the researcher's voice with an estimated 85% match.

The team was able to record 100 prompts for a mere 0.0006 per second of audio produced, resulting in a higher-quality output, and with the next premium plan, the team was able to add elements like emotion and accent, to render the voice indistinguishable from what was real.

With increased investment and effort in the cloning business, voice-matching capabilities would increase significantly, rendering AI voice scams a global menace. The report noted that they had no trouble imitating pronunciations from well-known countries.

Popular accents, typically from the United States, the United Kingdom, India, or Australia, were easier to reproduce. Cloning a person's voice with an unusual tempo, rhythm, or accent was more difficult and less likely to succeed.

83% of Indians fell prey to AI Voice Scam, with 48% losing ₹50,000+

However, the more precise the clone develops, the more likely it is to manipulate someone for money. With the advancement of technology, a scammer can easily make thousands of rupees in minutes which is getting scarier than ever.

Are you aware of the recent $1 million AI Kidnapping Scam?

Jennifer DeStefano, an Arizona mother, allegedly received a call from a scammer who duplicated her daughter's voice and pretended to kidnap her. She received a call from an unknown number, and when she answered, she heard her daughter screaming and saying, "Mom, I'm messed up."

Since her daughter was on a skiing vacation out of town, the scammer had a great opportunity to rob her bank. She later reported that a man's voice told her daughter to put her head back and lie down and said:

"Lister here, I've got your daughter."

While the scammer was speaking, her daughter's voice called out for help in the background, and she believed it was her daughter. "It was completely her voice," she added, "and it was the same way she would have cried and never doubted for one second if it wasn't her."

The scammer initially demanded $1 million but eventually reduced it to $50,000 by threatening to kill them if she would not deliver the money. He warned her not to call the cops, or he would inject her with something laced with drugs.

After a panicked questionaries string of events involving a $1 million ransom demand, the mother went into the dance class shaking and crying for help. A 911 call attempting to reach her daughter exposed the "kidnapping" as a scam.

Brianna had no idea and contacted her mother to tell her that she didn't know what the fuss was about and that everything was OK. Also, it turned out that everything was normal since she was with her father instead.

Jennifer further emphasized that her daughter does not have any social media profiles. But there was one school interview that might have led to the sampling of her voice or someone from the local grounds becoming aware of the matter.

83% of Indians fell prey to AI Voice Scam, with 48% losing ₹50,000+
The #1 Tech Newsletter
in India

Stay updated with the #1 Tech Newsletter in India, featuring the latest startup news, AI advancements, and tech innovations. Subscribe now for expertly curated stories delivered directly to your inbox, keeping you informed and at the forefront of India's tech landscape.

The #1 Tech Newsletter in India


Siddhesh Surve

With a background in Journalism, Siddhesh aims to educate readers on tech news in India. Covering national and global events, he wants his readers to be the first to know what’s new in tech today!