T.N. Cyber Crime Police issue advisory on new scam involving AI voice cloning

Police said scamsters were now employing AI to clone voices; victims received phone calls where the voice sounded like that of a relative/friend in distress, the victims are then asked to quickly transfer large sums of money to help their loved ones

April 27, 2024 12:02 pm | Updated 12:02 pm IST - CHENNAI

File photograph used for representational purposes only

File photograph used for representational purposes only | Photo Credit: REUTERS

The Cyber Crime Wing of the Tamil Nadu Police has issued an advisory on a new impersonation scam that uses AI (Artificial Intelligence) voice cloning, and has asked the public to be cautious about unsolicited calls received on their mobile phones.

Police said that cyber fraudsters are now employing voice cloning, using advanced AI technology, to mimic the voices of trusted individuals, such as family members, over phone calls.

The calls come under the pretext of an emergency, and by creating a sense of urgency or distress, deceive victims into transferring money quickly, exploiting their trust.

This tactic highlights the evolving sophistication of cybercrimes and emphasises the importance of awareness and caution to prevent residents from falling victim to such fraudulent schemes, the police said.

The Cyber Crime Wing said the scam begins with a phone call to the victim from a scamster posing as someone the victim knows and trusts, such as a family member or friend. The scamster may claim to be in urgent need of financial assistance due to a fabricated emergency or threat. The scamster uses various tactics to evoke a sense of urgency and emotional distress in the victim. He/she may employ sobbing or pleading tones, claiming to be in a dire situation that requires immediate help.

Behind the scenes, the scamster utilises sophisticated AI software to clone the voice of the person they are impersonating. They get a voice sample of the person from social media posts/videos or by just talking to the person over the phone using a ‘wrong number’ tactic. This technology allows them to mimic the voice as well as the intonation and emotional nuance of the victim’s trusted contact, convincingly.

“In a nutshell, the scamsters use an AI-generated cloned voice to commit cybercrimes,” said Sanjay Kumar, ADGP, Cyber Crime Wing.

Once they have established a sense of urgency and trust, the scamster requests the victim to transfer money immediately to help resolve the crisis. He/she often suggests using fast and convenient payment methods like the Unified Payments Interface (UPI) system to expedite the transaction. Driven by concern and a desire to help their loved one, the victim may comply with the scamster’s demands without verifying the authenticity of the caller or the legitimacy of the situation.

After the money transfer is completed, the victims may later realise that they have been deceived when they independently contact their family member or friend and discover that they were never in distress or in need of financial assistance. The victim suffers a financial loss, and additionally, may feel betrayed, violated, and emotionally distressed upon realising they were scammed, the police said.

Mr. Sanjay Kumar advised members of the public to always verify the identity of the person calling, especially if they request urgent financial assistance and to ask probing questions or contact the friend/relative through a known, verified number to confirm their identity before taking any action.

He also asked the public to stay informed about common scams, including this voice cloning fraud, and learn to recognise warning signs. Be wary of unexpected requests for money, especially if they involve urgent situations or emotional manipulation, he said.

If you suspect that you have been the victim of such a fraud or have come across any suspicious activity, report the incident to Cyber Crime Toll Free Helpline 1930 or register a complaint on www.cybercrime.gov.in

0 / 0
Sign in to unlock member-only benefits!
  • Access 10 free stories every month
  • Save stories to read later
  • Access to comment on every story
  • Sign-up/manage your newsletter subscriptions with a single click
  • Get notified by email for early access to discounts & offers on our products
Sign in


Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide by our community guidelines for posting your comments.

We have migrated to a new commenting platform. If you are already a registered user of The Hindu and logged in, you may continue to engage with our articles. If you do not have an account please register and login to post comments. Users can access their older comments by logging into their accounts on Vuukle.