Tamil Nadu cybercrime police issue advisory on deepfake scams

The police said deepfake tech is being used by scamsters to target citizens through video calls and other means, and defraud them of money; citizens must call the cybercrime helpline if they come across any such activity

Updated - August 08, 2023 06:59 pm IST

Published - August 08, 2023 05:23 pm IST - CHENNAI

Photograph used for representational purposes only

Photograph used for representational purposes only | Photo Credit: Reuters

The Cyber Crime Wing of the Tamil Nadu Police have issued an advisory on fake video calls that are being made by scamsters using artificial intelligence (AI) technology.

Deepfake technology is being used to perpetrate several types of fraudulent scams, by creating highly convincing and realistic fake content, often using AI to manipulate audio, video, or images. Initially, this technology was primarily utilised for entertainment purposes, enabling filmmakers and content creators to seamlessly integrate actors into scenes or impersonate historical figures. However, as the technology evolved, so did its misuse by criminals seeking to exploit the power of deception, the police have said.

Additional Director General of Police, Cyber Crime Wing Sanjay Kumar said, “The scam involving AI-generated deepfake video calls typically follows a series of carefully orchestrated steps, combining technological sophistication with psychological manipulation. The scamster creates a fake profile, often using stolen images or publicly-available photographs of trusted individuals, like friends or family members. They then use AI-powered deepfake technology to create highly realistic video calls on social media/other online platforms and impersonate someone the victim knows, such as a friend, family member, or colleague to deceive them into thinking it’s a genuine conversation. Later, they create a sense of urgency and request the victim to transfer money to their bank accounts.”

Police said the deepfake is carefully designed to mimic the appearance and mannerisms of the impersonated person. In addition to the video manipulation, scamsters are using AI-generated voice synthesis to mimic the voice of the impersonated person, enhancing the illusion of authenticity during the video call.

Mr. Kumar said, “Though there has been no complaint in this regard received so far from the State, we wish to alert citizens to be aware and to be vigilant about such frauds. People should stay informed about the latest scams, including those involving AI technology, and be cautious when receiving video calls from unexpected sources.”

When receiving a video call from someone claiming to be a friend or family member, make a phone call to their personal mobile number to verify their identity before transferring any money, said the ADGP.

Limit sharing of personal data

The advisory also asks citizens to limit the amount of personal data shared online and adjust privacy settings on social media platforms to restrict access to information, and to consider using multi-factor authentication and other identity verification measures to protect accounts from unauthorised access.

The Cyber Crime Wing Police also said if anyone suspects that they have been a victim of a deepfake video call fraud or have come across suspicious activity, it is crucial to take immediate action and report the incident by calling the Cyber Crime tollfree helpline number 1930 or by registering a complaint at www.cybercrime.gov.in. Citizen can also contact the concerned platform where the fraudulent activity took place, and provide them with all the relevant details, including the scamster’s profile information, messages, and any evidence collected.

0 / 0
Sign in to unlock member-only benefits!
  • Access 10 free stories every month
  • Save stories to read later
  • Access to comment on every story
  • Sign-up/manage your newsletter subscriptions with a single click
  • Get notified by email for early access to discounts & offers on our products
Sign in

Comments

Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide by our community guidelines for posting your comments.

We have migrated to a new commenting platform. If you are already a registered user of The Hindu and logged in, you may continue to engage with our articles. If you do not have an account please register and login to post comments. Users can access their older comments by logging into their accounts on Vuukle.