A brief clip of what appears to be popular Indian star Rashmika Mandanna entering an elevator has blown up in India and received condemnation the world over.
At first glance, the video looks to be a harmless clip of the 27-year-old Bollywood star – who has 39 million Instagram followers – in activewear getting out of the lift.
But despite looking painfully realistic, the video isn’t Mandanna in any respect.
The girl within the video was actually a British-Indian influencer named Zara Patel, along with her real face being visible in the primary frame of the six-second video.
Deepfakes are false images or videos created using artificial intelligence.
The phenomenon is nothing latest, but recent advancements in technology have led to creepily-convincing videos being posted online daily.
The star herself is now calling for greater regulation of AI technology, calling the clip “extremely scary”and saying it shows how technology could be easily misused.
Abhishek Kumar, a journalist from India, tracked down the fake video’s origins and called for brand new “legal and regulatory” measures to tackle the spooky phenomenon, as hundreds condemned the video for using Mandanna’s likeness without her permission.
The incident has sparked further discussions in Indian media publications about exactly combat deepfake technology as artificial intelligence continues to be developed at breakneck speed.
“There’s an urgent need for a legal and regulatory framework to cope with deepfake in India. You may have seen this viral video of actor Rashmika Mandanna on Instagram. But wait, it is a deepfake video of Zara Patel,” Kumar posted.
Mandanna took a stand against deepfake technology on Monday and thanked her fans for the support.
“I feel really hurt to share this and must talk in regards to the deepfake video of me being spread online. Something like that is truthfully extremely scary, not just for me, but additionally for each of us who today is vulnerable to a lot harm because of how technology is being misused,” she wrote.
“But when this happened to me after I was in class or college, I genuinely can’t imagine how I could ever tackle this. We’d like to deal with this as a community and with urgency before more of us are affected by identity theft.”
Several celebrities showed support for Mandanna and expressed their shock on the deceptive use of the technology.
Bollywood star Amitabh Bachchan supported Mandanna and called for legal motion against the creators of the deepfake video.
Other celebrities, including singer Chinmayi Sripaada, also voiced their concerns in regards to the misuse of technology and the necessity for legal protection.
“It’s truly disheartening to see how technology is being misused and the thought of what this could progress to in the longer term is even scarier,” Sripaada posted online.
“Motion needs to be taken and a few kind of law needs to be enforced to guard individuals who have and can be a victim to this. Strength to you.”
Fans defended Mandanna and demanded strict laws be brought in to combat the fakes.
The deepfake phenomenon has made headlines in recent weeks, with Australia’s own Hamish Blake being caught up in a “scary” video scam.
An commercial running on Instagram encompasses a somewhat convincing video of the comedian and broadcaster appearing to advertise weight reduction gummies.
“Two months ago, I saw an commercial for gummies and the web site claimed that with the assistance of this product, you possibly can shed pounds by 12 kilos in 4 weeks,” the fake Blake says within the ad.
“I made a decision to order 4 bottles and in the primary few days, nothing modified. I used to be sceptical about this. But what was my surprise when my weight began to evaporate.
“After only two weeks, I had lost six kilos. At the tip of the course, I had lost 13 kilos.”
The fake Blake sounds alarmingly like the actual one and the vision, although a low resolution, animates his face and shows his mouth moving.
On air this morning, 2GB Breakfast host Ben Fordham said he knows Blake well and was shocked when he spotted the Instagram ad on the weekend.
“That feels like Hamish Blake,” Fordham said, before introducing the real-like star.
“I promise that is the actual Hamish,” Blake said. “This one won’t sell you magic beans in the shape of weight reduction gummies.”
He said with some 20 years of recorded examples of his voice available online because of his prolific profession in radio and TV, AI technology has plenty to work with.
“I suppose there’s enough words on the market to effectively make me say anything,” he said.
Authorities around the globe are scrambling to establish guardrails for AI, with several US states reminiscent of Minnesota passing laws to criminalise deepfakes aimed toward hurting political candidates or influencing elections.
On Monday, US President Joe Biden signed an ambitious executive order to advertise the “protected, secure and trustworthy” use of AI.
“Deep fakes use AI-generated audio and video to smear reputations… spread fake news, and commit fraud,” Biden said on the signing of the order.
He voiced concern that fraudsters could take a three-second recording of someone’s voice to generate an audio deepfake.
“I’ve watched one of me,” he said.
“I said, ‘When the hell did I say that?’”