In a strongly worded order addressing the growing menace of artificial intelligence-driven impersonation and misrepresentation, the Bombay High Court has granted immediate ex-parte interim relief to actor Suniel Shetty against unauthorized use of his persona through deepfakes and AI-generated content.
Justice Arif S. The doctor, in an order passed on October 10 and made available on October 13, described the infringing material as “a lethal combination of a distorted mind and misuse of technology, resulting in loss of personality rights of the plaintiff”.
The court was hearing Mr Shetty’s commercial IP suit, which sought protection of his personality rights, privacy and dignity under Article 21 of the Constitution and the Copyright Act, 1957. The actor had approached the court after discovering a series of AI-generated images and videos circulating online which falsely portrayed him and his family in obscene and misleading contexts. These were hosted on platforms operated by Meta and X Corp and promoted by various known and unknown entities.
right to live with dignity
Justice Doctor said that “the unauthorized creation and uploading of deepfake images of the plaintiff on social media platforms is a serious violation not only of his personality rights but also of his right to live with dignity.”
The court said that such exploitation, especially when used to falsely associate actors with gambling websites, astrology services and commercial endorsements, amounts to abuse of goodwill and consumer fraud.
The court barred seven named defendants: John Doe/Ashok Kumar (Defendant 1), MyBhavishvani (Defendant 2), Tring.co.in (Defendant 4), IcePoster.com (Defendant 6), PaisaWapas.com (Defendant 13), WallpaperCave.com (Defendant 15), and BCGame.co.in (Defendant 18), from using Mr. Shetty’s name, Voices, likenesses, signatures, and other identifiable features in any medium, including images, AI-generated content, deepfake videos, voice-clone audio, and metaverse environments.
Take down orders on Meta, X
The court also directed Meta Platform (Defendant 3) and X Corp (Defendant 19) to remove all infringing content listed in the suit and to act on Shetty’s future removal requests. They were also ordered to provide customer and vendor information to help identify violators.
Justice Doctor stressed the urgency of granting relief without prior notice to the defendants and said, “Given the seriousness and likelihood of irreparable damage and injury, this Court finds that delay in issuing notice would frustrate the grant of injunction, and thus, the present case warrants immediate grant of ex parte ad-interim relief.”
The judge further said that the petitioner has made a strong prima facie case and the balance of convenience is completely in favor of Shetty.
Social media firms asked to act swiftly
The court also invoked Rule 3(1) of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, which requires platforms to prevent hosting of misleading, obscene or impersonating content. This legal basis was used to instruct Meta and X Corp to act swiftly.
In support of the relief sought, Mr Shetty’s lawyer, Dr Birendra Saraf, cited several precedents where courts had recognized personality rights, including cases involving Asha Bhosle, Arijit Singh, Anil Kapoor, Jackie Shroff, Aishwarya Rai Bachchan and Karan Johar.
The court acknowledged Mr Shetty’s stature as a public figure with over three decades of work in the film industry, wide following on social media and brand endorsements. It was held that the unauthorized exploitation of their personality traits not only resulted in commercial loss, but also posed a significant risk to the public, who could be misled into believing false endorsements.
“Unauthorized exploitation of these features, while directly harming the plaintiff’s business interests, right to privacy and right to live with dignity, also poses a significant risk of harm to the public,” the order said.
The matter has been listed for next hearing on November 17, 2025.
published – October 13, 2025 10:40 PM IST