MeitY warning to X signals tougher AI compliance for big tech

The ministry warned that failure to demonstrate adherence to due diligence obligations under the Information Technology Act, 2000 and the IT Rules, 2021 could result in withdrawal of safe harbour protection under Section 79.

Last week on Friday, MeitY issued a formal notice to X after Grok was allegedly used to generate and circulate obscene images and videos of women. (Image generated by Gemini)
Last week on Friday, MeitY issued a formal notice to X after Grok was allegedly used to generate and circulate obscene images and videos of women. (Image generated by Gemini)

The government’s warning to Elon Musk-owned X over alleged obscene AI-generated content produced by its chatbot Grok is expected to push large technology platforms towards tighter, India-specific compliance frameworks, particularly for generative AI tools embedded within social media services.

The notice issued by the ministry of electronics and information technology (MeitY) marks a shift from advisory oversight to enforcement, signalling that global platforms may now be required to demonstrate auditable safeguards against misuse of AI systems distributed at scale.

What do industry observers say?

“Large platforms already have governance mechanisms, but this episode will force India-specific internal and third-party evaluations of whether those safeguards are actually effective,” Arun Prabhu, partner and co-head, digital, TMT, at Cyril Amarchand Mangaldas, told Fe. He added that particular attention would be paid to anti-circumvention measures for sexually explicit content and content involving minors generated using AI.

Last week on Friday, MeitY issued a formal notice to X after Grok was allegedly used to generate and circulate obscene images and videos of women. The platform has been directed to remove the objectionable content, take action against offending users, and submit a compliance report within 72 hours.

Ministry’s warning to Musk

The ministry warned that failure to demonstrate adherence to due diligence obligations under the Information Technology Act, 2000 and the IT Rules, 2021 could result in withdrawal of safe harbour protection under Section 79. Such a move would expose the platform to direct legal risk for content hosted or distributed through its systems.

This has broadened the discussion around intermediary liability to include AI systems that are embedded within, or distributed through, large social media platforms. “While providers of AI tools may argue that they merely generate outputs based on user instructions, the defence has limits,” a MeitY official said.

The official pointed to the possibility that if an AI system is found to play an active role in generating unlawful content, courts may view the intermediary as having crossed into the role of a publisher, rather than a passive conduit.

Although X and Grok are owned by the same parent company and Grok also operates as a standalone AI service, regulators are unlikely to treat them as fully separate for liability purposes. “Regulatory scrutiny focuses on functional integration and content delivery rather than corporate structure alone,” said Rahul Sundaram, partner at IndiaLaw LLP. According to him, unlawful content generated by Grok and disseminated through or integrated into X could attract intermediary liability for X, making separation arguments difficult to sustain.

The loss of safe harbour would carry serious consequences. Without Section 79 protection, platforms could face civil liability for unlawful content and, in certain cases, criminal exposure, along with the risk of blocking orders. “Loss of safe harbour does not itself create liability, but it removes the statutory shield that keeps intermediaries out of the line of fire,” Mohammad Atif Ahmad, Attorney at Clement Law, said.

In such scenarios, intermediaries may be subject to civil claims including defamation, copyright infringement or negligence. Criminal liability would arise only where specific statutory offences under the Bharatiya Nyaya Sanhita or the IT Act are established, requiring proof of knowledge or intent.

“Indian courts have repeatedly affirmed that foreign incorporation does not exempt platforms from domestic legal obligations. Similar assertions have arisen in competition law, data governance, and digital taxation contexts,” Simrean Bajwa, IP lawyer and global partnerships lead at BITS Law School, said.

The prospect of losing safe harbour is likely to prompt big technology platforms to recalibrate internal governance. This could include strengthening content moderation systems, improving notice-and-takedown workflows, and exercising tighter oversight over algorithmic tools that generate or amplify content. Companies are also expected to increase investments in compliance infrastructure, including grievance redressal mechanisms, transparency reporting, and local compliance officers.

This article was first uploaded on January six, twenty twenty-six, at one minutes past eight in the night.