The proposed amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 are drawing concern from policy and legal experts, who say the changes could expand government oversight over online content while stretching the scope of the parent Information Technology Act, 2000.
At the centre of the debate are two shifts: giving greater legal weight to government advisories issued to intermediaries, and widening the regulatory ambit to include user-generated news and current affairs content. Together, these changes could alter how platforms moderate content and how users engage online.
Cyberlaw expert Pavan Duggal said the approach raises questions of legal validity. “The IT Act empowers the government to frame rules, but it does not permit converting advisories or standard operating procedures into binding obligations. That goes beyond the scope of the parent law,” he said.
Linking compliance with advisories
The draft links compliance with advisories, clarifications and SOPs to due diligence requirements, meaning platforms risk losing safe harbour protection if they fail to comply. Executives said this effectively blurs the distinction between guidance and enforceable law.
Former ministry of electronics and information technology official Rakesh Maheshwari said while SOPs can form part of subordinate legislation, advisories and FAQs occupy a different space. “The intent of the rules is clear. Normally, SOPs are part of the rule, because they are further subordinate. But not advisories and FAQs, that’s the area which might need some rethinking,” he said.
The amendments also expand the role of the government-led oversight mechanism under the grievance redressal framework. Rohit Kumar, founding partner at The Quantum Hub, said the earlier system was designed as an escalation model to preserve editorial independence. “The idea was that complaints move from publisher-level to self-regulation, and only then to government oversight,” he said.
Under the proposed changes, however, the inter-departmental committee can examine not just escalated complaints but also matters referred to it directly by the government. “This expands the committee’s remit significantly. What earlier depended on escalation can now be directly examined,” Kumar said.
What do experts say?
Experts said the changes could increase compliance burdens and lead platforms to adopt a more cautious approach. Shweta Venkatesan, fellow at Esya Centre, said failure to comply with directions related to user-generated news content could expose platforms to liability. “If platforms fail to comply with blocking orders, they may be seen as not fulfilling due diligence obligations,” she said.
The widening definition of news content has also raised concerns about its impact on users. Venkatesan said ordinary users posting analysis or commentary could be affected. “This could result in ordinary users’ takes on current events being blocked, leading to disproportionate interference with free expression,” she said.
Some executives also drew parallels with the now-withdrawn Broadcasting Services Bill. “After the pushback on the Broadcasting Bill, this appears to be a way to introduce similar provisions through rules rather than legislation,” one executive said.
Legal experts said while the objective of tackling misinformation and deepfakes is valid, the method raises questions. Arpit Choudhary, partner at King Stubb & Kasiva, said intermediaries may face greater expectations even if safe harbour remains formally intact.
