The Ministry of Electronics and Information Technology, on 10 February 2026, notified the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules 2026, which take effect from 20 February 2026. The Amendment Rules introduce new obligations for intermediaries in relation to artificial intelligence systems and update the compliance framework established by the 2021 Rules.
Background
The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules 2021, notified under Section 87 read with Section 79 of the Information Technology Act 2000, established the foundational compliance architecture for internet intermediaries and digital media entities in India. The 2021 Rules introduced requirements for significant social media intermediaries, including grievance redressal mechanisms, content traceability provisions, and the digital media ethics code.
Since 2021, the rapid proliferation of AI-powered tools and generative AI platforms has raised fresh regulatory questions about intermediary responsibility. The deployment of AI models for content generation, automated moderation, and personalised recommendation systems creates new vectors of platform liability that the original 2021 framework did not specifically address. The 2026 Amendment Rules represent the government's response to these evolving technological realities.
Key Provisions
The Amendment Rules 2026 introduce the following changes to the 2021 framework:
AI-specific compliance obligations: Intermediaries deploying artificial intelligence systems for content generation, recommendation, or moderation are required to implement specified safeguards and disclosure mechanisms.
Updated due diligence requirements: The amendment expands the due diligence obligations under the 2021 Rules to address AI-related risks, including requirements for algorithmic transparency and accountability.
Revised intermediary compliance standards: The Rules update the compliance architecture for significant social media intermediaries, incorporating new requirements that reflect the current operational practices of AI-integrated platforms.
Effective date with compliance window: The notification date of 10 February with an enforcement date of 20 February provides a ten-day compliance window for intermediaries to align their practices with the new requirements.
Implications for Practitioners
Technology law practitioners advising digital platforms, social media companies, and AI service providers should conduct an immediate gap analysis between existing compliance frameworks under the 2021 Rules and the additional obligations imposed by the 2026 Amendment.
The ten-day compliance window is notably short for what amounts to substantive new obligations. Intermediaries that have already been investing in AI governance frameworks will be better positioned; those that have not will need to prioritise the most critical compliance requirements and develop phased implementation plans.
The intersection of these Amendment Rules with the Digital Personal Data Protection Act 2023 and its anticipated rules creates a layered compliance environment. Practitioners should assess whether AI systems that process personal data trigger obligations under both frameworks simultaneously, potentially requiring integrated compliance architectures.
Litigators should monitor whether the new AI-related obligations generate fresh grounds for challenging intermediary safe harbour claims under Section 79 of the IT Act, particularly in cases involving AI-generated content that causes harm.