Intermediary Guidelines refers to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, the subordinate legislation that prescribes due diligence obligations for digital intermediaries, content takedown procedures, and a self-regulatory framework for digital news media and OTT platforms in India. Under Indian law, these rules are framed under Sections 79(2)(c), 69A, and 87 of the Information Technology Act, 2000 and establish the conditions that intermediaries must satisfy to retain safe harbour protection from third-party liability.
Legal definition
The Intermediary Guidelines were notified on 25 February 2021, replacing the Information Technology (Intermediaries Guidelines) Rules, 2011. The Rules are divided into three parts:
Part II — Due Diligence by Intermediaries (Rules 3-6):
Rule 3(1): All intermediaries must observe the following due diligence: (a) prominently publish rules, regulations, privacy policy, and user agreement; (b) inform users not to host, display, upload, modify, publish, transmit, store, update, or share any information that belongs to another person, is defamatory, obscene, invasive of privacy, harmful to a child, infringing of intellectual property, or violative of any law.
Rule 3(1)(d): Upon receiving a complaint or order from a court or government authority, the intermediary must "within thirty-six hours from the receipt of such order, and where applicable, an order is issued by a court of competent jurisdiction, remove or disable access to" the flagged content.
Rule 3(2): The intermediary must appoint a Grievance Officer who acknowledges complaints within 24 hours and resolves them within 15 days (or 72 hours for complaints relating to certain categories including content depicting nudity or sexual acts).
Additional obligations for Significant Social Media Intermediaries (SSMIs) — Rule 4:
Rule 4 imposes enhanced obligations on social media intermediaries with 50 lakh (5 million) or more registered users:
- Rule 4(1): Appoint a Chief Compliance Officer and a Nodal Contact Person (both senior employees based in India) and a Resident Grievance Officer
- Rule 4(2): Enable identification of the first originator of information when required by a court order for specified offences (including sovereignty threats, sexual offences, and child sexual abuse material)
- Rule 4(4): Deploy technology-based measures (including automated tools) to proactively identify and remove content depicting child sexual abuse, rape, or previously removed content
- Rule 4(7): Provide a mechanism for users to verify their accounts with a visible mark of verification
- Rule 4(8): Publish monthly compliance reports detailing complaints received, action taken, and content removed
Part III — Code of Ethics and Procedure for Digital Media (Rules 7-14):
This part establishes a three-tier regulatory framework for digital news media and OTT platforms: (i) self-regulation by the publisher, (ii) self-regulation by a self-regulatory body, and (iii) oversight by an Inter-Departmental Committee chaired by a government official.
2023 Amendments: The government introduced amendments creating a "fact check unit" mechanism — Rule 3(1)(b)(v) empowered the government to designate a fact-checking body, and intermediaries were required to take action against content identified as "fake or false or misleading" by this unit or risk losing safe harbour protection.
How courts have interpreted this term
Kunal Kamra v. Union of India [2025 SC]
The Supreme Court examined the constitutional validity of the 2023 amendments to the Intermediary Guidelines, particularly the "fact check unit" provision under Rule 3(1)(b)(v). The Court struck down the provision as unconstitutional, holding that empowering a government-designated body to determine what constitutes "fake or false or misleading" content — with the consequence of intermediaries losing safe harbour for non-compliance — constituted a form of pre-censorship violating Article 19(1)(a). The Court observed that the provision created a chilling effect on free speech by incentivising intermediaries to over-censor content to retain their safe harbour protection.
Agij Promotion of Nineteenonea Media Pvt. Ltd. v. Union of India [Bombay HC, 2021]
The Bombay High Court declined to grant interim relief against the Intermediary Guidelines 2021 in challenges filed by digital media entities, observing that the three-tier regulatory framework was not per se unconstitutional and did not impose prior restraint on publishing. The Court noted that the Code of Ethics applied post-publication grievance redressal, not pre-publication censorship. However, the Court left the substantive constitutional challenge open for final hearing.
WhatsApp LLC v. Union of India [Delhi HC, 2021]
WhatsApp challenged Rule 4(2) — the traceability requirement — arguing that it would necessitate breaking end-to-end encryption, violating users' right to privacy. The Delhi High Court did not grant interim relief, but the case raised fundamental questions about the balance between law enforcement access and encryption-protected privacy. The matter remains significant for the future of encrypted communications in India.
Why this matters
The Intermediary Guidelines 2021 fundamentally reshaped the regulatory landscape for digital platforms in India. For every social media company, search engine, e-commerce marketplace, and online service provider operating in India, compliance with these rules is a prerequisite for retaining safe harbour immunity under Section 79 of the IT Act. Loss of safe harbour exposes platforms to unlimited liability for every piece of user-generated content hosted on their systems.
For large platforms classified as Significant Social Media Intermediaries — including Meta (Facebook, Instagram, WhatsApp), Google (YouTube), Twitter/X, Koo, LinkedIn, and ShareChat — the enhanced obligations create substantial compliance costs. The requirement to appoint India-based compliance officers, publish monthly reports, enable traceability of first originators, and deploy proactive content filtering technology represents a significant operational and legal commitment.
For users and civil society, the Guidelines raise fundamental questions about the balance between content moderation and free expression. The 36-hour takedown window, combined with the breadth of prohibited content categories, creates concerns about over-censorship. The traceability requirement (Rule 4(2)) remains particularly contested, as it potentially undermines end-to-end encryption and the privacy of communications.
For practitioners, the Intermediary Guidelines create a new advisory frontier. Advising platforms requires understanding both the IT Act framework and the specific due diligence obligations under the Rules. Content-related disputes increasingly involve intermediary liability questions — whether in defamation suits, intellectual property enforcement, or regulatory compliance. The interplay between the Intermediary Guidelines and the DPDP Act, 2023 adds another compliance layer, as platforms must simultaneously satisfy their obligations as intermediaries (under the IT Act) and as data fiduciaries (under the DPDP Act).
Related terms
Parent framework:
Related concepts:
Frequently asked questions
What are the key obligations for social media platforms under the Intermediary Guidelines?
All intermediaries must: publish terms of service prohibiting specified categories of content; appoint a Grievance Officer who acknowledges complaints within 24 hours and resolves them within 15 days; remove flagged content within 36 hours of receiving a court order or government notification; and preserve information for investigation purposes. Significant Social Media Intermediaries (50 lakh+ users) face additional obligations including appointing a Chief Compliance Officer, Nodal Contact Person, and Resident Grievance Officer in India, and publishing monthly compliance reports.
What is the traceability requirement under Rule 4(2)?
Rule 4(2) requires Significant Social Media Intermediaries to enable identification of the "first originator" of any information on their platform when required by a judicial order. The order must relate to specified offences including threats to sovereignty, public order, sexual offences, or child sexual abuse material. This provision is controversial because it may require platforms offering end-to-end encryption (such as WhatsApp) to modify their architecture to enable traceability while maintaining encryption. WhatsApp's challenge to this provision remains pending before the Delhi High Court.
Was the government's fact-check unit provision upheld?
No. The Supreme Court in Kunal Kamra v. Union of India (2025) struck down Rule 3(1)(b)(v) — the 2023 amendment that required intermediaries to take action against content identified as "fake or false or misleading" by a government-designated fact-check unit, on pain of losing safe harbour protection. The Court held this constituted unconstitutional pre-censorship violating Article 19(1)(a), as it empowered the government to effectively determine truth and compel private platforms to suppress content.
Do the Intermediary Guidelines apply to OTT platforms and digital news media?
Yes. Part III of the Intermediary Guidelines (Rules 7-14) establishes a Code of Ethics and three-tier regulatory framework specifically for "publishers of news and current affairs content" and "publishers of online curated content" (OTT platforms). Publishers must classify content by age-appropriateness, adhere to the Code of Ethics (covering journalistic standards, impartiality, and content restrictions), and participate in a self-regulatory structure with government oversight.
This entry is part of the Veritect Indian Legal Glossary, a comprehensive reference of Indian legal terminology grounded in statutory text and judicial interpretation.
Last updated: 2026-03-27. Veritect provides this content for informational purposes and does not constitute legal advice.