Section 79 of the Information Technology Act, 2000 provides safe harbour protection to intermediaries — including social media platforms, e-commerce websites, internet service providers, and search engines — by exempting them from liability for third-party content hosted or transmitted on their platforms, provided they observe due diligence and remove unlawful content upon receiving actual knowledge through a court order or government notification. Under Indian law, the scope of this safe harbour was significantly clarified by the Supreme Court in Shreya Singhal v. Union of India (2015) 5 SCC 1, which read down the "actual knowledge" requirement to mean knowledge received through a court order.
Legal definition
Section 79 of the IT Act provides:
Section 79(1): Notwithstanding anything contained in any law for the time being in force but subject to the provisions of sub-sections (2) and (3), an intermediary shall not be liable for any third party information, data, or communication link made available or hosted by him.
Section 79(2): The provisions of sub-section (1) shall apply if — (a) the function of the intermediary is limited to providing access to a communication system over which information made available by third parties is transmitted or temporarily stored or hosted; or (b) the intermediary does not — (i) initiate the transmission, (ii) select the receiver of the transmission, and (iii) select or modify the information contained in the transmission.
Section 79(3): The provisions of sub-section (1) shall not apply if — (a) the intermediary has conspired or abetted or aided or induced, whether by threats or promise or otherwise in the commission of the unlawful act; (b) upon receiving actual knowledge, or on being notified by the appropriate Government or its agency that any information, data or communication link residing in or connected to a computer resource controlled by the intermediary is being used to commit the unlawful act, the intermediary fails to expeditiously remove or disable access to that material on that resource without vitiating the evidence in any manner.
How courts have interpreted this term
Shreya Singhal v. Union of India [(2015) 5 SCC 1]
The Supreme Court read down Section 79(3)(b), holding that "actual knowledge" must be interpreted to mean knowledge received through a court order — not mere private complaints or notices from individuals. The Court held that intermediaries are only obligated to remove content upon receipt of a court order directing such removal, or a notification from the appropriate government under Section 69A. This interpretation significantly strengthened safe harbour protection by shielding platforms from liability arising from private takedown notices.
Google India Pvt. Ltd. v. Visaka Industries Ltd. [(2020) — Supreme Court]
The Supreme Court clarified that intermediaries enjoy safe harbour under Section 79 only if they comply with the due diligence requirements prescribed under the Intermediary Guidelines (IT Rules). An intermediary that fails to observe due diligence — including appointing a grievance officer, publishing a privacy policy, and implementing content moderation mechanisms — loses its safe harbour protection.
Myspace Inc. v. Super Cassettes Industries Ltd. [(2017) — Delhi High Court]
The Delhi High Court held that an intermediary that exercises editorial control over content — by curating, selecting, modifying, or promoting specific content — goes beyond the passive conduit role contemplated by Section 79(2) and cannot claim safe harbour protection. The distinction between a passive intermediary and an active publisher is critical to the applicability of safe harbour.
Why this matters
Section 79 is the legal foundation upon which India's entire digital economy operates. Social media platforms (Meta, X, YouTube), e-commerce marketplaces (Amazon, Flipkart), messaging services (WhatsApp, Telegram), search engines (Google), and internet service providers all rely on safe harbour protection to function without facing liability for every piece of user-generated content on their platforms.
For intermediaries, maintaining safe harbour protection requires strict compliance with the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. These rules prescribe due diligence requirements including publishing a privacy policy, terms of service, and community guidelines; appointing a Grievance Officer to address user complaints within 24 hours (acknowledgment) and 15 days (resolution); implementing automated tools to proactively identify and remove content depicting child sexual abuse material, rape, and non-consensual intimate images; and, for Significant Social Media Intermediaries (SSMIs with over 50 lakh registered users), appointing a Chief Compliance Officer, Nodal Contact Person, and Resident Grievance Officer — all resident in India.
For users and content creators, Section 79 creates a framework where their content is not pre-screened by platforms but may be removed upon court order or government notification. The Shreya Singhal interpretation means that platforms cannot be compelled to remove content merely on the basis of a private complaint without judicial oversight — a critical free speech protection.
For legal practitioners handling defamation, intellectual property, or content removal cases, the practical implication is that a court order is the most effective mechanism for compelling content removal from platforms. Simply sending a legal notice to the platform may trigger the grievance redressal process but does not create a legal obligation under Section 79 to remove the content.
Related terms
Parent framework:
Related concepts:
Related provisions:
Frequently asked questions
Does safe harbour mean platforms can host any content?
No. Safe harbour under Section 79 exempts intermediaries from liability for third-party content only if they maintain a passive role and comply with due diligence requirements. Platforms must remove content upon receiving a court order or government notification under Section 69A. Additionally, platforms must proactively remove certain categories of content (child sexual abuse material, content depicting rape) under the 2021 IT Rules. Platforms that actively curate, promote, or modify content lose safe harbour protection.
What happens when a platform receives a private complaint about content?
Under the Shreya Singhal interpretation, a private complaint does not constitute "actual knowledge" under Section 79(3)(b). However, platforms must process complaints through their grievance redressal mechanism as required by the IT Rules 2021 — acknowledging the complaint within 24 hours and resolving it within 15 days. If the content violates the platform's terms of service, the platform may remove it voluntarily. For legally compelled removal, the complainant must obtain a court order.
Can a social media platform be sued for defamatory content posted by a user?
If the platform complies with its due diligence obligations and does not exercise editorial control over the content, it is protected by Section 79 safe harbour. The primary liability lies with the person who posted the defamatory content. However, if a court order directs the platform to remove the content and the platform fails to comply, it loses safe harbour protection and may face liability.
This entry is part of the Veritect Indian Legal Glossary, a comprehensive reference of Indian legal terminology grounded in statutory text and judicial interpretation.
Last updated: 2026-03-27. Veritect provides this content for informational purposes and does not constitute legal advice.