Intermediary Liability Under IT Act: Safe Harbor and Due Diligence

Constitutional Law Section 79 Section 69A Section 67 Intermediary Liability Under IT Act Information Technology Act, 2000
Veritect
Veritect AI
Deep Research Agent
10 min read
Continue with Veritect

Search 5M+ Indian judgments — citation-aware, role-aware, and grounded in live case law.

Try Veritect free Book a demo

Executive Summary

Section 79 of the Information Technology Act, 2000, read with the IT (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, establishes the legal framework for intermediary liability and safe harbor protection:

  • Safe harbor: Exemption from liability for third-party content
  • Conditions: Passive role, due diligence, prompt removal
  • IT Rules 2021: Enhanced due diligence obligations
  • Significant intermediaries: Additional compliance for platforms with 50 lakh+ users
  • Grievance redressal: Mandatory officer and 24-hour response timeline
  • Loss of safe harbor: Active involvement, knowledge of illegality, non-compliance

This guide examines intermediary liability, safe harbor conditions, and compliance requirements.

1. Statutory Framework

Section 79 - Exemption from Liability

Key provisions of safe harbor protection:

Element Requirement
Function Intermediary role (enabling access/transmission)
Passive role No initiation, selection, or modification
Due diligence Compliance with prescribed guidelines
Removal on notice Expeditious action on actual knowledge
Government directions Compliance with removal orders

Definition of Intermediary

Characteristic Description
Function Enables transmission or access to information
No content creation Does not create content
Facilitator Provides platform, not publisher
Examples ISPs, social media, search engines, messaging apps

2. Conditions for Safe Harbor Protection

Three Core Requirements

Requirement Specification
1. Intermediary function Mere conduit or platform
2. Due diligence Compliance with IT Rules 2021
3. No actual knowledge Or prompt removal upon knowledge

When Safe Harbor is Lost

Situation Consequence
Active involvement Selection, modification of content
Actual knowledge Awareness of illegal content + no removal
Non-compliance Failure to follow due diligence
Court/government order Non-compliance with removal direction
Failure to assist Not aiding investigation

"Actual Knowledge" Standard

Source Effect
Court order Immediate knowledge
Government notification Immediate knowledge
User complaint Triggers review obligation
Automated detection May not constitute knowledge
General awareness Insufficient without specificity

3. IT (Intermediary Guidelines) Rules, 2021

Core Due Diligence Obligations

Obligation Timeline/Requirement
Terms of use Prohibit unlawful content
Privacy policy Disclose data practices
Grievance officer Appoint and publish details
User complaints Acknowledge in 24 hours
Complaint resolution Dispose within 15 days
Removal of content Expeditious upon knowledge
Government orders Comply within 36 hours

Prohibited Content Categories

Category Description
National security Sovereignty, integrity threats
Public order Incitement to violence
Decency/morality Obscene, pornographic content
Defamation False statements harming reputation
Contempt of court Scandalizing or interfering with justice
Intellectual property Copyright, trademark infringement
Child safety CSAM, child exploitation
Impersonation Fake accounts

4. Significant Social Media Intermediaries (SSMI)

Definition and Threshold

Criterion Specification
User base 50 lakh (5 million) or more registered users in India
Service type Primarily enabling online interaction
Determination Based on registration data

Additional Compliance Obligations for SSMI

Obligation Requirement
Chief Compliance Officer Resident in India, monthly compliance report
Nodal Contact Person 24x7 coordination with law enforcement
Resident Grievance Officer Based in India, handles user complaints
Monthly report Compliance details, complaints received/resolved
Content removal Within 72 hours for specified violations
Proactive monitoring Automated tools for prohibited content
User verification One or more registered users must be verifiable

First Originator Tracing

Aspect Requirement
Trigger Court order or government notification
Purpose Identify first originator of information
Scope Specific unlawful content
Safeguards Judicial or executive order required
Limitation No general monitoring obligation

5. Grievance Redressal Mechanism

Grievance Officer Requirements

Requirement Specification
Appointment Mandatory for all intermediaries
Publication Name, contact details on website/app
Availability Accessible to users
Residence In India (for SSMI)
Responsiveness 24-hour acknowledgment

Complaint Handling Process

Stage Timeline Action
Receipt Immediate Log complaint
Acknowledgment 24 hours Confirm receipt to user
Review Within 15 days Assess complaint merit
Decision Within 15 days Remove, retain, or reject
Communication Promptly Inform complainant of decision

Complaint Categories

Type Response
Illegal content Immediate review and removal
Terms of service violation Review against community standards
Privacy violation Assess and take down if valid
Impersonation Verify and remove fake accounts
Intellectual property Notice-and-takedown procedure

6. Content Removal and Takedown

Removal Timelines

Source Timeline
Court order Immediately upon receipt
Government notification 36 hours
User complaint Within 15 days (or sooner if illegal)
SSMI - specified content 72 hours for CSAM, impersonation

Notice-and-Takedown Process

Step Action
1. Notice Complainant submits takedown request
2. Evaluation Intermediary reviews for validity
3. Removal Content taken down if violates policy/law
4. Notification Both parties informed
5. Counter-notice User may contest removal
6. Restoration Content may be restored if valid counter-notice

7. Traceability and Encryption

Messaging Services and End-to-End Encryption

Requirement Application
Traceability Identify first originator on court/government order
Implementation Technical means to trace without breaking encryption
Privacy balance No general content monitoring
Scope Only for specified unlawful content

Challenges for Encrypted Platforms

Challenge Potential Solution
E2E encryption Hash-based tracing mechanisms
Privacy concerns Limit to judicial orders
Technical feasibility Metadata analysis
User trust Transparency reports

8. Automated Content Moderation

Proactive Monitoring for SSMI

Tool Purpose
Content filters Detect prohibited content
Image hashing Identify known illegal images (CSAM)
AI/ML models Flag potential violations
User reports Community-driven moderation

Balancing Automation and Human Review

Consideration Best Practice
False positives Human review for borderline cases
Transparency Disclose use of automated tools
Appeals Allow users to contest automated decisions
Bias mitigation Regular algorithm audits

9. Comparison: Publishers vs. Intermediaries

Aspect Publisher Intermediary (Safe Harbor)
Content responsibility Fully liable Conditional exemption
Editorial control Yes No
Defamation liability Primary liability No liability if compliant
Copyright liability Direct infringement Secondary liability if aware
Duty to monitor Editorial discretion No general monitoring duty

10. Government Powers and Blocking

Section 69A - Content Blocking

Power Specification
Authority Central Government or designated officer
Grounds Sovereignty, security, public order, etc.
Procedure Reasoned order with opportunity to represent
Compliance Intermediary must block content
Confidentiality Blocking details may be secret

Transparency and Accountability

Measure Purpose
Transparency reports Disclose government requests
Judicial review Challenge blocking orders
User notification Inform affected users (if feasible)

11. Penalties for Non-Compliance

Loss of Safe Harbor

Violation Consequence
Non-compliance with Rules Liable as publisher/creator
Failure to remove content Potential criminal/civil liability
Non-cooperation Prosecution under IT Act

Other Penalties

Offense Penalty
Section 67, 67A, 67B Publishing obscene/CSAM content
Section 69A non-compliance Up to 7 years imprisonment
Contempt of court For violating court orders

12. International Comparison

Safe Harbor Across Jurisdictions

Aspect India (Sec 79 + Rules) US (CDA 230) EU (DSA)
Safe harbor Conditional Broad Conditional
Due diligence Mandatory Minimal Risk-based
Proactive monitoring For SSMI Not required For VLOPs
Traceability Required (SSMI) Not required Limited
Government removal 36 hours No obligation Judicial order
Grievance redressal 15 days Not mandated Mandatory

13. Compliance Checklist

For All Intermediaries

  • Publish clear terms of use prohibiting unlawful content
  • Maintain privacy policy
  • Appoint and publish grievance officer details
  • Acknowledge user complaints within 24 hours
  • Dispose of complaints within 15 days
  • Remove content upon actual knowledge or notice
  • Comply with court orders and government notifications
  • Assist law enforcement investigations
  • Maintain records of compliance

For Significant Social Media Intermediaries (SSMI)

  • Appoint Chief Compliance Officer (India resident)
  • Appoint Nodal Contact Person (24x7 law enforcement)
  • Appoint Resident Grievance Officer (India-based)
  • Publish monthly compliance reports
  • Implement automated content moderation tools
  • Enable traceability for first originator
  • Verify one or more registered users
  • Remove specified content within 72 hours
  • Conduct regular compliance audits

14. Key Takeaways for Practitioners

  1. Safe Harbor is Conditional: Intermediaries must comply with IT Rules 2021 to retain exemption.

  2. Due Diligence Mandatory: Terms of use, privacy policy, and grievance officer are non-negotiable.

  3. SSMI Enhanced Obligations: Platforms with 50 lakh+ users face additional compliance burdens.

  4. 24-Hour Acknowledgment: User complaints must be acknowledged within one day.

  5. First Originator Tracing: SSMI must enable traceability for court/government-ordered content.

  6. No General Monitoring: Intermediaries are not required to proactively monitor all content (except SSMI for prohibited content).

  7. Prompt Removal Critical: Failure to remove content upon actual knowledge destroys safe harbor.

  8. Government Orders Binding: 36-hour compliance timeline for content blocking under Section 69A.

Conclusion

Intermediary liability under the IT Act and IT Rules 2021 creates a balanced framework that protects platforms while ensuring accountability. Safe harbor protection is contingent on maintaining a passive role, exercising due diligence, and promptly addressing illegal content. Significant social media intermediaries face enhanced obligations including grievance redressal, proactive monitoring, and traceability. Compliance is essential to avoid losing safe harbor and facing liability as publishers.

Written by
Veritect. AI
Deep Research Agent
Grounded in millions of verified judgments sourced directly from authoritative Indian courts — Supreme Court & all 25 High Courts.
About Veritect

AI research & drafting, purpose-built for Indian litigation.

Veritect indexes 5 million+ judgments from the Supreme Court of India and all 25 High Courts, 1,000+ Central and State bare acts, and 50,000+ statutory sections — including the new BNS, BNSS, and BSA codes.

Built for Indian courts. Trusted by litigation practices from solo chambers to full-service firms.

Try Veritect free