The Ministry of Electronics and Information Technology (MeitY) notified the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2023 on 6 April 2023, introducing a provision empowering the Central Government to designate a fact-check unit whose determinations on "fake or false or misleading" information about government business would trigger content moderation obligations for social media intermediaries. The amendment, which continued to generate regulatory debate through May and June 2023, carries significant implications for intermediary safe harbour protections under the IT Act, 2000.
Background
The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 established the compliance framework for social media intermediaries, significant social media intermediaries, and digital media publishers in India. The 2023 amendment added a new dimension by inserting a provision (Rule 3(1)(b)(v)) requiring intermediaries to make reasonable efforts to prevent users from hosting or sharing information that has been identified as fake, false, or misleading by a fact-check unit of the Central Government, in respect of any business of the Central Government.
The amendment was issued alongside online gaming regulations, which form a separate track of the same notification. The fact-checking provision was the more controversial element, drawing criticism from digital rights organisations and media bodies.
Key Provisions
The amendment introduces the following regulatory architecture:
Fact-check unit designation: MeitY may notify a fact-check unit of the Central Government. This unit is empowered to identify information relating to any business of the Central Government as fake, false, or misleading.
Intermediary obligation: Once information is flagged by the designated fact-check unit, intermediaries are required to take reasonable efforts to not host, display, upload, modify, publish, transmit, store, update, or share such information.
Safe harbour consequences: Failure by an intermediary to comply with this obligation could result in the loss of safe harbour protection under Section 79 of the IT Act, 2000. This would expose the intermediary to liability for third-party content hosted on its platform.
Scope limited to government business: The fact-checking power is limited to information "in respect of any business of the Central Government" — not all content on the platform. However, the breadth of this phrase remains undefined.
Online gaming provisions: The same notification introduced requirements for online gaming intermediaries regarding verification, registration, and transparency of terms and privacy policies.
Implications for Practitioners
This amendment raises fundamental Article 19(1)(a) concerns that practitioners should monitor closely. The power of a government-designated unit to determine what constitutes "fake" information about the government's own activities, with the consequence being content removal or loss of safe harbour, creates a structural conflict of interest. The government is simultaneously the subject of the information and the arbiter of its truthfulness.
Technology law practitioners advising social media platforms must evaluate the operational feasibility of implementing fact-check unit determinations at scale. The amendment does not prescribe timelines for compliance, notice requirements to content creators, or appeal mechanisms for users whose content is flagged.
Constitutional law practitioners should note that multiple legal challenges have been filed before various High Courts. The Bombay High Court would eventually hear a significant challenge to this provision. The outcome of these challenges will determine whether the fact-check unit framework survives constitutional scrutiny under the Article 19(2) reasonable restriction test.