A report by Standing Committee on Communications and Information Technology (IT) has called for a clear legal definition of “fake news” and recommended revisiting the ‘Safe Harbour’ clause under Section 79 of the IT Act, 2000.
What is Fake News?
- Fake News: It consists of false information presented as fact to manipulate people intellectually and emotionally sparking strong emotional responses and even violence.
- Disinformation: It is understood as false information that is created or spread with the deliberate intent of causing harm.
- In the case of misinformation, the element of intent is considered to be absent.
|
Key Highlights from the Report
Concerns in the Current Mechanism
- Definition Ambiguity: The parliamentary panel notes that the absence of a clear legal definition of “fake news” creates regulatory confusion across print, electronic, and digital media platforms.
- Safe Harbour Concerns: The panel highlights that intermediaries often misuse the “safe harbour” protection under Section 79 of the IT Act, even though their algorithms amplify sensational and misleading content.
- Algorithmic Incentives: The committee highlighted that digital platforms profit from content that maximises engagement, which frequently includes misleading or false information.
- PIB Fact-Check Data: The report notes that the PIB Fact Check Unit received 1.63 lakh queries but debunked only 2,279 items, indicating limited reach relative to the scale of misinformation.
- Cross-Border Jurisdiction: Misinformation often originates from foreign creators, and varying international laws make it difficult to enforce accountability across borders.
- AI Risks: The rapid spread of deepfakes and AI-generated videos is fuelled by high internet penetration and low levels of digital literacy.
- Self-Regulation Weakness: The committee notes that more than half of India’s TV channels are outside any Self-Regulatory Body, weakening internal accountability systems.
- Penalty Limitations: The panel states that current penalties, including fines up to ₹25 lakh, are insufficient deterrents for repeated dissemination of fake news.
Recommendations by the Committee
- Legal Definition: The committee recommends formulating a precise, legally sound definition of “fake news” through broad stakeholder consultation.
- Balance Requirement: The committee stresses that any definition of “fake news” must balance misinformation control with the constitutional right to freedom of speech.
- Safe Harbour Reform: The committee advises revisiting Section 79 of the IT Act to enhance intermediary accountability and reduce misuse of immunity protections.
- Algorithm Transparency: The committee calls for mandatory transparency disclosures on how digital platform algorithms promote or suppress content.
- Stricter Penalties: The report proposes introducing higher fines and harsher punishments for repeat offenders to deter misinformation.
- Nodal Officer Appointment: The report recommends appointing a dedicated nodal officer in India to coordinate with big tech companies and handle compliance issues.
- Inter-Ministerial Task Force: The committee recommends establishing a dedicated task force involving the Ministry of External Affairs and legal experts to handle jurisdictional challenges.
- Artificial Intelligence-Human Oversight: The committee supports a hybrid system in which AI flags misinformation and human experts perform second-layer verification.
- AI Regulation: The panel proposes licensing requirements for AI content creators and mandatory labelling of AI-generated videos and digital content.
- Strengthen Self-Regulation: The committee recommends bringing all TV channels under Self-Regulatory Bodies and making fact-checking units and internal ombudsmen mandatory across all media sectors (print, Electronic, Digital).
- Accreditation Penalties: The panel proposes the cancellation of accreditation for journalists or creators found guilty of repeatedly producing or spreading fake news.
Initiatives to Prevent the Spread of Fake News
- Information Technology Act, 2000: The IT Act empowers the government to regulate online intermediaries and digital content to curb unlawful and misleading information.
- Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021: The IT rules regulate digital news publishers, current affairs content creators, and curated audio-visual platforms.
- Safe Harbour Provision: Intermediaries that fail to follow due diligence under the IT Rules risk losing their safe-harbour protection under Section 79 of the IT Act, making them legally liable for user-generated content.
- Bharatiya Nyaya Sanhita (BNS): Section 353 of the Bharatiya Nyaya Sanhita criminalizes the deliberate spread of false information or rumours when intended to cause public harm.
- PIB Fact-Check Unit: The Press Information Bureau operates a fact-check unit dedicated to verifying and countering misinformation related to government policies and actions.
- Ministry of Information & Broadcasting Advisory 2024: The Ministry issued a advisory prohibiting the promotion of online betting platforms and surrogate advertisements aimed at Indian users.
- Election Commission Initiatives: The Election Commission of India introduced the “Myth vs Reality Register” during the 2024 General Elections to identify and correct misinformation, and it also runs campaigns to educate voters against fake news.
- National Cyber Crime Reporting Portal: The Portal enables citizens to report cybercrimes, which are then forwarded to the respective State or Union Territory police for action.
|