Australia has taken a groundbreaking step by implementing a social media ban for children under 16, sparking a mix of praise, concern, and criticism.
Key Provisions of the Ban
- The legislation, described as a world-first, aims to protect minors from the physical and mental health risks associated with social media but has also raised questions about its feasibility and broader implications.
- Platforms under the law: Social media platforms like Instagram, Facebook, and TikTok must prevent under-16s from logging in.
- Non-compliance could result in fines of up to A$49.5 million ($32 million).
- Implementation: A trial phase for enforcement begins in January, with full implementation set for next year.
- Absolute Law : Unlike similar measures in other countries, which often require parental consent rather than a complete ban.
Enroll now for UPSC Online Course
Challenges and Concerns
- Implementation Hurdles: The need for effective age verification mechanisms raises privacy and feasibility issues.
- Unintended Risks: Critics warn of children seeking alternative, less secure online platforms.
- International Relations: The ban adds tension to Australia’s already complex relationship with US-based tech giants.
Other Global Initiative
- France and the U.S.: Require parental consent for minors to access social media, but do not impose outright bans.
- Florida: Faces legal challenges over a similar ban targeting children under 14, citing free speech concerns.
Social Media Regulation in India
- Information Technology Act, 2000: The primary legislation governing digital activities in India, including social media.
- Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021: These rules specifically regulate social media platforms, imposing various obligations on them.
- Key Provisions under the rule:
- Appointment of Grievance Officer: Social media platforms must appoint a grievance officer to address user complaints.
- Removal of Harmful Content: Platforms are required to remove content that is illegal, harmful, or objectionable.
- Traceability of Origin of Messages: Platforms must be able to identify the first originator of information, raising concerns about privacy.
- Due Diligence: Platforms must exercise due diligence in verifying the authenticity of user accounts and content.
- Transparency Reports: Platforms must submit periodic transparency reports to the government.
- Relevant Authorities:
- Ministry of Electronics and Information Technology (MeitY): The primary government body responsible for formulating and implementing policies related to information technology, including social media.
- Cybercrime Investigation Cell (Cyber Cell): Investigates cybercrimes, including those related to social media.
- Computer Emergency Response Team (CERT-In): Monitors and responds to cyber threats and vulnerabilities.
- Purpose of the regulation: These regulations aim to balance free speech with the need to control misinformation, hate speech, and other harmful content on social media platforms.
- However, they have also raised concerns about potential censorship and surveillance.
Check Out UPSC CSE Books From PW Store
Way Forward
- Balanced Approach: Implement regulations that safeguard children while preserving individual freedoms, ensuring that privacy and free speech are not compromised.
- Effective Age Verification: Develop secure, privacy-focused age verification systems to prevent misuse while maintaining data protection standards.
- Collaborative Policymaking: Involve stakeholders, including mental health experts, tech companies, and parents, to create inclusive, evidence-based policies.
- Promoting Digital Literacy: Introduce programs to educate children and parents about responsible social media use, empowering them to navigate online spaces safely.
Additional Reading: Debate on Whether Children Should be Barred From Social Media
To get PDF version, Please click on "Print PDF" button.