A Los Angeles jury found tech giants Meta (Facebook & Instagram), YouTube, and Google guilty of deliberately designing addictive social media products.
- The Companies were held liable for malice, oppression, and fraud, and a 20-year-old plaintiff (KGM) was awarded $1 million in damages, with Meta bearing 70% of the liability and YouTube 30%.
Victim’s Experience
- Early Exposure: The plaintiff began using YouTube at age 6 and Instagram at age 9.
- Health Consequences: Prolonged exposure contributed to anxiety, depression, and body dysmorphia.
- Body Dysmorphia: A psychological condition where individuals perceive their bodies as flawed due to unrealistic beauty standards created by filters and edited images.
Mechanism of Addiction
- Digital Casino Design: Social media platforms function like “dopamine-seeking slot machines.”
- Dopamine Triggers: Actions such as feed refresh and scrolling through reels create unpredictable rewards that release dopamine in the brain.
- Endless Scroll: This design encourages continuous engagement and has been associated with eating disorders, anxiety, and self-harm among young users.
Legal Breakthrough – Product Design vs Content
- Section 230 Protection: The US Communications Decency Act traditionally shields platforms from liability for user-generated content.
- Strategic Legal Approach: The lawsuit targeted product design (algorithms, feed structure, and engagement tools) rather than the content itself.
- Product Liability Argument: Platforms were held responsible for design features that intentionally maximise engagement and addiction, allowing the case to bypass Section 230 protections.
Check Out UPSC CSE Books From PW Store
Proving Malice and Responsibility
- Four Elements of Judgment: The jury evaluated four factors—duty of care (responsibility to users), breach (violation of that responsibility), causation (link between actions and harm), and actual harm caused.
- Substantial Factor Test: Even though Meta argued the girl had personal history issues, the court applied this test to establish that Meta’s platform played a meaningful role in her health decline.
- Evidence of Malice: Malice was defined as a conscious disregard for safety.
- Internal company emails and research papers showed that companies were aware that their products were harmful to children, but continued promoting them for profit.
Global Significance
- Precedent-setting Case: The verdict is seen as a bellwether trial, establishing an important precedent for holding technology companies accountable for addictive platform design and likely influencing future global litigation on social media responsibility.
- Relevance for India: The case highlights the need for stronger regulation of digital platforms in India under frameworks such as the Digital Personal Data Protection Act, 2023, the Information Technology Rules, and the protection of constitutional rights to health and privacy.
Conclusion
The case marks a significant shift toward holding tech companies accountable for the design and psychological impact of their platforms, signalling stronger global regulation of social media.