Context of the News
A Los Angeles court (USA) held Meta (Instagram) and YouTube negligent for:
- Designing addictive platform features
- Failing to warn users, especially children, about risks
They were ordered to pay USD 6 million in damages.
Background
- Social media platforms earlier claimed “neutral pipe” status:
- Not responsible for content or user harm
- Legal protection came from:
- Section 230 (USA)
- Section 79 of IT Act, 2000 (India)
- However, increasing concerns:
- Mental health issues among youth
- Addictive platform design
- Data exploitation
News Breakdown
This ruling marks a major shift in global digital regulation.
1. End of “Neutral Pipe” Defence
Definition:
Neutral Pipe Theory means platforms act only as intermediaries, not responsible for content or impact.
- Court rejected this argument
- Platforms now seen as active designers influencing behaviour
Implication:
- Platforms can be held legally liable
- Moves towards product liability approach
2. Addictive Design Features Identified
Definition:
Dark Patterns are design tricks that manipulate user behaviour.
Key Addictive Features
- Infinite Scrolling
- Endless content feed with no stopping point
- Autoplay Videos
- Videos play automatically without user action
- Algorithm-led Recommendations
- AI suggests highly engaging content
Impact:
- Designed to maximize screen time
- Triggers dopamine-based addiction
- Makes children “never put down the phone”
3. Algorithmic Transparency Requirement
Definition:
Algorithmic Transparency means platforms must disclose how their recommendation systems work.
New Expectations
- Conduct Design Risk Assessments
- Disclose:
- Impact on mental health
- Risks like addiction or body image issues
Key Point:
- If companies knowingly ignore harm → “reckless disregard”
4. Redesign of User Experience for Minors
Platforms may need to:
- Introduce:
- “You’re all caught up” alerts
- Time limits
- Remove:
- Infinite scroll for minors
- Manipulative logout barriers
5. Global Regulatory Trend
| Country | Measure |
|---|---|
| Australia | Restrict social media for under-16 |
| UK | Pilot age-based access restrictions |
| USA | Court-led accountability |
| India | Data protection & content regulation |
6. Impact on India’s Digital Framework
India may shift from safe harbour → product liability.
Key Laws in India
(A) Digital Personal Data Protection (DPDP) Act, 2023
Definition:
Regulates collection and processing of personal data.
- Child = Below 18 years
- Requires parental consent
- Prohibits:
- Tracking
- Targeted advertising
- Penalty: Up to Rs 250 crore
(B) IT Rules, 2021 (Amended 2026)
- Content classification:
- U, U/A 7+, 13+, 16+, A
- Mandatory:
- Parental locks
- Age verification
Strict timelines:
- Harmful content removal: 3 hours
- Intimate content: 2 hours
New Addition:
- Label SGI (Synthetically Generated Information)
(C) POCSO Act, 2012
Definition:
Protects children from sexual offences.
- Covers:
- Online grooming
- Child Sexual Abuse Material (CSAM)
(D) Juvenile Justice Act, 2015
- Addresses:
- Child trafficking
- Online exploitation
7. Risks of Social Media for Children
(A) Addiction Engineering
- Uses:
- Dopamine triggers
- Intermittent rewards (likes, notifications)
(B) Mental Health Issues
- Leads to:
- Body Dysmorphic Disorder
- Anxiety, depression
(C) Cyberbullying
- Persistent harassment
- Can lead to:
- Self-harm
- Suicidal tendencies
(D) Data Exploitation
- Children lack digital literacy
- Vulnerable to:
- Predators
- Privacy breaches
(E) Brain Development Impact
- Affects:
- Prefrontal cortex
- Leads to poor:
- Impulse control
- Decision making
(F) Filter Bubble Effect
Definition:
Exposure only to similar content reinforcing beliefs.
- Leads to:
- Radicalization
- Misinformation spread
Measures to Mitigate Risks
1. Parental Measures
- Co-viewing content
- Set screen-time examples
2. Educational Measures
- Teach:
- Digital literacy
- Identification of dark patterns
- Use:
- Phone-free classrooms
3. Technical Solutions
- Age-gating technologies
- Remove:
- Addictive features
- Introduce:
- Break reminders
4. Legal Safeguards
- Enforce:
- POCSO Act
- Juvenile Justice Act
- Promote initiatives like Mission Shakti
5. Right to Be Forgotten
Definition:
Allows users to erase personal data from digital platforms.
- Important for:
- Protecting children’s future identity
Prelims Focus
- Section 79 (IT Act, 2000) → Safe harbour provision
- DPDP Act, 2023 → Parental consent mandatory for children
- POCSO Act, 2012 → Covers online sexual offences
- SGI → Synthetically Generated Information
- Dark Patterns → Manipulative UX designs
Conclusion / Way Forward
Balancing innovation with child safety requires stricter platform accountability, strong enforcement, and collaborative efforts among governments, parents, and technology companies.
Prelims Check
Question 1
Consider the following statements:
- Section 79 of the IT Act, 2000 provides safe harbour protection to intermediaries.
- The DPDP Act, 2023 allows targeted advertising to children with parental consent.
- Dark patterns are designed to manipulate user behaviour.
Which of the statements given above is/are correct?
(a) 1 and 3 only
(b) 1 only
(c) 2 and 3 only
(d) 1, 2 and 3
Question 2
With reference to the IT Rules, 2021, consider the following:
- Platforms must classify content based on age categories.
- Harmful content must be removed within 24 hours.
- Platforms must label AI-generated content.
Which of the statements given above is/are correct?
(a) 1 and 3 only
(b) 1 only
(c) 2 and 3 only
(d) 1, 2 and 3
Question 3
Consider the following statements regarding social media risks:
- Infinite scrolling is a form of dark pattern.
- Filter bubbles promote exposure to diverse viewpoints.
- Excessive screen time can affect brain development.
Which of the statements given above is/are correct?
(a) 1 and 3 only
(b) 2 only
(c) 1 and 2 only
(d) 1, 2 and 3
Answers with Explanation
Answer 1: (a) 1 and 3 only
- Statement 1 is correct: Section 79 provides safe harbour.
- Statement 2 is incorrect: Targeted ads to children are prohibited.
- Statement 3 is correct: Dark patterns manipulate behaviour.
Answer 2: (a) 1 and 3 only
- Statement 1 is correct: Content classification mandatory.
- Statement 2 is incorrect: Removal is within 3 hours, not 24.
- Statement 3 is correct: AI content labelling required.
Answer 3: (a) 1 and 3 only
- Statement 1 is correct: Infinite scroll is a dark pattern.
- Statement 2 is incorrect: Filter bubbles limit diversity.
- Statement 3 is correct: Screen time affects brain development.
“Technology must empower minds, not imprison them.”



