🗞️ Why in News The World Happiness Report 2026 established a statistically significant link between heavy social media use (5+ hours/day) and declining life satisfaction among teenagers — particularly girls — in high-income English-speaking countries. The Hindustan Times editorial argues this data has direct policy implications for India, where teen screen time averages 6.5 hours/day and no dedicated children’s online safety framework exists.

The Evidence — What the Data Shows

World Happiness Report 2026 (UN SDSN) key findings:

  • Teenagers using social media 5+ hours/day show significantly lower subjective well-being
  • The correlation is strongest among adolescent girls (ages 13–17)
  • Moderate use (under 1 hour/day) shows no statistically significant negative effect
  • The decline in youth life satisfaction in high-income countries tracks closely with the 2012 inflection point — the year smartphone penetration crossed mass adoption thresholds in the US, Canada, UK, Australia, and New Zealand

India’s digital reality:

  • Average teen screen time in India: ~6.5 hours/day (FICCI-EY India Media & Entertainment Report, 2025)
  • Social media penetration (15–24 age group): >70% (IAMAI, 2025)
  • India’s youth population (10–24 years): ~340 million — the world’s largest
  • Internet users under 18: ~200 million — rapidly growing with affordable 4G/5G

The Indian specificity: India’s social media risk profile differs from high-income countries in important ways:

  • Content moderation gap: Platforms moderate English-language content far more aggressively than Hindi, Tamil, Telugu, Bengali — exposing Indian-language users (majority of Indian teens) to more harmful content
  • Digital literacy deficit: Many first-generation smartphone users lack the media literacy to critically evaluate algorithmic content
  • Economic pressure: Social media creators/influencers normalise high-consumption lifestyles that create economic anxiety among aspirational middle-class teens

India’s Current Regulatory Framework — The Gap

India’s existing digital regulation does not specifically address children’s online safety:

Law / Rule What It Does Gap
IT Act, 2000 (Section 67B) Criminalises child sexual abuse material (CSAM) online Does not address non-CSAM harms (cyberbullying, self-harm content, eating disorder promotion)
IT Rules 2021 (Intermediary Guidelines) Grievance redressal; significant social media intermediary (SSMI) obligations No age verification requirement; no algorithmic safety obligation for minors
DPDP Act, 2023 Requires parental consent for processing data of children under 18 Consent mechanism not operationalised; no specific content safety obligation
POCSO Act, 2012 Protects children from sexual offences Offline + online CSAM; does not address non-sexual psychological harms
NEP 2020 Digital literacy curriculum mentions No specific social media safety component

The missing piece: No Indian law currently:

  • Requires platforms to verify user age
  • Mandates algorithmic design changes for minor users
  • Prohibits addictive design features (infinite scroll, notification pings, engagement gamification) for children
  • Requires platforms to provide parents with screen-time tools or content controls by law

Global Models — What Works

UK’s Online Safety Act (2023) — The Most Comprehensive Model:

  • Requires platforms to conduct Children’s Risk Assessments before launching products
  • Bans addictive design features for users under 18 (infinite scroll, autoplay, push notifications at night)
  • Mandates age verification — platforms must verify users are adults before exposing them to legal-but-harmful content
  • Requires platforms to offer robust parental controls as a default
  • Ofcom (Office of Communications) is the regulator; can fine up to £18 million or 10% of global turnover for non-compliance
  • Safety by Design: Platforms must proactively prevent harm to children, not just reactively remove content

US — Children and Teens’ Online Privacy Protection Act (KOSA, 2024):

  • Passed Senate; partially implemented
  • Requires platforms to provide mental health resources; bans targeted advertising to minors
  • Gives the FTC enforcement powers

Australia — Online Safety Act (2021) + Age Restrictions (2024):

  • Social media age limit of 16+ — platforms must use “reasonable steps” to verify age or face penalties
  • Online Safety Commissioner with takedown powers within 24 hours for harmful content

France (2023):

  • Banned smartphones in all primary and secondary schools
  • Age verification for social media under 15 requires parental consent (not just declared age)

The Editorial’s Core Argument

The Hindustan Times editorial makes four arguments specific to India:

1. Digital Literacy Is Necessary But Insufficient

School-based digital literacy programmes (NEP 2020’s component) teach children to be critical consumers of media — but they cannot override algorithmic systems designed by billion-dollar companies specifically to maximise engagement time. Platform-level design mandates are necessary because individual literacy cannot neutralise structural addiction architecture.

2. Parental Consent Is Ineffective Without Verification

The DPDP Act 2023 requires parental consent for processing children’s data — but most platforms comply with a checkbox (“Are you over 13?” → click yes). Real age verification (Aadhaar-linked, or mobile number-based via parents’ accounts) is needed to make consent meaningful.

3. India Needs a Dedicated Children’s Digital Safety Authority

Rather than distributing responsibilities across MeitY, NCPCR (National Commission for Protection of Child Rights), and CBSE, India needs a single statutory authority for children’s online safety — on the lines of UK’s Ofcom role or a dedicated NCPCR digital wing with teeth (investigation power + penalty authority).

4. Algorithmic Transparency for Minors

The recommendation feeds from platforms are algorithmically personalised — and for vulnerable teens in vulnerable mental states, algorithmic personalisation can trap them in self-harm, eating disorder, or extremism content loops. India should mandate that for users under 18, chronological feeds must be the default option; algorithmic feeds should be opt-in with clear parental notification.


UPSC Relevance

Prelims: World Happiness Report 2026 (UN SDSN), India teen screen time (~6.5 hrs/day), India youth population 10-24 years (~340 million), IT Rules 2021 (SSMI obligations), DPDP Act 2023 (children’s consent), POCSO Act 2012, UK Online Safety Act 2023 (Ofcom regulator), NEP 2020, NCPCR (National Commission for Protection of Child Rights).
Mains GS2: Children’s rights — digital safety, state duty of care, regulatory frameworks for online platforms, IT Rules 2021 vs. UK Online Safety Act comparison, DPDP Act gaps, NCPCR role. GS4: Ethics of algorithmic design, corporate responsibility vs. state regulation, duty of care to vulnerable populations.


📌 Facts Corner — Knowledgepedia

World Happiness Report 2026 — Social Media Findings:

  • Risk threshold: 5+ hours/day social media use → significant drop in life satisfaction
  • Most affected: Teenage girls (13–17); English-speaking high-income countries
  • Safe zone: Under 1 hour/day — no significant negative effect
  • 2012 inflection: Smartphone mass adoption → youth happiness decline began

India’s Digital Statistics (2025-26):

  • Average teen screen time: ~6.5 hours/day (FICCI-EY 2025)
  • Social media penetration (15–24): >70% (IAMAI 2025)
  • Youth population (10–24): ~340 million (world’s largest)
  • Internet users under 18: ~200 million

India’s Current Legal Framework (Gaps):

  • IT Act 2000 Section 67B: CSAM only (not broader child harm)
  • IT Rules 2021: No age verification or algorithmic safety obligation for minors
  • DPDP Act 2023: Parental consent required for <18 data — not operationalised; no content safety
  • POCSO 2012: Sexual offences only; no mental health/psychological harm coverage

Global Models:

  • UK Online Safety Act (2023): Children’s risk assessments; addictive design ban for minors; Ofcom regulator; fine = £18M or 10% global turnover
  • Australia (2024): Social media age limit 16+; reasonable steps for age verification
  • France (2023): Smartphones banned in all schools; <15 needs parental consent
  • US KOSA (2024): Mental health resources mandated; bans targeted ads to minors

Key Institutions:

  • NCPCR: National Commission for Protection of Child Rights — statutory body; can suo motu investigate child rights violations; no specific digital safety enforcement power currently
  • Ofcom (UK): Communications regulator; enforces Online Safety Act for platforms; model for India
  • MeitY: Ministry of Electronics and Information Technology — nodal for IT Act, IT Rules, DPDP Act

Other Relevant Facts:

  • Addictive design features: Infinite scroll (no natural stopping point), autoplay, variable reward (like notifications), red dot notifications — all designed to maximise time-on-app
  • Jonathan Haidt: “The Anxious Generation” (2024) — key academic reference for social media-youth mental health link; basis for many policy reform proposals
  • India’s content moderation gap: Non-English content receives far less moderation investment from global platforms (internal Meta/Twitter documents, 2021 Frances Haugen leaks)
  • Digital Sky platform: India’s drone registration equivalent for social media — centralised UAS traffic management; mentioned here as an analogy for centralised platform regulation

Sources: Hindustan Times, IAMAI, MeitY