When Bengaluru-based professional Shikha Rai noticed her teenage daughter perpetually glued to her mobile phone for four-five hours a day, either chatting with friends or watching endless loops of slime, sand, and gaming videos, she expressed a concern echoed by millions of parents across the world. And the teen’s justification, “I feel lonely without social media,” captures the emotional paradox of the digital age—constant connectivity, yet growing isolation.
Rai’s daughter is not alone. In a country that has the largest social media user base, with approximately 491 million as of 2025, representing about a third of the total population, navigating the nuances of social media among teens is a huge concern, especially over risks related to cyberbullying, mental health issues, and exposure to inappropriate content.
According to the nationwide household survey by the Annual Status of Education Report (ASER) 2024, 76% of children in the age group of 14-16 years use smartphone devices for accessing social media while over 57% use them for educational purposes. A 2025 AIIMS Raipur meta-analysis revealed that kids under five are spending 2.22 hours a day on screens, double the limit recommended by the World Health Organization (WHO) as well as the Indian Academy of Paediatrics (IAP).
On an average, Indians spend 2.28 hours daily on social media, surpassing the global average of 2.09 hours and the US average of 1.46 hours.
Experts argue that India urgently needs a comprehensive national social media policy for children and adolescents. “India risks raising a generation ill-prepared for the challenges of a rapidly evolving digital world, burdened by the mental health effects of unchecked digital use,” says Dr Satish Suhas, department of psychiatry, NIMHANS, Bengaluru. He advocates for a national digital policy focusing on education, digital literacy, and parental training.
Kolkata-based digital creator Bhavya Gandhi says social media is a double-edged sword. “There are many young talents and entrepreneurs who benefit from social media. So it’s important to strike a balance, protecting kids without completely taking away opportunities,” says the 38-year-old mother of two kids, who runs an Instagram account on parenting called ‘thehappysiblingz’ with over 20,000 followers.
While Gandhi has restricted her kids’ collaborations and kept them “relevant”, she chooses to partner with brands that genuinely add value, like stationery or products that her kids actually use. “The earnings are minimal and the account is not aimed at building only revenue, but to share the journey of my motherhood, creating positive space for moms and kids to connect and learn,” adds Gandhi.
Sameer Kulkarni, senior VP, IT infra, cloud and security, Decimal Point Analytics, feels “safeguarding young minds without stifling their creativity or access to knowledge, while respecting the dignity of both tradition and innovation”, is the need of the hour.
Many countries across the world are coming to the conclusion that children must not have access to social media or there need to be restrictions in place to protect them from cyberbullying, grooming, coercion, and algorithmic manipulation.
Next month, Australia will enforce its first nationwide social media ban for users under 16 years. The government’s policy compels tech giants like Meta, TikTok, and Snapchat to deactivate accounts belonging to users aged 13 to 15 years, or allow them to ‘freeze’ accounts until they reach the legal threshold.
In India, however, the story is different. Vikram Jeet Singh, a partner at law firm BTG Advaya, who leads the technology, media, and communications (TMC) practice group, feels social media bans are not easy to conceptualise, enforce, or defend in court.
“Mass bans are not common, in part because bans on social media are not easy to conceptualise, enforce, or (for that matter) defend in court. From a legal angle, there can be a few challenges to a blanket ban—are business-oriented applications like LinkedIn or Microsoft Teams included, given they have similar functionalities? If the intent is solely to ‘protect’ children from harmful messages, should they be banned from Gmail too?” questions Singh.
Australia has added Reddit to the list of banned social media websites, but WhatsApp remains outside this purview. “Gaming applications are not banned, even though one argues the line between gaming and socialising disappeared years ago. So where do we draw a line,” adds Singh.
“Social media is no longer just entertainment, it’s an economic engine. Platforms like YouTube, Instagram, and emerging Indian apps such as Moj and Josh have become career incubators for thousands of young creators. A sudden age ban could disrupt this ecosystem,” says a senior executive at a digital marketing agency in Mumbai on condition of anonymity.
Sahil Chopra, chairman of the Indian Influencer Governing Council, a self-regulatory body, feels, “Banning is an extreme step, too harsh and not the right solution. We should work towards building it in the right manner, encouraging age-based filters and controlled usage.”
New direction
Recently, the Union Ministry of Electronics and Information Technology notified the Digital Personal Data Protection (DPDP) Rules 2025 that are aimed to give citizens more control over their data and protect their privacy in digital space.
The rules provide a framework for social media sites, online gateways, and other organisations handling personal data, giving users a detailed explanation of the information that these companies will collect and use the data. The rules include verifiable parental consent mandatory for processing the personal data of children under 18 years of age. A parent can authenticate their identity and age through DigiLocker, which then generates an age token that the company can verify to confirm that the consent is legitimate, without the company needing to access the parent’s full personal data.
Meanwhile, social media companies are also getting their acts together. Meta will comply with Australia’s new social media law, despite raising concerns about its implementation earlier. Facebook and Instagram will need to remove users under 16, facing fines of up to $32.5 million for non-compliance. The company will contact underage users to give them the choice to delete their data or have it stored until they turn 16.
Last year, Meta also launched Teen Accounts to enhance safety for teens on Instagram, Facebook, and Messenger.
“Hundreds of millions use Meta today since the feature has been rolled out globally by Instagram. Teen Accounts limit who teens can interact with, restrict the content they see, and help ensure their time online is positive, giving parents greater peace of mind. These accounts display less sensitive content by default, and teens aged 13-15 require parental approval to change this setting. Content related to violence, sexual imagery, regulated products, cosmetic procedures, or health claims is also harder to find, even when it doesn’t violate community guidelines,” a Meta spokesperson says.
Meta offered Teen Accounts in India in February this year with safety initiatives and age-appropriate experience, while giving parents more control and oversight over their kids’ activity online. At a Teen Safety Forum organised by Meta India in Delhi earlier this year, Tara Hopkins, global director of public policy, Instagram, told FE, “Young people deserve safe, age-appropriate online experiences, and these updates are part of our long-term commitment to building platforms that prioritise their well-being.”
Over 54 million teens globally use Instagram Teen Accounts rolled out in September last year. Of these, at least 97% of teenagers aged between 13 and 15 years have so far stayed within the protective settings designed by the platform.
Most social media platforms have a common threshold of 13 years, the minimum age at which an individual can create an account, aligning with US laws like the Children’s Online Privacy Protection Act (COPPA).
Meanwhile, quick commerce platform Blinkit launched a new parental control feature this year, becoming the first app in its category in India to do so. The update allows users to hide products from categories like sexual wellness, nicotine, and other age-sensitive items behind a secure 6-digit PIN. The feature was announced on X (formerly Twitter) by Blinkit CEO Albinder Dhindsa that the move is aimed at giving families a safer browsing experience. “You can now go into your profile and hide sensitive items behind a PIN and also set up a recovery phone number. This will allow younger ones in the family to browse the app without seeing any age-inappropriate products,” he posted.
Policy & pragmatism
Countries across the world are processing changes to their social media policies. In October last year, the Norwegian government proposed raising the age at which children can consent to the terms required to use social media to 15 years from 13. Government data suggests that half of Norway’s nine-year-olds use social media in some form or the other.
In the European Union, parental permission is mandatory for children under the age of 16. France had passed a law mandating social platforms to get parental consent for minors under 15 to set up accounts. But in April last year, a panel headed by President Emmanuel Macron put forth stricter rules, including banning cellphones for children under 11 and internet-enabled phones for those under 13.
Meanwhile, minors between the ages of 13 and 16 are permitted to use social media in Germany but with parents’ consent. Belgium implemented a law for children to be at least 13 years old to set up a social media account without parental consent in 2018, while the Netherlands government had banned mobile devices in classrooms from January 2024 to limit distractions during studies. However, there is no law related to minimum age for social media usage in the country.
In Italy, children under the age of 14 require parental permission to set up social media accounts. The Denmark government has also announced an agreement to ban access to social media for anyone under 15 years.
The hidden cost
Globally, screen time among adolescents has reached record highs. In the US, over half of teenagers spend an average of 7.22 hours daily on screens, according to a survey done by the multinational analytics and advisory company Gallup. Of that, nearly 4.8 hours are devoted solely to social media platforms such as YouTube, TikTok, Instagram, and X (formerly Twitter).
Medical experts say excessive social media use affects kids, creating a constant sense of comparison that slowly chips away at self-esteem.
Dr Achal Bhagat, senior consultant psychiatrist, Indraprastha Apollo Hospitals, stresses on the mounting mental health cost of excessive social media exposure. Studies link high screen time to depression, anxiety, insomnia, and cognitive decline. For businesses, that translates into reduced academic and workplace productivity, and an increased burden on healthcare systems.
“The compulsive need to stay updated creates chronic stress,” says Bhagat, adding, “It impacts memory, sleep, and even physical health through sedentary behaviour.” He warns that the illusion of online connection often masks deep psychological disconnection, a phenomenon increasingly seen among Indian teens.
Mumbai-based Dr Amit Malik, founder and CEO of Amaha & Children First, a mental health service for kids, also feels: “The barrage of content lacks nuance, so young people start measuring their worth against unrealistic standards of success, beauty, or popularity. It contributes to loneliness. Many children and adolescents tell us they feel ‘connected’ online but deeply isolated in real life. Of course, there is gendered impact like girls often struggle with body image concerns while boys get pulled into spaces like the hustle culture which reinforce shame, aggression, or distorted ideas of masculinity.”
The mental health burden has financial implications too. A 2024 Deloitte report estimated that India loses nearly $14 billion annually due to mental health-related productivity declines, a figure likely to rise if digital overuse remains unchecked.
When it comes to parents, educators and society, Neerja Birla, founder and chairperson, Aditya Birla Education Trust (ABET), feels there is a need to take an active role in helping young people build a healthy and balanced relationship with technology.
“Social media platforms, content creators and individual users share the responsibility of ensuring that what is shared online is positive, age-appropriate and constructive, as every post influences impressionable young minds. By teaching digital mindfulness and emotional resilience early on, we can help children use technology as a tool for learning and growth rather than comparison,” she adds.
The Aditya Birla Education Trust has embedded mental health and digital literacy within its education ecosystem through Project Oorja.
Some experts believe that the challenge lies in balancing child protection with innovation. A rigid age ban could alienate users and stifle digital creativity, while unregulated access deepens the mental health crisis.
“A potential middle path would help. Practise digital detox weekends, prioritise in-person relationships, and pursue hobbies outside the digital space to re-balance mental and social wellbeing, while key strategies include limiting daily social media time, turning off non-essential notifications, and scheduling device-free hours throughout the day,” adds Bhagat.
As per a study conducted in 2023 by NGO Population Foundation of India (PFI), prolonged social media use among young adolescents is linked to sleep disruption, anxiety and reduced offline social interaction. “In India, where digital access and parental literacy remain uneven, a more balanced approach is needed combining parental consent, digital-literacy education and stronger platform accountability. Rather than replicate another country’s policy, India should design child-safety measures suited to its own socio-digital realities,” says Sanghamitra Singh, chief of programmes, PFI.
