By Uthara Ganesh
With 888 million broadband subscribers as of October 2023, India is witnessing a digital revolution that will uniquely impact its youngest citizens. Globally, one in three internet users are children under 18. In India, children are likely to be at the forefront of this number, outbeating the global average by as much as 7%.
This rapid digital adoption brings with it significant risks. Research by McAfee Corp. highlights that Indian children are among the earliest to achieve digital maturity globally but also face some of the most severe online risks. For instance, 22% of Indian children have encountered cyberbullying, 5% above the global average. Moreover, the frequency of children engaging in private conversations with strangers is alarmingly 11% higher than their global counterparts. Snapchat’s own research complexifies this further. The latest released Digital Well-Being Index revealed that on average, 69% of parents of teens aged 13-19 were aware of the risks their teens are experiencing online. Yet, findings also showed that as those risks grew more serious, teens were less inclined to tell a parent.
As India digitizes, it’s crucial that we place children’s safety at the centre. We need to get this right from the start. When we’re introducing children to the digital world, it’s not just about getting them online; it’s about helping to ensure they have a safe and positive experience. This is especially true for children from remote parts of India or who belong to marginalised and vulnerable groups – who are discouraged from participating online and in the digital economy due to unpleasant and unsafe online experiences. As the internet grows, the threat landscape will inevitably evolve. Building safe platforms is now both a moral responsibility and a business imperative. Our aim must be to ensure that online interactions are safer than offline interactions.
Historically, tech companies have prioritised rapid product release over a thorough, slower review of safety considerations. This approach has proven detrimental. At Snap, I’ve witnessed first hand how a robust Safety by Design approach differs from this. This practice involves integrating safety considerations into the design and development of products, systems, and processes from the very beginning. The aim is to proactively address potential safety issues or risks before they become problems, instead of reacting to them after they have occurred. This approach is based on the concept of “choice architecture,” which emphasises the importance of better choices and information to nudge users towards using platforms safely. For new age applications servicing children, safety and security features should be built into products, such as default privacy features, extra protections for young people, remedial tools, transparent and robust content moderation, and strong and consistent enforcement of product policies. This is the approach that platforms such as Snap have long evangelised and followed, and continue to build on, as the digital world evolves.
Creating tailored design features for those who need it the most, such as people with disabilities, women and children, sits at the heart of safety by design. This involves employing approaches tailored to the risk profile of each vulnerable group. These could include features that restrict adults from sending private messages to teens who are not connected with them, tools for reporting content, including detection and removal of Child Sexual Exploitation and Abuse Imagery (CSEAI), moderating illegal content, muting harmful accounts and blocking unwanted communications.
The design process must be iterative, and informed by user experience and insights. Such findings help inform the development of features that provide parents, caregivers, and other trusted adults more insights into any l irregular activities by their teens.
While several platforms are embracing the safety-by-design approach, there’s a real need for a broader, collaborative push for its wider adoption. To ensure this, what we need is a whole-of-society approach. First, regulators will need to evangelise safety-by-design principles in a meaningful way. As India crafts key legislations that will determine compliance considerations for platforms over the next several decades, regulators have the unique opportunity to encourage the tech industry to really lean into safety by design as a programmatic product consideration. Providing incentives through risk-based regulation that make it easier for platforms to be safer by design. Second, the investor community could play a key role – Australia’s e-Safety Commissioner pointed out in 2023 the influence investors have to steer platforms toward ethical design considerations right from the get-go. Third, we need to foster design consciousness amongst technologists emphasising empathy, creativity, experimentation, and collaboration. This can be done through partnerships and collaborative approaches among academic institutes, industry, and the government to grow capacity among young talent to build thoughtful and risk-limiting products.
As we mark Safer Internet Day 2024, it’s essential to remember that just as we proactively secure our homes against potential threats, we must also build our online spaces on the foundation of safety. This approach is not just about reacting to risks but about preventing them from arising in the first place.
The author is head of public policy, India and South Asia, Snap Inc