By Nimisha Srivastava

Online gaming in India is an industry expanding at an explosive rate – market research company Mordor Intelligence estimates the value of the Indian gaming market in 2024 at $3.49 billion, calling the industry the fastest growing in India. A report by EY informs us that with around 42.5 crore gamers in FY23, India boasts the second largest number of online gamers in the world after China.

Women and children form a significant chunk of this vast user base. A report by gaming fund Lumikai puts the percentage of women gamers in India at an unexpected 41%, while studies have thrown up similarly high numbers when it comes to children, one showing that 55% of parents in urban centres admitted to their children being addicted to social media and online gaming. The Lumikai report in fact suggests that women gamers spend more hours playing games online (11.2 hours/week) on an average than men (10.2 hours/week)!

A need for regulation

Numbers aside, the extensive permeation of internet connectivity and smartphones into most parts of India, on the wings of the COVID-19 pandemic, has meant that online gaming as a medium of leisure (or even addiction) is cutting through economic and demographic divides. The sheer number of users, growing concerns around digital wellbeing, particularly that of younger people, as well as the possibility of misuse of the anonymity and access afforded by the internet that can manifest in forms of abuse and violence against women and children – all point to the need of greater accountability and regulation in the online gaming industry.

A discussion paper on online gaming among children by UNICEF says ‘parents, the general public – and researchers – remain divided on whether online games have a positive or a negative effect on children.’ Nevertheless, intermittent events like an online ‘challenge’ leading to suicides among children, recent reports on female players (including minors) being sexually assaulted by other players in the ‘metaverse’, all point to the need for looking more closely at the issues coming up and possible solutions.

A self governing framework

The UNICEF report lists content classification, moderation of speech and limiting of playtime as some of the measures employed to regulate online gaming in different jurisdictions. It also highlights the importance of these measures not being excessively restrictive, so they don’t violate children’s freedom of expression or right to participation.

The government of India has acted to address these divergent concerns through a specific framework. In April 2023, the ministry of electronics and information technology (MeitY) enacted IT Rules specific to the online gaming industry. These rules, aimed at safeguarding gamers’ interests and well-being, laid out a framework for self-regulatory bodies (SRBs) as critical stakeholders in the ecosystem. These SRBs (mentioned as multiple ‘Self Regulatory Organisations under the Rules) will be a set of bodies that will be notified by the government through a selection process, and subsequently will act as certification bodies for online games, based on a fixed set of criteria. Though ostensibly these SRBs are mandated primarily to disallow games that involve wagering or betting, the rules also provide for an educationist, along with experts in the field of mental health and child rights, clearly pointing to an intention to safeguard children, particularly, from potential harms of online gaming.

A system in the making

It is now the government’s prerogative to ensure that these SRBs are formed and begin functioning in a transparent, rights based and clearly defined framework, to ensure meaningful protection for women and children gamers. The following functions will need special focus:

Age ratings and content control:

SRBs can collaborate with game developers and establish robust age rating systems to prevent underage individuals from accessing inappropriate content, though this could prove challenging to implement. The SRBs will need to ensure adequate dissemination of information to parents regarding this, so that age appropriate parental guidance/control can be ensured when children access online games.

Responsible gaming practices:

Promoting responsible gaming practices can form the foundation of a safe online gaming environment. SRBs can advocate for features that limit screen time, provide warnings about excessive play, and support players in avoiding addiction, contributing to a healthier gaming culture. Female gamers are reported to frequently experience instances of sexism, trolling or more recently, even assault by male players. Online game companies can be required to issue warnings to players signing up, about the harmful impact, illegality and consequences of such actions.

Reporting mechanisms and grievance redressal:

Straightforward and user-friendly reporting systems are essential for addressing inappropriate behaviour, cyberbullying and other kinds of violence. A fast, responsive and fair grievance redressal system (that the rules promise), will be crucial for creating safe online spaces for everyone.

Privacy and confidentiality concerns:

A much larger debate around privacy and confidentiality in the digital space is already raging, owing to the passage of the Telecommunication Act, 2023, and potentially invasive provisions of the IT Act as well. Although these issues are beyond the scope of this article, it is important to point out that SRBs should not come to be seen as extended arms of the state, with overarching powers to violate rights to privacy and confidentiality, in the name of protection. That would defeat the logic of a flexible regulatory framework for a dynamic and fast growing industry.

Achieving user safety in online gaming requires a collaborative effort between users, gaming companies, and government authorities. This tripartite unsaid agreement acknowledges the shared responsibility of all stakeholders in creating a secure and inclusive gaming environment. Users play an important role in detecting potential threats, and reporting instances of abuse to gaming platforms and relevant authorities. Gaming companies, on the other hand, must prioritise user safety and privacy in their product design, implement robust security measures to mitigate cyber threats, and enforce strict policies against abusive behaviour on their platforms. Finally, government entities have a responsibility to enact and enforce regulations that promote user safety, particularly that of vulnerable groups like women and children, hold gaming companies accountable for lapses in security and user protection, and provide support and resources to educate and empower users to navigate the digital landscape safely. In addition, explicit safeguards to ensure the rights of women and children, against abuse, harassment or other kinds of harm, need to be built into the mechanisms being set out by both the companies and the government, for this burgeoning field. Game on, as they say, but not before these crucial checks are in place!

The author is executive director at a non-profit organisation called Counsel to Secure Justice

Follow us on TwitterInstagramLinkedIn, Facebook