By Disha Verma and Prateek Waghre

We’ve all experienced deceptive user interface designs on apps, websites or platforms, or in advertisements, with an intention to subvert user autonomy and influence decision-making. These include incessant spam notifications, miniscule ‘close ad’ buttons, sneaking items into carts, auto-checking paid products or services, making cancelling subscriptions or accounts very difficult, and more. Such practices are called ‘dark patterns’, and they come in all shapes, sizes, and degrees of intrusion and harm.

The rapidly accelerating reliance on such deceptive designs has evoked a response from the consumer affairs ministry, which published a set of draft guidelines to restrict their usage under the Consumer Protection Act, 2019. While a framework comprehensively recognising, restricting, and penalising the use of dark patterns is welcome, there are some roadblocks. Regulators and policymakers need to demonstrate a deep and nuanced understanding of the issue, along with an appetite to bolster consumer feedback and redress mechanisms for effective governance of the space. To that end, we pose four questions that must be considered before the draft Guidelines are adopted.

Who will regulate dark patterns?

Dark patterns are used across an array of sectors (payments, insurance, e-commerce, social media, travel, entertainment, and so on). Sectors like insurance and advertising have their own legal frameworks that penalise the use of dark patterns: IRDAI prohibits travel portals in India from selling insurance as a default option, and ASCI has its own guidelines regulating deceptive patterns in advertising. Competition law also governs ‘unfair trade practices’, which the Guidelines characterise dark patterns as.

In this regulatory overlap, it is imperative to clearly define jurisdiction. Sectoral regulators will want to govern their spaces narrowly and undertake assessments as necessary independent of a broad consumer law, and more so as dark patterns evolve in form and fashion. The Guidelines must clearly set out the ministry’s jurisdiction, which is consumer facing, and distinguish it from sectoral regulation, which is market facing. Sectoral regulators can undertake compliance assessments on market players, while the Guidelines can provide recourse for aggrieved consumers to report dark patterns. As the Guidelines apply in addition to existing sectoral laws, regulators should remove any ambiguity or possibility of dual penalties, as that may hurt design innovation and creativity or affect personalisation preferences of consumers.

Additionally, the Guidelines wish to govern a space which is hitherto self-regulating. Organic competition and market forces drive design innovation online. Advertisers subscribe to the ASCI code of self-regulation and set legal-ethical standards for themselves. Visibly, this environment is skewed more to the interest of its players than the consumer, so an intervention is appropriate and timely. However, it has to be staggered.

The answer lies in striking a balance between regulatory oversight and self-regulation among market players. Companies can continue setting standards for themselves, but the Guidelines can add a layer of consumer accountability. They can educate the consumer on what a dark pattern looks like and construct channels to report them. Thus, avoiding the need to be too prescriptive about the actual design elements that companies use.

What will they regulate?

The Guidelines define and illustrate 10 categories of dark patterns and are confined in application to these. Such categorisation does not cover the full extent of deceptive designs and practices prevalent today. Some other practices are obstruction, social proofs, psychological pricing, growth hacking, linguistic dead-ends, roach motels, privacy zuckering—the list is ever-expanding. IFF conducted a survey where people reported 110 instances of dark patterns across apps and sites—and found that a large number would not fall into any of the listed categories. The Guidelines should not prescribe exhaustive categories for dark patterns, and such lists should merely be illustrative for consumers to identify and report. They must also be updated through consumer feedback to reflect evolving trends.

What about consumer privacy?

Any intervention for dark patterns also needs to address the plethora of privacy risks associated with them. Collection of excessive and unnecessary personal data through deception and without informed consent is a grave threat. It will not be mitigated by the implementation of the Digital Personal Data Protection Act 2023 alone.

The first step to plugging this gap is making the consumer aware of privacy harms caused by dark patterns. The Guidelines could include additional categories of dark patterns that compromise consumer privacy (such as privacy zuckering, nagging, privacy maze, bait and switch) and provide illustrations or steps on how to recognise them.

In addition, the ministry can suggest standards for market players engaging in data collection, or require them to set standards themselves, on how to minimise the data they collect from consumers and process it in a transparent manner. Specific articulation and actual implementation of universal principles like ‘privacy by design’, which encourage data protection through inherent technology design, are also required to ensure data minimisation and mitigate consumer data risks.

How will contraventions be reported?

The Guidelines lean on the consumer reporting instances of dark patterns to issue penalties on the market players deploying it. At this stage, it neither lays down a grievance redress mechanism, nor defines thresholds for penalties. It simply states that using any of the dark patterns listed in the Guidelines will be a broad contravention of the Consumer Protection Act, 2019. Regulators must set clear parameters and thresholds for contraventions, as penalties under the Act can be criminal.

Any penalties need to be proportional to the nature of deception and extent of user harm, and not just based on the size or market share of the market player causing it, since smaller or newer entities can also infringe on consumer rights. This could, however, be challenging because the institutional capability to determine the extent of user harm also must be developed in this emerging context.

The regulation of dark patterns is fundamentally rooted in reporting and feedback mechanisms. The regulators must strengthen existing mechanisms and grievance redress channels established by the Consumer Protection Act, 2019. Without an accessible and proactive feedback mechanism, these Guidelines will lose their teeth.

Regulatory interventions to curb dark patterns are still at a nascent stage. The Guidelines and any follow up interventions will need to balance regulatory overlap and evolving scope in a dynamic space with competition, innovation, and consumer interest as priorities. Success is far from guaranteed—but they can shift incentives towards consumer interest.

Writers are associate policy counsel and policy director, Internet Freedom Foundation