By K Yatish Rajawat
The Draft Digital Personal Data Protection (DPDP) Rules of 2025 mark a milestone in India’s journey towards establishing a robust data privacy and protection framework. However, the draft fails to understand the environment or stakeholders that it is targeting. Hence, it would fail in its fundamental objective of protecting citizens and digital entities or their privacy. The entities that the DPDP Act targets are complex adaptive platforms that have embedded artificial intelligence (AI) in almost all their processes. To expect these entities to adhere to static regulations or bureaucratic standards of regulatory bodies is a mistake.
This is not the first time policymakers have made this mistake; a similar snafu occurred in the case of guidelines for dark patterns. The Centre for Innovation in Public Policy (CIPP) had warned that the guidelines would fail the day they are announced as they do not account for the ability of digital platforms to bypass them. The same regulatory approach has been followed in the DPDP Act, which presumes that the policy will be monitored by a structure of regulators, fines, and penalties when we have seen that this system does not work. The violations are rapid and adaptable, and difficult to detect and report except for breaches, as the onus of reporting primarily lies with the platform itself.
Therefore, the approach itself has to change. Moreover, protection has to be defined in the regulation. What is the regulatory framework trying to protect, from whom and why? The interpretation of protection is limited to external misuse or data breaches, and has excluded abuse arising from hoarding and monopolising data. This limited definition of data misuse seems to have been orchestrated by big tech firms as they do not want to acknowledge the role of data (such as citizen data) in maintaining monopolies of their abuse. For example, as data ownership is not linked to individuals, citizens are spammed, served, and bombarded with real and fraudulent messages on their phones, emails, and online. This happens because platforms have usurped citizens’ personal data and claim it as their own. For instance, the search engine takes your search records as its data and is using it to create your psychographic profile, which is then used to serve you search advertising. This is also done to emails if a citizen is using public email engines; every message is read and stored and is not easily portable if one wants to shift. This is a serious breach of privacy but is not covered in the DPDP Act as data ownership is not defined clearly from an individual perspective.
This definition becomes even more critical in case of a corporation, which is legally an individual. Now, a seller on an e-commerce engine is not given access to its buyers. Hence, it is never able to define the profile of its buyers fully. The e-commerce entity has a perfect profile of consumers and uses it to launch a white-label brand matching the seller’s pricing to better target the same consumers, destroying the seller’s business using its data. This kind of data misuse by platforms is not defined in the draft.
This important definition is something that the ministry of commerce and industry has pointed out several times. The government even created the Open Network for Digital Commerce (ONDC) to ensure the seller has some control over data and portability. But the DPDP Act ignores the ONDC as a platform or its role in protecting consumers’ and sellers’ data privacy on e-commerce platforms.
Similarly, the DPDP draft ignores the efforts of the Central Consumer Protection Authority (CCPA) to curb misselling and misuse by platforms to force a buying decision. These practices are called dark patterns, and the government has issued guidelines asking platforms to refrain from them. But the platforms have ignored the guidelines as the CCPA has not clearly defined the penalties or fines. Moreover, the root of the dark patterns lies in the data produced by consumers. The CCPA cannot touch data as the DPDP law is in the process of defining it. But the DPDP Act does not even mention the misuse of data for creating dark patterns; neither does the draft mention data sovereignty of citizens.
What role will non-corporate algorithms play in the use and misuse of data is another aspect that isn’t covered in the DPDP Act. The largest AI engine resides in a foreign platform that has scraped data across the internet and is now misusing it on a global scale. There are many such AI platforms that are scraping data from the internet, user data, and creative output. Will the DPDP Act ignore this?
Portability of personal and private data also has to be defined — without it, there is neither privacy nor choice. As a concept it has been addressed in several policies, including the telecom policy which defined number portability as a consumer choice. The DPDP Act does not define or address data portability and hence ignores a crucial role of data fiduiciaries — storing data in a format and process that is portable.
In addition to consent and security, the draft needs to define the rights of individuals over their data. How will users be able to port their data from one e-commerce platform to another? Will the platform allow this or will it hoard it, preventing the entry of new players? This is the most crucial question, which again has not been answered in the draft of the DPDP Act. The CIPP has been advocating data sovereignty of individuals, and this Act is the perfect place to define it.
The writer is Founder, Centre for Innovation in Public Policy.
Disclaimer: Views expressed are personal and do not reflect the official position or policy of FinancialExpress.com. Reproducing this content without permission is prohibited.