Whether or not the Supreme Court reverses its stand and concludes privacy is a fundamental right remains to be seen, but in either event it is critical to secure the data being given to a plethora of organisations like Apple, Google and Facebook and countless apps, and not just to Aadhaar though the public imagination seems to have got fixated with it. Recognising this, SC has tasked the Justice BN Srikrishna panel with recommending a framework for data protection. Against such a backdrop, a discussion paper by the Bengaluru-based Takshashila Institution lends some perspective on how such a legal framework should look like. The paper posits that the present ‘consent model’—one in which the data controller is free to collect, process and use the data once a user has given her consent—doesn’t offer complete, or even adequate, data protection. This could be because the user doesn’t always understand what she is giving her ‘consent’ to or the ‘consent’ can be forced—the majority of apps can’t be loaded without giving access to your phone-book, etc. Also, with data being collected, processed and used in far too many ways by far too many platforms for an average user to comprehend—interconnected databases and artificial learning complicate the scenario further—there is consent fatigue.
Takshashila argues that true data protection can be secured through a “rights-based model” based on accountability of the data controller, autonomy of the user over her data and security of the data. The model will guarantee certain broad rights to individuals, with the onus on the controller that these are not violated. In this model, the individual has a right to opt out of the processing of the data collected by a controller at any point of time. The controller will have to show the subject all the data that it has collected/or intends to collect on an interactive dashboard, and the ways in which this can be processed. The user can then opt out of data processing for certain categories or for certain outcomes, even as the dashboard informs her of the consequences—for instance, withdrawing consent for location tracking will mean that the data subject can’t use navigation services necessary for using, say, Uber. In this model, the data controller must assure security of the data collected, it must share information with the subject regarding what use the data has been put to or with whom has it been shared. While it is difficult to argue that the data controller must compensate the subject if there is discrimination due to data use—journalists’ data may suggest, for instance, to bankers that they don’t repay loans on time—the overall approach is a sound one.