The usage of EVMs has, no doubt, reduced the incidence of voter fraud, double-counting and strong-arm tactics, but that does not mean it has assuaged the concerns of the techno-illiterate that perhaps their votes might be misappropriated by an electronic intermediary
Two former Chief Election Commissioners (CEC) and the current CEC have verbally, and in writing, rebutted the suggestion that the electronic voting machine (EVM) is hackable and that the Election Commission (EC) should safeguard public franchise by reverting to a form of paper balloting. This controversy is about the electoral process in India. It bears, however, upon a deeper issue—the tension between technocracy and democracy.
The disclosure that Facebook had allowed consultant firm Cambridge Analytica to access the private data of its users, which was then passed onto the Donald Trump election campaign, raised concerns about data privacy and, more fundamentally, the power of the owners of data to abridge democratic rights. The most eloquent votary of this concern has been the historian Prof Yuval Noah Harari.
In a talk at Davos, Switzerland, in 2018, followed by other lectures and his latest book ‘21 Lessons for the 21st Century’, Prof Harari spelled out the potential consequences of an algorithmic world. He acknowledged the huge benefits of the digital age, but forewarned of a scenario in which human beings acquire the potential to “hack into the bodies, minds and brains of other human beings” and where algorithms “know individuals better than the individual knows himself.” This scenario is imaginable because of the advancement in computing power (infotech) and the agglomeration of biometric and biological data (biotech). When the two ‘tech’ revolutions merge, the handful of companies that own data will fashion the greatest revolution ever, overturning the laws of Darwinian selection with the “laws of intelligent design.” They will have the power to “control the fate of humanity” and possibly “that of life itself.” Democracy could be replaced by “digital dictatorship.”
Fascinating, science fiction, alarmist … one may use any one or a combination of these words to describe Prof Harari’s prognostications, but there is no ignoring the many questions that his description of an alternative future have raised.
Practical questions: What regulatory checks and balances should be imposed on companies that monopolise data (Amazon, Google, Tencent, Alibaba, Facebook)? Should these companies be broken up? And if so, who should be given the authority to keep data and to decide how and in what manner this asset should be given away? Surely, not the politicians!
Philosophic questions: How does one control phenomena (technology and data) that is “everywhere but nowhere,” and that recognises no physical or political barriers and is universal in scope and impact? Can an algorithmic world be managed through institutional structures of governance built on the bedrock of Westphalian principles? The treaties of Westphalia (between 1644 and 1648) brought to an end the religious wars in Europe. They established three principles that still define the nature of international affairs today—the principle of state sovereignty, the principle of non-interference in the affairs of other states, and the principle of the equality (legal) of states.
Prof Harari admits he does not have the answers to these questions. He believes that a collective of poets, philosophers and statesmen should be tasked to develop the answers.
Whether that should be the way forward or not can be debated. But what is becoming clear is that questions like those posed above cannot be answered by drawing on the past or projecting from the present. An “out of the box” approach is required that recognises that technology and innovation have not been an unmixed blessing, and that the current rules, institutions and structures of governance will need to be refashioned to address the emergent challenges.
The Industrial Revolution laid the foundations for decades of sustained development and economic prosperity. But it also led to the planetary crisis of global warming. Nuclear scientists generated the prospects of clean, affordable and abundant energy, but they also raised the spectre of a thermonuclear holocaust. The digital revolution opened up phenomenal vistas of knowledge and information, but, as suggested by Prof Harari, created the potential of digital dictatorship. The issue is that whilst humanity has harnessed the benefits of technology and innovation, it has not yet created the institutions for managing its consequences.
I am reminded of a rhetorical question that Robert Kennedy asked in his short memoir ‘Thirteen Days’ on the Cuban missile crisis. “What if any circumstance or justification gives this government or any government the moral right to bring its people and possibly all people under the shadow of total destruction.” To remind, in October 1962, the US discovered that the Soviets were placing offensive nuclear missiles in Cuba. President John F Kennedy gathered together his advisers and for 13 days these people deliberated on the US response. The military advocated a pre-emptive air strike; others, a blockade. Whatever the response, the risk existed of a nuclear fallout with global consequences. In the end, the crisis was averted, but it did remind everyone of the paradox of democratic governance. Elected leaders are subject to checks and balances to prevent absolutism of power. And yet, on one occasion, the world came to the edge of a nuclear conflict, and the fate of the humanity (in a sense) rested in the hands of a few people. President Kennedy and his advisers decided on how to respond. No one else. More than 60 years on, the world is still struggling to contain the exercise of plenipotentiary powers. Except that now, in addition to limiting the power of individuals, it has to find a way of limiting the ‘power of data’.
I have no doubt that the usage of EVMs has reduced the incidence of voter fraud, double counting and strong-arm tactics, but that does not mean it has assuaged the concerns of the techno-illiterate that perhaps their votes might be misappropriated by an electronic intermediary. We must not, therefore, duck the question: What institutional structures must be created and what regulatory checks imposed to ensure the algorithmic world does not abridge our democratic rights? This is the same question that Prof Harari and others are asking in the context of the impact of the digital age on humanity.
-The author is chairman & senior fellow, Brookings India. Views are personal