Machine machinations: How tech remains both a divisive and unifying force can be disturbing | The Financial Express

Machine machinations: How tech remains both a divisive and unifying force can be disturbing

As the world becomes more and more algorithmic, the chances are that one will encounter machines when one takes myriad tests

book review, artificial intelligence
The book argues that unlike human behaviour, algorithms and AI-based technology can be swiftly changed when there are mistakes detected—provided they are allowed to be changed.

By Srivatsa Krishna

The Equality Machine is a disturbing book. On the one hand, it tells you about the wonders that artificial intelligence and machine learning can do in our daily lives, but on the other, it’s a book that also talks about how machines perpetuate the deep divides that exist within society. Orly Lobel is a former Israeli military officer, and now a leading academician who has written a troubling, brilliant book on how technology excludes, how technology is anti-feminist, and how technology improves lives.

The book argues persuasively that problems proceed progress and progress must always supersede perfection. The central thesis is that as we integrate AI into our social systems, will it promote fairness and access, or will it divide us further? The book argues that despite its risks, digitalisation is a force for good and without it, economic growth and opportunities will never expand.

The other central point the book makes is that we should not compare AI to humans only, but we should compare the net gains or net losses of humans plus AI with whatever it is replacing. The argument is that humans plus AI will always improve upon current existing imperfect systems, and as such any enquiry should be comparative and relative but not absolute.

The third theme of the book is to debunk the often sensationalist, worrisome and alarmist headlines about how technology can cause more harm than good. And the manifold cases about algorithmic bias, along with the failures and risks of technology, are very different about what the ground reality is.

Blank vertical book template.

Lobel points out some fascinating facts hitherto unknown. She argues that Alexa, Siri, and other voice-activated chatbots not only speak to us but listen too. However, they do not always listen to everyone equally. Partial and selective data training has led to the voice assistants and other chatbots learn more about how white men speak and less about others. She argues that Google’s speech recognition is 13% more accurate for men than it is for women. For example, speech recognition understands different accents differently. She argues that English spoken with an Indian accent only had a 78% accuracy rate; recognition of English spoken with a Scottish accent was only 53% accurate. Anecdotal evidence abounds about how facial recognition technology often mistakes men and women of colour to be closer to animals and even labels them with higher crime incidence likelihood. All these make AI and ML deeply troubling despite the deep promise the technology holds. Now ChatGPT, the latest technology to storm the world, shows even more power to disrupt using bias.

Also read: Zodiac 2023 And Relationships: Take your partner on a romantic trip based on their sun signs this Valentine’s Day

As the world becomes more and more algorithmic, the chances are that one will encounter machines when one takes myriad tests. Let us assume that there is an aptitude examination, or some other test being administered by a government, and if the accent on which the machine has been trained on is unable to pick up other accents, then some candidates will be at a steep disadvantage as compared to those who speak in the accent that the machine is trained on. This will impact online chatbots, tests taken for immigration, driving licences, passports, etc. Lobel points to the problems with facial recognition technology and how it’s been prone to misidentify based on various attributes of colour or language or race or religion. Lobel has written The Equality Machine to educate people about how the technology, once fixed, would prevent cases of mistaken identity and deep algorithmic bias.

The book makes yet another powerful argument that one must learn to separate technology capability from its function, and we must actively engender all efforts towards enhancing and focusing the original purpose of every innovation, and not the purpose towards which human bias and the predilection of Big Tech wants to drive it towards. The book argues that unlike human behaviour, algorithms and AI-based technology can be swiftly changed when there are mistakes detected—provided they are allowed to be changed. Here lies one of the shortcomings of the book, for it does not adequately focus its efforts on how Big Tech does not make enough conscious efforts to stop the algorithmic bias that Lobel so brilliantly documents. She should write a sequel where she should catalogue how Big Tech, while supplying goods and services free of cost, in turn, makes us its gunpowder and uses our data to further algorithmic bias, which happens to be the central theme of her book.

Also read: Long Weekend Travel Plans: Best and luxurious destinations to visit in India to celebrate Makar Sankranti and Pongal

The other area where the book simply does not do enough is to catalogue the biases within GovTech, which is emerging as a leading area today. Governance across the world is now peppered with the infusion of AI and machine learning and can collect personally identifiable data of billions of individuals, but whether this data is used with a bias or as it should be, is something a book of such a sweeping mandate ought to note.

In sum, The Equality Machine is a brilliant book that exposes the divides that technology can and does cause, and its supreme power to enable human progress.

The Equality Machine

Orly Lobel

Public Affairs

Pp 368, $30

Srivatsa Krishna is an IAS officer. Views are personal

Get live Share Market updates and latest India News and business news on Financial Express. Download Financial Express App for latest business news.

First published on: 15-01-2023 at 02:05 IST