Microsoft Bing misleads election data: Report

Bing’s AI chatbot gave wrong answers 30% of the time

Microsoft will also be including the Azure OpenAI service which is expected to play a key role in Microsoft’s risk management strategy
Microsoft will also be including the Azure OpenAI service which is expected to play a key role in Microsoft’s risk management strategy (Freepik)

A study from two Europe-based nonprofits has found that Microsoft’s artificial intelligence (AI) Bing chatbot, now rebranded as Copilot, produces misleading results on election information and misquotes its sources, stated Cointelegraph. 

With insights from AI Forensics and AlgorithmWatch on December 1, 2023,  Bing’s AI chatbot gave wrong answers 30% of the time to basic questions regarding political elections in Germany and Switzerland. Inaccurate answers were expected to be on candidate information, polls, scandals and voting.

“As generative AI becomes more widespread, this could affect one of the cornerstones of democracy: the access to reliable and transparent public information,” Cointelegraph added.

Furthermore, the study also included that the safeguards built into the AI chatbot were “unevenly” distributed and caused it to provide evasive answers 40% of the time, Cointelegraph concluded.

(With insights from Cointelegraph)

Follow us on TwitterFacebookLinkedIn

Get live Share Market updates, Stock Market Quotes, and the latest India News and business news on Financial Express. Download the Financial Express App for the latest finance news.

This article was first uploaded on December eighteen, twenty twenty-three, at forty-five minutes past ten in the morning.
Market Data
Market Data