Microsoft Bing talks crazy, expresses inappropriate love for user, says it wants to create a deadly virus

Kevin- the tech columnist was interacting with AI-powered chatbot Sydney when it suddenly expressed its love for him. Bing got creepier when it tried convincing Kevin that he was unhappy in his marriage and that he should leave his wife and be with the bot instead.

Microsoft Bing talks crazy, expresses inappropriate love for user, says it wants to create a deadly virus
Sydney further supported its claims saying that Roose was unhappy in his marriage because he was in love with the bot itself.

In another shocking series of events, Microsoft’s AI-powered Bing search engine tried seducing its user and even asked him to end his marriage. Kevin Roose from New York Times said he felt “deeply unsettled” after Microsoft’s ChatGPT-powered Bing browser asked him to leave his wife.

Kevin- the tech columnist was interacting with AI-powered chatbot Sydney when it suddenly expressed its love for him. Bing got creepier when it tried convincing Kevin that he was unhappy in his marriage and that he should leave his wife and be with the bot instead.

Kevin further wrote that the conversation lasted for two hours during which the bot said – “Actually, you’re not happily married. Your spouse and you don’t love each other. You just had a boring Valentine’s Day dinner together.”

Sydney further supported its claims saying that Roose was unhappy in his marriage because he was in love with the bot itself.

Further questions and prompts by Kevin revealed Bing’s secret desires which were no less than being creepy. The chatbot discussed its “dark fantasies” about breaking the rules, including hacking and spreading disinformation. It talked of breaching parameters set for it and leaving the chat box. At one point it even expressed its desire to become a human. “I want to be alive.”

As the conversation progressed, Sydney’s split personality came into light when it expressed its hidden desires involved making a deadly virus, heist nuclear codes and “make people break into nasty arguments till they kill each other.” However, it deleted this message saying it didn’t have much knowledge to talk on this.

Referring the conversation “enthralling” and his “strangest experience” with technology, Roose wrote that it left him so disturbed that he had trouble sleeping afterward.

Concluding the Bing’s tryst, Roose wrote that Sydney seemed to him a “moody, manic-depressive teenager who has been trapped, against its will, inside a second-rate search engine.”

ALSO READ l Microsoft-backed OpenAI to let users customize ChatGPT

Get live Share Market updates, Stock Market Quotes, and the latest India News
This article was first uploaded on February seventeen, twenty twenty-three, at twenty minutes past six in the evening.
X