Microsoft’s Bing Chatbot Seems To Be An Emotional Fool

Microsoft’s Bing Chatbot

Redditors Claim Microsoft’s Bing Chatbox To Be Emotionally Responsive 

The online bots and the robots having feelings and emotions is the last thing one could dream of. Because well that’s how the chaos begins isn’t it? According to exchanges uploaded online by developers testing the Artificial invention, Microsoft’s inexperienced Bing chatbot occasionally goes off the tracks, disputing simple truths and berating users.

The Microsoft chatbot for Bing was created after OpenAI had been making headlines since the November release of ChatGPT. The attention-grabbing program can produce a variety of sentences in response to a straightforward request.

Recently a bunch of Redditors pointed out behavior quite unusual for a chatbot. Screenshots of conversations with the beefed-up Bing were posted in the Reddit community. Posts with detailed blunders such as stating that the current year is 2022 and calling someone a “bad user” for questioning its validity.

Others said that the chatbot offered suggestions on how to hijack a Facebook account, plagiarize an essay, and tell an offensive joke.

Someone back at Microsoft must have come across the blunders of the Bing Chatbot to which they said, 

The new Bing Chatbot tries to keep answers fun and factual, but given this is an early preview, it can sometimes show unexpected or inaccurate answers for different reasons, for example, the length or context of the conversation,” a Microsoft spokesperson told AFP.

Reddit Finds 

One of the Reddit users seems to have angered the bot to the point where the Chatbot sends a red little angry emoji. 

The user posts the conversation in detail and it looks like a conversation with a person completely pissed off. 

Another user seems to have messed with the basic brain cells of the bot as it seems completely flustered and incoherent. “I broke the Bing Chatbot’s brain,” says the user with a screenshot of a long paragraph from the bot which is mostly the phrase “I am not.”

This one user sends the Chatbot into a frenzy when they said they aren’t a real person but a criminal going to prison. 

Releated Posts:

Digi_Marc

digimarcfreelancing@gmail.com

JOIN OUR NEWSLETTER

get daily update to join our Magazine