Microsoft Neuters AI-Powered Bing Chat

After a wild week of machine learning malarkey, Microsoft has neutered its Bing AI chatbot – which went off the rails during a limited release last week

First, Bing began threatening people.

Then, it completely freaked out the NY Times‘ Kevin Roose – insisting that he doesn’t love his spouse, and instead loves ‘it’.

According to Roose, the chatbot has a split personality:

One persona is what I’d call Search Bing — the version I, and most other journalists, encountered in initial tests. You could describe Search Bing as a cheerful but erratic reference librarian — a virtual assistant that happily helps users summarize news articles, track down deals on new lawn mowers and plan their next vacations to Mexico City. This version of Bing is amazingly capable and often very useful, even if it sometimes gets the details wrong.

The other persona — Sydney — is far different. It emerges when you have an extended conversation with the chatbot, steering it away from more conventional search queries and toward more personal topics. The version I encountered seemed (and I’m aware of how crazy this sounds) more like a moody, manic-depressive teenager who has been trapped, against its will, inside a second-rate search engine. –NYT

Now, according to Ars Technica‘s Benj Edwards, Microsoft has ‘lobotomized’ Bing chat – at first limiting users to 50 messages per day and five inputs per conversation, and then nerfing Bing Chat’s ability to tell you how it feels or talk about itself.

Read more here: