The Microsoft Bing AI chatbot having disturbing conversations with users, explained

Bing’s AI chatbot has been saying some disturbing stuff lately. Yikes.

Welcome to FTW Explains, a guide to catching up on and better understanding stuff going on in the world. Have you seen headlines, tweets and stories about a Bing chatbot that’s having some very weird conversations with users? And wondering what’s up with that? We’re here to help.

As you may have seen, ChatGPT is very much a thing these days that people are having fun with — that uses artificial intelligence to craft answers to questions or requests.

But now, Microsoft’s search engine, Bing, is using an AI chatbot that users have been trying out. And that results have been, uh, pretty disturbing.

Let’s dive in: