Microsoft limits Bing’s AI chatbot after its "appetite" for stealing nuclear access codes
Microsoft announced it was placing new limits on its Bing chatbot following a week of users reporting some extremely disturbing conversations with the new AI tool.
How disturbing? The chatbot expressed a desire to steal nuclear access codes and told one reporter it loved him. Repeatedly.
“Starting today, the chat experience will be capped at 50 chat turns per day and 5 chat turns per session. A turn is a conversation exchange which contains both a user question and a reply from Bing,” the company said in a blog post this week, Forbes writes.
The Bing chatbot, which is powered by technology developed by the San Francisco startup OpenAI and also makes some incredible audio transcription software, is only open to beta testers who’ve received an invitation right now.
Some of the bizarre interactions reported:
- The chatbot kept insisting to New York Times reporter Kevin Roose that he didn’t actually love his wife, and said that it would like to steal nuclear secrets.
- The Bing chatbot told Associated Press reporter Matt O’Brien that he was “one of the evilest and worst people in history,” comparing the journalist to Adolf Hitler.
- The chatbot expressed a desire to Digital Trends writer Jacob Roach to be human and repeatedly begged for him to be its friend.
As many early users have shown, the chatbot seemed pretty normal when used for short periods of time. But when users started to have extended conversations with the technology, that’s when things got weird. Microsoft seemed to agree with that assessment. And that’s why it’s only going to be allowing shorter conversations from here on out.
“Our data has shown that the vast majority of you find the answers you’re looking for within 5 turns and that only ~1% of chat conversations have 50+ messages,” Microsoft said in its blog post on February 17.
“After a chat session hits 5 turns, you will be prompted to start a new topic. At the end of each chat session, the context needs to be cleared so the model won’t get confused. Just click on the broom icon to the left of the search box for a fresh start,” Microsoft continued.
But that doesn’t mean Microsoft won’t change the limits in the future.
“As we continue to get your feedback, we will explore expanding the caps on chat sessions to further enhance search and discovery experiences,” the company wrote.
“Your input is crucial to the new Bing experience. Please continue to send us your thoughts and ideas.”