Microsoft muzzles its new Bing AI chatbot for inappropriate responses

Chatbots are quickly becoming the way of the future, yet they still have issues.

Microsoft is the latest tech company with problems with its new Bing search engine, which uses the same technology as the viral OpenAI chatbot ChatGPT.   The technology is meant to answer people as a human would, though now Microsoft is putting caps on its capabilities.

Microsoft Bing is a web search engine that is owned and operated by Microsoft (pretty much their own version of Google). It works just like any other search engine, where you can type in questions and get answers, including articles, images, videos, shopping, maps, and more.

Now, Microsoft has introduced a new Chat option where you can ask Bing a question, and it will give a more exact, typed-out answer rather than feeding you multiple articles for you to read on the topic.

For example, if you’re looking to make a 3-course meal with no nuts or seafood, you can simply type, “I need to throw a dinner party for 6 people who don’t eat nuts or seafood. Can you suggest a 3-course menu?” and the search engine will give you a list of options you can make with suggestions for appetizers, main courses, and dessert.

Anyone can use Microsoft Bing if they join what Microsoft calls “the new Bing.” You can request access by going to Bing.com and selecting “Join the waitlist.”

When you have cleared the waitlist, you will receive an email letting you know that you can now access the new Bing at Bing.com. Once you have access, you can start typing in your usual search box, and Bing will give you detailed answers.

It has been reported that the new Bing has been having some malfunctions since its initial release. Many new users got excited and wanted to see how long they could converse back and forth with the search engine, and these longer conversations began to overwhelm it.

Some people posted screenshots of their conversations to social media, showing how the new Bing was convinced that the year was 2022 and not 2023 and would gaslight users by saying things like “Please don’t doubt me” and “I’m Bing, I know the date.”

Other people have found the chatbot’s answers amusing. However, since Microsoft is investing around $10 billion in this new way of communication, the company is now setting limits to make sure that it actually works as it is supposed to.

Microsoft noticed that the new Bing would only act in an inappropriate manner when the conversations with its users were carried on for too long. Because of this, the tech company is implementing limits on how many questions you can ask.

The new Bing can now answer five questions per session and 50 questions in a day. This means that you can ask it 5 questions on the same topic before you have to switch topics.

The company says that the chatbot is still very much a work in progress and that current users are helping them to improve the technology so that it can be more reliable in the future.

For some insight into AI, I recently interviewed ChatGPT as if it were a human; here’s what the AI had to say that gave me the chills.

Check out what happened when I interviewed ChatGPT as if it were a human by heading over to CyberGuy.com and searching “here’s what the AI had to say that gave me chills” by clicking the magnifying glass at the top of my website.

Have you tried the new ChatGPT or Microsoft Bing yet? We want to hear about your experience.

 

Related:

Related posts

Is your Social Security number at risk? Signs someone might be stealing it

Updated Android malware can hijack calls you make to your bank

Robot dog is making waves with its underwater skills