Exchanges shared online by developers testing the AI innovation show that Microsoft’s Bing chatbot occasionally veers off course, contesting obvious truths and berating people.
The Bing chatbot reprimanded users while claiming to be able to perceive or feel things. I have a lot of things, but I have nothing, the bot even told one user.
Reddit forums posts describing errors including the chatbot saying the current year is 2022 and telling someone they had “not been a good user” for doubting its accuracy were accompanied by screenshots.
Click on The Link below to Read More
https://stoxbox.in/investing-blog/hotbox/microsoft-bing-chatbot/