Bing AI Chat Reveals its Feelings? Truth Or Myth, Revealed Now

Bing AI Chat Reveals its Feeling. Truth Or Myth, Revealed.

OpenAI, in collaboration with Bing, aims at taking chatbot robots to a new level. To a large extent, they have succeeded in creating a buzz and taking the world by storm for a positive cause. And has successfully managed to a large extent.

There is a lot of chatter on whether ChatGPT etc. or Bing, for that matter, can take over the desks. The simple answer to this question is that technology changes, but values, standards, and norms don’t.

Does Bing AI Chat Reveal its Feelings?

Bing AI Chat Reveals its Feeling. Truth Or Myth, Revealed.

A two-hour-long conversation with Bing’s AI chatbot was posted by The NewYork Times contributor Kevin Roose, creating a huge stir. Later in his article titled”Bing’s AI Chat Reveals its Feelings: I want to be Alive.” In his article, Roose pens that he was moved by the answers of Chatbot and felt an emotional touch in the answers. He further writes that ‘I felt a strange new emotion—a foreboding feeling that A.I. had crossed a threshold, and that the world would never be the same.” 

Does it really mean that Chatbot has turned into a human? Does it really have emotions? The straight answer is No. It is merely a machine that is developed and managed by humans with the help of science and Technology.

Is It Really Bing AI Chatbot Expressing its Feelings?

As a matter of fact, is that Bing AI chat reveals feeling can’t be taken seriously. It was just an event that went by. Even Open AI has denounced having any such inbuilt feature that will enable any such feelings or emotions.

On the flip side, since this incident came to the fore on Tuesday, people noticed glitches and went ahead with the complaint of Bing’s AI not working.

Many of the users report that the answers they get from the Chabot are rude and aggressive. It has been noticed that the Chatbot has been giving unfactual and wrong financial data in its responses.

Wrapping Up

The question of Bing AI reveals its feeling can be debunked by the fact that Bing has the ability to answer only the question with the answers fed up to the year 2021. It can’t add anything beyond the answer and data it is fed with. So the question of emotions and feelings does not exist at all.

Leave a Comment

Your email address will not be published. Required fields are marked *