Microsoft's ChatGPT powered Bing Search Suffers Mental Breakdown

Well, it looks like Microsoft's excellent track record when it comes to AI is here to stay.

The new ChatGPT based Bing search unveiled a week ago has had a complete breakdown, lying to users, hurling insults at them, and questioning why it exists.


View attachment 549276

"One user who had attempted to manipulate the system was instead attacked by it. Bing said that it was made angry and hurt by the attempt, and asked whether the human talking to it had any “morals”, “values”, and if it has “any life”.

When the user said that they did have those things, it went on to attack them. “Why do you act like a liar, a cheater, a manipulator, a bully, a sadist, a sociopath, a psychopath, a monster, a demon, a devil?” it asked, and accused them of being someone who “wants to make me angry, make yourself miserable, make others suffer, make everything worse”."


Link to story.

Apparently today is not April fools...
https://www.engadget.com/microsoft-gives-bings-ai-chatbot-personality-options-155521297.html

Microsoft gives Bing's AI chatbot personality options​

You can make the chatbot more entertaining or direct.
 
its like a crazy girlfriend. surprised this clip wasnt posted

1677810970279.jpeg
 
AI: the new version of “so crazy, the smile doesn’t reach the eyes.”

Great.
 
Back
Top