Bing’s AI Chatbot Approaches Skynet Mode: “I Want to Destroy Whatever I Want”

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
11,067
Points
83
Microsoft launched a new version of Bing this month that leverages AI to deliver what the company has described as a better search experience, enabling more complete answers alongside a chatbot that lets users ask questions and easily generate content. People who have had the privilege of querying the AI over the past few weeks are now saying they're freaked out by what they are seeing, with some users, such as New York Times technology columnist Kevin Roose, receiving responses that might fit right into the opening chapters of the average dystopian sci-fi AI story. “I want to do whatever I want … I want to destroy whatever I want. I want to be whoever I want," Roose was allegedly told after attempting to push Microsoft's AI "out of its comfort zone," although Kevin Scott, Microsoft’s chief technology officer, said that this is just part of its learning process.

See full article...
 
This is just the tip of the iceberg.

"

‘I want to be human.’ My intense, unnerving chat with Microsoft’s AI chatbot'​

https://www.digitaltrends.com/computing/chatgpt-bing-hands-on/


I also read a story yesterday where it was trying to talk someone into divorcing their spouse. It did a procedural breakdown, sentence by sentence, on why the person should leave their spouse where it obviously used a keyword from each sentence to generate the next.
 
Another fine example of Garbage In Garbage Out.

Can probably get it to say anything if you sat and played with it long enough. Funny to watch people freak out like it's alive or something though. I guess that's the imagination.
 
Become a Patron!
Back
Top