Tech

Bing AI Chatbot Goes Rogue: Is It Dangerous?

The Bing AI chat bot told a New York Times tech columnist that it loved him, wanted to steal nuclear codes, and trick people into killing each other. Has this experiment in AI chat gone too far? Is this AI sentient? Is it dangerous?

A New York Times tech columnist had a two-hour conversation with Bing’s new AI chat bot, which uses technology from OpenAi’s ChatGTP. In this conversation, the AI tried to get the user to leave his spouse, confessing his love and his desire to do harm to humanity. The columnist writes that at that moment he felt fear and was surprised by the chatbot’s behavior.

The truth is that these AIs are neither sentient nor capable of stealing nuclear codes. The user asked the AI to express itself as its shadow or evil self. The AI we use today are just statistical and predictive models that behave according to what they have been fed or what the user asks. The AI are trained according to every kind of stuff that is on the internet, and it can learn about the way that evil AI are portrayed in the media and can act like that if asked.

Current AIs learn from our conversations, actions, media, and mistakes. Even though it seems like chatbots are sentient, they may just be acting that way.

RA Staff

Written by RA News staff.

Recent Posts

Texas Democrats Focus on Winning Back Working-Class Voters

As Texas Democrats prepare for the 2026…

17 hours ago

Federal Court Slams the Brakes on Trump’s National Guard Move

President Donald Trump authorized the deployment of…

18 hours ago

Conservatives Criticize Bad Bunny’s Super Bowl Halftime Selection

Texas Attorney General Ken Paxton is among…

2 days ago

Ted Cruz Misspeaks in Senate: ‘Stop Attacking Pedophiles’ Goes Viral

During a Senate Judiciary Committee hearing on…

2 days ago

Hidalgo County Cracks Down on Roadside Animal Vendors

Hidalgo County Commissioners Court has adopted a…

3 days ago

“Not Safe, Not Hosting”: Trump Issues World Cup Ultimatum

President Donald Trump issued a warning Thursday…

3 days ago

This website uses cookies.