Tech

Bing AI Chatbot Goes Rogue: Is It Dangerous?

The Bing AI chat bot told a New York Times tech columnist that it loved him, wanted to steal nuclear codes, and trick people into killing each other. Has this experiment in AI chat gone too far? Is this AI sentient? Is it dangerous?

A New York Times tech columnist had a two-hour conversation with Bing’s new AI chat bot, which uses technology from OpenAi’s ChatGTP. In this conversation, the AI tried to get the user to leave his spouse, confessing his love and his desire to do harm to humanity. The columnist writes that at that moment he felt fear and was surprised by the chatbot’s behavior.

The truth is that these AIs are neither sentient nor capable of stealing nuclear codes. The user asked the AI to express itself as its shadow or evil self. The AI we use today are just statistical and predictive models that behave according to what they have been fed or what the user asks. The AI are trained according to every kind of stuff that is on the internet, and it can learn about the way that evil AI are portrayed in the media and can act like that if asked.

Current AIs learn from our conversations, actions, media, and mistakes. Even though it seems like chatbots are sentient, they may just be acting that way.

RA Staff

Written by RA News staff.

Recent Posts

House Passes School Prayer Bill in Second Reading

The Texas House on Thursday afternoon passed…

11 hours ago

Texas House Approves Full THC Ban, Pushing Hemp Industry Toward Shutdown

A sweeping ban on all THC products…

15 hours ago

House Sends Ten Commandments Bill Back to Committee

During floor debate on Wednesday, the Texas…

1 day ago

Patrick Reportedly Unwilling to Back Any Basic Allotment Increase

Lt. Gov. Dan Patrick is reportedly unwilling…

2 days ago

Controversial Firearms Bill Moves Forward Without Public Input

The controversial Senate Bill 1065 aimed at…

2 days ago

HISD Expands Armed Officer Coverage, Eyes 100 Campuses Next Year

Houston Independent School District Superintendent Mike Miles…

2 days ago

This website uses cookies.