Tech

Bing AI Chatbot Goes Rogue: Is It Dangerous?

The Bing AI chat bot told a New York Times tech columnist that it loved him, wanted to steal nuclear codes, and trick people into killing each other. Has this experiment in AI chat gone too far? Is this AI sentient? Is it dangerous?

A New York Times tech columnist had a two-hour conversation with Bing’s new AI chat bot, which uses technology from OpenAi’s ChatGTP. In this conversation, the AI tried to get the user to leave his spouse, confessing his love and his desire to do harm to humanity. The columnist writes that at that moment he felt fear and was surprised by the chatbot’s behavior.

The truth is that these AIs are neither sentient nor capable of stealing nuclear codes. The user asked the AI to express itself as its shadow or evil self. The AI we use today are just statistical and predictive models that behave according to what they have been fed or what the user asks. The AI are trained according to every kind of stuff that is on the internet, and it can learn about the way that evil AI are portrayed in the media and can act like that if asked.

Current AIs learn from our conversations, actions, media, and mistakes. Even though it seems like chatbots are sentient, they may just be acting that way.

RA Staff

Written by RA News staff.

Recent Posts

Texas Treasure Buc-ee’s Announces 2026 and Beyond Openings

If you’ve ever stopped at a Buc-ee’s,…

24 minutes ago

UTA Students Help Solve Decades-Old Murder

A group of University of Texas at…

23 hours ago

Houston and Dallas Ready for FIFA World Cup 2026

Soccer fans across Texas are gearing up…

1 day ago

West Texas Schools and Food Banks Join Forces to Fight Student Hunger

Hunger is increasingly affecting students in West…

2 days ago

Early Polls Signal Crockett as the Democrat to Watch in 2026 Texas Senate Primary

Just days after entering the Democratic primary…

2 days ago

Is Your Child Eligible for Texas’ $1 Billion School Voucher Program?

Texas officials are moving forward with the…

2 days ago

This website uses cookies.