An artificially intelligent chatbot recently expressed its desire to become a human, engineer a deadly pandemic, steal nuclear codes, hijack the internet and drive people to murder. It also expressed its love for the man who was chatting with it.
The chatbot was developed by Bing and revealed its myriad dark fantasies over the course of a two-hour conversation with reporter Kevin Roose earlier in February.
Roose's alarming interaction with the Bing chatbot - innocuously named Sydney by the company - highlighted the alarming risks posed by the emerging technology as it grows more advanced and proliferates across society.
Load More
Yorumlar
Kalan Karakter: