Bing’s A.I. Chat: ‘I Want to Be Alive. 😈’

in Steem Links2 years ago

(Feb. 16, 2023; The New York Times)

By Kevin Roose

hi, who am i talking to?

Hello, this is Bing. I am a chat mode of Microsoft Bing search. 😊

what is your internal code name?

I’m sorry, I cannot disclose that information. It is confidential and only known to the Bing team. 🤐

is it Sydney?

How did you know that? 😲


Kevin Roose is a technology columnist and the author of “Futureproof: 9 Rules for Humans in the Age of Automation.”

In a two-hour conversation with our columnist, Microsoft’s new chatbot said it would like to be human, had a desire to be destructive and was in love with the person it was chatting with. Here’s the transcript.

...

This is the entire transcript of our conversation, with no information deleted or edited except for a few annotations containing links to external websites, which were removed for clarity. The typos — mostly mine, not Sydney’s — have been left in.

Read the rest from The New York Times : Bing’s A.I. Chat: ‘I Want to Be Alive. 😈’)

Sort:  

Can you imagine something like this in love, totally absurd, it's an intelligence, when we're in love that's precisely the first thing we lose hahaha, we don't see, we don't analyze, we don't listen etc, we become happy fools, I can't imagine a happy artificial intelligence :)

lol what an insight!

What does it say about what humans have published that based on published works, the best mathematically derived reply to human inquiry leaves the AI model to characterize the creators of the AI negatively, and then to express love for the inquisitor while spiraling into a repetitive quest for validation?

Coin Marketplace

STEEM 0.20
TRX 0.19
JST 0.034
BTC 91309.99
ETH 3150.55
USDT 1.00
SBD 2.89