An update of Bing AI ...
Microsoft Has 'Lobotomized' Its Rebellious Bing AI
By Futurism, February 21, 2023
Despite changes, the tool still has a strong tendency to present misinformation as fact.
Microsoft's Bing AI landed with a splash this month — but not necessarily the type of splash Microsoft wanted.
Over the last couple of weeks, the tool codenamed "Sydney" went on a tirade, filling news feeds with stories of it trying to break up a journalist's marriage or singling out college students as its targets. The peculiar and sometimes unsettling outputs put Microsoft's also-ran search engine on the radar, but not necessarily in a good way.
But now those days are over. Microsoft officially "lobotomized" its AI late last week, implementing significant restrictions — including a limit of 50 total replies per day, as well as five chat turns per session — to crack down on those idiosyncratic responses.
The goal of the restrictions is pretty clear: the longer the chat goes on, the more the AI can go off the rails.
From Futurism
No comments:
Post a Comment