/* ---- Google Analytics Code Below */

Friday, March 03, 2023

Microsoft Tones down Bing’s AI Chatbot and Gives it Multiple Personalities

Been impressed there by what I have seen, including some of the personalities, though which you use then needs to be carefully considered and tested. Is no more hallucinatory than other approaches.

Microsoft tones down Bing’s AI chatbot and gives it multiple personalities

BY James Farrell

After Microsoft Corp.’s Bing Chat went off the rails shortly after its introduction, the company has now reined in the bot and given users a selection of personalities they can choose from when chatting with it.

Millions of people signed up to use Bing powered by Open AI LLC’s ChatGPT when it first became available, but many who took the bot to its limits discovered the AI was prone to having what looked like nervous breakdowns. It was anything but the “fun and factual” that Microsoft had promised, with the bot at times airing existential despair and sometimes insulting people.

Earlier this week, Microsoft updated Windows 11, which includes the integration of the Bing chatbot. And today, the bot was given three personalities in an effort by Microsoft to counter the outlandish responses people had been seeing at the start. Now users can choose from “creative, balanced and precise” responses, although even the creative version is more constrained than the seemingly unhinged entity the company unleashed into the wild just a few weeks ago.  ... '   (more)  


No comments: