This made me think about the how and why of it. Watch the road before prompting.
Mercedes says it's going to test ChatGPT with its in-car voice assistant for the next three months. in xtremethink
ChatGPT may be well on its way to remaking the internet, but you know where there isn't enough generative AI? On the roads. Microsoft and Mercedes have announced a partnership to test the integration of ChatGPT with Mercedes vehicles. The feature will launch in beta on more than 900,000 vehicles in the US.
Like most high-end carmakers, Mercedes has spent the last few years developing bespoke vehicle technology. For example, the company has its own Hey Mercedes voice assistant, where ChatGPT will connect. Instead of reaching out to the Mercedes AI model to understand spoken words, the beta software will use ChatGPT to interpret what's said.
Microsoft and Mercedes contend that using ChatGPT with Hey Mercedes will make the system more reliable and expand its capabilities. Most voice assistants, Hey Mercedes included, are limited in what they can do and understand. You might use a phrase that a person would interpret immediately that flummoxes the AI. ChatGPT is much better at understanding commands, and its grasp of context will allow drivers to have multi-part conversations with the AI.
The carmaker also sought to allay any fears about data privacy with the ChatGPT rollout. It will access the OpenAI model's smarts via the Microsoft Azure OpenAI Service. The actual data is uploaded to the Mercedes-Benz Intelligent Cloud for anonymous processing. The company says it "keeps a close eye" on the potential risks of AI and will adjust the system as necessary for the continued benefit of customers.
Generative AIs like ChatGPT are famously prone to going off on disinformation tangents known as hallucinations. No one has found a way to prevent hallucinations, as these models don't technically know anything. They just generate outputs based on the data they've ingested, and no one knows exactly how the AI arrives at a specific output. ... ' (much more)
No comments:
Post a Comment