Have now had a few month look at the Watson Assistant in Beta. Also have had three years learning with the Amazon Echo and a year plus with the Google Home. So the comparison is quite interesting. Watson Assistant is very much a 'white label', a system designed to be installed in other, more complex things like cars or hotel night stands or Hospital rooms. Or even a tiny part of your IOT. Not to say that the Echo and Home's have not also crept into other devices. And both now have a considerable lead in implementation.
What Watson Assistant does now have is the ability to link to Watson meta skills that have already been built for Watson. Conversation, Discovery and Personality detection are just a few of dozens. In the future also Blockchain. Some are arranged in industry functional groups: Say Financial, Supply chain or Retail. So you should be able to look up just the intelligence 'skills' you need and apply it to your need, in API fashion. Mix and match them like parts of a business brain. And then you get the skill functions to augment business needs.
And these needs .... like understanding speech, speaking to you, linking to information on the Internet and communicating with the IOT, and performing typical business transactional interactions are all there. But how to attach them is still not clear. For example the Discovery Watson Skill, which lets you ingest private information and then interact with it intelligently, is still to be connected. Similarly business capabilities, like Business process modeling , are also possible future available methods.
IBM has gotten closer to making useful business oriented capabilities useful as skills. Better than Home or Echo. Closer to having a true assistant. So if developers and startups line up to produce meta-skills that will deliver business value, we may see great things. It remains to be seen if IBM Watson has the architecture to make it the place to do that. Or should the developers just write a business value from the ground up? Looking for new examples.
IBM’s Watson Assistant lets any company build Alexa-like voice interfaces
You get a voice assistant, and you get a voice assistant, and you By James Vincent @jjvincent in TheVerge.
IBM is today launching Watson Assistant, a new service aimed at companies looking to build voice-activated virtual assistants for their own products. Want your hotel’s rooms to remember a guest’s preferences for air-con? Or your car’s dashboard to be controllable via voice interface? IBM’s message to companies is: we can help you build that.
It’s an interesting pitch, especially as voice assistants like Amazon’s Alexa are being integrated into new arenas. (See, for example, the Wynn Las Vegas’s decision to install Echoes in every room.) IBM says this shows the popularity of conversational interfaces, and believes companies should choose Watson Assistant over Alexa or Siri for a number of reasons — namely: branding, personalization, and privacy.
First, Watson Assistant is a white label product. There’s no Watson animated globe, or “OK Watson” wake-word — companies can add their own flair rather than ceding territory to Amazon or Apple. Second, clients can train their assistants using their own datasets, and IBM says it’s easier to add relevant actions and commands than with other assistant tech. And third, each integration of Watson Assistant keep its data to itself, meaning big tech companies aren’t pooling information on users’ activities across multiple domains. .... "
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment