Its unclear what is happening here. Is the delay because it is computationally intensive? More accurate? Comes from slower sources? The answer depends on information that will only be known (accurately enough) in the future? Generally the 'I will get back to you' is allowed in a human conversation, and we usually know what we don't know almost immediately. And we take the risk of a guess. All part of a realistic conversation. So the results should be intriguing.
In Business Insider:
" ... Amazon says its Alexa voice assistant will soon be able to come back to you with a response to a question it can't initially answer.
The company is rolling out an update to Echo smart speakers that is essentially meant to allow Alexa to memorize a question it can't answer and then come back to the user once it's fetched an adequate response.
The feature was first spotted by Voicebot. The writer Bret Kinsella was asked by his Alexa whether he wanted to activate Answer Updates. When he asked what this was, Alexa replied: "If you ask me a question and I don't know the answer but I find out later, I'll notify you." ... '
My first test:
I Said: "Alexa turn on Answer update" This worked for me, but not sure what new information will trigger the resolution of a question. The concept can approach the philosophical. Can I assume the wait is open ended? She still tells me she can't answer things, but without any indication she is working on it.
(Update) The system will support only factual information, and be triggered when it receives new facts, says another article .... Which then begs for a definition of fact and 'in process'. All elements of a conversation.
Wednesday, August 08, 2018
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment