MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
alexa
Recherche

Alexa is Implementing Self-Learning Techniques To Better Understand Users

lundi 10 décembre 2018, 10:30 , par Slashdot
In a developer blog post published this week, Alexa AI director of applied science Ruhi Sarikaya detailed the advances in machine learning technologies that have allowed Alexa to better understand users through contextual clues. From a report: According to Sarikaya, these improvements have played a role in reducing user friction and making Alexa more conversational. Since this fall, Amazon has been working on self-learning techniques that teach Alexa to automatically recover from its own errors. The system has been in beta until now, and it launched in the US this week. It doesn't require any human annotation, and, according to Sarikaya, it uses customers' 'implicit or explicit contextual signals to detect unsatisfactory interactions or failures of understanding.'

The contextual signals range from customers' historical activity, preferences, and what Alexa skills they use to where the Alexa device is located in the home and what kind of Alexa device it is. For example, during the beta phase, Alexa learned to understand a customer's mistaken command of 'Play 'Good for What'' and correct them by playing Drake's song 'Nice for What.'

Read more of this story at Slashdot.
rss.slashdot.org/~r/Slashdot/slashdot/~3/APGPasuepMI/alexa-is-implementing-self-learning-techniques-...
News copyright owned by their original publishers | Copyright © 2004 - 2024 Zicos / 440Network
Date Actuelle
jeu. 28 mars - 18:34 CET