Amazon.com revealed Monday that the visually impaired can now ask an Echo show what they are holding. The new feature, called “Show and Tell,” enables customers who are blind or people with low vision to hold an item in front of first- and second-generation Echo Show devices and ask, “Alexa, what am I holding?” or “Alexa, what’s in my hand?” The object will then be identified using Amazon’s machine learning technology. Partnering with California-based Vista Center for the Blind and Visually Impaired, Amazon worked with visually impaired people to understand the problems they have at home and how Alexa could help.
In an Amazon video, a blind person asked Alexa what she’s holding and Alexa replied, “It looks like tea. “The whole idea for Show and Tell came about from feedback from blind and low vision customers,” said Sarah Caplener, who leads Amazon’s Alexa for Everyone team. “We heard that product identification can be a challenge and something customers wanted Alexa’s help with. Whether a customer is sorting through a bag of groceries, or trying to determine what item was left out on the counter, we want to make those moments simpler by helping identify these items and giving customers the information they need in that moment.”