SEATTLE, Sept 24 — Amazon yesterday added a feature enabling its Echo Show smart screens to recognise household pantry items as part of an effort intended to help the blind and visually impaired.
A new "Show and Tell” capability available for Alexa digital assistant on Echo Show devices in the US lets users get audible replies to the question "What am I holding?”
Echo Show screens built with camera capabilities use computer vision and machine learning technology to recognize what people have in hand, according to the Seattle-based technology titan.
"It's a tremendous help and a huge time saver because the Echo Show just sits on my counter, and I don't have to go and find another tool or person to help me identify something,” mechanical engineer Stacie Grijalva said in a blog post.
"I can do it on my own by just asking Alexa.”
Grijalva, who lost her sight as an adult, is technology manager at a centre for the blind and visually impaired in the California coast city of Santa Cruz that worked with the Amazon team.
"The whole idea for 'Show and Tell' came about from feedback from blind and low vision customers,” said Sarah Caplener, head of Amazon's Alexa for Everyone team.
"Whether a customer is sorting through a bag of groceries, or trying to determine what item was left out on the counter, we want to make those moments simpler by helping identify these items and giving customers the information they need in that moment.”
Major technology firms including Apple, Google and Microsoft invest in making innovations more accessible or helpful to people with disabilities, which is seen as being good for business as well as socially beneficial.
Being able to interact with smart speakers or other devices by voice can be a boon for the visually impaired, while features such as automatic captioning of online videos can aid those who can't hear. — AFP-Relaxnews
You May Also Like