If you’re happy and you know it, so does Amazon.
According to a new Bloomberg report, Amazon is developing a voice-activated wearable device that can read human emotions and tell whether people are angry or sad based just on their tone of voice.
In a 2017 patent, Amazon claims that the device and the accompanying AI software would be able to know if the wearer is feeling “joy, anger, sorrow, sadness, fear, disgust, boredom, stress, or other emotional states.”
According to documents obtained by Bloomberg, work on the device, which would be paired with a smartphone while being worn on the wrist, was recently ongoing — but it is unclear if the project, code-named “Dylan,” will ever be commercially available.
The unnamed source who is familiar with the project claims beta testing is already underway, though it’s not yet clear whether the hardware or software or both are part of this testing process.
On Tuesday, Amazon Alexa senior applied scientist Viktor Rozgic revealed more information about how the company is training artificial intelligence to recognize voice emotion.
“A person’s tone of voice can tell you a lot about how they’re feeling,” he wrote. “Not surprisingly, emotion recognition is an increasingly popular conversational-AI research topic.”
“Emotion recognition has a wide range of applications: it can aid in health monitoring; it can make conversational-AI systems more engaging; and it can provide implicit customer feedback that could help voice agents like Alexa learn from their mistakes,” Rozgic wrote.
The device is being developed by Lab126 and the Alexa software team internally. Lab126 is known for having developed the Kindle, Kindle Fire, Fire TV, Fire Phone, Echo smart speaker, Echo Dot, and Amazon Tap, so it certainly has the required experience to create an Alexa-using wearable.
In related news, Amazon has started using machines to automate the job of boxing up customer orders five times faster than humans.