Smart assistants may be a great way to streamline the purchasing process and the negating going on a website altogether, but allowing purchases with voice-recognition alone can cause several problems can arise. This most recent case is a black mark against Amazon’s smart assistant Alexa, who ordered their user a doll house after hearing a broadcast about Alexa on local news.
San Diego TV station CW6 ran a report on a six-year-old girl from Dallas, Texas who managed to order a $160 doll house through Alexa. During the report, the presenter, Jim Patton, stated “I love the little girl saying ‘Alexa order me a doll house’.”
Many people called in to complain about the report, stating that Patton’s words had been heard by their own Amazon Echo device and interpreted as a command to buy more dollhouses.
Thankfully, Amazon allows free returns for any accidental purchases, and also let users set a four-digit code that prevents unauthorized access, which is suitable for those with young children.
Nonetheless, additional safeguarding is what’s needed to help support voice command devices and voice command Internet of Things applications. If Amazon allows anyone to speak to the Echo and order an item with someone else’s payment details then it’s going to cost them a fortune in sales, not to mention in mass returns.