Member-only story
Your Words? Amazon’s Assets.
A change to Amazon’s Echo undermines users’ consent, treating their data as an asset useful for training the company’s AI
Late last month, Amazon alarmed privacy advocates when they announced they would no longer allow users of their Amazon Echo product to opt out of having their Alexa voice prompts stored and processed locally. That means that every customer’s voice prompt will now be sent to Amazon’s servers and, potentially, stored there. As of March 28th, Amazon would remove the “Do Not Send Voice Recordings” setting, which allowed users to prevent that from happening.
In an email to customers, Amazon explained “As we continue to expand Alexa’s capabilities with generative Al features that rely on the processing power of Amazon’s secure cloud, we have decided to no longer support this feature.” That would indicate, of course, that Amazon plans to use everything you say to Alexa to train their AI model, too.
Amazon did respond to critics by saying that users could still use a “Don’t save recordings” setting in Alexa and claimed that only 0.03% of customers had used the “Do Not Send Voice Recordings” setting. Additionally, however, users who opt out of having their recordings saved will not have access to new enhanced Alexa+ features. Amazon later clarified that…