This week, I read through a history of everything I said to Alexa, and it felt a little bit like reading an old diary. Until I remembered that the things I told had been read in a private server on an Amazon server and had possibly been read by an Amazon employee. This is all to make Alexa better, the company keeps saying. However, many people do not immediately notice how humans interact with your seemingly private voice commands. Alexa, these people say, is a spy hiding in a wiretapping device.
The debate on whether or not Alexa or any voice assistant is spying on us is years old at this point, and it's not going away. Privacy Laws have filed a complaint with the Federal Trade Commission (FTC) allegedly violating the Federal Wiretap Act. Journalists have investigated the dangers of always-on microphones and artificially intelligent voice assistants. Skeptical tech bloggers like me have argued these were more powerful than people realized and overloaded with privacy violations. Recent news stories about how Amazon employees review certain Alexa commands suggest the situation is worse than we thought.
It's starting to feel like Alexa and other voice assistants are destined to spy on us because that's how the system was designed to work. These systems rely on machine learning and artificial intelligence to improve themselves over time. The new technology is also currently prone to error, and even if they were perfect, the data-hungry companies that built them are constantly thinking of new ways to exploit users for profit. And where imperfect technology and powerful companies collide, the government tends to struggle so much with understanding what is going on, that regulation seems like an impossible solution.
The situation is not completely dire. This technology could be really cool, if we pay closer attention to what's happening. Never-ending Errors
One fundamental problem with Alexa or other voice assistants is that the technology is prone to fail. Devices like the Echo come equipped with always-on microphones that are only supposed to record when you want them to list. While some devices require the push of a physical button to be Alexa, many are designed to start recording you after you said the wake up word. Anyone who has spent any time using Alexa knows that it doesn't always work like this. Sometimes the software hears random noise, it is the wake word, and starts recording.
The extent to which false positives are a problem became glaringly evident the moment I started reading through my history of Alexa commands on Amazon's website. The entries are dull: “Hey Alexa;” “Show me an omelet recipe;” “What's up?” But sprinkled amongst the mundane dribble was also a series of messages that said, “Text not available for Alexa. ”Every time I saw it, I saw it twice again and read it aloud in my head:“ Audio was not intended for Alexa. ”These are the things Alexa heard that it should not have heard, commands that have been sent to Amazon's servers and sent back because the machine had not been said that Alexa had recorded audio when the user was not giving a command.