Would you let a stranger eavesdrop in your home and keep the recordings? For most people, the answer is, “Are you crazy?”
Yet that’s essentially what Amazon has been doing to millions of us with its assistant Alexa in microphone-equipped Echo speakers. And it’s hardly alone: Bugging our homes is Silicon Valley’s next frontier.
Many smart-speaker owners don’t realize it, but Amazon keeps a copy of everything Alexa records after it hears its name. Apple’s Siri, and until recently Google’s Assistant, by default also keep recordings to help train their artificial intelligences.
So come with me on an unwelcome walk down memory lane. I listened to four years of my Alexa archive and found thousands of fragments of my life: spaghetti-timer requests, joking houseguests and random snippets of “Downton Abbey.” There were even sensitive conversations that somehow triggered Alexa’s “wake word” to start recording, including my family discussing medication and a friend conducting a business deal.
WATCH: While Amazon spying in no laughing matter, this skit is pretty funny
For as much as we fret about snooping apps on our computers and phones, our homes are where the rubber really hits the road for privacy. It’s easy to rationalize away concerns by thinking a single smart speaker or appliance couldn’t know enough to matter. But across the increasingly connected home, there’s a brazen data grab going on, and there are few regulations, watchdogs or common-sense practices to keep it in check.
Let’s not repeat the mistakes of Facebook in our smart homes. Any personal data that’s collected can and will be used against us. An obvious place to begin: Alexa, stop recording us.
“Eavesdropping” is a sensitive word for Amazon, which has battled lots of consumer confusion about when, how and even who is listening to us when we use an Alexa device. But much of this problem is of its own making.
Alexa keeps a record of what it hears every time an Echo speaker activates. It’s supposed to only record with a “wake word” – “Alexa!” – but anyone with one of these devices knows they go rogue. I counted dozens of times when mine recorded without a legitimate prompt. (Amazon says it has improved the accuracy of “Alexa” as a wake word by 50 percent over the past year.)
What can you do to stop Alexa from recording? Amazon’s answer is straight out of the Facebook playbook: “Customers have control,” it says – but the product’s design clearly isn’t meeting our needs. You can manually delete past recordings if you know exactly where to look and remember to keep going back. You cannot actually stop Amazon from making these recordings, aside from muting the Echo’s microphone (defeating its main purpose) or unplugging the darn thing.
Amazon founder and chief executive Jeff Bezos owns The Washington Post, but I review all tech with the same critical eye.
Amazon says it keeps our recordings to improve products, not to sell them. (That’s also a Facebook line.) But anytime personal data sticks around, it’s at risk. Remember the family that had Alexa accidentally send a recording of a conversation to a random contact? We’ve also seen judges issue warrants for Alexa recordings.
Alexa’s voice archive made headlines most recently when Bloomberg discovered Amazon employees listen to recordings to train its artificial intelligence. Amazon acknowledged some of those employees also have access to location information for the devices that made the recordings.
Saving our voices is not just an Amazon phenomenon. Apple, which is much more privacy-minded in other aspects of the smart home, also keeps copies of conversations with Siri. Apple says voice data is assigned a “random identifier and is not linked to individuals” – but exactly how anonymous can a recording of your voice be? I don’t understand why Apple doesn’t give us the ability to say not to store our recordings.
The unexpected leader on this issue is Google. It also used to record all conversations with its Assistant, but last year quietly changed its defaults to not record what it hears after the prompt “Hey, Google.” But if you’re among the people who previously set up Assistant, you probably need to readjust your settings (check here) to “pause” recordings.
I’m not the only one who thinks saving recordings is too close to bugging. Last week, the California State Assembly’s privacy committee advanced an Anti-Eavesdropping Act that would require makers of smart speakers to get consent from customers before storing recordings. The Illinois Senate recently passed a bill on the same issue. Neither are much of a stretch: Requiring permission to record someone in private is enshrined in many state laws.
“They are giving us false choices. We can have these devices and enjoy their functionality and how they enhance our lives without compromising our privacy,” Assemblyman Jordan Cunningham, R, the bill’s sponsor, told me. “Welcome to the age of surveillance capitalism.”