Here’s why Amazon Echo recorded a family’s conversation


After an understandable freakout over the news that Amazon Echo recorded a family’s conversation and sent that recording to a person on their contact list, the company has released an explanation.

Amazon’s full statement, sent to media outlets including this one, seems to show Alexa, the company’s virtual personal assistant powered by artificial intelligence, was trying a little too hard:

Echo woke up due to a word in background conversation sounding like “Alexa.” Then, the subsequent conversation was heard as a “send message” request. At which point, Alexa said out loud “To whom?” At which point, the background conversation was interpreted as a name in the customers contact list. Alexa then asked out loud, “[contact name], right?” Alexa then interpreted background conversation as “right.” As unlikely as this string of events is, we are evaluating options to make this case even less likely.

siliconbeat logo tech news blogThe Portland family Alexa eavesdropped on had an Echo smart home speaker in every room of their house to help control their heat, lights and security system, they told Seattle CBS affiliate KIRO-7, which was the first to report the story Thursday.

A woman named Danielle who did not give her last name told the station her family found out about the eavesdropping when one of her husband’s employees called to tell them, “unplug your Alexa devices right now. You’re being hacked.” That person had received audio files of their conversations.

Now her family has unplugged all the devices, and although Amazon offered to “de-provision” the devices of their communications features so they could keep using them to control their home, Danielle and her family reportedly want a refund instead.

When reached Friday, an Amazon spokeswoman would not comment about whether the company will issue a refund.

Get tech news in your inbox weekday mornings. Sign up for the free Good Morning Silicon Valley newsletter.

Other smart home speakers carry similar privacy risks. Last year, for example, Google had to release a patch for its Home Mini speakers after some of them were found to be recording everything.

And earlier this month, researchers from UC Berkeley published a paper in which they show how they were able to embed commands into music or spoken text, according to the New York Times. Humans can’t hear the commands, but smart speakers can.

If all this is creeping you out, here’s an article about how to turn off some common apps and devices’ listening and recording capabilities, including Facebook and Apple’s Siri.

Don’t buy any IoTs.