So, you’ve bought a new speaker-come-smart assistant and it’s proudly sitting in the center of your coffee table.
It doesn’t matter whether you have an Amazon Echo, a Google Home, or you’re thinking of buying the new Harman Kardon Cortana speaker, everything is going to be a lot more organized with it in your life.
But at what price? What security risks and privacy problems are you exposing yourself to? Here are five security pitfalls of using speaker-based smart assistants.
1. Who Is Listening?
If a representative from Google showed up on your doorstep and asked to bug your house with mini-microphones, I think most of you would give a profanity-laden response.
What’s so different about the new class of speaker-based smart assistants? It’s like willfully allowing a giant corporation to bug your private space. It’s amazing how quickly public opinion changes. In early 2015, there was outrage over the Samsung TV affair . Today, we’re asking companies to listen in.
And let’s be clear: the speakers are always listening. Sure, they only react when they hear an activation phrase, but they are listening for that phrase permanently. Google has admitted some of that background audio is stored locally but declined to reveal how long it was stored for.
Could a hacker tune-in and listen to everything in your home? With the recent controversy over baby monitors, the threat seems very real. You certainly shouldn’t put the device in a location where you discuss highly confidential topics.
2. Storage of Data
At least the devices are not transmitting all your conversations (yet), so let’s move on from the fact it’s always listening . What happens when you’re actively engaging with it?
The speaker collects the data and sends it to the company’s central servers which process your request. Sounds fine, but what about storage of the data?
The answers might surprise you. Google Home and Amazon Alexa both save the audio snippets and log them against your user account. You can hear all your previous requests by logging into your respective account. What if someone gains unauthorized access? There could be a lot of personal information saved there.
At least you can delete those histories . But you can’t do anything about all the aggregated data stored on Google’s or Amazon’s servers. The companies use it to improve the assistant, running update scans several times per day.
As for Apple, it keeps Siri requests tagged to your device’s IDs for six months, then keeps the raw audio for a further 18 months.
3. Ambient Audio
The audio snippets the speaker sends to Google or Amazon do not only contain your requests. The nature of the devices means they’re going to pick up background “ambient” audio.
Think about this: the recordings could reveal what TV shows you watch, what sports you like, what pets you have, what time of day you’re at home, what the gender ratio in your household is, what music you like, and a lot, lot more.
It’s naïve to think Google and Amazon will just discard this data. For them, it’s like gold dust — it’s all going to go towards building your advertising profile.
— James Bailey (@JimTheHusband) December 15, 2016
You could be having a casual conversation about which new car to buy while someone else uses the assistant. Google will match your voice against your Google ID, and within seconds it’ll be peppering you with car ads.
4. Law Enforcement
How long will it be until countries start passing laws that allow the police to activate a smart assistant and listen in on suspects remotely?
We’re already well on the way. All the NSA spying of the last few years is well-documented, and the UK has just passed a hugely controversial law that allows everything from the Food Standards Agency to the Department for Work and Pensions to access all residents’ internet histories. Other countries are heading down a similar path.
Some of you might argue it’s a good thing. If snooping can help to keep the population safe, the law should allow it. But where does it stop? And who will monitor the snoopers? And what’s to stop cyber-criminals using the same backdoor methods that law enforcement can use?
I hope you’ve read 1984…
5. Risk of Someone Else Using It
One of Alexa’s biggest selling points is its ability to order stuff directly off Amazon. It’s not hard to imagine that she’ll one day become the main shop front for the online retail giant.
But that raises issues. If someone steals your device, could they spend hundreds of dollars on your credit card before you have a chance to react? It’s not inconceivable.
There are also security implications inside your home . At the moment, Alexa is not smart enough to recognize its owner’s voice over the voices of other people. It means anyone in your home has access to every app you’ve linked to Alexa. Strangers could pull up your bank balance, your naughty children could secretly buy that new toy they want, and friends could order random things for a prank.
Presumably, this will become less of an issue as the technology develops, but right now, it’s a worry.
Convenience or Catastrophe?
I’m not saying speaker-based smart assistants are all bad, and I’m certainly not saying you shouldn’t have one in your home. But you do need to be aware of the trade-off, the added convenience comes at a potentially catastrophic price.
Is it worth it? That’s for you to decide.
What is your opinion of smart assistants? Do you use one? Do the security and privacy implications concern you? Or are they a “must-have” at the cutting edge of consumer technology?
Let me know your thoughts and feedback in the comments section below.
Image Credit: SpeedKingz and Andrey Makurin via Shutterstock.com