

With tens
of millions of people now using smart
speakers in their homes and businesses, the question of whether devices such as
the Amazon Echo or Google Home may be being used to listen into private
conversations is a real concern for many.
First of all, it’s worth remembering what smart
speakers have been designed to do. Every day, millions of people use their
smart speakers to do everything from playing their favourite song to finding
out the day’s weather forecast. In addition, smart speakers are often used to
answer questions about everything from what to cook for dinner, to what is the
latest Brexit news. Indeed, they could be described as smart listeners, rather
than simply smart speakers, as their job is to listen carefully to what we’re
saying and respond appropriately.
It is this ability to make sense of what we say
and then provide a suitable response, that is key to understanding whether or
not smart speakers are listening in to what we say all, or just some, of the
time. For example, take Alexa. Like a large number of software tools, Amazon’s
digital assistant is built to learn from experience. This learning, or training
if you like, is not, like many people assume, carried out without any human
input. It is actually quite the contrary, as there is an important human role
in training software algorithms involved in devices such as the Amazon Echo.
According to Bloomberg, Amazon employs thousands of people whose job it is to help to
improve Alexa. These employees listen to voice recordings, which are then
transcribed and annotated, before being put back into the software, in order to
fill in any gaps in the way Alexa understands our speech, so that its response
can be improved.
Amazon claims that it only annotates “an
extremely small sample of Alexa voice recordings in order [to] improve the
customer experience” and says that its employees in this area do not have
access to information that would enable the person involved in the recording to
be identified.
Maybe most importantly, according to Amazon,
audio is not stored unless the ‘wake up’ word is used (usually ‘Alexa’) or a
button is pressed.
It’s reportedly a similar story at Google, with
some reviewers being able to access audio snippets from its Assistant for
improvement purposes. Again, Google says that the reviewers should not be able
to access information which would allow them to identify the person in the
recording.
Of course, it’s not just the software created by
the companies themselves which could be listening in. There are apps made by
third parties available for both the Amazon Echo and Google Home.
The BBC recently
reported that Security Research Labs (SRL)
had built eight “smart spies” which could listen in on Amazon Echo and Google
Home speaker users after the app was supposedly turned off and users had heard
a “Goodbye” message. One indication that all was not well, was that the light
on the smart speaker remained on, a giveaway that it was indeed still
listening.
While it seems highly unlikely that smart
speakers are listening to us all of the time, what SRL did highlight was the
need to be aware of the apps that we use with them.
As David Emm, a security analyst at Kaspersky
Lab told the BBC: “We all need to be aware of the capabilities of these
devices”.