Skip to content

OK, Google … Is anyone else listening to me?

The surprise was huge. Google admitted on Thursday that “language experts” hired by the firm listen to approximately 0.2 percent of the conversations that users have with their virtual assistant, which implies that part of those interactions -which were believed to be closed completely for human ears – they are not completely private.

The thing with virtual assistants

The usual assumption, and what often reiterate the companies that manage virtual assistants like Amazon, Samsung and Apple, in addition to Google, is that the conversations between a user and his assistant are entirely private and that the interaction occurs exclusively through intelligence artificial, that is, the only ones that “listen” to the user are systems.

However, the admission of Google, which ensured that it is done to improve the quality of service, sheds light on a practice that companies usually avoid advertising. Not a few observers of the industry say that it is known that to a lesser or greater extent it is something usual, but it is far from the promotional discourse of privacy in these systems.

The revelation came from the hand of search product manager Californian company David Monsees, who posted an entry in the official blog of Google in response to information appeared on the Belgian television VRT NWS, which could access about a thousand recordings of anonymous individuals.

Where did the recordings come from?

The recordings were provided to the Belgian television in the Dutch language by one of the “experts” that Google had hired in that country to listen to segments of the conversations and “to understand the particularities and accents of each specific language”.

The firm, which has already announced that it will “take action” due to the filtration as a “violation” of its data security policies, admitted having “experts all over the world” whose function is to listen and transcribe “a small part of the dialogues to help us better understand those languages”.

In particular, the firm of Mountain View (California, USA) encrypted the percentage of interactions analyzed by humans at 0.2% and guaranteed that these fragments are not associated with user accounts and that experts are tells them not to transcribe sounds or conversations in the background that are not directed to.

The real information

However, investigators were able to identify “postal addresses and other sensitive information” in the recordings, which allowed them to contact the people whose voice had been recorded and confirm that it was indeed them.

“A couple from Waasmunster (Belgium) immediately recognized the voice of their son and grandson”, they gave as an example from VRT NWS.

We have experts around the world whose function is to listen and transcribe a small part of the dialogues to help us better understand those languages.

What is Google saying?

Google indicated that the virtual assistant only sends audio recordings once it has detected that the user is interacting with him after having said, for example, “Hey, Google” and that he has several tools to avoid “false activations” , that is, the software interprets a sound erroneously as the keyword to activate.

Despite this,  of the around one thousand voice fragments to which it had access (all of them in the Dutch language), 153 were conversations in which nobody gave the activation order to the virtual assistant, but interpreted it erroneously a sound. Among others, the Belgian media claimed to have heard conversations in bed, between parents and their children, called professionals, discussions and scenes of both sex and violence.


Also published on Medium.

Published inTechnology
%d bloggers like this: