Thursday, April 25, 2024 | Shawwal 15, 1445 H
scattered clouds
weather
OMAN
33°C / 33°C
EDITOR IN CHIEF- ABDULLAH BIN SALIM AL SHUEILI

Listening in? How voice assistants lost their innocence in 2019

Untitled-1
Untitled-1
minus
plus

Voice assistants seem to be everywhere these days, whether tucked away in Amazon’s Echo smart speakers or deeply baked into Apple’s iPhone. Some say they seem futuristic, others that it’s half-baked gimmickry.


Either way, this year their dirty little secret came out: The things you say to these assistants are sometimes reviewed by teams of quality controllers. The process helps them make the software better able to understand and respond to users, but it also means statements made in one’s home might not be as private as some think.


The vast majority of users were unaware of the practice, since the eavesdropping was mentioned only in the privacy small print, if at all.


It became public in April when US financial information and media company Bloomberg reported that voice recordings from Echo speakers, enabled by the voice assistant Alexa, were transcribed, annotated and then fed back into the software by a mix of contractors and full-time Amazon employees working at far-flung sites from Boston to Costa Rica, India and Romania.


A worker in Boston was quoted as saying he had mined voice data for “Taylor Swift” and annotated them to indicate the user meant the US singer-songwriter of that name.


Following the Bloomberg report, it became clear that human monitors also analysed a small percentage of anonymized recordings from Apple’s voice assistant Siri and the Google Assistant.


Providers of voice-activated intelligent virtual assistants have a dilemma. Users expect a voice assistant to understand them, but they want privacy too. And how can the software’s mistakes be eliminated if they’re not identified? Misunderstandings are especially likely when the software has to grapple, for example, with different dialects or accents.


It’s important that humans intervene to gauge whether the voice assistant’s interpretation of requests is accurate, and to train its speech recognition and natural language processing algorithms, a machine-learning practice known as “supervised learning.” Simply providing general input data and then letting the software learn on its own isn’t enough, say industry experts.


Another stumbling block for the technology is accidental activation. Voice assistants are supposed to begin recording when prompted by a “wake word” such as “Alexa” or “Hey, Siri.” But some activations occur without a prompt, in which case reviewers try to determine the words or noises that triggered them so they can fix the problem.


Recordings made after accidental activations are especially problematic from a privacy standpoint, since they capture matters not meant for the voice assistant. An Apple contractor told the British newspaper The Guardian that recordings sometimes picked up confidential medical information, business deals, possible criminal activity and sexual encounters.


In the wake of the revelations, Apple, particularly embarrassed in view of its promises of strict data protection, announced it was suspending human monitoring for users who didn’t gave the company express permission to do so. It also said that only Apple employees, not contractors, would review Siri recordings. Google then introduced an opt-in procedure as well.


Amazon lets users opt out of human monitoring, but it remains the default setting. “We’re very convinced that the service gets better when customers allow us to use the data to improve it,” says Dave Limp, Amazon’s devices and services chief, adding, “I hope that someday we won’t need human interaction.”


As for Amazon’s rival tech giants that have made the practice an opt-in, he remarks, “Either they’ve figured out some computer science that we haven’t yet — which I’m deeply sceptical of — or their services aren’t going to improve as fast.”


Limp says reaction to the revelations was stronger in the media than among users. “Consumers continued to use Alexa — they didn’t stop using it.” — dpa


SHARE ARTICLE
arrow up
home icon