Bloomberg writes that Amazon employs thousands of contractors and full-time workers around the world, "from Boston to Costa Rica, India and Romania", to listen to voice recordings captured by Echo devices. Other times, there's no clear procedure: two employees told Bloomberg they heard "what they believe was a sexual assault".
Then they can be fed back into the team programming Alexa - ostensibly to improve Alexa's understanding in areas where it isn't strong already. They work nine hours a day, with each reviewer parsing as many as 1,000 audio clips per shift, according to two workers based at Amazon's Bucharest office, which takes up the top three floors of the Globalworth building in the Romanian capital's up-and-coming Pipera district.
A spokesperson for the trillion-dollar company said that "an extremely small sample of Alexa voice recordings" are analysed by staff.
Florian Schaub, a University of MI professor with expertise in privacy issues, told Bloomberg that many consumers assume smart speakers are "just doing magic machine learning" on their own. When developing software algorithms, a human role is often ignored, according to seven employees who have worked on it. Alexa needs help from humans to become the "smart" device it is. Apple disclosed they store recorded information for six months, after which the audio is stripped of all identification information, but can be still used for machine learning. Workers also allegedly use an internal chat room to share recordings they need help transcribing or recordings they find amusing. However, they were told by colleagues that it was not Amazon's job to intervene. The group explained that all information would be strictly confidential and it will work with access restrictions and encryption. "After that, the data is stripped of its random identification information but may be stored for longer periods to improve Siri's voice recognition".
"User voice recordings are saved for a six-month period so that the recognition system can utilize them to better understand the user's voice", the white paper says.
However, some workers say that they hear disturbing things in the recordings.
"We have strict technical and operational safeguards, and have a zero tolerance policy for the abuse of our system", said the company statement.
"You don't necessarily think of another human listening to what you're telling your smart speaker in the intimacy of your home", Florian Schaub, a professor at the University of MI who has researched privacy issues with smart speakers, told Bloomberg. Google has reviewers that train Assistant, but the clips don't have any personally identifiable information and the audio itself is distorted to prevent any identification.
- Sudan coup leader steps down
- Anthony Davis Clearly Trolls Pelicans With Shirt Choice Before Final Home Game
- Man lights himself on fire outside White House
- England’s Tamsin Beaumont is among Wisden’s Cricketers of the Year
- Ex-Als RB Wood charged with murder
- Luke Walton, Lakers Mutually Part Ways
- New Rumor Claims to Reveal Star Wars Episode IX Title
- Dedicate win to my wife, it's her birthday: Pollard
- Trump Touts 55 Percent Approval From Poll That Found 43 Percent Approval
- Toddler locks dad’s iPad for more than 25 million minutes