pageview
Banner Default Image

Apple Siri contractors often hear up to 30 seconds of accidental recordings

over 5 years ago by Lucy Cinder

Apple Siri contractors often hear up to 30 seconds of accidental recordings

Cyber Security

Apple has been open about the fact that it screens some Siri recordings to improve the AI assistant’s ability to “understand you better and recognize what you say.” But there are more of these recordings — notably accidental ones — than most customers realize, The Guardian reports, and they’re being heard by human contractors with high turnover rates, not just internal Apple teams. Worse yet, a whistleblower says that the recordings often contain highly private details that could be used to identify and compromise Apple’s users.

According to the report, the key issue is that Siri can be accidentally triggered by audio cues, including anything from the sound of a zipper to the word “Syria,” as well as gestures such as raising an Apple Watch on a certain angle. These inadvertent activations aren’t just frequent; on the Watch, they also lead to 30-second long recordings, some fraction of which are shared with Apple’s contractors for analysis, where they “can gather a good idea of what’s going on.”

The snippets have allegedly included “countless instances” of discussions between doctors and patients, sexual encounters, business deals, and criminal activities such as drug deals, accompanied by the user’s location, contact details, and app data. While contractors aren’t specifically listening for private activities in the recordings, the whistleblower claims that Apple doesn’t do “much vetting of who works there, and the amount of data that we’re free to look through seems quite broad … If there were someone with nefarious intentions, it wouldn’t be hard to identify [people on the recordings].”

Compounding the issue, Siri responds to the question “are you always listening?” by saying “I only listen when you’re talking to me,” providing a false sense of security regarding accidental activations. The most frequent accidental recordings come from Apple Watches and HomePod speakers, the whistleblower said, without specifying ratios for Apple’s other devices, such as iPhones, iPads, and Apple TVs.

Asked for comment, Apple suggested that the requests aren’t associated with the user’s Apple ID, and claimed that Siri responses are analyzed in “secure facilities” by reviewers who are all “under the obligation to adhere to Apple’s strict confidentiality requirements.” The company said that only a random subset of under 1% of daily Siri activations are used for accuracy grading — a number that could be massive, given that there are at least 500 million Siri devices out there — and most recordings are only a few seconds long.

Apple has positioned itself as the world’s most privacy-conscious tech giant, in recent months marketing its devices as particularly secure with users’ personal data. In January, it launched a gigantic billboard at CES under the theme “what happens on your iPhone, stays on your iPhone,” and representatives have portrayed the company as wholly disinterested in what’s on users’ devices — apart from keeping their content safe from unwanted inspection. The company also explicitly said last year that iPhones aren’t spying on users, though bugs in its FaceTime Audio system have twice enabled other users to surreptitiously hear private conversations in the vicinity of iPhones

source venturebeat

Industry: Cyber Security

Banner Default Image

Latest Jobs