“You can definitely hear a doctor and patient, talking about the medical history of the patient. Or you’d hear someone, maybe with car engine background noise – you can’t say definitely, but it’s a drug deal … you can definitely hear it happening. And you’d hear, like, people engaging in sexual acts that are accidentally recorded on the pod or the watch.”
The contractor also said incidents of accidental recordings on the Apple Watch were “incredibly high”. Presumably because the Watch is supposed to be worn regularly, which increases the chances of picking up audio it shouldn’t be eavesdropping on. There are two major issues here; the first is that Apple doesn’t make it readily known that your Siri audio may be processed by humans. Apple’s own privacy explainer details that Siri data is sent to “Apple servers” to improve the quality of the service, but it doesn’t explicitly mention that humans are processing it, not does it mention third-party contractors. Secondly, the contractor said it wouldn’t be difficult to identify people by the recordings - depending on what’s said - and that there isn’t much vetting of who is hired to process these recordings. However, on the same privacy explainer site, Apple repeatedly makes it clear that it takes multiple steps to remove anything identifying in Siri data sent to Apple.
“Analysis happens only after the data has gone through privacy-enhancing techniques so that it cannot be associated with you or your account.” The explainer reads.In a statement to The Guardian, Apple said that a random subset - less than 1% of daily Siri activations - were used for grading. It also said that “user requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”