Le Bonniec worked for Apple subcontractor Globe Technical Services in Ireland for two months, performing this manual analysis of audio recorded by Siri, and witnessed what he says was a “massive violation of the privacy of millions of citizens.” “All over the world, people had their private life recorded by Apple up to the most intimate and sensitive details,” he explained. “Enormous amounts of personal data were collected, stored and analyzed by Apple in an opaque way. These practices are clearly at odds with the company’s privacy-driven policies and should be urgently investigated by Data Protection Authorities and Privacy watchdogs.”
But despite the fact that Apple acknowledged it was in fact transcribing and tagging huge numbers of conversations that users were unaware had been recorded by their Macs and iOS devices, promised a “thorough review of our practices and policies,” and apologized that it hadn't “been fully living up to our high ideals,” Le Bonniec says nothing has changed.
“Nothing has been done to verify if Apple actually stopped the programme. Some sources already confirmed to me that Apple has not," he said.
"I believe that Apple's statements merely aim to reassure their users and public authorities, and they do not care for their user's consent, unless being forced to obtain it by law,” says the letter. “It is worrying that Apple (and undoubtedly not just Apple) keeps ignoring and violating fundamental rights and continues their massive collection of data.”
In effect, he argues, “big tech companies are basically wiretapping entire populations despite European citizens being told the EU has one of the strongest data protection laws in the world. Passing a law is not good enough: it needs to be enforced upon privacy offenders.”
Not goodHow bad is the situation? According to Le Bonniec: “I listened to hundreds of recordings every day, from various Apple devices (e.g. iPhones, Apple Watches, or iPads). These recordings were often taken outside of any activation of Siri, e.g. in the context of an actual intention from the user to activate it for a request.
On your iPhone or iPad, head to GitHub to download the “Prevent server-side logging of Siri commands.mobileconfig” Swith to the Raw view, tap Allow to download the profile Complete the profile installation in Settings by reviewing it and tapping Install Kaiser is also encouraging users to let Apple know if they want a more transparent option in Setting to turn off server-side Siri response logging.
Apple programs Siri to not bother its pretty little head with questions about feminism
“These processings were made without users being aware of it, and were gathered into datasets to correct the transcription of the recording made by the device. The recordings were not limited to the users of Apple devices, but also involved relatives, children, friends, colleagues, and whoever could be recorded by the device.
“The system recorded everything: names, addresses, messages, searches, arguments, background noises, films, and conversations. I heard people talking about their cancer, referring to dead relatives, religion, sexuality, pornography, politics, school, relationships, or drugs with no intention to activate Siri whatsoever.”
So, pretty bad.
How did Apple justify what would appear to be a transparently illegal act carried out daily on millions of people? It didn’t. After the program was exposed last year, Apple said it would make changes and move the system in-house as well as make sure that only recordings made by users that explicitly opted-in to doing so.That opt-in/opt-out option was added to software updates for iPhones and Macs late last year but the system and process remains entirely opaque. And Apple has maintained its usual approach of refusing to even acknowledge requests for more information.
What about the Irish Data Protection Commission (DPC) whose job it is to make sure companies within its jurisdiction (most tech giants have put their European headquarters in Ireland thanks to very generous tax breaks) comply with the law?
And the regulators?In December 2019, when the news broke of the program, the DPC put out a statement that referenced digital assistants from Google and Amazon as well as Apple and said it was “currently engaging with those organisations to establish the manner by which their voice assistant products comply with data protection requirements.”
Its plan, it said, was to “identify common areas of concern and to identify what further steps including guidance may be necessary to bring additional clarity to the application of data protection requirements in the use of voice assistant technology.” We're still waiting.
Le Bonniec makes it plain he doesn’t believe the issue is being taken seriously enough and that his letter is intended to push the matter. “This public letter is meant to ask authorities to take action and to call upon people who can testify to their experience with Apple, through a public channel or whistleblowing. This statement will also be shared with the press, and to the organisations protecting our digital rights.
"By doing so, I am breaching my Non Disclosure Agreement in order to help the authorities investigate and determine whether Apple actually ceased these practices. The risk I am taking will be worth it only if this letter is followed by a proper investigation and action from your side. I trust you understand that the privacy of millions of people is at stake and that your action is crucial to protect it.” ®
Sponsored: Practical tips for Office 365 tenant-to-tenant migration