A new report by the Guardian mentions that Apple contractors also have access to Siri voice recordings, and regularly listen to confidential medical information, drug deals, and even couples engaging in sexual encounters while going through these recordings.
That’s incredibly horrifying, but what’s more worrying is the fact that most of these recordings where confidential and extremely private things are happening, are recorded by Siri because of accidental wake-ups even when users have not used the ‘Hey Siri’ wake word. To make matters worse, these clips have identifying characteristics such as the user’s location, contact details, and app data, according to a whistleblower who spoke to the Guardian on the condition of anonymity.
The whistleblower said, “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.” The whistleblower further added, “Apple is subcontracting out, there’s a high turnover. It’s not like people are being encouraged to have consideration for people’s privacy, or even consider it. If there were someone with nefarious intentions, it wouldn’t be hard to identify [people on the recordings].”
Apple, for its part, said (in what sounds exactly the same thing that Google said), that the data “is used to help Siri and dictation … understand you better and recognise what you say.” and yet, Siri still sucks.