According to a company whistleblower, Apple passes on a small proportion of Siri recordings to contractors who are tasked with grading the responses on a variety of factors, including whether the activation of Siri was deliberate or accidental, whether the query was something Siri could be expected to help with and whether the response was appropriate.
Apple says the data is used to help Siri better understand the user and recognise what is said.
Just a small number of Siri requests are analysed, according to Apple, and this forms part of the company's efforts to improve the service. All analysis is also carried out in secure facilities.
The company added that this grading accounts for less than 1% of daily Siri activations, and are only a few seconds long.
All well and good, but it's the lack of proper disclosure, given the frequency with which accidental activations are picking up extremely sensitive personal information, that is a concern.
Siri can be accidentally activated and mistakes are certainly possible - that's not the issue.
What is of concern is that these recordings feature private discussions, from conversations with doctors to sexual encounters and all of which are accompanied by user data showing location, contact details, and app data.
Apparently, the Apple Watch and the company’s HomePod smart speakers are the main source of mistaken recordings.
So what is Apple doing to address this? Not as much as you'd think and there are real fears that this information could be misused.
Are they checking the subcontractors they use for this work? How are these recording kept? Is it as private as they claim?
Apple is not alone in employing human oversight of its automatic voice assistants, so does Google and Amazon, but it does pride itself on the privacy of its users.
As with so many consumer-oriented devices, security should be taken much more seriously than at present.