It does take steps to maintain privacy, but there are still concerns.
As much as Apple emphasizes privacy, it hasn’t completely avoided eyebrow-raising behavior. The Guardian has learned from a source that Siri quality control contractors regularly hear sensitive info, including medical info, criminal activities and even “sexual encounters,” much like their counterparts at Amazon and Google. They’re only listening to less than one percent of daily Siri activations, and frequently only for a few seconds each, but some of them include request-linked data like app info, contacts and locations. Like its peers, Apple is gauging how well its voice assistant is fulfilling requests and wants to know what happened after a command.
Apple stressed in a statement that it has multiple privacy protections. The recordings aren’t tied to Apple IDs, and they’re studied in “secure facilities” by people who are bound to “strict confidentiality requirements.” It won’t know whose device made a request or otherwise make connections.
That still leaves some concerns, though, and not just the sensitive info itself. There’s reportedly a “high turnover” for contractors, according to the source, and there’s supposedly relatively little vetting for new hires. A malicious recruit could theoretically abuse the data. There also don’t appear to be policies for dealing with sensitive recordings. Staff are encouraged to report accidental “hey Siri” requests as technical problems, but not the content itself.
It would be difficult to avoid these recordings so long as Apple and its rivals want humans checking the quality of voice assistant requests. Hotwords like “hey Siri” and “Alexa” aren’t foolproof. The companies haven’t readily informed the public about their human QA teams, though. And where Amazon and Google let you opt out of some uses for recordings, Siri tends to be an all-or-nothing affair. You can disable “hey Siri” or turn off Siri altogether, but there’s no way to control what you share. This doesn’t mean you’re at serious risk of privacy violations just by using Siri — just that Apple might have room for improvement in how it handles inadvertent audio captures.