“Hey, Siri…” “I am listening…”: Apple, Privacy & Siri
Summer 2019 was quite eventful for virtual assistants across platforms. The big five all got exposed for recording their virtual assistant users’ audio, Apple among them. Such revelations prompt the prior mentioned company to at first temporarily suspend the practice and then change their privacy settings regarding Siri, which are introduced with the new update in the fall 2019.
Virtual assistant technology is relatively new in the world of digital technology. However, it already made quite a change in some households in the way we interact with all our devices. Siri as a virtual assistant is featured in most Apple devices today, notably in HomePods and Apple Watches. So, when the fact that Siri’s audio recordings are sent back to Apple got exposed, and even, furthermore, the fact that the company used contractors as human moderators for “grading” performance of the program, the work of virtual assistants was propelled into the ongoing debate about privacy in digital world. (2)
What is “grading”?
The grading system uses recordings and computer generated transcripts to effectively ‘grade’ Siri’s performance and accuracy, and it is done by humans. However, Siri gets activated by accident quite often — for some it activated when it heard the sound of a zip — and not exactly at the most appropriate moments. Graders also mention that the most of false triggers come from HomePods and Apple Watches — two devices that potentially hear the most sensitive information about its users. (2)
Apple again reassures its users stating that recordings and transcripts are not commercialised in any way and not, especially, sold to any third parties. Moreover, the system explicitly keeps away from your personal data (Apple ID or phone number) and the data is assigned a “random identifier”. Furthermore, the system review less than 0.2% of requests. Apple reassures its users that it records them solely for the sake of improvement of Siri, it, nonetheless, remains quite a sensitive issue. (3)
Fall Update
Following their exposure, Apple changed its Siri Privacy and Grading, making it so the default choice was not sharing recordings with Apple and users could opt in, rather than opt out as was in previous versions. However, computer generated transcripts would be recorded and sent to Apple, no matter what the user’s choice is. Furthermore, in order for user to delete its Siri’s request history, one have to disable virtual assistant entirely, otherwise, the data are kept for a period up to 6 months. (1)
Outsourcing of these recordings to contractors would also stop, and after the update only Apple employees would be allowed to grade Siri’s performance. (3) However, as of writing this blog post, it is unclear whether old voice recordings that were harvested prior to the exposure would be deleted or not. (1)
New Platform-Made Legislation on Privacy
These changes were brought by public outcry following The Guardian’s article in July 2019, and Apple quickly reacted by first suspending its work and then changing its configurations. But, Siri user’s data is still handled and regulated by Apple and, though, it insists that user’s privacy is of the utmost importance to the company, one is left to wonder why was this kind of policy not implemented from the start.
Issue of Privacy & Apple
Apple positioned itself on the market as a pro-privacy company quite explicitly . It directly highlights this point in its speeches and even in the support document to the new update. (3)
However, Apple was outsourcing this grading to contractors prior the update, but reassuring that those contractors abided by Apple’s regulations and guidelines. Nonetheless, some of those recording contained the most sensitive information — from private conversations, doctor-patient confidentiality and drug deals to sexual intercourses. Moreover, contractors said themselves, when some of them went public, that when it was a false trigger, the recording got classified as technical problem, but the message recorded remained as it was. Furthermore, some of them confessed that there was not much of a vetting process who gets hired as a grader. (4) So, your most private information without your knowledge could be potentially overheard by a third-party grader who was not properly vetted for this kind of work, and that could use that said information in anyway they see fit. For average user, it may not be such a problem — who would be interested in private comings and goings of an average university student — however, when it is a public figure, situation changes, since they could be recognised by their voice and taken advantage of. Apple learnt from its past mistakes and responded with an update rather swiftly.
With the fall update, Apple also promised that any recording classified as a “false trigger” would be deleted. The value of privacy still remains as an issue, since if Siri does not collect the data about requests, it would not be able to learn and its performance would be stagnate. This example illustrates how “the user privacy expectations are sometimes characterised as obstacles to development”(Greene, Daniel, and Katie Shilton.,2017), meaning in order to develop Siri’s performance and accuracy, data about user’s requests and virtual assistant’s responses need to be analysed. Apple have been facing this dilemma for quite a while now on different fronts (Greene, Daniel, and Katie Shilton.,2017).
Your audio request to Siri could be looked through the lens of the search history on the browser, because of frequent use and mass proliferation. However, in browser’s case, your searches and data are visible to you and could be deleted, but in order to do the same thing with Siri, you need to turn it off. So, when you activate Siri again, the process of learning will begin anew. If we are to continue with search engine history analogy, your browser still collects information on your usage in some form or another in order to improve, which is what Siri also does. However, virtual assistant technology contains only requests, which brings us to the debate about how much users actually value privacy and how much of it are they willing to give up in order to improve their experience with virtual assistants.
Real world gets more and more integrated with the digital world with HomePods and other devices, and quite a lot of them use virtual assistant technology. However, even if Siri and others came quite far already, they still have a long way to go in terms of privacy settings.
Notes:
- Brodkin, Jon. “Apple to Stop Storing Siri Audio after Contractors Heard Private Talks and Sex.” Ars Technica, 29 Aug. 2019, https://arstechnica.com/tech-policy/2019/08/apple-will-stop-storing-your-siri-voice-recordings-by-default/.
- Hern, Alex. “Apple Contractors ‘Regularly Hear Confidential Details’ on Siri Recordings.” The Guardian, Guardian News and Media, 26 July 2019, https://www.theguardian.com/technology/2019/jul/26/apple-contractors-regularly-hear-confidential-details-on-siri-recordings.
- “Siri Privacy and Grading.” Apple Support, Apple, 28 Aug. 2019, https://support.apple.com/en-us/HT210558.
- Casey, Jess. “Apple Contractors Listened to 1,000 Siri Recordings per Shift, Says Former Employee.” Irish Examiner, Irishexaminer.com, 22 Aug. 2019, https://www.irishexaminer.com/breakingnews/ireland/apple-contractors-listened-to-1000-siri-recordings-per-shift-says-former-employee-945575.html.
- Greene, Daniel, and Katie Shilton. “Platform Privacies: Governance, Collaboration, and the Different Meanings of ‘Privacy’ in IOS and Android Development.” Sage Journals: New Media & Society, vol. 20, no. 4, 2017, pp. 1640–1657., doi:10.1177/1461444817702397.
- Brandom, Russell. “Apple Wants to Be the Only Tech Company You Trust.” The Verge, The Verge, 26 Mar. 2019, https://www.theverge.com/2019/3/26/18282158/apple-services-privacy-credit-card-tv-data-sharing.
- Hern, Alex. “Apple Apologises for Allowing Workers to Listen to Siri Recordings.” The Guardian, Guardian News and Media, 29 Aug. 2019, https://www.theguardian.com/technology/2019/aug/29/apple-apologises-listen-siri-recordings.
- Hollister, Sean. “Microsoft’s New Privacy Policy Admits Humans Are Listening to Some Skype and Cortana Recordings.” The Verge, The Verge, 14 Aug. 2019, https://www.theverge.com/2019/8/14/20805801/microsoft-privacy-policy-change-humans-listen-skype-cortana-voice-recording.
- “Apple Contractors Listened to 1k Siri Recordings per Shift: Former Employee: Hacker News.” Apple Contractors Listened to 1k Siri Recordings per Shift: Former Employee | Hacker News, https://news.ycombinator.com/item?id=20788463.
- Courtney, M. “Careless Talk Costs Privacy [Digital Assistants].” Engineering & Technology, vol. 12, no. 10, Jan. 2017, pp. 50–53., doi:10.1049/et.2017.1005.