Don’t Leave the Light On: Visualizing Privacy in iOS14

On: September 28, 2020
Print Friendly, PDF & Email
the author with iOS14 camera indicator light (photo by leo bevington)

The iOS14 update offers new forms of visualizing privacy, through features such as the camera indicator light and (still forthcoming) opt-in tracking. Rather than put pressure on the developer, these features act to visualize the continued shift of power over digital privacy from the corporation to the individual. This article takes a critical approach in analyzing the visualization of user privacy.

Visuals of an Algorithmic Gaze

            Apple’s iOS software update started rolling out earlier this month and along with a slew of new customizable features, Apple has placed significant focus on privacy. Users now have more options in restricting photo use by individual apps, secure Sign In with Apple features are capable across more platforms, and Safari for Web and iOS now present tracking reports (“iOS14”). With the new “approximate location” toggle feature, users can now allow apps (like weather or local news) to make an approximation in order to provide them with relevant data (O’Flaherty). An indicator light tells users when an app is using their camera or microphone, and Apple’s Control Center feature will specify which app(s) have been using those features recently (“iOS14”). Also featured in this month’s iOS launch (but still forthcoming) is the App Store requirement that developers provide detailed privacy usage reports upfront, along with the option that users be allowed to opt-in to tracking (“User Privacy and Data Use”). 

            Is this a big step for user privacy on iOS devices? If the last 13 years of smartphone proliferation has taught us anything, it should be a heightened wariness about “dataveillance” — the discreet kind of data collection that “vampirically feeds off of our identities, our “likes,” and our everyday habits” (Gitelman, 10). The adage if you’re not paying for it; you’re the product becomes especially poignant as we subject our every day moves to “an algorithmic gaze, a machine vision that emphasizes market values like productivity, efficiency, profit, and mitigation of risk and liability” (Silverman 149). The idea of tracking is not new — search engine DuckDuckGo and various third-party browser extensions have been exposing tracking like the Safari update for years, and the Google Play store has been allowing users to see which privacy settings an app may ask for since 2012 (Kummer and Shulte 2019). 

            However, the camera indicator light in iOS14 begets a new form of visualizing privacy, one that directly invokes the image of an “algorithmic gaze.” The past invisibility of data collection is made visible by a small green eye that blinks open in the corner of the screen. You know you’re being watched, but what are you going to do about it?

Governing (and a Privacy Policy)

            Insight into how Apple’s privacy policy (which has not changed since 2019), sheds light on how users are afforded specific tools with which to resist tracking, while discouraged from others (Cooke). These governed forms of resistance are enacted through visual techniques, such as toggle privacy features or the camera indicator light that encourage the user to be self-motivated in their individual fight against tracking. These disciplinary techniques of power act to divide the task of digital privacy into something seemingly accessible and attainable by the individual. The ability to stop tracking is at your fingertips — just slide the toggle bar and Instagram can’t know where you are! This sense of control “engenders a specific understanding of “security threat” that is central to the maintenance of negative feedback” (97). Negative feedback in this sense refers to system features that are built into the software, so that users’ choices about privacy are productive (aligning with choices governed by the software), rather than disruptive to that software system. Cooke argues that this kind of cybernetics understanding of governmentality can offer us a new way of thinking about biopolitical techniques that act on the level of the species body through principles of “maintaining efficient, reliable, and feedback-oriented control loops” (93). Tools that visualize the limitations of privacy mitigation by users demonstrate that the “the public gaze is directed to particular parts of the problem” (Flyverbom 8). This “guidance of attention” allows users to interact with some forms of information control, while drawing attention away from others (Flyverbom 10; Cooke 93). 

            Cooke distinguishes two forms of data referred to in Apple’s privacy policy — content data and metadata. While Apple promises to protect specific kinds of personal information (such as names, mailing addresses, and phone numbers), they also outline a wealth of metadata (data they say, on its own, is not directly associated with the user) that they collect for the purpose of advertising and understanding user behavior. While this metadata might seem unimportant, developers can collect unique device codes (an ID for Advertisers or IDFA) that aggregate the metadata and tie together this seemingly random user information (Cox).                  

            The Apple privacy policy shifts liability and responsibility away from themselves in noting that third parties (Apps downloaded from the App Store) govern and protect the metadata they collect without oversight by Apple (Cooke). Kummer and Shulte’s 2019 study noted that Apple’s lack of up-front information about what data an app will access means that “apps with privacy-sensitive permissions are on average more successful in the Apple iOS than in the Android OS, where these permissions are visible to the user before the app is installed” (3485). While the App Store plans to instate these features later this year (along with requiring developers to tell users if they collect an IDFA and letting users opt-in to tracking), the update has been held off while the Big Bad Wolf of data collection (Facebook) scrambles to understand an iOS world in which users are given the option to opt-out of their invasive tracking tools (“Preparing Audience Network for iOS14”).

            While we wait for opt-in tracking and an updated privacy policy, iOS14 has done little more than allow users the ability to visualize how pervasive tracking really is. The true novelty of iOS14 lies not in its new techniques of visualizing privacy, but in its continued shift of power and responsibility of digital privacy into the hands of the consumer, who remains a product. Perhaps hope will come in the form of opt-in tracking later this year. With an updated privacy policy and true focus on user safety, Apple still has the ability to provide a truly unique privacy environment for iOS users.

Works Cited

Cooke, Thomas. “Metadata, Jailbreaking, and the Cybernetic Governmentality of iOS: Or, the Need to Distinguish Digital Privacy from digital privacy.” Surveillance and Society, vol. 18, no. 1, 2020, pp. 90-103, https://ojs.library.queensu.ca/index.php/surveillance-and-society/index.

Cox, Kate. “iOS14 privacy settings will tank ad targeting business, Facebook warns.” Ars Technicha, 26 August 2020, https://arstechnica.com/tech-policy/2020/08/ios-14-privacy-settings-will-tank-ad-targeting-business-facebook-warns/. Accessed 23 September 2020. 

Flyverbom, Mikkel. “Disclosing and Concealing: Internet Governance, Information Control and the Management of Visibility.” Internet Policy Review, vol. 5, no. 3, Sept. 2016, DOI:10.14763/2016.3.428.

Gitelman, Lisa. Raw Data Is an Oxymoron. Cambridge, MIT Press, 2013. Introduction chapter, p. 1-14. 

Kummer, Michael and Schulte, Patrick. “When Private Information Settles the Bill: Money and Privacy in Google’s Market for Smartphone Applications.” Management Science, vol. 65, no. 8, Aug. 2019, pp. 3470–94, DOI:10.1287/mnsc.2018.3132.

“iOS14.” Apple, https://www.apple.com/ios/ios-14/. Accessed 25 September 2020.

O’Flaherty, Kate. “Apple iOS14: Brilliant New Security and Privacy Features You Can Use Now.” Forbes, 15 September 2020, https://www.forbes.com/sites/kateoflahertyuk/2020/09/15/apple-ios-14-launch-confirmed-brilliant-new-security-and-privacy-features-arriving-tomorrow/#2e7fc7394b38. Accessed 23 September 2020. 

“Preparing Audience Network for iOS14.” Facebook, 26 August 2020, https://www.facebook.com/audiencenetwork/news-and-insights/preparing-audience-network-for-ios14/. Accessed 25 September 2020. 

Silverman, Jacob. “Privacy under surveillance capitalism.” Social Research: An International Quarterly vol. 84, no. 1, 2017, pp. 147-164.

“User Privacy and Data Use.” Apple, https://developer.apple.com/app-store/user-privacy-and-data-use/. Accessed 24 September 2020.

Comments are closed.