“Encrypt your face”: protestors Signal digital data privacy fears with new face-blurring feature

On: September 28, 2020
Print Friendly, PDF & Email
About Paul Bugeja


   
Elijah Nouvelage/Getty Images

In the wake of George Floyd’s death, protestors flocked to encrypted messaging application (“app”) Signal to plan events and share content. In response, a new tool was quickly released that allows faces to be blurred in photos. These events illuminate a shift in user behaviour, with fears over digital data privacy being the catalyst for a change in communication preferences.

Signal is a free, encrypted communication service that uses the internet to send one-to-one and group messages, including the transfer of images, videos, files, and voice notes, as well as make voice and video calls. The open-source software employs end-to-end encryption, meaning only the sender and receiver can read the messages. This contrasts with Short Message Service (“SMS”), which is unencrypted, and popular communication app, WhatsApp, which is encrypted, but this encryption is not end-to-end so the service provider, in this case Facebook, Inc., can decrypt the conversation.

In addition to end-to-end encryption, the Signal Foundation, the non-profit organisation behind the development of the software underlying the Signal app, released a feature on 3 June 2020 that further protects the privacy of its users by allowing them to blur faces in photos within the app, either through detecting automatically or drawing manually. This was in direct response to the Black Lives Matter protests in America, with the company stating that “we support the people who have gone into the streets to make their voices heard”.

In turn, users have shown their support for the app and its focus on their privacy. In the week before George Floyd’s death on 25 May 2020, Signal was downloaded about 50,000 times; the following week, new downloads increased to almost 80,000; and in the first week of June, it had increased to 183,000. The recent surge in downloads displays a shift in user sentiment in relation to the reprioritisation of digital privacy and data protection over the convenience that the network effects of apps such as WhatsApp afford.

This trend can be related to the digital data classes discussed by Manovich (2011), where he writes about “those who create data (both consciously and by leaving digital footprints), those who have the means to collect it, and those who have expertise to analyze it”. Further, boyd and Crawford (2012) posit that the analysers are “the most privileged” and “get to determine the rules”. This digital divide strikes at the heart of the issue, where unwilling producers want to take back control of their data, especially when it may be used against them.

An example of how data can be used against the wishes of its creator was witnessed when Signal was subpoenaed to provide information relating to a phone number for a grand jury investigation. By design, the only information that was able to be provided was “the date and time a user registered with Signal and the last date of a user’s connectivity to the Signal service”. The organisation shared this via the Twitter platform after a gag order had elapsed, fuelling the app’s reputation as a user privacy vigilante:

Of course, not everyone agrees with the idea of end-end-encryption and the impunity that it provides. On 23 June 2020, Attorney General William Barr stated: “While strong encryption provides enormous benefits to society and is undoubtedly necessary for the security and privacy of Americans, end-to-end encryption technology is being abused by child predators, terrorists, drug traffickers, and even hackers to perpetrate their crimes and avoid detection”. This perspective can be viewed within Christin’s (2020) framework of understanding “tracking as a specific kind of power that operates from a distance”. In this approach, accessing user data could be for the common good of society without necessarily encroaching on the privacy of individuals.

This issue of data accessibility versus data privacy has continued to evolve through the role that Signal is playing in the recent American protests. Features such as the face-blurring tool shield the identity of users while they participate, both online by sharing content and in person by partaking in activist demonstrations: “We don’t want them to know where we are, so they can’t stop us at any point. On Signal, being able to communicate efficiently, and knowing that nothing is being tracked, definitely makes me feel very secure”.

This sense of security contrasts against what Solove (2015) describes as a “chilling effect”, where tracking “can discourage meaningful social participation and democratic debate”. Without digital artifacts such as Signal and the safety that its features afford its users, power may be completely centralised in the hands of profit-driven technology companies, as well as the law enforcement agencies that govern over them. In this Orwellian-like scenario, would as many people still speak up, either in public or in “private”, knowing that their every word could be tracked and potentially used against them in a digitised panopticon?

The dichotomy between surveillance and privacy is further highlighted with the face-blurring feature as it encourages users to hide in plain sight, distributing content without revealing the identity of the image’s subject or sender. The technological affordances (the capability to send anonymous information securely and secretly) of this tool simultaneously encourage deeper social affordances, including a decentralised culture of democratic expression.

These affordances can be overlayed with a problem articulated by Olson (1965), where he observed “that in large groups in which individual contributions are less noticeable, rational individuals will free-ride on the efforts of others”; however, Lupia and Sin (2003) showed how this “may play out differently under conditions of radically reduced communication costs”, with Bennett and Segerberg (2012) recognising the power of “digital media as organizing agents”. It can be argued that Signal and its tools reduce the potential costs of communicating due to the confidentiality and privacy afforded to its users and, consequently, play a role in the battle for control over digital data.

The recent rise in the popularity of Signal and the reactive release of the face-blurring feature present a reversal in the pattern noted by Lovink (2008), where he observes: “Security and privacy of information are rapidly becoming the new economy and technology of control. And the majority of users, and indeed companies, are happily abandoning the power to self−govern their informational resources”. As more and more people become increasingly concerned with how their data is collected, preserved, shared, analysed, monetised, and (ab)used, will we see a paradigm shift in how we treat our data and, more concretely, the organisations, platforms, and applications that we choose to create it with in the first place?

References

Bennett, W. Lance and Segerberg, Alexandra (2012); “The Logic of Connective Action”; Information, Communication & Society; 15(5): 739-768

boyd, danah and Crawford, Kate (2012); “Critical Questions for Big Data”; Information, Communication & Society; 15(5): 662–679

Christin, Angèle (2020); “What Data Can Do: A Typology of Mechanisms”; International Journal of Communication; 14: 1115–1134

Lovink, Geert (2008); “The Society of the Query and the Googlization of Our Lives: A Tribute to Joseph Weizenbaum”; Eurozine; https://www.eurozine.com/the-society-of-the-query-and-the-googlization-of-our-lives/

Lupia, Arthur and Sin, Gisela (2003); “Which public goods are endangered? How evolving communication technologies affect “The Logic of Collective Action””; Public Choice; 117: 315–331

Manovich, Lev (2011); “Trending: the promises and the challenges of big social data”; Debates in the Digital Humanities, Gold, Matthew K.; The University of Minnesota Press, Minneapolis, MN; http://manovich.net/content/04-projects/067-trending-the-promises-and-the-challenges-of-big-social-data/64-article-2011.pdf

Olson, Mancur (1965); “The Logic of Collective Action: Public Goods and the Theory of Groups”; Harvard University Press, Cambridge, MA

Solove, Daniel J. (2015); “Why privacy matters even if you have ‘nothing to hide’”; The Chronicle of Higher Education; https://www.chronicle.com/article/Why-Privacy-Matters-Evenif/127461

Comments are closed.