Digital resistance: A brief introduction to decentralized data-right movements
This post provides a brief introduction to collective organization as a form of resistance, whose ambition is overcoming technological determinism and creating new paths to re-define our relationship with “new” technologies, connecting the labor and knowledge of so many kinds of organizations (non-profits, associations, NGOs, citizen initiatives and so on). More particularly, summary descriptions about some examples of online organizations will also be deployed.
Recommended song while reading this: All Along the Watchtower versioned by Jimi Hendrix
Far from trying to make of these examples a representative sample of what these movements, initiatives, and platforms are, this post attempts to open the door to the creation of new possible ways of inhabiting digital environments. The report should be seen as a critical introduction to some of the means that we as a society have -and that are already being developed-, suggesting alternatives of usage and cohabitation within “new” technologies.
“Protocols, so we fear, cannot be questioned and are looked down upon as religious rituals that have lost their original social context”.(Lovink and Rossiter 1)
To critically understand this technical assemblage (infrastructure, platforms, algorithms, protocols, and machine learning, big data, artificial intelligence, and so on), it is crucial to examine these technologies, who boosts them and with which purposes. In short, being aware of the lack of neutrality (Dijck and Poell 2) (Lovink and Rossiter 1) that these infrastructures have and, worse, the naïve and uncritical approach from which these objects are generally read.
“Far from being neutral platforms for everyone, social media have changed the conditions and rules of social interaction”.(Dijck and Poell 2)
This technical assemblage is often treated as black boxes (Gillespie 178) that are mystified and magnified as if they had no errors or biases. The false belief that if something is refuted by data it’s correct. Quite far from prohibitionist arguments, the motivation here is to highlight the need for collective reflection on how we encode our reality, not only within academic literacies but especially in our collective imagination and everyday use.
Fortunately, there are movements and initiatives that seek to:
- Research and evaluate the practices that are being developed, especially focused on big-scale businesses and centralized governments, identifying bad practices (e.g. combating data discrimination) regarding our data (privacy) and what “they” do with it (political inferences);
- Empower people, raise awareness, and provide critical responses and context which will lead people to a more critical usage of technologies (critical literacy);
- Create decentralized forms of governance, self-organized spaces, develop new ways to inhabit digital environments, tools and means which would guide us towards a reformulated technological world, where we are able to explore areas that would not be otherwise suitable for “popular” use.
“The blackbox metaphor fails us here, as the workings of the algorithm are both obscured and malleable, “likely so dynamic that a snapshot of them would give us little chance of assessing their biases”.(Gillespie 178)
The fragmentation of society and, therefore, of the technologies which are part of it, leads to the granularity (Ruppert et al. 16) of digital devices and their content. However, an unconscious dayto-day use of technologies -technologies that we use individually, to the detriment of collectiveness (Hernando 115), even though we make use of “Social Media”- make us lose sight of the specificities (Ruppert et al. 4) that our world and technologies have. In other words, we only see the specific applications that we can make of these tools, rather than taking into account the context in which they are framed. Even if we are able to make use of affordances, we forget that we can reinterpret and redefine them and that, in the end, we always operate in a narrow framework for action, accepting these technologies without questioning the design.
Initiatives focused on highlighting bias:
Two great examples of platforms working to prevent this uncritical eye are the Algorithmic Justice League and AlgorithmWatch:
- The Algorithmic Justice League is an association that aims to “highlight bias and provide space for people to voice concerns and experiences with coded bias“. Here you have an example of The Gender Shades Project, which pilots an intersectional approach to inclusive product testing for AI:
- AlgorithmWatch is a non-profit focused on the algorithmic decision making (ADM) processes used either to predict or prescribe human action or to make decisions automatically (see ADM Manifesto)
While important to raise awareness, it is not enough to call attention to the design flaws and biases that technologies have. At the same time, the study of the consequences and implications that they have on our lives is equally important. What is more, the key here is not just analyzing the kind or quantity of our personal data that is being used, but also how it is being used. Again, a shift to a more critical perspective would be highly favorable.
To take an example of how we see the implications of technologies, when thinking about mass surveillance and control, one could rapidly link it to digital authoritarianism and the Social Credit System developed by the Government of China, however, we barely look at the global-scale monopolies when “governmental control is nothing compared to what Google is up to” (Zuboff 111).
Initiatives focused on the impact of technologies:
- Tactical Tech is an NGO investigating “how digital technologies impact society and individual autonomy, engaging citizens and civil-society organizations.” Their technique is particularly relevant, as their work is addressed to ordinary people: exhibitions, camps, guides, workshops are developed in order to increase data literacy, digital privacy, data detox and much more.
- The Data Justice Lab’s focal point is the study and practice of datafication from the perspective of social justice. Their research examines the “implications of data-driven policies, as well as the institutional and organizational uses of data“.
This kind of projects are increasingly growing popularity, not only those linked to the biases and impacts of technologies: we can also find politically motivated movements regarding techno-politics, critical digital literacy, freedom of information and speech, data privacy rights, open-source advocacy, net-neutrality, civic technologies, hacktivism, free circulation of culture, open democracy, and so on. There are many initiatives whose main objective is reaching digital-technological-data sovereignty, to the detriment of large-scale monopolies. The is too long to confine to a few lines, nor it is the aim of this post to describe all of them.
Instead of focusing on the mentioned initiatives, the intention is to drive the attention to us, ordinary people, seen as key actors to lead to a real and lasting change. It is high time for us to give a step forward, to assume responsibilities and conduct awareness-raising activities, even and especially in private and intimate living spaces. It is not just a matter of regulation; more than restrictions, these rules need to be followed (and empowered) by a change of mindset. Only through a conscious use of technologies may we be able to use them as a tool, without generating dependencies or passively assuming their effects.
- Dijck, José Van, and Thomas Poell. Understanding Social Media Logic. 2–14, 12 Aug. 2013, pp. 2–14. DOI.org (Crossref), doi:10.12924/mac2013.01010002.
- Gillespie, Tarleton. ‘The Relevance of Algorithms’. Media Technologies, edited by Tarleton Gillespie et al., The MIT Press, 2014, pp. 167–94. DOI.org (Crossref), doi:10.7551/mitpress/9780262525374.003.0009.
- Hernando, Almudena. La fantasía de la individualidad: sobre la construcción sociohistórica del sujeto moderno. 2018.
- Lovink, Geert, and Ned Rossiter. Organized Networks / “Seriality for All”: The Role of Protocols and Standards in Critical Theory. https://nedrossiter.org/?p=286. Accessed 29 Aug. 2019.
- Ruppert, Evelyn, et al. ‘Reassembling Social Science Methods: The Challenge of Digital Devices’. Theory, Culture & Society, vol. 30, no. 4, July 2013, pp. 22–46. DOI.org (Crossref), doi:10.1177/0263276413484941.
- Zuboff, Shoshana. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. First edition, PublicAffairs, 2019.