Could a Digital Expiry Date Policy Limit Extractive Platform Capitalism Business Models?

On: April 16, 2019
Print Friendly, PDF & Email
About rachel blennerhassett


   

Since 2016, numerous digital data leaks, data-sharing revelations, and scandals have been brought to light — from Cambridge Analytica to Google+ to several others. The data breaches compromised the personal information of millions of users and showed that not only have most consumers not been aware of the extent of data that digital businesses have collected (and continue to) on their behaviour throughout the years, but also that these same online platforms continue to show that they might be poor stewards of the accrued data (Grothaus, 2019; Vogelstein, 2019). These effects, taken together, present potentially volatile long-term implications for the world‟s hegemonic capitalist economic model, one that increasingly relies on data to maintain growth in the face of a presently sluggish production sector (Srnicek, 2017).

Economist Joseph Stiglitz has said that governmental policy determines market conditions, which in turn determine the overall wellbeing of an economy (Stiglitz, 2012). In this case, the absence of a stringent consumer data protection policy has led to market conditions in which technology companies have been continuously permitted to extract, store, market, and analyse the consumer data of their users. Here, dominant digital conglomerates such as GAFAM — Google, Amazon, Facebook, Apple, and Microsoft — have largely adopted rentist business models wherein their users rent a „free‟ service by giving up access to their data (Stiglitz, 2012). Over time, this data is accumulated and used by the platforms to both increase their profit margins and to make further technological advances in fields — facial recognition, for instance — that consumers could not have dreamed of when they were first consenting to the data extraction (Liao, 2019).

This for-profit use of consumers‟ data and activity on the internet has been conceptualised as a clandestine form of unpaid labour by Tiziana Terranova. The Italian theorist was influenced here by the Autonomists‟ concept of the „social factory‟, which rejected the separation between consumption and production and argued that production of value could no longer be confined to the times and spaces of waged work (Gill and Pratt, 2008). Free labour is a term used by Terranova to describe the expansion of the social factory to the internet. There, users‟ online time is converted by tech corporations to free labour (Terranova, 2004). Free labour can consequently be seen as the platform monetisation of users‟ voluntary online contributions and actions. Examples include subreddit moderators, open source coders, or even anyone who posts a selfie online.

Terranova acknowledged that the digital monetisation of free labour had rapidly increased in Web 2.0 due to the growing number of users and significant technological improvements that expanded platforms‟ capacity to store, analyse, and monetise user behavioural data and interactions. Terranova advocated for the liberation of free labour. To achieve this, she advised that social networking platforms be deprivatised, with their profits returned to those who are responsible for generating the value: the users. She also suggested that this ownership of data should be rightfully returned to users in the form of the freedom to access and modify the protocols and diagrams that structure their digital participation.

Several politicians and governing institutions in the western world seem inclined to agree with Terranova‟s sentiments nowadays, with the European Union‟s General Data Protection Regulation (GDPR) and American presidential candidate Elizabeth Warren‟s proposal to regulate big technology companies being two recent and notable examples of such concordance. While the policies may well represent important first steps in raising awareness on the value of digital user data on the long run, a possibly problematic layer involved within them is that they fail to challenge large online companies‟ storage of vast troves of historical user data. The aforementioned digital move towards Web 2.0, in other words, has meant that various entities, both currently functioning and defunct, possess considerable amounts of historical data on user behaviour that ongoing efforts to regulate internet monopolies have largely steered clear from addressing. All of which means that a digital expiry date for the use of historical user data might be needed if current efforts are to have their intended impact.

A significant difficulty involved with ongoing legislation is that while policies such as GDPR can permit EU citizens to request that platforms delete all of their historical data, doing so means that the users in question also have to delete their profiles on the relevant platforms (GDPR, 2018). Digital heavyweights such as Facebook, Google, and Amazon are all deeply integrated into western consumers‟ lives, however, and provide convenience and valuable services for work, socialisation, and recreation. Losing access to a platform‟s services and one‟s profile on it is as such too strong a deterrent for consumers seeking greater control over their data, which in turn indirectly leads to a typical consolidation of the status quo. The research question that this paper hence aims to tackle is how can policy decrease the accumulation of historical user data and increase user ownership of data generated from free digital labour.

 

Proposed Solution:

A digital expiry date is proposed as a possible measure to increase consumer control over historical data gathered by large technology platforms without restricting access to the digital entities themselves. The policy would consist of a biannual email sent from these major online platforms consulting users on whether they would like to continue allowing their data to be stored or if they would like to explore further choices that include the option to reset their user data. Electing to activate the digital expiry date would mean that users‟ explicit data (profile, posts, friends‟ list, messages, photos, etc.) would be saved, and the historical implicit data that a given platform had accumulated — such as location history, advertising categorisation, or deleted messages, among others — would be deleted. An option will be included within the policy for users to explore further data options where they can either allow platforms to continue gathering data on top of the previous two years‟ worth of information or for the consumers to delete or download specific categories of explicit and implicit data themselves. If a user does not respond to the expiry date email or the subsequent reminders, the default would be for the digital expiry date to be activated and their profile and explicit data to be maintained.

How the policy is presented to consumers will be incredibly important for any implementation of this proposed policy. As American economists Richard Thaler and Cass Sunstein have theorised, in drawing up the environment in which a decision is made, a designer or choice architect can nudge consumers to make certain choices (Thaler and Sunstein, 2009). The best choice as determined by the choice architect is not always the best choice for the consumer, though. GDPR is again a good case in point here. The regulation has mandated all websites to obtain informed consent from users before attaching tracking cookies to their web browsers, yet has not provided a concrete or uniform method for how businesses needed to present their consent requests. Consequently, its consent notices have often been designed to nudge users towards accepting the maximum number of cookies by default — to the benefit of websites and platforms.

To counter such possibility for user manipulation, the digital expiry date policy is proposed to have a set email format to be designed and written by policymakers rather than digital entities that have a vested stake in users‟ decisions. The email will be simply worded and have a clear design to ensure and maximise both comprehension and convenience for users.
Given the far-reaching scope of the digital expiry date policy as well as the varied nature of the data collected by different online businesses and platforms, it is proposed that the regulation be applied to the big five of GAFAM in its first year. In its second year, the policy can be expanded to other major western platforms with annual revenues exceeding $20 billion. If the rollout proves successful, the policy can then be gradually expanded to include more data-collecting digital players.

To ensure a smooth and consistent implementation, it is similarly proposed that the policy be first applied to the EU, America, and Canada — regions that are likely to have the necessary infrastructures and governing institutions to properly and fairly enforce the regulation. Other regions and states will be welcome to adopt the policy in the future.

 

Possible Consequences:

A digital expiry date would aim to result in informed consumers, with increased ownership of the data generated by their free labour. A limitation on platforms‟ ability to profit from users‟ historical data could potentially incentivise technology companies to pursue alternative, non-data-extractive business models. This renewed ownership of personal data could also lead to more knowledgeable discourse and even legislation on the online practice of sharing user data with third parties. The compartmentalisation of historical data into usable categories could likewise lead to greater interoperability between platforms. This would potentially decrease platform control over users and promote competition. Citizens might also be encouraged to voluntarily donate anonymised sections of their data to academia, publicly funded platforms, or artificial intelligence projects.

A possible hurdle for the implementation of the proposed digital expiry date regulation, on the other hand, is that a policy that limits platforms‟ ownership of users‟ historical data threatens the extractive business model of powerful internet conglomerates. Regulatory capture, whereby business leaders use their societal influence to get people sympathetic to their cause appointed to the very institutions that are supposed to regulate their businesses, could prevent a bill from ever being debated, let alone become a law (Stiglitz, 2013).

Similarly, while a digital expiry date might limit commercial data extraction, it would not address the governmental surveillance of users. GDPR provides governments with the right to process a user‟s personal data if a “public security concern” is feared (Human Rights Watch, 2018). State-led security agencies in both Europe and North America typically use GAFAM platforms as surveillance tools in such instances. It is thereby likely that any digital expiry data legislation passed in the western world would also provide governments with a similar power to prevent certain users from deleting their data. It could be argued that the potential continuity of government surveillance is an inevitable consequence of addressing the aggregation of user data by platforms using governmental methods, though.

On the whole, the implementation of the digital expiry date is nevertheless likely to be well-received by those it aims to empower: consumers. Apathetic users could choose to continue to allow platforms to have access to all of their historical data, maintaining the status quo of data aggregation by large platforms. But with the implementation of this policy, that continued data accumulation will at least become an active choice that users make, rather than a default determined by under-regulated platforms.

 

 

 

Works cited:

General Data Protection Regulation. Article 17. 2018

Gill, R., & Pratt, A. (2008). “In the Social Factory?” Theory, Culture & Society, 25(7-8), pp 1–30.

Grothaus, M. (2019). How our data got hacked, scandalized, and abused in 2018. [online] Fast Company. Available at: https://www.fastcompany.com/90272858/how-our-data-got-hacked-scandalized-and-abused-in-2018 [Accessed 23 Mar. 2019].

Human Rights Watch. (2018). The EU General Data Protection Regulation. [online] Available at: https://www.hrw.org/news/2018/06/06/eu-general-data-protection-regulation [Accessed 24 Mar. 2019].

Liao, S. (2019). IBM didn’t inform people when it used their Flickr photos for facial recognition training. [online] The Verge. Available at: https://www.theverge.com/2019/3/12/18262646/ibm-didnt-inform-people-when-it-used-their-flickr-photos-for-facial-recognition-training [Accessed 23 Mar. 2019].

Srnicek, N. (2017). Platform Capitalism. Cambridge: Polity Press.

Thaler, R. and Sunstein, C. (2009). Nudge. [United States]: Gildan Audio.

Terranova, T. (2004). “Free Labour” in Network Culture: Politics for the Information Age. London: Pluto Press. pp 33-57

Stiglitz, J. E. (2012). “Rent Seeking and The Making of an Unequal Society” in The Price of Inequality: How Today’s Divided Society Endangers Our Future. New York: W. W. Norton & Company.

Vogelstein, F. (2019). Why Should Anyone Believe Facebook Anymore?. [online] WIRED. Available at: https://www.wired.com/story/facebook-data-sharing-privacy-investigation/ [Accessed 23 Mar. 2019].

Comments are closed.