Health Code, a tool to tackle COVID-19 or automatize social control?
“Please show your Health Code.” A few months since the COVID-19 pandemic started, almost all public places in China have posted such requests at the entrances. Health Code, a color-based application, is rolled out to control people’s movements and curb the coronavirus’s spread. Green code means free travel; yellow or red codes mean different risk levels with the coronavirus.
A new field of surveillance
By authorizing the national government service platform to obtain the name, ID number, and mobile phone number through Alipay, Wechat, or other local applications, the Health code will be displayed within a few seconds.
The algorithm behind it has not yet been announced. For instance, who is in control of data flows, who owns user health data, and how governments regulate health Code. (Fan Liang, 2020)
It is not a surprise that big data and computational revolutions have substantially promoted surveillance and social sorting. (Cheney-Lippold, 2018) Nevertheless, when we think such surveillance and social sorting are based on the online information by cookies, preferences, and personal behaviors, the Health Code expands a new surveillance field: people cannot avoid logging in the reality web. The health code was born not at the moment people authorized it, but before.
The algorithm is hidden; the power emerges.
Whether people use or not does not depend on personal will but a mandatory obligation: “No green code, no entry.” Such an obligation is an interpretation of what Foucault calls “discipline,” basically a power mechanism.
More than that, this algorithmic, automated template of technology can shift the governmental logic from surveillance and discipline to capture and control. (Deleuze, 1992)
When the object of medical security’s responsibility turns to the individual, it drives a panopticon of control. Besides hospitals and public transportation, some companies, and even small private parties and forums ask for Health Code to guarantee participants’ safety in order to demonstrate their social responsibilities. Discipline promotes self-control. People hang up their safety flags of green Health Code and supervise other’s screens, which is how a decentralized, panoramic template of control is built.
A new status of data
The basic logic is that the code developer can assess people’s contagion risks based on factors like travel history, duration of time spent in risky areas, and relationships to potential carriers. (Fan Liang, 2020) Therefore, the main element is the mapping of risky zones and the motion location of people.
However, big data are often noisy and messy, with gaps, errors, biases, and inconsistencies that prompt questions of veracity (accuracy and precision) and reliability (consistency over time) (Kitchin, 2014). In this case, how coarse the cell-site location information (CSLI) based on connections to nearby towers is, how partial wifi connections are in coverage outside of densely urbanized places, and how precise GPS and Bluetooth record location and contact all points a kind of uncertainty. (Stanley & Granick, 2020)
What is frustrating is that the technical problem of big data is just one part.
- Living in ritual areas, or the elderly, who does not bring smartphones or know how to own and generate a health code, will be passively marginalized of this system.
- Then geographic display and tracking rely primarily on phone numbers, which were registered with a Chinese ID or a passport number, so the tracking location may not match the real users when others purchase SIM cards.
- The risk rating and classification of areas make it even more challenging to operate. When Beijing is labeled in the Yellow zone, labeling people living in the adjacent regions is difficult to confirm under GPS and data monitoring.
More than these costs of making a technique mistake, there is no manual to guide people on how to change their status. After being classified as yellow, all people can do is to take fourteen days’ quarantine at home. The inability to prove the data’s error will impose the dominance hierarchy on people.
A new presentation of code
The presentation form of the health code is a QR code. We are already very familiar with the QR code’s existence, which we use to log in to the device or scan to obtain the URL.
However, the QR code setting of the Health Code does not rely on others to scan to obtain information, but directly uses the color as a mark. It presents a visual way of profiling. Users only need to “display” their QR code, not let others to “scan.” Even if others scan it, all they get is just a string of garbled code. From the perspective of transmission, color as a symbol is the proper eye-catching way to profile people. On the other hand, it is a refusal of communication or negotiation. It cannot be explained by scanning to present personal information; it only tells the results.
Color profiles people. And profiling is a new type of knowledge, a new form of power.
Profiling can be understood as a set of correlated data to individuate and represent or identify, and the purpose of profiling is to assess risks and/or opportunities for the data controller. (Hildebrandt & Gutwirth, 2008) We have been well-trained to recognize the “red” color and bring out our caution. This nature affordance, the world’s properties defined concerning people’s interaction with the object (Gaver, 1991: 80), was initially used to profile dangerous areas, but now it is used to profile “high-risky” people.
The danger of this kind of one-way output affordance is that it changes the relationship with each other, and it creates a unidimensional and uncompromisingly profiled environment.
Furthermore, this application is design to profile people as the state of health today. What information will it afford tomorrow?
In 1990, Gilles Deleuze (1992) recalled that Felix Guattari imagined a city where one would be able to leave one’s apartment, one’s street, one’s neighborhood, thanks to one’s (dividual) electronic card that raises a given barrier; but the card could just as easily be rejected on a given day or between certain hours.
This new automated city arises many debates around scholars, and now it may appear much earlier than we thought.
Reference
Liang, Fan. “COVID-19 and Health Code: How Digital Platforms Tackle the Pandemic in China.” Social Media + Society 6, no. 3 (July 2020): 205630512094765. https://doi.org/10.1177/2056305120947657.
Mozur, Paul, Raymond Zhong, and Aaron Krolik. “In Coronavirus Fight, China Gives Citizens a Color Code, With Red Flags.” The New York Times, August 7, 2020, sec. Business. https://www.nytimes.com/2020/03/01/business/china-coronavirus-surveillance.html.
Daugelaite, Tautvile. “China’s Health Code System Shows the Cost of Controlling Coronavirus.” Wired UK, July 17, 2020. https://www.wired.co.uk/article/china-coronavirus-health-code-qr.
Dawei, Xu. “健康码遭层层‘加码’ 全国范围互认真的这么难吗?.” NEWS CHINA, April 27, 2020. http://www.chinanews.com/gn/2020/04-28/9170116.shtml.
Cheney-Lippold, John. “A New Algorithmic Identity: Soft Biopolitics and the Modulation of Control.” Theory, Culture & Society 28, no. 6 (November 2011): 164–81. https://doi.org/10.1177/0263276411424420.
Deleuze, Gilles. “Postscript on the Societies of Control.” Columbia University Press, 1990. https://theanarchistlibrary.org/library/gilles-deleuze-postscript-on-the-societies-of-control.
Stanley, Jay, and Jennifer Stisa Granick. “The limits of location tracking in an epidemic.” American Civil Liberties Union (2020).
Kitchin, Rob. “Civil Liberties or Public Health, or Civil Liberties and Public Health? Using Surveillance Technologies to Tackle the Spread of COVID-19.” Space and Polity, June 3, 2020, 1–20. https://doi.org/10.1080/13562576.2020.1770587.
Kandias, Miltiadis, Lilian Mitrou, Vasilis Stavrou, and Dimitris Gritzalis. “Profiling Online Social Networks Users: An Omniopticon Tool.” International Journal of Social Network Mining 2, no. 4 (2017): 293. https://doi.org/10.1504/IJSNM.2017.091807.
Hildebrandt, Mireille. “Defining Profiling: A New Type of Knowledge?” In Profiling the European Citizen, edited by Mireille Hildebrandt and Serge Gutwirth, 17–45. Dordrecht: Springer Netherlands, 2008. https://doi.org/10.1007/978-1-4020-6914-7_2.
Hildebrandt, Mireille, and Serge Gutwirth, eds. Profiling the European Citizen: Cross-Disciplinary Perspectives. New York: Springer, 2008.
Bilal, Muhammad, Abdullah Gani, Muhammad Ikram Ullah Lali, Mohsen Marjani, and Nadia Malik. “Social Profiling: A Review, Taxonomy, and Challenges.” Cyberpsychology, Behavior, and Social Networking 22, no. 7 (July 2019): 433–50. https://doi.org/10.1089/cyber.2018.0670.
Wisniewski, Pamela J., Bart P. Knijnenburg, and Heather Richter Lipford. “Making Privacy Personal: Profiling Social Network Users to Inform Privacy Education and Nudging.” International Journal of Human-Computer Studies 98 (February 2017): 95–108. https://doi.org/10.1016/j.ijhcs.2016.09.006.
Pötzsch, Holger. “Archives and Identity in the Context of Social Media and Algorithmic Analytics: Towards an Understanding of IArchive and Predictive Retention.” New Media & Society 20, no. 9 (September 2018): 3304–22. https://doi.org/10.1177/1461444817748483.
Gaver, William W. “Technology Affordances.” In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems Reaching through Technology – CHI ’91, 79–84. New Orleans, Louisiana, United States: ACM Press, 1991. https://doi.org/10.1145/108844.108856.
Krivý, Maroš. “Towards a Critique of Cybernetic Urbanism: The Smart City and the Society of Control.” Planning Theory 17, no. 1 (February 2018): 8–30. https://doi.org/10.1177/1473095216645631.