Big Brother is -still- watching you
Big data is everywhere. This term has been widely discussed on television studio sets, during political debates, on lunch tables, online conversations. But what exactly constitutes big data, why is it “subjective”, and why does it pose an ethical debate? Here, I will discuss facial recognition as an example of the instrumentalization of big data by the Israeli government.
Big Data : Can it be objective?
Big data is defined by the scholar Boyd as a “cultural, technological, and scholarly phenomenon that rests on the interplay of technology, analysis, and mythology that provokes extensive utopian and dystopian rhetoric” (Boyd 2012: 663). What is especially interesting about big data is its addition to how people, governments, and private corporations approach data. As explained in the article, big data has the advantage of analyzing information “with an unprecedented breadth and depth and scale”(idem:665), meaning that we are now able to process data in a quick and quasi- complete way.
A key takeaway of the author’s argument is that “working with Big Data is still subjective” (idem:667). In fact, big data is subject to interpretation and design decisions. When collecting the data, there is always a choice that must be made on what to include, “what attributes and variables that will be counted”, which is part of the data cleaning process (ibid). This can be seen especially in the case of the Israeli’s instrumentalization of facial recognition, and the unequal collection of data of non-Israeli citizens that live on the West Bank.
Utilizing facial recognition for surveillance
Facial recognition has been more commonly used by authoritarian governments to regulate, watch and keep information about the citizens.It operates through “biometric software application capable of uniquely identifying or verifying a person by analyzing patterns based on the person’s facial contours” (Nijmeh, 2020). It has been for example used by the Chinese police organ to trace the Uighurs, a Muslim community (Mozur, 2019).
In the Old City of Jerusalem, we can count almost “ one camera per 100 people out of the total population of 40,000 covering only 0.9 square kilometres” (Amer, 2021)
The Israelian government used this technology throughout the Old City of Jerusalem, and equipped the streets of “closed-circuit television (CCTV) cameras (Nijmeh, 2020) in the context of the Mabat 2000 project. With these cameras, the government detains information such as license plates, and can keep an account on the citizens movements while using facial recognition (ibid). Interestingly, the government and the police apparatus refuses to provide information on the number of cameras, their locations, or the “length of time the data was stored” (ibid). It has been argued by the government that the use of the cameras has been significantly useful in “lowering the number of knife and shooting attacks in Israel in recent years” (CBS News, 2018)
More recently, Investigations led by NBC and Hareetz concluded that an Israelien startup named AnyVision set up “facial recognition software to conduct biometric surveillance on Palestinians” (Brown, 2019). The startup then set up a program named Better Tomorrow that “allows the tracking of objects and people on live video feeds, even tracking between independent camera feeds.” (ibid) This allows the government to register Palestinians and to trace them, as they have to scan their faces to go past the checkpoints along the occupied West Bank and Israel (Amer, 2021)
Predictive Policing : A general trend backed up by big data
Facial recognition shows us the biases and failures of big data when employed against the citizens or against minorities. Moreover, it inserts itself in the bigger trend that we observe throughout the world of predictive policing. As explained by Jens, predictive policing is the “ different ways future crimes are rendered knowledgeable in order to act upon them that reaffirm or reconfigure the status of criminological knowledge within the criminal justice system” (Jens 2021: 2). Through technologies and advancement such as facial recognition, it is observed that predictive policing is redefining the “police organizational practices” (ibid), by invading the lives of citizens without them being aware of it, or without being able to give their consent on whether they want their data to be used and shared.
This poses an ethical dilemma, especially when the state is instrumentalizing such data in order to develop more information about the citizens. In the particular case of Palestine, there is no clear communication about the use of these technologies. Boyd argues that “practices like informed consent” (boyd 2012:672) can be a solution to these problems, yet here the privacy of Palestinians is violated and consent is not discussed. In fact, in 2018 the israelian government enacted a law that protects the right to privacy to the israeli citizens, meaning “that information on Israeli citizens be done only with their consent” (Brown,2019). However, Palestinians that live in the West bank “ don’t hold Israeli citizenship and, therefore, are not protected by Israeli privacy laws” as explained by Vox (ibid).
This example leads us to think about a bigger debate on the sharing of data, and what constitutes “public data” (boyd 2012:672). Although the data is available to everyone, is it preferable to request permission?
Facial recognition is part of our daily lives now. Big Brother is still watching you – and me -. From logging in to a website, to turning on your phone, to going from a checkpoint to another, it is a technology that has been democratized in the lives of people for the past few years, and it is in constant improvement. Now, it is up to the consumers to be more aware of their rights when it comes to privacy, and when sharing their data, while waiting for states to introduce regulations on their personal information. Moreover, researching facial recognition is also more challenging than it used to be, as moral dilemmas are shaking up academia. A growing number of scholars refuse to work with firms or departments that are “linked to unethical projects” (Van Noorden 2020:355), meaning opaque ways to collect data and to “distribute facial recognition data sets” (ibid). They issued calls for more extensive extra “ethics checks on studies” (ibid).
boyd, d. & Crawford, K., 2012. Critical Questions for Big Data. Information, Communication & Society, 15(5), pp. 662–679.
Brown, H., 2019. An artificial intelligence company backed by Microsoft is helping Israel surveil Palestinians. [online] Vox.
Hälterlein, Jens. 2021. Epistemologies of predictive policing: Mathematical social science, social physics, and machine learning. Big Data & Society 8(1).
Mozur, P., 2019. One Month, 500,000 Face Scans: How China Is Using A.I. to Profile a Minority (Published 2019). [online] Nytimes.com.
Nijmeh, A., 2020. Israeli attacks on Palestinian digital rights rise during pandemic. 7amleh – Arab Center for the Advancement of Social Media.
CBS News, 2018. Israel claims 200 attacks predicted, prevented with data tech. [online] CBS News.
Amer, S., 2021. How Israel is automating the occupation of Palestine. [online] The New Arab.
Van Noorden, R., 2020. The ethical questions that haunt facial-recognition research. Nature, 587(7834), pp.354-358.