A(m)I beautiful? – Questioning AI through a Biometric Mirror

By:
On: September 26, 2020
Print Friendly, PDF & Email
About


   

As AI integrates its way into everyday life, it raises more and more questions on whether to see this technology as assisting or oppressive. Facial recognition art installation Biometric Mirror (2018) contributes to this critical approach, challenging the ethics of artificial intelligence.

A(m)I beautiful? – Questioning AI through a Biometric Mirror

In the society we now live in, artificial intelligence has made itself more and more inevitable, promising to give us insights that were too complex to even think about before this advanced computational civilization. The power this technology has is ever-growing and trickles down every part of life. Therefore I believe this subject of AI, although not per se “new” in a sense that its tracks date back to the late 1950s (Edwards, 1996), is “new” hypothetically in that this “old” medium remixes with new technologies and through that takes on this more variable form and can exist in different versions and different aspects of life (Manovich, 2018).

A commonly seen perception on pervasive AI technologies is that it brings something positive to the table and gives convenience to everyday life, like not having to try on 50 shades of lipstick using Modiface’s augmented reality app ‘MakeUp’ (ModiFace Inc., 2019) that allows you to virtually test out beauty products or using facial recognition as a surveillance technology like the ‘Smartscreen Network’ placed throughout Westfield shopping malls to capture age, gender and mood of the shopper of optimal advertising (Anscombe, 2017).

But the notion of facial recognition is also being adopted in more critical areas like policing to identify criminals. This pervasiveness of AI in more and more parts of our lives raises all sort of questions around privacy and surveillance and the ethics of AI. 

Challenging these notions is the AI artwork Biometric Mirror(2018), created by artist and bodyarchitect Lucy McRae and scientist dr. Niels Wouters. An artwork wherein the veracity, presumptions and boundaries of facial recognition algorithms get questioned through entering this so-called Sci-Fi beauty salon scanning and revealing a mathematically ‘perfect’ version of your face. On their website www.biometricmirror.com (Biometric Mirror, 2018), they describe Biometric Mirror as: “An ethically provocative interactive system that enables public participation in the debate around ethics of artificial intelligence.” Cause what will happen when algorithms make mistakes?

Coded gazing at you

Concerns that Biometric Mirrorpoints out, are algorithmic biases & ethical problems facial recognition produces. In the current age, a big percentage of user-generated data is analyzed by algorithms to classify certain patterns and predict outcomes. But as several researchers have shown these algorithms are far from being objective, containing biases that are for example racist (Noble, 2018) or sexist (Hamilton, 2019). Some refer to algorithmic bias as the ‘coded gaze’, an algorithmic way of seeing regarding AI techniques (Feuston and Piper, 2018). Here the ‘coded gaze’ classifies certain content through categories invented by researchers and machines. 

In case of Biometric Mirroryou get confronted with this ‘coded gaze’ in several ways, starting with the 14 characteristics that the system analyzes you on. Why did the programmers choose these specific characteristics and not swap out certain ones for others? As I believe humans have quite some more aspects to them than 14. 

When the system comes up with these 14 characteristics, the question of accuracy can be brought up. Let’s look at the aspect of ethnicity for example. Has the system calculated your ethnicity “right”? Or can that even be in this globalized world we are living in and where ethnicity is a concept that is ever-changing making it at the same time almost elusive? In addition, there is also the aspect of the Hollywood plastic surgeon who collaborated on this project. Here we see another feature of someone with certain believes influence the outcome of this creation of the ‘perfect’ self. An influence that implements predominantly Western beauty standards into the algorithm, whilst there are so many perceptions on what beauty actually is. 

Here you see how the programmers and thereby the generated data and algorithms play a role of interference. As boyd & Crawford (2012) rightfully point out, claims to objectivity & accuracy are misleading. For data to exist it needs to be imagined and imagining certain entities entails an interpretative base, leading to inevitable subjectivity. The same, you could say, we see here in the choices made within the Biometric MirrorAI system, were (design) decisions are made that determine the measurements, thereby stemming from interpretations and making this system inherently subjective where biases become almost inescapable. 

Imagining this system in real-life situations, how would this unfold itself? Did you not get the job because an AI system didn’t think you were trustworthy? Or perhaps a facial recognition technology misidentifies you as a criminal, while this particular machine has an inherent bias against minorities? 

According to Zeynep Tufekci (TED, 2017) a considerable problem in algorithmic biases is inherent to power structures. Big, generally Western, enterprises who steer these complex technological structures gain ever more power, corresponding with the famous phrase of philosopher Francis Bacon: “knowledge is power”, and creating this divide between themselves and their end-users. This is yet another aspect that comes to mind when discussing the ubiquity of technologies in our society. 

Debating AI

Biometric Mirror gives an interesting provocative view on the flaws of current technology in presenting ourselves through the biased lens of AI. Presenting us with questions on accuracy & objectivity of technologies and whether these technologies are here to help us or dissect our every move in the interest of large corporations and government agencies. As artist Lucy McRae describes it, Biometric Mirror gives you“a moment to start thinking about transparency of algorithms (..) and the current trend of perceiving algorithms (and AI) as the holy grail that will ultimately improve society.” (McRae, 2018). It dares you to be critical and ask ourselves if this is really the future that we want?

Biometric Mirror is showing at the Nxt Museum in Amsterdam from 29 Augustus 2020 – 28 February 2021.

Bibliography

Anscombe, Luke. “Westfield is using facial detection software to watch how you shop.” News.com.au. 2017. 26 September 2020. <https://www.news.com.au/finance/business/retail/westfield-is-using-facial-detection-software-to-watch-how-you-shop/news-story/7d0653eb21fe1b07be51d508bfe4626>.

Biometric Mirror. 2018. Microsoft Research Centre for Social Natural User Interfaces, The University of Melboure, Microsoft Australia, Science Gallery Melbourne. 26 September 2020. < https://www.biometricmirror.com>.

boyd, danah, and Kate Crawford. “CRITICAL QUESTIONS FOR BIG DATA: Provocations for a Cultural, Technological, and Scholarly Phenomenon.” Information, Communication & Society, vol. 15, no. 5, June 2012, pp. 662–79. DOI.org (Crossref), doi:10.1080/1369118X.2012.678878.

Edwards, Paul N. The Closed World: Computers and the Politics of Discourse in Cold War America. MIT Press, 1996.

Feuston, Jessica L., and Anne Marie Piper. “Beyond the Coded Gaze: Analyzing Expression of Mental Health and Illness on Instagram.” Proceedings of the ACM on Human-Computer Interaction, vol. 2, no. CSCW, Nov. 2018, pp. 1–21. DOI.org (Crossref), doi:10.1145/3274320.

Hamilton, Melissa. “The Sexist Algorithm.” Behavioral Sciences & the Law, vol. 37, no. 2, John Wiley & Sons, Inc., Mar. 2019, pp. 145–57. EBSCOhost, doi:10.1002/bsl.2406.

Manovich, Lev. “How Media Became New.” Communication in History, by Peter Urquhart and Paul Heyer, edited by Paul Heyer and Peter Urquhart, 7th ed., Routledge, 2018, pp. 293–96. DOI.org (Crossref), doi:10.4324/9781315189840-42.

McRae, Lucy and Niels Wouters. Biometric Mirror. Melboure: Science Gallery, 2018. 

McRae, Lucy. “Biometric Mirror” Lucy Mcrae. 2018. 20 September 2020. < https://www.lucymcrae.net/biometric-mirror->.

Modiface Inc. 2019, MakeUp (version 15.0.3). [Mobile app] [Accessed 26 September 2020]

Noble, Safiya Umoja. Algorithms of Oppression: How Search Engines Reinforce Racism. University Press, 2018.

Science Gallery Dublin. “Biometric Mirror.” YouTube. 23 July 2019. 20 September 2020. < https://www.youtube.com/watch?v=CypZuv56iuI>

 TED. “We’re building a dystopia just to make people click on ads | Zeynep Tufekci”. YouTube. 17 November 2017. 20 September 2020. < https://www.youtube.com/watch?v=iFTWM7HV2UI>.

Comments are closed.