Silicone Robotic Mask: Making Computed Emotions Physical

On: September 24, 2017
About Nena Snoeren


   

An important aid in computing our emotions since the digital revolution is the use of emoticons. But our emotions could be captured and communicated better, by moving away from mimicking facial expressions to having roots in real physical functions. Also by making them physical again after being made digital through a new technique as shown by the art work of a silicone robotic mask.

Emoticons

We use emoticons (also emoji’s) to express how a text message should be interpreted or less commonly also without accompanying text to communicate or contextualize our feelings on its own. Joseph Walther and Kyle D’addario describe emoticons as: “[…] graphic representations of facial expressions […] substituting for the nonverbal cues that are missing from computer-mediated communication” in their text The Impacts of Emoticons on Message Interpretation in Computer-Mediated Communication (324). One might think that emoticons came in existence during the digital revolution, but this is not true. The emoticon is much older than the digital age, with its first ‘alleged’ appearance in The New York Times in 1862 (or was it a typo? ;). In 1881 however, Puck, a satirical U.S. magazine, published four emoticons: joy, melancholy, indifference and astonishment. These emoticons were described as ‘typographical art’ in a small tongue-in-cheek article mocking cartoonists. Emoticons really started to flourish in the digital age though as many of us can remember, texting on our first mobile phones and on instant messaging services on the internet. Emoticons are still widely used to quickly mimic our facial expressions that we might have had speaking in real life. A physical smile, frown or wink is made digital so that the receiver of our message can correctly interpret and understand our intention. This practice is the computing of human behaviour, a concept that Philip Agre talks about throughout his text “Surveillance and Capture”. With the computing of human behaviour, Agre means a process in which human activities are thoroughly integrated with distributed computational processes (743). An important concept within this computational processes is ‘to capture’: the human activity that is being computed should be captured well in this process (Agre 744). The human activity of exchanging sentiments and emotions is heavily computed nowadays and we have become quite comfortable and skilled at capturing and computing our own emotions, making our feelings and facial expression digitally clear. But what if these captured and computed expressions of emotions could be made physical again after being made digital? This poses the central questions: would this enrich our computed emotional communication and would this capture the process of the exchange of emotions better?

Silicone Robotic Mask

A soft silicone robotics mask introduced on online platform Digital Trends might help us think about this futuristic possibility. The robotic mask, designed by a team of designers from the London’s Bartlett School of Architecture, changes shape and colour depending on the facial expressions of the person that is wearing it. An electromyography (EMG) sensor in the mask measures the muscle movements made by the wearer, sending signals triggering different results in the shape and colour of the mask. This device, or art object would be more accurate, is of course not needed to offer context to what the wearer is saying, after all the real emotional expressions are right there on the wearer’s face. But the possibility of making emotional expressions digital and then physical again offers a new way of thinking about how we capture and compute our feelings and communicate with each other. Another article introducing the robotic mask on www.3ders.org, a website on 3D printing news, poses the question: ‘But what if we saw emotions in a completely different way, entirely removed from the human facial features that now appear so natural to us?’ Communicating emotions could be moving away from facial expressions and computed representations of facial expressions towards a more conceptual and physical feeling. An interesting thought experiment that can make us rethink how we perceive the computed communication of emotions. After all, we feel emotions, why not communicate with them physically too in computed communication?

Virtual Interpersonal Touch

An interesting text about this kind of communication is “Virtual Interpersonal Touch: Expressing and Recognizing Emotions Through Haptic Devices” by Bailenson et al. published in the online journal Human-Computer Interaction. Bailenson et al. call this kind of communication ‘Virtual Interpersonal Touch’. In the introduction, the authors state that one of the primary purposes of nonverbal behaviour is to communicate subtleties of emotional states between individuals (327). They then go on and state: “Clearly, if social interaction mediated by […] digital communication systems is to be successful, it will be necessary to allow for a full range of emotional expressions […]’’ (327). As we all know, touch is a very powerful way of communicating emotions. It is one of the most definitive markers of intimacy in social interaction (Bailenson et al. 330). Yet it is still missing, or at least in large part, from our day to day digitalised communication. Do we have examples of Virtual Interpersonal Touch in our daily lives? Yes, some of us do. An example is the feature on the Apple Watch with which one can send self-made vibration patterns from one watch to another. With the (romantic) option to record and send your heartbeat to another Apple Watch wearer. This heartbeat could represent a non-specific emotion, like being angry, afraid or happy but one’s heartbeat could also be raised trying to catch a bus or sitting in a sauna. Another, maybe surprising example, is a set of him/her sex toys designed by porn-star Bobby Eden. Physical touch (again disputable if this is emotional per se) of one partner is made digital through haptic technology and, without being too specific here, can in turn be felt by the other partner. Research like The Erotic Engine: How Pornography has Powered Mass Communication, from Gutenberg to Google by Patchen Barss shows that revolutionary steps in technology and communication are advanced and sometimes even driven by the porn industry. Think for example about VCR, the Internet, video streaming software, 3D and virtual reality, pornography always finds a way to be consumed easier and seem more realistic. The technology surrounding the communication of physical touch could be next.

Digital emotional capture

But let us now move away from vibrations and move towards the possibilities of physically changing the shape or size of an object digitally through emotional responses like the robotic mask discussed above. What if for example, one smiles slightly and nostalgically while typing a message. An EMG sensor like in the robotic mask, or other technical devices designed for picking up human emotions, would turn these emotions into a digital signal that physically and very specifically alters the device the respondent is receiving the message on (or another object like a phone case if you will). One could imagine these functions and sensors being built into our phones and computers, or even built in our body’s, all in order to capture and communicate with our emotions more accurately. We could physically experience the complex emotions that were send to us, with its roots in real bodily emotional expression rather than a mimicked facial expression. If this all seems a little too futuristic to you, read the work “Sustainable Wearables: Wearable Technology for Enhancing the Quality of Human Life” by Jaewoon Lee et al. It explores the types of sensors, called wearables in their text, that compute our body’s, and predicts what these sensors will look like in the future: ‘Future wearables will […] eventually be invisible’ (5.). They state that wearables can measure heart rates, blood flow resistance, skin tissue, muscle status, respiratory volume, skin temperature, skin conductance, and blood volume pulse (2.1). They conclude with: ‘The future wearables will collect more accurate and diverse individual information than ever and will be utilized in wider areas’ (5.). Although they do not mention the ‘collection’ of emotions specifically, one can easily see how this could very well be possible. Virtual Interpersonal Touch enables the communication of more complex emotions and it also makes digital communication richer and more real in a very physical sense. The capturing of the expressions of human emotions will become more diverse and in computing them this way, digital emotional experience will become less visual and more intrinsic, more at the core of what emotions are.

References:

Agre, Philip E. “Surveillance and Capture: Two Models of Privacy.” The Information Society 10.2 (1994): 101–127.

Bailenson, Jeremy N., Nick Yee, Scott Brave, Dan Merget and David Koslow. “Virtual Interpersonal Touch: Expressing and Recognizing Emotions Through Haptic Devices.” Human–Computer Interaction 22.3 (2007): 325-353.

Barss, Patchen. The Erotic Engine: How Pornography has Powered Mass Communication, from Gutenberg to Google. Toronto: Doubleday Canada, 2010.

Lee, Jaewoon, Dongho Kim, Han-Young Ryoo and Byeong-Seok Shin. “Sustainable Wearables: Wearable Technology for Enhancing the Quality of Human Life.” Sustainability 8.5 (2016): 466.

Walther, Joseph B. and Kyle P. D’addario. “The Impacts of Emoticons on Message Interpretation in Computer-Mediated Communication.” Social Science Computer Review 19.3 (2001): 324-347.

Comments are closed.