The wisdom of Wysa – Mental health apps, the (AI) friend who is always there
Introduction
Ever felt lonely depressed or anxious at four AM and didn’t feel like you had anyone to talk to? Mental health issues are a daily struggle for many people. In recent years people have started coming forward and being more open about these issues, making it a contemporary public topic. Take for instance the hyped term “burn out society”, the idea that we live in a society that is no longer plagued by viruses and bacteria but by our own mental health. An interesting quote on this from the book The Burnout Society (2015) by Byung-Chul Han “Neurological illnesses such as depression, attention deficit hyperactivity disorder (ADHD), Borderline personality disorder (BPD), and burnout syndrome mark the landscape of pathology at the beginning of the twenty-first century.” (p.1) The same period also saw the meteoric rise of the use of the smartphone, by now an integrated part of most people’s lives. The same people who struggle with minor of major mental illness, often alone, continuously carry around a very capable technology device, which brings me to the subject of this paper, the concept of mental-health apps. As read in mHealth for Mental Health: Integrating Smartphone Technology in Behavioral Healthcare Smartphones (2011), smartphones have a huge range of capabilities such as access to the information of the internet, but also connections to databases, opportunity for real-time communication without physical presence, even connection to biofeedback sensors. All these functions could work as very useful tools for a person feeling certain mental illness symptoms and in need of help.
Wysa
The application I am going to discuss in this bolgpost is Wysa, a mental health application. Described in the article by Becky Inkster et al. An Empathy-Driven, Conversational Artificial Intelligence Agent (Wysa) for Digital Mental Well-Being: Real-World Data Evaluation Mixed-Methods Study (2018) Wysa “is an AI-based emotionally intelligent mobile chatbot app aimed at building mental resilience and promoting mental well-being using a text-based conversational interface.” The fact that Wysa is AI-based adds an interesting dimension. The services that Wysa affords over written communication include “evidence-based self-help practices such as CBT, dialectical behaviour therapy, motivational interviewing, positive behaviour support, behavioural reinforcement, mindfulness, and guided microactions and tools to encourage users to build emotional resilience skills.” Harking back to the fact that these services are namely provided by an AI-chatbot. The subscription to Wysa and these examples of tools and help are free; if you would like to communicate about these feelings a user can also pay for ‘a real human’.
AI vs face-to-face
What could be a frightening or even disappointing idea is that a chatbot just spews learnt cliché’s, that maybe in the more general sense apply to the person’s needs but after a while come to realise it is just simply ‘not listening’ more so not tailoring to the person’s needs. But AI has become so much more capable over the years, with a wide range of learning abilities and more so personal learning abilities. So what is the gradation of difference between help from an AI-chatbot and a real human over the same device? According to an article by Cuijpers et al. Is guided self-help as effective as face-to-face psychotherapy for depression and anxiety disorders? A systematic review and meta-analysis of comparative outcome studies (2010) that guided self-help, and in this case that is an AI-chatbot, does not carry a big difference in the treatment of for instance anxiety or depression compared to face-to-face treatment.
Enter the Wysa app. Their website states “people don’t want their problems ‘fixed’. Mostly, they just want to talk through them, with someone who doesn’t judge.” In this specific case of AI vs actual human therapy Wysa builds on the idea that some ‘problems’ or minor mental health issues can be solved by just a simple conversation, with someone who listens. That can be achieved by an AI-chatbot that simply listens to someone vent their emotions and then feels relieved. Because Wysa offers the option to talk to- and get help from ‘a real human’ it is a paid service. If the user thinks being treated by a professionally trained therapist and can afford the price, they may opt for this additional service level.
Privacy issues
These helpful tools seem mostly as effective as face-to-face help and much more accessible, via the tool (smartphone) that is already pervasive in most people lives. So what are the drawbacks? How does this relate to privacy? Mental health apps obviously deal with a lot of personal and therefore sensitive material. Health related apps in general process sensitive information and therefore also have to be aware of privacy. In a person to person medical exchange there is always a doctor-patient confidentiality. But what changes then when you doctor is an AI-chatbot collecting your personal data and ‘learning’ from your sensitive and personal troubles? Wysa states that they do not request nor collect your Personal Identifiable Information and that the chatting experience is completely anonymized. Yet, past conversations can be accessed via a simple request with an email and can also be deleted with a request via email. Both these facts mean that the information does get stored in a certain way. What happens to these data when they are stored is a difficult question to answer. Does a user feel safe with the idea that an AI-chatbot is learning newer skills based on your personal troubles, and that they are also stored and easily accessible enough via email.
Conclusion
Mental health apps on smartphones such as Wysa enabled by tools such as AI-chatbots can be a really helpful recourse to relieve minor mental health issues that can occur in day to day life. The fact that many occurrences of feeling depression and or anxiety can be helped by the use of these AI-chatbots as well as through a face-to-face session is a great thing, especially since the smartphones on which these apps run are always accessible and therefore a user can be relieved of these feelings quickly. Doctor-patient confidentiality, and that is in this specific case, data privacy is something to keep in mind during these sessions especially considering that the AI-chatbots learns from every user they ‘help’. All in all it is a great chance to use these pervasive devices to the best of their ability, nonetheless it is important to have a critical eye on privacy, machine learning and personal data.
Mental health apps on smartphones such as Wysa enabled by tools such as AI-chatbots can be a really helpful recourse to relieve minor mental health issues that can occur in day to day life. The fact that many occurrences of feeling depression and or anxiety can be helped by the use of these AI-chatbots as well as through a face-to-face session is a great thing, especially since the smartphones on which these apps run are always accessible and therefore a user can be relieved of these feelings quickly. Doctor-patient confidentiality, and that is in this specific case, data privacy is something to keep in mind during these sessions especially considering that the AI-chatbots learns from every user they ‘help’. All in all it is a great chance to use these pervasive devices to the best of their ability, nonetheless it is important to have a critical eye on privacy, machine learning and personal data.
Bibliography
Andersson, Gerhard. “Predicting Treatment Outcome in Internet versus Face to Face Treatment of Panic Disorder.” Computers in Human Behavior 24, no. 5 (September 1, 2008): 1790–1801. https://doi.org/10.1016/j.chb.2008.02.003.
Andersson, Gerhard, and Pim Cuijpers. “Internet-Based and Other Computerized Psychological Treatments for Adult Depression: A Meta-Analysis.” Cognitive Behaviour Therapy 38, no. 4 (December 1, 2009): 196–205. https://doi.org/10.1080/16506070903318960.
Cuijpers, P., T. Donker, A. van Straten, J. Li, and G. Andersson. “Is Guided Self-Help as Effective as Face-to-Face Psychotherapy for Depression and Anxiety Disorders? A Systematic Review and Meta-Analysis of Comparative Outcome Studies.” Psychological Medicine, December 2010. /core/journals/psychological-medicine/article/is-guided-selfhelp-as-effective-as-facetoface-psychotherapy-for-depression-and-anxiety-disorders-a-systematic-review-and-metaanalysis-of-comparative-outcome-studies/4C7DA862B658641E1A648299A19186D1.
Fiske, Amelia, Peter Henningsen, and Alena Buyx. “Your Robot Therapist Will See You Now: Ethical Implications of Embodied Artificial Intelligence in Psychiatry, Psychology, and Psychotherapy.” Journal of Medical Internet Research 21, no. 5 (2019): e13216. https://doi.org/10.2196/13216.
Han, Byung-Chul. The Burnout Society. Stanford University Press, 2015.
Hoa Ly, Ken. “Experiences of a Guided Smartphone-Based Behavioral Activation Therapy for Depression: A Qualitative Study.” Internet Interventions 2, no. 1 (March 1, 2015): 60–68. https://doi.org/10.1016/j.invent.2014.12.002.
Inkster, Becky, Shubhankar Sarda, and Vinod Subramanian. “An Empathy-Driven, Conversational Artificial Intelligence Agent (Wysa) for Digital Mental Well-Being: Real-World Data Evaluation Mixed-Methods Study.” JMIR mHealth and uHealth 6, no. 11 (2018): e12106. https://doi.org/10.2196/12106.
“JMU – Exploring the Far Side of Mobile Health: Information Security and Privacy of Mobile Health Apps on iOS and Android | Dehling | JMIR mHealth and uHealth.” Accessed September 18, 2019. https://mhealth.jmir.org/2015/1/e8/?
Kretzschmar, Kira, Holly Tyroll, Gabriela Pavarini, Arianna Manzini, Ilina Singh, and NeurOx Young People’s Advisory Group. “Can Your Phone Be Your Therapist? Young People’s Ethical Perspectives on the Use of Fully Automated Conversational Agents (Chatbots) in Mental Health Support:” Biomedical Informatics Insights, March 5, 2019. https://doi.org/10.1177/1178222619829083.
“mHealth for Mental Health: Integrating Smartphone Technology in Behavioral Healthcare. – PsycNET.” Accessed September 18, 2019. https://psycnet.apa.org/record/2011-25015-001?doi=1.
Wallach, Eric. “An Interview with Jo Aggarwal, Co-Inventor of Wysa.” The Politic (blog), March 28, 2018. http://thepolitic.org/an-interview-with-jo-aggarwal-co-inventor-of-wysa/.
“Wysa – Your 4 Am Friend and AI Life Coach.” Accessed September 18, 2019. https://www.wysa.io/.