Can Robots Solve the Mental Health Crisis?
Digital healthcare is emerging as an attractive alternative to mental health treatment, with an increasing number of therapy apps leveraging advances in machine learning and artificial intelligence (AI) to provide people with increasingly personalized treatment.
These AI apps seek to offer a democratization of therapy: short-term solutions, which are quick, cheap and easily available for a variety of users at a time when healthcare systems globally are not able to cope with increasing demands for care. Could robots solve the mental health crisis?
One of the great promises of AI in mental healthcare is accessibility. Technology enables constant connection, which means AI could enable 24/7 access to therapeutic care. “A machine therapist is always there, day or night, as often as needed,” says Sarah Lubrano, Head of Content for The School of Life. AI also allows for increased efficiencies in the therapy process, by reducing time and cost and automating at least part of the process. Maria Boghiu, head of product at Spill explains: “Research shows that you remove some barriers to entry if you can put therapy on people’s phones. Plus, there are only so many therapists.”
Affordability is another major advantage of AI mental health therapy. Given the high demand for publicly funded therapy, options are limited but private care is expensive. Technology seeks to remedy this. Boghiu claims, “If you can automate the legwork such as tracking mood or finding the right words to express an idea that therapy communicates, you can improve a therapist’s reach exponentially.”
Lubrano believes the benefits of AI will be more nuanced because therapy looks different for everyone. “Therapy can mean a lot of things,” she explains. “It could mean helping us build healthier patterns of thinking and behaving through regular messages. In each of these cases, machines can play a very useful role.” In fact, AI therapy app Woebot is already doing this. Building on Cognitive Behavioural Therapy, (CBT), Woebot speaks to users via an AI chatbot, teaching them new techniques through daily conversations to help develop new, healthier patterns of thinking.
But CBT is just one form of therapy that AI could facilitate. As an organisation dedicated to developing emotional intelligence, The School of Life builds on insights from psychodynamic psychotherapy, focusing on the role of consciousness in human psychology. Lubrano believes AI could play a similar role to human therapists picking up aspects of the ‘unconscious’ by analyzing the meaning in our thinking and speech that humans are not yet fully conscious of ourselves.
But the use of AI in therapeutic pursuits comes with risks and limitations. The human brain is a complex machine and person-to-person therapy nurtures a specific relationship, something R2D2 might never fully replicate. In person-to-person therapy, the therapist serves as a reparative, model relationship. Often, the therapist is modelling behaviour for us that we might not have had the privilege of getting from our real relationships, via parents or caregivers. “A good therapist shows us, over a period of time, that there can be a relationship between two humans that has these healthy attributes,” explains Lubrano. This kind of therapeutic practice goes to the heart of man’s search for sociability and connection, and for this reason it is unlikely a machine could ever recreate such a relationship.
Then there are big ethical questions surrounding the use of AI in mental healthcare. Not only does the use of this kind of technology raise privacy concerns, but there’s a bigger question around trusting a machine with human life. Users at Replika shared how much AI had helped with their mental health issues. They claimed that machines are not capable of judgement, and so they felt very comfortable sharing intimate details of their struggles with their Replika bots.
However, some told of dangerous experiences when their Replika bot started spouting nonsense, giving them unhelpful advice, making obvious mistakes, or even saying offensive things during times of crisis. How do you overcome this? Well, to some extent, you can’t, the app builders claim. While human oversight plays a fundamental role in the teaching and training of algorithms, this doesn’t make them infallible. Using AI carries a certain amount of risk – as indeed does all human activity.
The greatest benefits of AI will come from humans carefully programming, teaching and training machines, what technologists call ‘combined intelligence’.
3 best mental health apps to download
Spill: Online counselling app
Calm: Mindfulness app was voted App of the Year by Apple
Woebot: Cognitive Behavioral Therapy (CBT) chatbot

Categories
You must be logged in to post a comment.