Alumni news
Mental health apps work best when you feel emotionally close to your chatbot, study reveals
By: Vicky Welstead
Last updated: Friday, 12 December 2025
Mental health chatbots work best when people form an emotional connection with the AI, according to new research by the University of Sussex.
With more than one in three UK residents now using AI to support their mental health, a new study highlights both the key to effective chatbot therapy and the need for responsible design to protect users.
Analysis of feedback from 4,000 users of a market-leading mental health app found that people experience positive changes in their wellbeing when they develop emotional intimacy with the AI. However, the study also raises fresh questions about how any form of AI which people may turn to in moments of distress needs to be designed to protect users.
University of Sussex Assistant Professor Dr Runyu Shi said: “Forming an emotional bond with an AI sparks the healing process of self-disclosure. Extraordinary numbers of people say this works for them, but we need to look at how AI can be used appropriately and when cases need to be escalated. The app we studied, Wysa, has been specially designed for mental health, but many people are using standard AI and we need to make sure people don’t get stuck in a self-fulfilling loop, with dangerous perceptions going unchallenged.”
Reports of people around the globe in relationships or even marriages with artificial intelligence have put synthetic intimacy in the spotlight. The researchers say this is the extreme end of a common phenomenon and have pinpointed the stages by which intimacy with AI is generated.
The process is described as a loop, where users take part in intimate behaviour by disclosing personal information, then they have an emotional response, with feelings of gratitude, safety and freedom from judgement. This can lead to positive changes in thinking and wellbeing, such as self-confidence and higher energy levels. Over time this loop creates an intimate relationship, with human-like roles attributed to the app.
Published in Social Science and Medicine, this paper was based on feedback from users of Wysa, the consumer version of an AI-powered mental health application. The study reports that users commonly referred to the app as a friend, companion, therapist and even occasionally partner.
University of Sussex Professor Dimitra Petrakaki said: “Intimacy towards AI is a fact of modern life now. Policymakers and app designers would be wise to accept this reality and consider how to ensure people get help when an AI witnesses someone in serious need of clinical intervention.”
With chatbots increasingly filling the gaps left by overstretched services charities like Mental Health UK are calling for urgent safeguards to make sure people receive safe and appropriate information.