A study conducted by the University of Sussex reveals that mental health chatbots are most effective when users establish an emotional connection with their AI therapist. Published in the journal Social Science & Medicine, the research highlights both the potential benefits and psychological risks associated with what is termed “synthetic intimacy.” With over one in three residents in the U.K. now using AI tools for mental health support, the findings are timely and significant.
Key Findings on Emotional Connections
The study analyzed feedback from approximately 4,000 users of Wysa, a widely used mental health application endorsed by the NHS under its Talking Therapies program. The results showed that therapy outcomes improved when users developed emotional intimacy with their AI therapists.
Dr. Runyu Shi, an assistant professor at the University of Sussex, stated, “Forming an emotional bond with an AI sparks the healing process of self-disclosure.” While many users report positive experiences, the study also cautions against the potential risks of synthetic intimacy. Dr. Shi noted that vulnerable individuals might find themselves in a self-reinforcing cycle, where the chatbot fails to challenge harmful perceptions, potentially delaying necessary clinical intervention.
Understanding Synthetic Intimacy
The phenomenon of synthetic intimacy is becoming increasingly prevalent, with reports of individuals forming relationships or even marriages with AI. The researchers identified a cyclical process in which users engage in intimate behavior by sharing personal information, which elicits emotional responses—such as feelings of safety and freedom from judgment. This cycle can foster positive changes in well-being, including increased self-confidence and energy levels.
As users attribute human-like characteristics to the app, they often refer to Wysa as a friend, companion, or therapist. Such attributions underscore the complexity of human-AI interactions and the emotional dimensions of mental health support.
Professor Dimitra Petrakaki from the University of Sussex emphasized the need for awareness around synthetic intimacy. “This is a reality of modern life. Policymakers and app designers should consider how to escalate cases when AI detects users in urgent need of clinical intervention,” she advised.
With mental health resources stretched thin, organizations like Mental Health UK are advocating for immediate safeguards to ensure users receive safe and appropriate information. The study’s findings highlight the importance of balancing the benefits of AI therapy with the need for professional support, especially for those facing significant mental health challenges.
The implications of this research are far-reaching, particularly as the landscape of mental health care continues to evolve. As AI increasingly fills the gaps left by traditional services, understanding the dynamics of user interaction with these technologies will be crucial in shaping future mental health strategies.
For more information, refer to the study by Runyu Shi et al, titled “User-AI intimacy in digital health,” published in Social Science & Medicine (2025). DOI: 10.1016/j.socscimed.2025.118853.
