We've updated our Privacy Policy to make it clearer how we use your personal data. We use cookies to provide you with a better experience. You can read our Cookie Policy here.

Advertisement

Harnessing AI as a Wellness Tool for Teenagers’ Mental Health

Harnessing AI as a Wellness Tool for Teenagers’ Mental Health content piece image
Credit: Pixabay
Listen with
Speechify
0:00
Register for free to listen to this article
Thank you. Listen to this article using the player above.

Want to listen to this article for FREE?

Complete the form below to unlock access to ALL audio articles.

Read time: 4 minutes

Current options for teenagers looking to help improve their mental health can be prohibitive in terms of cost and accessibility or provide limited opportunities for engagement. The developers of Kai – an AI-powered personal guide and companion – aim to address these issues by providing a personalized platform that can help young people take control of their own well-being. Users can interact with Kai from a range of familiar platforms such as WhatsApp and Messenger, receiving regular motivational reminders and exercises.


To learn more about Kai and the potential benefits of using AI as a wellness tool for mental health, Technology Networks spoke to Alex Frenkel, CEO and co-founder of Kai. In this interview, Alex also explains how the data that teenagers share with Kai is protected and how the platform responds if an urgent situation is communicated.


Ruairi J Mackenzie (RM): Your website states that “Kai is an AI-powered, personal companion designed to help ease anxiety, depression, sleeping disorders, and many other psychological stressors by integrating wellness tools, techniques and exercises according to the Acceptance Commitment Therapy ACT model.” Does this mean Kai offers a form of psychotherapy or not? If yes, do you believe that an AI can effectively replace a human in offering psychotherapy? If not, do you believe that offering wellness advice is an effective substitute for psychotherapy?


Alex Frenkel (AF): Mental health issues are a growing concern today, and there are many outlets, resources and tools available – therapy, life coaching, meditation apps, etc. The problem is that many of these solutions are not available or affordable for all—half of the adolescents who need mental health treatment never see it. Kai offers a form of psychotherapy but should not be a replacement for necessary, alternative methods.

 

As an AI-powered personal guide and companion, Kai uses a combination of human insight and machine learning to motivate teenagers to consistently commit to showing up for themselves so that they can take control of their own well-being. It accomplishes this by acting as a companion and accountability partner, engaging users with personalized questions, and proactively surfacing insights and content based on past interactions.


Kai draws upon a range of therapeutic modalities including ACT (Acceptance and Commitment Therapy), CBT (Cognitive Behavioral Therapy), positive psychology and coaching psychology, to engage with teens conversationally and interactively, so that it feels safe and familiar. The AI allows each interaction to be custom-tailored to each person’s specific needs.


Kai also leverages messaging APIs and neuro-linguistic programming tools such as Google DialogFlow to manage user conversations, understand their intent and automate responses from its platform.


Our comprehensive psychological training programs are developed from the ground up and packed into simple, bite-sized sessions entirely within the conversational structure of Kai. Ultimately, Kai helps teenagers become more self-aware and teaches them how to overcome their current challenges to thrive and reach their full potential.


RM: Is there any published clinical evidence that Kai can help ease anxiety or depression? If not, are you planning such studies?


AF: Currently, we have two research papers that were accepted for peer review by the Journal of Medical Internet Research. In both papers, we explore the benefits of using AI as a wellness tool for mental health.


Almost 50% of 11-years old in the United States have a mobile phone, and this number increases to 85% when looking at the 14-year-old age bracket. Additionally, in the US, adolescents ranging from 13 to 18 years engage for more than three hours every day with their mobile devices. Given these findings, we determined that the mobile device was a very accessible and easy tool for mental health prevention.


The first study focuses on the well-being of adolescents while using an AI-powered ACT tool, while the other tests the suitability of AI-driven intervention delivered directly through texting apps. Each study was conducted over some time, sampling more than 50,000 participants. A massive feat for any wellness study, and the results are even more exciting. One study showed that patients’ well-being increased according to the 5-item World Health Organization Well-Being Index questionnaire. The other indicated that the efficiency of an AI-driven intervention delivered through texting apps increases the treatment intensity and integrates therapeutic strategies into daily life.


Research is a crucial pillar of what we do at Kai. Dana Vertsberger heads up our research department here. She and our entire staff continually look into developing more studies to improve our offering and add to the breadth of research on this critical topic.


RM: Human therapists work under strict safeguarding procedures, especially when working with children. If a teenager were to share, for example, an intention to complete suicide, how would Kai react and who would it inform?


AF: Conversations range in severity among the entire Kai platform. The system is trained to flag and track when an urgent situation of life-threatening dialogue, such as suicide, gets brought to the forefront of a chat. These chats are then flagged, and critical resources such as suicide prevention hotline numbers are provided to the user for additional help, but the most important thing is that Kai is there to talk them through it. Furthermore, Kai urges users to reach out for human interaction and assistance when the need meets these standards.


RM: How is the data that teens share with Kai protected? Is the data that teenagers share with Kai used to train its algorithm?


AF: Data security is one of the most important principles at Kai that we ensure. One of the benefits of Kai is that the user has ultimate control of their data. Users can delete all data and conversations with Kai by simply commanding it.


The platform doesn’t collect personal data from the user. In fact, users are entirely anonymous inside the system. If the user would like to share “high-level” details such as age or gender, there is an opportunity for them to share that information. However, the only direct question Kai requests of a user is “how would you like Kai to call you?”.  Additionally, Kai complies with global government rules and regulations regarding patient health data.


Kai continues to learn from data collected from adolescents to train its algorithm to respond in a custom and personalized way. Right now, we are seeing, on average, 15,000 users engaging with Kai daily – with an increase in data, we have seen an increase in ways in which the platform responds, ensuring that custom approach. We can track this with human validation. Kai has a team to support human validation to make sure that it is marked when a response from Kai was “excellent” or should have provided more information. The algorithm continues to learn during the human verification process, providing better responses.


Alex Frenkel was speaking to Ruairi J Mackenzie, Senior Science Writer for Technology Networks.