We've updated our Privacy Policy to make it clearer how we use your personal data. We use cookies to provide you with a better experience. You can read our Cookie Policy here.

Advertisement

AI Tracks Facial Expressions To Support PTSD Detection in Children

The lower half of four people, three clearly children. All wearing Wellington boots and dressed for the rain.
Credit: Ben Wicks/Unsplash
Listen with
Speechify
0:00
Register for free to listen to this article
Thank you. Listen to this article using the player above.

Want to listen to this article for FREE?

Complete the form below to unlock access to ALL audio articles.

Read time: 2 minutes

Diagnosing post-traumatic stress disorder (PTSD) in children presents distinct challenges, especially for those with limited verbal communication or emotional self-awareness. A research team at the University of South Florida (USF) has developed an artificial intelligence (AI) system that analyzes children’s facial expressions to help clinicians identify PTSD and track symptom changes over time.


The interdisciplinary team, led by Alison Salloum of the USF School of Social Work and Shaun Canavan from the Bellini College of Artificial Intelligence, Cybersecurity and Computing, designed the system to offer an objective and cost-effective assessment method. Published in Pattern Recognition Letters, the study is the first to integrate context-aware PTSD classification using a fully privacy-preserving approach.

Analyzing facial cues in a clinical context

Traditional diagnostic methods for PTSD in young people rely heavily on clinical interviews and self-reported questionnaires, which may be influenced by developmental stage, emotional avoidance or linguistic limitations. The USF team’s system aims to reduce subjectivity by focusing on subtle facial muscle movements captured during therapy sessions.


Instead of using raw video, the AI analyzes de-identified data such as eye gaze, head movement and the positions of facial features. The system strips out personally identifiable information and evaluates the dynamics of facial expressions in different conversational contexts, including those with parents and clinicians.

Data collection and model development

To develop the system, the team used footage from 18 therapy sessions in which children described emotional experiences. Each child contributed over 100 minutes of video, with around 185,000 frames per session. The AI models processed these data to detect patterns in facial movement linked to PTSD.


The findings revealed that children with PTSD exhibited distinguishable facial expression patterns. Moreover, the system found that children’s emotional expressions were more revealing during clinician-led interviews than during parent-child conversations. This observation aligns with psychological literature suggesting children may be more expressive with therapists and may suppress emotions in the presence of parents.

Potential applications and future directions

While the system is not intended to replace human clinicians, it may eventually offer real-time feedback during sessions and provide a non-invasive way to monitor recovery. The researchers are planning further studies to assess how factors like age, gender and cultural background might affect facial expression analysis. They are particularly interested in expanding the system’s applicability to preschool-aged children, for whom verbal reporting is often limited.


The study included participants with diverse clinical backgrounds, including comorbid conditions such as anxiety, depression and attention-deficit/hyperactivity disorder (ADHD). These complexities reflect real-world scenarios and suggest the system may perform well in clinical settings.


Although still in early development, the AI-based tool offers a promising complement to existing diagnostic strategies. If validated in larger trials, it may support more accurate and timely identification of PTSD symptoms in young patients while minimizing distress during assessment.


Reference: Aathreya S, Nourivandi T, Salloum A, et al. Multimodal, context-based dataset of children with Post Traumatic Stress Disorder. Pattern Recognit Lett. 2025. doi: 10.1016/j.patrec.2025.05.003


This content includes text that has been generated with the assistance of AI. Technology Networks' AI policy can be found here.