Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Machine learning prediction of anxiety symptoms in social anxiety disorder: utilizing multimodal data from virtual reality sessions

Authors
Park, Jin-HyunShin, Yu-BinJung, DooyoungHur, Ji-WonPack, Seung PilLee, Heon-JeongLee, HwaminCho, Chul-Hyun
Issue Date
Jan-2025
Publisher
Frontiers Media S.A.
Keywords
machine learning; multimodal data; digital phenotyping; digital psychiatry; social anxiety disorder; virtual reality intervention; anxiety prediction
Citation
Frontiers in Psychiatry, v.15
Indexed
SCIE
SSCI
SCOPUS
Journal Title
Frontiers in Psychiatry
Volume
15
URI
https://scholarworks.korea.ac.kr/kumedicine/handle/2021.sw.kumedicine/76278
DOI
10.3389/fpsyt.2024.1504190
ISSN
1664-0640
Abstract
Introduction Machine learning (ML) is an effective tool for predicting mental states and is a key technology in digital psychiatry. This study aimed to develop ML algorithms to predict the upper tertile group of various anxiety symptoms based on multimodal data from virtual reality (VR) therapy sessions for social anxiety disorder (SAD) patients and to evaluate their predictive performance across each data type.Methods This study included 32 SAD-diagnosed individuals, and finalized a dataset of 132 samples from 25 participants. It utilized multimodal (physiological and acoustic) data from VR sessions to simulate social anxiety scenarios. This study employed extended Geneva minimalistic acoustic parameter set for acoustic feature extraction and extracted statistical attributes from time series-based physiological responses. We developed ML models that predict the upper tertile group for various anxiety symptoms in SAD using Random Forest, extreme gradient boosting (XGBoost), light gradient boosting machine (LightGBM), and categorical boosting (CatBoost) models. The best parameters were explored through grid search or random search, and the models were validated using stratified cross-validation and leave-one-out cross-validation.Results The CatBoost, using multimodal features, exhibited high performance, particularly for the Social Phobia Scale with an area under the receiver operating characteristics curve (AUROC) of 0.852. It also showed strong performance in predicting cognitive symptoms, with the highest AUROC of 0.866 for the Post-Event Rumination Scale. For generalized anxiety, the LightGBM's prediction for the State-Trait Anxiety Inventory-trait led to an AUROC of 0.819. In the same analysis, models using only physiological features had AUROCs of 0.626, 0.744, and 0.671, whereas models using only acoustic features had AUROCs of 0.788, 0.823, and 0.754.Conclusions This study showed that a ML algorithm using integrated multimodal data can predict upper tertile anxiety symptoms in patients with SAD with higher performance than acoustic or physiological data obtained during a VR session. The results of this study can be used as evidence for personalized VR sessions and to demonstrate the strength of the clinical use of multimodal data.
Files in This Item
There are no files associated with this item.
Appears in
Collections
4. Research institute > Chronobiology Institute > 1. Journal Articles
1. Basic Science > Department of Medical Informatics > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Lee, Hwa Min photo

Lee, Hwa Min
College of Medicine (Department of Medical Informatics)
Read more

Altmetrics

Total Views & Downloads

BROWSE