Automated Characterization of Mouth Activity for Stress and Anxiety Assessment

Affiliation auteurs!!!! Error affiliation !!!!
TitreAutomated Characterization of Mouth Activity for Stress and Anxiety Assessment
Type de publicationConference Paper
Year of Publication2016
AuteursPampouchidou A., Pediaditis M., Chiarugi F., Marias K., Simos P., Yang F., Meriaudeau F., Tsiknakis M.
Conference Name2016 IEEE INTERNATIONAL CONFERENCE ON IMAGING SYSTEMS AND TECHNIQUES (IST)
PublisherIEEE; IEEE Instrumentat & Measurement Soc
Conference Location345 E 47TH ST, NEW YORK, NY 10017 USA
ISBN Number978-1-5090-1817-8
Mots-clésAnxiety, automatic assessment, Image processing, mouth gesture recognition, Stress
Résumé

Non-verbal information portrayed by human facial expression, apart from emotional cues also encompasses information relevant to psychophysical status. Mouth activities in particular have been found to correlate with signs of several conditions; depressed people smile less, while those in fatigue yawn more. In this paper, we present a semi-automated, robust and efficient algorithm for extracting mouth activity from video recordings based on Eigen-features and template-matching. The algorithm was evaluated for mouth openings and mouth deformations, on a minimum specification dataset of 640x480 resolution and 15 fps. The extracted features were the signals of mouth expansion (openness estimation) and correlation (deformation estimation). The achieved classification accuracy reached 89.17%. A second series of experimental results, for the preliminary evaluation of the proposed algorithm in assessing stress/anxiety, took place using an additional dataset. The proposed algorithm showed consistent performance across both datasets, which indicates high robustness. Furthermore, normalized openings per minute, and average openness intensity were extracted as video-based features, resulting in a significant difference between video recordings of stressed/anxious versus relaxed subjects.