Individuals with autism spectrum disorder have difficulties in facial emotion recognition (Joseph, Tanaka, 2002). Although these difficulties have been long investigated using different technologies (Alves et al., 2013; Bertacchini et al., 2013; Kim et al., 2015) results have not still yield a shared identification of all the involved variables as well as common results. In order to investigate this complex problem, an advanced application has been designed and implemented, involving the most useful features that other tools actually present in the market have. In fact, it allows the display of 3D faces expressing the following six basic emotions: Joy, Sadness, Anger, Fear, Disgust, and Surprise. Furthermore, the application as a more wider educational purposes, it is capable of recording both the mathematical parameters of recognition for each emotion, for each subject, and different variables (number of trials, time of recognition, percentage of success, total time). To demonstrate the effectiveness and robustness of the developed application, eighteen subjects with high-functioning autism spectrum disorder diagnosis (aged between 5 and 18) have been involved in an experimentation, in order to evaluate the more recognized parameters for each emotion, exactly identifying each emotion parametric configuration that each subject recognizes, for all the emotional spectrum. This allowed the investigation of the most recognized basic emotions among the considered sample. Results have been considered to implement specific personalized path of the application to be implemented in the device, to improve its effectiveness. Even though these outcomes can be considered preliminary, results highlight that the new advanced system could have many educational applications for users with recognition impairments of facial emotions.

A FACIAL EMOTIONS RECOGNITION APPLICATION FOR SUBJECTS WITH AUTISM SPECTRUM DISORDER

F. Bertacchini;L. Gabriele
Membro del Collaboration Group
;
P. S. Pantano;E. Bilotta
2018-01-01

Abstract

Individuals with autism spectrum disorder have difficulties in facial emotion recognition (Joseph, Tanaka, 2002). Although these difficulties have been long investigated using different technologies (Alves et al., 2013; Bertacchini et al., 2013; Kim et al., 2015) results have not still yield a shared identification of all the involved variables as well as common results. In order to investigate this complex problem, an advanced application has been designed and implemented, involving the most useful features that other tools actually present in the market have. In fact, it allows the display of 3D faces expressing the following six basic emotions: Joy, Sadness, Anger, Fear, Disgust, and Surprise. Furthermore, the application as a more wider educational purposes, it is capable of recording both the mathematical parameters of recognition for each emotion, for each subject, and different variables (number of trials, time of recognition, percentage of success, total time). To demonstrate the effectiveness and robustness of the developed application, eighteen subjects with high-functioning autism spectrum disorder diagnosis (aged between 5 and 18) have been involved in an experimentation, in order to evaluate the more recognized parameters for each emotion, exactly identifying each emotion parametric configuration that each subject recognizes, for all the emotional spectrum. This allowed the investigation of the most recognized basic emotions among the considered sample. Results have been considered to implement specific personalized path of the application to be implemented in the device, to improve its effectiveness. Even though these outcomes can be considered preliminary, results highlight that the new advanced system could have many educational applications for users with recognition impairments of facial emotions.
2018
978-84-09-02709-5
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.11770/288854
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact