Human-robot collaboration (HRC) has become increasingly popular in modern assembly systems because of the flexibility of human capabilities and the precision and efficiency of the fellow robot. However, previous research has identified challenges to achieve a genuine and natural human-robot interaction, one being the real-time robot behavior adaptation depending on the worker's emotions revealed by facial or body signals. Human emotional state recognition has been widely explored in the fields of human-machine interaction and affective computing, but a practical implementation of the technology in real-time during a collaborative task hides complexities and challenges. In this paper, the authors tested and compared twelve different models, all based on Deep Learning and Convolutional Neural Networks (CNN), to recognize emotions using the datasets CK+ and Fer2013. DeepFace algorithm resulted to be the most accurate and was further tested on real subjects in working and industry-like contexts to determine the actual validity and necessary modifications for a possible large-scale industrial application. A discussion about all the main challenges to face for a practical application of this technology on field is presented.

Real-time Detection of Worker's Emotions for Advanced Human-Robot Interaction during Collaborative Tasks in Smart Factories

Chiurco A.;Frangella J.;Longo F.
;
Padovano A.;Solina V.;Mirabelli G.;Citraro C.
2022-01-01

Abstract

Human-robot collaboration (HRC) has become increasingly popular in modern assembly systems because of the flexibility of human capabilities and the precision and efficiency of the fellow robot. However, previous research has identified challenges to achieve a genuine and natural human-robot interaction, one being the real-time robot behavior adaptation depending on the worker's emotions revealed by facial or body signals. Human emotional state recognition has been widely explored in the fields of human-machine interaction and affective computing, but a practical implementation of the technology in real-time during a collaborative task hides complexities and challenges. In this paper, the authors tested and compared twelve different models, all based on Deep Learning and Convolutional Neural Networks (CNN), to recognize emotions using the datasets CK+ and Fer2013. DeepFace algorithm resulted to be the most accurate and was further tested on real subjects in working and industry-like contexts to determine the actual validity and necessary modifications for a possible large-scale industrial application. A discussion about all the main challenges to face for a practical application of this technology on field is presented.
2022
deep learning
emotion recognition
human-robot collaboration
smart factory
social cobots
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.11770/332049
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 15
  • ???jsp.display-item.citation.isi??? 11
social impact