The COVID-19 pandemic has forced a sudden change of traditional office works to smart working models, which however force many workers staying at home with a significant increase of sedentary lifestyle. Metabolic disorders, mental illnesses, and musculoskeletal injuries are also caused by the physical inactivity and chronic stress at work, threatening office workers' physical and physiological health. In the modern vision of smart workplaces, cyber-physical systems play a central role to augment objects, environments, and workers with integrated sensing, data processing, and communication capabilities. In this context, a work engagement system is proposed to monitor psycho-physical comfort and provide health suggestion to the office workers. Recognizing their activity, such as sitting postures and facial expressions, could help assessing the level of work engagement. In particular, head and body posture could reflects their state of engagement, boredom or neutral condition. In this paper we proposed a method to recognize such activities using an infrared sensor array by analyzing the sitting postures. The proposed approach can unobstructively sense their activities in a privacy-preserving way. To evaluate the performance of the system, a working scenario has been set up, and their activities were annotated by reviewing the video of the subjects. We carried out an experimental analysis and compared Decision Tree and k-NN classifiers, both of them showed high recognition rate for the eight postures. As to the work engagement assessment, we analyzed the sitting postures to give the users suggestions to take a break when the postures such as lean left/right with arm support, lean left/right without arm support happens very often.

Work Engagement Recognition in Smart Office

Li Q.;Gravina R.
2022-01-01

Abstract

The COVID-19 pandemic has forced a sudden change of traditional office works to smart working models, which however force many workers staying at home with a significant increase of sedentary lifestyle. Metabolic disorders, mental illnesses, and musculoskeletal injuries are also caused by the physical inactivity and chronic stress at work, threatening office workers' physical and physiological health. In the modern vision of smart workplaces, cyber-physical systems play a central role to augment objects, environments, and workers with integrated sensing, data processing, and communication capabilities. In this context, a work engagement system is proposed to monitor psycho-physical comfort and provide health suggestion to the office workers. Recognizing their activity, such as sitting postures and facial expressions, could help assessing the level of work engagement. In particular, head and body posture could reflects their state of engagement, boredom or neutral condition. In this paper we proposed a method to recognize such activities using an infrared sensor array by analyzing the sitting postures. The proposed approach can unobstructively sense their activities in a privacy-preserving way. To evaluate the performance of the system, a working scenario has been set up, and their activities were annotated by reviewing the video of the subjects. We carried out an experimental analysis and compared Decision Tree and k-NN classifiers, both of them showed high recognition rate for the eight postures. As to the work engagement assessment, we analyzed the sitting postures to give the users suggestions to take a break when the postures such as lean left/right with arm support, lean left/right without arm support happens very often.
2022
Infrared Sensor
Sitting Posture Recognition
Smart Office
Work Engagement
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.11770/336403
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 3
  • ???jsp.display-item.citation.isi??? 3
social impact