Multi-user activities are essential to human communication and significantly shape social interactions, behaviors, and relationships. Understanding these activities is crucial for developing smart systems based on human-computer interaction, such as those used in security, safety, and healthcare applications. Recent advancements in Wi-Fi signal analysis have opened up new possibilities for contactless sensing of human activities. Wi-Fi infrastructure is pervasive and can represent a convenient, non-invasive method for detecting multi-user activities in indoor environments. In this paper, we propose a data-level fusion method based on Wi-Fi Channel State Information (CSI) analysis to recognize multi-user activities (e.g., walking together) and gestures (e.g., handshaking). Our approach utilizes artificial neural networks (ANNs) to analyze the CSI data and extract features representing different activities. We evaluate the performance of our method on a publicly available dataset and compare it to other approaches, such as those based on computer vision and wearable sensors. Our results show that off-the-shelf Wi-Fi devices can be effectively used as a contactless sensing method for multi-user activity recognition, providing an alternative to other approaches that may be limited by occlusion or privacy concerns.
Multi-User Activity Monitoring Based on Contactless Sensing
Li Q.;Lal B.;Gravina R.
2024-01-01
Abstract
Multi-user activities are essential to human communication and significantly shape social interactions, behaviors, and relationships. Understanding these activities is crucial for developing smart systems based on human-computer interaction, such as those used in security, safety, and healthcare applications. Recent advancements in Wi-Fi signal analysis have opened up new possibilities for contactless sensing of human activities. Wi-Fi infrastructure is pervasive and can represent a convenient, non-invasive method for detecting multi-user activities in indoor environments. In this paper, we propose a data-level fusion method based on Wi-Fi Channel State Information (CSI) analysis to recognize multi-user activities (e.g., walking together) and gestures (e.g., handshaking). Our approach utilizes artificial neural networks (ANNs) to analyze the CSI data and extract features representing different activities. We evaluate the performance of our method on a publicly available dataset and compare it to other approaches, such as those based on computer vision and wearable sensors. Our results show that off-the-shelf Wi-Fi devices can be effectively used as a contactless sensing method for multi-user activity recognition, providing an alternative to other approaches that may be limited by occlusion or privacy concerns.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.