The advancement in bioelectrical measurement technologies and the push towards a higher impact of the Brain Computer Interfaces and Affective Computing in the daily life have made non-invasive and low-priced devices available to the large population to record physiological states. The aim of this study is the assessment of the abilities of the MUSE headband, together with the Shimmer GSR+ device, to assess the emotional state of people during stimuli exposure. Twenty-four pictures from the IAPS database were showed to 54 subjects and were evaluated in their emotional values by means of the Self-Assessment Manikin (SAM). Using a Machine Learning approach, fifty-two scalar features were extracted from the signals and used to train 6 binary classifiers to predict the valence and arousal elicited by each stimulus. In all classifiers we obtained accuracies ranging from 53.6% to 69.9%, confirming that these devices are able to give information about the emotional state.

(2020). Emotion assessment using Machine Learning and low-cost wearable devices . Retrieved from http://hdl.handle.net/10446/228916

Emotion assessment using Machine Learning and low-cost wearable devices

Valesi, Riccardo.;Russo, Vincenzo
2020-01-01

Abstract

The advancement in bioelectrical measurement technologies and the push towards a higher impact of the Brain Computer Interfaces and Affective Computing in the daily life have made non-invasive and low-priced devices available to the large population to record physiological states. The aim of this study is the assessment of the abilities of the MUSE headband, together with the Shimmer GSR+ device, to assess the emotional state of people during stimuli exposure. Twenty-four pictures from the IAPS database were showed to 54 subjects and were evaluated in their emotional values by means of the Self-Assessment Manikin (SAM). Using a Machine Learning approach, fifty-two scalar features were extracted from the signals and used to train 6 binary classifiers to predict the valence and arousal elicited by each stimulus. In all classifiers we obtained accuracies ranging from 53.6% to 69.9%, confirming that these devices are able to give information about the emotional state.
2020
Laureanti, Rita; Bilucaglia, Marco; Zito, Margherita; Circi, R.; Fici, A.; Rivetti, F.; Valesi, Riccardo; Oldrini, C.; Mainardi, L. T.; Russo, Vincenz...espandi
File allegato/i alla scheda:
File Dimensione del file Formato  
(3)[Conference](2020)Emotion assessment.pdf

Solo gestori di archivio

Versione: publisher's version - versione editoriale
Licenza: Licenza default Aisberg
Dimensione del file 183.34 kB
Formato Adobe PDF
183.34 kB Adobe PDF   Visualizza/Apri
Pubblicazioni consigliate

Aisberg ©2008 Servizi bibliotecari, Università degli studi di Bergamo | Terms of use/Condizioni di utilizzo

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10446/228916
Citazioni
  • Scopus 28
  • ???jsp.display-item.citation.isi??? 19
social impact