Corresponding author: Patrizia Grifoni ( patrizia.grifoni@irpps.cnr.it ) © Alessia D'Andrea, Maria Chiara Caschera, Fernando Ferri, Patrizia Grifoni. This is an open access article distributed under the terms of the Creative Commons Attribution License (CC BY-ND 4.0). This license allows reusers to copy and distribute the material in any medium or format in unadapted form only, and only so long as attribution is given to the creator. The license allows for commercial use. Citation:
D'Andrea A, Caschera MC, Ferri F, Grifoni P (2021) MuBeFE: Multimodal Behavioural Features Extraction Method. JUCS - Journal of Universal Computer Science 27(3): 254-284. https://doi.org/10.3897/jucs.66375 |
The paper aims to provide a method to analyse and observe the characteristics that distinguish the individual communication style such as the voice intonation, the size and slant used in handwriting and the trait, pressure and dimension used for sketching. These features are referred to as Communication Extensional Features. Observing from the Communication Extensional Features, the user’s behavioural features, such as the communicative intention, the social style and personality traits can be extracted. These behavioural features are referred to as Communication Intentional Features. For the extraction of Communication Intentional Features, a method based on Hidden Markov Models is provided in the paper. The Communication Intentional Features have been extracted at the modal and multimodal level; this represents an important novelty provided by the paper. The accuracy of the method was tested both at modal and multimodal levels. The evaluation process results indicate an accuracy of 93.3% for the Modal layer (handwriting layer) and 95.3% for the Multimodal layer.