Human Self-touch vs Other-Touch Resolved by Machine Learning - Perception,INteraction, Robotique sociales
Communication Dans Un Congrès Année : 2022

Human Self-touch vs Other-Touch Resolved by Machine Learning

Résumé

Abstract Using a database of vibratory signals captured from the index finger of participants performing self-touch or touching another person, we wondered whether these signals contained information that enabled the automatic classification into categories of self-touch and other-touch. The database included signals where the tactile pressure was varied systematically, where the sliding speed was varied systematically, and also where the touching posture were varied systematically. We found that using standard sound feature-extraction, a random forest classifier was able to predict with an accuracy greater than 90% that a signal came from self-touch or from other-touch regardless of the variation of the other factors. This result demonstrates that tactile signals produced during active touch contain latent cues that could play a role in the distinction between touching and being touched and which could have important applications in the creation of artificial worlds, in the study of social interactions, of sensory deficits, or cognitive conditions.
Fichier principal
Vignette du fichier
Ramasamy-et-al-EH-2022.pdf (270.51 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03937431 , version 1 (04-12-2023)

Identifiants

Citer

Aruna Ramasamy, Damien Faux, Vincent Hayward, Malika Auvray, Xavier Job, et al.. Human Self-touch vs Other-Touch Resolved by Machine Learning. 13th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2022, May 2022, Hambourg, Germany. pp.216-224, ⟨10.1007/978-3-031-06249-0_25⟩. ⟨hal-03937431⟩
46 Consultations
36 Téléchargements

Altmetric

Partager

More