Emotive Response to a Hybrid-Face Robot and Translation to Consumer Social Robots

Wairagkar M, Lima MR, Bazo D, Craig R, Weissbart H, Etoundi AC, Reichenbach T, Iyengar P, Vaswani S, James C, Barnaghi P, Melhuish C, Vaidyanathan R (2021)


Publication Type: Journal article

Publication year: 2021

Journal

DOI: 10.1109/JIOT.2021.3097592

Abstract

We present the conceptual formulation, design, fabrication, control and commercial translation of an IoT enabled social robot as mapped through validation of human emotional response to its affective interactions. The robot design centres on a humanoid hybrid-face that integrates a rigid faceplate with a digital display to simplify conveyance of complex facial movements while providing the impression of three-dimensional depth. We map the emotions of the robot to specific facial feature parameters, characterise recognisability of archetypical facial expressions, and introduce pupil dilation as an additional degree of freedom for emotion conveyance. Human interaction experiments demonstrate the ability to effectively convey emotion from the hybrid-robot face to humans. Conveyance is quantified by studying neurophysiological electroencephalography (EEG) response to perceived emotional information as well as through qualitative interviews. Results demonstrate core hybrid-face robotic expressions can be discriminated by humans (80%+ recognition) and invoke face-sensitive neurophysiological event-related potentials such as N170 and Vertex Positive Potentials in EEG. The hybrid-face robot concept has been modified, implemented, and released by Emotix Inc in the commercial IoT robotic platform Miko (‘My Companion’), an affective robot currently in use for human-robot interaction with children. We demonstrate that human EEG responses to Miko emotions are comparative to that of the hybrid-face robot validating design modifications implemented for large scale distribution. Finally, interviews show above 90% expression recognition rates in our commercial robot. We conclude that simplified hybrid-face abstraction conveys emotions effectively and enhances human-robot interaction.

Authors with CRIS profile

Involved external institutions

How to cite

APA:

Wairagkar, M., Lima, M.R., Bazo, D., Craig, R., Weissbart, H., Etoundi, A.C.,... Vaidyanathan, R. (2021). Emotive Response to a Hybrid-Face Robot and Translation to Consumer Social Robots. IEEE Internet of Things. https://doi.org/10.1109/JIOT.2021.3097592

MLA:

Wairagkar, Maitreyee, et al. "Emotive Response to a Hybrid-Face Robot and Translation to Consumer Social Robots." IEEE Internet of Things (2021).

BibTeX: Download