We present the engagement in human–robot interaction (eHRI) database containing natural interactions between two human participants and a robot under a story-shaping game scenario. The audio-visual recordings provided with the database are fully annotated at a 5-intensity scale for head nods and smiles, as well as with speech transcription and continuous engagement values. In addition, we present baseline results for the smile and head nod detection along with a real-time multimodal engagement monitoring system. We believe that the eHRI database will serve as a novel asset for research in affective human–robot interaction by providing raw data, annotations, and baseline results.
Key words: Engagement, Gesture, Multimodal Data, Human-Robot Interaction
Authors: Kesim E. ,Numanoglu T. ,Bayramoglu O. ,Turker B. B. ,Hussain N. ,Yemez Y. ,Erzin E, T. M. Sezgin