Home Page

We are interested in enabling natural human-computer interaction by combining techniques from machine learning, computer vision, computer graphics, human-computer interaction and psychology. Specific areas that we focus on include: multimodal human-computer interfaces, affective computing, pen-based interfaces, sketch-based applications, intelligent user interfaces, applications of computer vision and machine learning to solving real world problems. Browse through the publications and research pages to get a flavor of IUI@Koc.

Haptic Negotiation and Role Exchange for Collaboration in Virtual Environments
We investigate how collaborative guidance can be realized in multimodal virtual environments for dynamic tasks involving motor control. Haptic guidance
Read more.
The role of roles: Physical cooperation between humans and robots
Since the strict separation of working spaces of humans and robots has experienced a softening due to recent robotics research
Read more.
Analysis of Engagement and User Experience with a Laughter Responsive Social Robot
We explore the effect of laughter perception and response in terms of engagement in human-robot interaction. We designed two distinct
Read more.
Speech Driven Backchannel Generation using Deep Q-Network for Enhancing Engagement in Human-Robot Interaction
We present a novel method for training a social robot to generate backchannels during human-robot interaction. We address the problem
Read more.
Stroke-Based Sketched Symbol Reconstruction and Segmentation
Hand-drawn objects usually consist of multiple semantically meaningful parts. For example, a stick figure consists of a head, a torso,
Read more.
Generation of 3D Human Models and Animations Using Simple Sketches
Generating 3D models from 2D images or sketches is a widely studied important problem in computer graphics.We describe the first
Read more.
The ASC-Inclusion Perceptual Serious Gaming Platform for Autistic Children
‘Serious games’ are becoming extremely relevant to individuals who have specific needs, such as children with an Autism Spectrum Condition
Read more.
HapTable: An Interactive Tabletop Providing Online Haptic Feedback for Touch Gestures
We present HapTable; a multi–modal interactive tabletop that allows users to interact with digital images and objects through natural touch
Read more.
Audio-Visual Prediction of Head-Nod and Turn-Taking Events in Dyadic Interactions
Head-nods and turn-taking both significantly contribute conversational dynamics in dyadic interactions. Timely prediction and use of these events is quite
Read more.
Multifaceted Engagement in Social Interaction with a Machine: the JOKER Project
This paper addresses the problem of evaluating engagement of the human participant by combining verbal and nonverbal behaviour along with
Read more.