Humanoid Robot and Avatar Assisted Sign Language Tutoring for Children

Hatice Kose
Istanbul Technical University, Turkey

4:15pm-5:15pm, 6 September 2016
EM G.44


In this talk I'll summarize two projects on sign language for deaf children. In the first project, a social humanoid robot assists in sign language tutoring involving multimodal interaction games for child-robot pairs based on imitation and turn-taking. Children communicate with the robot via touch, sound (speech and drumming), signs from Turkish Sign Language (TSL), non-verbal gestures, colored flashcards, and face recognition. Signs from TSL are consist of basic upper torso actions, hand, body and facial gestures. The humanoid robot platforms employed in the studies are able to express and recognize selected signs from TSL using HMM based vision modules, express non-verbal gestures (e.g smiling, nodding), and verbal comments to motivate children, detect face, and selected colored cards of cartoon characters. The games involve training sessions to familarize the cihldren with the robots, an interactive turn-taking game part, where both robot and child actively participate in the game in turns, and a test part to evaluate children's subjective and objective perception of the robot and the game. In the second project, the humanoid robot is replaced by an avatar, and employed on a machine translation system which is built to translate the written educational material to sign for deaf primary school children.

Keywords: Avatar, Humanoid, Sign language, imitation, interaction games, child-robot interaction


Hatice Kose is an Associate Professor at Istanbul Technical University, Turkey, coordinating the Cognitive Social Robotics Lab, since 2010. She received her M.S. and Ph.D. degrees from the Computer Engineering Department, Bogazici University, Turkey, in 2000 and 2006, respectively.

During her MSc and PhD studies, she worked in several research projects involving vision, localization and multi-agent planning in robot soccer. In 2006-2010, she worked as a Research Fellow at the University of Hertfordshire, in the EU sixth Framework Project RobotCub. She was a visiting researcher at Imperial College, UK in 2010. Her current research focuses on gesture communication and imitation-based interaction with humanoid robots. Her motivation is to teach children with communication impairments (children with hearing impairment and children with Autism) sign language through the imitation based interaction games. She is leading several national research projects on robot/avatar assisted sign language tutoring, and text-to-sign machine translation supported by the Scientiļ¬c and Technological Research Council of Turkey, and take part in several EU projects.

Currently, she is an academic visitor at the University of Manchester, as a part of EtexWeld Horizon2020 project, and working on sensor based activity recognition via e-textile.