Learning to Understand Natural Language in Interaction

Arash Eshghi
Heriot-Watt University, UK

3:15pm-4:15pm, 3 May 2017
EM 3.06

Abstract

Human perception and action are goal-oriented, adaptive, and domain-sensitive: they are highly tuned to specific areas of human activity and problem-solving. We learn from live interactional feedback and adjust not only our actions and plans, but our perceptions, i.e. how we conceptualise the world around us. In this talk, I will argue that capturing this domain-sensitivity of perception and action, and their systemic plasticity in the face of live feedback, is the key to better, more human-like Artificial Intelligence, spanning the fields of Robotics & HRI as well as Language Technologies. I will then focus on the problem of Semantics and Natural Language Understanding where this issue shows itself acutely. I'll present some of our recent work on the BABBLE project which combines insights from Formal Linguistics and Cognitive Science with modern machine learning techniques (e.g. Deep Reinforcement Learning) for automating the development of naturally interactive conversational systems. I will show that the method allows action schema as well as perceptual word meanings to be learned and adapted in interaction, rather than assumed as given. I will then sketch a short- to medium-term pathway to address the many challenges that remain in and around this area, as well as some ideas about how I could integrate this research within the context of MACS more broadly.

Bio

Arash Eshghi is currently a member of the Interaction Lab in the School of Mathematical and Computer Sciences at Heriot-Watt University where he works as a researcher on the Babble Project. Previously, he was a member of the Cognitive Science Research Group at Queen Mary University, London where he did his PhD. His research is focused on dialogue modelling: understanding people's everyday use of language in interaction with others, and building better computational models of it. He PhD explored processes whereby people reach mutual understanding in everyday conversation. Later he worked on Dynamic Syntax - a formal/computational model of how people produce and understand language incrementally, word-by-word. He is now applying this model, in combination with machine learning techniques to build better, more natural conversational systems; and systems that can learn language from interaction with a human partner.

Website: https://sites.google.com/site/araesh81/