KCL Social AI & Robotics Lab (SAIR)


Team selfie taken by Kaiko! We are part of Centre for Robotics Research (CoRe), Department of Engineering, King's College London, United Kingdom. Our research focuses on computer vision and machine learning for artificial intelligence and human-machine interaction. In particular, we are interested in learning multimodal representations of human behaviour and environment from data and integrating such models into the perception, learning and control of real-world systems such as robots. Key application areas include but not limited to autonomous systems, intelligent interfaces and assistive technologies in healthcare, education, public and personal spaces; indeed, any area that demands human-machine interaction.

Interested in joining us? Check [this link] out and email Oya.

News


Team Members


Oya Celiktutan, Director
Assistant Professor / Lecturer in Robotics

Viktor Schmuck, PhD Student
Socially Aware Robotic Assistance

Edoardo Cetin, PhD Student
Deep Reinforcement Learning

Jian Jiang, PhD Student
Continual Learning for Human-Robot Interaction

Gerard Canal, Associate Member
Postdoctoral Researcher, Department of Informatics

Miriam Redi, Visiting Research Fellow
Senior Research Scientist, Wikimedia Foundation

Iman Ismail, PhD Student
Personalised Machine Learning

Nguyen Tan Viet Tuyen, Postdoctoral Researcher
Social Human-Robot Interaction

Robots


Kaiko, Human Support Robot


Emo, NAO Robot


Publications (since 2020)


Learning Routines for Effective Off-policy Reinforcement Learning
Edoardo Cetin, and Oya Celiktutan
International Conference on Machine Learning 2021 (ICML'21)
[Project page]
IB-DRR: Incremental Learning with Information-Back Discrete Representation Replay
Jian Jiang, Edoardo Cetin, and Oya Celiktutan
CVPR Workshop on Continual Learning 2021 (CLVision'21)
[Paper]
Domain-Robust Visual Imitation Learning with Mutual Information Constraints
Edoardo Cetin, and Oya Celiktutan
International Conference on Learning Representations 2021 (ICLR'21)
[Project page]
Robocentric Conversational Group Discovery
Viktor Schmuck, Tingran Sheng, and Oya Celiktutan
The 29th IEEE International Conference on Robot & Human Interactive Communication Proceedings 2020 (RO-MAN'20)
[Project page]
RICA: Robocentric Indoor Crowd Analysis Dataset
Viktor Schmuck, and Oya Celiktutan
UKRAS20 Conference: “Robots into the real world” Proceedings, 2020
[Project page]
Inferring Student Engagement in Collaborative Problem Solving from Visual Cues
Angelika Kasparova, Oya Celiktutan, and Mutlu Cukurova
Companion Publication of the 2020 International Conference on Multimodal Interaction 2020 (ICMI'20 Companion)
[Project page]

Projects


EPSRC New Investigator Award, 2021 - 2023
LISI - Learning to Imitate Noverbal Communication Dynamics for Human-Robot Social Interaction
[Project page]

Partners


Toyota Motor Europe
National Galery X

Invited Talks


Date Where Speaker(s) Title Slides/Video
25/03/2021 INRIA Perception Team, Deeptails Seminar Series Oya, Edoardo Towards Observational Imitation Learning [slides-part1,slides-part2][video]
04/03/2021 Nokia Bell Labs, Social Dynamics Seminar Series Oya Visual Learning of Group Interaction Dynamics [slides][video]

Social Events


Our 2020 Christmas activity for the team was to play Dungeons and Dragons. We had a lot of fun!
Due to Covid-19, we had to move our group meetings to Hyde Park. Not bad actually! :)
Viktor and Kaiko met with Olympic athletes!
We had a lot of fun when playing Star Wars together!
The SAIR Lab was established on the 1st of October, 2019!