Dr Oya Celiktutan Academics Reader in AI and Robotics Research subject areas Engineering Contact details oya.celiktutan@kcl.ac.uk @oyaceliktutan Oya Celiktutan
MASSXR 2025: The 3 rd Workshop on Multi-modal Affective and Social Behavior Analysis and Synthesis in Extended Reality (Affiliated with IEEE VR 2025) Predicting When and What to Explain from Multimodal Eye Tracking and Task Signals Are Large Language Models Aligned with People's Social Intuitions for Human–Robot Interactions? A Taxonomy of Explanation Types and Need Indicators in Human–Agent Collaborations A Time Series Classification Pipeline for Detecting Interaction Ruptures in HRI Based on User Reactions Towards a Modular Architecture for eXtended Reality Systems When Do People Want an Explanation from a Robot? A Multimodal Dataset for Robot Learning to Imitate Social Human-Human Interaction A Multimodal Dataset for Robot Learning to Imitate Social Human-Human Interaction A Survey of Evaluation Methods and Metrics for Explanations in Human–Robot Interaction (HRI) It takes two, not one: context-aware nonverbal behaviour generation in dyadic interactions Learning Pessimism for Reinforcement Learning Neural Weight Search for Scalable Task Incremental Learning The Impact of Robot’s Body Language on Customer Experience: An Analysis in a Cafe Setting A Cloud-based Robot System for Long-term Interaction: Principles, Implementation, Lessons Learned A Computational Approach for Analysing Autistic Behaviour during Dyadic Interactions: A Computational Approach for Analysing Autistic Behaviour Agree or Disagree? Generating Body Gestures from Affective Contextual Cues during Dyadic Interactions Analysing Eye Gaze Patterns during Confusion and Errors in Human–Agent Collaborations Context-Aware Body Gesture Generation for Social Robots Context-Aware Human Behaviour Forecasting in Dyadic Interactions iGROWL: Improved Group Detection With Link Prediction Learning Personalised Models for Automatic Self-Reported Personality Recognition Towards Autonomous Collaborative Robots that Adapt and Explain What Does Shared Understanding in Students’ Face-to-Face Collaborative Learning Gaze Behaviours “Look Like”? GROWL: Group Detection with Link Prediction IB-DRR-incremental learning with information-back discrete representation replay Socially Informed AI for Healthcare: Understanding and Generating Multimodal Nonverbal Cues Audio-driven Robot Upper-body Motion Synthesis Inferring Student Engagement in Collaborative Problem Solving from Visual Cues RICA: Robocentric Indoor Crowd Analysis Dataset Robocentric Conversational Group Discovery Inferring Human Knowledgeability from Eye Gaze in Mobile Learning Environments Learning to self-manage by intelligent monitoring, prediction and intervention Live Human-Robot Interactive Public Demonstrations with Automatic Emotion and Personality Prediction Computational Analysis of Affect, Personality, and Engagement in Human–Robot Interactions Automatic Prediction of Impressions in Time and across Varying Context: Personality, Attractiveness and Likeability Automatic Replication of Teleoperator Head Movements and Facial Expressions on a Humanoid Robot Multimodal Human-Human-Robot Interactions (MHHRI) Dataset for Studying Personality and Engagement Personality Perception of Robot Avatar Teleoperators in Solo and Dyadic Tasks Fully Automatic Analysis of Engagement and Its Relationship to Personality in Human-Robot Interactions Personality Classification from Robot-mediated Communication Cues Personality perception of robot avatar tele-operators Computational analysis of human-robot interactions through first-person vision: Personality and interaction experience Fast Exact Hyper-Graph Matching with Dynamic Programming for Spatio-Temporal Data Group-level arousal and valence recognition in static images: Face, body and context Let me tell you about your personality!: Real-time personality prediction from nonverbal behavioural cues View all publications
4 April 2025 King's Culture announces Creative Practice Catalyst Seed Fund recipients King's Culture announce the recipients of a new creative practice seed fund, enabling researchers…