top of page

PhD Project:
From Simulation to Real-World: Measuring
Social Engagement for Social Robots

1N3A7364_anonymized.jpg

​My doctoral research develops methods to measure social engagement in real-world interactions, enabling social robots to better understand and respond to human social dynamics. I address a fundamental challenge: how can robots perceive engagement the way humans intuitively do?

​

Core Contribution: Visual Social Engagement Metric

I developed the Visual Social Engagement (VSE) metric based on two observable social signals:

  • Proximity: The physical distance between people interacting

  • Mutual Gaze: Whether people are looking toward each other

By combining these signals, I created a simple yet effective measure of how engaged people are during interactions. The metric produces engagement scores that indicate the intensity of the engagement between individuals.

​

Studies

Study 1: Online Game Data (88 Participants)

I developed a 3D murder mystery game during COVID-19 to collect interaction data. Participants demonstrated diverse, consistent engagement styles despite identical virtual character behaviours.

​

Study 2: Real-World Dataset (SoGrIn)

I collected interaction data from 30 participants in 6 groups performing collaborative tasks. I captured data using motion capture (VICON), GoPro cameras, and action unit detection. I created a publicly available dataset, filling a gap in real-world group interaction resources.

​

Key Findings
  • The VSE metric successfully distinguishes engagement from non-engagement in real-world data (Precision: 65.81%, though Recall: 40.34% indicates room for improvement)

  • Individual participants demonstrate unique "engagement signatures" characteristic patterns of how they engage over time

  • Interaction profiles vary significantly across individuals, suggesting distinct engagement styles

  • Non-verbal signals alone capture some but not all aspects of social engagement

​

Publications
  • Webb, N., Giuliani, M., and Lemaignan, S. (2022). Measuring Visual Social Engagement from Proxemics and Gaze. RO-MAN.

  • Webb, N., Giuliani, M., and Lemaignan, S. (2023). SoGrIn: A Non-Verbal Dataset of Social Group-Level Interactions. RO-MAN.

  • Webb, N., Giuliani, M., and Lemaignan, S. (2024). Measuring Visual Social Engagement from Proxemics and Gaze in the Real World. HRI Companion.

​

​

​

​

​

​

​

​

​

​

bottom of page