Dual eye-tracking



Dual Eye Tracking (DUET) consists of using 2 eye tracking devices to investigate collaborative mechanisms at a depth that has never been attained before. Eye tracking methods are not new; what is new that they are now stable enough to investigate complex phenomena such as collaborative problem solving and collaborative work or  collaborative learning. This topic lies at the frontier between computer science (groupware research and machine learning) and cognitive science (learning mechanisms and gaze analysis).



Cross-recurrence is a general way to explore the coupling between two dynamical systems. In cross-recurrence plots like the ones shown above, each axis is the timeline of one system and the colored points show whether the two systems are in the same state at a given lag in time (the diagonal corresponds to synchronicity). In the context of dual gaze, the dark points on the diagonal represent two people looking at the “same thing” at the “same time”.

The two cross-recurrence plots above correspond to a 10 minutes long interaction of two programmers reading JAVA code. The graphs are strikingly different. For the dyad on the left, the density of recurrence points on the diagonal is not very marked. The numerous white stripes are indicative of frequent scrolling. For the dyad on the right, clearly defined rectangular areas appear along the diagonal of the plot. This pattern is indicative of gaze coupling, where the two programmers look at the same parts of the code within a few seconds of each other, and hence explore the code together. We have found that high cross-recurrence is one criteria for a good collaboration flow.



We ran several behavioral experiments in which people have to collaborate through a shared workspace. We have collected more than 35 millions raw gaze points, aggregated into 1 million fixations, produced by 600 subjects. 

Our main and most stable finding is that the gaze coupling between collaborators is predictive of the quality of their interaction. Simply put, when the gaze of a speaker and a listener follow each other (with some delay), they will understand each other better, and (sometimes) achieve better problem-solving performance.

The main experiments we conducted are summarized below:

  • Shoutspace (CSCW)
    The goal of this task was to organize a festival on the EPFL campus through the annotation of a shared map. The participants could communicate through the use of a chat tool. Gaze cross-recurrence predicts performance and misunderstanding. We have shown that there is a positive correlation between the performance at the task and the level of gaze cross-recurrence. From the same experiment, we have found that the level of overlapping between the writer gazes, collected during the writing of a message, and the reader gazes, collected during reading, predicts a potential misunderstanding from the reader.
  • Knowledge-Awareness tool / Scripting (CSCL)
    Two collaborative learning experiments were conducted using similar experimental settings. The task consisted in reading individually a text about the neuron physiology followed by a collaborative ConceptMap (a kind of entity-association diagram) building task. The participants could speak to each other.

    Based on the fact that gaze precedes speech production and follows speech comprehension we have developed REGARD (REmote Gaze-Aware Reference Detector) a computational model to use real-time gaze data combined with speech data to associate automatically names to objects present on the screen of two partners conversing.

  • Collaborative Tetris / Gaze-aware collaborative Tetris (CSCW)
    Two experiments were run with a collaborative version of the Tetris game. This game is similar to the classical Tetris game but two pieces fall at the same time on the same gameboard and each player had to control one  piece. We also made a  version of the game which shows to the player the gaze of their partner. 

    Experts and novices look at the game differently. When playing in mixed ability pairs, novices play faster and look at the contour of the stack as much as their expert partners. These gaze patterns allow us to predict the pair composition using Support Vector Machines and Gaussian Mixture Models. 

  • Gaze-aware REmote Pair Programming (GREPP)
    This experiment aims to study program understanding. We have collected data from more than 50 individuals and 50 pairs doing a JAVA program comprehension task. Subjects have to describe the rules of a mathematical game given the code and have to find and fix bugs in the program.     



K. Sharma, P. Jermann, M.-A. Nüssli and P. Dillenbourg. Understanding Collaborative Program Comprehension: Interlacing Gaze and Dialogues. Computer Supported Collaborative Learning (CSCL 2013), Madison, Wisconsin, USA, June 15-19, 2013.

Detailed recordFulltext



K. Sharma, P. Jermann, M.-A. Nüssli and P. Dillenbourg. Gaze Evidence for Different Activities in Program Understanding. 24th Annual conference of Psychology of Programming Interest Group, London, UK, November 21-23, 2012.

Detailed recordFulltext


M.-A. Nüssli, P. Jermann, M. Sangin and P. Dillenbourg. Collaboration and abstract representations: towards predictive models based on raw speech and eye-tracking data. Computer Support for Collaborative Learning (CSCL) 2009, Rhodes, 2009.

M. Cherubini, M.-A. Nüssli and P. Dillenbourg. Deixis and gaze in collaborative work at a distance (over a shared map): a computational model to detect misunderstandings. International Symposium on Eye Tracking Research & Applications (ETRA2008), Savannah, 2008.

M. Sangin, G. Molinari, M.-A. Nüssli and P. Dillenbourg. How learners use awareness cues about their peer knowledge? Insights from synchronized eye-tracking data. International Conference of the Learning Sciences, Utrecht, 2008.

G. Molinari, M. Sangin, M.-A. Nüssli and P. Dillenbourg. Effects of informational interdependence on visual attention and action transactivity in collaborative concept mapping.. International Conference of the Learning Sciences, Utrecht, 2008.

M. Cherubini, M.-A. Nüssli and P. Dillenbourg. Deixis and coupling of partners’eye movements in collaborative work at distance. GROUPS’07: ACM 2007 International Conference on Supporting Group Work, Sanibel Island, FL, 2007.