Classroom Attention Monitoring

Motion & gaze

      The aim of this project is to give feedback to the teachers without interfering with their lecture. We use a set of computer vision technologies to analyise how the students are reacting to the lecture and to help teachers find weak points in their presentation. By using a system of cameras, we observe how the students are moving, where are they looking and we try to use the body-language information to infer the attention level of the audience.


People sitting

Our first set of features focuses on analysis of motion and between-person motion synchronisation during class. On the diagram below we show that our system can extract the level of movement of a single person from an annotated video. We also suplemented the visual measurements with a extensive set of additional data, annotating locations of people in videos as well as class events.

Movement timeline

Our observations are based on the formulated measurement of relative intensity of student’s motion, which is our context-dependant measurement of motion intensity. The measurement allowed us to extract meaningful information about the motion from low resolution recordings of groups of students, and to compare the values between persons recorded in the same session.

We found that increased synchronicity between persons was one of the indicators of attention, in accordance with our perception that anayzing a group of students has bigger potential of meaningful information than trying to profile individuals.


gaze visualized

Following the research of gaze-tracking, the second source of information about the students which was considered was the individual’s head orientation. In order to approximate the direction of the gaze, we conducted additional tests in controlled settings and created a refined model of gaze direction based on the observed head orientation, illustrated in the following figure.


mpp gaze model

The probabalistic model of gaze direction was combined with a face detector / pose estimator setup adjusted to processing large number of students present in each recording. The data was associated with the student’s attention captured with a set of questionnaires.

In the class, as the context of the analyzed behaviour, there are number of information sources and activities which can draw student’s visual attention. With the decision to focus on a prominant indicator, we have shown that the students who’s head orientation was more predictive of teacher’s location also reported higher levels of attention.

The research has focused on the contextual measures – modelling the interaction between two sides of the learning/teaching activity – instead of profiling the behaviour of the “perfect” student. For more details, please check the publication list given below.


Mirko Raca <>


M. Raca / P. Dillenbourg (Dir.) : Camera-based estimation of student's attention in class. Lausanne, EPFL, 2015. DOI : 10.5075/epfl-thesis-6745.
M. Raca; L. Kidzinski; P. Dillenbourg : Translating Head Motion into Attention - Towards Processing of Student’s Body-Language. 2015. 8th International Conference on Educational Data Mining, Madrid, Spain, June 26-29, 2015.
M. Raca; P. Dillenbourg : Classroom Social Signal Analysis; Journal of Learning Analytics. 2014. DOI : 10.18608/jla.2014.13.16.
M. Raca; P. Dillenbourg : Holistic Analysis of the Classroom. 2014. 3rd Multimodal Learning Analytics Workshop and Grand Challenges, Istanbul, Turkey, November 12, 2014. DOI : 10.1145/2666633.2666636.
M. Raca; P. Dillenbourg; R. Tormey : Sleepers’ lag - study on motion and attention. 2014. 4th Internation Conference on Learning Analytics and Knowledge, Indianapolis, Indiana, USA, March 24-28, 2014.
M. Raca; R. Tormey; P. Dillenbourg : Student motion and it's potential as a classroom performance metric ; 3rd International Workshop on Teaching Analytics (IWTA), Paphos, Cyprus, September 17-21, 2013.
M. Raca; P. Dillenbourg : System for Assessing Classroom Attention. 2013. 3rd International Learning Analytics & Knowledge Conference, Leuven, Belgium, April 8-12, 2013.