Master/Semester projects

Overview of available projects

The CHILI lab is inventing learning technologies that exploit recent adavances in human-computer interaction (e.g. eye tracking, augmented reality, …) and in human-robot interaction. We are working on several educational platforms described below. Each platforms offers possibilities for semester and master projects. While semester projects are often limited to development, master projects usually include an empirical study with learners, supervised by our team.  The platforms are:

  1. MOOC Platforms EdX and Coursera: the projects here are about processing the large data sets of interaction traces produced by these platforms in order to classify, understand or predict the behavior of learners, using machine learning or visualisation methods. Contact: patrick.jermann (at) epfl.ch
  2. FROG is a platform to Fabricate and Run Orchestration Graphs, i.e. graphs that describe rich pedagogical scenarios. The projects concern the design of online real-time collaborative activities for learning and building a data analytics pipeline for monitoring and understanding students behavior. Contact: stian.haklev (at) epfl.ch.
  3. REALTO is a social platform for vocational education. Apprentices collect pictures at the workplace and upload them on their class flow, where several picture annotation tools and augmented reality tools are available. Current projects concern these tools as well as a dashboard for teachers. Contact: catharine.oertel (at) epfl.ch
  4. CELLULO is a small robot for education. It moves by itself and can be moved by pupils. The hardware is ready and projects concern the software environments as well as designing and experimenting with new learning activities. Contact: wafa.johal (at) epfl.ch
  5. CO-WRITER is a project in which kids who face writing difficulties are offered to teach Nao how to write. Nao is a small humanoid robot available on the market. The projects concerns smoothening the interaction between the robot and young children. Contact: wafa.johal (at) epfl.ch

Some of these projects are described below, but since research is moving on permanently, we always have new opportunities. You can always contact the names above or pierre.dillenbourg (at) epfl.ch if you are interested in advancing digital education.

A variety of other projects in Data Science & Machine Learning are offered by the Center for Digital Education (CEDE). See project list here.

NEW. In fall 2018, 10 master thesis will be funded in the Swiss EdTech Collider, an incubator that gathers 65 EdTech Start-Ups on EPFL Campus.  Students will be paid. If you are interested contact pierre.dillenbourg@epfl.ch


[Semester] Bibliometric analysis of scholarly community (SNA, topic extraction)

This student will get access to 15 years of digitized conference proceedings from a Learning Science conference – around 2-300 papers per year for 15 years (PDFs, with metadata, easy to extract text). The project is to do a bibliometric study on citations (metadata, co-authorships over time), and text contents (using topic modeling, word embeddings, conceptnet etc). We could look at co-authorship (who is co-authoring with whom) over time, topic modeling over time (when did MOOCs first start getting mentioned? How much is about natural sciences vs literature?)… It might also be interesting to look at open repositories of publications from some other conferences to see how similar they are – how many people are participating in both communities, do ideas begin in one community and then “jump over” to another community (time-lag), etc. Aim is journal paper, if student is interested, he/she can be involved and become co-author (will require some additional time beyond the end of the project), but this is optional.

Pre-requisites: Should have some experience with machine learning, NLP/text analytics, social network analysis etc. Python or R.
Contact: stian.haklev@epfl.ch


[Master/semester] FROG

FROG is an open-source web-platform to author (design) and run rich collaborative learning activities. It introduces a concept of pluggable activity types (like video player, quiz, brainstorming or programming exercises), which can be configured by the designer, and connected in a learning graph, with data from one activity being transformed and reused in another (ideas from a brainstorm flowing into a concept map tool), and with pluggable “operators”, which can transform data, intelligently group students, etc. through the use of algorithms and machine learning.

While the teacher runs the class, intelligent visualizations show not only simple facts like how many people have watched a given video, but can provide predictions based upon analysis of student actions with the activities. This platform might be used in small classrooms, large lectures (we are currently doing experiments with 350 students in a lecture at UNIL), or MOOCs.

The FROG team offers a number of interesting and challenging projects, in terms of web development, infrastructure and testing, developing and testing algorithms, running experiments, and user interface design/HCI.

For anyone interested in the projects listed below, please follow these instructions:

  • Please visit https://github.com/chili-epfl/FROG/wiki, and look at some of the short videos and short papers there to get a better understanding of the project (taking into account that the project is progressing rapidly, and many current features are not shown in the videos)
  • For students who will be doing web development of new activities, data visualizations or core FROG functionality, we require prior knowledge of Javascript. We use a modern JS stack, with ES6, React and Flow type-checking. In your email, please mention any previous JS projects and link to code if available.
  • For students who will be doing algorithm development, we support mainly JS and Python, although R could be a possibility. Again, mention previous JS/Python/R projects, with links to code if available.
  • For all students, you should be comfortable with a Github development flow (branches, pull requests, issues, etc).
  • Possibility to publish: For some projects, there will be the option to work on an academic publication. This is not a requirement, but a possibility for motivated students. This paper is the result of a previous semester project.
  • In your email, please include your CV, and a short description of your specific interests as they relate to FROG.

Advantages include:

  • Your project has the potential to have a real impact on student learning at EPFL and internationally, as well as on researcher productivity for multiple educational research teams
  • Work in a dynamic team – we’re housed in the Rolex Learning Center, and you can work from here one day per week – we are usually available for discussions, and invite you to join the lab life – group lunches, invited talk series, etc.
  • Learn cutting edge technologies – our web stack and practices are industry standards, learning React development is a very marketable skill
  • Open source project, so your contributions are public to the world. Can serve as part of your portfolio, and we are happy to write recommendation letters for hard-working students.

If you find the project interesting, and have a specific idea that is not listed below, we are flexible and would still like to hear from you! For any of these positions, please contact stian.haklev@epfl.ch unless otherwise noted.

[semester] Prediction of student completion of short text creation

Within the classroom, students are often required to produce small amounts of text either for a short answer on a quiz or a justification for an answer. In the classroom, it can be difficult for the teacher to judge when students will complete writing. Based upon log data from the writing process, this project aims to judge when a class will complete a short writing task with no prior information. Using existing data, you will develop an algorithm that at any time point in the process can predict when students finish writing.

Prerequisites: Experience in data analytics or machine learning, using Python or R.

Contact: jennifer.olsen@epfl.ch

[semester] Prediction of student completion of activities using prior knowledge of the students and activity

Within the classroom, when students are working on an individual or collaborative activity, it can be difficult for teachers to judge when to move to the next activity. Teachers can make better predictions of when students will be complete based upon how long the activity took in another class or based on their knowledge of their students. Using existing data, in this project you will enhance existing algorithms to account for prior activity and student knowledge to provide a better prediction of when an activity will complete.

Prerequisites: Experience in data analytics or machine learning, using Python or R.

Contact: jennifer.olsen@epfl.ch

Innovative activity supporting rich collaboration in React

Activity types (tools) in FROG are the corner-stones of collaborative learning. Past student projects have resulted in a synchronous PDF viewer/annotation tool, flexible video chat with WebRTC, an online code editor which can run Python in the browser, and collaboratively editable forms. This is an opportunity to work with cutting edge web technologies (React, Flow typing, Meteor, etc) on an open source project that will be used by many. There are several ideas for new activities, including a collaborative concept map, rich text editing with some wiki integration, or integration with an external knowledge base such as Slack or Discourse, but we are also open for other ideas.

Prerequisites: Experience with Javascript, and ideally with React.
Contact: stian.haklev@epfl.ch

Analyzing collaborative synchronous writing traces to automatically determine role taking and collaboration quality

Students who collaborate using Etherpad or Google Docs generate large amounts of very granular data (every single key press, time sequences, etc), which could potentially tell us about which roles the students are taking (outlining, adding information, brainstorming, re-organizing, fixing spelling mistakes), and how well the students are collaborating, all measures which can be calculated over time to see transitions between states/roles. We could also add semantic data on the text contents to this dynamic analysis to see how for example two students who read different articles gradually “converge” in their editing (or not). We have two available positions: one will focus on pure behavioural (editing) data, and try to do unsupervised and supervised modeling to detect and visualize role taking, editing process, etc. The second will attempt to enhanced the analysis with semantic information (using things like word2vec). We have data from collaborative writing experiments which the students will work on. Previous paper.
Prerequisites: Experience in data analytics or machine learning, using Python or R. Experience with or interest in NLP, word2vec, topic modeling, etc.
Contact: stian.haklev@epfl.ch

Developing/prototyping the ideal video chat interface for distance learning

Skype or Google Hangout are commonly used also in education, to support small groups of students discussing or working on a problem, however they were never designed for education. In FROG, we currently have flexible video chat built into the platform. This means that we can randomly assign students to various groups, switch them from one group to another, and have them work with different synchronous collaboration tools (editor, code editor, image annotation, concept mapping etc) while talking.

Since we have full control over the video chat, we can experiment with different kinds of visual interfaces and interactions to see what works best for different kinds of collaborative learning – is there a difference between small or large video boxes, making the current speaker larger, or maybe even hiding video altogether while the students are working on a collaborative problem. Perhaps we can support explicit turn-taking, show visualizations of which student is speaking more to support group awareness etc. We’re looking for a student interested in prototyping different interfaces, and run small experiments to see which work better (in what situations). This could lead to an academic paper, if the student is interested.
Prerequisites: Students should have some experience with the theory or practice of user interface design, designing prototypes and/or conducting usability studies. Please provide examples of prototyping projects that you have completed and the tools that were used.
Contact: stian.haklev@epfl.ch

Automated testing of complex multi-user web application

FROG consists of an editor, an engine, and a large number of pluggable activity types and operator types (data transformation). The activity types all support live collaboration with data sync, enabling two or more students to work together. When a graph is running, there is a flow of data between activities, students are grouped based on the output of algorithms etc. While we use unit-tests to ensure the functionality of small modules, we are currently investigating integration tests that will enable us to automatically test a large number of different configurations and flows.

Students interested in advanced automatic testing could for example help us develop a domain-specific language to describe a graph and the inputs that students would provide in each activity, letting a test-runner “execute” an entire graph, with virtual students and teachers. Another approach could be to use property-based testing and our internal type system to automatically generate theoretically valid graphs and data inputs, and test that no “allowed input” breaks the system. A final issue is how to do integration testing of a multi-user live-synced app – ensuring that if one user types something, it appears on the other users’ screen, etc.
Prerequisites: Strong knowledge of Javascript and interest in automated software testing, reliability Q&A etc.
Contact: stian.haklev@epfl.ch

Large scale instructional design analysis, comparing thousands of graphs

LAMS is a platform similar to FROG, which lets users visually design a learning scenario, with individual and collaborative activities connected with data flow and teacher actions. Example of a script. The difference is that LAMS has been in use for over 15 years. We will get access to 18,000 different LAMS scripts designed by teachers, and actually used by students, and we are interested in doing data analysis on this unique data set. The graphs are stored as structured XML, and we should be able to parse them, and extract for example common structures – can we detect common instructional patterns in scripts? Can we visualize what activity sequences often follow each other, or common transitions (which activities are involved when switching from an individual to a group plane?). This project requires some creativity – beginning with what intermediary format to store the graphs in – is it better to use a graph database, or perhaps treat them as a corpus, to be able to analyze n-grams, and even word2vec (with activities as tokens). Possibility to publish paper if student is interested.

Prerequisites: Experience in data analytics or machine learning, using Python or R. Experience with or interest in graph databases, data wrangling, NLP, word2vec, topic modeling, etc.
Contact: stian.haklev@epfl.ch

Devops: How to automatically deploy, scale on demand, and have error recovery for a somehow complex app?

Running FROG in development mode is designed to be as easy as possible, but in production when we need to scale to multiple servers, we have several different services that need to be coordinated – the Meteor processes that run FROG, other node servers, Redis, MongoDB, some Python servers, nginx for SSL reverse proxying and load balancing etc. We often spin up 10 VMs, but if we want to do large scale deployments in MOOCs we would probably need many more. Currently this is all done semi-manually. We’re looking for someone to help us develop a proper devops approach, possibly using the EPFL Kubernetes cluster, letting us deploy a whole infrastructure in one keystroke, automatically restarting failed servers, scaling up based on demand, etc.
Pre-requisites: Some background and interest in devops, Docker, Kubernetes and other cloud infrastructure.
Contact: stian.haklev@epfl.ch

REALTO: Online learning platform for integrated vocational education

REALTO is a social platform for vocational education. Apprentices collect pictures (and videos) at the workplace and upload them on their class flow. Supervisors and teachers have the possibility to provide feedback on the students private flow and peers have the possibility to comment on other students pictures and videos. Over 2000 apprentices from a wide variety of disciplines such as florists, carpenters, fashion designers are currently registered on REALTO.

Following are the list of available projects and their descriptions. In case of interest, please send an email to the contact person. In your email, please include your CV and a short description of your specific interests.

[Master] Designing a Virtual Communication Trainer

Description: Sales clerks are every day confronted with very different and often challenging situations. A angry customer, an undecided customer, a customer which tells his or her whole life in an unsolicited manner. All of these situations are potentially hard to handle, especially for someone who is just starting into a new career. Currently, apprentices already have the possibility to upload videos out of such everyday work experience to REALTO. The goal of this semester project is to organize these interactions by topics (e.g. angry customer) and then use this data to implement a virtual agent who is able to take on the role of an e.g. angry customer. The purpose of the implementation is to provide the apprentice with a communication trainer.

Tasks:

  • Design a dialogue flow
  • Implement a limited domain dialogue system with a virtual agent
  • Integrate the system within the REALTO platform

Contact: catharine.oertel (at) epfl.ch

[Master] Detection of the technological vanguard on Stack Overflow

Description: Stack Overflow is a widely-known Q&A platform for developers, and many developers are themselves contributors, both in asking questions and answering them. These developers may have varying levels of expertise, and also varying levels of knowledge diversity. We hypothesise that many of said developers are constantly in touch with the latest news and developments in the IT industry, and that their behaviour on Stack Overflow would reflect emerging innovations. One of the main challenges of the project is to come up with an evaluation scheme or some kind of ground truth. Since this project is part of a greater project on understanding trends on Stack Overflow, you will be working in a team and there will be a lot of collaboration involved.

Aims: The aim of this project is the detection of this ‘vanguard’ on Stack Overflow, to understand the entry points of innovations into Stack Overflow, and to describe the features of this vanguard.

Prerequisites: The project will utilise machine learning and network analysis techniques and therefore familiarity with at least one is necessary. Experience with Python is also necessary, and familiarity with some of the implementations of said techniques in Python is a big plus. Last but not least, we’re looking for highly motivated people – if you are one, sign right up!

Contact: ramtin [dot] yazdanian [at] epfl.ch; catharine.oertel (at) epfl.ch

[Master] Educational Game Design for Inter-Class Collaboration

Description: One fact of vocational education is that apprentices are often working in small groups and do not have the possibility to exchange ideas across classrooms. Many of the apprentices in fact feel quite alone in their apprenticeship. Yet, being successful in an apprenticeship and also in later on in building a successful career is facilitated by having a large network of peers.
The goal of this project is to developing a game which makes it possible for students to collaborate on a topic which is relevant to their studies across classrooms.

Tasks:

  • Design a or collaborative game
  • Implement it on REALTO

Contact: catharine.oertel (at) epfl.ch

[Master] Designing a virtual reality learning environment with Unity

Description: The practical experience of the students in their apprenticeship can be quite limited. One of the ideas of Realto is to provide a digital space for the learners to “expand their experience.” The goal of this project is to design a 3d virtual learning environment where the learners can gain some additional experience that would support their vocational training. Some examples can be: an interactive 3d garden for gardeners or a church to be decorated for florists. The project will involve working with Unity and VR headset.

Prerequisites: experience in following topics or interest in learning: Unity, C#, VR

Contact: kevin.kim (at) epfl.ch

[Master] Object detection using deep learning

Description: On REALTO, apprentices upload pictures taken from their workplaces and share them in the digital space. In order to make a better use of the uploaded data, it is important to have some semantic understanding about the image. Recent advancement of deep learning algorithms has improved the performance on the problem of image-based object detection. The goal of this project is to implement, train and test state-of-the-art deep learning algorithms to recognize objects from images. We are currently working with a dataset of flower bouquets (for florists).

Prerequisites: experience in following topics or interest in learning: machine learning, deep learning, image processing, python

Contact: kevin.kim (at) epfl.ch, catharine.oertel (at) epfl.ch


CoWriter

The CoWriter Project aims at exploring how a robot can help children with the acquisition of handwriting, with an original approach: the children are the teachers who help the robot to better write! This paradigm, known as learning by teaching, has several powerful effects: it boosts the children’ self-esteem (which is especially important for children with handwriting difficulties), it get them to practise hand-wrtiing without even noticing, and engage them into a particular interaction with the robot called the Protégé effect: because they unconsciously feel that they are somehow responsible if the robot does not succeed in improving its writing skills, they commit to the interaction, and make particular efforts to figure out what is difficult for the robot, thus developing their metacognitive skills and reflecting on their own errors.

[Semester] CoPainter: Drawing Activity with Nao Robot

The CoWriter project aims to develop an Child-Robot-Interaction based on the learning by teaching paradigm: we ask a child to teach handwriting to a Nao robot. The educational/therapeutic success of such an interaction is mainly based on the engagement of children in the interaction. For the moment we have 3 activities but our idea is to implement broad types of activities in order to switch activities according to the child’s learning level and  attentional state.

The aim of this semester project is to develop a drawing activity involving the Nao robot, that will be tested later on with children. During this project you will be in charge of the software development of the activity controlling the Nao robot. The student will also develop the tablet application allowing drawing.
Using the same principle as the QuickDraw online game, the robot and the child will play a pictionary-like game. The goal will be to collect drawing data and adpat the difficulty of the game with the ability of the child.

The activity should be ROS based and be integrated to the whole CoWriter project. The student will:

  • Explore the QuickDraw dataset to generate different levels of drawing difficulty.
  • Develop an interactive scenario with Nao using an android tablet.
  • Integrate the activity within the CoWriter project.

Prerequisites: Experience in the following skills or interest in learning Machine Learning, Python, Android, OpenCV, ROS, git.
Contact: wafa.johal (at) epfl.ch


Cellulo

In the Cellulo Project, we are aiming to design and build the pencils of the future’s classroom, in the form of robots. We imagine these as swarm robots, each of them very simple and affordable, that reside on large paper sheets that contain the learning activities. Our vision is that these be ubiquitous, namely a natural part of the classroom ecosystem, as to shift the focus from the robot to the activity. With Cellulo you can actually grab and move a planet to see what happens to its orbit, or vibrate a molecule with your hands to see how it behaves. Cellulo makes tangible what is intangible in learning.

[Master] Soft Cellulo Component

The Cellulo project is concerned with tangible haptic-enabled swarm robots: 15+ small tabletop robots that work synchronously, each of which can move and be moved, see https://www.youtube.com/watch?v=CeSF6-75cY4. So far, we have implemented an array of applications ranging from education (see https://www.youtube.com/watch?v=zv6nDMQCWCo and https://infoscience.epfl.ch/record/224129?ln=en) to upper limb rehabilitation (see https://infoscience.epfl.ch/record/254966?ln=en). While our robots can move and be moved, their haptic actuation is limited to simple kinesthetic sensations (such as low-resolution forces and motions) while their haptic sensing is non-existent.

The goal of this master’s project is to integrate technologies from soft robotics in order to alleviate this problem, namely to enable detailed tactile actuation and sensing. More specifically, the student will collaborate with our lab and the Reconfigurable Robotics Lab in order to intetgrate soft robotics technologies developed there into Cellulo; see e.g Vacuum-Powered soft actuators (https://rrl.epfl.ch/page-148999.html) and Soft Pneumatic actuators (https://rrl.epfl.ch/page-148990-en.html). While we expect that this integration will certainly benefit the aforementioned applications, the student will aim to show that this is the case through a user study. In summary, the student will do the following:

– Become familiar with soft robotics technologies suitable for Cellulo
– Integrate a soft sensing/actuation component to Cellulo
– Design and run user study probing the effectiveness of the developed component

Prerequisites: Experience in the following skills or interest in learning: Soft robotics, embedded systems development, rapid prototyping, QML/QtQuick development
Contact: wafa.johal (at) epfl.ch, ayberk.ozgur (at) epfl.ch

[Master/Semester] 3D Rehabilitation Game Using IMU and Cellulo

In the Cellulo project, we are designing tangible swarm robots to be used in classrooms. Our robots come out-of-the-box with connectivity to tablets and smartphones in order to use these readily available devices as orchestration/visualization tool, communication router and processing power source. Our application framework on these devices is Qt/QtQuick (C++ and QML) which allows us to write cross-platform modular code (PC and Android) where we can easily use existing peripherals and capabilities on the device.

The unique functionalities of Cellulo (i.e. haptic feedback and submillimeter precision localisation) makes it an interesting device for upper-arm rehabilitation.

A first pacman game has been developed and tested with nearly 50 participants. You can find the patients playing Pacman within the therapy concept in the following video link:
https://drive.google.com/file/d/0B8UFszzja-gPR21tellfVnJzRmM/view?usp=sharing

This first game has been found to be indeed a good training for upper-arm motor learning. However, for now, we have only explored game in 2D plane. Research suggest that 3D rehabilitation is often necessary to recover better functional motor capabilities.

The goal of this project is to generate and recognise 3D arm gestures for interaction with Cellulo robots. Using wearable devices (possibly attached to both arms) we can define and detect a set of gestures and used them to select and control one or many Cellulos, e.g. select several Cellulos, group them, split a group, move as a group. Some of these gestures may need additional touch interaction with the Cellulos, e.g. touch one Cellulo at the left and other at the right, do a select 3D gesture to group all Cellulos in between.

This master project will investigate the combination of the Cellulo platform with one or two IMU sensors placed on the upper-arm of the patient in order to enable rehabilitation activities in the 3D space. (These sensors are previously used to control small drones as you can see from the following video links: https://www.youtube.com/watch?v=xEZX8j3JpOs, https://www.youtube.com/watch?v=VaQ3aZBf_uE )

The project will consist in:

  • Integrate the two technologies, IMU and Cellulo by developing a ROS or QtPlugin for the IMU
  • Define basic 3D atomic gestures, coupling IMU data and localisation of the Cellulo robots on paper
  • Propose a proof of concept game using the 3D space above a map for interaction with the robots

This project is a collaborative project between the CHILI Lab and IDSIA.

Prerequisites: Experience in the following skills or interest in learning Signal Processing, Python, ROS, QtQuick, C++, git.

Contact: wafa.johal (at) epfl.ch & arzu.guneysu (at) epfl.ch & alessandrog (at) idsia.ch

[Master/Semester] Hand Motion Detection of Cellulo Robots

In the Cellulo project, we are designing tangible swarm robots to be used in classrooms. Our robots come out-of-the-box with connectivity to tablets and smartphones in order to use these readily available devices as orchestration/visualization tool, communication router and processing power source. Our application framework on these devices is Qt/QtQuick (C++ and QML) which allows us to write cross-platform modular code (PC and Android) where we can easily use existing peripherals and capabilities on the device.

The unique functionalities of Cellulo (i.e. haptic feedback and submillimeter precision localisation) – Cellulo for rehab.

The unique functionalities of Cellulo (i.e. haptic feedback and submillimeter precision localisation) makes it an interesting device for upper-arm rehabilitation.

A first pacman game has been developed and tested with nearly 50 participants. This first game has been found to be indeed a good training for upper-arm motor learning. You can find the patients playing Pacman within the therapy concept in the following video link:
https://drive.google.com/file/d/0B8UFszzja-gPR21tellfVnJzRmM/view?usp=sharing

The next step in the project is to implement a collaborative game (with two or more participants). However, the Cellulo robots can detect grasp but without knowing who grasped.

Using a wearable sensor on the users’ arm, we can detect the contraction of muscles.

The goal of this project would be to integrate the wearable sensors used at IDSIA with the Cellulo robots in order to be able to recognize which users grasp the robot in real time. These sensors are previously used to control small drones as you can see from the following video link: https://www.youtube.com/watch?v=xEZX8j3JpOs, https://www.youtube.com/watch?v=VaQ3aZBf_uE

The project will consist in:

  • Integrate the two technologies, wearable sensor and Cellulo.
  • Use classification methods to recognize tangible interaction with the robot such as grasp or finger touch. A first step will be to record sensor inputs for a variety of gestures and build different classification model (using only robot data, only IMU or combining both ) ot evaluate the added value of the combined data for each gesture.  Obtain a more stable grasp, finger involves, tap motion, strength (by potentially record a dataset and using classification methods)
  • Implement an event based library that link grasp to the Cellulo platform
  • Propose a proof of concept game using this new grasp features.

This project is a collaborative project between the CHILI Lab and IDSIA.

Prerequisites: Experience in the following skills or interest in learning Signal Processing, Python, ROS, QtQuick, C++, git.

Contact: wafa.johal (at) epfl.ch & arzu.guneysu (at) epfl.ch alessandrog (at) idsia.ch

[Semester] Cellulo: Tangible Game with Dynamic Workspace

In the Cellulo project, we are designing tangible robots to be used in games, among other purposes. Our robots operate on tabletop paper sheets and are used as game elements where they can be physical input devices and/or autonomous agents. Our hypothesis is that we can build tabletop games with these robots that can move and be moved, possibly at the same time. We can further design games with many robots that can exhibit swarm behaviors, and one of our current goals is to explore game design options that create engaging interactions between multiple players and these robots.

Our robots work on printed paper sheets that can be produced with up to ~1m width and with unlimited length, and so far we have used these large shared workspaces within our activities to promote scalable multi-user interaction. We have recently developed smaller workspaces that can be tiled dynamically during runtime, allowing the growing, shrinking and modification of the workspace and its shape by the players. The goal of this project is to design a tangible game that uses this element at its core, together with the Cellulo robots. The resulting game will require the players to build the shape and the functionality of the workspace itself as the game progresses, on which their robots will move and be moved. A small user study will be conducted at the end to validate the player interaction with the developed game.

Prerequisites: Experience in the following skills or interest in learning: Game design, Qt/QtQuick development, QML programming, git
Contact: wafa.johal (at) epfl.ch