Affective Computing and Multimodal Interaction Group

The Affective Computing and Multimodal Interaction group studies various forms of interaction, in addition to the classical screen-based visual mode. Some considered interaction modalities are haptic, auditory, and based on physiological signals.

Research topics:

User emotional assessment

Assessment of users emotional states by using multimodal physiological signals
Movie affective characterization using physiological signals and content analysis
Emotion Awareness Tools for Mediated Interaction (EATMINT)

Interaction for visually impaired and blind people

Modality conversion and visual data sonification

Eyewalker

A smart walker for senior citizens

Brain-Computer Interface

Interaction between the user and the computer, using only electrical brain information (i.e. without using the motor system, unlike keyboard or mouse).

3D brain activity reconstruction

Stochastic approaches for determining active brain areas, based on direct and inverse solutions.

Members

Prof. Thierry Pun (Professor)
Dr. Guido Bologna (Associate Researcher)
Dr. Guillaume Chanel (Senior Researcher and Teaching Assistant)
Dr. Mohammad Soleymani (Senior Researcher)
Dr. Theodoros Kostoulas (Postdoctoral Researcher)
Viviana Weiss (Research Assistant)
Severine Cloix (Research Assistant)
Sunny Avry (Research Assistant)
Michal Muszynski (Research Assistant)
Dr. Patrick Roth (Associate Researcher)

MMI publications

The list of MMI team publications

Resources and grants

MMI Resources

Tutorial

BCI2000 software online tutorial
BCI2000 software