Demonstrations and Downloads

This page provides access to on-line demonstrations and downloads of systems produced by our group.

Recording emotions in a public theater (interview)

Profs. Thierry Pun and Patrizia Lombardo (Faculty of litterature) in an interview for Radio Télévision Suisse (RTS) talking about the experiment “Recording emotions in a public theater”. For this experiment around 40 people have had their affective reactions recorded, while watching Martin Scorsese’s classic film “Taxi Driver” at Grütli cinema, in Geneva.

Listen

Swiss TV (TSR) report on SpudTV project

SpudTv project aims at using physiological signal analysis and Multimedia Content analysis (MCA) for emotional understanding of the videos for music video recommendation. This project has been done in the context of Petamedia European Network of Excellence (NoE) in collaboration with Queen Mary London, TU Berlin, EPFL and University of Twente.

MediaEval benchmarking Initiative

MediaEval is a benchmarking initiative dedicated to evaluating new algorithms for multimedia access and retrieval. MediaEval focuses on speech, language and contextual aspects of video. We are co-organizing this benchmarking initiative.

Body driven electric guitar effect

Video Affective Annotation

Participate in our affective annotation experiment. We would very much appreciate it, if you could spend some time as a participant. In this online experiment you will watch video clips extracted from ordinary movies of different genres (action, horror, drama, and comedy). Then, you will be asked to give your feedback about your felt emotion. There is an interface to assess level of valence (pleasantness, attractiveness) and arousal (excitement).
Your feedbacks will be anonymized to be used in our analysis and giving personal details (name, and contact) is optional. In this experiment you will get a sequence of video clips and you can watch and annotate as many clips as you like. Moreover, you can always stop the experiment and resume it later whenever you login again to the system. Each video clip is about one to two minutes long and the participants who annotate more than 30 clips till the end of this month (May 31st) will have the chance to win one of the four 50CHF FNAC vouchers.
We are thankful to have your voluntary participation by accessing this URL and signing up using your @unige email address. If you do not have a unige email address contact us for authorization mohammad.soleymani@unige.ch.

Viper Content-Based Image Retrieval

Try out the Java interface demonstration of the latest version of the Viper system, which uses a variety of colour and texture features, organised using an inverted file approach. Images can be marked as either relevant (R) or non-relevant (NR). Querying is thus an iterative process. We suggest requesting that at least 10 images be returned for each query. Details of the available algorithms can be found in our recent publications.

Videos

BCI Speller

|
BCI speller video
By analyzing the electrical activity of the brain, a computer can detect which letter is being focussed at. This way, people can write whole sentences just by looking at the appropriate letters in sequence.

Muscular pinball demonstration

Video from flipper game
MOV version
Manipulation of the flippers in pinball with the muscular activity and the real time processing using BCI 2000.

Emotive Game

See color video

see color video
Video of the SeeColor prototype, showing two experiments: matching coloured socks, and following a serpentine in an outdoor environment. (click on the image to download the video) For more video on this project and related topic you can visit This page.

BCI2000 Tutorial

BCI Tutorial

Tutorial on Brain-Computer Interaction (a powerpoint presentation with audio comments, ~10 MB) Provides a broad overview of the BCI field.

demonstrations/home.txt · Last modified: 2014/06/04 14:10 by chanel
Valid CSS Driven by DokuWiki Valid XHTML 1.0