Multimedia information indexing and retrieval is about developing techniques which allow people to effectively find the media they are looking for. Content-based methods become necessary when dealing with big databases due to the limitations inherent in metadata-based systems. Current technology allows researchers to explore the emotional space which is known to carry very interesting semantic information, but emotion recognition systems, however, lack sufficient reliability when dealing with real world data. A possible solution to this problem resides in the multimodal fusion paradigm which aims at improving robustness to real world noise. We state the need for an integrated methodology which extracts reliable affective information through a multimodal fusion system and tags this semantic information to the medium itself. A framework, EMMA, currently under development in our laboratory, will be described.
Using emotions to tag media
Jamboree 2007, Workshop By and For KSpace PhD Students, September, 15th 2007, Berlin, Germany
Best Poster Award (2nd price)
PERMALINK : https://www.eurecom.fr/publication/2353