B-Interface 2010 Abstracts


Full Papers
Paper Nr: 7
Title:

Identifying Psychophysiological Correlates of Boredom and Negative Mood induced during HCI

Authors:

Dimitris Giakoumis, Athanasios Vogiannou, Illka Kosunen, Kostantinos Moustakas, Dimitrios Tzovaras and George Hassapis

Abstract: This paper presents work conducted towards the automatic recognition of negative emotions like boredom and frustration, induced due to the subject’s loss of interest during HCI. Focus was on the basic pre-requisite for the future development of systems utilizing an “affective loop”, namely effective recognition of the human affective state. Based on the concept of “repetition that causes loss of interest”, an experiment for the monitoring and analysis of biosignals during repetitive HCI tasks was deployed. During this experiment, subjects were asked to play a simple labyrinth--based 3D video game repeatedly, while biosignals from different modalities were monitored. Twenty one different subjects participated in the experiment, allowing for a rich biosignals database to be populated. Statistically significant correlations were identified between features extracted from two of the modalities used in the experiment (ECG and GSR) and the actual affective state of the subjects.
Download

Paper Nr: 11
Title:

The Smart Sensor Integration Framework and its Application in EU Projects

Authors:

Johannes Wagner, Frank Jung, Jonghwa Kim, Thurid Vogt and Elisabeth André

Abstract: Affect sensing by machines is an essential part of next-generation human-computer interaction (HCI). However, despite the large effort carried out in this field during the last decades, only few applications exist, which are able to react to a user's emotion in real-time. This is certainly due to the fact that emotion recognition is a challenging part in itself. Another reason is that so far most effort has been put towards offline analysis and only few applications exist, which can react to a user's emotion in real-time. In response to this deficit we have developed a framework called Smart Sensor Integration (SSI), which considerably jump-starts the development of multimodal online emotion recognition (OER) systems. In this paper, we introduce the SSI framework and describe how it is successfully applied in different projects under grant of the European Union, namely the CALLAS and METABO project, and the IRIS network.
Download

Paper Nr: 12
Title:

A Spectral Mapping Method for EMG-based Recognition of Silent Speech

Authors:

Matthias Janke, Michael Wand and Tanja Schultz

Abstract: This paper reports our recent findings in speech recognition by surface electromyography (EMG), which captures the electric potentials of the human articulatory muscles. This technology can be used to enable Silent Speech Interfaces, since no audible signal needs to be transmitted or captured in order to recognize speech. Previous experiments showed the existence of large discrepancies between the EMG signals of audible and silent speech, which negatively affect the performance of an EMG-based speech recognizer. In this paper we analyze EMG signals from audible, silent, and whispered speech and show how discrepancies between these speaking modes may be quantified. We then present a spectral mapping method to compensate for these differences, which improves our recognition rates on silent speech by up to 12.3% relative.
Download

Short Papers
Paper Nr: 3
Title:

Facial Features’ Localization using a Morphological Operation

Authors:

Kenz Bozed, Osei Adjei and Ali Mansour

Abstract: Facial features’ localization is an important part of various applications such as face recognition, facial expression detection and human computer interaction. It plays an essential role in human face analysis especially in searching for facial features (mouth, nose and eyes) when the face region is included within the image. Most of these applications require the face and facial feature detection algorithms. In this paper, a new method is proposed to locate facial features. A morphological operation is used to locate the pupils of the eyes and estimate the mouth position according to them. The boundaries of the allocated features are computed as a result when the features are allocated. The results obtained from this work indicate that the algorithm has been very successful in recognising different types of facial expressions.
Download

Paper Nr: 8
Title:

START AND END POINT DETECTION OF WEIGHTLIFTING MOTION USING CHLAC AND MRA

Authors:

Fumito Yoshikawa

Abstract: Extracting human motion segments of interest in image sequences is essential for quantitative analysis and effective video browsing, although it requires laborious human efforts. In analysis of sport motion such as weightlifting, it is required to detect the start and end of each weightlifting motion in an automated manner and hopefully even for different camera angle-views. This paper describes a weightlifting motion detection method employing cubic higher-order local auto-correlation (CHLAC) and multiple regression analysis (MRA). This method extracts spatio-temporal motion features and leans the relationship between the features and specific motion, without prior knowledge about objects. To demonstrate the effectiveness of our method, the experiment was conducted on data captured from eight different viewpoints in practical situations. The detection rates for the start and end motions were more than 94% for 140 data in total even for different angle views, 100% for some angles.
Download

Paper Nr: 9
Title:

Prerequisites for Affective Signal Processing (ASP) - Part IV

Authors:

Egon L. van den Broek, Joris Janssen, Marjolein van der Zwaag, Jennifer A. Healey and Joyce M. Westerink

Abstract: In van den Broek et al. (2009, 2010a,b), a series of prerequisites for affective signal processing (ASP) was defined: validation (e.g., mapping of constructs on signals), triangulation, a physiology-driven approach, contributions of the signal processing community, identification of users, theoretical specification, integration of biosignals, and physical characteristics. This paper defines three additional prerequisites: historical perspective, temporal construction, and real-world baselines. In addition, an overview of biosignals and their response time is included. Moreover, the third part of a review on the classification of emotions through ASP is presented, complementary to the reviews presented in van den Broek et al. (2009, 2010a).
Download

Paper Nr: 10
Title:

Biometrics for Emotion Detection (BED): Exploring the combination of Speech and ECG

Authors:

Egon L. van den Broek, Marleen H. Schut, Joyce M. Westerink and Kees Tuinenbreijer

Abstract: The paradigm Biometrics for Emotion Detection (BED) is introduced, which enables unobtrusive emotion recognition, taking into account varying environments. It uses the electrocardiogram (ECG) and speech, as a powerful but rarely used combination to unravel people’s emotions. BED was applied in two environments (i.e., office and home-like) in which 40 people watched 6 film scenes. It is shown that both heart rate variability (derived from the ECG) and, when people’s gender is taken into account, the standard deviation of the fundamental frequency of speech indicate people’s experienced emotions. As such, these measures validate each other. Moreover, it is found that people’s environment can indeed of influence experienced emotions. These results indicate that BED might become an important paradigm for unobtrusive emotion detection.
Download

Paper Nr: 13
Title:

Contagion of Physiological Correlates of Emotion between Performer and Audience: An Exploratory Study

Authors:

Javier Jaimovich, Niall Coghlan and R. B. Knapp

Abstract: Musical and performance experiences are often described as evoking powerful emotions, both in the listener/observer and player/performer. There is a significant body of literature describing these experiences along with related work examining physiological changes in the body during music listening and the physiological correlates of emotional state. However there are still open questions as to how and why, emotional responses may be triggered by a per-formance, how audiences may be influenced by a performers mental or emo-tional state and what effect the presence of an audience has on performers. We present a pilot study and some initial findings of our investigations into these questions, utilising a custom software and hardware system we have developed. Although this research is still at a pilot stage, our initial experiments point to-wards significant correlation between the physiological states of performers and audiences and we here present the system, the experiments and our pre-liminary data.
Download

Paper Nr: 15
Title:

Motion and single-trial biosignal analysis platform for monitoring of rehabilitation

Authors:

Perttu Ranta-aho, Stefanos Georgiadis, Timo Bragge, Eini Niskanen, Mika P. Tarvainen, Ina M. Tarkka and Pasi Karjalainen

Abstract: Three-dimensional motion analysis is a powerful tool fort he assessment of human movements during different rehabilitation applications. An adaptive virtual reality rehabilitation environment which is based on modern motion and biosignal analysis techniques is described.
Download

Paper Nr: 16
Title:

AffectPhone: A Handset Device to Present User’s Emotional State with Warmth/Coolness

Authors:

Ken Iwasaki, Takashi Miyaki and Jun Rekimoto

Abstract: We developed AffectPhone: a system which detects user’s emotional state by GSR, and tells it as warmth or coolness of the back panel of another handset. GSR is a good measure of arousal, so we detect it through electrodes attached to the sides of the handset. When arousal level of the user rises or drops, Peltier module in the back panel of another device generates warmth or coolness. This system needs no special sensor attached to user’s body, so it doesn’t interrupt user’s daily usage of the mobile phone. Moreover, this system is designed to present non-verbal information in anambient manner, so it would be more efficient than to present with display or speaker. This system would help enhancing existing telecommunication.
Download