Smartphones get emotional: mind reading images and reconstructing the neural sources |
Michael Kai Petersen, Carsten Stahlhut, Arkadiusz Stopczynski, Jakob Eg Larsen, Lars Kai Hansen
|
Abstract | Combining a 14 channel neuroheadset with a smartphone to capture and process brain imaging data, we demonstrate the ability to distinguish among emotional responses reflected in different scalp potentials when viewing pleasant and unpleasant pictures compared to neutral content. Clustering independent components across subjects we are able to remove artifacts and identify common sources of synchronous brain activity, consistent with earlier findings based on conventional EEG equipment. Applying a Bayesian approach to reconstruct the neural sources not only facilitates differentiation of emotional responses but may also provide an intuitive interface for interacting with a 3D rendered model of brain activity. Integrating a wireless EEG set with a smartphone thus offers completely new opportunities for modeling the mental state of users as well as providing a basis for novel bio-feedback applications. |
Type | Conference paper [With referee] |
Conference | Affective Computing and Intelligent Interaction |
Editors | 1st workshop on machine learning for affective computing |
Year | 2011 Month October |
Electronic version(s) | [pdf] |
BibTeX data | [bibtex] |
IMM Group(s) | Intelligent Signal Processing |