Affective BCI: is emotion detection within reach?

Is it possible at this point to reliably detect “emotions” (and differentiate between them) using EEG, or EEG and other biometric data like heart-rate, muscle tension, etc?

I’m not sure how you would go about it, but some commercial apps claim that they can do it. Is it possible to do this, or are systems claiming to be able to do it shams?

Thanks.

2 Likes

Emotion detection is a tough nut to crack because of how subjective it can be, and I haven’t done anything much with affective BCI myself, but these links may be of interest:

1 Like

Here is a terribly vague hearsay starting point: when I asked a professional researcher working on this question about affective state detection, he said that a promising avenue is measuring the bilateral synchrony (or other bilateral comparisons) in the front of the head. He also said that this can give a general affect measure, but not really identify specific emotions.

Might be worth some experiments.

Here is another intriguing but obscure starting point: at TSC 2015 in Helsinki there was a presentation on “EEG guided meditation”. During this presentation, the author reported on an uncontrolled (and therefor not terribly reliable) experiment showing improved cognitive and psychological metrics as a result of a period of meditation training. Interestingly, the results were based on something called “BrainMind Audit”, which is an EEG-based tool developed by the Fingelkurts twins. More about this can be seen here: http://eegmeditation.eu/en/the-brainmind-audit

BrainMind Audit looks initially suspect. How do we know it’s really assessing what it says it’s measuring? However, Andrew and Alexander Fingelkurts are serious EEG experts, with an impressive list of peer-reviewed publication credits, so I’m not inclined to dismiss it out of hand. Unfortunately, BrainMind Audit is closed-source and not publicly available. However, I have spoken with Andrew and Alexander, and there may be a possibility of at least getting access to an alpha version of the software for testing purposes. If it does what it seems to, then it could open up many interesting applications, including affective measurements.

Really interesting links and info!
I’m looking into that area for a thesis and by personal curiosity. Will post back any new info I can gather.
For now I’m looking for professors in the field who would be interested to tutor me or collaborate. So if you have any ideas, let me know :wink:

Facial expressions recognition through depth sensor or video might be a way to start gathering more data around the brain response on different feelings.

1 Like

Hi Deams – thanks for joining NeuroBB!

That’s a very interesting idea to use facial expression recognition to gather emotion-correlated EEG data. If this has not been done yet, I’d say you have a winning research idea.

With video-based automated emotion recognition + EEG, it would likely be possible to use machine learning techniques to find affective indicators in the EEG data that we might not know exist yet. Very interesting!

Good luck, and keep us posted.

1 Like

Here are a few additional Affective BCI (aBCI) links I ran across:

Affective Brain-Computer Interfaces — Neuroscientific Approaches to Affect Detection
http://people.ict.usc.edu/~gratch/CSCI534/ACII-Handbook-BCI.pdf

Affective Brain-Computer Interfaces - University of Twente, Netherlands
http://doc.utwente.nl/68954/

Towards Affective BCI/BMI paradigms (abstract only):

4th Workshop on Affective Brain Computer Interaction - aBCI 2015 (conference):
http://www.affective-sciences.org/aBCI2015

TL;DR: no

It is extremely difficult to conduct hard science in this area, because it is hard to reliably induce an emotion in a person.

For example, if we want to distinguish anger from joy, we must obtain an EEG dataset recorded from a person that is experiencing true anger at one time and true joy at another. Furthermore, we must know when the subject became angry and when he became joyous.

Some researchers have tried inducing these emotions by showing pictures or video clips, but there is no evidence that, to state it bluntly, showing a picture of a puppy puts a person into a true joyous state. Our emotions are not that easily manipulated.

I have not come across any research article that employs a convincing way to collect EEG recordings of different emotions, making any results obtained vague and hard to replicate. (We’ve had some master students who tried replicating some studies: their results were never as good as the results claimed in the original papers.)

There are numerous EEG related companies that claim they can measure emotional states, but these claims are never backed up by hard science.

Good points @wmvanvliet. It raises the question though: EEG aside, how are emotions even defined in a hard science sense? Is there anything that we can say measures or indicates “true joy” or “true anger”? If there is, then couldn’t this — at least in principle — be correlated with EEG recordings?

Of course in practice, as you point out, if genuine emotions can’t be reliably stimulated, it would be quite difficult to collect enough emotion-correlated EEG data to, say, train a machine learning system.

OTOH, we know that heart rate and GSR are related to affective activation. These two could be combined with a video of the subject’s face, recorded along with an EEG while the subject was, for example, watching a dramatic movie. Couldn’t a technique like this be used to measure rigorously-definable affective states, matched to EEG recordings?

Thanks for contributing your expertise to the NeuroBB community!

I don’t know about EEG, but measuring emotions in theory is possible by measuring facial muscle movements. Paul Ekman is famous for studying universal emotions (which are present in every culture). You might want to check his work out. They studied emotions and developed a universal facial action coding system. Basic emotions like surprise, anger, sadness, disgust produce culture independent involuntary changes in facial expressions. Allegedly, even when people try to conceal their emotion, they tend to produce involuntary muscle movements for a fraction of a second (so called micro-expressions).

Facial EMG has been studied to assess its utility as a tool for measuring emotional reaction.[3]
Studies have found that activity of the corrugator muscle, which lowers the eyebrow and is involved in producing frowns, varies inversely with the emotional valence of presented stimuli and reports of mood state. Activity of the zygomatic major muscle, which controls smiling, is said to be positively associated with positive emotional stimuli and positive mood state.

More: Facial electromyography - Wikipedia

1 Like

I was thinking of Ekman in the discussion above, but I didn’t realize his work was systematized to that extent. Interesting! This seems like an excellent possibility for connecting EEG and affect measures. It should be possible to record facial EMG and EEG simultaneously, then use FACS to identify emotions, and then train a machine learning system on the FACS/EEG data to see if reliable EEG/emotion correlations could be found.

Seems to me like an interesting research project waiting to happen, if it hasn’t already!

2 Likes

An interesting paper on emotion and EEG turned up on the The MIT technology review a few days ago:

http://www.technologyreview.com/view/545986/how-one-intelligent-machine-learned-to-recognize-human-emotions

And the paper by Zheng et al. can be found here:
Identifying Stable Patterns over Time for Emotion Recognition from EEG (PDF)

They had 15 students watch 15 video clips “associated with positive, negative, or neutral emotions”. During each film clip, they recorded 64 channels of EEG data and the subject’s facial expression. After, they asked the subject about their subjective response to the clip (did the clip produce a positive, neutral, or negative emotion, and how strong the response was on a scale of 1 to 5).

Running a machine learning algorithm on the data to check for correlations turned up the following result (as reported by the MIT technology review):

…the algorithm found a set of patterns that clearly
distinguished positive, negative, and neutral emotions that worked for
different subjects and for the same subjects over time with an accuracy
of about 80 percent.

Progress on the aBCI front!

Better yet, the ML aBCI study authors are making their data set available to other experimenters: http://bcmi.sjtu.edu.cn/~seed/

1 Like