International Journal of Integrated Engineering (Issue on Electrical and Electronic Engineering)
Classification of Human Emotions from EEG
Signals using Statistical Features and Neural
Network
Chai Tong Yuen1,*, Woo San San1, Mohamed Rizon2 and Tan Ching Seong1
1
Department of Mechatronics and Biomedical Engineering, Faculty of Engineering and Science,
Universiti Tunku Abdul Rahman
2
Department of Electrical Engineering, King Saud University
*Corresponding email: chaity@utar.edu.my
Abstract
A statistical based system for human emotions classification by using electroencephalogram (EEG) is proposed in
this paper. The data used in this study is acquired using EEG and the emotions are elicited from six human
subjects under the effect of emotion stimuli. This paper also proposed an emotion stimulation experiment using
visual stimuli. From the EEG data, a total of six statistical features are computed and back-propagation neural
network is applied for the classification of human emotions. In the experiment of classifying five types of
emotions: Anger, Sad, Surprise, Happy, and Neutral. As result the overall classification rate as high as 95% is
achieved.
Keywords: EEG, Human emotions, Neural network, Statistical features.
International Journal of Integrated Engineering (Issue on Electrical and Electronic Engineering)
extract features from the EEG data. Neural network is
1. INTRODUCTION used for classification of four types of emotions (joy,
A considerable amount of research effort has been sorrow, relax and anger) and the highest success rate is
channeled towards the identification and utilization of 67.7%. Takahashi [5] developed an emotion
information of human emotions. Various ways of recognition system using SVM for to classify five
human-computer and human-machine interaction have emotions (joy, anger, sadness, happiness and relax)
been studied in the effort of enabling computers and based on statistical features computed from the raw
machines to be more alert to the emotions and signal. The recognition rate is 41.7%. Takahashi and
affective needs of human beings. Information of Tsuguchi [6] compared the effectiveness of neural
human emotions are gathered from a living body using network and SVM in classifying two emotions:
various channels including the use of pleasure and displeasure. In this study, statistical
electroencephalogram, in which the brainwaves are features are used and the recognition rates achieved are
directly extracted from a human and the patterns of the 62.3% and 59.7% for neural network and SVM
waves are studied to classify emotions. Other respectively.
techniques explored in the classification of emotions Fuzzy logic provided new possibilities into control,
including face emotion recognition using vision data analysis and data modeling. One of the issues in
system, respiration rate and tone recognition from using fuzzy clustering based classification is setting
human voice [1]. the number of clusters in each class. The
EEG is used to record information of the human brain generalization is acceptable when large sets of samples
activities in the form of measurement of electrical are available for classification [12]. Fuzzy C-Means
activity of the brain. The electrical activities of the [13], [14] has been one of popular researches in recent
brain are recorded from electrodes placed on the scalp researches.
and this measurement may indicate the emotion state
of human subject while the information is recorded [2], 2. EEG DATA ACQUISITION
[11], [15]. Researches believe that the states of the This section describes the acquisition of physiological
brain changes as feelings change, therefore, EEG is signals from EEG under emotion stimulation
suitable for the task of recording the changes in brain experiments. Various ways of elicitation emotions in
waves which vary in accordance to feelings or human subjects have been employed in the aim to
emotions [3]. The EEG has a few advantages that develop databases of brain waves data for different states
enable it to be chosen in the study of human emotions, of emotions. Some of the methods studied include:
that is, EEG has high speed, non-invasive and causes acquisition of EEG data from subjects who act out the
no pain to the human subjects. These are important emotions based on imagination and by employing stimuli
aspects so as to acquire natural and real emotions from such as audio and visual stimuli.
human subjects. However, there are difficulties in The strategy of requiring an actor to feel or express a
understanding the EEG data. There are a large number particular mood has been widely used for emotions
of organizations, structures, processes involved in the assessment from facial expressions and physiological
underlying EEG data. The number of associations and signals [7]. This strategy has a major weakness as it is
aspects of emotions is also large [2]. highly difficult to ensure that the physiological signals
The analysis of brain waves [16] has utilized various obtained can consistently reproducible by non-actors.
Therefore, actor-play database is often far from real
signal processing and artificial intelligence skills in the
emotions found in the real scenario [8]. Alternatively,
effort to develop emotions classification. These
visual, audio or combination of both stimuli can be used
systems are mainly developed to facilitate the
as an approach of inducing emotions. This method is
interaction between human and computer and further capable of producing responses that are closer to real life.
to be incorporated in various machines such as robots In this study, visual stimuli are used in the emotion
to develop various intelligent systems and machines stimulation experiment where adult facial stimuli [9]
such as patients monitoring systems in hospitals and were used. Evaluation is carried out after the stimulation
medical robots. experiment as the emotions induced during the
The expressions ‘Yes’ and ‘No’ are the most essential experiments may vary from the expected emotions. This
expressions in the interactions among humans. Chang can be explained by individual differences in past
Su Ryu [4] developed a system to discriminate ‘Yes’ experience and personal understanding when viewing the
and ‘No’ using Support Vector Machine (SVM) to stimuli [8].
classify features extracted by Fourier Fast Transform The database developed in this study consists of the
(FFT). The recognition rate of 80% is achieved. Ishino EEG data acquired from 6 human subjects (3 males and 3
and Hagiwara [3] applied FFT, Wavelet Transform, females, aged from 23 to 26 years old). A 64-channel
Principal Component Analysis, mean and variance to biosensor is used, in which 62 channels are occupied for
the EEG electrodes and the remaining 2 channels for the
International Journal of Integrated Engineering (Issue on Electrical and Electronic Engineering)
Table 1: Emotions assessment results by human subjects [%]
Very weak Weak Moderate Strong Very strong
Anger 33.33 0.00 16.67 16.67 33.33
Happy 0.00 16.67 33.33 16.67 33.33
Sadness 0.00 16.67 0.00 50.00 33.33
Neutral 0.00 0.00 0.00 83.33 16.67
Surprise 16.67 16.67 33.33 33.33 0.00
Table 2: Ratings of emotions elicited by human subjects [%]
In/Out Anger Happy Sadness Neutral Surprise No Emotion
Anger 66.67 0.00 16.67 16.67 0.00 0.00
Happy 0.00 83.33 0.00 16.67 0.00 0.00
Sadness 0.00 0.00 83.33 0.00 0.00 16.67
Neutral 0.00 0.00 0.00 100.00 0.00 0.00
Surprise 0.00 0.00 0.00 0.00 100.00 0.00
electrodes of electrooculogram (EOG) for the are also required to specify if multiple emotions were
detection of eye blinks and eye movements. The aroused during the display of a particular emotion
signals were sampled at the rate of 256 Hz. For the stimulus.
recording of EEG data during the experiment, the The results of assessment for the emotion elicitation
subject wearing the EEG and EOG sensors sits experiment are as shown in Table 1 and Table 2. Table
comfortably in front of a computer screen presenting 1 shows the results of the emotions categorized by the
the stimuli in the format of Windows Microsoft Office subjects based on the visual stimuli. For example, the
PowerPoint with the transition of slides made ratings for emotion ‘anger’: 33.33% of the subjects
automated. rated the emotion ‘anger’ that they felt was ‘very
First, before the experiment is started, a slide weak’, 16.67% rated ‘moderate’, 16.67% rated
containing the instructions is displayed for 10 seconds ‘strong’ and 33.33% rated ‘very strong’. Table 2
to prepare the subject for the experiment which shows the ratings of strength of the emotions felt when
includes: reminder for subjects to minimize physical viewing the stimuli. For example, when stimulus for
movements and eye blinks. A set of 4 images the emotion ‘anger’ is viewed, 66.67% of the subjects
consisting relaxing sceneries is presented for a period correctly classified the stimulus as ‘anger’.
of 20 seconds to record the ‘neutral’ emotion from the
subject. Then, an introductory slide is displayed to 3. METHODOLOGY: STATISTICAL FEATURES
prepare the subject to react to the visual stimuli about For the emotion classification stage, significant and
to be shown. Next, two sets of visual stimuli consisting important features need to be extracted from the EEG
6 images of facial stimuli [9] are displayed for 3 raw data for training and testing. Let the signals
seconds for each images to stimulate one emotion. In recorded from the EEG be designated by X and Xn
between each set, a dark screen is displayed for 5 represents the value of the nth sample of the raw signal,
seconds to facilitate a short ‘rest’ period for the where n = 1,..., N , with N = 1024 (1024 samples
subject.
corresponds to 4 seconds of the EEG recording). In
After running the two sets of visual stimuli, a dark
this study, six statistical features are computed from
screen is shown for 45 seconds and soothing music is
the EEG data [1], [6]:
played for the subject to relax and prepare for the next
emotion stimuli. This completes a cycle of stimulation
1. The means of the raw signals
for one emotion. The total time consumed for the
1 N
stimulation of one emotion is approximately two
minutes. The flow of stimuli as described above is
µx =
N n =1
Xn ∑ (1)
repeated for the stimuli of 5 types of emotions:
‘happy’, ‘anger’, ‘surprise’ and ‘sadness’. An 2. The standard deviation of the raw signals
assessment is carried out after the whole experiment 1/ 2
1 N 2
for the subjects to describe the particular emotions σx = (X n − µ x )
∑ (2)
elicited when the stimuli shown during the experiment N − 1 n =1
and rate the strength of the emotions felt. The subjects
International Journal of Integrated Engineering (Issue on Electrical and Electronic Engineering)
3. The means of the absolute values of the first using neural network. The features are computed using
differences of the raw signals equations (1) – (11). The number of input learning data
1 N −1 is 30 for each emotion, amount to 150 and the neural
δx = ∑ X n+1 − X n
N − 1 n =1
(3) network is tested with 60 data from each emotion,
amount to 300. The target output for each emotion is
set as ‘000’ for ‘anger’, ‘001’ for ‘happy’, ‘010’ for
4. The means of the absolute values of the first ‘sadness’, ‘011’ for ‘neutral’ and ‘100’ for ‘surprise’.
differences of the normalized signals The assumptions applied to the output of the neural
~ 1 N −1 ~ ~ δ
δx = ∑
N − 1 n =1
X n +1 − X n = x
σx
(4)
network to obtain final result: result = 0 if output is
less than 0.5, result = 1 if output is more than or equal
to 0.5 and rejecting output more than 1.0.
For the combination that produced the highest
5. The means of the absolute values of the
success rate in classifying the emotions, a second study
second differences of the raw signals
is carried out using the combination. In the second
1 N −2
γx = ∑ X n+2 − X n
N − 2 n=1
(5) study, the neural network is trained and tested with
input extracted from 4 emotions, then 3 emotions and
lastly 2 emotions. This is to justify if the number of
6. The means of the absolute values of the categories to be classified by the neural network
second differences of the normalized signals affects the performance. Table 3 shows the parameters
1 N −2 ~ ~ γ used for the back-propagation neural network.
γ~x = ∑
N − 2 n =1
X n+2 − X n = x
σx
(6)
Table 3: Parameters of neural network for
The features chosen can cover and extend a range combinations of statistical features
of typically measured statistics in the emotion Number of input layer units 2
physiology literature [10]. The combinations of Number of hidden layer units 30
statistical features computed from equation (1) – (6) Number of output layer units 3
are defined as feature vectors, χ n below: Learning rate 0.01
Maximum epoch 10000
χ 1 = [µ x σ x ] (7) Learning goal 0.01
χ 2 = [δ x γ x ] (8) 5. RESULTS AND DISCUSSIONS
The results for the classification of 5 types of
[~
χ 3 = δ x γ~x ] (9) emotions are as shown in Table 4. The effectiveness of
the combinations of statistical features is compared
[
χ4 = δ x δ x
~
] (10)
based on rate of correct classification as well as time
consumed for the training of the neural network. The
results show that the combination computed using
χ 5 = [γ x γ~x ] (11) equation (8) produced the highest rate of correct
classification. Using the features computed by this
combination, 95% of correct classification rate is
4. METHODOLOGY: CLASSIFICATION USING BACK– achieved for the classification of 5 types of emotions
PROPAGATION NEURAL NETWORK with 12.68 seconds consumed for training. In terms of
time consumption, the combination of equation (9) is
Neural network is inspired by the way human brain
the lowest at 7.50 seconds but only achieved 78.33%
works [3]. It is an information processing paradigm
in terms of performance.
that is closely related to biological nervous system of a
Table 5 shows the classification result for
human. Neural network comprises of processing
elements which are highly interconnected neurons that combination [δ x γ x ] based on emotions. From the
work in unison in solving specific problems. Neural table, 100% correct classification is achieved for
network possesses a few advantages over other emotion ‘sadness’, which means all the testing inputs
classification techniques because it has the ability to for ‘sadness’ were correctly identified as ‘sadness’,
derive meaning from complex and imprecise data while other emotions achieved correct classification
which means it can be used to extract and detect trends
and patterns that are too complex [3].
In this study, the combinations of statistical features
are first used for classification of 5 types of emotions
International Journal of Integrated Engineering (Issue on Electrical and Electronic Engineering)
Table 4: Overall classification rate and time consumption
for combinations of statistical features
Combinations Time Consumption (s) Classification Rate (%)
χ 1 = [µ x σ x ] 36.69 69.00
χ 2 = [δ x γ x ] 12.68 95.00
[
χ3 = δ x
~
γ~x] 7.50 78.33
= [δ δ ]
~
χ4 x x 36.56 78.00
χ 5 = [γ x γ x ]
~ 23.26 81.00
Table 5: Classification result for the combination of [δ x γ x ] [%]
(Highest overall classification rate among combinations)
In/Out Anger Happy Sadness Neutral Surprise
Anger 95.00 1.67 0.00 3.33 0.00
Happy 1.67 91.67 0.00 6.67 0.00
Sadness 0.00 0.00 100.00 0.00 0.00
Neutral 1.67 1.67 0.00 96.67 0.00
Surprise 8.33 0.00 1.67 0.00 90.00
Table 6: Classification results for 5 emotions, 4 emotions,
3 emotions and 2 emotions
Emotions Time Consumption (s) Classification Rate (%)
Sad, Neutral, Happy, Anger, Surprise 12.68 95.00
Sad, Neutral, Happy, Anger 8.91 95.42
Sad, Neutral, Happy 5.36 97.20
Sad, Neutral 2.87 97.50
rate of between 90.00% to 96.66%. The results shows reduce the complexity due to high dimensionality.
that there are some of the inputs which were mistaken High dimensionality and complexity can affect the
for the wrong emotion, therefore, such cases produced performance of neural network.
wrong outputs. In this study, two points have been proven:
This study also investigates the effect of number of combinations of two features are effective in the task
categories to be classified to the performance of the of classifying emotions in achieving high classification
neural network. The combination of [δ x γ x ] rate using back-propagation neural network, and, as
produced the highest correct classification, therefore, number of categories to be classified is reduced, the
this combination is used in comparing the number of performance is improved significantly.
emotions and their rates of successful classification.
6. CONCLUSION
Table 6 shows the results of the classification using
neural network, where the emotions are broken down The results produced in this paper have proved that
into smaller number of states. From the table, it can be neural network can perform well in classifying
observed that as the number of stated to classified is emotions. This paper also suggested that statistical
reduced, the performance improved. The classification features can be successfully used in the classification
of only 2 types of emotions (neutral and sadness) of different types of emotions using the EEG. Unlike
produced the highest percentage of correct other feature extraction methods, statistical features
classification at 97.50%. The percentage is then can be computed easily for real-time applications.
followed by the classification of 3 emotions, 97.20%, 4 More work should be emphasized in increasing the
emotions, 95.42% and 5 emotions, the lowest effectiveness of algorithm in recognizing higher
performance at 95.00%. Combinations of two features number of states of emotions as well as reducing the
are used instead of all six features to be used jointly to
International Journal of Integrated Engineering (Issue on Electrical and Electronic Engineering)
processing time needed by the algorithm in producing for Multifunction Myoelectric Control”, IEEE
positive results. Transactions on Biomedical Sciences, Vol. 2, No.
3, 2007, pp.186-194.
[13] K.G. Srinivasa, K.R. Venugopal and L.M.
REFERENCES Patnaik, “ Feature Extraction using Fuzzy C-
Means Clustering for Data Mining Systems”,
[1] R.W. Picard, E. Vyzas and J. Healey, “Toward International Journal of Computer Science and
Machine Emotional Intelligence: Analysis of Network Security, Vol. 6, No. 3A, 2006, pp. 230-
Affective Physiological State”, IEEE 236.
Transactions on Pattern Analysis and Machine [14] M. Murugappan, M. Rizon, R. Nagarajan and S.
Intelligence, Vol. 23, No. 10, 2001, pp. 1175 - Yaacob, “FCM Clustering of Human Emotions
1191. using Wavelet based Features from EEG”,
[2] M. Teplan, “Fundamentals of EEG Biomedical Soft Computing and Human
Measurements”, Measurement Science Review, Sciences, Vol. 14, No. 2, 2009, pp. 35-40.
Volume 2, Section 2, 2002, pp. 1-11. [15] F. Nasoz, C.L. Lisetti, K. Alvarez and N.
[3] K. Ishino and M. Hagiwara, “A Feeling Finkelstein, “Emotion Recognition from
Estimation System Using a Simple Physiological Signals for User Modeling of
Electroencephalograph”, Proc. of the IEEE Affect”, Proceedings on Engineering
International Conference on Systems, Man and Applications on Artificial Intelligence, Vol.
Cybernetics, 2003, pp. 4204-4209. 20(3), 2007, pp. 337-345.
[4] S.R. Chang, S. Kim and H.P. Seon, “An [16] J. Wagner, J. Kim and E. Andre, “From
Estimation of the Correlation Dimension for the Physiological Signals to Emotions: Implementing
EEG in the Emotional States”, Proc. of the 19th and Comparing Selected Methods for Feature
International Conference – IEEE/EMSS, 1997, Extraction and Classification”, IEEE Proceedings
pp. 1547-1550. on Multimedia and Expo, 2005, pp. 940-943.
[5] K. Takahashi, “Remarks on SVM-Based Emotion
Recognition from Multi-modal Bio-Potential
Signals”, Proc. of the IEEE International
Workshop on Robot and Human Interactive
Communication, 2004, pp. 95-100.
[6] K. Takahashi and A. Tsukaguchi, ‘Remarks on
Emotion Recognition from Multimodal Bio-
Potential Signals”, The Japanese Journal of
Ergonomics, Vol. 41, No. 4, 2005, pp.248-253.
[7] Y.K. Min, S.C. Chung and B.C. Min,
“Physiological Evaluation on Emotional Change
Induced by Imagination”, Applied
Psychophysiology and Biofeedback, Vol. 30, No.
2, 2005, pp. 137-150.
[8] G. Chanel, J. Kronegg, D. Grandjean and T. Pun,
“Emotion Assessment: Arousal Evaluation Using
EEG’s and Peripheral Physiological Signals”,
Technical Report, 2005.
[9] M. Minear and D.C. Park, “A Lifespan Database
of Adult Facial Stimuli”, Behavior Research
Methods, Instruments, & Computer 36, 2004, pp.
630-633.
[10] E. Vyzas and R.W. Picard, “Affective Pattern
Recognition”, Proc. of AAAI 1998 Fall Symp.,
Emotional and Intelligent: The Tangled Knot of
Cognition, 1998.
[11] Egon L. Van Den Broek, M.H. Schut, Joyce
H.D.M. Westerink, J.V. Herk and K.
Tuinenbreijer, “Computing Emotion Awareness
through Facial Electromyography”, HCI/ECCV,
2006, pp. 52-63.
[12] R.N. Khushaba and A. Al-Jumaily, “Fuzzy
Wavelet Packet based Feature Extraction Method