Identification of Human Emotions using EEG signals Based on an Intelligent Discriminative Sparse Model

Document Type : Original Article

Authors

1 Department of Biomedical Engineering, Hamedan University of Technology, Hamedan, Iran

2 1Hamedan University of Technology, Department of Biomedical Engineering, Hamedan, Iran, 65169-13733

Abstract
Automatic categorization of human affective states is one of the main tasks required for implementation of a human-computer interface (HMI). Electroencephalogram (EEG) signals are particularly useful for developing efficient models for the HMI interface. However, the signal patterns associated with the emotional feelings are often subject-dependent, compromising the accuracy of the recognition system. In this paper, a novel sparse framework is proposed to enhance discriminative characteristics of emotional states by utilizing dictionaries as the basis functions. The proposed framework increases the between-class discrimination of the emotional states through the use of class-specified atoms in the dictionaries. In addition to the common representation residual employed by the sparse models, we have utilized separate classifiers to improve the discriminating capability of the proposed model. The proposed model is employed to distinguish between human emotional states based on recorded EEG signals presented in the ‘DEAP’ dataset. Our results are compared with those produced by well-known feature extraction methods and classification approaches indicating the proposed model could identify between-class differences associated with the input patterns. While the representation residual led to limited performance, the advantage of the proposed model was prominent when separate classifiers were applied. Finally, our results were compared with those of relevant investigations reported in the literature indicating that our method identified valence and arousal categories with reasonable accuracy.

Keywords

Subjects


Volume 1, Issue 4
Autumn 2025
Pages 217-227

  • Receive Date 15 October 2024
  • Revise Date 26 February 2025
  • Accept Date 08 March 2025
  • First Publish Date 06 July 2025