مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

video

مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

sound

مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

Persian Version

مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

View:

30
مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

Download:

2
مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

Cites:

Information Journal Paper

Title

Enhancing Emotion Classification via EEG Signal Frame Selection

Pages

  83-93

Keywords

Electroencephalogram (EEG) 
Common Spatial Pattern (CSP) 

Abstract

 The classification of emotions using electroencephalography (EEG) signals is inherently challenging due to the intricate nature of brain activity. Overcoming inconsistencies in EEG signals and establishing a universally applicable sentiment analysis model are essential objectives. This study introduces an innovative approach to cross-subject emotion recognition, employing a Genetic Algorithm (GA) to eliminate non-informative frames. Then, the optimal frames identified by the GA undergo spatial feature extraction using common spatial patterns (CSP) and the logarithm of variance. Subsequently, these features are input into a Transformer network to capture spatial-temporal features, and the emotion classification is executed using a fully connected (FC) layer with a Softmax activation function. Therefore, the innovations of this paper include using a limited number of channels for emotion classification without sacrificing accuracy, selecting optimal signal segments using the GA, and employing the Transformer network for high-accuracy and high-speed classification. The proposed method undergoes evaluation on two publicly accessible datasets, SEED and SEED-V, across two distinct scenarios. Notably, it attains mean accuracy rates of 99.96% and 99.51% in the cross-subject scenario, and 99.93% and 99.43% in the multi-subject scenario for the SEED and SEED-V datasets, respectively. Noteworthy is the outperformance of the proposed method over the state-of-the-art (SOTA) in both scenarios for both datasets, thus underscoring its superior efficacy. Additionally, comparing the accuracy of individual subjects with previous works in cross subject scenario further confirms the superiority of the proposed method for both datasets.

Multimedia

  • No record.
  • Cites

  • No record.
  • References

  • No record.
  • Cite

    Related Journal Papers

  • No record.
  • Related Seminar Papers

  • No record.
  • Related Plans

  • No record.
  • Recommended Workshops






    Move to top
    telegram sharing button
    whatsapp sharing button
    linkedin sharing button
    twitter sharing button
    email sharing button
    email sharing button
    email sharing button
    sharethis sharing button