skip to main content
Primo Search
Search in: Busca Geral

Deep-Emotion: Facial Expression Recognition Using Attentional Convolutional Network

Minaee, Shervin ; Minaei, Mehdi ; Abdolrashidi, Amirali

Sensors (Basel, Switzerland), 2021-04, Vol.21 (9), p.3046 [Periódico revisado por pares]

Switzerland: MDPI AG

Texto completo disponível

Citações Citado por
  • Título:
    Deep-Emotion: Facial Expression Recognition Using Attentional Convolutional Network
  • Autor: Minaee, Shervin ; Minaei, Mehdi ; Abdolrashidi, Amirali
  • Assuntos: Accuracy ; attention mechanism ; convolutional neural network ; Datasets ; Deep learning ; Emotions ; Face ; Facial Expression ; facial expression recognition ; Facial Recognition ; Machine learning ; Neural networks ; Neural Networks, Computer ; Recognition ; spatial transformer network
  • É parte de: Sensors (Basel, Switzerland), 2021-04, Vol.21 (9), p.3046
  • Notas: ObjectType-Article-1
    SourceType-Scholarly Journals-1
    ObjectType-Feature-2
    content type line 23
  • Descrição: Facial expression recognition has been an active area of research over the past few decades, and it is still challenging due to the high intra-class variation. Traditional approaches for this problem rely on hand-crafted features such as SIFT, HOG, and LBP, followed by a classifier trained on a database of images or videos. Most of these works perform reasonably well on datasets of images captured in a controlled condition but fail to perform as well on more challenging datasets with more image variation and partial faces. In recent years, several works proposed an end-to-end framework for facial expression recognition using deep learning models. Despite the better performance of these works, there are still much room for improvement. In this work, we propose a deep learning approach based on attentional convolutional network that is able to focus on important parts of the face and achieves significant improvement over previous models on multiple datasets, including FER-2013, CK+, FERG, and JAFFE. We also use a visualization technique that is able to find important facial regions to detect different emotions based on the classifier's output. Through experimental results, we show that different emotions are sensitive to different parts of the face.
  • Editor: Switzerland: MDPI AG
  • Idioma: Inglês

Buscando em bases de dados remotas. Favor aguardar.