skip to main content
Language:
Search Limited to: Search Limited to: Resource type Show Results with: Show Results with: Search type Index

MRMR-based ensemble pruning for facial expression recognition

Multimedia tools and applications, 2018-06, Vol.77 (12), p.15251-15272 [Peer Reviewed Journal]

Springer Science+Business Media, LLC 2017 ;Multimedia Tools and Applications is a copyright of Springer, (2017). All Rights Reserved. ;ISSN: 1380-7501 ;EISSN: 1573-7721 ;DOI: 10.1007/s11042-017-5105-z

Full text available

Citations Cited by
  • Title:
    MRMR-based ensemble pruning for facial expression recognition
  • Author: Li, Danyang ; Wen, Guihua
  • Subjects: Accuracy ; Algorithms ; Classifiers ; Computer Communication Networks ; Computer Science ; Data Structures and Information Theory ; Face recognition ; Feature selection ; Methods ; Multimedia Information Systems ; Neural networks ; Optimization ; Pruning ; Redundancy ; Side effects ; Special Purpose and Application-Based Systems
  • Is Part Of: Multimedia tools and applications, 2018-06, Vol.77 (12), p.15251-15272
  • Description: Facial expression recognition (FER) can assist the interaction between humans and devices. The combination of FER and ensemble learning can usually improve final recognition results. However, in many cases, the produced ensemble classifiers often contain many redundant members, and those components bring potential side effects to final results. Previous studies have illustrated that a more compact subset of a classifier pool shows better performance than original classifier pool. Furthermore, the compacted subset reduces storage space and decreases computation complexity. This paper proposes a maximum relevance and minimum redundancy-based ensemble pruning (MRMREP) method that treats prediction results as features, and extends the feature selection method to the ensemble classifier reduction problem to obtain a more representative subset. This novel method ordered all base classifiers according to two important factors: the correlation between target labels and predictions, and the redundancy between classifiers. The final ensemble performance was evaluated by comparing our method with other ensemble pruning methods, and superior results were obtained on the FER2013, JAFFE, and CK + databases.
  • Publisher: New York: Springer US
  • Language: English
  • Identifier: ISSN: 1380-7501
    EISSN: 1573-7721
    DOI: 10.1007/s11042-017-5105-z
  • Source: ProQuest Central

Searching Remote Databases, Please Wait