Image-based gender estimation from body and face across distances

Gonzalez-Sosa, Ester; Dantcheva, Antitza; Vera-Rodriguez, Ruben; Dugelay, Jean-Luc; Bremond, Francois; Fierrez, Julian
ICPR 2016, 23rd International Conference on Pattern Recognition, December 4-8, 2016, Cancun, Mexico

Gender estimation has received increased attention due to its use in a number of pertinent security and commercial applications. Automated gender estimation algorithms are mainly based on extracting representative features from face images. In this work we study gender estimation based on information deduced jointly from face and body, extracted from single-shot images. The approach addresses challenging settings such as lowresolution-images, as well as settings when faces are occluded. Specifically the face-based features include local binary patterns (LBP) and scale-invariant feature transform (SIFT) features, projected into a PCA space. The features of the novel bodybased algorithm proposed in this work include continuous shape information extracted from body silhouettes and texture information retained by HOG descriptors. Support Vector Machines (SVMs) are used for classification for body and face features. We conduct experiments on images extracted from video-sequences of the Multi-Biometric Tunnel database, emphasizing on three distance-settings: close, medium and far, ranging from full body exposure (far setting) to head and shoulders exposure (close setting). The experiments suggest that while face-based gender estimation performs best in the close-distance-setting, body-based gender estimation performs best when a large part of the body is visible. Finally we present two score-level-fusion schemes of face and body-based features, outperforming the two individual modalities in most cases. 


DOI
HAL
Type:
Conférence
City:
Cancun
Date:
2016-12-04
Department:
Sécurité numérique
Eurecom Ref:
5031
Copyright:
© 2016 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.

PERMALINK : https://www.eurecom.fr/publication/5031