Automatic extraction of facial interest points based on 2D and 3D data

Erdogmus, Nesli; Dugelay, Jean-Luc
3DIP 2011, Electronic Imaging Conference on 3D Image Processing and Applications, Vol 7864, January 23-27, 2011, San Francisco, CA, USA

 

 

 

Facial feature points are one of the most important clues for many computer vision applications such as face normalization, registration and model-based human face coding. Hence, automating the extraction of these points would have a wide range of usage. In this paper, we aim to detect a subset of Facial Definition Parameters (FDPs) defined in MPEG-4 automatically by utilizing both 2D and 3D face data. The main assumption in this work is that the 2D images and the corresponding 3D scans are taken for frontal faces with neutral expressions. This limitation is realistic with respect to our scenario, in which the enrollment is done in a controlled environment and the detected FDP points are to be used for the warping and animation of the enrolled faces [1] where the choice of MPEG-4 FDP is justified. For the extraction of the points, 2D, 3D data or both is used according to the distinctive information they carry in that particular facial region. As a result, total number of 29 interest points is detected. The method is tested on the neutral set of Bosphorus database that includes 105 subjects with registered 3D scans and color images.


DOI
HAL
Type:
Conférence
City:
San Francisco
Date:
2011-01-23
Department:
Sécurité numérique
Eurecom Ref:
3327
Copyright:
© 2011 Society of Photo-Optical Instrumentation Engineers.
This paper is published in 3DIP 2011, Electronic Imaging Conference on 3D Image Processing and Applications, Vol 7864, January 23-27, 2011, San Francisco, CA, USA and is made available as an electronic preprint with permission of SPIE. One print or electronic copy may be made for personal use only. Systematic or multiple reproduction, distribution to multiple locations via electronic or other means, duplication of any material in this paper for a fee or for commercial purposes, or modification of the content of the paper are prohibited.

PERMALINK : https://www.eurecom.fr/publication/3327