Affiliations: [a] Department of Electrical and Computer Engineering, North South University, Bangladesh | [b] Department of Medical Engineering, California Institute of Technology, Pasadena, CA, USA
Corresponding author: Mayamin Hamid Raha, Department of Electrical and Computer Engineering, North South University, Bangladesh. E-mail: [email protected].
Abstract: Face recognition is the most efficient image analysis application, and the reduction of dimensionality is an essential requirement. The curse of dimensionality occurs with the increase in dimensionality, the sample density decreases exponentially. Dimensionality Reduction is the process of taking into account the dimensionality of the feature space by obtaining a set of principal features. The purpose of this manuscript is to demonstrate a comparative study of Principal Component Analysis and Linear Discriminant Analysis methods which are two of the highly popular appearance-based face recognition projection methods. PCA creates a flat dimensional data representation that describes as much data variance as possible, while LDA finds the vectors that best discriminate between classes in the underlying space. The main idea of PCA is to transform high dimensional input space into the function space that displays the maximum variance. Traditional LDA feature selection is obtained by maximizing class differences and minimizing class distance.
Keywords: Eigenvalues, face recognition, linear discriminant analysis, principal component analysis, supervised learning