[PDF][PDF] Significance of dimensionality reduction in image processing

VB Shereena, JM David - Signal & Image Processing: An …, 2015 - academia.edu
VB Shereena, JM David
Signal & Image Processing: An International Journal (SIPIJ), 2015academia.edu
The aim of this paper is to present a comparative study of two linear dimension reduction
methods namely PCA (Principal Component Analysis) and LDA (Linear Discriminant
Analysis). The main idea of PCA is to transform the high dimensional input space onto the
feature space where the maximal variance is displayed. The feature selection in traditional
LDA is obtained by maximizing the difference between classes and minimizing the distance
within classes. PCA finds the axes with maximum variance for the whole data set where LDA …
Abstract
The aim of this paper is to present a comparative study of two linear dimension reduction methods namely PCA (Principal Component Analysis) and LDA (Linear Discriminant Analysis). The main idea of PCA is to transform the high dimensional input space onto the feature space where the maximal variance is displayed. The feature selection in traditional LDA is obtained by maximizing the difference between classes and minimizing the distance within classes. PCA finds the axes with maximum variance for the whole data set where LDA tries to find the axes for best class seperability. The neural network is trained about the reduced feature set (using PCA or LDA) of images in the database for fast searching of images from the database using back propagation algorithm. The proposed method is experimented over a general image database using Matlab. The performance of these systems has been evaluated by Precision and Recall measures. Experimental results show that PCA gives the better performance in terms of higher precision and recall values with lesser computational complexity than LDA.
academia.edu
以上显示的是最相近的搜索结果。 查看全部搜索结果