Linear discriminant analysis for signal processing problems

Abstract
Linear discriminant analysis (LDA) and principal components analysis (PCA) are two common techniques used for classification and dimensionality reduction. These techniques typically use a linear transformation which can either be implemented in a class-dependent or class-independent fashion. PCA is a feature classification technique in which the data in the input space is transformed to a feature space where the features are decorrelated. On the other hand, the optimization criterion for LDA attempts to maximize class separability. We quantify the efficacy of these two algorithms along with two other classification techniques, support vector machines (SVM) and independent components analysis (ICA). The problem of classifying forestry images based on their scenic beauty is considered. On a standard evaluation task consisting of 478 training images and 159 test images, class-dependent LDA produced a 35.22% misclassification rate, which is significantly better than the 43.3% rate obtained using PCA and is on par with the performance of ICA and SVM.

This publication has 2 references indexed in Scilit: