Orthogonal vs. Uncorrelated Least Squares Discriminant Analysis for Feature Extraction
View Researcher II's Other CodesDisclaimer: “The provided code links for this paper are external links. Science Nest has no responsibility for the accuracy, legality or content of these links. Also, by downloading this code(s), you agree to comply with the terms of use as set out by the author(s) of the code(s).”
Please contact us in case of a broken link from here
Authors | Feiping Nie, Shiming Xiang, Yun Liu, Chenping Hou, Changshui Zhang |
Journal/Conference Name | Pattern Recognition Letters |
Paper Category | Computer Science |
Paper Abstract | In this paper, a new discriminant analysis for feature extraction is derived from the perspective of least squares regression. To obtain great discriminative power between classes, all the data points in each class are expected to be regressed to a single vector, and the basic task is to find a transformation matrix such that the squared regression error is minimized. To this end, two least squares discriminant analysis methods are developed under the orthogonal or the uncorrelated constraint. We show that the orthogonal least squares discriminant analysis is an extension to the null space linear discriminant analysis, and the uncorrelated least squares discriminant analysis is exactly equivalent to the traditional linear discriminant analysis. Comparative experiments show that the orthogonal one is more preferable for real world applications. |
Date of publication | 2012 |
Code Programming Language | MATLAB |
Comment |