Semi-supervised Dimension Reduction using Trace Ratio Criterion

View Researcher II's Other Codes

Disclaimer: “The provided code links for this paper are external links. Science Nest has no responsibility for the accuracy, legality or content of these links. Also, by downloading this code(s), you agree to comply with the terms of use as set out by the author(s) of the code(s).”

Please contact us in case of a broken link from here

Authors Yi Huang, Dong Xu, Feiping Nie
Journal/Conference Name IEEE Transactions on Neural Networks and Learning Systems (TNNLS)
Paper Category
Paper Abstract In this brief, we address the trace ratio (TR) problem for semi-supervised dimension reduction. We first reformulate the objective function of the recent work semi-supervised discriminant analysis (SDA) in a TR form. We also observe that in SDA the low-dimensional data representation F is constrained to be in the linear subspace spanned by the training data matrix X (i.e., F=X^{T}W). In order to relax this hard constraint, we introduce a flexible regularizer \Vert F-X^{T}W\Vert^{2} which models the regression residual into the reformulated objective function. With such relaxation, our method referred to as TR based flexible SDA (TR-FSDA) can better cope with data sampled from a certain type of nonlinear manifold that is somewhat close to a linear subspace. In order to address the non-trivial optimization problem in TR-FSDA, we further develop an iterative algorithm to simultaneously solve for the low-dimensional data representation F and the projection matrix W. Moreover, we theoretically prove that our iterative algorithm converges to the optimum based on the Newton-Raphson method. The experiments on two face databases, one shape image database and one webpage database demonstrate that TR-FSDA outperforms the existing semi-supervised dimension reduction methods.
Date of publication 2012
Code Programming Language MATLAB
Comment

Copyright Researcher II 2021