LSDT: Latent Sparse Domain Transfer Learning for Visual Adaptation

View Researcher's Other Codes

Disclaimer: The provided code links for this paper are external links. Science Nest has no responsibility for the accuracy, legality or content of these links. Also, by downloading this code(s), you agree to comply with the terms of use as set out by the author(s) of the code(s).

Authors Lei Zhang, Wangmeng Zuo, and David Zhang
Journal/Conference Name IEEE Transactions on Image Processing
Paper Category
Paper Abstract We propose a novel reconstruction-based transfer learning method called latent sparse domain transfer (LSDT) for domain adaptation and visual categorization of heterogeneous data. For handling cross-domain distribution mismatch, we advocate reconstructing the target domain data with the combined source and target domain data points based on ℓ1-norm sparse coding. Furthermore, we propose a joint learning model for simultaneous optimization of the sparse coding and the optimal subspace representation. In addition, we generalize the proposed LSDT model into a kernel-based linear/nonlinear basis transformation learning framework for tackling nonlinear subspace shifts in reproduced kernel Hilbert space. The proposed methods have three advantages: 1) the latent space and the reconstruction are jointly learned for pursuit of an optimal subspace transfer; 2) with the theory of sparse subspace clustering, a few valuable source and target data points are formulated to reconstruct the target data with noise (outliers) from source domain removed during domain adaptation, such that the robustness is guaranteed; and 3) a nonlinear projection of some latent space with kernel is easily generalized for dealing with highly nonlinear domain shift (e.g., face poses). Extensive experiments on several benchmark vision data sets demonstrate that the proposed approaches outperform other state-of-the-art representation-based domain adaptation methods.
Date of publication 2016
Code Programming Language MATLAB
Comment

Copyright Researcher 2021