Probabilistic labeled Semi-supervised SVM

View Researcher II's Other Codes

Disclaimer: “The provided code links for this paper are external links. Science Nest has no responsibility for the accuracy, legality or content of these links. Also, by downloading this code(s), you agree to comply with the terms of use as set out by the author(s) of the code(s).”

Please contact us in case of a broken link from here

Authors Mingjie Qian, Feiping Nie, Changshui Zhang
Journal/Conference Name Workshop on Optimization Based Methods for Emerging Data Mining Problems in conjunction with IEEE International Conference on Data Mining (ICDM Workshop)
Paper Category
Paper Abstract Semi-supervised learning has been paid increasing attention and is widely used in many fields such as data mining, information retrieval and knowledge management as it can utilize both labeled and unlabeled data. Laplacian SVM (LapSVM) is a very classical method whose effectiveness has been validated by large number of experiments. However, LapSVM is sensitive to labeled data and it exposes to cubic computation complexity which limit its application in large scale scenario. In this paper, we propose a multi-class method called Probabilistic labeled Semi-supervised SVM (PLSVM) in which the optimal decision surface is taught by probabilistic labels of all the training data including the labeled and unlabeled data. Then we propose a kernel version dual coordinate descent method to efficiently solve the dual problems of our Probabilistic labeled Semi-supervised SVM and decrease its requirement of memory. Synthetic data and several benchmark real world datasets show that PLSVM is less sensitive to labeling and has better performance over traditional methods like SVM, LapSVM (LapSVM) and Transductive SVM (TSVM).
Date of publication 2009
Code Programming Language MATLAB
Comment

Copyright Researcher II 2021