New Primal SVM Solver with Linear Computational Cost for Big Data Classifications

View Researcher II's Other Codes

Disclaimer: “The provided code links for this paper are external links. Science Nest has no responsibility for the accuracy, legality or content of these links. Also, by downloading this code(s), you agree to comply with the terms of use as set out by the author(s) of the code(s).”

Please contact us in case of a broken link from here

Authors Feiping Nie, Yizhen Huang, Xiaoqian Wang, Heng Huang
Journal/Conference Name The 31st International Conference on Machine Learning (ICML)
Paper Category
Paper Abstract Support Vector Machines (SVM) is among the most popular classification techniques in machine learning, hence designing fast primal SVM algorithms for large-scale datasets is a hot topic in recent years. This paper presents a new L2- norm regularized primal SVM solver using Augmented Lagrange Multipliers, with linear computational cost for Lp-norm loss functions. The most computationally intensive steps (that determine the algorithmic complexity) of the proposed algorithm is purely and simply matrix-byvector multiplication, which can be easily parallelized on a multi-core server for parallel computing. We implement and integrate our algorithm into the interfaces and framework of the well-known LibLinear software toolbox. Experiments show that our algorithm is with stable performance and on average faster than the state-of-the-art solvers such as SVMper f, Pegasos and the LibLinear that integrates the TRON, PCD and DCD algorithms.
Date of publication 2014
Code Programming Language MATLAB
Comment

Copyright Researcher II 2021