Título: | A Simple Maximum Gain Algorithm for Support Vector Regression. IWANN, International Work Conference in Neural Networks. Proceedings. LNCS, Lecture Notes in Computer Science. LNCS 5517, 73-80. Springer Verlag. Germany. |
---|---|
Autor: | Barbero, A., & Dorronsoro, J.R. |
Año: | 2009 |
Enlace: | http://link.springer.com/chapter/10.1007/978-3-642-02478-8_10 |
Abstract
Shevade’s et al. Modification 2 is one of the most widely used algorithms to build Support Vector Regression (SVR) models. It selects as a size 2 working set the index pair giving the maximum KKT violation and combines it with the updating heuristics of Smola and Schölkopf enforcing at each training iteration a αiα∗i=0 condition. In this work we shall present an alternative, much simpler procedure that selects the updating indices as those giving a maximum gain in the SVR dual function. While we do not try to enforce the αiα∗i=0 condition, we show that it will hold at each iteration provided it does so at the starting multipliers.
We will numerically show that the proposed procedure requires essentially the same number of iterations than Modification 2 having thus the same time performance while being much simpler to code.