Indexed Optimization: Learning Ramp-Loss SVM in Sublinear Time
碩士 === 國立臺灣大學 === 資訊工程學研究所 === 100 === Multidimensional indexing has been frequently used for sublinear-time nearest neighbor search in various applications. In this paper, we demonstrate how this technique can be integrated into learning problem with sublinear sparsity like ramp-loss SVM. We propos...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Others |
Language: | en_US |
Published: |
2012
|
Online Access: | http://ndltd.ncl.edu.tw/handle/73543025733430311023 |
Summary: | 碩士 === 國立臺灣大學 === 資訊工程學研究所 === 100 === Multidimensional indexing has been frequently used for sublinear-time nearest neighbor search in various applications. In this paper, we demonstrate how this technique can be integrated into learning problem with sublinear sparsity like ramp-loss SVM. We propose an outlier-free convex-relaxation for ramp-loss SVM and an indexed optimization algorithm which solves large-scale problem in sublinear-time even when data cannot fit into memory. We compare our algorithm with state-of-the-art linear hinge-loss solver and ramp-loss solver in both sufficient and limited memory conditions, where our algorithm not only learns several times faster but achieves more accurate result on noisy and large-scale datasets.
|
---|