Indexed Optimization: Learning Ramp-Loss SVM in Sublinear Time

碩士 === 國立臺灣大學 === 資訊工程學研究所 === 100 === Multidimensional indexing has been frequently used for sublinear-time nearest neighbor search in various applications. In this paper, we demonstrate how this technique can be integrated into learning problem with sublinear sparsity like ramp-loss SVM. We propos...

Full description

Bibliographic Details
Main Authors: EN-HSU YEN, 嚴恩勗
Other Authors: 林守德
Format: Others
Language:en_US
Published: 2012
Online Access:http://ndltd.ncl.edu.tw/handle/73543025733430311023
id ndltd-TW-100NTU05392071
record_format oai_dc
spelling ndltd-TW-100NTU053920712015-10-13T21:50:17Z http://ndltd.ncl.edu.tw/handle/73543025733430311023 Indexed Optimization: Learning Ramp-Loss SVM in Sublinear Time 利用索引技術達到低於線性時間的支持向量機訓練 EN-HSU YEN 嚴恩勗 碩士 國立臺灣大學 資訊工程學研究所 100 Multidimensional indexing has been frequently used for sublinear-time nearest neighbor search in various applications. In this paper, we demonstrate how this technique can be integrated into learning problem with sublinear sparsity like ramp-loss SVM. We propose an outlier-free convex-relaxation for ramp-loss SVM and an indexed optimization algorithm which solves large-scale problem in sublinear-time even when data cannot fit into memory. We compare our algorithm with state-of-the-art linear hinge-loss solver and ramp-loss solver in both sufficient and limited memory conditions, where our algorithm not only learns several times faster but achieves more accurate result on noisy and large-scale datasets. 林守德 2012 學位論文 ; thesis 32 en_US
collection NDLTD
language en_US
format Others
sources NDLTD
description 碩士 === 國立臺灣大學 === 資訊工程學研究所 === 100 === Multidimensional indexing has been frequently used for sublinear-time nearest neighbor search in various applications. In this paper, we demonstrate how this technique can be integrated into learning problem with sublinear sparsity like ramp-loss SVM. We propose an outlier-free convex-relaxation for ramp-loss SVM and an indexed optimization algorithm which solves large-scale problem in sublinear-time even when data cannot fit into memory. We compare our algorithm with state-of-the-art linear hinge-loss solver and ramp-loss solver in both sufficient and limited memory conditions, where our algorithm not only learns several times faster but achieves more accurate result on noisy and large-scale datasets.
author2 林守德
author_facet 林守德
EN-HSU YEN
嚴恩勗
author EN-HSU YEN
嚴恩勗
spellingShingle EN-HSU YEN
嚴恩勗
Indexed Optimization: Learning Ramp-Loss SVM in Sublinear Time
author_sort EN-HSU YEN
title Indexed Optimization: Learning Ramp-Loss SVM in Sublinear Time
title_short Indexed Optimization: Learning Ramp-Loss SVM in Sublinear Time
title_full Indexed Optimization: Learning Ramp-Loss SVM in Sublinear Time
title_fullStr Indexed Optimization: Learning Ramp-Loss SVM in Sublinear Time
title_full_unstemmed Indexed Optimization: Learning Ramp-Loss SVM in Sublinear Time
title_sort indexed optimization: learning ramp-loss svm in sublinear time
publishDate 2012
url http://ndltd.ncl.edu.tw/handle/73543025733430311023
work_keys_str_mv AT enhsuyen indexedoptimizationlearningramplosssvminsublineartime
AT yánēnxù indexedoptimizationlearningramplosssvminsublineartime
AT enhsuyen lìyòngsuǒyǐnjìshùdádàodīyúxiànxìngshíjiāndezhīchíxiàngliàngjīxùnliàn
AT yánēnxù lìyòngsuǒyǐnjìshùdádàodīyúxiànxìngshíjiāndezhīchíxiàngliàngjīxùnliàn
_version_ 1718068875944263680