Distributed Consensus Reduced Support Vector Machine

碩士 === 國立交通大學 === 應用數學系數學建模與科學計算碩士班 === 107 === Nowadays, Machine learning performs astonishingly in many different fields. The more data we have, our machine learning methods show better results. However, in some cases, the data owners may not want to share the information they have, because those...

Full description

Bibliographic Details
Main Authors: Chen, Hsiang-Hsuan, 陳祥瑄
Other Authors: Lee, Yuh-Jye
Format: Others
Language:en_US
Published: 2019
Online Access:http://ndltd.ncl.edu.tw/handle/as23xk
id ndltd-TW-107NCTU5507012
record_format oai_dc
spelling ndltd-TW-107NCTU55070122019-11-26T05:16:52Z http://ndltd.ncl.edu.tw/handle/as23xk Distributed Consensus Reduced Support Vector Machine 分散式共識縮減支持向量機 Chen, Hsiang-Hsuan 陳祥瑄 碩士 國立交通大學 應用數學系數學建模與科學計算碩士班 107 Nowadays, Machine learning performs astonishingly in many different fields. The more data we have, our machine learning methods show better results. However, in some cases, the data owners may not want to share the information they have, because those materials contain privacy issues. On the other hand, sometimes we encounter a very large dataset, which are difficult to store in a single machine. To deal with these two problems, we propose the distributed consensus reduced support vector machine (DCRSVM) for binary classification. Imagine that we have many local working units and a central master, and each working unit owns its data. The DCRSVM includes the following two merits. First, our method keeps the privacy of data, so we are not going to disclose local data to the central master. Besides, when we confront a large dataset, which is hard to store in a single server, the central master can still derive a good machine learning model even if the data stores only in local devices. Our method successfully solves the problems we mentioned above, and it generates a competitive result. Lee, Yuh-Jye 李育杰 2019 學位論文 ; thesis 37 en_US
collection NDLTD
language en_US
format Others
sources NDLTD
description 碩士 === 國立交通大學 === 應用數學系數學建模與科學計算碩士班 === 107 === Nowadays, Machine learning performs astonishingly in many different fields. The more data we have, our machine learning methods show better results. However, in some cases, the data owners may not want to share the information they have, because those materials contain privacy issues. On the other hand, sometimes we encounter a very large dataset, which are difficult to store in a single machine. To deal with these two problems, we propose the distributed consensus reduced support vector machine (DCRSVM) for binary classification. Imagine that we have many local working units and a central master, and each working unit owns its data. The DCRSVM includes the following two merits. First, our method keeps the privacy of data, so we are not going to disclose local data to the central master. Besides, when we confront a large dataset, which is hard to store in a single server, the central master can still derive a good machine learning model even if the data stores only in local devices. Our method successfully solves the problems we mentioned above, and it generates a competitive result.
author2 Lee, Yuh-Jye
author_facet Lee, Yuh-Jye
Chen, Hsiang-Hsuan
陳祥瑄
author Chen, Hsiang-Hsuan
陳祥瑄
spellingShingle Chen, Hsiang-Hsuan
陳祥瑄
Distributed Consensus Reduced Support Vector Machine
author_sort Chen, Hsiang-Hsuan
title Distributed Consensus Reduced Support Vector Machine
title_short Distributed Consensus Reduced Support Vector Machine
title_full Distributed Consensus Reduced Support Vector Machine
title_fullStr Distributed Consensus Reduced Support Vector Machine
title_full_unstemmed Distributed Consensus Reduced Support Vector Machine
title_sort distributed consensus reduced support vector machine
publishDate 2019
url http://ndltd.ncl.edu.tw/handle/as23xk
work_keys_str_mv AT chenhsianghsuan distributedconsensusreducedsupportvectormachine
AT chénxiángxuān distributedconsensusreducedsupportvectormachine
AT chenhsianghsuan fēnsànshìgòngshísuōjiǎnzhīchíxiàngliàngjī
AT chénxiángxuān fēnsànshìgòngshísuōjiǎnzhīchíxiàngliàngjī
_version_ 1719296037608226816