Binary Classification with a Pseudo Exponential Model and Its Application for Multi-Task Learning
In this paper, we investigate the basic properties of binary classification with a pseudo model based on the Itakura–Saito distance and reveal that the Itakura–Saito distance is a unique appropriate measure for estimation with the pseudo model in the framework of general Bregman divergence. Furtherm...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2015-08-01
|
Series: | Entropy |
Subjects: | |
Online Access: | http://www.mdpi.com/1099-4300/17/8/5673 |
id |
doaj-41c4a7329be84343b7f879df451a242c |
---|---|
record_format |
Article |
spelling |
doaj-41c4a7329be84343b7f879df451a242c2020-11-25T00:30:24ZengMDPI AGEntropy1099-43002015-08-011785673569410.3390/e17085673e17085673Binary Classification with a Pseudo Exponential Model and Its Application for Multi-Task LearningTakashi Takenouchi0Osamu Komori1Shinto Eguchi2Future University Hakodate, 116-2 Kamedanakano, Hakodate Hokkaido 041-8655, JapanThe Institute of Statistical Mathematics, 10-3 Midori-cho, Tachikawa, Tokyo 190-8562, JapanThe Institute of Statistical Mathematics, 10-3 Midori-cho, Tachikawa, Tokyo 190-8562, JapanIn this paper, we investigate the basic properties of binary classification with a pseudo model based on the Itakura–Saito distance and reveal that the Itakura–Saito distance is a unique appropriate measure for estimation with the pseudo model in the framework of general Bregman divergence. Furthermore, we propose a novelmulti-task learning algorithm based on the pseudo model in the framework of the ensemble learning method. We focus on a specific setting of the multi-task learning for binary classification problems. The set of features is assumed to be common among all tasks, which are our targets of performance improvement. We consider a situation where the shared structures among the dataset are represented by divergence between underlying distributions associated with multiple tasks. We discuss statistical properties of the proposed method and investigate the validity of the proposed method with numerical experiments.http://www.mdpi.com/1099-4300/17/8/5673multi-task learningItakura–Saito distancepseudo modelun-normalized model |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Takashi Takenouchi Osamu Komori Shinto Eguchi |
spellingShingle |
Takashi Takenouchi Osamu Komori Shinto Eguchi Binary Classification with a Pseudo Exponential Model and Its Application for Multi-Task Learning Entropy multi-task learning Itakura–Saito distance pseudo model un-normalized model |
author_facet |
Takashi Takenouchi Osamu Komori Shinto Eguchi |
author_sort |
Takashi Takenouchi |
title |
Binary Classification with a Pseudo Exponential Model and Its Application for Multi-Task Learning |
title_short |
Binary Classification with a Pseudo Exponential Model and Its Application for Multi-Task Learning |
title_full |
Binary Classification with a Pseudo Exponential Model and Its Application for Multi-Task Learning |
title_fullStr |
Binary Classification with a Pseudo Exponential Model and Its Application for Multi-Task Learning |
title_full_unstemmed |
Binary Classification with a Pseudo Exponential Model and Its Application for Multi-Task Learning |
title_sort |
binary classification with a pseudo exponential model and its application for multi-task learning |
publisher |
MDPI AG |
series |
Entropy |
issn |
1099-4300 |
publishDate |
2015-08-01 |
description |
In this paper, we investigate the basic properties of binary classification with a pseudo model based on the Itakura–Saito distance and reveal that the Itakura–Saito distance is a unique appropriate measure for estimation with the pseudo model in the framework of general Bregman divergence. Furthermore, we propose a novelmulti-task learning algorithm based on the pseudo model in the framework of the ensemble learning method. We focus on a specific setting of the multi-task learning for binary classification problems. The set of features is assumed to be common among all tasks, which are our targets of performance improvement. We consider a situation where the shared structures among the dataset are represented by divergence between underlying distributions associated with multiple tasks. We discuss statistical properties of the proposed method and investigate the validity of the proposed method with numerical experiments. |
topic |
multi-task learning Itakura–Saito distance pseudo model un-normalized model |
url |
http://www.mdpi.com/1099-4300/17/8/5673 |
work_keys_str_mv |
AT takashitakenouchi binaryclassificationwithapseudoexponentialmodelanditsapplicationformultitasklearning AT osamukomori binaryclassificationwithapseudoexponentialmodelanditsapplicationformultitasklearning AT shintoeguchi binaryclassificationwithapseudoexponentialmodelanditsapplicationformultitasklearning |
_version_ |
1725326843508162560 |