General and Local: Averaged k-Dependence Bayesian Classifiers
The inference of a general Bayesian network has been shown to be an NP-hard problem, even for approximate solutions. Although k-dependence Bayesian (KDB) classifier can construct at arbitrary points (values of k) along the attribute dependence spectrum, it cannot identify the changes of interdepende...
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2015-06-01
|
Series: | Entropy |
Subjects: | |
Online Access: | http://www.mdpi.com/1099-4300/17/6/4134 |
id |
doaj-673bcef3777f411ea1bf72273d6ba747 |
---|---|
record_format |
Article |
spelling |
doaj-673bcef3777f411ea1bf72273d6ba7472020-11-25T01:48:37ZengMDPI AGEntropy1099-43002015-06-011764134415410.3390/e17064134e17064134General and Local: Averaged k-Dependence Bayesian ClassifiersLimin Wang0Haoyu Zhao1Minghui Sun2Yue Ning3Key Laboratory of Symbolic Computation and Knowledge Engineering of Ministry of Education, Jilin University, ChangChun 130012, ChinaSchool of Software, Jilin University, ChangChun 130012, ChinaKey Laboratory of Symbolic Computation and Knowledge Engineering of Ministry of Education, Jilin University, ChangChun 130012, ChinaKey Laboratory of Symbolic Computation and Knowledge Engineering of Ministry of Education, Jilin University, ChangChun 130012, ChinaThe inference of a general Bayesian network has been shown to be an NP-hard problem, even for approximate solutions. Although k-dependence Bayesian (KDB) classifier can construct at arbitrary points (values of k) along the attribute dependence spectrum, it cannot identify the changes of interdependencies when attributes take different values. Local KDB, which learns in the framework of KDB, is proposed in this study to describe the local dependencies implicated in each test instance. Based on the analysis of functional dependencies, substitution-elimination resolution, a new type of semi-naive Bayesian operation, is proposed to substitute or eliminate generalization to achieve accurate estimation of conditional probability distribution while reducing computational complexity. The final classifier, averaged k-dependence Bayesian (AKDB) classifiers, will average the output of KDB and local KDB. Experimental results on the repository of machine learning databases from the University of California Irvine (UCI) showed that AKDB has significant advantages in zero-one loss and bias relative to naive Bayes (NB), tree augmented naive Bayes (TAN), Averaged one-dependence estimators (AODE), and KDB. Moreover, KDB and local KDB show mutually complementary characteristics with respect to variance.http://www.mdpi.com/1099-4300/17/6/4134k-dependence Bayesian classifiersubstitution-elimination resolutionfunctionaldependency rules of probability |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Limin Wang Haoyu Zhao Minghui Sun Yue Ning |
spellingShingle |
Limin Wang Haoyu Zhao Minghui Sun Yue Ning General and Local: Averaged k-Dependence Bayesian Classifiers Entropy k-dependence Bayesian classifier substitution-elimination resolution functionaldependency rules of probability |
author_facet |
Limin Wang Haoyu Zhao Minghui Sun Yue Ning |
author_sort |
Limin Wang |
title |
General and Local: Averaged k-Dependence Bayesian Classifiers |
title_short |
General and Local: Averaged k-Dependence Bayesian Classifiers |
title_full |
General and Local: Averaged k-Dependence Bayesian Classifiers |
title_fullStr |
General and Local: Averaged k-Dependence Bayesian Classifiers |
title_full_unstemmed |
General and Local: Averaged k-Dependence Bayesian Classifiers |
title_sort |
general and local: averaged k-dependence bayesian classifiers |
publisher |
MDPI AG |
series |
Entropy |
issn |
1099-4300 |
publishDate |
2015-06-01 |
description |
The inference of a general Bayesian network has been shown to be an NP-hard problem, even for approximate solutions. Although k-dependence Bayesian (KDB) classifier can construct at arbitrary points (values of k) along the attribute dependence spectrum, it cannot identify the changes of interdependencies when attributes take different values. Local KDB, which learns in the framework of KDB, is proposed in this study to describe the local dependencies implicated in each test instance. Based on the analysis of functional dependencies, substitution-elimination resolution, a new type of semi-naive Bayesian operation, is proposed to substitute or eliminate generalization to achieve accurate estimation of conditional probability distribution while reducing computational complexity. The final classifier, averaged k-dependence Bayesian (AKDB) classifiers, will average the output of KDB and local KDB. Experimental results on the repository of machine learning databases from the University of California Irvine (UCI) showed that AKDB has significant advantages in zero-one loss and bias relative to naive Bayes (NB), tree augmented naive Bayes (TAN), Averaged one-dependence estimators (AODE), and KDB. Moreover, KDB and local KDB show mutually complementary characteristics with respect to variance. |
topic |
k-dependence Bayesian classifier substitution-elimination resolution functionaldependency rules of probability |
url |
http://www.mdpi.com/1099-4300/17/6/4134 |
work_keys_str_mv |
AT liminwang generalandlocalaveragedkdependencebayesianclassifiers AT haoyuzhao generalandlocalaveragedkdependencebayesianclassifiers AT minghuisun generalandlocalaveragedkdependencebayesianclassifiers AT yuening generalandlocalaveragedkdependencebayesianclassifiers |
_version_ |
1725011162305658880 |