Pattern classification via unsupervised learners

We consider classification problems in a variant of the Probably Approximately Correct (PAC)-learning framework, in which an unsupervised learner creates a discriminant function over each class and observations are labeled by the learner returning the highest value associated with that observation....

Full description

Bibliographic Details
Main Author: Palmer, Nicholas James
Published: University of Warwick 2008
Subjects:
Online Access:http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.491521
id ndltd-bl.uk-oai-ethos.bl.uk-491521
record_format oai_dc
spelling ndltd-bl.uk-oai-ethos.bl.uk-4915212015-03-20T03:39:30ZPattern classification via unsupervised learnersPalmer, Nicholas James2008We consider classification problems in a variant of the Probably Approximately Correct (PAC)-learning framework, in which an unsupervised learner creates a discriminant function over each class and observations are labeled by the learner returning the highest value associated with that observation. Consideration is given to whether this approach gains significant advantage over traditional discriminant techniques. It is shown that PAC-learning distributions over class labels under Ll distance or KL-divergence implies PAC classification in this framework. We give bounds on the regret associated with the resulting classifier, taking into account the possibility of variable misclassification penalties. We demonstrate the advantage of estimating the a posteriori probability distributions over class labels in the setting of Optical Character Recognition. We show that unsupervised learners can be used to learn a class of probabilistic concepts (stochastic rules denoting the probability that an observation has a positive label in a 2-class setting). This demonstrates a situation where unsupervised learners can be used even when it is hard to learn distributions over class labels - in this case the discriminant functions do not estimate the class probability densities. We use a standard state-merging technique to PAC-learn a class of probabilistic automata and show that by learning the distribution over outputs under the weaker L1 distance rather than KL-divergence we are able to learn without knowledge of the expected length of an output. It is also shown that for a restricted class of these automata learning under L1 distance is equivalent to learning under KL-divergence.006.3LB Theory and practice of education : QA MathematicsUniversity of Warwickhttp://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.491521http://wrap.warwick.ac.uk/2373/Electronic Thesis or Dissertation
collection NDLTD
sources NDLTD
topic 006.3
LB Theory and practice of education : QA Mathematics
spellingShingle 006.3
LB Theory and practice of education : QA Mathematics
Palmer, Nicholas James
Pattern classification via unsupervised learners
description We consider classification problems in a variant of the Probably Approximately Correct (PAC)-learning framework, in which an unsupervised learner creates a discriminant function over each class and observations are labeled by the learner returning the highest value associated with that observation. Consideration is given to whether this approach gains significant advantage over traditional discriminant techniques. It is shown that PAC-learning distributions over class labels under Ll distance or KL-divergence implies PAC classification in this framework. We give bounds on the regret associated with the resulting classifier, taking into account the possibility of variable misclassification penalties. We demonstrate the advantage of estimating the a posteriori probability distributions over class labels in the setting of Optical Character Recognition. We show that unsupervised learners can be used to learn a class of probabilistic concepts (stochastic rules denoting the probability that an observation has a positive label in a 2-class setting). This demonstrates a situation where unsupervised learners can be used even when it is hard to learn distributions over class labels - in this case the discriminant functions do not estimate the class probability densities. We use a standard state-merging technique to PAC-learn a class of probabilistic automata and show that by learning the distribution over outputs under the weaker L1 distance rather than KL-divergence we are able to learn without knowledge of the expected length of an output. It is also shown that for a restricted class of these automata learning under L1 distance is equivalent to learning under KL-divergence.
author Palmer, Nicholas James
author_facet Palmer, Nicholas James
author_sort Palmer, Nicholas James
title Pattern classification via unsupervised learners
title_short Pattern classification via unsupervised learners
title_full Pattern classification via unsupervised learners
title_fullStr Pattern classification via unsupervised learners
title_full_unstemmed Pattern classification via unsupervised learners
title_sort pattern classification via unsupervised learners
publisher University of Warwick
publishDate 2008
url http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.491521
work_keys_str_mv AT palmernicholasjames patternclassificationviaunsupervisedlearners
_version_ 1716781757948231680