Learning with Markov logic networks : transfer learning, structure learning, and an application to Web query disambiguation

Traditionally, machine learning algorithms assume that training data is provided as a set of independent instances, each of which can be described as a feature vector. In contrast, many domains of interest are inherently multi-relational, consisting of entities connected by a rich set of relations....

Full description

Bibliographic Details
Main Author: Mihalkova, Lilyana Simeonova
Format: Others
Language:English
Published: 2011
Subjects:
Online Access:http://hdl.handle.net/2152/10574
id ndltd-UTEXAS-oai-repositories.lib.utexas.edu-2152-10574
record_format oai_dc
collection NDLTD
language English
format Others
sources NDLTD
topic Markov logic networks
Web query disambiguation
Statistical Relational Learning
Artificial intelligence
Machine learning
Multi-relational data
First-order logic
Probability distribution
Structure learning
Transfer learning
Algorithms
spellingShingle Markov logic networks
Web query disambiguation
Statistical Relational Learning
Artificial intelligence
Machine learning
Multi-relational data
First-order logic
Probability distribution
Structure learning
Transfer learning
Algorithms
Mihalkova, Lilyana Simeonova
Learning with Markov logic networks : transfer learning, structure learning, and an application to Web query disambiguation
description Traditionally, machine learning algorithms assume that training data is provided as a set of independent instances, each of which can be described as a feature vector. In contrast, many domains of interest are inherently multi-relational, consisting of entities connected by a rich set of relations. For example, the participants in a social network are linked by friendships, collaborations, and shared interests. Likewise, the users of a search engine are related by searches for similar items and clicks to shared sites. The ability to model and reason about such relations is essential not only because better predictive accuracy is achieved by exploiting this additional information, but also because frequently the goal is to predict whether a set of entities are related in a particular way. This thesis falls within the area of Statistical Relational Learning (SRL), which combines ideas from two traditions within artificial intelligence, first-order logic and probabilistic graphical models to address the challenge of learning from multi-relational data. We build on one particular SRL model, Markov logic networks (MLNs), which consist of a set of weighted first-order-logic formulae and provide a principled way of defining a probability distribution over possible worlds. We develop algorithms for learning of MLN structure both from scratch and by transferring a previously learned model, as well as an application of MLNs to the problem of Web query disambiguation. The ideas we present are unified by two main themes: the need to deal with limited training data and the use of bottom-up learning techniques. Structure learning, the task of automatically acquiring a set of dependencies among the relations in the domain, is a central problem in SRL. We introduce BUSL, an algorithm for learning MLN structure from scratch that proceeds in a more bottom-up fashion, breaking away from the tradition of top-down learning typical in SRL. Our approach first constructs a novel data structure called a Markov network template that is used to restrict the search space for clauses. Our experiments in three relational domains demonstrate that BUSL dramatically reduces the search space for clauses and attains a significantly higher accuracy than a structure learner that follows a top-down approach. Accurate and efficient structure learning can also be achieved by transferring a model obtained in a source domain related to the current target domain of interest. We view transfer as a revision task and present an algorithm that diagnoses a source MLN to determine which of its parts transfer directly to the target domain and which need to be updated. This analysis focuses the search for revisions on the incorrect portions of the source structure, thus speeding up learning. Transfer learning is particularly important when target-domain data is limited, such as when data on only a few individuals is available from domains with hundreds of entities connected by a variety of relations. We also address this challenging case and develop a general transfer learning approach that makes effective use of such limited target data in several social network domains. Finally, we develop an application of MLNs to the problem of Web query disambiguation in a more privacy-aware setting where the only information available about a user is that captured in a short search session of 5-6 previous queries on average. This setting contrasts with previous work that typically assumes the availability of long user-specific search histories. To compensate for the scarcity of user-specific information, our approach exploits the relations between users, search terms, and URLs. We demonstrate the effectiveness of our approach in the presence of noise and show that it outperforms several natural baselines on a large data set collected from the MSN search engine. === text
author Mihalkova, Lilyana Simeonova
author_facet Mihalkova, Lilyana Simeonova
author_sort Mihalkova, Lilyana Simeonova
title Learning with Markov logic networks : transfer learning, structure learning, and an application to Web query disambiguation
title_short Learning with Markov logic networks : transfer learning, structure learning, and an application to Web query disambiguation
title_full Learning with Markov logic networks : transfer learning, structure learning, and an application to Web query disambiguation
title_fullStr Learning with Markov logic networks : transfer learning, structure learning, and an application to Web query disambiguation
title_full_unstemmed Learning with Markov logic networks : transfer learning, structure learning, and an application to Web query disambiguation
title_sort learning with markov logic networks : transfer learning, structure learning, and an application to web query disambiguation
publishDate 2011
url http://hdl.handle.net/2152/10574
work_keys_str_mv AT mihalkovalilyanasimeonova learningwithmarkovlogicnetworkstransferlearningstructurelearningandanapplicationtowebquerydisambiguation
_version_ 1716821292619923456
spelling ndltd-UTEXAS-oai-repositories.lib.utexas.edu-2152-105742015-09-20T16:58:21ZLearning with Markov logic networks : transfer learning, structure learning, and an application to Web query disambiguationMihalkova, Lilyana SimeonovaMarkov logic networksWeb query disambiguationStatistical Relational LearningArtificial intelligenceMachine learningMulti-relational dataFirst-order logicProbability distributionStructure learningTransfer learningAlgorithmsTraditionally, machine learning algorithms assume that training data is provided as a set of independent instances, each of which can be described as a feature vector. In contrast, many domains of interest are inherently multi-relational, consisting of entities connected by a rich set of relations. For example, the participants in a social network are linked by friendships, collaborations, and shared interests. Likewise, the users of a search engine are related by searches for similar items and clicks to shared sites. The ability to model and reason about such relations is essential not only because better predictive accuracy is achieved by exploiting this additional information, but also because frequently the goal is to predict whether a set of entities are related in a particular way. This thesis falls within the area of Statistical Relational Learning (SRL), which combines ideas from two traditions within artificial intelligence, first-order logic and probabilistic graphical models to address the challenge of learning from multi-relational data. We build on one particular SRL model, Markov logic networks (MLNs), which consist of a set of weighted first-order-logic formulae and provide a principled way of defining a probability distribution over possible worlds. We develop algorithms for learning of MLN structure both from scratch and by transferring a previously learned model, as well as an application of MLNs to the problem of Web query disambiguation. The ideas we present are unified by two main themes: the need to deal with limited training data and the use of bottom-up learning techniques. Structure learning, the task of automatically acquiring a set of dependencies among the relations in the domain, is a central problem in SRL. We introduce BUSL, an algorithm for learning MLN structure from scratch that proceeds in a more bottom-up fashion, breaking away from the tradition of top-down learning typical in SRL. Our approach first constructs a novel data structure called a Markov network template that is used to restrict the search space for clauses. Our experiments in three relational domains demonstrate that BUSL dramatically reduces the search space for clauses and attains a significantly higher accuracy than a structure learner that follows a top-down approach. Accurate and efficient structure learning can also be achieved by transferring a model obtained in a source domain related to the current target domain of interest. We view transfer as a revision task and present an algorithm that diagnoses a source MLN to determine which of its parts transfer directly to the target domain and which need to be updated. This analysis focuses the search for revisions on the incorrect portions of the source structure, thus speeding up learning. Transfer learning is particularly important when target-domain data is limited, such as when data on only a few individuals is available from domains with hundreds of entities connected by a variety of relations. We also address this challenging case and develop a general transfer learning approach that makes effective use of such limited target data in several social network domains. Finally, we develop an application of MLNs to the problem of Web query disambiguation in a more privacy-aware setting where the only information available about a user is that captured in a short search session of 5-6 previous queries on average. This setting contrasts with previous work that typically assumes the availability of long user-specific search histories. To compensate for the scarcity of user-specific information, our approach exploits the relations between users, search terms, and URLs. We demonstrate the effectiveness of our approach in the presence of noise and show that it outperforms several natural baselines on a large data set collected from the MSN search engine.text2011-03-18T20:34:36Z2011-03-18T20:34:36Z2009-082011-03-18electronichttp://hdl.handle.net/2152/10574engCopyright is held by the author. Presentation of this material on the Libraries' web site by University Libraries, The University of Texas at Austin was made possible under a limited license grant from the author who has retained all copyrights in the works.