Implementing and Evaluating sparsification methods in probabilistic networks
Most queries on probabilistic networks assume a possible world semantic, which causes an exponential increase in execution time. Deterministic networks can apply sparsification methods to reduce their sizes while preserving some structural properties, but there have not been any equivalent methods f...
Main Author: | |
---|---|
Format: | Others |
Language: | English |
Published: |
Uppsala universitet, Institutionen för informationsteknologi
2020
|
Subjects: | |
Online Access: | http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-428591 |
id |
ndltd-UPSALLA1-oai-DiVA.org-uu-428591 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-UPSALLA1-oai-DiVA.org-uu-4285912020-12-16T05:29:38ZImplementing and Evaluating sparsification methods in probabilistic networksengDahlin, OskarUppsala universitet, Institutionen för informationsteknologi2020Engineering and TechnologyTeknik och teknologierMost queries on probabilistic networks assume a possible world semantic, which causes an exponential increase in execution time. Deterministic networks can apply sparsification methods to reduce their sizes while preserving some structural properties, but there have not been any equivalent methods for probabilistic networks until recently. As a first work in the field, Parchas, Papailiou, Papadias and Bonchi have proposed sparsification methods for probabilistic networks by adapting a gradient descent and expectation-maximization algorithm. In this report the two proposed algorithms, Gradient Descent Backbone (GDB) and Expectation-Maximization Degree (EMD), were implemented and evaluated on different input parameters by comparing how well the general graph properties, expected vertex degrees and ego betweenness approximations are preserved after sparsifying different datasets. In the sparsified networks we found that the entropies had mostly gone down to zero, effectively creating a deterministic network. EMD generally showed better results than GDB specifically when using the relative discrepancies, however on lower alpha values the EMD methods can generate disconnected networks, more so when using absolute discrepancies. The methods produced unexpected results on higher alpha values which suggests they're not stable. Our evaluations have confirmed that the proposed algorithms produce acceptable results in some cases, however finding the right input parameters for specific networks can be time consuming. Therefore further testing on diverse structures of networks with different input parameters is recommended.Tryckt av: Student thesisinfo:eu-repo/semantics/bachelorThesistexthttp://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-428591IT ; 20078application/pdfinfo:eu-repo/semantics/openAccess |
collection |
NDLTD |
language |
English |
format |
Others
|
sources |
NDLTD |
topic |
Engineering and Technology Teknik och teknologier |
spellingShingle |
Engineering and Technology Teknik och teknologier Dahlin, Oskar Implementing and Evaluating sparsification methods in probabilistic networks |
description |
Most queries on probabilistic networks assume a possible world semantic, which causes an exponential increase in execution time. Deterministic networks can apply sparsification methods to reduce their sizes while preserving some structural properties, but there have not been any equivalent methods for probabilistic networks until recently. As a first work in the field, Parchas, Papailiou, Papadias and Bonchi have proposed sparsification methods for probabilistic networks by adapting a gradient descent and expectation-maximization algorithm. In this report the two proposed algorithms, Gradient Descent Backbone (GDB) and Expectation-Maximization Degree (EMD), were implemented and evaluated on different input parameters by comparing how well the general graph properties, expected vertex degrees and ego betweenness approximations are preserved after sparsifying different datasets. In the sparsified networks we found that the entropies had mostly gone down to zero, effectively creating a deterministic network. EMD generally showed better results than GDB specifically when using the relative discrepancies, however on lower alpha values the EMD methods can generate disconnected networks, more so when using absolute discrepancies. The methods produced unexpected results on higher alpha values which suggests they're not stable. Our evaluations have confirmed that the proposed algorithms produce acceptable results in some cases, however finding the right input parameters for specific networks can be time consuming. Therefore further testing on diverse structures of networks with different input parameters is recommended.Tryckt av: |
author |
Dahlin, Oskar |
author_facet |
Dahlin, Oskar |
author_sort |
Dahlin, Oskar |
title |
Implementing and Evaluating sparsification methods in probabilistic networks |
title_short |
Implementing and Evaluating sparsification methods in probabilistic networks |
title_full |
Implementing and Evaluating sparsification methods in probabilistic networks |
title_fullStr |
Implementing and Evaluating sparsification methods in probabilistic networks |
title_full_unstemmed |
Implementing and Evaluating sparsification methods in probabilistic networks |
title_sort |
implementing and evaluating sparsification methods in probabilistic networks |
publisher |
Uppsala universitet, Institutionen för informationsteknologi |
publishDate |
2020 |
url |
http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-428591 |
work_keys_str_mv |
AT dahlinoskar implementingandevaluatingsparsificationmethodsinprobabilisticnetworks |
_version_ |
1719370594666938368 |