On Minimizing Weighted Finite Automata

碩士 === 國立臺灣大學 === 電機工程學研究所 === 96 === Weighted Finite Automata represent a very general model which has many different names such as probabilistic finite automata (PFA), hidden Markov models (HMM), stochastic regular grammars, Markov chains and n-grams. These models play central roles in many domain...

Full description

Bibliographic Details
Main Authors: Ting-Yuan Huang, 黃亭遠
Other Authors: Hsu-Chun Yen
Format: Others
Language:en_US
Published: 2008
Online Access:http://ndltd.ncl.edu.tw/handle/64923272723918819375
Description
Summary:碩士 === 國立臺灣大學 === 電機工程學研究所 === 96 === Weighted Finite Automata represent a very general model which has many different names such as probabilistic finite automata (PFA), hidden Markov models (HMM), stochastic regular grammars, Markov chains and n-grams. These models play central roles in many domains such as machine learning, speech processing, computational linguistics, pattern recognition, language modeling, bioinformatics, music modeling, circuit testing, image processing, path query and time series analysis. The huge number of applications makes weighted finite automata a very valuable research topic: Even a very small breakthrough or improvement would benefit lots of domains. In this thesis, we introduce the notion of “weight redistribution” by investigating the algebraic properties along with the graphical structure inside weighted finite automata. We also generalize the concept of shortest-path problem to finding infimum along every paths and give an effcient algorithm to solve it. Our algorithm to compute “weight redistribution” not only plays the central role in our minimization algorithm, but also is applicable on determining the equivalence Z-automata. We also give two new algorithms to shrink the state space. These algorithms outperform pervious results.