Summary: | Thesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2018. === This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections. === Cataloged from student-submitted PDF version of thesis. === Includes bibliographical references (page 47). === The EM (Expectation-Maximization) algorithm is a heuristic for parameter estimation in statistical models with latent variables, where explicit computation of the maximum likelihood estimate (MLE) is infeasible. Although widely used in practice, the theoretical guarantees associated with EM are quite weak. We study the setting of a hidden Markov model (HMM) with two hidden states, where the (symmetric) transition matrix [mu] is unknown and observations are Gaussian with known covariance and unknown mean [mu]. The EM algorithm for HMMs, also known as the Baum-Welch algorithm, was previously studied by Yang, Balakrishnan, and Wainwright [1] but without global convergence guarantees. In this paper we propose a "local" version of the EM algorithm and prove absolute convergence of this algorithm to the true parameters ([mu], E) in both the population and finite-sample regime. To the best of our knowledge this is the first algorithm for simultaneous parameter estimation with global convergence guarantees. Additionally, we prove several theoretical results and supply some counterexamples for the ordinary Baum-Welch algorithm in this setting. === by Dhroova Aiylam. === M. Eng.
|