Maximum Likelihood Embedding of Logistic Random Dot Product Graphs

A latent space model for a family of random graphs assigns real-valued vectors to nodes of the graph such that edge probabilities are determined by latent positions. Latent space models provide a natural statistical framework for graph visualizing and clustering. A latent space model of particular i...

Full description

Bibliographic Details
Main Authors: M´edard, Muriel (Author), Feizi, Soheil (Author)
Other Authors: Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science (Contributor)
Format: Article
Language:English
Published: Association for the Advancement of Artificial Intelligence (AAAI), 2021-04-27T15:23:52Z.
Subjects:
Online Access:Get fulltext
LEADER 01936 am a22001813u 4500
001 130525
042 |a dc 
100 1 0 |a M´edard, Muriel  |e author 
100 1 0 |a Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science  |e contributor 
700 1 0 |a Feizi, Soheil  |e author 
245 0 0 |a Maximum Likelihood Embedding of Logistic Random Dot Product Graphs 
260 |b Association for the Advancement of Artificial Intelligence (AAAI),   |c 2021-04-27T15:23:52Z. 
856 |z Get fulltext  |u https://hdl.handle.net/1721.1/130525 
520 |a A latent space model for a family of random graphs assigns real-valued vectors to nodes of the graph such that edge probabilities are determined by latent positions. Latent space models provide a natural statistical framework for graph visualizing and clustering. A latent space model of particular interest is the Random Dot Product Graph (RDPG), which can be fit using an efficient spectral method; however, this method is based on a heuristic that can fail, even in simple cases. Here, we consider a closely related latent space model, the Logistic RDPG, which uses a logistic link function to map from latent positions to edge likelihoods. Over this model, we show that asymptotically exact maximum likelihood inference of latent position vectors can be achieved using an efficient spectral method. Our method involves computing top eigenvectors of a normalized adjacency matrix and scaling eigenvectors using a regression step. The novel regression scaling step is an essential part of the proposed method. In simulations, we show that our proposed method is more accurate and more robust than common practices. We also show the effectiveness of our approach over standard real networks of the karate club and political blogs. 
546 |a en 
655 7 |a Article 
773 |t 10.1609/AAAI.V34I04.5975 
773 |t Proceedings of the AAAI Conference on Artificial Intelligence