Variational inference for neural network matrix factorization and its application to stochastic blockmodeling
Published in ICML Workshop on Learning and Reasoning with Graph-Structured Data, 2019
Recommended citation: Kampman and Heaukulani. (2019). "Variational inference for neural network matrix factorization and its application to stochastic blockmodeling." ICML Workshop on Learning and Reasoning with Graph-Structured Data. https://arxiv.org/pdf/1905.04502
We consider the probabilistic analogue to neural network matrix factorization (Dziugaite & Roy, 2015), which we construct with Bayesian neural networks and fit with variational inference. We find that a linear model fit with variational inference can attain equivalent predictive performance to the regular neural network variants on the Movielens data sets. We discuss the implications of this result, which include some suggestions on the pros and cons of using the neural network construction, as well as the variational approach to inference. Such a probabilistic approach is required, however, when considering the important class of stochastic block models. We describe a variational inference algorithm for a neural network matrix factorization model with nonparametric block structure and evaluate its performance on the NIPS co-authorship data set.