pith. machine review for the scientific record. sign in

arxiv: 1406.1440 · v3 · submitted 2014-06-05 · 📊 stat.ML · math.ST· stat.CO· stat.TH

Recognition: unknown

Bayesian matrix completion: prior specification

Authors on Pith no claims yet
classification 📊 stat.ML math.STstat.COstat.TH
keywords priormatrixpriorsbayesiansingularalgorithmsbehaviourcompletion
0
0 comments X
read the original abstract

Low-rank matrix estimation from incomplete measurements recently received increased attention due to the emergence of several challenging applications, such as recommender systems; see in particular the famous Netflix challenge. While the behaviour of algorithms based on nuclear norm minimization is now well understood, an as yet unexplored avenue of research is the behaviour of Bayesian algorithms in this context. In this paper, we briefly review the priors used in the Bayesian literature for matrix completion. A standard approach is to assign an inverse gamma prior to the singular values of a certain singular value decomposition of the matrix of interest; this prior is conjugate. However, we show that two other types of priors (again for the singular values) may be conjugate for this model: a gamma prior, and a discrete prior. Conjugacy is very convenient, as it makes it possible to implement either Gibbs sampling or Variational Bayes. Interestingly enough, the maximum a posteriori for these different priors is related to the nuclear norm minimization problems. We also compare all these priors on simulated datasets, and on the classical MovieLens and Netflix datasets.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 3 Pith papers

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. CoreFlow: Low-Rank Matrix Generative Models

    cs.LG 2026-04 unverdicted novelty 6.0

    CoreFlow is a low-rank matrix generative model that trains normalizing flows on shared subspaces to improve efficiency and quality for high-dimensional limited-sample data, including incomplete matrices.

  2. PAC-Bayes Bounds for Gibbs Posteriors via Singular Learning Theory

    stat.ML 2026-04 unverdicted novelty 6.0

    PAC-Bayes bounds for Gibbs posteriors are obtained via singular learning theory, producing explicit and tighter posterior-averaged risk bounds that adapt to data structure in overparameterized models.

  3. Empirical Bayes 1-bit matrix completion

    stat.ML 2026-05 unverdicted novelty 4.0

    An empirical Bayes 1-bit matrix completion method using Efron-Morris-style singular value shrinkage outperforms baselines in accuracy, calibration, and speed on simulations and real data.