XDisynd63Y@OpenReview

Total: 1

#1 The Quotient Bayesian Learning Rule [PDF] [Copy] [Kimi] [REL]

Authors: Mykola Lukashchuk, Raphaël Trésor, Wouter W. L. Nuijten, Ismail Senoz, Bert de Vries

This paper introduces the Quotient Bayesian Learning Rule, an extension of natural-gradient Bayesian updates to probability models that fall outside the exponential family. Building on the observation that many heavy-tailed and otherwise non-exponential distributions arise as marginals of minimal exponential families, we prove that such marginals inherit a unique Fisher–Rao information geometry via the quotient-manifold construction. Exploiting this geometry, we derive the Quotient Natural Gradient algorithm, which takes steepest-descent steps in the well-structured covering space, thereby guaranteeing parameterization-invariant optimization in the target space. Empirical results on the Student-$t$ distribution confirm that our method converges more rapidly and attains higher-quality solutions than previous variants of the Bayesian Learning Rule. These findings position quotient geometry as a unifying tool for efficient and principled inference across a broad class of latent-variable models.

Subject: NeurIPS.2025 - Poster