From Geyer’s reverse logistic regression to GANs, a statistician tale on normalising constants
Speaker: Christian Robert
Bio
Professor at Université Paris Dauphine since 2000, part-time professor at University of Warwick (Fall 2013- ), fellow of the ASA (2012) and the IMS (1996), former editor of the Journal of the Royal Statistical Society (2006-2010) and deputy editor of Biometrika (2018-), senior member of Institut Universitaire de France (2010-2021)
Abstract
The problem of unknown normalising constants has been a long-standing issue in statistics and in particular Bayesian statistics. While many simulation based proposals have been made to address this issue, a class of methods stands out as relying on statistical representations to produce estimators of these normalising constants, along with uncertainty quantification. The starting point is Geyer’s (1994) reverse logistic regression, which proves highly efficient and robust to the curse of dimension. It relates to later Monte Carlo methods like bridge sampling and multiple mixtures, as statistical and learning principles such as non-parametric MLE, noise contrastive estimation (NCE), and generative adversarial networks (GANs).
[This talk is based on on-going, joint, work with Jean-Michel Marin and Judith Rousseau.]