Practical pre-asymptotic diagnostic of Monte Carlo estimates in Bayesian inference and machine learning
Abstract: I discuss the use of the Pareto-k diagnostic as a simple and practical approach for estimating both the required minimum sample size and empirical pre-asymptotic convergence rate for Monte Carlo estimates. Even when by construction a Monte Carlo estimate has finite variance the pre-asymptotic behaviour and convergence rate can be very different from the asymptotic behaviour following the central limit theorem. I demonstrate with practical examples in importance sampling, stochastic optimization, and variational inference, which are commonly used in Bayesian inference and machine learning.