[edit]
Variational inference with continuously-indexed normalizing flows
Anthony Caterini, Rob Cornish, Dino Sejdinovic, Arnaud DoucetProceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence, PMLR 161:44-53, 2021.
Abstract
Continuously-indexed flows (CIFs) have recently achieved improvements over baseline normalizing flows on a variety of density estimation tasks. CIFs do not possess a closed-form marginal density, and so, unlike standard flows, cannot be plugged in directly to a variational inference (VI) scheme in order to produce a more expressive family of approximate posteriors. However, we show here how CIFs can be used as part of an auxiliary VI scheme to formulate and train expressive posterior approximations in a natural way. We exploit the conditional independence structure of multi-layer CIFs to build the required auxiliary inference models, which we show empirically yield low-variance estimators of the model evidence. We then demonstrate the advantages of CIFs over baseline flows in VI problems when the posterior distribution of interest possesses a complicated topology, obtaining improved results in both the Bayesian inference and surrogate maximum likelihood settings.
Cite this Paper
BibTeX
@InProceedings{pmlr-v161-caterini21a, title = {Variational inference with continuously-indexed normalizing flows}, author = {Caterini, Anthony and Cornish, Rob and Sejdinovic, Dino and Doucet, Arnaud}, booktitle = {Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence}, pages = {44--53}, year = {2021}, editor = {de Campos, Cassio and Maathuis, Marloes H.}, volume = {161}, series = {Proceedings of Machine Learning Research}, month = {27--30 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v161/caterini21a/caterini21a.pdf}, url = {https://proceedings.mlr.press/v161/caterini21a.html}, abstract = {Continuously-indexed flows (CIFs) have recently achieved improvements over baseline normalizing flows on a variety of density estimation tasks. CIFs do not possess a closed-form marginal density, and so, unlike standard flows, cannot be plugged in directly to a variational inference (VI) scheme in order to produce a more expressive family of approximate posteriors. However, we show here how CIFs can be used as part of an auxiliary VI scheme to formulate and train expressive posterior approximations in a natural way. We exploit the conditional independence structure of multi-layer CIFs to build the required auxiliary inference models, which we show empirically yield low-variance estimators of the model evidence. We then demonstrate the advantages of CIFs over baseline flows in VI problems when the posterior distribution of interest possesses a complicated topology, obtaining improved results in both the Bayesian inference and surrogate maximum likelihood settings.}}
Endnote
%0 Conference Paper%T Variational inference with continuously-indexed normalizing flows%A Anthony Caterini%A Rob Cornish%A Dino Sejdinovic%A Arnaud Doucet%B Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence%C Proceedings of Machine Learning Research%D 2021%E Cassio de Campos%E Marloes H. Maathuis%F pmlr-v161-caterini21a%I PMLR%P 44--53%U https://proceedings.mlr.press/v161/caterini21a.html%V 161%X Continuously-indexed flows (CIFs) have recently achieved improvements over baseline normalizing flows on a variety of density estimation tasks. CIFs do not possess a closed-form marginal density, and so, unlike standard flows, cannot be plugged in directly to a variational inference (VI) scheme in order to produce a more expressive family of approximate posteriors. However, we show here how CIFs can be used as part of an auxiliary VI scheme to formulate and train expressive posterior approximations in a natural way. We exploit the conditional independence structure of multi-layer CIFs to build the required auxiliary inference models, which we show empirically yield low-variance estimators of the model evidence. We then demonstrate the advantages of CIFs over baseline flows in VI problems when the posterior distribution of interest possesses a complicated topology, obtaining improved results in both the Bayesian inference and surrogate maximum likelihood settings.
APA
Caterini, A., Cornish, R., Sejdinovic, D. & Doucet, A.. (2021). Variational inference with continuously-indexed normalizing flows.Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence, inProceedings of Machine Learning Research 161:44-53 Available from https://proceedings.mlr.press/v161/caterini21a.html.