Edinburgh Research Archive

Neural density estimation and likelihood-free inference

dc.contributor.advisor
Murray, Iain
en
dc.contributor.advisor
Williams, Chris
en
dc.contributor.author
Papamakarios, Georgios
en
dc.contributor.sponsor
Engineering and Physical Sciences Research Council (EPSRC)
en
dc.date.accessioned
2019-11-22T11:40:21Z
dc.date.available
2019-11-22T11:40:21Z
dc.date.issued
2019-11-23
dc.description.abstract
I consider two problems in machine learning and statistics: the problem of estimating the joint probability density of a collection of random variables, known as density estimation, and the problem of inferring model parameters when their likelihood is intractable, known as likelihood-free inference. The contribution of the thesis is a set of new methods for addressing these problems that are based on recent advances in neural networks and deep learning. The first part of the thesis is about density estimation. The joint probability density of a collection of random variables is a useful mathematical description of their statistical properties, but can be hard to estimate from data, especially when the number of random variables is large. Traditional density-estimation methods such as histograms or kernel density estimators are effective for a small number of random variables, but scale badly as the number increases. In contrast, models for density estimation based on neural networks scale better with the number of random variables, and can incorporate domain knowledge in their design. My main contribution is Masked Autoregressive Flow, a new model for density estimation based on a bijective neural network that transforms random noise to data. At the time of its introduction, Masked Autoregressive Flow achieved state-of-the-art results in general-purpose density estimation. Since its publication, Masked Autoregressive Flow has contributed to the broader understanding of neural density estimation, and has influenced subsequent developments in the field. The second part of the thesis is about likelihood-free inference. Typically, a statistical model can be specified either as a likelihood function that describes the statistical relationship between model parameters and data, or as a simulator that can be run forward to generate data. Specifying a statistical model as a simulator can offer greater modelling flexibility and can produce more interpretable models, but can also make inference of model parameters harder, as the likelihood of the parameters may no longer be tractable. Traditional techniques for likelihood-free inference such as approximate Bayesian computation rely on simulating data from the model, but often require a large number of simulations to produce accurate results. In this thesis, I cast the problem of likelihood-free inference as a density-estimation problem, and address it with neural density models. My main contribution is the introduction of two new methods for likelihood-free inference: Sequential Neural Posterior Estimation (Type A), which estimates the posterior, and Sequential Neural Likelihood, which estimates the likelihood. Both methods use a neural density model to estimate the posterior/likelihood, and a sequential training procedure to guide simulations. My experiments show that the proposed methods produce accurate results, and are often orders of magnitude faster than alternative methods based on approximate Bayesian computation.
en
dc.identifier.uri
https://hdl.handle.net/1842/36394
dc.language.iso
en
dc.publisher
The University of Edinburgh
en
dc.relation.hasversion
G. Papamakarios, T. Pavlakou, and I. Murray. Masked autoregressive flow for density estimation. Advances in Neural Information Processing Systems 30, 2017
en
dc.relation.hasversion
G. Papamakarios and I. Murray. Fast ε-free inference of simulation models with Bayesian conditional density estimation. Advances in Neural Information Processing Systems 29, 2016
en
dc.relation.hasversion
G. Papamakarios, D. C. Sterratt, and I. Murray. Sequential neural likelihood: Fast likelihood-free inference with autoregressive flows. Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics, 2019
en
dc.relation.hasversion
K. Gregor, G. Papamakarios, F. Besse, L. Buesing, and T. Weber. Temporal difference variational auto-encoder. Proceedings of the 7th International Conference on Learning Representations, 2019.
en
dc.relation.hasversion
G. Papamakarios. Preprocessed datasets for MAF experiments, 2018. URL https://doi.org/ 10.5281/zenodo.1161203.
en
dc.relation.hasversion
G. Papamakarios and I. Murray. Distilling intractable generative models. Probabilistic Integration Workshop at Neural Information Processing Systems, 2015.
en
dc.subject
probabilistic relationships
en
dc.subject
computer simulations
en
dc.subject
density estimation
en
dc.subject
likelihood-free inference
en
dc.subject
neural networks
en
dc.subject
Masked Autoregressive Flow
en
dc.title
Neural density estimation and likelihood-free inference
en
dc.type
Thesis or Dissertation
en
dc.type.qualificationlevel
Doctoral
en
dc.type.qualificationname
PhD Doctor of Philosophy
en

Files

Original bundle

Now showing 1 - 1 of 1
Name:
Papamakarios2019.pdf
Size:
6.41 MB
Format:
Adobe Portable Document Format
Description:

This item appears in the following Collection(s)