Fast statistical inference with neural networks and amortisation: Golden ticket or red herring?

Go back to programme

Fast statistical inference with neural networks and amortisation: Golden ticket or red herring?

  • Watch

    * Register (or log in) to the AI4G Neural Network to add this session to your agenda or watch the replay

    • Watch

      Register

      * Register (or log in) to the AI4G Neural Network to add this session to your agenda or watch the replay

    Neural networks can provide solutions to tasks that were inconceivable just a few years ago and have benefitted society in numerous ways. These benefits primarily stem from a property often referred to as “amortisation“: Training a neural network usually requires significant effort and resources but, once trained, the network can solve similar problems repeatedly and rapidly with virtually no additional computational cost. Hence, the substantial initial training cost of training neural networks is “amortised” over time. Amortisation can also be used to enable fast inference with parametric statistical models: Once a network is trained using observational data as input and inferential statements (e.g., point parameter estimates) as output, the network can make inference with future data in a tiny fraction of the computing time needed by conventional likelihood or Monte Carlo methods. These amortised inferential tools have several compelling advantages over classical methods: they do not require knowledge of the likelihood function, are relatively easy to implement, and facilitate inference at a substantially reduced computational cost. In this lecture I will first give a brief review of recent work that has leveraged the property of “amortisation” in statistical inference in the context of (spatial) environmental and geophysical applications. I will then evaluate the merits and drawbacks of amortised inference from a statistician’s perspective and conclude by outlining the challenges that need to be overcome for these inferential tools to gain widespread acceptance. 

    Share this session