Softcover ISBN: | 978-3-98547-053-2 |
Product Code: | EMSZLEC/30 |
List Price: | $45.00 |
AMS Member Price: | $36.00 |
Softcover ISBN: | 978-3-98547-053-2 |
Product Code: | EMSZLEC/30 |
List Price: | $45.00 |
AMS Member Price: | $36.00 |
-
Book DetailsEMS Zurich Lectures in Advanced MathematicsVolume: 30; 2023; 171 ppMSC: Primary 62; Secondary 35; 65
Bayesian methods based on Gaussian process priors are frequently used in statistical inverse problems arising with partial differential equations (PDEs). They can be implemented by Markov chain Monte Carlo (MCMC) algorithms. The underlying statistical models are naturally high- or infinite-dimensional, and this book presents a rigorous mathematical analysis of the statistical performance, and algorithmic complexity, of such methods in a natural setting of non-linear random design regression.
Due to the non-linearity present in many of these inverse problems, natural least squares functionals are non-convex, and the Bayesian paradigm presents an attractive alternative to optimization-based approaches. This book develops a general theory of Bayesian inference for non-linear forward maps and rigorously considers two PDE model examples arising with Darcy's problem and a Schrödinger equation. The focus is initially on statistical consistency of Gaussian process methods and then moves on to study local fluctuations and approximations of posterior distributions by Gaussian or log-concave measures whose curvature is described by PDE mapping properties of underlying “information operators”. Applications to the algorithmic runtime of gradient-based MCMC methods are discussed, as well as computation time lower bounds for worst case performance of some algorithms.
A publication of the European Mathematical Society (EMS). Distributed within the Americas by the American Mathematical Society.
ReadershipGraduate students and research mathematicians interested in probability, statistics, and inverse problems with partial differential equations.
-
Additional Material
-
RequestsReview Copy – for publishers of book reviewsAccessibility – to request an alternate format of an AMS title
- Book Details
- Additional Material
- Requests
Bayesian methods based on Gaussian process priors are frequently used in statistical inverse problems arising with partial differential equations (PDEs). They can be implemented by Markov chain Monte Carlo (MCMC) algorithms. The underlying statistical models are naturally high- or infinite-dimensional, and this book presents a rigorous mathematical analysis of the statistical performance, and algorithmic complexity, of such methods in a natural setting of non-linear random design regression.
Due to the non-linearity present in many of these inverse problems, natural least squares functionals are non-convex, and the Bayesian paradigm presents an attractive alternative to optimization-based approaches. This book develops a general theory of Bayesian inference for non-linear forward maps and rigorously considers two PDE model examples arising with Darcy's problem and a Schrödinger equation. The focus is initially on statistical consistency of Gaussian process methods and then moves on to study local fluctuations and approximations of posterior distributions by Gaussian or log-concave measures whose curvature is described by PDE mapping properties of underlying “information operators”. Applications to the algorithmic runtime of gradient-based MCMC methods are discussed, as well as computation time lower bounds for worst case performance of some algorithms.
A publication of the European Mathematical Society (EMS). Distributed within the Americas by the American Mathematical Society.
Graduate students and research mathematicians interested in probability, statistics, and inverse problems with partial differential equations.