Academy of Mathematics and Systems Science, CAS Colloquia & Seminars | Speaker: | 闫亮 教授,东南大学 | Inviter: | | Title: | Stein variational gradient descent with local approximations for Bayesian inference | Time & Venue: | 2021.09.17 09:30-10:30 腾讯会议 ID:837 778 5033 | Abstract: | 腾讯会议:https://meeting.tencent.com/p/8377785033?rs=25,会议密码:067269 Bayesian computation plays an important role in modern machine learning and statistics to reason about uncertainty. A key computational challenge in Bayesian inference is to develop efficient techniques to approximate, or draw samples from posterior distributions. Stein variational gradient decent (SVGD) has been shown to be a powerful approximate inference algorithm for this issue. However, the vanilla SVGD requires calculating the gradient of the target density and cannot be applied when the gradient is unavailable or too expensive to evaluate. In this talk, we explore one way to address this challenge by the construction of a local surrogate for the target distribution in which the gradient can be obtained in a much more computationally feasible manner. More specifically, we approximate the forward model using a deep neural network (DNN) which is trained on a carefully chosen training set, which also determines the quality of the surrogate. To this end, we propose a general adaptation procedure to refine the local approximation online without destroying the convergence of the resulting SVGD. This significantly reduces the computational cost of SVGD and leads to a suite of algorithms that are straightforward to implement. The new algorithm is illustrated on a set of challenging Bayesian inverse problems, and numerical experiments demonstrate a clear improvement in performance and applicability of standard SVGD. | | | |