Toggle Main Menu Toggle Search

Open Access padlockePrints

Stein Points

Lookup NU author(s): Professor Chris Oates

Downloads


Licence

This work is licensed under a Creative Commons Attribution 4.0 International License (CC BY 4.0).


Abstract

© 2018 by the Authors. All rights reserved. An important task in computational statistics and machine learning is to approximate a posterior distribution p(x) with an empirical measure supported on a set of representative points {xi} n i=1 . This paper focuses on methods where the selection of points is essentially deterministic, with an emphasis on achieving accurate approximation when n is small. To this end, we present Stein Points. The idea is to exploit either a greedy or a conditional gradient method to iteratively minimise a kernel Stein discrepancy between the empirical measure and p(x). Our empirical results demonstrate that Stein Points enable accurate approximation of the posterior at modest computational cost. In addition, theoretical results are provided to establish convergence of the method.


Publication metadata

Author(s): Chen WY, Mackey L, Gorham J, Briol F-X, Oates CJ

Publication type: Conference Proceedings (inc. Abstract)

Publication status: Published

Conference Name: Proceedings of the 35th International Conference on Machine Learning

Year of Conference: 2018

Pages: 843-852

Online publication date: 15/07/2018

Acceptance date: 02/04/2018

Date deposited: 02/01/2020

ISSN: 1533-7928

Publisher: Proceedings of Machine Learning Research

URL: https://icml.cc/Conferences/2018/Schedule?type=Poster

Library holdings: Search Newcastle University Library for this item

Series Title: Proceedings of Machine Learning Research

ISBN: 9781510867963


Share