Algorithms & Publications

Quasi-Newton Methods
Stochastic Optimization
Global Optimization

People

Philipp Hennig
Christian Schuler
Martin Kiefel

contact

Quasi-Newton Methods for Noisy Objectives

Paper: [pdf] [BibTeX]
Hennig, P.: Fast probabilistic optimization from noisy gradients
Proceedings of the 30th International Conference on Machine Learning (ICML); Atlanta, Georgia; June 2013 (Please cite this work when you use this algorithm)
Code: After download, refer to the README.txt file
NoisyNewton_v0.1.zip

As we showed in our work quasi-Newton methods, these algorithms are instances of Gaussian regression. It is widely known that Gaussian regression can be extended, analytically, to the case of regression from objectives observed with Gaussian noise. In this paper, we investigate the utility of this idea as a low-cost training method for large-scale machine learning algorithms, such as neural networks, where stochastic gradient descent remains the most widely used algorithm so far.

This is a research-grade implementation of a novel optimization algorithm. While there is considerable theoretical understanding suggesting that this is very good nonlinear optimization method, the numerical implementation is nontrivial. At any rate, it is hard to design a general optimizer that works on most optimization problems. We strongly encourage you to try out our algorithm and read our paper. But we do not guarantee that this code will always work. As the research and development process continues, we will update this algorithm as new insights become available. If you think you have found an interesting fail case of this algorithm, consider writing us, ideally with a simple reduction case.