PASCAL2
Invited Speakers

Persi Diaconis, Stanford
Matthias Seeger, EPFL
Ben Calderhead, UCL
Jacek Gondzio, Edinburgh

Schedule

Accepted Papers

Organizers

Philipp Hennig, Tübingen
John Cunningham, Wash U
Michael Osborne, Oxford

we are grateful for financial support from the PASCAL2 network.

Announcement

The first international workshop on probabilistic numerics will take place on Saturday, 8 December 2012 at Lake Tahoe, Nevada, in co-location with Neural Information Processing Systems.

We invite contribution of recent results in the development and analysis of numerical analysis methods based on probability theory. This includes, but is not limited to the areas of optimization, sampling, linear algebra, quadrature and the solution of differential equations.

Overview

The traditional remit of machine learning is problems of inference on complex data. At the computational bottlenecks of our algorithms, we typically find a numerical problem: optimization, integration, sampling. These inner routines are often treated as a black box, but many of these tasks in numerics can be viewed as learning problems:

Many of these problems can be seen as special cases of decision theory, active learning, or reinforcement learning. Work along these lines was pioneered twenty years ago by Diaconis [7] and O'Hagan [2]. But modern desiderata for a numerical algorithm differ markedly from those common elsewhere in machine learning: Numerical methods are "inner-loop" algorithms, used as black boxes by large groups of users on wildly different problems. As such, robustness, computation and memory costs are more important here than raw prediction power or convergence speed. Availability of good implementations also matters. These kind of challenges can be well addressed by machine learning researchers, once the relevant community is brought together to discuss these topics.

Some of the algorithms we use for numerical problems were developed generations ago. They have aged well, showing impressively good performance over a broad spectrum of problems. Of course, they also have a variety of shortcomings, which can be addressed by modern probabilistic models (see some of the work cited above). In the other direction, the numerical mathematics community, a much wider field than machine learning, is bringing experience, theoretical rigour and a focus on computational performance to the table. So there is great potential for cross-fertilization to both the machine learning and numerical mathematics community. The main goals of this workshop are

References

[1] D.R. Jones, M. Schonlau, and W.J. Welch.
Efficient global optimization of expensive black-box functions.
Journal of Global Optimization, 13 (4): 455--492, 1998.

[2] Anthony O'Hagan.
Some Bayesian Numerical Analysis.
{\em Bayesian Statistics}, volume 4, pp. 345--363, 1992.

[3] C.E. Rasmussen and Z. Ghahramani.
Bayesian Monte Carlo.
In Advances in NIPS, volume~15, pp. 489--496, 2003.

[4] T.P. Minka.
Deriving quadrature rules from Gaussian processes.
Technical report, Statistics Department, Carnegie Mellon University, 2000.

[5] H. Haario, E. Saksman, and J. Tamminen.
An Adaptive Metropolis Algorithm.
Bernoulli, volume 7, number 2, pp. 223--242, 2001

[6] J.P. Cunningham, P. Hennig, and S. Lacoste-Julien.
Gaussian probabilities and expectation propagation.
arXiv:1111.6832 [stat.ML], November 2011.

[7] P. Diaconis.
Bayesian numerical analysis.
In S. Gupta J. Berger, editors, Statistical Decision Theory and Related Topics IV, volume 1, pages 163--175. Springer-Verlag, New York, 1988.