Distributionally Robust Optimization Techniques in Batch Bayesian Optimization

N. Rontsis, M. A. Osborne and P. J. Goulart

Journal of Machine Learning Research, vol. 21, no. 149, pp. 1-26, July 2020.
BibTeX  URL  Preprint  Code 

@article{ROG:2020,
  author = {N. Rontsis and M. A. Osborne and P. J. Goulart},
  title = {Distributionally Robust Optimization Techniques in Batch Bayesian Optimization},
  journal = {Journal of Machine Learning Research},
  year = {2020},
  volume = {21},
  number = {149},
  pages = {1-26},
  url = {https://jmlr.org/papers/v21/18-211.html}
}

We propose a novel, theoretically-grounded, acquisition function for batch Bayesian optimisation informed by insights from distributionally robust optimization. Our acquisition function is a lower bound on the well-known Expected Improvement function - which requires a multi-dimensional Gaussian Expectation over a piecewise affine function - and is computed by evaluating instead the best-case expectation over all probability distributions consistent with the same mean and variance as the original Gaussian distribution. We show that, unlike alternative approaches including Expected Improvement, our proposed acquisition function avoids multi-dimensional integrations entirely, and can be calculated exactly as the solution of a convex optimization problem in the form of a tractable semidefinite program (SDP). Moreover, we prove that the solution of this SDP also yields exact numerical derivatives, which enable efficient optimisation of the acquisition function. Numerical results suggest that our acquisition function performs very similar to the computationally intractable exact Expected Improvement and considerably better than other heuristics.