About Bayesian Optimization


Bayesian optimization is a global optimization strategy for black-box and expensive-to-evaluate functions. Generic Bayesian optimization follows these steps:

  1. Build a surrogate function \(\hat{f}\) with historical inputs \(\mathbf{X}\) and their observations \(\mathbf{y}\), which is defined with mean and variance functions.

\[\hat{f}(\mathbf{x} \mid \mathbf{X}, \mathbf{y}) = \mathcal{N}(\mu(\mathbf{x} \mid \mathbf{X}, \mathbf{y}), \sigma^2(\mathbf{x} \mid \mathbf{X}, \mathbf{y}))\]
  1. Compute and maximize an acquisition function \(a\), defined by the outputs of surrogate function, i.e., \(\mu(\mathbf{x} \mid \mathbf{X}, \mathbf{y})\) and \(\sigma^2(\mathbf{x} \mid \mathbf{X}, \mathbf{y})\).

\[\mathbf{x}^{*} = {\arg \max} \ a(\mathbf{x} \mid \mu(\mathbf{x} \mid \mathbf{X}, \mathbf{y}), \sigma^2(\mathbf{x} \mid \mathbf{X}, \mathbf{y}))\]
  1. Observe the maximizer of acquisition function from a true objective function \(f\) where a random observation noise \(\epsilon\) exists.

\[y = f(\mathbf{x}^{*}) + \epsilon\]
  1. Update historical inputs \(\mathbf{X}\) and their observations \(\mathbf{y}\) accumulating the maximizer \(\mathbf{x}^{*}\) and its observation \(y\).

This project helps us to execute this Bayesian optimization procedure. In particular, several surrogate functions such as Gaussian process regression and Student-\(t\) process regression and various acquisition functions such as probability of improvement, expected improvement, and Gaussian process upper confidence bound are included in this project.