Statistical methods to design computer experiments usually rely on a Gaussian process (GP) surrogate model, and typically aim at selecting design points (combinations of algorithmic and model parameters) that minimize the average prediction variance. Sequential or adaptive design strategies select design points one-at-a-time, and use the collected observation to update the surrogate.
In many applications, one software parameter indicates the accuracy of the observation, and is directly related to the amount of computing time (e.g. number of Monte-Carlo runs, or mesh size in FEM simulation). We formulate the problem of allocating a budget of computing time over a finite set of candidate points for the goals mentioned above. This is a continuous optimization problem, which is moreover convex whenever the tradeoff function accuracy vs. computing time is concave. On the other hand, using non-concave weight functions can help to identify sparse designs.
At the end of this talk, we will present preliminary results concerning the adaptation of these techniques to black-box optimization. The goal is now to select design points --together with associated computing times-- in a sequential manner, in order to find a global minimizer of the sampled function.