Sequential Optimization in Locally Important Dimensions

20 Jan 2019  ·  Winkel Munir A., Stallings Jonathan W., Storlie Curt B., Reich Brian J. ·

Optimizing an expensive, black-box function $f(\cdot)$ is challenging when its input space is high-dimensional. Sequential design frameworks first model $f(\cdot)$ with a surrogate function and then optimize an acquisition function to determine input settings to evaluate next. Optimization of both $f(\cdot)$ and the acquisition function benefit from effective dimension reduction. Global variable selection detects and removes input variables that do not affect $f(\cdot)$ across the input space. Further dimension reduction may be possible if we consider local variable selection around the current optimum estimate. We develop a sequential design algorithm called Sequential Optimization in Locally Important Dimensions (SOLID) that incorporates global and local variable selection to optimize a continuous, differentiable function. SOLID performs local variable selection by comparing the surrogate's predictions in a localized region around the estimated optimum with the $p$ alternative predictions made by removing each input variable. The search space of the acquisition function is further restricted to focus only on the variables that are deemed locally active, leading to greater emphasis on refining the surrogate model in locally active dimensions. A simulation study across three test functions and an application to the Sarcos robot dataset show that SOLID outperforms conventional approaches.

PDF Abstract
No code implementations yet. Submit your code now

Categories


Methodology

Datasets


  Add Datasets introduced or used in this paper