Text
Sequential Optimization in Locally Important Dimensions
Optimizing an expensive, black-box function f(⋅) is challenging when its input space is high-dimensional. Sequential design frameworks first model f(⋅) with a surrogate function and then optimize an acquisition function to determine input settings to evaluate next. Optimization of both f(⋅) and the acquisition function benefit from effective dimension reduction. Global variable selection detects and removes input variables that do not affect f(⋅) across the input space. Further dimension reduction may be possible if we consider local variable selection around the current optimum estimate. We develop a sequential design algorithm called sequential optimization in locally important dimensions (SOLID) that incorporates global and local variable selection to optimize a continuous, differentiable function. SOLID performs local variable selection by comparing the surrogate’s predictions in a localized region around the estimated optimum with the p alternative predictions made by removing each input variable. The search space of the acquisition function is further restricted to focus only on the variables that are deemed locally active, leading to greater emphasis on refining the surrogate model in locally active dimensions. A simulation study across multiple test functions and an application to the Sarcos robot dataset show that SOLID outperforms conventional approaches. Supplementary materials for this article are available online.
Barcode | Tipe Koleksi | Nomor Panggil | Lokasi | Status | |
---|---|---|---|---|---|
art139260 | null | Artikel | Gdg9-Lt3 | Tersedia namun tidak untuk dipinjamkan - No Loan |
Tidak tersedia versi lain