Text
Evolutionary Machine Learning With Minions : A Case Study in Feature Selection
Many decisions in a machine learning (ML) pipeline involve nondifferentiable and discontinuous objectives and search spaces. Examples include feature selection, model selection, and hyperparameter tuning, where candidate solutions in an outer optimization loop must be evaluated via a learning subsystem. Evolutionary algorithms (EAs) are prominent gradient-free methods to handle such tasks. However, EAs are known to pose steep computational challenges, especially when dealing with large-instance datasets. As opposed to prior works that often fall back on parallel computing hardware to resolve this big data problem of EAs, in this article, we propose a novel algorithm-centric solution based on evolutionary multitasking . Our approach involves the creation of a band of minions , i.e., small data proxies to the main target task, that are constructed by subsampling a fraction of the large dataset. We then combine the minions with the main task in a single multitask optimization framework, boosting evolutionary search by using small data to quickly optimize for the large dataset. Our key algorithmic contribution in this setting is to allocate computational resources to each of the tasks in a principled manner. The article considers wrapper-based feature selection as an illustrative case study of the broader idea of using multitasking to speedup outer loop evolutionary configurations of any ML subsystem. The experiments reveal that multitasking can indeed speedup baseline EAs, by more than 40% on some datasets.
Barcode | Tipe Koleksi | Nomor Panggil | Lokasi | Status | |
---|---|---|---|---|---|
art141630 | null | Artikel | Gdg9-Lt3 | Tersedia namun tidak untuk dipinjamkan - No Loan |
Tidak tersedia versi lain