{"user_id":"102979","year":"2015","status":"public","_id":"48838","type":"conference","keyword":["evolutionary algorithms","model-based optimization","parameter tuning"],"publisher":"Association for Computing Machinery","citation":{"short":"J. Bossek, B. Bischl, T. Wagner, G. Rudolph, in: Proceedings of the Genetic and Evolutionary Computation Conference, Association for Computing Machinery, New York, NY, USA, 2015, pp. 1319–1326.","ieee":"J. Bossek, B. Bischl, T. Wagner, and G. Rudolph, “Learning Feature-Parameter Mappings for Parameter Tuning via the Profile Expected Improvement,” in Proceedings of the Genetic and Evolutionary Computation Conference, 2015, pp. 1319–1326, doi: 10.1145/2739480.2754673.","bibtex":"@inproceedings{Bossek_Bischl_Wagner_Rudolph_2015, place={New York, NY, USA}, series={GECCO ’15}, title={Learning Feature-Parameter Mappings for Parameter Tuning via the Profile Expected Improvement}, DOI={10.1145/2739480.2754673}, booktitle={Proceedings of the Genetic and Evolutionary Computation Conference}, publisher={Association for Computing Machinery}, author={Bossek, Jakob and Bischl, Bernd and Wagner, Tobias and Rudolph, Günter}, year={2015}, pages={1319–1326}, collection={GECCO ’15} }","ama":"Bossek J, Bischl B, Wagner T, Rudolph G. Learning Feature-Parameter Mappings for Parameter Tuning via the Profile Expected Improvement. In: Proceedings of the Genetic and Evolutionary Computation Conference. GECCO ’15. Association for Computing Machinery; 2015:1319–1326. doi:10.1145/2739480.2754673","chicago":"Bossek, Jakob, Bernd Bischl, Tobias Wagner, and Günter Rudolph. “Learning Feature-Parameter Mappings for Parameter Tuning via the Profile Expected Improvement.” In Proceedings of the Genetic and Evolutionary Computation Conference, 1319–1326. GECCO ’15. New York, NY, USA: Association for Computing Machinery, 2015. https://doi.org/10.1145/2739480.2754673.","mla":"Bossek, Jakob, et al. “Learning Feature-Parameter Mappings for Parameter Tuning via the Profile Expected Improvement.” Proceedings of the Genetic and Evolutionary Computation Conference, Association for Computing Machinery, 2015, pp. 1319–1326, doi:10.1145/2739480.2754673.","apa":"Bossek, J., Bischl, B., Wagner, T., & Rudolph, G. (2015). Learning Feature-Parameter Mappings for Parameter Tuning via the Profile Expected Improvement. Proceedings of the Genetic and Evolutionary Computation Conference, 1319–1326. https://doi.org/10.1145/2739480.2754673"},"language":[{"iso":"eng"}],"page":"1319–1326","publication":"Proceedings of the Genetic and Evolutionary Computation Conference","publication_identifier":{"isbn":["978-1-4503-3472-3"]},"author":[{"last_name":"Bossek","full_name":"Bossek, Jakob","first_name":"Jakob","id":"102979","orcid":"0000-0002-4121-4668"},{"first_name":"Bernd","full_name":"Bischl, Bernd","last_name":"Bischl"},{"full_name":"Wagner, Tobias","last_name":"Wagner","first_name":"Tobias"},{"full_name":"Rudolph, Günter","last_name":"Rudolph","first_name":"Günter"}],"publication_status":"published","date_created":"2023-11-14T15:58:51Z","doi":"10.1145/2739480.2754673","department":[{"_id":"819"}],"date_updated":"2023-12-13T10:40:30Z","place":"New York, NY, USA","series_title":"GECCO ’15","extern":"1","abstract":[{"lang":"eng","text":"The majority of algorithms can be controlled or adjusted by parameters. Their values can substantially affect the algorithms’ performance. Since the manual exploration of the parameter space is tedious – even for few parameters – several automatic procedures for parameter tuning have been proposed. Recent approaches also take into account some characteristic properties of the problem instances, frequently termed instance features. Our contribution is the proposal of a novel concept for feature-based algorithm parameter tuning, which applies an approximating surrogate model for learning the continuous feature-parameter mapping. To accomplish this, we learn a joint model of the algorithm performance based on both the algorithm parameters and the instance features. The required data is gathered using a recently proposed acquisition function for model refinement in surrogate-based optimization: the profile expected improvement. This function provides an avenue for maximizing the information required for the feature-parameter mapping, i.e., the mapping from instance features to the corresponding optimal algorithm parameters. The approach is validated by applying the tuner to exemplary evolutionary algorithms and problems, for which theoretically grounded or heuristically determined feature-parameter mappings are available."}],"title":"Learning Feature-Parameter Mappings for Parameter Tuning via the Profile Expected Improvement"}