{"article_number":"885207","citation":{"bibtex":"@article{Yegenoglu_Subramoney_Hater_Jimenez-Romero_Klijn_Pérez Martín_van der Vlag_Herty_Morrison_Diaz-Pier_2022, title={Exploring Parameter and Hyper-Parameter Spaces of Neuroscience Models on High Performance Computers With Learning to Learn}, volume={16}, DOI={10.3389/fncom.2022.885207}, number={885207}, journal={Frontiers in Computational Neuroscience}, publisher={Frontiers Media SA}, author={Yegenoglu, Alper and Subramoney, Anand and Hater, Thorsten and Jimenez-Romero, Cristian and Klijn, Wouter and Pérez Martín, Aarón and van der Vlag, Michiel and Herty, Michael and Morrison, Abigail and Diaz-Pier, Sandra}, year={2022} }","apa":"Yegenoglu, A., Subramoney, A., Hater, T., Jimenez-Romero, C., Klijn, W., Pérez Martín, A., van der Vlag, M., Herty, M., Morrison, A., & Diaz-Pier, S. (2022). Exploring Parameter and Hyper-Parameter Spaces of Neuroscience Models on High Performance Computers With Learning to Learn. Frontiers in Computational Neuroscience, 16, Article 885207. https://doi.org/10.3389/fncom.2022.885207","chicago":"Yegenoglu, Alper, Anand Subramoney, Thorsten Hater, Cristian Jimenez-Romero, Wouter Klijn, Aarón Pérez Martín, Michiel van der Vlag, Michael Herty, Abigail Morrison, and Sandra Diaz-Pier. “Exploring Parameter and Hyper-Parameter Spaces of Neuroscience Models on High Performance Computers With Learning to Learn.” Frontiers in Computational Neuroscience 16 (2022). https://doi.org/10.3389/fncom.2022.885207.","ieee":"A. Yegenoglu et al., “Exploring Parameter and Hyper-Parameter Spaces of Neuroscience Models on High Performance Computers With Learning to Learn,” Frontiers in Computational Neuroscience, vol. 16, Art. no. 885207, 2022, doi: 10.3389/fncom.2022.885207.","ama":"Yegenoglu A, Subramoney A, Hater T, et al. Exploring Parameter and Hyper-Parameter Spaces of Neuroscience Models on High Performance Computers With Learning to Learn. Frontiers in Computational Neuroscience. 2022;16. doi:10.3389/fncom.2022.885207","short":"A. Yegenoglu, A. Subramoney, T. Hater, C. Jimenez-Romero, W. Klijn, A. Pérez Martín, M. van der Vlag, M. Herty, A. Morrison, S. Diaz-Pier, Frontiers in Computational Neuroscience 16 (2022).","mla":"Yegenoglu, Alper, et al. “Exploring Parameter and Hyper-Parameter Spaces of Neuroscience Models on High Performance Computers With Learning to Learn.” Frontiers in Computational Neuroscience, vol. 16, 885207, Frontiers Media SA, 2022, doi:10.3389/fncom.2022.885207."},"volume":16,"language":[{"iso":"eng"}],"doi":"10.3389/fncom.2022.885207","type":"journal_article","year":"2022","title":"Exploring Parameter and Hyper-Parameter Spaces of Neuroscience Models on High Performance Computers With Learning to Learn","abstract":[{"text":"Neuroscience models commonly have a high number of degrees of freedom and only specific regions within the parameter space are able to produce dynamics of interest. This makes the development of tools and strategies to efficiently find these regions of high importance to advance brain research. Exploring the high dimensional parameter space using numerical simulations has been a frequently used technique in the last years in many areas of computational neuroscience. Today, high performance computing (HPC) can provide a powerful infrastructure to speed up explorations and increase our general understanding of the behavior of the model in reasonable times. Learning to learn (L2L) is a well-known concept in machine learning (ML) and a specific method for acquiring constraints to improve learning performance. This concept can be decomposed into a two loop optimization process where the target of optimization can consist of any program such as an artificial neural network, a spiking network, a single cell model, or a whole brain simulation. In this work, we present L2L as an easy to use and flexible framework to perform parameter and hyper-parameter space exploration of neuroscience models on HPC infrastructure. Learning to learn is an implementation of the L2L concept written in Python. This open-source software allows several instances of an optimization target to be executed with different parameters in an embarrassingly parallel fashion on HPC. L2L provides a set of built-in optimizer algorithms, which make adaptive and efficient exploration of parameter spaces possible. Different from other optimization toolboxes, L2L provides maximum flexibility for the way the optimization target can be executed. In this paper, we show a variety of examples of neuroscience models being optimized within the L2L framework to execute different types of tasks. The tasks used to illustrate the concept go from reproducing empirical data to learning how to solve a problem in a dynamic environment. We particularly focus on simulations with models ranging from the single cell to the whole brain and using a variety of simulation engines like NEST, Arbor, TVB, OpenAIGym, and NetLogo.","lang":"eng"}],"date_created":"2025-08-06T15:02:30Z","status":"public","publisher":"Frontiers Media SA","date_updated":"2025-08-08T11:40:08Z","user_id":"117951","publication_status":"published","_id":"60900","publication":"Frontiers in Computational Neuroscience","publication_identifier":{"issn":["1662-5188"]},"author":[{"first_name":"Alper","orcid":"0000-0001-8869-215X","id":"117951","full_name":"Yegenoglu, Alper","last_name":"Yegenoglu"},{"first_name":"Anand","full_name":"Subramoney, Anand","last_name":"Subramoney"},{"first_name":"Thorsten","full_name":"Hater, Thorsten","last_name":"Hater"},{"first_name":"Cristian","full_name":"Jimenez-Romero, Cristian","last_name":"Jimenez-Romero"},{"last_name":"Klijn","full_name":"Klijn, Wouter","first_name":"Wouter"},{"first_name":"Aarón","last_name":"Pérez Martín","full_name":"Pérez Martín, Aarón"},{"full_name":"van der Vlag, Michiel","last_name":"van der Vlag","first_name":"Michiel"},{"first_name":"Michael","last_name":"Herty","full_name":"Herty, Michael"},{"first_name":"Abigail","last_name":"Morrison","full_name":"Morrison, Abigail"},{"first_name":"Sandra","last_name":"Diaz-Pier","full_name":"Diaz-Pier, Sandra"}],"intvolume":" 16"}