---
_id: '51159'
abstract:
- lang: eng
  text: Sparsity is a highly desired feature in deep neural networks (DNNs) since
    it ensures numerical efficiency, improves the interpretability of models (due
    to the smaller number of relevant features), and robustness. In machine learning
    approaches based on linear models, it is well known that there exists a connecting
    path between the sparsest solution in terms of the $\ell^1$ norm,i.e., zero weights
    and the non-regularized solution, which is called the regularization path. Very
    recently, there was a first attempt to extend the concept of regularization paths
    to DNNs by means of treating the empirical loss and sparsity ($\ell^1$ norm) as
    two conflicting criteria and solving the resulting multiobjective optimization
    problem. However, due to the non-smoothness of the $\ell^1$ norm and the high
    number of parameters, this approach is not very efficient from a computational
    perspective. To overcome this limitation, we present an algorithm that allows
    for the approximation of the entire Pareto front for the above-mentioned objectives
    in a very efficient manner. We present numerical examples using both deterministic
    and stochastic gradients. We furthermore demonstrate that knowledge of the regularization
    path allows for a well-generalizing network parametrization.
author:
- first_name: Augustina Chidinma
  full_name: Amakor, Augustina Chidinma
  id: '97916'
  last_name: Amakor
- first_name: Konstantin
  full_name: Sonntag, Konstantin
  id: '56399'
  last_name: Sonntag
- first_name: Sebastian
  full_name: Peitz, Sebastian
  id: '47427'
  last_name: Peitz
  orcid: 0000-0002-3389-793X
citation:
  ama: Amakor AC, Sonntag K, Peitz S. A multiobjective continuation method to compute
    the regularization path of deep neural networks. <i>arXiv</i>. Published online
    2023.
  apa: Amakor, A. C., Sonntag, K., &#38; Peitz, S. (2023). A multiobjective continuation
    method to compute the regularization path of deep neural networks. In <i>arXiv</i>.
  bibtex: '@article{Amakor_Sonntag_Peitz_2023, title={A multiobjective continuation
    method to compute the regularization path of deep neural networks}, journal={arXiv},
    author={Amakor, Augustina Chidinma and Sonntag, Konstantin and Peitz, Sebastian},
    year={2023} }'
  chicago: Amakor, Augustina Chidinma, Konstantin Sonntag, and Sebastian Peitz. “A
    Multiobjective Continuation Method to Compute the Regularization Path of Deep
    Neural Networks.” <i>ArXiv</i>, 2023.
  ieee: A. C. Amakor, K. Sonntag, and S. Peitz, “A multiobjective continuation method
    to compute the regularization path of deep neural networks,” <i>arXiv</i>. 2023.
  mla: Amakor, Augustina Chidinma, et al. “A Multiobjective Continuation Method to
    Compute the Regularization Path of Deep Neural Networks.” <i>ArXiv</i>, 2023.
  short: A.C. Amakor, K. Sonntag, S. Peitz, ArXiv (2023).
date_created: 2024-02-06T08:51:00Z
date_updated: 2024-02-06T08:52:07Z
department:
- _id: '655'
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://arxiv.org/pdf/2308.12044.pdf
oa: '1'
publication: arXiv
status: public
title: A multiobjective continuation method to compute the regularization path of
  deep neural networks
type: preprint
user_id: '47427'
year: '2023'
...
