---
_id: '57472'
abstract:
- lang: eng
  text: In this paper we introduce, in a Hilbert space setting, a second order dynamical
    system with asymptotically vanishing damping and vanishing Tikhonov regularization
    that approaches a multiobjective optimization problem with convex and differentiable
    components of the objective function. Trajectory solutions are shown to exist
    in finite dimensions. We prove fast convergence of the function values, quantified
    in terms of a merit function. Based on the regime considered, we establish both
    weak and, in some cases, strong convergence of trajectory solutions toward a weak
    Pareto optimal solution. To achieve this, we apply Tikhonov regularization individually
    to each component of the objective function. This work extends results from single
    objective convex optimization into the multiobjective setting.
author:
- first_name: Radu Ioan
  full_name: Bot, Radu Ioan
  last_name: Bot
- first_name: Konstantin
  full_name: Sonntag, Konstantin
  id: '56399'
  last_name: Sonntag
  orcid: https://orcid.org/0000-0003-3384-3496
citation:
  ama: Bot RI, Sonntag K. Inertial dynamics with vanishing Tikhonov regularization
    for multobjective optimization. <i>Journal of Mathematical Analysis and Applications</i>.
    Published online 2025.
  apa: Bot, R. I., &#38; Sonntag, K. (2025). Inertial dynamics with vanishing Tikhonov
    regularization for multobjective optimization. <i>Journal of Mathematical Analysis
    and Applications</i>.
  bibtex: '@article{Bot_Sonntag_2025, title={Inertial dynamics with vanishing Tikhonov
    regularization for multobjective optimization}, journal={Journal of Mathematical
    Analysis and Applications}, author={Bot, Radu Ioan and Sonntag, Konstantin}, year={2025}
    }'
  chicago: Bot, Radu Ioan, and Konstantin Sonntag. “Inertial Dynamics with Vanishing
    Tikhonov Regularization for Multobjective Optimization.” <i>Journal of Mathematical
    Analysis and Applications</i>, 2025.
  ieee: R. I. Bot and K. Sonntag, “Inertial dynamics with vanishing Tikhonov regularization
    for multobjective optimization,” <i>Journal of Mathematical Analysis and Applications</i>,
    2025.
  mla: Bot, Radu Ioan, and Konstantin Sonntag. “Inertial Dynamics with Vanishing Tikhonov
    Regularization for Multobjective Optimization.” <i>Journal of Mathematical Analysis
    and Applications</i>, 2025.
  short: R.I. Bot, K. Sonntag, Journal of Mathematical Analysis and Applications (2025).
date_created: 2024-11-28T08:58:17Z
date_updated: 2025-10-16T11:56:36Z
ddc:
- '510'
department:
- _id: '101'
- _id: '530'
- _id: '655'
external_id:
  arxiv:
  - '2411.18422'
file:
- access_level: open_access
  content_type: application/pdf
  creator: sonntagk
  date_created: 2024-11-28T08:58:00Z
  date_updated: 2024-11-28T08:58:00Z
  file_id: '57473'
  file_name: Inertial dynamics with vanishing Tikhonov regularization for multobjective
    optimization.pdf
  file_size: 4291134
  relation: main_file
file_date_updated: 2024-11-28T08:58:00Z
has_accepted_license: '1'
keyword:
- Pareto optimization
- Lyapunov analysis
- gradient-like dynamical systems
- inertial dynamics
- asymptotic vanishing damping
- Tikhonov regularization
- strong convergence
language:
- iso: eng
main_file_link:
- url: https://arxiv.org/pdf/2411.18422
oa: '1'
publication: Journal of Mathematical Analysis and Applications
status: public
title: Inertial dynamics with vanishing Tikhonov regularization for multobjective
  optimization
type: journal_article
user_id: '56399'
year: '2025'
...
---
_id: '62750'
abstract:
- lang: eng
  text: 'Diese Dissertation enthält Beiträge zum Bereich der Mehrzieloptimierung mit
    einem Fokus auf unbeschränkten Problemen, die auf einem allgemeinen Hilbertraum
    definiert sind. Für Mehrzieloptimierungsprobleme mit lokal Lipschitz-stetigen
    Zielfunktionen definieren wir ein multikriterielles Subdifferential, das wir erstmals
    im Kontext allgemeiner Hilberträume analysieren. Aufbauend auf diesen theoretischen
    Untersuchungen präsentieren wir ein Abstiegsverfahren, bei welchem in jeder Iteration
    eine Abstiegsrichtung mittels einer numerischen Approximation des multikriteriellen
    Subdifferentials bestimmt wird. Im Kontext konvexer, stetig differenzierbarer
    Zielfunktionen mit Lipschitz-stetigen Gradienten, führen wir eine Familie von
    dynamischen Gradientensystemen mit Trägheitsterm ein, die bekannte kontinuierliche
    Systeme aus der skalaren Optimierung verallgemeinern. Wir stellen drei neue Systeme
    vor: eines mit konstanter Dämpfung, eines mit asymptotisch abnehmender Dämpfung
    und eines, das zusätzlich eine zeitabhängige Tikhonov-Regularisierung beinhaltet.
    Aufbauend auf den Untersuchungen der neuen dynamischen Gradientensysteme, entwickeln
    wir ein beschleunigtes Gradientenverfahren zur Mehrzieloptimierung, das auf einer
    Diskretisierung des multikriteriellen Gradientensystems mit asymptotisch abnehmender
    Dämpfung beruht. Das hergeleitete Verfahren bewahrt die günstigen Konvergenzeigenschaften
    des kontinuierlichen Systems und erreicht eine schnellere Konvergenz als klassische
    Verfahren.'
- lang: eng
  text: 'This dissertation contributes to the field of multiobjective optimization,
    with a focus on unconstrained problems formulated in a general Hilbert space.
    For multiobjective optimization problems with locally Lipschitz continuous objective
    functions, we define a multiobjective subdifferential, which we analyze for the
    first time in the context of general Hilbert spaces. Building on these theoretical
    investigations, we present a descent method in which, at each iteration, a descent
    direction is determined via a numerical approximation of the multiobjective subdifferential.
    In the setting of convex, continuously differentiable objective functions with
    Lipschitz continuous gradients, we introduce a family of inertial gradient dynamical
    systems that generalize well-known continuous-time systems from scalar optimization.
    We present three novel systems: one with constant damping, one with asymptotic
    vanishing damping, and one combining vanishing damping with time-dependent Tikhonov
    regularization. Building on the investigation of the novel gradient dynamical
    systems, we develop an accelerated gradient method for multiobjective optimization
    via discretization of the multiobjective gradient system with asymptotic vanishing
    damping. The proposed method retains the favorable convergence properties of the
    continuous system while achieving faster convergence than standard approaches,
    such as classical methods.'
author:
- first_name: Konstantin
  full_name: Sonntag, Konstantin
  id: '56399'
  last_name: Sonntag
  orcid: https://orcid.org/0000-0003-3384-3496
citation:
  ama: Sonntag K. <i>First-Order Methods and Gradient Dynamical Systems for Multiobjective
    Optimization</i>. Paderborn University; 2025. doi:<a href="https://doi.org/10.17619/UNIPB/1-2457">10.17619/UNIPB/1-2457</a>
  apa: Sonntag, K. (2025). <i>First-order methods and gradient dynamical systems for
    multiobjective optimization</i>. Paderborn University. <a href="https://doi.org/10.17619/UNIPB/1-2457">https://doi.org/10.17619/UNIPB/1-2457</a>
  bibtex: '@book{Sonntag_2025, title={First-order methods and gradient dynamical systems
    for multiobjective optimization}, DOI={<a href="https://doi.org/10.17619/UNIPB/1-2457">10.17619/UNIPB/1-2457</a>},
    publisher={Paderborn University}, author={Sonntag, Konstantin}, year={2025} }'
  chicago: Sonntag, Konstantin. <i>First-Order Methods and Gradient Dynamical Systems
    for Multiobjective Optimization</i>. Paderborn University, 2025. <a href="https://doi.org/10.17619/UNIPB/1-2457">https://doi.org/10.17619/UNIPB/1-2457</a>.
  ieee: K. Sonntag, <i>First-order methods and gradient dynamical systems for multiobjective
    optimization</i>. Paderborn University, 2025.
  mla: Sonntag, Konstantin. <i>First-Order Methods and Gradient Dynamical Systems
    for Multiobjective Optimization</i>. Paderborn University, 2025, doi:<a href="https://doi.org/10.17619/UNIPB/1-2457">10.17619/UNIPB/1-2457</a>.
  short: K. Sonntag, First-Order Methods and Gradient Dynamical Systems for Multiobjective
    Optimization, Paderborn University, 2025.
date_created: 2025-12-03T06:55:01Z
date_updated: 2025-12-03T07:04:36Z
ddc:
- '510'
department:
- _id: '101'
- _id: '530'
doi: 10.17619/UNIPB/1-2457
has_accepted_license: '1'
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://digital.ub.uni-paderborn.de/hs/download/pdf/8141881
oa: '1'
publisher: Paderborn University
status: public
supervisor:
- first_name: Michael
  full_name: Dellnitz, Michael
  last_name: Dellnitz
- first_name: Sina
  full_name: Ober-Blöbaum, Sina
  id: '16494'
  last_name: Ober-Blöbaum
title: First-order methods and gradient dynamical systems for multiobjective optimization
type: dissertation
user_id: '56399'
year: '2025'
...
---
_id: '46019'
abstract:
- lang: eng
  text: We derive efficient algorithms to compute weakly Pareto optimal solutions
    for smooth, convex and unconstrained multiobjective optimization problems in general
    Hilbert spaces. To this end, we define a novel inertial gradient-like dynamical
    system in the multiobjective setting, which trajectories converge weakly to Pareto
    optimal solutions. Discretization of this system yields an inertial multiobjective
    algorithm which generates sequences that converge weakly to Pareto optimal solutions.
    We employ Nesterov acceleration to define an algorithm with an improved convergence
    rate compared to the plain multiobjective steepest descent method (Algorithm 1).
    A further improvement in terms of efficiency is achieved by avoiding the solution
    of a quadratic subproblem to compute a common step direction for all objective
    functions, which is usually required in first-order methods. Using a different
    discretization of our inertial gradient-like dynamical system, we obtain an accelerated
    multiobjective gradient method that does not require the solution of a subproblem
    in each step (Algorithm 2). While this algorithm does not converge in general,
    it yields good results on test problems while being faster than standard steepest
    descent.
author:
- first_name: Konstantin
  full_name: Sonntag, Konstantin
  id: '56399'
  last_name: Sonntag
  orcid: https://orcid.org/0000-0003-3384-3496
- first_name: Sebastian
  full_name: Peitz, Sebastian
  id: '47427'
  last_name: Peitz
  orcid: 0000-0002-3389-793X
citation:
  ama: Sonntag K, Peitz S. Fast Multiobjective Gradient Methods with Nesterov Acceleration
    via Inertial Gradient-Like Systems. <i>Journal of Optimization Theory and Applications</i>.
    Published online 2024. doi:<a href="https://doi.org/10.1007/s10957-024-02389-3">10.1007/s10957-024-02389-3</a>
  apa: Sonntag, K., &#38; Peitz, S. (2024). Fast Multiobjective Gradient Methods with
    Nesterov Acceleration via Inertial Gradient-Like Systems. <i>Journal of Optimization
    Theory and Applications</i>. <a href="https://doi.org/10.1007/s10957-024-02389-3">https://doi.org/10.1007/s10957-024-02389-3</a>
  bibtex: '@article{Sonntag_Peitz_2024, title={Fast Multiobjective Gradient Methods
    with Nesterov Acceleration via Inertial Gradient-Like Systems}, DOI={<a href="https://doi.org/10.1007/s10957-024-02389-3">10.1007/s10957-024-02389-3</a>},
    journal={Journal of Optimization Theory and Applications}, publisher={Springer},
    author={Sonntag, Konstantin and Peitz, Sebastian}, year={2024} }'
  chicago: Sonntag, Konstantin, and Sebastian Peitz. “Fast Multiobjective Gradient
    Methods with Nesterov Acceleration via Inertial Gradient-Like Systems.” <i>Journal
    of Optimization Theory and Applications</i>, 2024. <a href="https://doi.org/10.1007/s10957-024-02389-3">https://doi.org/10.1007/s10957-024-02389-3</a>.
  ieee: 'K. Sonntag and S. Peitz, “Fast Multiobjective Gradient Methods with Nesterov
    Acceleration via Inertial Gradient-Like Systems,” <i>Journal of Optimization Theory
    and Applications</i>, 2024, doi: <a href="https://doi.org/10.1007/s10957-024-02389-3">10.1007/s10957-024-02389-3</a>.'
  mla: Sonntag, Konstantin, and Sebastian Peitz. “Fast Multiobjective Gradient Methods
    with Nesterov Acceleration via Inertial Gradient-Like Systems.” <i>Journal of
    Optimization Theory and Applications</i>, Springer, 2024, doi:<a href="https://doi.org/10.1007/s10957-024-02389-3">10.1007/s10957-024-02389-3</a>.
  short: K. Sonntag, S. Peitz, Journal of Optimization Theory and Applications (2024).
date_created: 2023-07-12T06:35:58Z
date_updated: 2024-02-21T10:13:33Z
department:
- _id: '101'
- _id: '655'
doi: 10.1007/s10957-024-02389-3
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://link.springer.com/content/pdf/10.1007/s10957-024-02389-3.pdf
oa: '1'
publication: Journal of Optimization Theory and Applications
publication_status: published
publisher: Springer
status: public
title: Fast Multiobjective Gradient Methods with Nesterov Acceleration via Inertial
  Gradient-Like Systems
type: journal_article
user_id: '56399'
year: '2024'
...
---
_id: '51334'
abstract:
- lang: eng
  text: The efficient optimization method for locally Lipschitz continuous multiobjective
    optimization problems from [1] is extended from finite-dimensional problems to
    general Hilbert spaces. The method iteratively computes Pareto critical points,
    where in each iteration, an approximation of the subdifferential is computed in
    an efficient manner and then used to compute a common descent direction for all
    objective functions. To prove convergence, we present some new optimality results
    for nonsmooth multiobjective optimization problems in Hilbert spaces. Using these,
    we can show that every accumulation point of the sequence generated by our algorithm
    is Pareto critical under common assumptions. Computational efficiency for finding
    Pareto critical points is numerically demonstrated for multiobjective optimal
    control of an obstacle problem.
author:
- first_name: Konstantin
  full_name: Sonntag, Konstantin
  id: '56399'
  last_name: Sonntag
  orcid: https://orcid.org/0000-0003-3384-3496
- first_name: Bennet
  full_name: Gebken, Bennet
  id: '32643'
  last_name: Gebken
- first_name: Georg
  full_name: Müller, Georg
  last_name: Müller
- first_name: Sebastian
  full_name: Peitz, Sebastian
  id: '47427'
  last_name: Peitz
  orcid: 0000-0002-3389-793X
- first_name: Stefan
  full_name: Volkwein, Stefan
  last_name: Volkwein
citation:
  ama: Sonntag K, Gebken B, Müller G, Peitz S, Volkwein S. A Descent Method for Nonsmooth
    Multiobjective Optimization in Hilbert Spaces. <i>arXiv:240206376</i>. Published
    online 2024.
  apa: Sonntag, K., Gebken, B., Müller, G., Peitz, S., &#38; Volkwein, S. (2024).
    A Descent Method for Nonsmooth Multiobjective Optimization in Hilbert Spaces.
    In <i>arXiv:2402.06376</i>.
  bibtex: '@article{Sonntag_Gebken_Müller_Peitz_Volkwein_2024, title={A Descent Method
    for Nonsmooth Multiobjective Optimization in Hilbert Spaces}, journal={arXiv:2402.06376},
    author={Sonntag, Konstantin and Gebken, Bennet and Müller, Georg and Peitz, Sebastian
    and Volkwein, Stefan}, year={2024} }'
  chicago: Sonntag, Konstantin, Bennet Gebken, Georg Müller, Sebastian Peitz, and
    Stefan Volkwein. “A Descent Method for Nonsmooth Multiobjective Optimization in
    Hilbert Spaces.” <i>ArXiv:2402.06376</i>, 2024.
  ieee: K. Sonntag, B. Gebken, G. Müller, S. Peitz, and S. Volkwein, “A Descent Method
    for Nonsmooth Multiobjective Optimization in Hilbert Spaces,” <i>arXiv:2402.06376</i>.
    2024.
  mla: Sonntag, Konstantin, et al. “A Descent Method for Nonsmooth Multiobjective
    Optimization in Hilbert Spaces.” <i>ArXiv:2402.06376</i>, 2024.
  short: K. Sonntag, B. Gebken, G. Müller, S. Peitz, S. Volkwein, ArXiv:2402.06376
    (2024).
date_created: 2024-02-13T09:35:26Z
date_updated: 2024-02-21T10:21:03Z
department:
- _id: '101'
- _id: '655'
external_id:
  arxiv:
  - "\t2402.06376"
has_accepted_license: '1'
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://arxiv.org/abs/2402.06376
oa: '1'
publication: arXiv:2402.06376
status: public
title: A Descent Method for Nonsmooth Multiobjective Optimization in Hilbert Spaces
type: preprint
user_id: '56399'
year: '2024'
...
---
_id: '53858'
author:
- first_name: Junaid
  full_name: Akhter, Junaid
  id: '97994'
  last_name: Akhter
- first_name: Paul David
  full_name: Fährmann, Paul David
  last_name: Fährmann
- first_name: Konstantin
  full_name: Sonntag, Konstantin
  id: '56399'
  last_name: Sonntag
  orcid: https://orcid.org/0000-0003-3384-3496
- first_name: Sebastian
  full_name: Peitz, Sebastian
  id: '47427'
  last_name: Peitz
  orcid: 0000-0002-3389-793X
citation:
  ama: Akhter J, Fährmann PD, Sonntag K, Peitz S. Common pitfalls to avoid while using
    multiobjective optimization in machine learning. <i>arXiv</i>. Published online
    2024.
  apa: Akhter, J., Fährmann, P. D., Sonntag, K., &#38; Peitz, S. (2024). Common pitfalls
    to avoid while using multiobjective optimization in machine learning. In <i>arXiv</i>.
  bibtex: '@article{Akhter_Fährmann_Sonntag_Peitz_2024, title={Common pitfalls to
    avoid while using multiobjective optimization in machine learning}, journal={arXiv},
    author={Akhter, Junaid and Fährmann, Paul David and Sonntag, Konstantin and Peitz,
    Sebastian}, year={2024} }'
  chicago: Akhter, Junaid, Paul David Fährmann, Konstantin Sonntag, and Sebastian
    Peitz. “Common Pitfalls to Avoid While Using Multiobjective Optimization in Machine
    Learning.” <i>ArXiv</i>, 2024.
  ieee: J. Akhter, P. D. Fährmann, K. Sonntag, and S. Peitz, “Common pitfalls to avoid
    while using multiobjective optimization in machine learning,” <i>arXiv</i>. 2024.
  mla: Akhter, Junaid, et al. “Common Pitfalls to Avoid While Using Multiobjective
    Optimization in Machine Learning.” <i>ArXiv</i>, 2024.
  short: J. Akhter, P.D. Fährmann, K. Sonntag, S. Peitz, ArXiv (2024).
date_created: 2024-05-03T13:38:34Z
date_updated: 2024-05-06T08:29:38Z
department:
- _id: '655'
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://arxiv.org/pdf/2405.01480
oa: '1'
publication: arXiv
status: public
title: Common pitfalls to avoid while using multiobjective optimization in machine
  learning
type: preprint
user_id: '47427'
year: '2024'
...
---
_id: '32447'
abstract:
- lang: eng
  text: 'We present a new gradient-like dynamical system related to unconstrained
    convex smooth multiobjective optimization which involves inertial effects and
    asymptotic vanishing damping. To the best of our knowledge, this system is the
    first inertial gradient-like system for multiobjective optimization problems including
    asymptotic vanishing damping, expanding the ideas previously laid out in [H. Attouch
    and G. Garrigos, Multiobjective Optimization: An Inertial Dynamical Approach to
    Pareto Optima, preprint, arXiv:1506.02823, 2015]. We prove existence of solutions
    to this system in finite dimensions and further prove that its bounded solutions
    converge weakly to weakly Pareto optimal points. In addition, we obtain a convergence
    rate of order \(\mathcal{O}(t^{-2})\) for the function values measured with a
    merit function. This approach presents a good basis for the development of fast
    gradient methods for multiobjective optimization.'
article_type: original
author:
- first_name: Konstantin
  full_name: Sonntag, Konstantin
  id: '56399'
  last_name: Sonntag
  orcid: https://orcid.org/0000-0003-3384-3496
- first_name: Sebastian
  full_name: Peitz, Sebastian
  id: '47427'
  last_name: Peitz
  orcid: 0000-0002-3389-793X
citation:
  ama: Sonntag K, Peitz S. Fast Convergence of Inertial Multiobjective Gradient-Like
    Systems with Asymptotic Vanishing Damping. <i>SIAM Journal on Optimization</i>.
    2024;34(3):2259-2286. doi:<a href="https://doi.org/10.1137/23M1588512">10.1137/23M1588512</a>
  apa: Sonntag, K., &#38; Peitz, S. (2024). Fast Convergence of Inertial Multiobjective
    Gradient-Like Systems with Asymptotic Vanishing Damping. <i>SIAM Journal on Optimization</i>,
    <i>34</i>(3), 2259–2286. <a href="https://doi.org/10.1137/23M1588512">https://doi.org/10.1137/23M1588512</a>
  bibtex: '@article{Sonntag_Peitz_2024, title={Fast Convergence of Inertial Multiobjective
    Gradient-Like Systems with Asymptotic Vanishing Damping}, volume={34}, DOI={<a
    href="https://doi.org/10.1137/23M1588512">10.1137/23M1588512</a>}, number={3},
    journal={SIAM Journal on Optimization}, publisher={Society for Industrial and
    Applied Mathematics}, author={Sonntag, Konstantin and Peitz, Sebastian}, year={2024},
    pages={2259–2286} }'
  chicago: 'Sonntag, Konstantin, and Sebastian Peitz. “Fast Convergence of Inertial
    Multiobjective Gradient-Like Systems with Asymptotic Vanishing Damping.” <i>SIAM
    Journal on Optimization</i> 34, no. 3 (2024): 2259–86. <a href="https://doi.org/10.1137/23M1588512">https://doi.org/10.1137/23M1588512</a>.'
  ieee: 'K. Sonntag and S. Peitz, “Fast Convergence of Inertial Multiobjective Gradient-Like
    Systems with Asymptotic Vanishing Damping,” <i>SIAM Journal on Optimization</i>,
    vol. 34, no. 3, pp. 2259–2286, 2024, doi: <a href="https://doi.org/10.1137/23M1588512">10.1137/23M1588512</a>.'
  mla: Sonntag, Konstantin, and Sebastian Peitz. “Fast Convergence of Inertial Multiobjective
    Gradient-Like Systems with Asymptotic Vanishing Damping.” <i>SIAM Journal on Optimization</i>,
    vol. 34, no. 3, Society for Industrial and Applied Mathematics, 2024, pp. 2259–86,
    doi:<a href="https://doi.org/10.1137/23M1588512">10.1137/23M1588512</a>.
  short: K. Sonntag, S. Peitz, SIAM Journal on Optimization 34 (2024) 2259–2286.
date_created: 2022-07-28T11:53:02Z
date_updated: 2024-07-02T09:27:39Z
department:
- _id: '101'
- _id: '655'
doi: 10.1137/23M1588512
intvolume: '        34'
issue: '3'
keyword:
- multiobjective optimization
- Pareto optimization
- Lyapunov analysis
- gradient-likedynamical systems
- inertial dynamics
- asymptotic vanishing damping
- fast convergence
language:
- iso: eng
page: 2259 - 2286
publication: SIAM Journal on Optimization
publication_identifier:
  issn:
  - 1095-7189
publication_status: published
publisher: Society for Industrial and Applied Mathematics
status: public
title: Fast Convergence of Inertial Multiobjective Gradient-Like Systems with Asymptotic
  Vanishing Damping
type: journal_article
user_id: '56399'
volume: 34
year: '2024'
...
---
_id: '51159'
abstract:
- lang: eng
  text: Sparsity is a highly desired feature in deep neural networks (DNNs) since
    it ensures numerical efficiency, improves the interpretability of models (due
    to the smaller number of relevant features), and robustness. In machine learning
    approaches based on linear models, it is well known that there exists a connecting
    path between the sparsest solution in terms of the $\ell^1$ norm,i.e., zero weights
    and the non-regularized solution, which is called the regularization path. Very
    recently, there was a first attempt to extend the concept of regularization paths
    to DNNs by means of treating the empirical loss and sparsity ($\ell^1$ norm) as
    two conflicting criteria and solving the resulting multiobjective optimization
    problem. However, due to the non-smoothness of the $\ell^1$ norm and the high
    number of parameters, this approach is not very efficient from a computational
    perspective. To overcome this limitation, we present an algorithm that allows
    for the approximation of the entire Pareto front for the above-mentioned objectives
    in a very efficient manner. We present numerical examples using both deterministic
    and stochastic gradients. We furthermore demonstrate that knowledge of the regularization
    path allows for a well-generalizing network parametrization.
author:
- first_name: Augustina Chidinma
  full_name: Amakor, Augustina Chidinma
  id: '97916'
  last_name: Amakor
- first_name: Konstantin
  full_name: Sonntag, Konstantin
  id: '56399'
  last_name: Sonntag
- first_name: Sebastian
  full_name: Peitz, Sebastian
  id: '47427'
  last_name: Peitz
  orcid: 0000-0002-3389-793X
citation:
  ama: Amakor AC, Sonntag K, Peitz S. A multiobjective continuation method to compute
    the regularization path of deep neural networks. <i>arXiv</i>. Published online
    2023.
  apa: Amakor, A. C., Sonntag, K., &#38; Peitz, S. (2023). A multiobjective continuation
    method to compute the regularization path of deep neural networks. In <i>arXiv</i>.
  bibtex: '@article{Amakor_Sonntag_Peitz_2023, title={A multiobjective continuation
    method to compute the regularization path of deep neural networks}, journal={arXiv},
    author={Amakor, Augustina Chidinma and Sonntag, Konstantin and Peitz, Sebastian},
    year={2023} }'
  chicago: Amakor, Augustina Chidinma, Konstantin Sonntag, and Sebastian Peitz. “A
    Multiobjective Continuation Method to Compute the Regularization Path of Deep
    Neural Networks.” <i>ArXiv</i>, 2023.
  ieee: A. C. Amakor, K. Sonntag, and S. Peitz, “A multiobjective continuation method
    to compute the regularization path of deep neural networks,” <i>arXiv</i>. 2023.
  mla: Amakor, Augustina Chidinma, et al. “A Multiobjective Continuation Method to
    Compute the Regularization Path of Deep Neural Networks.” <i>ArXiv</i>, 2023.
  short: A.C. Amakor, K. Sonntag, S. Peitz, ArXiv (2023).
date_created: 2024-02-06T08:51:00Z
date_updated: 2024-02-06T08:52:07Z
department:
- _id: '655'
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://arxiv.org/pdf/2308.12044.pdf
oa: '1'
publication: arXiv
status: public
title: A multiobjective continuation method to compute the regularization path of
  deep neural networks
type: preprint
user_id: '47427'
year: '2023'
...
---
_id: '46578'
abstract:
- lang: eng
  text: 'Multiobjective optimization plays an increasingly important role in modern
    applications, where several criteria are often of equal importance. The task in
    multiobjective optimization and multiobjective optimal control is therefore to
    compute the set of optimal compromises (the Pareto set) between the conflicting
    objectives. The advances in algorithms and the increasing interest in Pareto-optimal
    solutions have led to a wide range of new applications related to optimal and
    feedback control - potentially with non-smoothness both on the level of the objectives
    or in the system dynamics. This results in new challenges such as dealing with
    expensive models (e.g., governed by partial differential equations (PDEs)) and
    developing dedicated algorithms handling the non-smoothness. Since in contrast
    to single-objective optimization, the Pareto set generally consists of an infinite
    number of solutions, the computational effort can quickly become challenging,
    which is particularly problematic when the objectives are costly to evaluate or
    when a solution has to be presented very quickly. This article gives an overview
    of recent developments in the field of multiobjective optimization of non-smooth
    PDE-constrained problems. In particular we report on the advances achieved within
    Project 2 "Multiobjective Optimization of Non-Smooth PDE-Constrained Problems
    - Switches, State Constraints and Model Order Reduction" of the DFG Priority Programm
    1962 "Non-smooth and Complementarity-based Distributed Parameter Systems: Simulation
    and Hierarchical Optimization".'
author:
- first_name: Marco
  full_name: Bernreuther, Marco
  last_name: Bernreuther
- first_name: Michael
  full_name: Dellnitz, Michael
  last_name: Dellnitz
- first_name: Bennet
  full_name: Gebken, Bennet
  id: '32643'
  last_name: Gebken
- first_name: Georg
  full_name: Müller, Georg
  last_name: Müller
- first_name: Sebastian
  full_name: Peitz, Sebastian
  id: '47427'
  last_name: Peitz
  orcid: 0000-0002-3389-793X
- first_name: Konstantin
  full_name: Sonntag, Konstantin
  id: '56399'
  last_name: Sonntag
  orcid: https://orcid.org/0000-0003-3384-3496
- first_name: Stefan
  full_name: Volkwein, Stefan
  last_name: Volkwein
citation:
  ama: Bernreuther M, Dellnitz M, Gebken B, et al. Multiobjective Optimization of
    Non-Smooth PDE-Constrained Problems. <i>arXiv:230801113</i>. Published online
    2023.
  apa: Bernreuther, M., Dellnitz, M., Gebken, B., Müller, G., Peitz, S., Sonntag,
    K., &#38; Volkwein, S. (2023). Multiobjective Optimization of Non-Smooth PDE-Constrained
    Problems. In <i>arXiv:2308.01113</i>.
  bibtex: '@article{Bernreuther_Dellnitz_Gebken_Müller_Peitz_Sonntag_Volkwein_2023,
    title={Multiobjective Optimization of Non-Smooth PDE-Constrained Problems}, journal={arXiv:2308.01113},
    author={Bernreuther, Marco and Dellnitz, Michael and Gebken, Bennet and Müller,
    Georg and Peitz, Sebastian and Sonntag, Konstantin and Volkwein, Stefan}, year={2023}
    }'
  chicago: Bernreuther, Marco, Michael Dellnitz, Bennet Gebken, Georg Müller, Sebastian
    Peitz, Konstantin Sonntag, and Stefan Volkwein. “Multiobjective Optimization of
    Non-Smooth PDE-Constrained Problems.” <i>ArXiv:2308.01113</i>, 2023.
  ieee: M. Bernreuther <i>et al.</i>, “Multiobjective Optimization of Non-Smooth PDE-Constrained
    Problems,” <i>arXiv:2308.01113</i>. 2023.
  mla: Bernreuther, Marco, et al. “Multiobjective Optimization of Non-Smooth PDE-Constrained
    Problems.” <i>ArXiv:2308.01113</i>, 2023.
  short: M. Bernreuther, M. Dellnitz, B. Gebken, G. Müller, S. Peitz, K. Sonntag,
    S. Volkwein, ArXiv:2308.01113 (2023).
date_created: 2023-08-21T05:50:12Z
date_updated: 2024-02-21T12:22:20Z
department:
- _id: '655'
- _id: '101'
external_id:
  arxiv:
  - '2308.01113'
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://arxiv.org/pdf/2308.01113
oa: '1'
publication: arXiv:2308.01113
status: public
title: Multiobjective Optimization of Non-Smooth PDE-Constrained Problems
type: preprint
user_id: '47427'
year: '2023'
...
