---
_id: '57472'
abstract:
- lang: eng
  text: In this paper we introduce, in a Hilbert space setting, a second order dynamical
    system with asymptotically vanishing damping and vanishing Tikhonov regularization
    that approaches a multiobjective optimization problem with convex and differentiable
    components of the objective function. Trajectory solutions are shown to exist
    in finite dimensions. We prove fast convergence of the function values, quantified
    in terms of a merit function. Based on the regime considered, we establish both
    weak and, in some cases, strong convergence of trajectory solutions toward a weak
    Pareto optimal solution. To achieve this, we apply Tikhonov regularization individually
    to each component of the objective function. This work extends results from single
    objective convex optimization into the multiobjective setting.
author:
- first_name: Radu Ioan
  full_name: Bot, Radu Ioan
  last_name: Bot
- first_name: Konstantin
  full_name: Sonntag, Konstantin
  id: '56399'
  last_name: Sonntag
  orcid: https://orcid.org/0000-0003-3384-3496
citation:
  ama: Bot RI, Sonntag K. Inertial dynamics with vanishing Tikhonov regularization
    for multobjective optimization. <i>Journal of Mathematical Analysis and Applications</i>.
    Published online 2025.
  apa: Bot, R. I., &#38; Sonntag, K. (2025). Inertial dynamics with vanishing Tikhonov
    regularization for multobjective optimization. <i>Journal of Mathematical Analysis
    and Applications</i>.
  bibtex: '@article{Bot_Sonntag_2025, title={Inertial dynamics with vanishing Tikhonov
    regularization for multobjective optimization}, journal={Journal of Mathematical
    Analysis and Applications}, author={Bot, Radu Ioan and Sonntag, Konstantin}, year={2025}
    }'
  chicago: Bot, Radu Ioan, and Konstantin Sonntag. “Inertial Dynamics with Vanishing
    Tikhonov Regularization for Multobjective Optimization.” <i>Journal of Mathematical
    Analysis and Applications</i>, 2025.
  ieee: R. I. Bot and K. Sonntag, “Inertial dynamics with vanishing Tikhonov regularization
    for multobjective optimization,” <i>Journal of Mathematical Analysis and Applications</i>,
    2025.
  mla: Bot, Radu Ioan, and Konstantin Sonntag. “Inertial Dynamics with Vanishing Tikhonov
    Regularization for Multobjective Optimization.” <i>Journal of Mathematical Analysis
    and Applications</i>, 2025.
  short: R.I. Bot, K. Sonntag, Journal of Mathematical Analysis and Applications (2025).
date_created: 2024-11-28T08:58:17Z
date_updated: 2025-10-16T11:56:36Z
ddc:
- '510'
department:
- _id: '101'
- _id: '530'
- _id: '655'
external_id:
  arxiv:
  - '2411.18422'
file:
- access_level: open_access
  content_type: application/pdf
  creator: sonntagk
  date_created: 2024-11-28T08:58:00Z
  date_updated: 2024-11-28T08:58:00Z
  file_id: '57473'
  file_name: Inertial dynamics with vanishing Tikhonov regularization for multobjective
    optimization.pdf
  file_size: 4291134
  relation: main_file
file_date_updated: 2024-11-28T08:58:00Z
has_accepted_license: '1'
keyword:
- Pareto optimization
- Lyapunov analysis
- gradient-like dynamical systems
- inertial dynamics
- asymptotic vanishing damping
- Tikhonov regularization
- strong convergence
language:
- iso: eng
main_file_link:
- url: https://arxiv.org/pdf/2411.18422
oa: '1'
publication: Journal of Mathematical Analysis and Applications
status: public
title: Inertial dynamics with vanishing Tikhonov regularization for multobjective
  optimization
type: journal_article
user_id: '56399'
year: '2025'
...
---
_id: '62750'
abstract:
- lang: eng
  text: 'Diese Dissertation enthält Beiträge zum Bereich der Mehrzieloptimierung mit
    einem Fokus auf unbeschränkten Problemen, die auf einem allgemeinen Hilbertraum
    definiert sind. Für Mehrzieloptimierungsprobleme mit lokal Lipschitz-stetigen
    Zielfunktionen definieren wir ein multikriterielles Subdifferential, das wir erstmals
    im Kontext allgemeiner Hilberträume analysieren. Aufbauend auf diesen theoretischen
    Untersuchungen präsentieren wir ein Abstiegsverfahren, bei welchem in jeder Iteration
    eine Abstiegsrichtung mittels einer numerischen Approximation des multikriteriellen
    Subdifferentials bestimmt wird. Im Kontext konvexer, stetig differenzierbarer
    Zielfunktionen mit Lipschitz-stetigen Gradienten, führen wir eine Familie von
    dynamischen Gradientensystemen mit Trägheitsterm ein, die bekannte kontinuierliche
    Systeme aus der skalaren Optimierung verallgemeinern. Wir stellen drei neue Systeme
    vor: eines mit konstanter Dämpfung, eines mit asymptotisch abnehmender Dämpfung
    und eines, das zusätzlich eine zeitabhängige Tikhonov-Regularisierung beinhaltet.
    Aufbauend auf den Untersuchungen der neuen dynamischen Gradientensysteme, entwickeln
    wir ein beschleunigtes Gradientenverfahren zur Mehrzieloptimierung, das auf einer
    Diskretisierung des multikriteriellen Gradientensystems mit asymptotisch abnehmender
    Dämpfung beruht. Das hergeleitete Verfahren bewahrt die günstigen Konvergenzeigenschaften
    des kontinuierlichen Systems und erreicht eine schnellere Konvergenz als klassische
    Verfahren.'
- lang: eng
  text: 'This dissertation contributes to the field of multiobjective optimization,
    with a focus on unconstrained problems formulated in a general Hilbert space.
    For multiobjective optimization problems with locally Lipschitz continuous objective
    functions, we define a multiobjective subdifferential, which we analyze for the
    first time in the context of general Hilbert spaces. Building on these theoretical
    investigations, we present a descent method in which, at each iteration, a descent
    direction is determined via a numerical approximation of the multiobjective subdifferential.
    In the setting of convex, continuously differentiable objective functions with
    Lipschitz continuous gradients, we introduce a family of inertial gradient dynamical
    systems that generalize well-known continuous-time systems from scalar optimization.
    We present three novel systems: one with constant damping, one with asymptotic
    vanishing damping, and one combining vanishing damping with time-dependent Tikhonov
    regularization. Building on the investigation of the novel gradient dynamical
    systems, we develop an accelerated gradient method for multiobjective optimization
    via discretization of the multiobjective gradient system with asymptotic vanishing
    damping. The proposed method retains the favorable convergence properties of the
    continuous system while achieving faster convergence than standard approaches,
    such as classical methods.'
author:
- first_name: Konstantin
  full_name: Sonntag, Konstantin
  id: '56399'
  last_name: Sonntag
  orcid: https://orcid.org/0000-0003-3384-3496
citation:
  ama: Sonntag K. <i>First-Order Methods and Gradient Dynamical Systems for Multiobjective
    Optimization</i>. Paderborn University; 2025. doi:<a href="https://doi.org/10.17619/UNIPB/1-2457">10.17619/UNIPB/1-2457</a>
  apa: Sonntag, K. (2025). <i>First-order methods and gradient dynamical systems for
    multiobjective optimization</i>. Paderborn University. <a href="https://doi.org/10.17619/UNIPB/1-2457">https://doi.org/10.17619/UNIPB/1-2457</a>
  bibtex: '@book{Sonntag_2025, title={First-order methods and gradient dynamical systems
    for multiobjective optimization}, DOI={<a href="https://doi.org/10.17619/UNIPB/1-2457">10.17619/UNIPB/1-2457</a>},
    publisher={Paderborn University}, author={Sonntag, Konstantin}, year={2025} }'
  chicago: Sonntag, Konstantin. <i>First-Order Methods and Gradient Dynamical Systems
    for Multiobjective Optimization</i>. Paderborn University, 2025. <a href="https://doi.org/10.17619/UNIPB/1-2457">https://doi.org/10.17619/UNIPB/1-2457</a>.
  ieee: K. Sonntag, <i>First-order methods and gradient dynamical systems for multiobjective
    optimization</i>. Paderborn University, 2025.
  mla: Sonntag, Konstantin. <i>First-Order Methods and Gradient Dynamical Systems
    for Multiobjective Optimization</i>. Paderborn University, 2025, doi:<a href="https://doi.org/10.17619/UNIPB/1-2457">10.17619/UNIPB/1-2457</a>.
  short: K. Sonntag, First-Order Methods and Gradient Dynamical Systems for Multiobjective
    Optimization, Paderborn University, 2025.
date_created: 2025-12-03T06:55:01Z
date_updated: 2025-12-03T07:04:36Z
ddc:
- '510'
department:
- _id: '101'
- _id: '530'
doi: 10.17619/UNIPB/1-2457
has_accepted_license: '1'
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://digital.ub.uni-paderborn.de/hs/download/pdf/8141881
oa: '1'
publisher: Paderborn University
status: public
supervisor:
- first_name: Michael
  full_name: Dellnitz, Michael
  last_name: Dellnitz
- first_name: Sina
  full_name: Ober-Blöbaum, Sina
  id: '16494'
  last_name: Ober-Blöbaum
title: First-order methods and gradient dynamical systems for multiobjective optimization
type: dissertation
user_id: '56399'
year: '2025'
...
---
_id: '20731'
abstract:
- lang: eng
  text: We present a novel algorithm that allows us to gain detailed insight into
    the effects of sparsity in linear and nonlinear optimization, which is of great
    importance in many scientific areas such as image and signal processing, medical
    imaging, compressed sensing, and machine learning (e.g., for the training of neural
    networks). Sparsity is an important feature to ensure robustness against noisy
    data, but also to find models that are interpretable and easy to analyze due to
    the small number of relevant terms. It is common practice to enforce sparsity
    by adding the ℓ1-norm as a weighted penalty term. In order to gain a better understanding
    and to allow for an informed model selection, we directly solve the corresponding
    multiobjective optimization problem (MOP) that arises when we minimize the main
    objective and the ℓ1-norm simultaneously. As this MOP is in general non-convex
    for nonlinear objectives, the weighting method will fail to provide all optimal
    compromises. To avoid this issue, we present a continuation method which is specifically
    tailored to MOPs with two objective functions one of which is the ℓ1-norm. Our
    method can be seen as a generalization of well-known homotopy methods for linear
    regression problems to the nonlinear case. Several numerical examples - including
    neural network training - demonstrate our theoretical findings and the additional
    insight that can be gained by this multiobjective approach.
article_type: original
author:
- first_name: Katharina
  full_name: Bieker, Katharina
  id: '32829'
  last_name: Bieker
- first_name: Bennet
  full_name: Gebken, Bennet
  id: '32643'
  last_name: Gebken
- first_name: Sebastian
  full_name: Peitz, Sebastian
  id: '47427'
  last_name: Peitz
  orcid: 0000-0002-3389-793X
citation:
  ama: Bieker K, Gebken B, Peitz S. On the Treatment of Optimization Problems with
    L1 Penalty Terms via Multiobjective Continuation. <i>IEEE Transactions on Pattern
    Analysis and Machine Intelligence</i>. 2022;44(11):7797-7808. doi:<a href="https://doi.org/10.1109/TPAMI.2021.3114962">10.1109/TPAMI.2021.3114962</a>
  apa: Bieker, K., Gebken, B., &#38; Peitz, S. (2022). On the Treatment of Optimization
    Problems with L1 Penalty Terms via Multiobjective Continuation. <i>IEEE Transactions
    on Pattern Analysis and Machine Intelligence</i>, <i>44</i>(11), 7797–7808. <a
    href="https://doi.org/10.1109/TPAMI.2021.3114962">https://doi.org/10.1109/TPAMI.2021.3114962</a>
  bibtex: '@article{Bieker_Gebken_Peitz_2022, title={On the Treatment of Optimization
    Problems with L1 Penalty Terms via Multiobjective Continuation}, volume={44},
    DOI={<a href="https://doi.org/10.1109/TPAMI.2021.3114962">10.1109/TPAMI.2021.3114962</a>},
    number={11}, journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
    publisher={IEEE}, author={Bieker, Katharina and Gebken, Bennet and Peitz, Sebastian},
    year={2022}, pages={7797–7808} }'
  chicago: 'Bieker, Katharina, Bennet Gebken, and Sebastian Peitz. “On the Treatment
    of Optimization Problems with L1 Penalty Terms via Multiobjective Continuation.”
    <i>IEEE Transactions on Pattern Analysis and Machine Intelligence</i> 44, no.
    11 (2022): 7797–7808. <a href="https://doi.org/10.1109/TPAMI.2021.3114962">https://doi.org/10.1109/TPAMI.2021.3114962</a>.'
  ieee: 'K. Bieker, B. Gebken, and S. Peitz, “On the Treatment of Optimization Problems
    with L1 Penalty Terms via Multiobjective Continuation,” <i>IEEE Transactions on
    Pattern Analysis and Machine Intelligence</i>, vol. 44, no. 11, pp. 7797–7808,
    2022, doi: <a href="https://doi.org/10.1109/TPAMI.2021.3114962">10.1109/TPAMI.2021.3114962</a>.'
  mla: Bieker, Katharina, et al. “On the Treatment of Optimization Problems with L1
    Penalty Terms via Multiobjective Continuation.” <i>IEEE Transactions on Pattern
    Analysis and Machine Intelligence</i>, vol. 44, no. 11, IEEE, 2022, pp. 7797–808,
    doi:<a href="https://doi.org/10.1109/TPAMI.2021.3114962">10.1109/TPAMI.2021.3114962</a>.
  short: K. Bieker, B. Gebken, S. Peitz, IEEE Transactions on Pattern Analysis and
    Machine Intelligence 44 (2022) 7797–7808.
date_created: 2020-12-15T07:46:36Z
date_updated: 2022-10-21T12:27:16Z
ddc:
- '510'
department:
- _id: '101'
- _id: '530'
- _id: '655'
doi: 10.1109/TPAMI.2021.3114962
file:
- access_level: closed
  content_type: application/pdf
  creator: speitz
  date_created: 2021-09-25T11:59:15Z
  date_updated: 2021-09-25T11:59:15Z
  file_id: '25040'
  file_name: On_the_Treatment_of_Optimization_Problems_with_L1_Penalty_Terms_via_Multiobjective_Continuation.pdf
  file_size: 7990831
  relation: main_file
  success: 1
file_date_updated: 2021-09-25T11:59:15Z
has_accepted_license: '1'
intvolume: '        44'
issue: '11'
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=9547772
oa: '1'
page: 7797-7808
publication: IEEE Transactions on Pattern Analysis and Machine Intelligence
publication_status: epub_ahead
publisher: IEEE
status: public
title: On the Treatment of Optimization Problems with L1 Penalty Terms via Multiobjective
  Continuation
type: journal_article
user_id: '47427'
volume: 44
year: '2022'
...
