Multi-Objective Optimization for Sparse Deep Multi-Task Learning
S.S. Hotegni, M.B. Berkemeier, S. Peitz, in: 2024 International Joint Conference on Neural Networks (IJCNN), IEEE, Yokohama, Japan, 2024, p. 9.
Download (ext.)
Conference Paper
| Published
| English
Author
Department
Abstract
Different conflicting optimization criteria arise naturally in various Deep
Learning scenarios. These can address different main tasks (i.e., in the
setting of Multi-Task Learning), but also main and secondary tasks such as loss
minimization versus sparsity. The usual approach is a simple weighting of the
criteria, which formally only works in the convex setting. In this paper, we
present a Multi-Objective Optimization algorithm using a modified Weighted
Chebyshev scalarization for training Deep Neural Networks (DNNs) with respect
to several tasks. By employing this scalarization technique, the algorithm can
identify all optimal solutions of the original problem while reducing its
complexity to a sequence of single-objective problems. The simplified problems
are then solved using an Augmented Lagrangian method, enabling the use of
popular optimization techniques such as Adam and Stochastic Gradient Descent,
while efficaciously handling constraints. Our work aims to address the
(economical and also ecological) sustainability issue of DNN models, with a
particular focus on Deep Multi-Task models, which are typically designed with a
very large number of weights to perform equally well on multiple tasks. Through
experiments conducted on two Machine Learning datasets, we demonstrate the
possibility of adaptively sparsifying the model during training without
significantly impacting its performance, if we are willing to apply
task-specific adaptations to the network weights. Code is available at
https://github.com/salomonhotegni/MDMTN.
Publishing Year
Proceedings Title
2024 International Joint Conference on Neural Networks (IJCNN)
Page
9
Conference
2024 International Joint Conference on Neural Networks (IJCNN)
Conference Location
Yokohama, Japan
Conference Date
2024-06-30 – 2024-07-05
eISSN
LibreCat-ID
Cite this
Hotegni SS, Berkemeier MB, Peitz S. Multi-Objective Optimization for Sparse Deep Multi-Task Learning. In: 2024 International Joint Conference on Neural Networks (IJCNN). IEEE; 2024:9. doi:10.1109/IJCNN60899.2024.10650994
Hotegni, S. S., Berkemeier, M. B., & Peitz, S. (2024). Multi-Objective Optimization for Sparse Deep Multi-Task Learning. 2024 International Joint Conference on Neural Networks (IJCNN), 9. https://doi.org/10.1109/IJCNN60899.2024.10650994
@inproceedings{Hotegni_Berkemeier_Peitz_2024, place={Yokohama, Japan}, title={Multi-Objective Optimization for Sparse Deep Multi-Task Learning}, DOI={10.1109/IJCNN60899.2024.10650994}, booktitle={2024 International Joint Conference on Neural Networks (IJCNN)}, publisher={IEEE}, author={Hotegni, Sedjro Salomon and Berkemeier, Manuel Bastian and Peitz, Sebastian}, year={2024}, pages={9} }
Hotegni, Sedjro Salomon, Manuel Bastian Berkemeier, and Sebastian Peitz. “Multi-Objective Optimization for Sparse Deep Multi-Task Learning.” In 2024 International Joint Conference on Neural Networks (IJCNN), 9. Yokohama, Japan: IEEE, 2024. https://doi.org/10.1109/IJCNN60899.2024.10650994.
S. S. Hotegni, M. B. Berkemeier, and S. Peitz, “Multi-Objective Optimization for Sparse Deep Multi-Task Learning,” in 2024 International Joint Conference on Neural Networks (IJCNN), Yokohama, Japan, 2024, p. 9, doi: 10.1109/IJCNN60899.2024.10650994.
Hotegni, Sedjro Salomon, et al. “Multi-Objective Optimization for Sparse Deep Multi-Task Learning.” 2024 International Joint Conference on Neural Networks (IJCNN), IEEE, 2024, p. 9, doi:10.1109/IJCNN60899.2024.10650994.
All files available under the following license(s):
Copyright Statement:
This Item is protected by copyright and/or related rights. [...]
Link(s) to Main File(s)
Access Level
Closed Access