{"conference":{"name":"International Conference on Learning Representations, ICLR","location":"Kigali, Ruanda"},"publication":"International Conference on Learning Representations, ICLR","date_updated":"2023-06-29T09:14:26Z","type":"conference","main_file_link":[{"url":"https://arxiv.org/abs/2206.05530","open_access":"1"}],"language":[{"iso":"eng"}],"date_created":"2022-06-14T14:48:36Z","citation":{"chicago":"Nguyen, Duc Anh, Ron Levie, Julian Lienen, Gitta Kutyniok, and Eyke Hüllermeier. “Memorization-Dilation: Modeling Neural Collapse Under Noise.” In International Conference on Learning Representations, ICLR, 2023.","short":"D.A. Nguyen, R. Levie, J. Lienen, G. Kutyniok, E. Hüllermeier, in: International Conference on Learning Representations, ICLR, 2023.","bibtex":"@inproceedings{Nguyen_Levie_Lienen_Kutyniok_Hüllermeier_2023, title={Memorization-Dilation: Modeling Neural Collapse Under Noise}, booktitle={International Conference on Learning Representations, ICLR}, author={Nguyen, Duc Anh and Levie, Ron and Lienen, Julian and Kutyniok, Gitta and Hüllermeier, Eyke}, year={2023} }","apa":"Nguyen, D. A., Levie, R., Lienen, J., Kutyniok, G., & Hüllermeier, E. (2023). Memorization-Dilation: Modeling Neural Collapse Under Noise. International Conference on Learning Representations, ICLR. International Conference on Learning Representations, ICLR, Kigali, Ruanda.","ieee":"D. A. Nguyen, R. Levie, J. Lienen, G. Kutyniok, and E. Hüllermeier, “Memorization-Dilation: Modeling Neural Collapse Under Noise,” presented at the International Conference on Learning Representations, ICLR, Kigali, Ruanda, 2023.","mla":"Nguyen, Duc Anh, et al. “Memorization-Dilation: Modeling Neural Collapse Under Noise.” International Conference on Learning Representations, ICLR, 2023.","ama":"Nguyen DA, Levie R, Lienen J, Kutyniok G, Hüllermeier E. Memorization-Dilation: Modeling Neural Collapse Under Noise. In: International Conference on Learning Representations, ICLR. ; 2023."},"user_id":"44040","title":"Memorization-Dilation: Modeling Neural Collapse Under Noise","_id":"31880","abstract":[{"text":"The notion of neural collapse refers to several emergent phenomena that have been empirically observed across various canonical classification problems. During the terminal phase of training a deep neural network, the feature embedding of all examples of the same class tend to collapse to a single representation, and the features of different classes tend to separate as much as possible. Neural collapse is often studied through a simplified model, called the unconstrained feature representation, in which the model is assumed to have \"infinite expressivity\" and can map each data point to any arbitrary representation. In this work, we propose a more realistic variant of the unconstrained feature representation that takes the limited expressivity of the network into account. Empirical evidence suggests that the memorization of noisy data points leads to a degradation (dilation) of the neural collapse. Using a model of the memorization-dilation (M-D) phenomenon, we show one mechanism by which different losses lead to different performances of the trained network on noisy data. Our proofs reveal why label smoothing, a modification of cross-entropy empirically observed to produce a regularization effect, leads to improved generalization in classification tasks.","lang":"eng"}],"oa":"1","author":[{"first_name":"Duc Anh","full_name":"Nguyen, Duc Anh","last_name":"Nguyen"},{"first_name":"Ron","last_name":"Levie","full_name":"Levie, Ron"},{"full_name":"Lienen, Julian","last_name":"Lienen","id":"44040","first_name":"Julian"},{"first_name":"Gitta","last_name":"Kutyniok","full_name":"Kutyniok, Gitta"},{"full_name":"Hüllermeier, Eyke","id":"48129","last_name":"Hüllermeier","first_name":"Eyke"}],"year":"2023","status":"public"}