Deep Neural Collapse: Theory and Applications

Neural Collapse reveals a simple yet elegant mathematical structure within the learned features during the terminal phase of training. The research effort about Neural Collapse encompasses both theoretical aspects to better understand the phenomenon and practical applications in domains such as transfer learning, long-tailed classification, continual learning, out-of-domain detection, etc. This page collects recent research effort in this line. (Update: Aug 10th, 2024.)

Neural Collapse in the Balanced Case (Left); Multi-label case (Middle); and Large classes (K>d) case (Right).

Neural Collapse in Last-Layer

Phenomena

  • Prevalence of Neural Collapse During the Terminal Phase of Deep Learning Training. Vardan Papyan, X.Y. Han, David L. Donoho. Proceedings of the National Academy of Sciences.
  • Exploring Deep Neural Networks via Layer-Peeled Model: Minority Collapse in Imbalanced Training. Cong Fang, Hangfeng He, Qi Long, Weijie J. Su. Proceedings of the National Academy of Sciences.
  • Neural Collapse in Multi-label Learning with Pick-all-label Loss. Pengyu Li, Xiao Li, Yutong Wang, Qing Qu. 41st International Conference on Machine Learning.
  • Generalized Neural Collapse for a Large Number of Classes. Jiachen Jiang, Jinxin Zhou, Peng Wang, Qing Qu, Dustin Mixon, Chong You, Zhihui Zhu. ArXiv preprint.

Theoretical Insights

  • Neural Collapse with Unconstrained Features. Dustin G. Mixon, Hans Parshall, Jianzong Pi. Sampling Theory, Signal Processing, and Data Analysis.
  • Extended Unconstrained Features Model for Exploring Deep Neural Collapse. Tom Tirer, Joan Bruna. International Conference on Machine Learning (ICML).
  • Memorization-Dilation: Modeling Neural Collapse Under Label Noise. Duc Anh Nguyen, Ron Levie, Julian Lienen, Gitta Kutyniok, Eyke Hüllermeier. arXiv preprint.
  • A Geometric Analysis of Neural Collapse with Unconstrained Features. Zhihui Zhu, Tianyu Ding, Jinxin Zhou, Xiao Li, Chong You, Jeremias Sulam, Qing Qu. Advances in Neural Information Processing Systems.
  • An Unconstrained Layer-Peeled Perspective on Neural Collapse. Wenlong Ji, Yiping Lu, Yiliang Zhang, Zhun Deng, Weijie J. Su. International Conference on Learning Representations.
  • Neural Collapse under Cross-Entropy Loss. Jianfeng Lu, Stefan Steinerberger. Applied and Computational Harmonic Analysis.
  • On the Emergence of Simplex Symmetry in the Final and Penultimate Layers of Neural Network Classifiers. Weinan E, Stephan Wojtowytsch. arXiv preprint.
  • Neural Collapse with Normalized Features: A Geometric Analysis over the Riemannian Manifold. Can Yaras, Peng Wang, Zhihui Zhu, Laura Balzano, Qing Qu. Advances in Neural Information Processing Systems.
  • Neural Collapse Under MSE Loss: Proximity to and Dynamics on the Central Path. X.Y. Han, Vardan Papyan, David L. Donoho. International Conference on Learning Representations.
  • On the Optimization Landscape of Neural Collapse under MSE Loss: Global Optimality with Unconstrained Features. Jinxin Zhou, Xiao Li, Tianyu Ding, Chong You, Qing Qu, Zhihui Zhu. International Conference on Machine Learning.
  • Neural Collapse in Deep Homogeneous Classifiers and the Role of Weight Decay. Akshay Rangamani, Andrzej Banburski-Fahey. IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).
  • Are All Losses Created Equal: A Neural Collapse Perspective. Jinxin Zhou, Chong You, Xiao Li, Kangning Liu, Sheng Liu, Qing Qu, Zhihui Zhu. Advances in Neural Information Processing Systems.


Proressive Neural Collapse across Layers

Phenomena

Theoretical Insights