Deep Neural Collapse: Theory and Applications

Neural Collapse reveals a simple yet elegant mathematical structure within the learned features during the terminal phase of training. The research effort about Neural Collapse encompasses both theoretical aspects to better understand the phenomenon and practical applications in domains such as transfer learning, long-tailed classification, continual learning, out-of-domain detection, etc. This page collects recent research effort in this line. (Update: Aug 10th, 2024.)

Neural Collapse in the Balanced Case (Left); Multi-label case (Middle); and Large classes (K>d) case (Right).

Neural Collapse in Last-Layer

Phenomena

  • Prevalence of Neural Collapse During the Terminal Phase of Deep Learning Training. Vardan Papyan, X.Y. Han, David L. Donoho. Proceedings of the National Academy of Sciences.
  • Exploring Deep Neural Networks via Layer-Peeled Model: Minority Collapse in Imbalanced Training. Cong Fang, Hangfeng He, Qi Long, Weijie J. Su. Proceedings of the National Academy of Sciences.
  • Neural Collapse in Multi-label Learning with Pick-all-label Loss. Pengyu Li, Xiao Li, Yutong Wang, Qing Qu. 41st International Conference on Machine Learning.
  • Generalized Neural Collapse for a Large Number of Classes. Jiachen Jiang, Jinxin Zhou, Peng Wang, Qing Qu, Dustin Mixon, Chong You, Zhihui Zhu. ArXiv preprint.

Theoretical Insights

  • To be filled


Proressive Neural Collapse across Layers

Phenomena

Theoretical Insights