Neural Collapse-Informed Initialization with Perturbation Injection in Classification-based Metric Learning
  • Park, Jinhee
  • Yoo, Hee Bin
  • Kim, Minjun
  • Zhang, Byoung-Tak
  • Kwon, Junseok
Citations

WEB OF SCIENCE

0
Citations

SCOPUS

0

초록

Recent studies have revealed Neural Collapse (NC) in deep classifiers, where last-layer weights and features align into an equiangular tight frame (ETF), concentrating class information along specific embedding directions. However, conventional fine-tuning typically disregards this structure, initializing task-specific classifier heads randomly. To explicitly leverage this phenomenon, we propose a simple yet effective method for metric learning: (1) initializing the classifier head along each class’s NC direction from a pretrained model to preserve the emergent structure, and (2) injecting small isotropic Gaussian noise during finetuning to boost generalization. In addition, we provide a theoretical bound proving that our method explicitly reduces cumulative weight drift from the NC-initialization, compared to standard finetuning. This suggests that our method better preserves the pretrained model’s class-specific structure. Empirically, this structural preservation yields Recall@K gains: reduced weight drift correlates with better performance. Concurrent decreases in the Neural Collapse 1 (NC1) measure confirm that stronger intra‐class cohesion underlies these improvements. Furthermore, we validate the effectiveness of our method on class‐imbalanced benchmarks.

제목
Neural Collapse-Informed Initialization with Perturbation Injection in Classification-based Metric Learning
저자
Park, JinheeYoo, Hee BinKim, MinjunZhang, Byoung-TakKwon, Junseok
DOI
10.1609/aaai.v40i10.37777
발행일
2026
유형
Conference Paper
저널명
Proceedings of the AAAI Conference on Artificial Intelligence
40
10
페이지
8287 ~ 8295