상세 보기
- Park, Jinhee;
- Yoo, Hee Bin;
- Kim, Minjun;
- Zhang, Byoung-Tak;
- Kwon, Junseok
WEB OF SCIENCE
0SCOPUS
0초록
Recent studies have revealed Neural Collapse (NC) in deep classifiers, where last-layer weights and features align into an equiangular tight frame (ETF), concentrating class information along specific embedding directions. However, conventional fine-tuning typically disregards this structure, initializing task-specific classifier heads randomly. To explicitly leverage this phenomenon, we propose a simple yet effective method for metric learning: (1) initializing the classifier head along each class’s NC direction from a pretrained model to preserve the emergent structure, and (2) injecting small isotropic Gaussian noise during finetuning to boost generalization. In addition, we provide a theoretical bound proving that our method explicitly reduces cumulative weight drift from the NC-initialization, compared to standard finetuning. This suggests that our method better preserves the pretrained model’s class-specific structure. Empirically, this structural preservation yields Recall@K gains: reduced weight drift correlates with better performance. Concurrent decreases in the Neural Collapse 1 (NC1) measure confirm that stronger intra‐class cohesion underlies these improvements. Furthermore, we validate the effectiveness of our method on class‐imbalanced benchmarks.
- 제목
- Neural Collapse-Informed Initialization with Perturbation Injection in Classification-based Metric Learning
- 저자
- Park, Jinhee; Yoo, Hee Bin; Kim, Minjun; Zhang, Byoung-Tak; Kwon, Junseok
- 발행일
- 2026
- 유형
- Conference Paper
- 저널명
- Proceedings of the AAAI Conference on Artificial Intelligence
- 권
- 40
- 호
- 10
- 페이지
- 8287 ~ 8295