Announcement_7
I’m excited to share that our paper, “The Price of Format: Diversity Collapse in LLMs”, has been accepted to Empirical Methods in Natural Language Processing 2025 (EMNLP)! In this work, we find out that structured templates in instruction-tuned LLMs cause diversity collapse—limiting open-ended generation—even under high-temperature sampling, and systematically evaluated this effect across tasks to show the trade-off between alignment, task performance, and output diversity.