Rethinking Layer Relevance in Large Language Models Beyond Cosine Similarity
Track: Main Track, Up to 8 pages, excluding references
Anonymization: This submission has been anonymized for double-blind review via the removal of identifying information such as names, affiliations, and identifying URLs.
Attendance: Maybe, the presenting author may attend in person, depending on factors such as funding, visa approval, or travel constraints.
Presenter: ~Cristian_Hinostroza1
Serve As Reviewer: ~Cristian_Hinostroza1
Submission Number: 49
Loading