ICLR 2026 - Submissions

SubmissionsReviews

Submissions

Summary Statistics

Quantity AI Content Count Avg Rating
0-10% 0 (0%) N/A
10-30% 0 (0%) N/A
30-50% 1 (100%) 2.67
50-70% 0 (0%) N/A
70-90% 0 (0%) N/A
90-100% 0 (0%) N/A
Total 1 (100%) 2.67
Title Abstract Avg Rating Quantity AI Content Reviews Pangram Dashboard
From Compression to Specialization: An Information-Preserving Approach for Dense to Mixture-of-Experts Construction The high cost of training Mixture-of-Experts (MoE) models from scratch has spurred interest in converting pre-trained dense models into sparse MoE models. However, existing dense-to-sparse MoE methods... 2.67 33% See Reviews View AI Dashboard
PreviousPage 1 of 1 (1 total rows)Next