ICLR 2026 - Submissions

SubmissionsReviews

Submissions

Summary Statistics

Quantity AI Content Count Avg Rating
0-10% 0 (0%) N/A
10-30% 0 (0%) N/A
30-50% 0 (0%) N/A
50-70% 1 (100%) 1.60
70-90% 0 (0%) N/A
90-100% 0 (0%) N/A
Total 1 (100%) 1.60
Title Abstract Avg Rating Quantity AI Content Reviews Pangram Dashboard
Decoupling of Experts: A Knowledge-Driven Architecture for Efficient LLMs Current large language models (LLMs), particularly Mixture-of-Experts (MoE) variants, face challenges in achieving efficient, structured, and interpretable scaling. We introduce the Decoupling of Expe... 1.60 69% See Reviews View AI Dashboard
PreviousPage 1 of 1 (1 total rows)Next