ICLR 2026 - Submissions

SubmissionsReviews

Submissions

Summary Statistics

Quantity AI Content Count Avg Rating
0-10% 0 (0%) N/A
10-30% 0 (0%) N/A
30-50% 0 (0%) N/A
50-70% 1 (100%) 2.67
70-90% 0 (0%) N/A
90-100% 0 (0%) N/A
Total 1 (100%) 2.67
Title Abstract Avg Rating Quantity AI Content Reviews Pangram Dashboard
AgentDistill: Training-Free Agent Distillation with Generalizable MCP Boxes While knowledge distillation has become a mature field for compressing large language models (LLMs) into smaller ones by aligning their outputs or internal representations, the distillation of LLM-bas... 2.67 54% See Reviews View AI Dashboard
PreviousPage 1 of 1 (1 total rows)Next