ICLR 2026 - Submissions
Submissions
Summary Statistics
| Quantity AI Content | Count | Avg Rating |
|---|---|---|
| 0-10% | 1 (100%) | 6.50 |
| 10-30% | 0 (0%) | N/A |
| 30-50% | 0 (0%) | N/A |
| 50-70% | 0 (0%) | N/A |
| 70-90% | 0 (0%) | N/A |
| 90-100% | 0 (0%) | N/A |
| Total | 1 (100%) | 6.50 |
| Title | Abstract | Avg Rating | Quantity AI Content | Reviews | Pangram Dashboard |
|---|---|---|---|---|---|
| SGD-Based Knowledge Distillation with Bayesian Teachers: Theory and Guidelines | Knowledge Distillation (KD) is a central paradigm for transferring knowledge from a large teacher network to a typically smaller student model, often by leveraging soft probabilistic outputs. While KD... | 6.50 | 0% | See Reviews | View AI Dashboard |