ICLR 2026 - Submissions
Submissions
Summary Statistics
| Quantity AI Content | Count | Avg Rating |
|---|---|---|
| 0-10% | 1 (100%) | 5.50 |
| 10-30% | 0 (0%) | N/A |
| 30-50% | 0 (0%) | N/A |
| 50-70% | 0 (0%) | N/A |
| 70-90% | 0 (0%) | N/A |
| 90-100% | 0 (0%) | N/A |
| Total | 1 (100%) | 5.50 |
| Title | Abstract | Avg Rating | Quantity AI Content | Reviews | Pangram Dashboard |
|---|---|---|---|---|---|
| Theory of Scaling Laws for In-Context Regression: Depth, Width, Context and Time | We study in-context learning (ICL) of linear regression in a deep linear self-attention model, characterizing how performance depends on various computational and statistical resources (width, depth, ... | 5.50 | 0% | See Reviews | View AI Dashboard |