评分 6 · 来源:cs.LG updates on arXiv.org · 发布于 2026-04-17
评分依据:Important finding challenging PTQ assumptions: flat minima ≠ quantization-ready, has practical implications for deployment
arXiv:2604.15167v1 Announce Type: new Abstract: Post-training quantization (PTQ) assumes that a well-converged model is a quantization-ready model. We show this assumption fails in a structured, measurable, and previously uncharacterized way.