The Training Bottleneck and Model Collapse
Abstract. No matter how fast AI inference scales, capability growth is limited by the speed of frontier training --- and when models train on their own output, the diversity that makes AI valuable quietly erodes.