Nathan Lambert predicts the gap between open and closed models will narrow by mid-2026. He argues that synthetic data and improved distillation techniques will accelerate open-source parity. This trend forces proprietary labs to innovate faster. Researchers should prioritize efficient fine-tuning as high-quality open weights become more competitive against closed-source giants like OpenAI.