OpenAI’s newest AI model, Orion, reportedly shows only moderate improvement over GPT-4, sparking concerns about the future of generative AI.
The industry’s reliance on scaling laws—using more data and computing power for better results—seems to be hitting a wall. Challenges like limited training data, dependence on synthetic content, and high computing costs are compounding the problem.
Investors are questioning the potential for diminishing returns, while companies may face difficulty justifying rising costs for products that might no longer show significant advancements. As AI development slows, it signals a critical juncture for an industry built on the promise of rapid innovation.