OpenAI’s latest AI project, code-named Orion, has hit a wall.
Billions spent. Countless hours invested. Still, no clear timeline for success.
Why?
• Not enough data in the world to make it “smart enough.”
• Training costs soaring past half a billion dollars per run.
• Internal challenges and competitor pressure growing every day.
Orion (expected to power GPT-5) is meant to redefine AI—a Ph.D. version of GPT-4. But here’s the catch: The “bigger is better” strategy seems to be hitting a plateau.
And yet… OpenAI’s CEO, Sam Altman, remains ambitious. He claims reasoning models might be the answer—teaching machines how to think like humans instead of just feeding them more data.
Will Orion become the breakthrough that unlocks the next era of AI? Or will it be an expensive lesson in the limits of scaling?