OpenAI’s efforts to develop its next major model, GPT-5, are running behind schedule, with results not yet justifying the enormous costs involved, according to a report from The Wall Street Journal. This development struggle echoes earlier reports indicating the company is considering new strategies, as GPT-5 might not signify as substantial progress as its forebears. But the WSJ brings additional details concerning the estimated 18-month development timeline of GPT-5, code-named Orion.
OpenAI has reportedly completed multiple large training runs aimed at enhancing the model's capabilities by utilizing vast datasets. Initial tests were slower than anticipated, indicating subsequent phases would not only consume extensive time but also cost significant resources. Although GPT-5 shows promise outperforming its predecessors, it has not reached levels of advancement to warrant the high costs associated with its maintenance.
Adding to the complexity of the project, OpenAI is diverging from its traditional reliance on publicly available data and licensing agreements. The company has started hiring people to develop new data through coding and solving mathematical problems. They’re also utilizing synthetic data generated from another model, labeled o1. Efforts directed at enriching data inputs aim to alleviate some of the quality issues detrimental to the development process.
OpenAI has yet to comment publicly on the WSJ findings. The firm previously indicated it wouldn’t release any model codenamed Orion this year, effectively pushing timelines back as it works on improving the underlying technology.
CEO Sam Altman expressed skepticism about the WSJ report, stating on the X platform, "I think the Wall Street Journal (WSJ) is currently the best newspaper in the United States, but just hours after we announced the o3 model, they published their article titled 'The Next Great Leap in AI Is Behind Schedule and Crazy Expensive'?!" His remarks highlight his belief the publication may have rushed to judgment following their late-breaking news on OpenAI's latest model.
Reports have surfaced detailing various issues facing the GPT-5 project. One prominent problem stems from the reduced quality of the datasets initially used for training. Speculation indicated the data was not diversified enough to promote broader learning within the model. Midway through the testing phase, OpenAI executed one training run dubbed "Arrakis," which was frustratingly sluggish, hinting at prolonged and potentially complex future training exercises.
The slow progress on GPT-5 isn’t the only concern for OpenAI. There have been significant executive departures this year, including co-founder and Chief Scientist Ilya Sutskever and Chief Technology Officer Mira Murati, leading some analysts to question the stability of the organization during this tumultuous time.
While OpenAI grapples with these challenges, competition is ramping up from other tech firms eager to mesh innovative AI technologies. Amazon, for example, announced plans to invest $4 billion more in Anthropic, the parent firm of Claude AI and OpenAI’s rival, pushing the company's total investment to around $8 billion. Also, earlier this month, Elon Musk’s xAI revealed intentions to quintuple the GPUs available for its systems and expand its Colossus supercomputer project in Memphis, Tennessee, targeting to house at least one million GPUs—an ambitious leap aimed directly at rivaling OpenAI’s current domain.
Despite its setbacks, OpenAI continues its mission to refine AI models, aspiring for breakthroughs with advanced reasoning capabilities. Notably, they may skip the "o2" designation due to potential trademark conflicts with the UK telecommunications operator O2 and move directly to branding as "o3," hinting at both progress and strategy adjustments as they navigate this competitive environment.
Time will tell whether these efforts yield satisfactory results or if they leave OpenAI susceptible to competitive pressures from rapidly advancing rivals.