
OpenAI’s abrupt decision to shut down Sora, its AI video-generation app, just six months after its public release triggered an immediate wave of speculation. The product had encouraged people to upload their own faces, prompting worries that the whole effort might have been a data-harvesting exercise. But new reporting suggests the real reason was far more prosaic: Sora was losing money at a scale OpenAI could no longer justify, and it wasn’t attracting enough users to make the burn worthwhile.
According to a Wall Street Journal investigation referenced by TechCrunch, Sora turned into a costly distraction at a critical moment in the AI race. Rather than being a secret data grab, it was a high-profile experiment that never found sustainable traction.
After a highly publicised launch, Sora’s global user base reportedly topped out at around one million before dropping to fewer than 500,000. That weakening demand might not have been fatal on its own, but it collided with the realities of running state-of-the-art video generation in the cloud.
The reporting indicates Sora was consuming roughly $1 million per day to operate. Crucially, that wasn’t because the app was a runaway success; it was because video generation itself is extraordinarily compute-intensive. Each time a user placed their likeness into a cinematic, fantastical clip, they were consuming part of OpenAI’s limited pool of AI chips.
In practice, that meant a shrinking active user base was still incurring very high infrastructure costs. Instead of scaling toward efficiency, Sora was effectively locking up expensive compute that could have been used for more popular or more profitable products.
A dedicated team inside OpenAI had been working to make Sora viable, but the economics appear to have been moving in the wrong direction. At the same time, the broader AI market was shifting further toward tools that directly serve software developers and enterprise customers, where spending is more predictable and revenue more immediate.
Focus shifts back to core AI and enterprise competition
While OpenAI was trying to turn Sora into a hit consumer product, rival Anthropic was steadily building influence among developers and corporate buyers. The Wall Street Journal’s reporting, as summarised by TechCrunch, points specifically to Anthropic’s Claude Code as a product that was gaining momentum in the market OpenAI also needs to win: the engineers and enterprises that drive recurring revenue.
Within that context, Sora’s compute bill became more than just a line item. Every GPU hour spent rendering creative video clips was one not spent training or serving the models that underpin OpenAI’s more strategic offerings. In a period defined by fierce competition, limited high-end chips and rapidly evolving AI capabilities, that trade-off grew harder to justify.
Ultimately, CEO Sam Altman decided to shut Sora down, reallocate compute resources and refocus the company’s efforts. The move underscores how central access to AI hardware has become in this industry: even a highly visible product can be cut if it threatens to slow progress elsewhere.
The decision also appeared to catch at least one major partner off guard. The Wall Street Journal reporting cited by TechCrunch notes that Disney had agreed to a $1 billion partnership tied to Sora. The entertainment giant reportedly learned that OpenAI was pulling the plug less than an hour before the news became public. With Sora’s closure, that deal collapsed as well.
The short life span of Sora and the rapid end to such a large prospective partnership highlight the volatility of today’s AI product experiments. Companies are willing to move quickly and reverse course just as quickly when the economics or strategic value no longer add up.
For users who worried that Sora’s shutdown showed a hidden data play, the current reporting points instead to a more conventional corporate calculation which is that usage fell, costs remained extremely high and the opportunity cost in… Share on X
For users who worried that Sora’s shutdown showed a hidden data play, the current reporting points instead to a more conventional corporate calculation which is that usage fell, costs remained extremely high and the opportunity cost in compute and focus was too steep for OpenAI to absorb.
Discover more from TechBooky
Subscribe to get the latest posts sent to your email.






