Oracle co-founder and CTO Larry Ellison has highlighted what he considers the fundamental weakness plaguing most of today’s popular large language models, including OpenAI’s ChatGPT, Google’s Gemini, xAI’s Grok, Meta’s Llama, and Anthropic’s Claude. Ellison believes that these AI systems are becoming commoditised far too quickly because they are all trained on essentially the same pool of publicly available internet data.
During Oracle’s fiscal Q2 2026 earnings call in December 2025, Ellison explained, “All the large language models—OpenAI, Anthropic, Meta, Google, xAI—they’re all trained on the same data. It’s all public data from the internet. So they’re all basically the same. And that’s why they’re becoming commoditized so quickly.”
Ellison argued that this shared reliance on open web-sourced training data creates little meaningful differentiation among the major players. As a result, the current wave of generative AI, while impressive, risks turning into a race to the bottom on price and features, with limited barriers to entry or unique value propositions.
Private enterprise data the next frontier for AI
Rather than viewing this as a dead end, Ellison positioned it as an opportunity for the next phase of AI development. He predicted that the real breakthroughs and massive economic value will come from AI systems that reason over private, proprietary enterprise data instead of public sources.
“The future lies in leveraging private enterprise data,” Ellison highlighted, suggesting this “second wave” of AI would dwarf the current boom in GPUs, data centers, and public-model infrastructure. He pointed to Oracle’s strategic advantages, stating that most high-value corporate and institutional data already resides in Oracle databases, giving the company a natural edge in building secure, enterprise-grade AI applications.
Oracle is aggressively pursuing this vision through its AI Data Platform, which incorporates techniques like Retrieval-Augmented Generation (RAG) to enable real-time querying of private information without compromising security or requiring model retraining on sensitive datasets.
Oracle ramps up its investments
To capitalise on this opportunity, Oracle has significantly ramped up its investments. The company now projects roughly $50 billion in capital expenditures for the full fiscal year—up from $35 billion estimated just three months earlier. Recent announcements include a 50,000-GPU supercluster powered by AMD MI450 chips set to launch in Q3 2026, and the OCI Zettascale10 supercomputer linking hundreds of thousands of NVIDIA GPUs.
Oracle’s cloud backlog has already surpassed $500 billion by late 2025, largely fueled by surging AI demand from enterprises. However, Ellison’s thesis faces stiff competition. Rivals like Amazon Web Services, Microsoft Azure, and Google Cloud are rapidly expanding their own enterprise AI offerings, while advancements in synthetic data generation could reduce dependence on exclusive proprietary datasets.

