Here’s a number that should make your jaw drop: $130 billion.
That’s what Meta plans to spend on capital expenditure in 2026 alone. Not over five years. Not cumulatively. In a single calendar year. For context, that’s roughly the entire GDP of Hungary.
So when Meta signs a $60 billion chip deal with AMD, the instinct is to read it as a story about AMD’s comeback. About how Lisa Su finally cracked Nvidia‘s fortress. About the underdog winning.
But that’s the wrong frame entirely.
This is actually a story about what happens when one company becomes so powerful, so indispensable, and so expensive that its biggest customers start engineering their way out of dependency. And it tells you something uncomfortable about where the AI industry is really headed.
The Nvidia tax
Nvidia has over 90% market share in AI chips. That’s not a competitive moat. That’s a monopoly with better branding.
And monopolies charge monopoly prices.
When you’re Meta, running AI inference across 3.5 billion daily active users — recommending reels, generating ad copy, powering chatbots, summarizing threads — every chip you buy is multiplied across millions of servers running 24 hours a day. Improving your cost-per-computation by even 10% doesn’t save you a little money. It saves you hundreds of millions of dollars a year.
This is why Meta didn’t just sign the AMD deal. It built an entire four-layer silicon architecture. Nvidia for frontier model training. AMD for inference at scale. Its own homegrown MTIA chips for recommendation algorithms. Google’s TPUs for Llama workloads.
Meta essentially said: we’re too big to have a single point of failure in our supply chain. And we’re too big to keep paying one vendor’s premium when alternatives exist.
That’s not a chip story. That’s a procurement strategy story. And it has enormous implications.
Why inference changes everything
Most people think about AI chips in terms of training — the massive, months-long process of teaching a model to think. That’s where Nvidia dominates and where its H100 and Blackwell GPUs earn their legendary reputation.
But here’s what’s quietly shifting: the AI industry is moving from the training era into the inference era. Training happens once. Inference happens billions of times a day — every search query, every chatbot response, every content recommendation.
AMD’s CEO Lisa Su has called inference an 80% annual growth market. And crucially, inference has a different cost calculus than training. Raw performance matters less. Performance per dollar matters enormously.
This is the gap AMD has been targeting. Its MI355 chip reportedly delivers up to 40% more AI outputs per dollar compared to rivals in inference workloads. It doesn’t need to beat Nvidia’s H100 on benchmarks. It just needs to be substantially cheaper for the jobs that run constantly and at scale.
Meta, processing hundreds of billions of AI interactions daily, understood this math better than most. The AMD deal is effectively a bet that inference economics will dominate AI infrastructure spending going forward — and that AMD has built the right chip for that moment.
The equity twist nobody is talking about
Here’s where it gets genuinely interesting — and a little uncomfortable.
As part of the deal, Meta gets the option to acquire up to 160 million AMD shares at essentially one cent each, depending on how many chips it actually buys. A potential 10% stake in AMD, earned through purchases.
AMD did the exact same thing with OpenAI in October 2025.
Think about what this structure actually is: AMD is offering equity in itself as a discount mechanism. Rather than simply cutting chip prices, it’s giving customers an ownership stake that appreciates if the company succeeds. Customers become investors. Investors become advocates. Advocates become long-term buyers.
It’s creative. It’s also a sign that AMD cannot yet win these deals on chip quality alone. Nvidia doesn’t offer equity warrants to land customers. It doesn’t need to. The fact that AMD does tells you the competitive gap still exists — even as it narrows.
But there’s a deeper implication here. These deals create alignment that goes beyond vendor relationships. If Meta holds 10% of AMD, it has a financial incentive for AMD to succeed. It will push workloads toward AMD chips, share feedback on chip design, and co-develop infrastructure. The relationship stops being transactional and becomes structural.
That is genuinely threatening to Nvidia — not because AMD suddenly matches its chips, but because AMD is stitching itself into its customers’ balance sheets in a way that makes switching back expensive in more ways than one.
The bubble nobody wants to name
Zoom out further and the picture gets stranger.
AMD offers equity to Meta to buy chips. Nvidia has invested billions into OpenAI, which spends that money on Nvidia chips. Microsoft and Google pour capital into AI startups that then spend everything on Microsoft and Google’s cloud infrastructure. The same dollars are rotating through the same hands, each time generating a transaction that looks like revenue.
Gil Luria of D.A. Davidson put it bluntly: “The cost of the AI build-out is so high that these are the only companies that can fund it. They’re making a bet that the technology will be so powerful that ultimately everyone will be buying it from them.”
That’s either the clearest-eyed description of a transformative technology cycle — or a politely worded description of a circular economy that looks more robust than it actually is. Nobody knows which yet. Not even the people signing the deals.
What’s certain is this: the AI infrastructure race has reached a scale where the only way to keep it funded is for the participants to fund each other. And that creates a system where the headline numbers — $60 billion here, $130 billion there — can obscure how much of the underlying demand is self-reinforcing rather than organically generated.
What this actually means
For AMD, the Meta deal is validation at a scale that changes analyst models and investor narratives. Morningstar raised its fair value estimate for the company to $300 per share. Two anchor customers — Meta and OpenAI — now publicly stake their AI futures partly on AMD silicon. That’s real.
For Meta, it’s a supply chain insurance policy with upside optionality. If AMD’s chips perform, Meta saves money at massive scale and gains on its equity position. If they underperform, it still has Nvidia running the most critical workloads.
For Nvidia, it’s a warning shot. Not a fatal one — Nvidia’s earnings the very next day were expected to remind markets who still runs the industry. But the direction of travel is clear. The largest AI spenders in the world are methodically building alternatives to Nvidia dependency. Not to replace it. To make sure they never have to.
That’s the real chip story here. Not AMD’s comeback. Not Meta’s spending. But the quiet, expensive, structurally fascinating project of the world’s most powerful tech companies trying to ensure no single vendor can ever tell them no.
Sonia Boolchandani is a seasoned financial writer She has written for prominent firms like Vested Finance, and Finology, where she has crafted content that simplifies complex financial concepts for diverse audiences.
Disclosure: The writer and her his dependents do not hold the stocks discussed in this article.
The website managers, its employee(s), and contributors/writers/authors of articles have or may have an outstanding buy or sell position or holding in the securities, options on securities or other related investments of issuers and/or companies discussed therein. The content of the articles and the interpretation of data are solely the personal views of the contributors/ writers/authors. Investors must make their own investment decisions based on their specific objectives, resources and only after consulting such independent advisors as may be necessary.

