Will there be an AI bubble peak? Yes. Every breakthrough technology has had over investment.
Has AI bubble peaked? If you keep reading mainstream media and listening to Michael Burry, you'd believe it.
You'd be losing a lot of money though.
Real demand is through the roof:
-
H100 prices recovering to highest in 8 months. This is a clear indicator that Burry's claim that old GPUs become useless faster than expected is wrong. Source mvcinvesting @ X. Can't post link here due to X being banned.
-
China's Alibaba Justin Lin just said they're severely constrained by inference demand. He said Tencent is the same. They simply do not have compute to meet demand. They're having to use their precious compute for inference which does not leave enough to train new models to keep up with Americans. Source: https://www.bloomberg.com/news/articles/2026-01-10/china-ai-leaders-warn-of-widening-gap-with-us-after-1b-ipo-week
-
Google says they need to double compute every 6 months to meet demand. Source: https://www.cnbc.com/2025/11/21/google-must-double-ai-serving-capacity-every-6-months-to-meet-demand.html
Notice how compute is always followed by "demand". It's real demand. It's not a circular economy. It's truly real user demand.
Listen to people actually are close to AI demand. They're all saying they're compute constrained. Literally everyone does not have enough compute. Every software developer has experienced unreliable inference when using Anthropic's Claude models because Anthropic simply does not have enough compute to meet demand.
So why is demand increasing?
-
Because to contrary to popular belief on Reddit, AI is tremendously useful even at the current intelligence level. Every large company I know is building agents to increase productivity and efficiency. Every small company I know is using some form of AI whether it's ChatGPT or video gen.
-
Models are getting smarter faster. In the last 6 months, GPT5, Gemini 3, and Claude 4.5 have increased capabilities faster than expected. The graph is now exponential. Source 1: https://metr.org/blog/2025-03-19-measuring-ai-ability-to-complete-long-tasks Source 2: https://arcprize.org/leaderboard
-
There are reasons to believe that the next generation of foundational models from OpenAI and Anthropic will accelerate again. GPT5 and Claude 4.5 were stilled trained on H100 GPUs or H100-class chips. The next gen will be trained on Blackwell GPUs.
-
LLMs aren't just chat bots anymore. They're trading stocks, doing automated analysis, writing apps from scratch. The token usage has exploded.
At some point, the AI bubble will peak. Anyone who thought it peaked in 2025 is seriously going to regret it.
Railroad bubble in the US peaked at 6% of GDP spend. AI is at 1% right now. I'd argue that AI is more important than railroads.
Believing that AI bubble has peaked is going to lose people a lot of money
byu/auradragon1 instocks
Posted by auradragon1
27 Comments
Yes
Nobody can see into the future but Ill think back to this post and wonder how you’re doing mentally if the bubble pops.
Most of my stocks are outside of AI because I see it as gambling.
Its no use trying to convince reddit. The hivemind has concluded that AI is overvalued trash and it will all come crashing down. Just keep reading and listening to a variety of sources on the matter and draw your own conclusions.
Yeah, Burry didn’t say “it has peaked”. In fact he says in his Substack that historically investment has continued for some time _after_ the peak, which makes the peak of an investment bubble very hard to see. The housing crisis had real mortgages with balloon payments that could be calculated to a specific timeframe, but this bubble does not…so a “big short” really isn’t possible (or at least, not that he knows).
Which bubble?
Reddit hates AI while a bunch of posts are literally written with AI.
Says the person wo generated this post by using AI…
People talking about GPUs depreciating rapidly really have no idea what they’re talking about. Just the Chinese market itself would buy every single “old” GPU they could get their hands on if they got the chance, and not only that; they would also be willing to pay a hefty premium for them.
I have 70% of my portfolio in TSM and Google. They have done well.
Since this is about stocks, people seem to miss the point. At some point the ratio of investable money to money required for normal business becomes unstable.
Just like you, the markets may front run stock price as a percentage of investable money vice gdp spend. They justify that the bubble isn’t over but everyone thinks they are beating everyone to the punch.
At some point the market pegs back to whatever liquidity is available to go up. This is irrespective of your arguments for AI.
Neat thing about markets is that everyone loads the boat and front runs literally everything with max leverage because of human greed. You will never know how much is “priced in” until something breaks.
I’d be interested if “demand” is revenue generating demand or just demand for free services or more features under existing revenue streams.
The biggest unknown about the bubble is can they convert enough revenue to justify the trillions in buildout and large opex costs.
Microsoft was complaining the other day about people calling ai slop amid low consumer demand so I’m not convinced this is catching on as well as some believe.
I love it when Reddit bets against things.
* Reddit ipo will flop.
* Meta is dead
* Ai bubble
* Sofi is overvalued
* Advanced money destroyer
Inverse Reddit made me a lot of money this year.
You are absolutely right that data centers are valuable assets. In a normal market, if one tenant leaves, you just rent it to the next one. This is the main “Bull Case” for Oracle.
However, Michael Burry is betting on a specific scenario where those assets turn into liabilities.
Here is why Burry thinks Oracle’s data centers might not be as “safe” as they look:
1. The “Rotting Fruit” Problem (Obsolescence)
A data center is made of two things: the Building/Power (which keeps value) and the Chips/Servers inside (which lose value).
The Trap: Oracle is spending billions on Nvidia H100 chips right now.
The Risk: Nvidia releases new, faster chips (like Blackwell) every 1–2 years.
Burry’s Point: If OpenAI leaves Oracle in 3 years, those H100 servers will be “old technology.” No other company will want to pay premium prices to rent 3-year-old chips when they can get the new ones elsewhere. The “asset” depreciates much faster than the debt Oracle took out to buy it.
2. The Accounting Trick (Depreciation)
Burry specifically called out an accounting maneuver Oracle (and others) are using to look more profitable:
The Trick: Oracle changed its accounting rules to say their servers will last 6 years. This spreads the cost out, making their yearly profits look higher on paper.
The Reality: In AI, a server rarely stays “state of the art” for 6 years.
The Consequence: If those servers become obsolete in 3 years (not 6), Oracle will suddenly have to write off billions of dollars in losses, which would crash the stock.
3. The “Glut” of 2026
You mentioned that “others will use it.” That is true today because there is a shortage.
But Amazon, Google, Microsoft, Meta, and CoreWeave are all building massive data centers right now.
Burry fears that by 2026/2027, there will be too many data centers and not enough profitable AI companies to fill them.
If supply exceeds demand, rental prices crash. Oracle would be stuck with high-interest debt payments while collecting lower rent.
Summary
You are right that the building and power connection will always have value. But Burry is betting that the expensive computers inside will lose value faster than Oracle expects, leaving them with massive debt for “old” technology.
AI has its use which is not for everything. That’s it!
$NBIS
Nah, you’re missing a few points in your premise and not understanding Burry’s argument.
First of all Blackwell is already available (was available for general purpose in July 2025).
Second, Nvidia has already announced its next generation GPUs based on Rubin.
Burry’s argument wasn’t about availability but depreciation instead.
So, top of the line GPUs are required for model training whereas for inference it’s more memory bound (due to the size of model being able to fit). Which can done with either cheaper cards like GeForce or A100 with higher memory (~90GB). Using a Blackwell chip for inference is highly inefficient, in terms of cost.
Think of cars as an example, model training is similar to race cars whereas inference is more everyday driving. You can use an older generation race car for everyday purposes but it’s not at all efficient. Same is the case for the new generation GPUs.
Burry’s argument was that CEO’s extending the depreciation form 2-3 years to 5-7 years is wrong. This meant that companies will have to jump to the new generation (when available, usually 2-3 years cycle by Nvidia) or be left behind (as this is an arms race)
This sounds like something ai would write
Which bubble pops first, AI or Silver?
See this is the thing about every single one of these predictions. “At some point, the AI bubble will peak” is pretty much rubbish along with it may peak end of 26 or early 27. You make a prediction, hedge it by making it a year out, if it doesn’t happen no one remembers and if it does you will remind us how smart you were.
The only way to make these credible is to post up your short positions. Put your money where you mouth is as to they say.
I believe the AI bubble is now just another bet you make that goes like this: if you believe that American tech billionaires are gonna be successful in installing a stable fascist regime with full surveillance of it’s citizens in the US then it wasn’t a bubble and your investment will pay off. If you believe however that the US can save itself and the world from the techno facsim, then the bubble will pop.
He’s wrong so much that the one time he was right they made a movie about it. He’s rhe perfect Reddit dweeb hero because he’s always wrong
I work for one of the FAANGs and I know there is an insane demand. The servers cannot keep up, so much so that they have to severely constrain capacity for the users to prioritize enterprise customers. This will only slow once the GPUs become 50-60x faster than today.
H100 are rising due to supply slowing down. Now everyone is moving to GB100 and H200
I googled air quality in Boston last summer for a date in July because we had since weirdness in a meter we used for work that day, and the ai result said it couldn’t provide an answer because that date is on the future.
I asked Grok the other day to summarize the time it defamed and then sexually harassed Twitter CEO Linda Yaccarino and it said it couldn’t find any information that this every happened. It did.
How the hell is crap like this helpful to anyone?
I don’t think it’s right to short AI, because it’s hard to know when the bubble will peak.
However, I don’t think it’s fair to say “the bubble has peaked because demand is currently high”. That’s circular reasoning.
AI is useful, but it’s not as useful as a lot of industry executives lead you to believe. 95% of AI projects are failing to achieve ROI. For coding, it has made senior devs less productive according to studies.
With that said, I think AI is here to stay, there are useful applications, but I think the bubble is in Compute. Attempting to brute force model quality by expanding compute used in training is hitting significant diminishing returns. Leading models have hardly improved in the past year in real world performance; most of the perceived gains are just from overfitting RL to benchmarks. This gives shareholders the illusion of progress, when in reality LLMs are not much different to where they were a year ago.
I don’t think AI progress will stall forever, but I think future progress will come from researchers achieving improved designs, not larger datacenters.
The main thing keeping the compute bubble going is no one wants to be the first to cut back. As long as OpenAI is spending Hundreds of billions, X AI, Google, Anthropic, etc feel obligated to follow their lead.
It’s sort of like how in 2020/2021, everyone was hoarding tech workers, and placing them on the bench, just to “have them”, because it was seen as scarce. In 2026, compute is seen as very scarce, so companies are doing whatever they can to procure compute, whether they need it or not.
Buy some BYND to diversify.
Last I checked I’m pretty sure Burry hasn’t said or thinks it has peaked.