Year-End Thoughts on AI
Yes, it's a bubble, but not like the ones we know.
Cartoon by Nick Anderson. Scroll down to the end to see three videos of what I did NOT do on my vacation and to read my other recent essays on AI.
📣 Dear friends, I wanted to take a moment and thank you for being a subscriber to this newsletter. We are all drowning in content options, so your willingness to give my ramblings any attention is a special gift. We now have 16,500 subscribers (9.5K here and 7K for the LinkedIn version) and have big plans in 2026.
This core newsletter will remain free and available as is. Thanks to all of you who have pledged to support us if/when we turn on paid subscriptions. As of this writing, we have arrived at that point. Your paid subscription will help us pay outside contributors and editors, and provide, on occasion, exclusive content and events. Thank you for being part of this journey. With your help, we will do even more.
***
We’re currently in a bubble of some sort, but this feels like a different kind of bubble. It’s a bubble of cash flow, capital expenditure and risk, for sure. However, the technology itself is here to stay (as is the threat to hundreds of millions of jobs over the next few years).
I will use some data points, but this is more of a vibe-check than an economic analysis—there are more than enough economists to cover that. I use AI all the time. My team has worked with several organizations to build practical guides for AI and do workshops on advanced prompt writing. I am not a programmer, I don’t code, and I am not a tech executive. I’m a person who has options for when to use different systems, and has other AI shenanigans thrust upon me by the myriad platforms that govern my personal and professional lives. If you’ve made it this far, I suspect you know what I mean.
First, the financial bubble at the top of the AI industry is very real, with a lot of the main players investing in each other. OpenAI has raised $60 billion in funding so far and is trying to raise another $100 billion. Anthropic just raised another $13 billion. Amazon, Google, and Microsoft alone are spending unprecedented amounts of cash on data center infrastructure and computing power, and McKinsey estimates that these facilities will cost nearly $7 trillion by 2030. That number may be low if Goldman Sachs’ estimates for 2026 are even close to correct:
If recent history holds, analyst estimates for technology capex may be revised higher still. The hyperscalers spent $106 billion in capex in the third quarter this year (including AI and non-AI expenditures), representing a year-over-year growth rate of 75%. Analysts expect this growth rate will slow sharply to 49% in the fourth quarter and to 25% by the end of 2026.
But Goldman Sachs Research notes that consensus capex estimates have proven to be too low for two years running. At the start of both 2024 and 2025, consensus estimates implied capex growth of roughly 20% for the year. In reality, it exceeded 50% in both years.
According to economist Mark Furman, AI writ large is the single driving force in the U.S. economy—it has accounted for essentially all economic growth over the last year-plus: Without data centers, GDP growth was 0.1% in the first half of 2025.
The average public user of ChatGPT or Gemini does not really see any of this. If you’re making funny images with image generators, or putting together shopping lists, holiday plans, or whatever else with these tools, the slightest uptick in quality of response is pretty marginal. I firmly believe that ChatGPT could collapse and disappear next week and the fallout would be almost entirely on the financial side. There would be very little overall effect on the technology.
To me, that’s where AI has staying power. This is also where the comparison to the dot-com bubble starts to unravel. The narrative is familiar for sure—the silicon tech barons are at it again with this new digital technology that is churning through mind-numbing amounts of money. But, away from these headlines, businesses and organizations of all sizes are finding real use cases for AI that are moving the needle. These are not apps built on top of a commercial ChatGPT model; these are small, specialized, closed models that are laser-focused on tasks they can complete reliably.
Call it “AI’s Refinement Era.” Large models that are sorta good at a lot of things are so resource-intensive that there almost has to be a correction. The arms race that is driving up energy prices and causing as-yet-untold environmental destruction. Elections are already being decided by single-issue voters holding data center construction as their single issue. High electricity prices and contaminated water are prices that are simply untenable for large swathes of most countries, not just the U.S.
The tension here is obvious—Western companies can probably tolerate some level of data offshoring, but the populace and public sector simply cannot. Again, this is where smaller, more efficient, more purpose-driven AI looks more like the near-term reality. The consumer side will be there, but I don’t see how a company like OpenAI can honestly expect to generate revenue significant enough to maintain its spending habits via John and Jane Q. Public. To date, the evidence bears this out.
There is a lot at stake here, and it goes far beyond the assets of a few mega-rich tech oligarchs and mega corporations. The chart below, from McKinsey, illustrates this perfectly. There are a lot of places where AI just isn’t up to the task yet, but the inverse is true as well, and this trend will only accelerate.
There will be a lot of destruction throughout the AI industry over the next 12-18 months, but this is not the dot-com bubble. There is too much underlying value, there’s been too much progress, and supercomputing isn’t going anywhere.
Back in 2023, I wrote a piece here on what the rise of AI demanded of us as a society, and how it was only going to get harder to separate truth from fiction online. Image and video generation engines are exponentially more capable than they were when I published that essay two years ago and they only continue to improve.
Remember this: Computers do not have ideas; they work on probability, and that has not changed. As I’ve written many times here, the human spirit has never been more important, and AI can’t generate anything a human didn’t generate first.
Speaking of generation, here’s the state of the art
My previous essays on AI:
DeepSeek says I died in 2020 Feb. 9, 2025
The problem with fake AI avatars Jan 5, 2025
AI is coming for your search traffic May 21, 2024
AI isn’t ready to be everywhere April 25, 2024
Don’t call them “hallucinations!” April 16, 2024
Copyright, fair use & AI March 7, 2024
The great AI acceleration Nov. 27, 2023
AI companies use the public as a laboratory Sept. 26, 2023
The liberal arts are more important than ever April 23, 2023
What AI demands of us April 16, 2023
Education in the age of AIFeb. 27, 2023
Did we miss anything? Make a mistake? Do you have an idea for anything we’re up to? Let’s collaborate! sree.sreenivasan1@gmail.com and please connect w/ me: Twitter | IG | LinkedIn | FB | YouTube / Threads | Spread | TikTok
Sree’s Sunday Note is reader-supported. If you enjoyed this post, you can tell Sree that his writing is valuable by pledging a future subscription. You won’t be charged unless they enable payments.







It's better to spare some valuable time and focus for this content. It's that much valuable.
Looking forward to how your Substack evolves in 2026. It will certainly help make my new year just a little bit better.