The period of frenzy is followed by the period of deployment; the time to harvest the fruits of the revolution. - Carlota Perez
The AI hype cycle is over for now. I’ve been an AI advocate since the first time I used GPT-4 to write a working Python trading algorithm in 2023. I’m still convinced it will take the wider economy years, maybe decades to fully digest the productivity shock we’ve already unwrapped. But the recent curve we’ve been riding just flattened into a long plateau.
The problem isn’t that the models stopped improving. It’s that the improvements we need are measured in orders of magnitude, not percentage points. Every step up the scaling laws now demands a city’s worth of electricity and a sovereign wealth fund’s worth of GPUs. You can still squeeze clever things out of mixture-of-experts or chain tiny specialists into something that looks like agency; that keeps the demo videos cinematic. It just doesn’t get us to super-intelligence.
For that we need either an architectural miracle (un-forecastable by definition) or a civil-engineering miracle (a decade-long sprint to build nuclear plants and 2-nanometer fabs). The first is luck. The second is politics. Both are scarce.
Meanwhile the models we have remain, at their core, next-token roulette wheels. Chain enough spins together and tiny error probabilities compound into existential glitches. In domains where you can automatically verify an answer, unit tests pass, the protein binds, those glitches are an acceptable tax.
You iterate until it works. In domains where judgment is qualitative, the tax becomes fatal. My trading example still hurts: I asked Chat GPT to locate the optimal MACD strategy for a new asset. Chat GPT quickly came up with a strategy that seemed like it was optimized on the surface but did not account for things like implied volatility and bet sizing.
A layperson would have seen a nice strategy and move on. Scale that failure mode to law, medicine, or national security and you understand why “human-in-the-loop” isn’t a slogan—it’s a ceiling.
What comes next is not the next spectacular demo but the quiet absorption of today’s tools into the 80 percent of the economy that still runs on Excel and email. Code will ship with more AI-authored lines, but senior devs will still sign the diffs. Customer-support bots will escalate the five percent of tickets that matter; the other ninety-five percent will vanish so smoothly customers won’t notice. Radiologists will still stare at scans, except now a model will have pre-read every slide and flagged the one in a thousand the tired resident would have missed. And yes, you’ll still need a PhD to notice that the portfolio weights got renormalized, but you’ll build the new clustering module in five minutes instead of an afternoon.
The productivity gains are real; they’re just not cinematic. For founders, this means stop chasing the next 0.3 percent on MMLU. Find a vertical where verification is cheap and margins fat, then build the scaffolding that lets domain experts ride the model instead of babysit it. For investors, treat “AI” the way we treated “mobile” circa 2011: infrastructure bets can still clear the hurdle rate if you triple the time discount, but the application layer is a graveyard of demos wearing revenue costumes. For policymakers, forget the AGI manifestos and write zoning rules that let utilities run high-voltage lines to data centers.
THE LONG PLATEAU IS THE NORM
Every general-purpose technology follows the same rhythm: hype, overbuild, crash, consolidation, integration, and—decades later—reinvention. Railroads, electricity, automobiles, the Internet—each cycle runs 60 to 70 years.
Railroads (1830s–1900s): Frenzied expansion followed by decades of operational optimization—standard gauges, timetables, refrigerated cars.
Electricity (1880s–1940s): Edison’s lightbulbs gave way to the slow work of wiring cities, building power grids, and electrifying industry. - Vanderbilt
Automobiles (1900s–1970s): Thousands of car companies collapsed into a few giants; the real transformation came from highways, logistics, and dealership networks.
Internet (1990s–present): The dot-com boom crashed, then broadband, mobile, and cloud turned the web into essential infrastructure.
In a startup, absolutely nothing happens unless you make it happen. - Andreessen
The plateau phase isn’t stagnation—it’s when the technology seeps into every pore of the economy. It’s also where the biggest fortunes are made by those who can operationalize, standardize, and integrate.
AI, CRYPTO, ROBOTICS: ONE SUPER-CYCLE
AI isn’t a standalone revolution in the way railroads or electricity were. It’s an application-layer wave riding on the back of the Internet cycle that began around 1990. So is crypto. So are robotics and self-driving vehicles.
The Internet’s long wave looks like this:
1. 1990: Web launch: The browser opened the network to the public.
2. 2007: iPhone: Computing became constant, personal, and sensor-rich.
3. 2022: ChatGPT: AI entered mass public consciousness.
4. 2025: Integration plateau begins: AI, crypto, robotics, AV systems embed into existing workflows.
~2050: Post-network wave: Brain–computer interfaces, energy revolutions, off-planet industry.
The plateau we’re entering now is simply the integration phase of the Internet super-cycle, one that could last until mid-century.
WHAT TO DO IN THE PLATEAU
Founders: Build in verticals with cheap verification, high margins, and entrenched demand. Sell to domain experts, not tourists.
Investors: Prefer infrastructure over flashy apps; triple your time horizon.
Policymakers: Focus on enabling physical capacity power, data centers, chips. Ignore the AGI manifestos.
We are not going back to the pre-2022 world. The ceiling just got higher, but the ladder is longer than we thought. That isn’t failure; it’s physics. The next breakthrough will arrive, maybe from a grad student with a sparse-attention kernel, maybe from a national lab running a ten-gigawatt reactor. Until then, the boring work of integration is the only game in town.
Exponential curves always look flat when you zoom in.