From 1980 to 2000, GDP doubled during the PC revolution. Why assume AI ends differently?
KEY TAKEAWAYS
AI-driven productivity fears are not new and resemble prior technological panics such as the PC revolution and the internet boom. Historically, these transitions ultimately accelerated growth rather than causing sustained contraction.
Between 1980 and 2000, US real GDP nearly doubled while personal computers spread throughout the workplace. Productivity growth accelerated and unemployment fell, contradicting fears of automation-driven collapse.
The Citrini report frames AI as potentially creating “ghost GDP,” where output rises but wages stagnate. The argument assumes permanent labor displacement without sufficient reallocation.
AI qualifies as a general-purpose technology, similar to electricity or computing. Such technologies restructure production functions and create entirely new industries over time.
Capital expenditures in AI infrastructure, semiconductors, and data centers signal expansionary investment rather than contraction. Rising private fixed investment directly contributes to GDP growth.
NEXT UP
Productivity gains have historically expanded the economic pie rather than shrinking it. AI should be viewed through the same long-run lens.
Transitional labor dislocation is real, but permanent structural unemployment is unlikely. Human adaptability and incentive structures consistently drive reallocation.
Capital deployment at massive scale is not consistent with systemic contraction. Investment waves signal belief in long-term expansion.
Markets often overreact to technological fear narratives. Short-term volatility may obscure long-term structural opportunity.
Betting against human ingenuity and productivity has repeatedly proven costly. Long-term investors should anchor in historical precedent rather than headline anxiety.
You can quote me: “Every technological panic has been the ignition switch for growth.”
When you grow up. I was always a computer nerd of sorts. My dad was a scientist and he encouraged me to experiment–expand my knowledge, and he was able to provide me with tools to do that at a very young age where I learned to program computers by spending hours in a freezing computer center in the basement of one of Bell Labs iconic locations. I was part of a small group known as Explorers, which was chartered by Bell Labs to foster young engineering talent. We would meet twice a week and hack away. Our mentor was a legendary scientist, and each week he would bring in different engineers to work with us–shoulder-to-shoulder. That is where I met Brian Kernighan and Dennis Ritchie.
If you know who they are, you get a prize, but if you don’t, they were pioneering computer scientists at Bell Labs who helped shape the modern software world (a bold statement, but a true one). Ritchie co-created the C programming language and was a key developer of the Unix operating system, laying the technical foundation for much of today’s computing infrastructure. Kernighan, co-authored The C Programming Language with Ritchie–often called “K&R”--a book that became the definitive guide to C and influenced generations of programmers (I still have my book). Those two would spark a revolution that would quietly build the tools that power everything from operating systems to today’s internet.
From that young age, I almost always had a computer by my side, even building my first Apple computer by hand. It became my super power. I was able to create programs to do my homework, and when Vizicalc–the first spreadsheet program–was released, it became a tool for me to earn money. YES, I actually earned big bucks in my early teens putting peoples businesses on spreadsheets. I could do all this, and go to school, and play sports, AND have plenty of time to do silly kid things. It was an incredible productivity hack.
In college, when my classmates were clicking away at messy typewriters, I was spell checking on Word Perfect! After college when I started on Wall Street, I was one of the few people on the trading floor with a real computer where spreadsheets gave me the edge over–litererally just about everyone who was still doing math with pencils on the back of trade tickets. Eventually more and more computers showed up on desks. Some traders embraced them and gained great wealth to show for it, while others resisted, and ultimately–well, they went the way of the Dodo.
Did technology cause folks to lose their jobs?
Well, if you read the headlines at the time, you would certainly think so. But alas, what happened was quite the opposite. Could you imagine if we never adopted PCs in the workplace? How about spreadsheets, that were expected to completely eliminate the accounting profession. Could you imagine where our economy would be with computers and software?
When I first went to college I was slated to become an engineer. Once I started my studies, which included all manner of brutal high-level, theoretical math, I quickly realized that being an engineer had very little commercial viability. No. This was the early 80s, and Wall Street was booming–that was where the money was at. 🤣 It didn’t take me long to switch majors. In retrospect that was a silly trade wasn’t it? Imagine that, no one wanted to be a computer scientist when they grew up.
Ok, that was a long lead up to what I want to discuss today. Over the weekend, Citrini Research released a report titled “The 2028 Global Intelligence Crisis.” It is written as a forward-looking thought experiment set in mid-2028, imagining a world in which rapid advances in AI dramatically boost productivity while displacing large swaths of white-collar labor. In this scenario, wages come under pressure, credit markets strain, consumption weakens, and asset prices reprice, even as headline output looks strong. The authors frame it clearly as a scenario exercise, not a formal forecast, but the core message is that AI-driven labor disruption could trigger higher defaults and broader economic contraction. It went viral on Monday and helped spark a sharp selloff in anything remotely connected to AI, which, if we’re honest, is almost everything. I touched on it yesterday, but because the fear is still lingering, I want to tackle it again from a different angle today.
Now let me tell you why I think they have it exactly backwards.
I lived through the last great technological panic. Actually, I lived through a few of them. The PC revolution. The internet boom. The automation scare. The offshoring scare. The spreadsheet-will-kill-accountants scare. Each one came with breathless warnings about structural unemployment and permanent dislocation. And each one, in hindsight, was the ignition switch for growth.
Let’s anchor this in facts, not vibes.
From 1980 to 2000, during the period when personal computers moved from hobbyist kits to corporate necessity, US real GDP nearly doubled, rising from roughly $7 trillion to about $13 trillion. Labor productivity growth accelerated sharply in the second half of the 1990s, averaging over 2.5% annually, compared to closer to 1.5% in the prior two decades. Unemployment, instead of permanently spiking, fell to 4% by 2000. That was not an economy collapsing under automation. That was an economy supercharged by it.
And let’s be clear about scale. In 1980, there were essentially zero software engineers relative to today’s standards. By the early 2000s, millions of Americans worked in computer and mathematical occupations. The Bureau of Labor Statistics now counts well over 4 million jobs in computer and mathematical fields. Those jobs did not exist in any meaningful way when I was freezing in that Bell Labs basement. I think that there were 6 of us tops on any Tuesday or Thursday night.
Technology did eliminate tasks. It did eliminate some jobs. It absolutely displaced people. But it created more value than it destroyed. That is not opinion. That is observable history. 👀
The Citrini scenario hinges on the idea that AI creates “ghost GDP”, which is characterized by output without wages. But historically, productivity gains do not remove wages from the system. They raise real wages over time. When output per worker increases, the pie gets larger. Firms compete for talent. New industries form. Entire categories of demand appear.
Think about this in human terms. When you grow up, what do you want to be? In 1975, nobody answered “cybersecurity analyst,” I certainly didn’t. In 1985, nobody said “app developer.” In 1995, nobody said “cloud architect.” In 2005, nobody said “social media strategist.” In 2015, nobody said “AI prompt engineer.”
When I was growing up, being a computer scientist was niche. When I pivoted to Wall Street in the early 80s, engineering felt commercially irrelevant. That sounds absurd now. Today, some of the highest-compensated professionals in the world sit at the intersection of code and capital.
If I only knew.
AI is not a single-product innovation. It is a general-purpose technology, like electricity or computing. Economists use that phrase carefully. A general-purpose technology does not just replace labor. It restructures the production function of the entire economy. Electricity did not eliminate candle makers and then stop. It enabled refrigeration, telecommunications, appliances, industrial scaling, and entirely new consumer categories. Computing did not just eliminate typists. It enabled e-commerce, digital advertising, SaaS, mobile ecosystems, and globalized supply chains., and AI is on that exact trajectory.
Look at the capital flows. The largest technology companies in the world are deploying hundreds of billions of dollars into AI infrastructure. Data centers are being built at a pace not seen since the telecom buildout of the late 1990s. Semiconductor capital expenditure is surging. Global data center electricity demand is projected to more than double by the end of the decade. That is not a contractionary signal. That is investment!
Investment is a component of GDP. When private fixed investment rises, GDP rises. That investment supports construction jobs, engineering jobs, grid modernization, hardware manufacturing, cooling systems, networking equipment, and logistics. AI is not just a software story. It is a physical infrastructure story as well.
Now, let’s address the labor fear directly. Yes, AI will compress certain white-collar tasks. Entry-level legal drafting, basic coding, junior analyst report writing–these are clearly vulnerable to automation. Transitional unemployment in those pockets is likely. Wage pressure in commoditized roles is plausible. But the same thing happened when spreadsheets arrived on trading desks. Junior clerks were no longer needed to reconcile books manually. Instead, analysts became more powerful. Portfolio managers scaled larger pools of capital. Firms grew. GREW!
When worker productivity increases, firms expand output without proportionally expanding headcount at first. That is the tremor phase. But over time, lower costs translate into lower prices or higher margins. Lower prices raise real purchasing power. Higher margins fuel investment. Investment creates new roles.
And here is the macro mechanism that matters. Real GDP growth equals labor force growth plus productivity growth. The US labor force is not growing rapidly. That means productivity is the lever. If AI can sustainably raise productivity growth from, say, 1.5% annually to 2.5% or 3% over a multi-year stretch, the compounding effect is enormous. Over a decade, that difference is trillions of dollars in additional output. Trillions!
An expanding economy does not permanently starve its own labor base. It reallocates it. The friction is the cost of progress. The end state is higher living standards.
Credit defaults? Economic contraction? Those are cyclical outcomes. They happen with or without AI. We had defaults in 1991. In 2001. In 2008. In 2020. None were caused by productivity booms. They were caused by financial excess, leverage, and policy error.
The idea that AI-driven productivity automatically leads to systemic default assumes that displaced workers remain permanently unemployable. History rejects that assumption. Skills change. Education adapts. Incentives shift. Humans respond.
There is another concept economists call non-satiation. Human wants are infinite. “More is good,” is a basic tenet of economics. When productivity frees resources, we do not stop consuming. We upgrade. We demand higher-quality healthcare, better entertainment, personalized services, new experiences, faster logistics, smarter homes. Entire value chains form around those demands.
The Bell Labs engineers I met were not trying to destroy employment. They were trying to build better tools. Those tools scaled the global economy. They did not shrink it. We are at a similar inflection point.
AI will change what it means to be competitive. It will reward adaptability. It will punish complacency. It will expose skill mismatches. Some roles will disappear. Others will be born. The distribution of income may shift. Policy debates will intensify. That is the tremor.
But the direction of travel is clearly that of expansion, and I am firm on this. AI is not a zero-sum substitution machine. It is a productivity amplifier. Productivity is the engine of prosperity. Prosperity creates opportunity. Opportunity creates jobs.
When you grow up, what do you want to be? The honest answer is that we do not fully know yet. And that is precisely the point. The most valuable jobs of 2035 likely do not have clear titles today. Just as I could not have imagined cloud computing from a freezing basement data center, many cannot yet see the contours of AI-based industries.
If I only knew. What I do know is this: betting against human adaptation has historically been a losing trade. Betting on productivity has historically been a winning one. This is not the end of work. It is the beginning of a different kind of work. And that, in my view, is a boom.
YESTERDAY’S MARKETS
Stocks staged a comeback yesterday as dip-buyers thought better of the doom and gloom offered by the weekend report that caused markets to spin out on Monday. Treasury yields declined, Bitcoin slipped, and Gold stayed strong above $5,000.
NEXT UP
No economic numbers, but we will hear from Barkin, Schmid, and Musalem today.
Important earnings today: Lowe’s, Medline, Owens Corning, Circle, TJX, NVIDIA, Synopsis, VICI Properties, Chime Financial, Zoom Communications, IonQ, Pure Storage, Salesforce, Paramount Skydance, Snowflake, and Agilent.