The capitalists have already won
Nvidia controls >90% of the datacenter GPU market
One of the biggest anxieties people have, in the long run, is about capital concentration. This is the idea that wealth will end up in the hands of the few, and due to strong property rights and democratic distortions from power, it will stay this way forever.
Data centers are ludicrously expensive. Not just to build, but to operate. They take in huge amounts of power, water, and bandwidth. In fact, this is why data centers tend to cluster on internet backbone rails (see map below). Year-to-date DC (data center) construction is nearly $13B by June alone. The total expenditure in 2025 is expected to be around $60B. That’s not pocket change for anyone. We the people can’t even play in that league. The best we can do is invest in Microsoft, Google, Nvidia, and Amazon and hope for the best. But the possibility of the masses ever controlling those resources, let alone building them, is a pipe dream.
This is just the nature of capital.
Like bygone eras, it was impossible for a peasant to buy a large farm, or a shop worker to buy a large factory. Data centers are this era’s primary capital asset. Whatever the current economic paradigm is (agrarian, manufacturing, services) just look to the biggest companies. During the industrial revolutions, it was the steel, rail, and oil tycoons. They were the richest. Right now? It’s the Magnificent 7, which is not a racehorse name, but the biggest tech companies.
Don’t get me wrong, I am a big believer in Why Nations Fail and believe in strong property rights—without those, you don’t get data centers at all. That is, unless the State builds them, but that’s arguably far worse for society in both the short and long run.
So where does that leave the rest of us?
There are those who say “ban it, burn it all down” and it’s ostensibly under the banner of “AI Safety” though to me it looks more like crab mentality. If I can’t control it, no one can have it!
To date, generative AI has shown no signs of becoming homicidal.
And, while I may be a bit cynical about these folks, most of them are non-technical. They have no relevant education or experience with AI, and unless they have a crystal ball they aren’t telling anyone about, they cannot predict the future with any more accuracy than anyone else. In fact, when you look at their track record, it’s pretty abysmal.
If AI isn’t about to nuke humanity out of existence, then what’s the next biggest threat? What is actually observable and measurable?
To me, it’s capital concentration as well as capital intensification.
This is what we can literally see when Sam Altman gets billions of dollars of capital to build Stargate. This is what we see when Nvidia becomes a multi-trillion dollar company for making AI chips.
So where does this leave us?
A few of my friends are extremely worried about this future. Power begets more power. Wealth begets more wealth. Inequality reigns supreme. This is, indeed, the default attractor state. High tech, low life. We very reasonably could end up with the “Standard Oil of AI”—back in the day, Sam Altman thought it would be him when he said that AI was going to generate $100T in economic value and that OpenAI could “capture most of it.”
Follow the money, amiright?
The way I see it, there are a few possible future attractor states. And keep in mind: many of these are not mutually exclusive.
Data Fortresses—I borrowed this term from Cyberpunk 2077, but the term is pretty obvious. Instead of “data centers” the Big Tech and Big Iron shops fortify their positions (literally, financially, and legally). In any civilization that respects property rights, civil liberties, and other freedoms, this is an inevitable outcome. Some level of wealth concentration is going to happen.
Ubiquitous Compute—On the polar opposite end of the spectrum is the idea of open source dominating. Local AI on your hardware. We’re getting pretty close to this already. OpenAI released GPT-OSS, an open source reasoning model that can run on a high end laptop. Eventually, who knows, maybe ruthless competition will create power AI chips and free models to the point that there’s no real moat around HPC clusters (“high performance compute” aka the GPUs and specialized hardware for training and running AI)
Hybrid Models—Reality will probably land somewhere in between. China is releasing popular open source models as well as big American shops. This means that some AI will be free for everyone. Some people will remain “AI poor” if they can’t even afford the hardware and power, though. Some level of regulation will step in and shape the landscape, though the US’ current policy is explicitly “we will not regulate AI” which, for now, I tend to agree with.
Does the completely solve capitalism? Absolutely not. It’s a far larger problem than can be solved in a blog post. That’s why I’m working on my book THE GREAT DECOUPLING which talks about the compounding effects of capital concentration and capital intensification, and what we can do about it.
Is it guaranteed that we can find a way out of a dystopian cyberpunk future? No. Nothing is ever guaranteed.
Do I see a viable path forward? Absolutely. I do think it is possible to solve this problem, and navigate towards a more utopian, generative, and abundant future.







Libertarian Dave Smith said it himself: “No one hates the free market more than Capitalists.”
Can you make a post about the rise of deflationary money(bitcoin) and AI? I think AI never gets taken into account with the rise of unprintable money that’s clearly going to be inevitable.