Musk and Altman Say Money Is Dying. They're Raising $100 Billion Anyway.
Inside the race to convert cash into compute, energy, and orbital real estate before the currency expires.
Elon Musk says money may not survive artificial general intelligence. In his words, “I think long term, I think money disappears as a concept, honestly... In a future where anyone can have anything, you no longer need money as a database for labor allocation.” Sam Altman’s camp has been more lawyerly, but not less radical. In an investor warning, OpenAI wrote that “It may be difficult to know what role money will play in a post-AGI world.”
Then came the part I couldn’t stop staring at.
In February 2026, OpenAI reportedly closed a $100 billion funding round at a valuation exceeding $850 billion. That is an almost absurd amount of capital for a company openly telling investors that the thing they’re handing over may become hard to value, or even conceptually unstable, if the company succeeds too well.
At first glance, that looks like hypocrisy. But I think it’s something more interesting.
If the people closest to the AGI race really believe money becomes less important, then the obvious move is not to hoard money. It’s to convert money into whatever replaces money as the medium of power. That changes the whole story. The contradiction is only superficial. Underneath it sits a coherent strategy.
So I kept asking a simple question: if money is headed for obsolescence, what exactly are Musk and Altman buying with all of it?
The answer, as far as I can tell, is control over the scarce physical inputs of a post-labor economy. Compute. Energy. Launch capacity. Orbital position. The rhetoric is about abundance. The capital allocation is about ownership.
Money is a claim on labor
To understand why this argument has a certain internal logic, I think you have to start with a very old idea. Money is not just paper, digits, or central bank plumbing. At its core, money is a social claim on other people’s time.
When I spend $10, I’m not really trading a green piece of paper for an object. I’m exercising a claim on labor that has already been performed, or will be performed to replace what I consume. The cashier’s time. The truck driver’s time. The refinery worker’s time. The software engineer’s time. The warehouse worker’s time. Money works because other human beings still have economically valuable hours to exchange.
That sounds abstract until you apply it to something stupidly ordinary, like a $3.29 bag of Doritos.
That bag contains the labor of farmers growing corn, workers processing oils and flavorings, people building and maintaining tractors, petroleum workers producing packaging inputs, shipbuilders moving goods, warehouse operators routing inventory, truckers delivering product, and even the people who keep GPS satellites and telecom systems functioning so the logistics chain doesn’t fall apart. A trivial snack is really a compressed global labor contract.
This is why the deflation examples matter. They show what happens when labor embedded in a product collapses.
Economic research from the National Bureau of Economic Research captures this dynamic clearly. Long-distance phone calls once cost $3 to $5 per minute in the era of manual switching and expensive network infrastructure. Over time, digitization, compression, and internet delivery drove that cost toward near-zero. The labor claim embedded in the call largely vanished.
The same pattern hit music. The same NBER research notes that 1,000 songs could cost more than $10,000 in vinyl and CDs in 1985. Once music became digital files, the marginal cost of replication became negligible. Again, the labor claim collapsed.
We tend to describe these as stories about cheaper technology. They are. But they’re also stories about shrinking labor content. Once the human effort needed to reproduce something falls close to zero, the price follows it down.
Now push that logic much further.
If AGI plus robotics can perform not just some labor, but most economically relevant labor, then the old foundation of money starts to wobble. If machines can design, negotiate, code, diagnose, manufacture, transport, and even coordinate supply chains with minimal human input, then what exactly is a dollar claiming? Not much human effort. Mostly access to raw materials, land, power, and machine time.
That’s the logic chain Musk is working from. And to be fair to him, he states it plainly. He isn’t talking about a mystical end of economics. He’s talking about abundance overwhelming the old allocation mechanism.
I don’t know if that future arrives on the timelines being priced in. Nobody honest does. AGI by 2029 or 2030 is still a forecast, not a settled fact. Progress in AI has been rapid, but plateaus are not imaginary. This is where the story gets uncertain.
Still, the framework matters because it tells you how the protagonists themselves may be thinking. If money is a claim on labor, and labor is heading toward radical devaluation, then money’s importance falls with it.
And that leads directly to the next question. If money stops being the main scarce asset, what replaces it?
The answer is already visible in where the capital is flowing.
The new scarcity is physical
The cleanest mistake in a lot of AGI commentary is the assumption that post-scarcity means no scarcity. It doesn’t. It means scarcity migrates.
If intelligence becomes abundant, then intelligence stops being the bottleneck. Something else becomes the bottleneck instead. In practice, that means compute, electricity, raw materials, and physical space. Not metaphorical space. Real space. Land. Grid access. Cooling. And maybe, if Musk is right, orbital real estate.
This is where his behavior becomes more revealing than his rhetoric.
Just days before laying out his case for space-based AI economics, Musk’s empire made the strategic move that clarifies the whole thesis. Reuters reported that SpaceX acquired xAI on February 2, 2026, creating a combined entity valued at roughly $1.25 trillion. That number deserves a pause. One person now sits atop a trillion-plus structure that combines the world’s most important private launch company with a frontier AI developer.
The official language was revealing. In announcing the combination, xAI described the two companies as “xAI joins SpaceX... vertically integrated innovation engines on (and off) Earth.”
On and off Earth. That’s not branding fluff. That’s strategy.
Musk’s argument is that Earth is becoming a constrained environment for frontier AI. In his discussion of infrastructure bottlenecks, he pointed to the fact that US average power consumption is about 0.5 terawatts, while power growth outside China has been broadly flat. If compute demand keeps exploding, the limiting factor is no longer model architecture alone. It’s whether you can physically feed the machines.
That is why his comments about solar in orbit matter. He said, “Any given solar panel can do about five times more power in space than on the ground.” The claim is simple enough to understand. No weather. No night cycle in the same way. Different operating conditions. More continuous energy capture.
Then he made the timeline explicit:
“In 36 months... the most economically compelling place to put AI will be space.”
I have a view on that, but I hold it loosely. The timeline sounds extraordinarily aggressive. Orbital data centers are still more thesis than mature industry, and a lot of technically literate observers think the idea is closer to a decade-scale project than a three-year one. But that’s almost beside the point. The important thing is that Musk is not merely talking about post-money abundance. He is using present-day money to buy control over the infrastructure stack that would matter if his view is right.
The scale of the ambition is staggering. In the same discussion, he projected 100 gigawatts of space AI compute by 2030 and suggested that reaching that kind of system would require 1 terawatt per year of launch capacity. Even if those numbers prove far too optimistic, they tell you what game he thinks he’s in. The ambition is to lock down the next scarcity regime.
And that regime is physical.
On Earth, AI runs into permitting fights, transformer shortages, transmission constraints, water issues, local politics, and grids that do not expand at the pace software people expect. In orbit, at least in theory, you get vast solar input and a much less crowded map of property claims. What looks crazy in ordinary industrial terms starts to look rational if you believe intelligence will soon be cheap and power will be the choke point.
So yes, Musk talks like money may become conceptually obsolete. But he is very obviously treating money as urgently useful right now, because it can still be exchanged for launch systems, data centers, and strategic position before those things become even harder to buy.
He is converting financial capital into infrastructure capital.
Altman is doing something similar, though in a different register. Musk is buying hardware. Altman is raising money while openly hinting that money itself may not be the endgame.
That makes the OpenAI story even stranger.
OpenAI is selling a future that may dissolve the price tag
OpenAI’s February 2026 raise was not just large. It was historic in scale and weird in structure.
The reported terms were $100 billion at a valuation above $850 billion. To understand how fast that happened, look at the valuation path. OpenAI reportedly went from $157 billion in October 2024 to $300 billion in March 2025, then $500 billion in October 2025, and then $850 billion-plus in February 2026. More than a fivefold increase in roughly 16 months.
That trajectory doesn’t resemble ordinary fundamental repricing. It looks like urgency. It looks like strategic bidding for position before the board gets crowded.
And yet the company was also warning investors, in plain language, that “It may be difficult to know what role money will play in a post-AGI world.”
I keep coming back to that sentence because it is one of the most honest things any company in this race has said. It is also, on its face, bizarre. Imagine raising the largest private round in history while telling backers the unit of account they are using may become philosophically unstable if your product works too well.
So why pay anyway?
Part of the answer is that these are not passive investors clipping coupons. According to reporting on the cap table and financing dynamics, key backers include Microsoft, which reportedly owns 27%, as well as SoftBank, Nvidia, and Amazon. These are not tourists. They are strategic actors buying influence over a possible future bottleneck.
The financials make clear that this is not a normal profitability story. Analyst projections cited in reporting on OpenAI’s economics point to cumulative losses of $143 billion from 2024 through 2029, including a projected $14 billion loss in 2026 alone. HSBC reportedly sees a $207 billion shortfall by 2030. Even with roughly $20 billion in annual recurring revenue, the cost structure remains in another universe.
That means investors are not underwriting a neat discounted cash flow model. They’re underwriting access. Optionality. Survival insurance. A seat at the table if AGI arrives quickly enough to reorder the economy before the income statement ever looks healthy.
One bull case, cited in reporting on investor thinking, came from Ethan Choi of Khosla Ventures, who indicated that “$10 billion revenue per gigawatt of compute is viable”. Maybe. But notice what even that argument implies. The path to justification still runs through massive compute scale. Not clever branding. Not marginal SaaS economics. Infrastructure again.
This is why I don’t think the contradiction is fake. OpenAI’s warning may read like legal disclosure, but it also contains a strategic truth. If AGI actually changes the role of labor, then conventional money returns stop being the only thing that matters. Investors may be paying for a claim on the systems that allocate intelligence in a world where intelligence is the base layer of production.
There is real uncertainty here. If AGI timelines slip badly, these valuations look fragile. If capability progress slows over the next two or three years, the repricing could be brutal. If infrastructure commoditizes the way cloud partially did, then no single company gets to own the future. Those are real tripwires, not footnotes.
But if the race continues at anything like its current pace, then the behavior makes sense. Musk and Altman are not acting like traditional executives maximizing quarterly profit. They are acting like people trying to secure position in a world where position matters more than currency.
Both are converting money into strategic control as fast as they can.
The question is what kind of world that creates for everyone else.
Abundance could still end in concentration
The utopian version of AGI is easy to picture. Intelligence gets cheap. Goods and services get cheap. Production explodes. Human drudgery collapses. Everyone gets more.
The harder question is who owns the machinery that makes that world run.
If everybody can command enormous productive capacity, then old status markers lose meaning. If everyone is rich in some nominal sense, then richness stops differentiating. But that does not mean power disappears. It means power relocates to whatever remains scarce and gatekept.
In a post-labor economy, what exactly does the average person trade for access to premium compute, energy allocation, bandwidth, launch, or scarce materials? If human labor has little market value, then the old wage loop breaks. This remains genuinely unclear, and I don’t think anyone else is certain either. The allocation mechanism for post-scarcity resources may turn out to be the biggest political question of the next era.
What I do know is that the ownership pattern already looks concentrated. After the SpaceX-xAI combination, Musk’s integrated vehicle sits at about $1.25 trillion. OpenAI’s latest financing pegs it above $850 billion. Two entities. Two central figures. Nearly $2.1 trillion in AGI infrastructure value.
That is before you count the strategic investors circling them.
The academic literature is not blind to this possibility. The NBER work on technological deflation and post-scarcity economics points toward a world where AGI can end labor scarcity while disproportionately enriching owners of capital. Separate research on AGI-driven inequality frames the danger in similar terms: capital ownership can bypass the need to distribute income through wages at all. Some scholars use the phrase techno-feudalism for a reason.
I don’t use that phrase lightly. But I understand why it keeps showing up. If a tiny elite owns the intelligence layer, the energy layer, the physical compute layer, the land, and the orbital access, then everyone else may enjoy cheap services while remaining structurally dependent. And dependency is power. The ultimate leverage is not charging a high price. It’s the ability to cut you off.
This is the paradox that gets buried under the abundance talk. AGI could make many things effectively free at the margin while making social power more concentrated than ever at the center.
That doesn’t mean this outcome is inevitable. Infrastructure could commoditize. Governments could intervene. Public or sovereign compute initiatives could emerge at scale. Competition from other tech giants could fragment the market. The picture here is murky, and honest observers disagree.
But the early ownership map matters. These systems are expensive enough that the entry barriers are already enormous. If the future depends on giant clusters, giant power deals, giant fabs, giant launch systems, and giant balance sheets, then the number of players who can shape that future stays very small.
Which brings me to the real synthesis.
What this actually means
The phrase “money becomes irrelevant” can sound mystical, or unserious, or like Silicon Valley performance art. I don’t think that’s the right way to read it.
What Musk and Altman are pointing toward, whether consciously or not, is a migration of power. Not from economics to utopia, but from one form of capital to another. From financial capital to infrastructure capital.
If money is a claim on labor, and labor becomes less scarce, then money’s role weakens. But the world does not become structureless. Power condenses around the things that still cannot be summoned instantly: compute capacity, energy generation, chip supply, raw materials, launch access, and maybe orbital positioning.
That is exactly where the money is going now.
Musk is the clearest case. He is taking present-day capital and converting it into a vertically integrated stack that spans AI, rockets, and potentially off-world energy and compute. OpenAI is doing the same in a more financialized form, using extraordinary investor appetite to lock in scale before anyone can cheaply replicate it. Their major backers, from Microsoft to SoftBank to Nvidia to Amazon, are not just funding research. They are buying adjacency to the bottleneck.
So I don’t read the current behavior as hypocrisy. I read it as consistency.
They may actually believe that ordinary money matters less in a post-AGI world. Which is precisely why they are spending so aggressively to acquire what might matter more.
Who benefits if they’re right? The entities that control scarce post-AGI inputs. Frontier model operators. Cloud-scale compute owners. Energy suppliers. Chipmakers. Launch providers. Investors with strategic claims on those systems.
Who gets pressured? Anyone whose income depends mainly on selling labor, or labor-adjacent services, into a market where machine intelligence keeps getting cheaper. Traditional financial assets tied to currency claims could also become harder to think about if the labor foundation beneath the currency weakens dramatically. That’s still theoretical. But it is no longer a silly question.
What would change my view? A meaningful AI plateau over the next few years would matter. So would existential financial stress at OpenAI or SpaceX-xAI before AGI arrives. So would rapid commoditization, with multiple competitive providers making infrastructure cheap and widely distributed. And Musk’s own timeline gives us a clean test: if his 36-month claim about space becoming the most economically compelling place for AI passes without meaningful orbital compute deployment, that would tell us something important about how much of this thesis is engineering and how much is aspiration.
I wouldn’t watch the rhetoric first. I’d watch the buildout.
Watch energy permitting. Watch grid access. Watch giant compute clusters. Watch launch cadence. Watch whether any real system of compute credits or resource allocation starts to emerge. That’s where the politics of post-scarcity will live.
I don’t know if money becomes irrelevant after AGI. Sam Altman has effectively admitted he doesn’t know either. Musk sounds more certain, but certainty is cheap and orbital data centers are not.
What I do know is what these men are doing with money while it still works. They are converting it into compute, energy, launch capacity, and strategic control as fast as possible. A concrete behavioral signal, not an abstract philosophy.
And behavioral signals matter more than grand theory.
Watch what they buy when they think the currency is expiring.



