Summary Information Access & Transparency Undermines
The CNBC article URL about OpenAI's spending projections is inaccessible to most readers due to a paywall/authentication wall, directly impeding Article 19 (freedom to seek and receive information) and Article 12 (privacy protections through information control). The structural barrier to content access undermines transparency about technology development spending, a matter of significant public interest.
> OpenAI is projecting that its total revenue for 2030 will be more than $280 billion
For context, that is more than the annual revenue of all but 3 tech companies in the world (Nvidia, Apple, Google), and about the same as Microsoft.
OpenAI meanwhile is projected to make $20 billion in 2026. So a casual 1300% revenue growth in under 4 years for a company that is already valued in the hundreds of billions.
Must be nice to pull numbers out of one's ass with zero consequence.
It’s interesting that they felt the need to leak this to the press.[0] Some investors or partners (or LPs, board members, etc. of those) are getting spooked by the spending plans and rightfully questioning if the return is there. Putting it in public my feel like a stronger commitment (though I doubt it.)
Even with the revised numbers, I cannot believe that they’ll have $280bn in revenue by 2030.
[0]: You can tell by the reason the sources are granted anonymity: because the information is private, not because they aren’t authorized to speak on the matter
These numbers were always out of line with basic infrastructure constraints. People were talking like the US would build 50 new nuclear power plants in 10 years. And I believe we will not see $600B either, there are basic infrastructure, permitting, and power delivery limits.
This is more complicated than just hand wavy spending expectation resets. Other companies were taking these “commitments” and gearing up for capital investments to meet all that demand which is now vaporizing. That creates a big mess as the hype AI hype machine starts to unravel.
This looks very much like a careful move to deflate the bubble without popping it, but we’ve likely passed that point.
The market is spooked by capex projections generally. Interesting that Microsoft, despite some apparent hesitation in 2025, seems to be still going all in on AI spend over the next several years according to the most recent earnings call.
A trillion here, a trillion there and all the AI companies are also telling us they're planning on wiping out 2/3 of jobs in the next 10 years? Nothing about the economics of the AI boom makes any sense.
I'm not saying it's not possible, but if we wipe out 2/3 of jobs with AI, who is going to be buying *all the stuff*?
Unemployed people aren't much of a demographic, and you can't just say UBI because that doesn't make sense either. You think the billionaires are going to allow themselves to be taxed heavily enough to support UBI just so that there's a market for people to buy stuff from them? That's nonsense.
Not trying to creep anybody out, but I just don't see a stable outcome for a society that doesn't need 2/3 of the population.
This article is bad. It is mixing up capex and opex. OpenAI is projecting more spending on compute through their income statement now than they were 6 months ago.
> After previously boasting $1.4 trillion in infrastructure commitments, OpenAI is now telling investors that it plans to spend $600 billion by 2030.
does the word "commitment" have a different meaning in this context? How do you cut a commitment >50%? OpenAI's partners are making decisions based on the previous commitment because.. OpenAI committed to it. I must be completely wrong because how does this not set off a severe chain reaction?
edit: as others have pointed out, the article is misleading. $1.4T was over 8 years or by 2034. 2030 is halfway to 2034 and $600B is not too far from half of $1.4T.
What do we think? Is this possible without AGI level breakthroughs?
If we see a continuation or even a slowdown of the current trend, the technology overhang, lagging productization, and catch up from the slow adoption of AI by businesses probably gets them part of the way there, but I don't know about 1000% growth at this point... Seems kinda like they're banking on another breakthrough no? And if they don't get the breakthrough, the downside risks such as a competitor of some sort destroying their margin can't exactly be ignored...
I'm not convinced that companies venturing into the unknown really know more than anyone else, they just survive or don't. I've no idea what OpenAI is up to and honestly the public actions of Sam & Co seem like they feel kinda insecure about their position... whatever that position is.
I like the little blurb at the end which said that Codex had 1.5 million users. So, if you can get each of them to pony up a mere $186k a piece, they can hit those revenue numbers.
However, we are all going to be paying higher energy costs for these ridiculous infrastructure claims. Utilities typically price out energy three years in advance. If they were protecting for twice as many energy sinks, that represents an enormous amount of generation capacity which needs to be accounted for in projections.
I saw a report that previous capacity pricing was $28/MWh/day. Latest numbers have shot up to $300.
>I'm not saying it's not possible, but if we wipe out 2/3 of jobs with AI, who is going to be buying all the stuff?
Money is just a proxy for access to resources. If a machine that is capable of replacing almost all jobs is really created then money will matter much less than access to said machine.
Taken to the extreme to make the point, if you had a genie that could grant your every wish, what would you need money for ?
I think TSMC laughed them out of the room when they announced the original numbers. So maybe there’s no reaction now because everyone already knew not to trust OpenAI’s promises.
I have used AI a bit, like it for a bunch of use cases. But god damn, these numbers are so big. Gotta wonder, are the returns even worth it? RAM prices up, electricity prices up, hard disk prices up… Maybe this is the price to pay for “progress”, but it sure is wild
> how does this not set off a severe chain reaction?
Just like you and me, Sam Altman can say anything he likes to say. To pump the investors' confidence, to make the US administration believe he's serious about AGI, or just to make himself feel good. It's not legally binding in any way.
You should never read it as "OpenAI committed to..." but as "Altman said these words..." and words mean very little today.
We're already there. Most of us have jobs that are just made up to fill the gaps after steam power and automation. In the future, we'll have jobs that fill up the AI gap. It's UBI, but more arbitrary so we can tell ourselves we're useful while group X is not.
It's not unforeseeable that the US demarcates Special Economic Zones without environmental oversight or labor regulations to speed up the construction.
The number of projects accessing OpenAI directly, who might only reach for OpenRouter once an alternative is desired, is unknowable (since OpenAI doesn't share usage statistics), but likely meaningful.
The number of tokens seen per model on OpenRouter is not a good measure of quality.
There are so many plausible explanations for why a particular model is or is not ranked in the top 10 by this metric.
Maybe people using OpenAI models are so happy that they don't care about other models and have no need for OpenRouter. Maybe OpenAI models produce fewer tokens, or are more expensive per token.
Your conclusion might be correct, but citing the number of tokens seen by OpenRouter is not very strong evidence.
OpenAI is a bet on LLMs replacing a large chunk of the labour force in whatever sector it’s best at replacing. It’s essentially looking to get companies to pay $5k-$10k a month to have coding agents replace the output of a single software engineer.
If the S-curve levels off below that level OpenAI will be an unsuccessful company.
I, too, can make $280B in revenue by 2030 (by selling $10 bills for $5 (as long as I bamboozle enough investors into giving me sufficient capital, of course)).
Easy - a greater portion of the world's resources can go toward the luxury market for the wealthy. This is already the trend.
It's dark but certainly not impossible to have a smaller and smaller group doing all the spending and keep spending the same, and to keep stability by force using technology.
I think the optimistic scenario is AI can do the jobs but humans don't become unemployed so the workforce is 1 lot of humans +2/3 that in AI. The humans are wealthier and can buy the stuff.
Like rather than Dilbert writing code, he gets promoted to pointy haired boss and manages an AI which writes the code.
build 1ad9551+j7zs · deployed 2026-03-02 09:09 UTC · evaluated 2026-03-02 11:31:12 UTC
Support HN HRCB
Each evaluation uses real API credits. HN HRCB runs on donations — no ads, no paywalls.
If you find it useful, please consider helping keep it running.