October 8, 2025 - 6:15pm

Is Sam Altman a Steve Jobs-level genius? Or is he just a smart guy who struck it lucky and is now about to go shopping with the global capital markets’ credit cards?

His company OpenAI has this year gone on a $1 trillion spending spree, and earlier this week signed a deal with chipmaker AMD to buy chips with a total expected value of $200 billion. Last month, Altman’s company finalized a contract with computing power colossus Oracle to the tune of $300 billion, in a deal whose risks credit ratings agency Moody’s has already flagged. These agreements followed a January pledge from Altman’s Stargate initiative to pump $500 billion into AI infrastructure.

OpenAI wants to extend the advantage of its 700 million weekly users, betting that the future is more: more computing, more chips, more raw voltage to run those chips. For inspiration, Altman is looking to Amazon Web Services (AWS), which stole a march on cloud computing by dipping into Amazon’s vast pockets 15 years ago.

There is also the question of the particular financial architecture that OpenAI have deployed here — a bit like one of those recursive loops ChatGPT will occasionally send a user round. AMD wants to sell its new AI-ready chips, and OpenAI has committed to buying millions of them. AMD’s stock price rose by 24% on announcement of the deal so, given that part of the agreement involves OpenAI taking a shareholding in AMD, that means that they now have even more collateral, against which they could potentially perform even more borrowing. Flywheels are fun on the way forward — but when the flywheel reverses, all hell can break loose.

True demand value of AMD’s chips is far from clear. They have all been marked to market against the expectation of future computing needs, which are — for now at least — a linear function of Altman’s dreams. The plans call for 20 gigawatts of power: that’s equivalent to 20 new nuclear power stations, or three times the annual needs of Singapore.

But could models actually become more efficient rather than bigger, as Chinese upstart DeepSeek seemed to show at the start of the year? Are we, as some suggest, already topping out on AI utility and intelligence? Or will the next iterations be more to do with traditional software encasing, rather than the raw machine learning which is currently the most expensive element? A bet laid today must withstand at least 20 years.

It’s easy to imagine that Altman simply has Silicon Valley fever. For him, the calculation is between godlike status and “well, we tried” — an asymmetric bet. For those of a nervous disposition, comfort can at least be taken from the thought that putting billions into developing more power and chip infrastructure will likely result in a solid consumer surplus, even if the private profits ultimately don’t accrue to OpenAI.

Both the railroad boom and the dotcom boom evinced the power of bubbles to build critical infrastructure that ordinarily would have taken decades to come together. If some foolhardy folks lose their shirts in the process, so be it. Either way, at least this will be one the savvy traders saw coming: a WeWork scenario tends to barrel in from left field. The market is already split, so the drawdown likely won’t be one hideous cliff edge.


Gavin Haynes is a journalist and former editor-at-large at Vice.

@gavhaynes