Estimates of GPU or equivalent resources of large AI players for 2024/5
AI infrastructure numbers are hard to find with any precision. There are many reported numbers of “[company] spending Xbn on infrastructure this quarter” and “[company] has bought 100k H100s or “has a cluster of 100k H100s” but when I went looking for an estimate of how much compute a given company had access to, I could not find consistent numbers available. Here I’ve tried to pull together information from a variety of sources to get ballpark estimates of (i) as of EOY 2024, who do we expect to have how much compute? and (ii) how do we expect that to change in 2025? I then spend a little time talking about what that might mean for training compute availability at the main frontier labs. Before going into this, I want to lay out a few caveats: * These numbers are all estimates I’ve made from publicly available data, in limited time, and are likely to contain errors and miss some important information somewhere. * There are very likely much better estimates available from paywalled vendors, who can spend more time going into detail of how many fabs there are, what each fab is likely producing, where the data centers are and how many chips are in each one, and other detailed minutiae and come to much more accurate numbers. This is not meant to be a good substitute for that, and if you need very accurate estimates I suggest you go pay one of several vendors for that data. With that said, let’s get started. Nvidia chip production The first place to start is by looking at the producers of the most important data center GPUs, Nvidia. As of November 21st, after Nvidia reported 2025 Q3 earnings[1] calendar year Data Center revenues for Nvidia look to be around $110bn. This is up from $42bn in 2023, and is projected to be $173bn in 2025 (based on this estimate of $177bn for fiscal 2026).[2] Data Center revenues are overwhelmingly based on chip sales. 2025 chip sales are estimated to be 6.5-7m GPUs, which will almost entirely be Hopper and Blackwell models. I have est