5 Comments

Thanks for this post — really interesting and helpful stuff.

I'm also not sure what to think about the consumer surplus issue, but here's one thought:

As argued in this post — https://www.adamsmith.org/blog/economics/why-were-very-much-richer-than-the-gdp-numbers-tell-us-we-are — I think it is plausible that consumer surplus is > GDP impact. So, even if your assumption that AI companies only capture 25% of consumer value is right, all that consumer value might not translate into GDP. This estimate is also extremely imprecise but I would lean towards thinking the revenue of the industry is a better approximation of the GDP impact — so more like $1 trillion (which I think lines up a bit better with the other sources you reference?).

Of course, the revenue=GDP approximation assumes that all the revenues created by AI companies are "new" revenues not taking away from other (non-AI) company revenues and also not adding to other company revenues. Both are unrealistic. Maybe a better way of getting at it would be trying to see how much market wide dividend futures have gone up in the past few years, and making some assumptions about a) how much of that is due to AI and b) how much of AI industry revenue will come from existing public companies. But writing that out, I guess it's also not great.

Here's a piece on looking at long-dated dividend futures if anyone's curious: https://www.federalreserve.gov/econres/notes/feds-notes/the-stock-market-real-economy-disconnect-a-closer-look-20201014.html

Expand full comment
May 5Liked by Benjamin Todd

This is a really good way of looking at it, I wish I had thought of this.

$100B spent on AI chips is crazy, who's spending all that money and what are they using it on? I thought big AI models cost on the order of $100M. (Maybe that was the number a year ago, and by a year from now it will be much higher?)

Expand full comment
author

Training the biggest models costs ~$200m now, but they're targeting $1bn runs ~this year.

Then inference is the bigger use of FLOP (rather than training).

Then you need to consider the trailing models and random people using ML to do random analyses.

It's plausible AI software *broadly considered* is already generating $100bn of revenue, which would make $100bn on chips less crazy, especially if you expect dramatic growth - it probably makes sense for AI chip spending to be a couple of years ahead of the revenues.

But yeah it also wouldn't be surprising if revenues don't high enough in the next few years and there's a crash in AI chip spending.

Expand full comment
author

To be more concrete, a lot of asset managers use ML to trade. Google uses a lot of ML to optimise search, and has revenues of 300bn. Other tech companies are doing similarly. These uses are far from leading models but could be a significant amount of revenue.

Expand full comment