The AIpocalypse we should be worried about

We are fairly optimistic about the prospects for AI, albeit in some decidedly mundane places. But we are still in early days of this transition, with many unknowns. We are aware that there is a strain of thinking among some investors that we are in an “AI Bubble”, and the hard form of that thesis holds that AI is just a passing fad, and once the bubble deflates the semis market will revert to the status quo of two years ago. Somewhere between the extremes of AI is so powerful it will end the human race and AI is a useless toy sits a much more mild downside case for semis.

As far as we can gauge right now, the consensus seems to hold that market for AI semis will be modestly additive to overall demand. Companies will still need to spend billions on CPUs and traditional compute, but now need to AI capabilities necessitating the purchase of GPUs and accelerators. At the heart of this case is the market for Inference semis. As AI models percolate into widespread usage, the bulk of AI demand will fall in this area, actually making AI useful to users. There are a few variations within this case. Some CPU demand will disappear in the transition to AI, but not a large stake. And investors can debate how much of inference will be run in the cloud versus the edge, and who will pay for that capex. But this is essentially the base case. Good for Nvidia, with lots of inference market left over for everyone else in a growing market.

The downside case really comes in two forms. The first centers on the size of that inference market. As we have mentioned a few times, it is not clear how much demand there is going to be for inference semis. The most glaring problem is at the edge. As much as users today seem taken with generative AI, willing to pay $20+/month for access to OpenAI’s latest, the case for having that generative AI done on device is not clear. People will pay for OpenAI, but will they really pay a $1 more to run it on their device rather than the cloud? How will they even be able to tell the difference. Admittedly, there are legitimate reasons why enterprises would not want to share their data and models with third parties, which would require on device inference. On the other hand, this seems like a problem solved by a bunch of lawyers and a tightly worded License Agreement, which is surely much more affordable than building up a bunch of GPU server racks (if you could even find any to buy). All of which goes to say that companies like AMD, Intel and Qualcomm, building big expectations for on-device AI are going to struggle to charge a premium for their AI-ready processors. On their latest earnings call, Qualcomm’s CEO framed the case for AI-ready Snapdragon as providing a positive uplift for mix shift, which is a polite way of saying limited price increases for a small subset of products.

The market for cloud inference should be much better, but even here there are questions as to the size of the market. What if models shrink enough that they can be run fairly well on CPUs? This is technically possible, the preference for GPUs and accelerators is at heart an economic case, but change a few variables and for many use cases CPU inference is probably good enough for many workloads. This would be catastrophic, or at least very bad, to expectations for all the processor makers.

Probably the scariest scenario is one in which generative AI fades as a consumer product. Useful for programming and authoring catchy spam emails, but little else. This is the true bear case for Nvidia, not some nominal share gains by AMD, but a lack of compelling use cases. This is why we get nervous at the extent to which all the processor makers seem so dependent on Microsoft’s upcoming Windows refresh to spark consumer interest in the category.

That all being said, we think the market for AI semis will continue to grow, driving healthy demand across the industry. Probably not as much as some hope, but far from the worst-case, “AI is a fad” camp. It will take a few more cycles to really find the interesting use cases for AI, and there is no reason to think Microsoft is the only company that can innovate here. All of which places us firmly in the middle of expectations – long time structural demand will grow, but there will be ups and downs before we get there, and probably no post-apocalyptic zombies to worry about.

Photo By Google Bard

Leave a Reply