Sometimes we have ideas for posts that come from work we our doing. Other times, we start with the title image. This is one of those posts. We were recently listening to a pitch from a stealth-mode start-up designing an AI accelerator chip. In the middle of that meeting, we saw an image of a computer driving a car, with the accelerator pedal fully depressed, right off a cliff. The image was so vidid that learly our subconscious was trying to tell us something.
There are now a dozen or so companies in the US designing chips specifically for “AI” workloads. There are a few dozen more in China, and of course all the hyperscalers have some version of this chip as well. When we lay out the landscape this way the problem is clear – there are a lot of these chips on or coming to the market. Is anyone going to buy them?
Our best guess is that the outcomes for many of these will not be great. We say this for several reasons.
The first is historical. This is not the first wave of AI chip start-ups, it is the third. The first came around 2017 when Google unveiled the TPU. There was another one four years ago, and now the latest spurred by Chat GPT. Almost none of that first wave still exist today – they were acquired, closed or just sit swirling around in limbo. The second wave was not looking great, but were saved at the last minute by the hype excitement around Chat GPT. The third wave could do better, but still face the problems that hampered the previous two.
The second factor here is technical. Put simply software is moving too fast. We have seen the same pattern repeated many times. A company unveils its plans for a new chip, on paper it offers a meaningful performance advantage. But in between its launch and the chip getting taped out and put into production, the software it needs to run has changed so much that the chip no longer has any performance advantage.
The third factor is commercial. Customers do not like to try out new chips. There are considerable costs around porting code to the new designs, and no one likes to be a guinea pig. Add to this the fact that there is a limited pool of customers capable of using these chips in data centers. Also add that all those potential customers are working on their own chips. And finally, let’s not forget everyone is competing with Nvidia which sometimes seems unstoppable. Put simply, there are a lot of companies chasing a small serviceable market.
To be clear, we hate writing this post. We are big advocates for increased venture funding in semis in the US. And it pains us to criticize chip start-ups. Notice we are not naming anyone in this piece. Maybe one or two companies can get to scale, and likely many more will be acquired into the big companies who badly need a new approach to their AI designs. And maybe someone gets it right – some combination of technical brilliance, innovative business model and luck – the rewards could be huge, but beyond that this space is likely to prove very challenging.
Photo credits: Microsoft Copilot