Can we pick winners in AI’s memory obsession?
The current infatuation with artificial intelligence has been fixated on the fastest processors that make AI possible, such as Nvidia’s GPU chips, and competing efforts from Advanced Micro Devices and Intel.
It’s clear, however, that the thing that will make or break AI going forward is memory, specifically, the circuits that hold the data that feeds data-hungry AI.
The speed and the energy efficiency of memory circuits such as DRAM have not kept pace with that of “logic” chips such as GPUs, leading to a disconnect: The faster the chips go, the more that memory is holding everything back.
Is that an investment opportunity? Reason dictates it should be. The problem of how to have really big, really efficient memory is one of the key challenges of the AI age. Herewith, some thoughts on how to play the memory race.
Enovix CEO: building the world’s next great battery
“What the investors are betting on, is, that these guys will execute.”