Will the world’s biggest AI chip find buyers?

COMPUTER brains are tiny rectangles that are shrinking with each new generation. Or so it used to be. These days Andrew Feldman, the boss of Cerebras, a startup, pulls a block of Plexiglas out of his backpack. Baked into it is a microprocessor the size of a letter paper (pictured). “It’s the world’s biggest,” he says proudly, rattling off its technical specs: 400,000 cores (sub-brains), 18 gigabytes of memory and 1.2trn transistors. That is, respectively, about 78, 3,000 and 57m times more than the largest existing processor from Nvidia, a big chipmaker.

Cerebras—the name echoes cerebrum, the front part of the brain, as well as Cerberus, the giant three-headed dog who guards the entrance to Hades—is leading a shift in semiconductors that was in plain display at Hot Chips, an industry gathering at Stanford University, where startups, including Cerebras and Habana, but also such giants as Nvidia and Intel, showed off their new silicon wares earlier this week. Thanks to Moore’s law, which states that computer power doubles every two years at the same cost, cramming ever more transistors on standard chips used to be the way to go. But with transistors now the size of dozens of atoms, improvements have become less predictable. And with the spread of artificial intelligence (AI), demand for computing power has grown much faster than Moore’s law—by more than 300,000 times for certain applications between 2012 and 2018, according to some estimates. As a result, chipmakers are now dialling up performance by, among other things, increasing the size of processors that inhale data to train AI services, such as facial recognition.

Cerebras has pushed this approach to the limit: its chip is the biggest that can be cut from the largest available wafers, the round sheets of silicon into which transistors are etched. To get there, the firm had to overcome more than one technical hurdle. One is defects: every wafer has some, so Mr Feldman’s team had to find a way to route around faulty cores. Another is cooling: water pumped through tiny pipes carries away the great heat that cores generate. But this is not all: Cerebras has also built a specialised computer for its new chip that it claims will deliver 150 times more number-crunching power than the best server based on graphics processing units, today’s AI workhorses.

At Hot Chips, attendees were passably impressed when Cerebras presented its new processor. But the biggest hurdle for the company may be economic, not technical, says Linley Gwennap of Microprocessor Report, an industry newsletter. The firm has to convince big providers of cloud computing, such as Amazon Web Services, Microsoft Azure and Google Cloud, that it is worth their while to use Cerebras computers instead of machines packed with Nvidia chips, for instance when it comes to power consumption. Another question is whether other firms that have huge demand for computing power, including banks and oil majors, will want to buy such AI supercomputers, instead of having their data crunched in a cloud. Don’t be surprised if Cerebras is taken over by a bigger firm, be it another chipmaker or a computer vendor—just like other AI-chip pioneers before it.

This post was originally posted at https://www.economist.com/business/2019/08/19/will-the-worlds-biggest-ai-chip-find-buyers.

Leave a Reply

Your email address will not be published. Required fields are marked *