Chip to accelerate artificial intelligence

Chip to accelerate artificial intelligence

The field of AI is experiencing a remarkable spur of progress recently. The software is becoming much better at understanding images and speech and it is learning quickly how to play games. The company whose hardware has been the underpinning force to much of that progress has created a new chip to keep the forward momentum going. Nvidia announced a new chip called the Tesla P100, this chip is designed to put more power into a technique called “deep learning”. This technique is responsible for producing most of the recent major advances including Google's AlphaGo, which we know defeated the world's top Go player.

The Nvidia P100 is designed to help computer scientists passed more data to their artificial neural networks or to create larger collections of virtual neurons, as this is the core of deep learning, passing data through large collections of crudely simulated neurons until now, things can only get faster and better. Artificial neuron networks have been around since the mid-1990s but deep learning has only become relevant in the last four or five years. It was by accident that researchers discovered that chips designed to handle videogame graphics made the technique much more powerful. Without serious graphics processors deep learning would not be possible but until the P100 no chip was actually designed with deep learning in mind, this chip is designed for that.

Nvidia claims it spent more than $2 billion on R&D for this new chip, it has a total of 15 billion transistors which use three times as many as any other Nvidia chip which should enable it to learn up to 12 times faster than any artificial neural network that is being powered by an older chip. Nvidia granted early access to the new chip to deep learning researchers from Facebook, Microsoft and other companies involved in the research. The CEO of Nvidia said he expects the new chip to be in full production by the end of this year and that by the end of next year he would be expecting that cloud computing companies would be using it, he also expects IBM, Dell and HP to sell the chip inside servers starting mid-2017.

As well as the new chip Nvidia have made a special computer for deep learning researchers that includes eight P100 chips with relevant memory chips and flash hard drives. Some leading academic research groups, including University of California, Berkeley, Stanford, New York University and MIT are to be given models of the computer, it is being called a DGX-1 and it will be available for sale in mid-2017 for the small sum of $129,000. The whole idea of the chip and the computers is to make machine learning software more capable and faster. To achieve faster and more capable machine learning the machines need massive amounts of computational power, with that power harnessed they can now recognize objects and translate speech in real time, in other words artificial intelligence is finally getting smarter.