Intel to release 'Nervana' neuromorphic chip by December
Firm collaborating with Facebook in order to develop the chip's AI capabilities
Intel has announced what it describes as the "industry's first neural network processor".
The microchip giant says it will be shipping the Nervana Neural Network Processor (NNP) by the end of the year, and that it is collaborating with Facebook in order to develop its AI capabilities.
In a blog post, Intel CEO Brian Krzanich says the new NNP will enable companies to "develop new classes of AI applications", and goes on to make familiar but vague promises for the technology as a whole including benefits for healthcare, social media, automotive and weather forecasting.
A more concrete goal mentioned by Krzanich, but one that is perhaps just as hard to measure, is an aim set out by Intel last year: to increase the company's AI performance 100-fold by 2020.
Neuromorphic chips attempt to model the workings of the human brain, in which information captured by billions of sensory receptors is processed in parallel by neurons and synapses. Over time the connections between neurons alter according to their inputs; that is, they learn from experience.
The idea of replicating this process in silicon dates back to at least the late 1980s when US scientist Carver Mead coined the term 'neuromorphic' in a research paper.
It is an area beset by hyperbole and claims a significant breakthroughs occur with some frequency. For example, in 2014 IBM announced it had created a neuromorphic chip that could perform 46 billion operations per second drawing just 70 milliwatts of power (the brain is also extremely energy efficient), and later claimed a breakthrough in creating artificial neurons based on an alloy used in Blu-ray discs. How far away from a production device these developments are are is rarely made clear, so the current Intel announcement is unusual in that regard.
Other chipmakers from Qualcomm to Nvidia have made claims and patented architectures and designs of their own, and Facebook, collaborator on the NNP, has long been working with AI to try to understand the context of users' posts.
Intel purchased deep learning startup Nervana in 2016. At the time the firm said the acquisition would "advance Intel's AI portfolio and enhance the deep learning performance and TCO of our Intel Xeon and Intel Xeon Phi processors".
Unlike standard chips, the on-chip memory on the NNP is directly managed by software rather than as a physical cache, Intel says, which increases the memory bandwidth, allowing for increased parallelisation and reduced power consumption at the same time.