Neuromorphic Computing

Neuromorphic computing tries to mimic way human brain works.

Neuromorphic computing is much better candidate for next-gen computation. The term was first conceived by professor Carver Mead back in 80s it is describing computation mimicking human brain. Over the last decade, number of company and institutions have been working on neuromorphic computing take IBM as an example IBM True North chip it’s first neuromorphic chip in world that becomes non-fundamental architecture.


Figure – Carver Mead


The TrueNorth chip has 4096 cores each core contains 256 numerals (about 1 million new rooms and over 250 million synapses.).




Figure – TrueNorth chip


Now, this is relatively new concept of computing. The human brain typical human brain contains between 86 to 87 billion neurons and 10 to 15 power of synapse.

When we perform task, computational task in our brain, we only consume tiny fraction of our entire neuron count. This is the reason why human brain is extremely power efficient we consume 20 watts of power and we are capable of achieving one exaFLOP (An exaFLOP is one quintillion (1018) floating-point operations per second or 1, 000 petaFLOPS) and let’s put this into perspective world’s fastest supercomputer IBM summit consumes 30 megawatts of power and it’s capable of 200 peddle flops human brain consumes 20 watts of power and capable of 1x exaflops that’s five times computational capacity of IBM summit.


Figure – IBM Summit


Neuromorphic computers which uses neuromorphic computing are directly modeled after human brain it uses special artificial neural network methodology called Spiking Neural Networks (SNN). This is not to be confused with software-based algorithms such as Convolutional Neural Networks (CNN), Recurrent Neural Networks (RNN), or Generative Adversarial Networks (GAN).

Neuromorphic chips which powers neuromorphic computers may not replace conventional computational chips such as CPU GPU or application-specific ICs. However neuromorphic computers have ability to add to existing computers that performs deep learning for artificial intelligence. From IBM TrueNorth to Intel loyalty to Manchester University is spinnaker machine every company or institution is working on unique solution for neuromorphic computing chips.

Limitations :
While current-day computers can do many things that humans can do, they lack in at-least 2 aspects i.e. Machine Reasoning and Transfer Learning. Léon Bottou, an expert-defined Machine Reasoning as

Algebraically manipulating previously acquired knowledge in order to answer new question.

Transfer learning refers to ability to transfer learned experiences from one context to another.

There is third angle, i.e., physical dimensions and energy consumptions. Supercomputers represent highest computing speeds and current versions have speed in PFLOPs (10^15 Floating point operations per second). But these are bulky devices housed in dedicated buildings and they need power in MW. The human brain consumes about 20 W of power.

Attention reader! Don’t stop learning now. Get hold of all the important CS Theory concepts for SDE interviews with the CS Theory Course at a student-friendly price and become industry ready.

My Personal Notes arrow_drop_up

Check out this Author's contributed articles.

If you like GeeksforGeeks and would like to contribute, you can also write an article using contribute.geeksforgeeks.org or mail your article to contribute@geeksforgeeks.org. See your article appearing on the GeeksforGeeks main page and help other Geeks.

Please Improve this article if you find anything incorrect by clicking on the "Improve Article" button below.


Article Tags :
Practice Tags :


Be the First to upvote.


Please write to us at contribute@geeksforgeeks.org to report any issue with the above content.