Neuromorphic computing will revolutionize the edge

Biomimicry, the scientific art of copying natural structures, is not a new idea. For decades we have tried to copy biological brains to make efficient computers, slightly discouraged by the fact that we don’t know exactly how biological intelligence works. Armed with our best guesses, we have developed models of neurons and spike neural networks based on the human brain, and are now trying to grow them in silicon. Silicon mimics typically use simplified versions of the neuron, but they can still offer distinct advantages to advanced applications that require fast, power-efficient processing to make decisions.

ABI Research reports that 4.6 billion sensors will ship in 2027, embedded in appliances, robots and smart home devices, up from 1.8 billion in 2021. These additional sensors will support existing and new functions at the future, which will lead to increased sensor data. which will need to be dealt with. While the vast majority of smart home appliances and devices will have internet connections by 2027, the cloud may not be the best place to process this data. There’s a cost to hosting and processing that data in the cloud, it’s slow, and there are privacy implications.

The best bet for processing sensor data in real time, closer to the sensor, may well be neuromorphic computing. Demonstrations of neuromorphic computing systems have proven the technology’s value for ultra-fast, ultra-low-power decision-making at the edge. Biomimicry in computing and neuromorphic computing are poised to bring a whole new level of intelligence to edge devices, making it possible to add decision-making power to devices with extreme limits on power consumption. energy and speed. As advanced networks and specialized hardware continue to develop, the effects will become even more pronounced.

Neuromorphic’s competitor, deep learning (the paradigm that powers most mainstream AI today), is growing rapidly. Today, it is easily possible to do small deep learning applications, including keyword detection and basic image processing, on a microcontroller for less than US$1. But neuromorphic concepts go even further, squeezing into tiny energy budgets. Will these technologies compete or co-exist at the edge? The most likely mid-term scenario is coexistence – with millions of use cases at the edge, there are millions of niches, and some may be better suited for neuromorphic computing, for technical or business reasons.

Although the demise of Moore’s Law has been circumvented somewhat by accelerated computing/domain-specific computing, there remains a delicate balance between computing architecture flexibility and performance, especially for workloads rapidly evolving like AI. Taking inspiration from the most efficient computer ever known – the human brain – and using the results of millions of years of evolution as a starting point seems like a safe bet.


Read also:

neuromorphic

Sherry J. Basler