“Nanomagnetic” computing can deliver low-power AI
Researchers have shown that it is possible to achieve artificial intelligence using tiny nanomagnets that interact like neurons in the brain.
The new method, developed by a team led by researchers at Imperial College London, could significantly reduce the energy cost of artificial intelligence (AI), which is currently doubling globally every 3.5 months.
In an article published today in Nature’s nanotechnology, the international team has produced the first evidence that arrays of nanomagnets can be used to perform type IA processing. The researchers showed that the nanomagnets can be used for “time series prediction” tasks, such as predicting and regulating insulin levels in diabetic patients.
Artificial intelligence that uses “neural networks” aims to reproduce the functioning of certain parts of the brain, where neurons talk to each other to process and retain information. Much of the math used to power neural networks was originally invented by physicists to describe how magnets interact, but at the time it was too difficult to use magnets directly because researchers didn’t know how to insert data and get information.
Instead, software running on traditional silicon-based computers was used to simulate the magnetic interactions, in turn simulating the brain. Now the team was able to use the magnets themselves to process and store data, eliminating the middleman of software simulation and potentially offering huge energy savings.
Nanomagnets can come in different “states”, depending on their direction. Applying a magnetic field to an array of nanomagnets changes the state of the magnets depending on the properties of the input field, but also on the states of the surrounding magnets.
The team, led by researchers from the Imperial Department of Physics, were then able to devise a technique to count the number of magnets in each state once the field passed, giving the “answer”.
Study co-first author Dr Jack Gartside said: “We’ve been trying to figure out for a long time the problem of how to enter data, ask a question, and get an answer from the magnetic computing. Now we’ve proven it can be done, it’s paving the way for eliminating the computer software that does the power-hungry simulation.”
Co-first author Kilian Stenning added: “The way magnets interact gives us all the information we need; the laws of physics themselves become the computer.”
Team leader Dr Will Branford said: “It was a long-term goal to make hardware based on the software algorithms of Sherrington and Kirkpatrick. It was not possible to use the spins on the atoms in conventional magnets, but by increasing the spins in nano-patterned arrays, we were able to achieve the necessary control and readout.”
Reduced energy cost
AI is now being used in a variety of settings, from voice recognition to self-driving cars. But training the AI to perform even relatively simple tasks can require huge amounts of energy. For example, training the AI to solve a Rubik’s cube required the energy equivalent of two nuclear power plants running for an hour.
Much of the energy used to achieve this in conventional silicon-chip computers is wasted in inefficient transport of electrons during processing and memory storage. However, nanomagnets do not rely on the physical transport of particles like electrons, but rather process and transfer information in the form of a “magnon” wave, where each magnet affects the state of neighboring magnets.
This means that much less energy is wasted and that processing and storing information can be done together, rather than being separate processes as in conventional computers. This innovation could make nanomagnetic computing up to 100,000 times more efficient than conventional computing.
The team will then teach the system using real-world data, such as ECG signals, and hope to turn it into a real computing device. Eventually, magnetic systems could be integrated into conventional computers to improve the energy efficiency of intense processing tasks.
Their energy efficiency also means they could be powered by renewables and used to do “AI at the edge” – processing data where it’s collected, such as weather stations in Antarctica, rather than sending it back to big data centers.
It also means they could be used on wearable devices to process biometric data about the body, such as predicting and regulating insulin levels for people with diabetes or detecting abnormal heartbeats.