# Will probabilistic computing eclipse quantum computing

Conventional computers have made impressive progress, but there are still problems that machines cannot solve. Even with innovations capable of meeting the challenges, the inherent problems of reasoning, context limitation, and scalability continue to instill a sense of uncertainty.

The global scientific community has focused its efforts on using AI and machine learning for real-world applications. However, it is believed that laying the groundwork for basic AI work rather than just focusing on improvements may be the best course of action moving forward.

To solve these problems, several breakthroughs have been orchestrated in engineering computers that can use the laws of quantum physics to recognize patterns in complex problems. However, being in their early stages of development, these computers are sensitive to their environment and require extremely low temperatures to operate.

In one such breakthrough, researchers at Tohoku University in Japan are said to have built more advanced computers to calculate complex data and their scientists are now investigating something new and different, with the concept of “probabilistic computing”. . The team is developing quantum-enabled computers that can apply the principles of quantum physics to the problem of pattern recognition.

**Quantify uncertainty and interpret complex data**

A research paper, titled “Local bifurcation with spin-transfer torque in superparamagnetic tunnel junctions”, published in the open-access journal “Nature Communications”, should serve as the basis for the creation of more sophisticated computers. Scientists say this breakthrough could serve as the basis for designing more advanced computers that can quantify uncertainty and interpret complex data.

The team discovered a mathematical description of what happens in tiny magnets when electric current and magnetic fields are applied to them.

Probability computers could operate at room temperature and infer answers from complex inputs. It could be as simple as inferring information about a person by looking at their buying behavior. It would select patterns instead of the computer providing a single, discrete result, thus providing a good estimate of what the result might be.

**From bits to p-bits**

Among several ways to build such computers, researchers are studying devices called “magnetic tunnel junctions,” which are made of two layers of magnetic metal separated by an ultrathin insulator. The two nanomagnetic devices are thermally activated under an electric current and a magnetic field when electrons pass through the insulating layer.

Depending on their spin, electrons can cause changes or fluctuations inside magnets. These fluctuations are called **p-bits**—an alternative to binary **on off **Where **0/1 bit **in classical computers – which can form the basis of probabilistic calculation.

For researchers to conceive of this in probabilistic computers, they must be able to describe the physics that takes place in magnetic tunnel junctions.

The approach is based on the Néel-Arrhenius law with STT using superparamagnetic tunnel junctions with high sensitivity to external disturbances. These determine exponents through measurements such as nanosecond STT switching, homodyne-detected ferromagnetic resonance, and random telegraph noise.

The article quotes: “The results demonstrate the ability of the superparamagnetic tunnel junction as a useful tester for statistical physics as well as sophisticated probabilistic hardware engineering with a rigorous mathematical basis.”

Professor Shun Kanai from the Electrical Communication Research Institute of Tohoku University said: “We have experimentally clarified the ‘switching exponent’ which governs the fluctuation under the disturbances caused by the magnetic field and the torque of spin transfer in magnetic tunnel junctions. This gives us the mathematical basis to implement magnetic tunnel junctions in the p-bit to design sophisticated probabilistic computers. Our work has also shown that these devices can be used to study unexplored physics related to thermally activated phenomena.

**The big “if”**

As with all big jumps, it also comes with its own set of problems. P-bit talk aside, quantum computers are expected to run 158 million times faster than supercomputers currently in use, completing a task in 4 minutes that would take a traditional supercomputer 10,000 years.

For this to happen, it is imperative to meet the challenges around physics. However, it would also make it an extremely expensive undertaking for most, except for large corporations and the best-funded research institutes.

Researchers are already trying to build these machines at large companies such as Google and IBM, which are competing to come up with the first practical general-purpose quantum computers. Google is building an open source library for quantum machine learning applications, in collaboration with the University of Waterloo, Alphabet’s X and Volkswagen. TensorFlow Quantum (TFQ) is used for rapid prototyping of quantum ML models, where research communities can collaboratively control and model natural or artificial quantum systems.

An early prediction of this process was made when the late Nobel laureate and physicist Richard Feynman considered the role of probability in quantum computing – researchers are now aiming to implement this vision. So, probabilistic computing adds probabilities, while quantum computing adds complex probability amplitudes. The theoretical promise of P-bits is unequivocal, but even then the practical hurdles seem unavoidable.