Microsoft Research has announced a major breakthrough in its pursuit of quantum computing – the basis for a new kind of qubit, which has never left the theoretical world before…and still hasn’t. Microsoft still ultimately hasn’t produced devices based on its new qubit design, but is adding credence to their feasibility with evidence produced by immense simulations inside and outside of Microsoft. Azure Quantum cloud infrastructure. Microsoft’s quantum computing research focuses on a special and exotic type of qubit, topological qubits, which he has presented as his vehicle towards the future of quantum since 2016.
Despite Microsoft’s investment in the field, relatively little has been heard of the company. Additionally, Microsoft’s giant tech competitors Google and IBM, and companies much smaller than Microsoft, like Riggetti Computing and IonQ, have already deployed quantum computing systems, while Microsoft has not. So you would think the two trillion-dollar Microsoft is dragging its feet in the race for scalable quantum computing.
The road less traveled
However, Microsoft looks like it chose to tackle a type of qubit its competitors wouldn’t. Topological qubits were initially proven to exist in a 2018 Nature magazine publication, then disproved, with the original researchers retracting the article. “for insufficient scientific rigor in our original manuscript.”
Microsoft isn’t just demonstrating that topological qubits are on the verge of becoming a reality: the company says they’re finally the only valid bet currently for quantum computing that’s sustainable, scalable (to the tune of millions of qubits mined), and ultimately meaningful. .
“Today’s qubits will not be the basis of tomorrow’s quantum computers,” Microsoft Distinguished Engineer Chetan Nayak told Ars Technica. “The qubits we have today are very interesting, very impressive – you can learn a lot and do a lot of research and make good incremental progress. But some sort of new idea is going to be needed to make a commercial-scale quantum computer.”
The world record for the highest number of qubits on a single device, IBM’s Eagle, currently stands at 127 addressable qubits – a far cry from the 1 million qubit figure that Microsoft predicts will be needed. And because IBM uses transmon-based qubits, the company’s devices must be cooled to absolute zero (-273.15 ºC) to protect the qubits from environmental interference.
Another thing to note is that because non-topological qubits are particularly susceptible to decoherence, these quantum architectures typically include additional qubits whose sole function is to provide a measure of error-correction capabilities, meaning that they are not directly used in the calculations. This is inefficient, especially considering the current difficulties of scaling the number of qubits. Hence Microsoft’s choice to go all the way. But what is it so far – what is topological qubits?
Absence of proof ≠ Proof of absence
The first thing to remember about topological qubits is that they still haven’t materialized. Instead, they are theorized to exist (as we’ve covered and explored in more detail here) as Majorana Zero Mode Pairs (MZMs), a special type of quasiparticle that naturally behaves like s it was only half an electron. These MZMs have been shown to deposit as a layer on the surface of superconducting materials and exhibit extreme resistance to environmental noise (such as heat, stray subatomic particles, or magnetic fields). Left unchecked, or not designed against it, this environmental noise leads to decoherence – the process by which qubits come out of their state of superposition and reveal their value. When qubits reveal their value too soon, an error appears before the calculation is complete.
Microsoft’s qubit topology design includes a U-shaped wire with a zero Majorana mode at each end, providing a physical separation, close to a quantum dot. This quantum dot serves as a control mechanism as its ability changes each time it interacts with one of Majorana’s Zero Modes, allowing it to be measured. This is the part Microsoft still hasn’t figured out: its design still doesn’t include a quantum dot.
The resilience of topological qubits comes from the fact that the two MZMs (in Microsoft’s design, at each end of the U-shaped wire) are responsible for encoding quantum information. However, since the information is only accessible by looking at the states of the two quasiparticles simultaneously, the qubit state only decoheres if the two MZMs are equally affected and “forced” to reveal their contents. And depending on the technical design, the researchers can provide a measure of the distance between the MZMs, creating a space between them that reduces the risk of the two particles decohering.
To better visualize this, imagine you’re a world-class villain and you write down the password to your doomsday device on a piece of paper. You then give half of it to two of your trusted subordinates (before forgetting about it forever). Whatever happens, none of them can divulge the password, regardless of the methods applied to reveal the information. In quantum terms, information has become non-local. The only way for someone to recover your full password would be to take both pieces of information and join them together to reveal the end result. In an extremely simplified way, this is what gives MZMs their resilience to decoherence.
To arrive at the final superconducting wire design that enables all of this, Microsoft research had to simulate the materials and their shape through 23 adjustable parameters. This step requires huge amounts of computing power, but Microsoft has one of the most powerful computing networks in the world with Azure. Additionally, the simulations allowed the Microsoft team to quickly iterate on both fronts, a process that would have been unfeasible with the hands-on materials engineering approach. According to Nayak, “If you were to sort [the materials] experimentally through trial and error, you would never be able to optimize all of these parameters in a reasonable amount of time.”
In the end, Microsoft settled on aluminum as the superconducting wire and indium arsenide as the semiconductor around it, and the company is manufacturing the devices itself. This new data- and materials-driven approach has been the glue that holds Microsoft’s approach to quantum computing together.
“We’re now driven by simulation-based designs, not just someone tossing ideas around in a boardroom,” Nayak said. “And now we have the unique growth and manufacturing technologies to bring these ideas to life. It doesn’t matter if you have the best designs in the world – if you can’t make them, they just stay on paper.
It seems that Microsoft’s intention to be a hardware vendor goes well beyond Xbox and its device division – the company wants to be the one that provides the foundational hardware for quantum computing systems, both for on-premises installations and in cloud environments. Microsoft’s quantum hardware could very well end up powering a hypothetical Apple “iQuantum” product. In theory, of course.
The computing of the future
Microsoft has expectations for the computational density of machines powered by its future topological qubits: Microsoft claims that one million of these qubits can be installed on a wafer smaller than a credit card security chip. It’s once again the revolution of transistors, but with qubits.
Although Microsoft still hasn’t delivered a working quantum computing product, we must remember that this is one of the most complex areas possible for a nightmarish interception between theoretical physics, practical engineering, and science. pure economy. But there’s also a critical factor: the company is losing ground in an estimated $76 billion market by 2030. But then again, should Microsoft really come up with an openly scalable quantum computing system before anyone else? ? , the market doesn’t care who moved through space first.
Experts in the field of quantum computing are even more aware of the weakness of the status quo for quantum computing systems. One of the reviewers of the since-retracted Nature 2018 publication, Marco Valentini, believed at the time that Majorana modes would eventually be created, detected, and exploited as qubits – but not as they were presented in the retracted 2018 paper. .
This same document was investigated by an independent panel of experts who found no evidence of data falsification. In the end, that was the most common cause of error: human failure. The original data fell into confirmation bias, as the report’s authors noted, writing that “the research program on which the authors embarked is particularly vulnerable to self-deception, and the authors do not are not protected against this”.
Microsoft knows what’s at stake with its choice of qubits – and was already deep in its research when all the drama surrounding the paper exploded. Perhaps that’s why Microsoft is keen to defend its results: the company’s research wing had a separate team to precisely analyze the output data to avoid confirmation bias. And the company also submitted its research data to “an expert panel of independent consultants,” which sounds a lot like inviting prominent peers in the field to constructively review the researcher’s work. Microsoft is continuing its topological qubit research while trying to isolate its efforts from previous flaws.
“It would be irresponsible for the physics community to decide now that we already know the only way,” Ady Stern of the Weizmann Institute of Science in Rehovot, Israel, told Quanta Magazine. “We’re on page 10 of a thriller, and we’re trying to guess how it’s going to end.”
With its topological qubits, Microsoft aims to be the BluRay of the conflict of quantum formats. And maybe it will – the company seems to be convinced of the physics behind its calculated choice. Time, as always, will tell.