Quantum computing remains woefully out of reach

Quantum computing is one of the most exciting (and trendy) areas of research right now, but if you ask any scientist how far quantum computers are from “successful,” you’ll get the classic scientific answer “five to 10 years.”

So if a new quantum reality is still potentially a decade away, how far are we from personal quantum technology in our computer or laptop?

Beijing in December 2020, China takes another big step in quantum computing with overwhelming computational acceleration. Credit: Xinhua News Agency/Contributor/Getty

Although there are still very big hurdles to overcome in quantum computing, some research groups are already connecting very small quantum processors to traditional computers, and the future (as many years as possible) looks bright.

“With quantum computing, you are tapping into an otherwise untapped physical phenomenon,” says Dr Andrew Horsley, CEO of Quantum Brilliance, an Australian team working on diamond-based quantum technology.

“With quantum computing, you are exploiting an otherwise untapped physical phenomenon”

Dr. Andrew Horsley, CEO of Quantum Brilliance

“The last time we did it was electricity – that’s a pretty big change in the technological underpinnings of society.”

One of the reasons quantum computing has been so difficult to get started is that the architecture that supports it had to be built entirely from scratch.

Quantum computers perform calculations using “qubits”, the quantum equivalent of calculating “bits”. But instead of being on or off (0 or 1 in binary), the qubit can be in a mixture of 0 and 1. Imagine a qubit as a spinning globe or atom where the qubit’s state is a point on the globe creating a mixture between 0 and 1 in longitude and latitude.

quantum laptop
Engineer Hannah Wagenknecht shows a wafer with photonic chips considered the “building block” for integrating quantum technology into everyday products. September 2021. Credit: Thomas Kienzle/AFP/Getty

Unfortunately, the price of adding quantum to these bits is a lot more room for noise – random variations that can change the value of the result. Without specialized error correction, this may mean that the qubit is providing the wrong value. Even Google’s 2019 quantum supremacy demo was 99% noise and just 1% signal.

Although our systems are slowly improving over time, we are still in what is known as the noisy mid-scale quantum era. This means that although we have the technology to build systems of up to a few hundred qubits, the systems are still incredibly sensitive to their surroundings and can lose their “coherence” after just a few seconds of work.

The physical materials we use to make quantum computers don’t help at all.

“In a classical system, hard drives are made of magnetic memory, and the magnetic material has the property of retaining a memory of its state for a long time…nature actually does a lot of error correction for us for free,” explains Tom. Stace, Deputy Director of the ARC Center of Excellence in Quantum Systems Engineering in Queensland.

“For quantum computers, we don’t have an analog substance. We don’t have something that preserves quantum states indefinitely. So we really need to design something like a quantum magnet, because something that can store quantum information indefinitely doesn’t exist in nature in any form.

There have been several approaches to this problem.

Groups like IBM and Google use “superconducting transmon” qubits made of materials like niobium and aluminum on silicon. Some teams, like the Quantum Group at the University of New South Wales, use silicon and phosphorus atoms to do “semiconductor” quantum computing, while Quantum Brilliance uses nitrogen atoms to inside a diamond lattice to create his diamond-based qubits.

quantum laptop
Quantum Brilliance is one of the few companies in the world already able to provide quantum computing systems for customers to operate on-premises today. Credit: Quantum Brilliance

Each of these technologies has its advantages and disadvantages.

Superconducting qubits are better developed and more precise, but still have high noise-to-signal ratios compared to traditional computers. These qubits are also more easily able to “interact” with each other, which is an important way for this technology to evolve. However, this makes them incredibly complex machines. As Stace describes it, these are qubits “all the way”, requiring physical qubits in one section to perform error correction and logical qubits in another area to perform the actual computation.

Then there’s the issue of temperature – being a superconductor means the machine has to be at near absolute zero temperatures. Even if you could Put that in a laptop, the energy cost to run it means you probably wouldn’t want it.

Silicon and diamond aren’t as advanced as their supercomputing counterparts when it comes to noise and error correction, but they do have the advantage of not having to be in super cold temperatures.

Silicon qubits have recently achieved over 99% accuracy, which is an exciting milestone. This means that researchers can start implementing error correction in the same way superconducting qubits do. But this technology still requires cooler temperatures and only has two qubits in a system, so there’s still a long way to go before it’s likely ready for large-scale manufacturing.

Then there are diamond-based qubits. The technology itself has similar issues to silicon – although diamond qubits can work at room temperature – but Quantum Brilliance is apparently much more advanced, with the team soon to deliver a quantum chip to the Pawsey Supercomputing Center in Perth.

“Diamond is one of the most widely used quantum technologies, but it’s primarily used for sensing,” says Horsley. “It’s at room temperature, it’s a very simple system, very efficient.

“The challenge has been to scale it beyond a handful of qubits.”

The reason they are difficult to scale is due to the particular way they are made. In a process known as shotgun implantation, nitrogen atoms or electrons are fired at a piece of synthetic diamond to create what is called a “nitrogen vacancy center”. The problem is that, despite the many atoms being triggered, researchers might only get one or two at the right level to use as a qubit.

Instead, the Diamond Brilliance team is working on a system where they implant the nitrogen gap, then grow more diamonds, then implant another nitrogen gap, and so on.

“The challenges that you should have anticipated to solve at the time would be kind of inconceivable, but nonetheless, we solved it 80 years later. I think anything is possible.

Tom Stace, ARC Center of Excellence in Engineered Quantum Systems, Queensland.

They have big plans for a 50-qubit system built this way, which would make the technology useful for implementation with a typical computer to speed up time-consuming, CPU-intensive requests like speech-to-text conversion, especially there where it wouldn’t be easy to access the cloud and extra processing power.

That goal is still a long way off — despite exciting new software to connect the quantum system to the classical system, the box going to Pawsey’s center has only two lone qubits.

quantum laptop
La Trobe, RMIT University and the Australian-German quantum computing hardware company, Quantum Brilliance, are designing a diamond computer chip. 2022. Credit: La Trobe University

“The really exciting thing about it is minus its computing power, [and] more than we were able to then take a really complex set of tabletop systems, put it in a box and ship it 3,500 kilometers and run it in a supercomputing center there,” says Horsley.

“I’m very curious – what are all the strange things that [that will stem from] just have a box in their facilities? »

Quantum computing – don’t give up

Although there are some very big technological hurdles to overcome before quantum computing is likely to be a part of our lives, Stace and Horsley suggest not to put aside dreams of owning a personal quantum laptop.

“If you go back to the 1940s, and people were inventing the first serious digital computers, you couldn’t even ask the question, ‘Will we have a laptop?’ because no one could even conceive of that as a thing to have,” says Stace.

“The challenges that you should have anticipated to solve at the time would be kind of inconceivable, but nonetheless, we solved it 80 years later. I think anything is possible.

Comments are closed.