Quantum computing is temptingly out of reach

Quantum computing is one of the most exciting (and furious) research fields now, but if you ask a scientist how far quantum computers are from ‘doing it’, you’ll get the classic scientist’s “five to 10 years” answer.

So, if a new quantum reality is still perhaps a decade away, how far are we from personal quantum technology in our computer or laptop?

Quantum Laptop
Beijing Dec 2020, China reaches new milestone in quantum computing overwhelming computing speed. Credit: Xinhua News Agency / Contributor / Getty

Although there are still some very big hurdles to overcome in quantum computing, some research groups are already connecting very small quantum processors to traditional computers, and the future (as many years away as it could be) looks bright.

“With quantum computing, you are taking advantage of an otherwise unused, physical phenomenon,” says Dr Andrew Horsley, CEO of Quantum Brilliance, an Australian team working on diamond-based quantum technology.

“With quantum computing, you take advantage of an otherwise unused, physical phenomenon”

Dr Andrew Horsley, CEO of Quantum Brilliance

“The last time we did that was electricity – that’s a pretty big step in the technological fundamentals of society.”

One of the reasons why quantum computing has been so difficult to get started is that the architecture supporting it had to be built entirely from scratch.

Quantum computers make calculations using “quits”, the quantum equivalent of computing “bits”. But instead of being either turned on or off (0 or 1 in binary) the quut can be in a mixture of 0 and 1. Imagine a quit as a globe or a rotating atom where the kbit state is a point on the globe creating a mixture between 0 and 1 in and length and latitude.

Quantum Laptop
Engineer Hannah Wagenknecht shows an oblate with photonic chips considered the “building block” to bring quantum technology into everyday products. Sept 2021. Credit: Thomas Kienzle / AFP / Getty

Unfortunately, the price of adding a quantum to these bits is much more room for noise – random variations that can change the value of the result. Without special error adjustment this may mean that the kbit provides the wrong value. Even the proof of Google’s quantum superiority in 2019 there was 99% noise and only 1% signal.

Although our systems are slowly improving over time, we are still in what is known as the noisy medium-scale quantum epoch. This means that even though we have the technology to build systems up to a few hundred kbits, the systems are still incredibly sensitive to their environment and can lose “coherence” after just a few seconds of work.

The actual physical materials we use to make quantum computers do not help at all.

“In a classical system, hard drives are made of magnetic memory, and magnetic material has the property that it retains a memory of its state for a long time … nature actually does a lot of error correction for us for free,” says Tom. Stace, the Deputy Director of the ARC Center of Excellence in Implemented Quantum Systems in Queensland.

“For quantum computers, we don’t have an analog substance. We do not have anything that preserves quantum states indefinitely. So, we really have to realize something that looks like a quantum magnet, because a thing that is capable of storing quantum information indefinitely does not exist in nature in any form. “

There have been several approaches to this problem.

Groups such as IBM and Google use “superconducting transmonic” qubits made of materials such as niobium and aluminum on silicon. Some teams, as the quantum group at the University of New South Wales uses silicon and phosphor atoms to make “semiconductor” quantum computing, while Quantum Brilliance uses nitrogen atoms inside a diamond lattice to create its diamond-based quits.

Quantum Laptop
Quantum Brilliance is one of only a few companies worldwide capable of delivering quantum computer systems for customers to operate on-site today. Credit: Quantum Brilliance

Each of these technologies has its advantages and disadvantages.

Superconducting quits are better developed and more accurate, but still has high noise-to-signal ratios compared to traditional computers. These cubits are also more easily able to “interact” with each other, which is an important way for this technology to scale. However, this makes them incredibly sophisticated machines. As Stace describes it, it is quits “completely down”, requiring physical quits in one section to make an error correction and logical quits in another area to make the actual calculation.

Then there is the issue of temperature – being superconducting means that the machine must be at almost zero temperature. Even if you could put that into a laptop, the energy cost to run it means you probably wouldn’t want to.

Silicon and diamond are not as advanced as their supercomputing counterparts when it comes to noise and error correction, but they have the advantage of not having to be at supercold temperatures.

Silicon wedges recently achieved more than 99% accuracy, which is an exciting milestone. It means that researchers can begin to implement error correction in the same way that superconducting quits do. But this technology still requires cooler temperatures and has only two bits in a system, so there is still a long way to go before it is likely to be ready for scale manufacturing.

Then there are diamond-based quits. The technology itself has similar problems to silicon – although diamond quits can roll at room temperature – but Quantum Brilliance is apparently much further ahead, with the team soon making a quantum chip available to the Pawsey Supercomputing Center in Perth.

“Diamond is one of the most widely used quantum technologies, but for the most part it is only used for sensing,” says Horsley. “It’s room temperature, it’s a very simple system, it’s very high performance.

“The challenge climbed past a handful of quibbles.”

The reason why it is difficult to climb is because of the particular way they are made. In a process known as implantation of a shotgun, nitrogen atoms or electrons are fired to a piece of synthetic diamond to create something called a “nitrogen vacuum center”. The problem is, despite the many atoms fired, the researchers may only get one or two of these sitting on the right level to be used as qubit.

Instead, the Diamond Brilliance team is working on a system where they implant the nitrogen vacuum, and then grow more diamond, and then implant another nitrogen vacuum, and so on.

“The challenges you would have had to face then would have been almost unimaginable, but we solved them 80 years later. I think anything is possible.”

Tom Stace, ARC Center of Excellence in Implemented Quantum Systems, Queensland.

They have grand plans for a 50-kb system built in this way that would make the technology useful for implementation with a classic computer to speed up time-sensitive, processor-hungry requests like word-to-text conversion, especially where it wouldn’t. Not easy access to the cloud and additional processing power.

This goal is still quite far away – despite exciting new software to connect the quantum system to the classic, the box going to the Pawsey Center has only two solitary quits.

Quantum Laptop
The Trobe, RMIT University and Australian-German quantum computing hardware company Quantum Brilliance are making diamond computing chips. 2022. Credit: La Trobe University

“The really exciting thing about it is less its computing power, [and] more than we could then take a really complex set of tabletop systems, put that in a box and ship it 3,500 miles away and run it in a supercomputer center there, ”says Horsley.

“I’m very curious – what a strange thing it is [that will stem from] just have a box in their facility? ”

Quantum computing – don’t give up

Although there are some very big technological hurdles to overcome before quantum computing is likely to be in our lives, and Stace and Horsley suggest not taking away dreams of owning a personal quantum laptop.

“If you got back in the 1940’s and people invented the first serious digital computers, you wouldn’t even be able to ask the question, ‘Shall we have a laptop?’ because no one could even conceive of it as something to have, ”says Stace.

“The challenges you would have had to solve then would have been unimaginable, but we solved it 80 years later. I think anything is possible.”

Leave a Reply

Your email address will not be published.