Paul Benioff (born 1930) is a US physicist who in 1980 imagined the feats computing might achieve if it could harness quantum mechanics, where the word quantum refers to the tiniest amount of something needed to interact with something else – it’s basically the world of atoms and sub-atomic particles.
Benioff’s imagination helped give rise to the phrase ‘quantum computing’, a term that heralds how the storage and manipulation of information at the sub-atomic level would usher in computing feats far beyond those of ‘classical’ computers.
Benioff was coincidently thinking about a vague concept being outlined by Russian mathematician Yuri Manin (born 1937). Since then, many others have promoted the potential of computing grounded in the concept of ‘superposition,’ when matter can be in different states at the same time.
Quantum computing is built on manipulating the superposition of the qubit, the name of its computational unit. Qubits are said to be in the ‘basis states’ of 0 or 1 at the same time when in superposition, whereas a computational unit in classical computing can only be 0 or 1. This qubit characteristic, on top of the ability of qubits to engage with qubits that are not physically connected (a characteristic known as entanglement), is what proponents say gives quantum computers the theoretical ability to calculate millions of possibilities in seconds, something far beyond the power of the transistors powering classical computers.
In 2012, US physicist and academic John Preskill (born 1953) devised the term ‘quantum supremacy’ to describe how quantum machines one day could make classical computers look archaic.
In October last year, a long-awaited world first arrived. NASA and Google claimed to have attained quantum supremacy when something not “terribly useful” was computed “in seconds what would have taken even the largest and most advanced supercomputers thousands of years”. The pair were modest that their computation on a 53-qubit machine meant they were only able “to do one thing faster, not everything faster”. Yet IBM peers doused their claim as “grandiosity” anyway, saying one of IBM’s supercomputers could have done the same task in two-and-a-half days.
Nonetheless, most experts agreed the world had edged closer to the transformative technology. Hundreds of millions of dollars are pouring into research because advocates claim quantum computing promises simulations, searches, encryptions and optimisations that will lead to advancements in artificial intelligence, communications, encryption, finance, medicine, space exploration, even traffic flows.
No one questions that practical quantum computing could change the world. But the hurdles are formidable to accomplish a leap built on finicky qubits in superposition, entanglement and ‘error correction’, which is the term for overcoming ‘decoherence’ caused by derailed qubits that can’t be identified as out of whack when they are in superposition. There’s no knowing as to when, or if, a concept reliant on mastering so many tricky variables will eventuate. While incremental advancements will be common, the watershed breakthrough could prove elusive for a while yet.
To be clear, quantum computing is expected to be designed to work alongside classical computers, not replace them. Quantum computers are large machines that require their qubits to be kept near absolute zero (minus 273 degrees Celsius) in temperature, so don’t expect them in your smartphones or laptops. And rather than the large number of relatively simple calculations done by classical computers, quantum computers are only suited to a limited number of highly complex problems with many interacting variables. Quantum computing would come with drawbacks too. The most flagged disadvantage are the warnings that a quantum computer could quickly crack the encryption that protects classical computers. Another concern is that quantum computing’s potential would add to global tensions if one superpower gains an edge. The same applies in the commercial world if one company dominates. Like artificial intelligence, quantum computing has had its ‘winters’ – when its challenges smothered the excitement and research dropped off.
That points to the biggest qualification about today’s optimism about quantum computing; that it might take a long time to get beyond today’s rudimentary levels where quantum machines are no more powerful than classical supercomputers and can’t do practical things. But if quantum computing becomes mainstream, a new technological era would have started.
For the full version of this article and to view sources, go to: magellangroup.com.au/insights/