Paul Benioff (born 1930) is a US physicist who in 1980 imagined the feats computing might achieve if it could harness quantum mechanics, where the word quantum refers to the tiniest amount of something needed to interact with something else โ itโs basically the world of atoms and sub-atomic particles.
Benioffโs imagination helped give rise to the phrase โquantum computingโ, a term that heralds how the storage and manipulation of information at the sub-atomic level would usher in computing feats far beyond those of โclassicalโ computers.
Benioff was coincidently thinking about a vague concept being outlined by Russian mathematician Yuri Manin (born 1937). Since then, many others have promoted the potential of computing grounded in the concept of โsuperposition,โ when matter can be in different states at the same time.
Quantum computing is built on manipulating the superposition of the qubit, the name of its computational unit. Qubits are said to be in the โbasis statesโ of 0 or 1 at the same time when in superposition, whereas a computational unit in classical computing can only be 0 or 1. This qubit characteristic, on top of the ability of qubits to engage with qubits that are not physically connected (a characteristic known as entanglement), is what proponents say gives quantum computers the theoretical ability to calculate millions of possibilities in seconds, something far beyond the power of the transistors powering classical computers.
In 2012, US physicist and academic John Preskill (born 1953) devised the term โquantum supremacyโ to describe how quantum machines one day could make classical computers look archaic.
In October last year, a long-awaited world first arrived. NASA and Google claimed to have attained quantum supremacy when something not โterribly usefulโ was computed โin seconds what would have taken even the largest and most advanced supercomputers thousands of yearsโ. The pair were modest that their computation on a 53-qubit machine meant they were only able โto do one thing faster, not everything fasterโ. Yet IBM peers doused their claim as โgrandiosityโ anyway, saying one of IBMโs supercomputers could have done the same task in two-and-a-half days.
Nonetheless, most experts agreed the world had edged closer to the transformative technology. Hundreds of millions of dollars are pouring into research because advocates claim quantum computing promises simulations, searches, encryptions and optimisations that will lead to advancements in artificial intelligence, communications, encryption, finance, medicine, space exploration, even traffic flows.
No one questions that practical quantum computing could change the world. But the hurdles are formidable to accomplish a leap built on finicky qubits in superposition, entanglement and โerror correctionโ, which is the term for overcoming โdecoherenceโ caused by derailed qubits that canโt be identified as out of whack when they are in superposition. Thereโs no knowing as to when, or if, a concept reliant on mastering so many tricky variables will eventuate. While incremental advancements will be common, the watershed breakthrough could prove elusive for a while yet.
To be clear, quantum computing is expected to be designed to work alongside classical computers, not replace them. Quantum computers are large machines that require their qubits to be kept near absolute zero (minus 273 degrees Celsius) in temperature, so donโt expect them in your smartphones or laptops. And rather than the large number of relatively simple calculations done by classical computers, quantum computers are only suited to a limited number of highly complex problems with many interacting variables. Quantum computing would come with drawbacks too. The most flagged disadvantage are the warnings that a quantum computer could quickly crack the encryption that protects classical computers. Another concern is that quantum computingโs potential would add to global tensions if one superpower gains an edge. The same applies in the commercial world if one company dominates. Like artificial intelligence, quantum computing has had its โwintersโ โ when its challenges smothered the excitement and research dropped off.
That points to the biggest qualification about todayโs optimism about quantum computing; that it might take a long time to get beyond todayโs rudimentary levels where quantum machines are no more powerful than classical supercomputers and canโt do practical things. But if quantum computing becomes mainstream, a new technological era would have started.
For the full version of this article and to view sources, go to: magellangroup.com.au/insights/