The invention of the computer changed everything. Now, with the advancements in quantum computing, we are yet again on the brink of another revolution.
In this article, we are rewinding the clock to look at some of the major milestones in the history of quantum technology.
We’ll show how we - as a collective - can benefit from these advancements, from artificial intelligence to improved drug design.
So hop on as we skim through some of the most outstanding achievements in human technology — and take a glimpse at what is yet to come.
With the development of quantum mechanics in the early 1900s, advancements in quantum computing started to arise later that decade.
Richard Feynman’s keynote speech of 1981 can be thought to mark the beginning.
In his now-famous presentation, he argued that since nature works on a quantum level, a quantum computer would have the ability to simulate physical phenomena that a classical computer could not.
In 1992, David Deutsch and Richard Jozsa proposed one of the first quantum algorithms that, although has little practical use, was one of the first examples of an algorithm that is exponentially faster than any deterministic classical algorithm.
Shor’s algorithm is used for finding the prime factors of an integer. With the ability to swiftly factor even large numbers - a task notoriously tedious for a classical computer - Shor’s algorithm poses a significant threat to cyber security and cryptography.
In 2001, researchers at IBM demonstrated Shor's algorithm when they factored the number 15 using a quantum computer running on seven qubits. Later on, independent teams have performed more demonstrations, achieving the factoring of up to 21.
As we see from these seemingly unimpressive results, Shor’s algorithm can only fully work its magic on a fault-tolerant quantum device, so the threat it poses won’t be realized just yet, in the noisy intermediate-scale quantum (NISQ) era of today.
The development of quantum hardware also began in the late 20th century.
One of the main questions was, and still is, what is the correct way to approach building a quantum computer.
Many different approaches are being proposed for the physical implementation, with companies such as IBM and IQM using superconducting qubits, while technology based on photonic chips is used, for example, in China.
In 1999, for the first time, researchers Yasunobu Nakamura and Jaw-Shen Tsai demonstrated that a superconducting circuit could be used as a qubit. The device IBM used in 2001 to run Shor’s algorithm employed nuclear magnetic resonance.
The first commercial quantum device, running on superconducting technology, was developed in 2011 by D-Wave.
This so-called quantum annealer was designed to solve specific optimization problems. Although there was some debate on the actual quantum effects it could provide, soon after their announcement, they sold their first quantum computer to security company Lockheed Martin for a rumored $10 million.
Later in the same decade, hardware started to improve considerably as big corporations such as Google began investing in quantum technology.
The number of qubits and their quality has been steadily improving, and sure enough, in 2019, Google announced that they had achieved quantum supremacy with their 53-qubit sycamore processor.
They stated to have performed a series of operations in 200 seconds that would take a supercomputer 10,000 years to complete.
The exact numbers, however, are debatable as IBM suggested a technique that a supercomputer could use to cut the time down from 10,000 years to 2.5 days.
Nonetheless, Google's demonstration was an impressive milestone.
In 2021, China announced it had also reached quantum supremacy with two separate devices working with different technologies.
In their first demonstration, using a photonic approach, they claimed to be capable of solving sampling problems up to 1023 times faster than a supercomputer.
The second demonstration indicated speed-ups of up to 1000 times compared to a supercomputer, using a 66-qubit superconductor-based computer.
With hardware developing rapidly, companies have started to look for practical applications for quantum devices, of which one of the first might be quantum-enhanced machine learning.
Quantum-enhanced machine learning refers to quantum algorithms solving tasks in machine learning with improved techniques and speed.
This would, for example, improve artificial intelligence.
Also, as Feynman stated in 1981, quantum devices are excellent for simulating quantum systems - systems that abide by the laws of quantum mechanics.
Precisely simulating the workings of atoms, molecules, and their interactions has enormous implications in drug design and the field of medicine altogether.
Also, as it turns out, quantum devices can be used to speed up solving the partial differential equations (PDEs) describing much of classical physics, allowing for even exponential speed-ups in simulations at the heart of science, engineering, and R&D.
Here at Quanscient, we are focusing on exactly this, and we already have preliminary quantum algorithms currently being validated and tested with quantum devices.
Although the fault-tolerant era of quantum computing might be a long time ahead, our goal is to provide significant speed-ups using these algorithms in the next 2 to 4 years, especially in fluid dynamics and electromagnetics.