IBM has announced it’s surmounted one of the biggest hurdles on the road toward creating the world’s first true usable quantum computer.
A number of analysts have predicted that the jump from traditional computing to quantum chips could be on par with the revolution we saw when the world moved from vacuum tubes to integrated circuits back in the early sixties.
The reason for this increased power is that quantum computers are capable of processing multitudes more calculations than traditional CPUs at once, because instead of a transistor existing in one of either two states — on, or off — independently of one another, a quantum bit can be both at the same time.
How is that possible? Well, while the specifics of the mechanism that makes it work involves a bit more math than I could sit through in college, at its essence the computer is taking advantage of a quantum phenomena known as “superposition,” wherein an atom can act as both a wave and a particle at once.
In short, this means that at least in theory, quantum bits (or “qubits”), can process twice as much information twice as fast. This has made the race to create the world’s first true quantum computer a bit of a Holy Grail moment for big chip makers, who have found themselves inching closer to maxing out Moore’s Law as 22 nano-meter transistors shrink to to 14nm, and 14nm tries to make the jump to 10.
So far we’ve seen just one company pull out in front of the herd with its own entry, D-Wave, which first debuted all the way back in 2013. Unfortunately for futurists, the D-Wave is more a proof of concept that quantum computing is at least possible, but still not necessarily all that much quicker than what we have to work with today.
IBM has found a way for quantum computers to check their own errors, overcoming quantum decoherence.
Now though, according to a statement released by IBM Research, it seems Big Blue may have found a way around one of the biggest qualms in quantum computing by sorting out the problem of something known as “quantum decoherence.”
Decoherence is a stumbling block that quantum computers run into when there’s too much “noise” surrounding a chip, either from heat, radiation, or internal defects. The systems that support quantum chips are incredibly sensitive pieces of machinery, and even the slightest bit of interference can make it impossible to know whether or not the computer was able to successfully figure out that two plus two equals four.
IBM was able to solve this by upping the number of available qubits laid out on a lattice grid to four instead of two, so the computer can compensate for these errors by running queries against itself and automatically compensating for any difference in the results.
In laymen’s, this means that researchers can accurately track the quantum state of a qubit, without altering the result through the act of observing alone.
“Quantum computing could be potentially transformative, enabling us to solve problems that are impossible or impractical to solve today,” said Arvind Krishna, senior vice president and director of IBM Research, in a statement.
While that may not sound huge, it’s still a big step in the right direction for IBM. The company believes the quantum revolution could be a potential savior for the supercomputing industry, a segment that is projected to be hardest hit by the imminent slowdown of Moore’s trajectory.
Other possible applications up for grabs include solving complex physics problems beyond our current understanding, testing drug combinations by the billions at a time, and creating unbreakable encryption through the use of quantum cryptography.