What the Hell Is a Quantum Computer and How Excited Should I Be?

Image: liquidtransistorresonancyvalley by artist Brenna Murphy. Used with permission. 

They will never sit on your desk, and they will most certainly never fit in your pocket. Today, they’re fragile, and need to be kept at temperatures close to absolute zero. Quantum computers aren’t much like the desktop PCs we’re all so familiar with—they’re a whole new kind of machine, capable of calculations so complex, it’s like upgrading from black-and-white to a full color spectrum.

Lately, you’ve been hearing a lot about quantum computing. There are news stories about how it “could change the world” and “open new dimensions.” Universities are hyping up their quantum microchip prototypes, demonstrations of quantum mechanical ideas in silicon, and other devices and theories. But come on, how does it work? What does it do? Who’s doing it? And, most importantly, why should you care?

Despite what you’ve heard, right now, quantum computing is more or less in the era that classical computing was in the ‘50s, when room-sized hulks ran on vacuum tubes. But it could revolutionize computing. Potentially. Maybe.

Before you learn what a quantum computer is and why it matters, let’s break down the mathematical theory of quantum mechanics. It may sound esoteric, but the rules of quantum mechanics govern the very nature of the particles that make up our universe, including those of your electronics and gadgets.

When one thing is two things at the same time

In our universe, we are used to a thing being one thing. A coin, for example, can be heads, or it can be tails. But if the coin followed the rules of quantum mechanics, the coin would be flipping midair. So until it lands and we look at it, we don’t know if it’s heads or tails. Effectively, it’s both heads and tails at the same time.

We do know one thing about this coin. There is a probability for the flipping coin to be either heads or tails. So the coin isn’t heads, it’s not tails, it’s—for example—the probability of 20% heads and 80% tails. Scientifically speaking, how can a physical thing be like this? How do we even begin describe it?

The most mind-boggling part of quantum mechanics is that for some reason, particles like electrons seem to act like waves, and light waves like particles. Particles have a wavelength. The most basic experiment demonstrating this fact is the double slit experiment:

If you put a pair of parallel slits in a partition between a beam of particles and a wall, and put a detector on the wall to see what happens, a strange pattern of stripes appear. It’s called an interference pattern.

Like waves, the particles-waves that travel through one slit interfere with those that travel through the other slit. If the peak of the wave aligns with a trough, the particles cancel out and nothing shows up. If the peak aligns with another peak, the signal in the detector would be even brighter. (This interference pattern still exists even if you only send one electron at a time.)

If we were to describe one of these wave-like particles (before they hit the wall) as a mathematical equation, it would look like the mathematical equation describing our coin (before it hits the ground and lands on heads or tails).

These equations can look kind of scary, like this:

But all you need to know is that this equation lists the particle’s definite properties but doesn’t say which one you’ll get. (We don’t know that yet.) You can use this equation to find the probabilities of some of the particle’s properties.

And because this math involves complex numbers—those containing the square root of -1, or i—it doesn’t just describe the probability of a coin being heads or tails, it describes an advanced probability, which could include the way the face of the coin will be rotated.

From all this crazy math, we get a couple of crazy things. There’s superposition—the midair coin being heads and tails at the same time. There’s interference—probability waves overlapping and cancelling each other out. And there’s entanglement, which is like if we tied a bunch of coins together, changing the probability of certain outcomes because they’re, well, entangled now. These three crazy things are exploited by quantum computers to make whole new kinds of algorithms.

How a quantum computer works

“In some sense we’ve been doing the same thing for 60 years. The rules we use to compute have not changed—we’re stuck with bits and bytes and logic operations,” Martin Laforest, Senior Manager of Scientific Outreach at the Institute for Quantum Computing at the University of Waterloo in Canada, tells Gizmodo. But that is all about to change. “Quantum computers turn the rules of computers on their heads.”

Traditional computers do their computation using bits, which can be stored as electrical charges in processors or even tiny pits drilled into CDs. A bit only has two choices, which we represent as one and zero. Anything with two choices you can pick from is a bit. All computing is done via setting and relating bits, with operations like “if this bit is a zero and this bit is a one, make this third bit a one, otherwise make it a zero,” and so on and so forth.

The qubit, short for quantum bit, is like a regular bit, but it’s both a zero and a one at the same time (before you look at it). It’s that coin flipping in midair. A quantum computer is like flipping multiple coins at the same time—except while these coins are flipping, they obey the wacky rules of superposition, interference and entanglement.

The quantum computer first bestows the qubits with this quantum mechanical version of probability of what will happen once you actually peep the qubit. (Once you peep the mysterious qubit though, it stops being mysterious and becomes a defined bit.) Quantum mechanical computations are made by preparing the qubits (or adding weights to a coin before you flip it to manipulate the probability of the outcome), then interacting them together (or flipping a bunch of entangled coins at once) and then measuring them (which causes the coins to stop flipping and produces the final value). If done properly, all of this mid-air interaction should result in a best answer (the value) to whatever question you’ve asked the computer.

Quantum computing is special. As we said before, because its math uses complex numbers, it computes a special version of probabilities—not just heads vs. tails but also the orientation of the coin. So as you throw these coins up in the air, they bump into each other with their different sides and orientations, and some of this bumping changes the probability of the side revealed by the outcome. Sometimes they bump into each other and cancel each other out, making certain outcomes less likely. Sometimes they push each other along, making certain outcomes more likely. All this is interference behavior.

“The idea with a quantum computer is that you take this phenomenon and exploit it on a massive scale,” said Scott Aaronson, theoretical computer scientist at the University of Texas, Austin. “The idea is to choreograph a pattern of interference” so that everything cancels out except for the answer you were looking for. You want the coins to interfere in the air.

To the observer, the answer just looks like the output of regular bits. The quantum mechanics happens in the background.

What you can do with it, from chemistry to encryption

It was famous physicist Richard Feynman who’s credited as dreaming up the first quantum computer in a 1982 paper—a computer that could use quantum mechanics to solve certain problems. But it was like first coming up with a new way of notating music, but no instrument to play it on and no compositions written. It wasn’t until mathematicians began devising algorithms for this computer to use that it became a more reasonable dream to pursue. Theorists wrote the compositions (the algorithms), while physicists worked on building the instruments (the physical quantum computers).

But okay, now you just have these weird quantum bits whose output you can’t guess beforehand. Now you have to figure out how you can use them. Today, there are several places where researchers think using a quantum computer could solve certain problems better than a classical computer.

Most obviously, you can use these quantum bits to create simulations of other things that follow the crazy rules of quantum mechanics: namely, atoms and molecules. Scientists can use qubits to model entire molecules and their interactions. This could help drug companies devise new medicines, or create new materials with desired properties, before ever setting a foot into a lab.

Scientists have already been able to model these molecules using classical computing, but quantum mechanics offers a huge speedup. Fully representing the behavior of the caffeine molecule, including the relevant quantum mechanical rules of its individual particles, might take 160 qubits, explained Robert Sutor, vice president of Cognitive, Blockchain, and Quantum Solutions at IBM. Doing so with a classical computer to that level of detail would require around the same number of bits (10^48) as there are atoms on planet Earth (between 10^49 and 10^50).

IBM has already modeled the far lighter beryllium hydride molecule using a six qubit quantum computer. Researchers at Lawrence Berkeley National Laboratory determined all of the energy state of a hydrogen molecule with their own two qubit quantum computer.

There are other algorithms that researchers think might provide some sort of speedup over classical computers. Grover’s algorithm, for example, can help optimize searching. Some are working on using quantum computing in artificial intelligence, or in optimization problems such as “find the biggest mountain in this mountain range” and “find the fastest route between these two points separated by several rivers crossed by several bridges.”

But perhaps the most talked-about quantum computer algorithm is something called Shor’s algorithm, which could change the way almost all our data in encrypted.

Devised by Peter Shor in 1994, its purpose is to factor numbers into primes. I literally mean the factoring you learned in elementary school, the way that you can break 15 into its factors, 3 and 5. Multiplying numbers together is a simple computational task, but breaking big numbers into their factors takes a far longer time. Modern cryptography is based on this knowledge, so lots of your data is, in its most simplified form, encrypted “securely” by converting things into numbers, multiplying them together and associating them with a “key”—instructions on how to factor them. RSA encryption is used almost everywhere, from passwords to banking to your social media. But if a quantum computer can come along that can run Shor’s algorithm and break the encryption, then that old encryption method is no longer secure.

According to everyone I spoke with, breaking RSA encryption is decades away, but scientists are well on their way looking for post-quantum cryptography, new math that can be used for encoding data. The idea is that encryption based on these new ideas would be based on mathematics not easier to run with a quantum computer. Meanwhile, other researchers are scrambling to break the popular RSA encryption system with quantum computers before a hacker does.

“I suppose on that level, it’s like the Cold War,” said Stephan Haas, University of Southern California theoretical physicist. “You’re getting nuclear weapons because the other guy is getting nuclear weapons.”

Here’s a physical qubit

Scientists needed transistors, teeny electrical switches, to store bits and make regular computers. Similarly, they need hardware that can store a quantum bit. The key to producing a quantum computer is finding a way to model a quantum system that folks can actually control—actually set the probabilities and orientations of those flipping coins. This can be done with atoms trapped by lasers, photons, and other systems. But at this point, most everyone in the industry who’s presented a quantum computer has done so with superconductors—ultra-cold pieces of specially-fabricated electronics.

They look like teeny microchips. Except these microchips get placed into room-sized refrigerators cooled to temperatures just above absolute zero.

An 8-qubit quantum processor from Lawrence Berkeley National Labs. Photo: Ryan Mandelbaum/Gizmodo.
An 8-qubit quantum processor from Lawrence Berkeley National Labs. Photo: Ryan Mandelbaum/Gizmodo.

These superconducting qubits stay quantum for a long time while performing quantum computing operations, explained Irfan Sidiqqi from the University of California, Berkeley. He said that other types of systems can stay quantum for longer, but are slower.

There are three kinds of qubits made from these electronics. They’re called flux, charge, and phase qubits, differing by the specifics of their constructions and their physical properties. All of them rely on something called a Josephson junction in order to work.

A Josephson junction is a tiny piece of non-superconducting insulator placed between the superconducting wires, places where electrons travel without any resistance and begin to show off obvious quantum effects in larger systems. Manipulating the current through the wires allows physicists to set up qubits in these systems. As of today, these systems are very fragile. They fall apart into classical bits through any sorts of noise. And every additional qubit adds more complexity. The biggest quantum computers today have less than 20 qubits, with an exception, the D-Wave computer, whose 2,000 qubits operate on a separate, more specific principle that we’ll dig into later.

Actually performing calculations with these qubits can be a challenge. Regular computers have error correction, or built-in redundancies, places where multiple bits perform the same function in case one of them fail. For quantum computers to do this, they need to have extra qubits built into their system specifically to check errors. But the nature of quantum mechanics makes actually doing this error correction more difficult than it does in classical computers. It could take around two thousand physical qubits working in tandem, in fact, to create one reliable “error-corrected” qubit resistant to messing up. But we’re getting closer. “There’s a lot of healthy progress that wouldn’t have been imaginable two years ago,” said Debbie Leung on the faculty at the Institute for Quantum Computing at the University of Waterloo.

“A quantum computer will always have errors,” said Laforest. Thankfully, modeling molecules doesn’t need quite the same level of accuracy, said Siddiqi, which is why researchers have plowed forward with these types of simulations in few-qubit systems.

Better qubits and further research continue to bring us closer to the threshold where we can construct few-qubit processors. “Now we’re at the junction where the theoretical demand versus the reality of experiments are converging together,” said Laforest.

Who’s doing it

Universities, national labs, and companies like IBM, Google, Microsoft and Intel are pursuing qubits set-ups in logic circuits similar to regular bits, all with less than 20 qubits so far. Companies are simultaneously simulating quantum computers with classical computers, but around 50 qubits is seen as the limit—IBM recently simulated 56 qubits, which took 4.5 terabytes of memory in a classical computer.

Each company we spoke to has a slightly different approach to developing their superconducting machines. Sutor from IBM told Gizmodo the company is taking a long-term approach, hoping to one day release a general-purpose quantum computer that classical computers rely on, when needed, through the cloud. Intel has just entered the race with their 17-qubit processor released in October. Microsoft showed off their consumer-facing software suite to Gizmodo, and described a similar long-term goal for quantum computing involving scalable hardware.

Rumors are swelling that before the end of this year, Google will unleash a quantum computer that will achieve “quantum supremacy” with 49 or 50 qubits. Quantum supremacy simply means finding one single algorithm for which a quantum computer always win, and for which a classical workaround can’t be found to solve the same problem. This is just one milestone, though.

“It will probably be a contrived task, something not classically important,” said Aaronson. Still, he said, “I think at that point it raises the stakes for the skeptics, for the people who have said and continue to say that it’s a pipe dream.” The other companies seemed to agree and stressed their long-term goals for quantum computing. Google did not respond to a request for comment.

While 2017 seems to be a year amidst a sort-of quantum boom, everyone I spoke to was realistic about just how far from a consumer-facing product quantum computing is. “Looking at 2020, 2021 we’ll start seeing the advantage for real users, corporations, and scientific research,” Sutor said.

But one controversial company, D-wave, is instead doing a different kind of quantum computing called adiabatic quantum computing. Rather than just a dozen to a few dozen qubits, they’ve announced a computer with 2,000. And rather than rely on quantum logic circuits like the rest of the pack, their computer solves one type of problem—optimization problems, like finding the best solution from a range of okay solutions, or finding the best taxi route from point A to point B staying as far as possible from other taxis. These kind of problems are potentially useful in finance.

Unlike the competitors, D-wave doesn’t need its qubits to be error-corrected. Instead, it overcomes the error correction by running the algorithm many times per second. “Is it a general purpose machine that could run any problem? No,” Bo Ewald, D-Wave’s president, told Gizmodo. “But there aren’t any computers that can run these problems anyway.”

At this point, people agree that D-Wave’s computer is a quantum computer, but are unsure if it’s better than a classical computer for the same problem yet (some of its users report beating classical algorithms, said Ewald). But Ewald just wanted to get quantum computers in front of people now. “If you want to get started with real-world quantum computing today, this is how you do it. NASA, Google, and Los Alamos National Labs have all purchased models or computing space,” said Ewald.

The outlook

Everyone, even Ewald at D-Wave, agrees that we’re far from seeing quantum computers used in everyday life—there’s a lot of excitement but we’re still in the early days. There are hordes of challenges, like error correction. Then comes the related problem of transmitting quantum information between distant computers or storing quantum information long term in memory.

I asked Aaronson whether he thought some startup or some secret effort might come along from out of nowhere and present a super advanced model—he said probably not. “We know who the best scientists are and we’d expect them to be vacuumed up the way physicists were in the Manhattan project,” he said. “I think it remains a very healthy field, but at the same time it’s true that actually building a useful quantum computer is a massive technological undertaking.” You can’t just build one in your garage.

So no, you cannot own a quantum computer now, nor is it likely that you will ever own a quantum computer. It’s more likely that when your classical computer needs quantum help, you won’t notice it working. You may hear about some benefits of quantum computing in the next few years, like biochemical advances, but other advantages could be 20 years down the line. And overall, there’s no proof a quantum computer is any better than a classical computer. Yet.