Pages

Sunday, 1 December 2013

Quantum leap in computing

Researchers are making real progress in building the next breed of supercomputers, says Philip Ball.
When people first hear about quantum computers, a common response is “where and when can I get one?” But that’s the wrong question, and not just because you’ll be disappointed with the answer. Quantum computers are often said to promise faster, bigger, more multi-layered computation—but they are not, and might never be, an upgrade of your laptop. They’re just not that sort of machine. So what are they, and why do we want them?

You could argue that your laptop is already a quantum computer, because the laws of quantum physics govern the ways that electricity passes through it. In part that’s just pointing out that, ultimately, quantum physics governs all the properties of materials at the atomic scale. What’s more, as the scale of electronics shrinks, strange quantum effects that don’t usually manifest in the everyday world, such as the ability of electrons to leap through walls, are becoming important. This “quantum tunnelling,” for example, is the basis of flash memory.

Real quantum computers go far beyond any of that, though. In the end, all of today’s computers work using old-fashioned binary logic: by encoding information in strings of 1’s and 0’s, values known as “bits.” These are then manipulated in “logic gates,” devices built from electronic components such as transistors. A particular set of input bits prompts the gate to produce another set of output bits. That’s what computation is; the rest is a question of building software and interfaces that turn these bits into a letter to Mum glowing on the screen.

Quantum computers will also use 1’s and 0’s, but with a crucial difference. As well as having just one or other of these values, a quantum bit, or “qubit,” could have any mixture of them—it can be counter-intuitively in two states at once. While a regular bit can only be 1 or 0, a qubit can be simultaneously 1 and 0, or 1 with a tiny bit of 0, and so on. These mixtures are called superpositions, and they are a fundamental feature of objects that obey quantum rules. A photon of light, for example, can be polarised either vertically or horizontally, or can be in a superposition of both polarisations.

That gives qubits access to a vast range of states, so you can encode much more information in them. It enables quantum computers to perform many calculations simultaneously, where a classical computer can do only one at a time with any given set of bits. This is (at least according to one view—there are others) why a quantum computer would be so much faster.

So where’s the catch? Quantum phenomena, such as superpositions, are generally very delicate. They get easily disrupted or destroyed by disturbances from the surrounding environment, particularly the randomising effects of heat. To create superpositions usually requires very low temperatures. The fragility of quantum effects means that, while the question of what you could do with a quantum computer has been explored extensively by physicists and mathematicians, actually building a device that can do any of it is taxing electrical engineers and applied physicists to the limit.

Now there are signs of real progress. The community was set buzzing two years ago when a Canadian company called D-Wave (“the world’s first commercial quantum computing company”) announced that it had created the first practical quantum computer: a black box, if you will, that could actually solve stuff. But several researchers questioned whether D-Wave’s device was really a true quantum computer at all, or just a fancy box of tricks that made token nods towards quantum effects. It employs an approach called “quantum annealing,” which is different from most theories of quantum computing and for which any real advantages over classical computing have yet to be shown.

At Raytheon BBN Technologies in Cambridge, Massachusetts, researchers are convinced that they are closing in on the real thing. Conveniently close to Harvard and MIT, BBN was founded in 1948 and was intimately involved in the development of the earliest military networks that became the internet. In 2009 the company became a subsidiary of the US defence contractor Raytheon.

It has been seeking to develop so-called quantum information technologies since 2001, when the company’s researchers devised an optical telecommunications network that could exchange light signals between their headquarters and nearby Harvard and Boston universities that encoded information in superpositions of photons. Such networks, which could be immune to eavesdropping, have now been developed in many places around the world.

But the quantum computer, which actually does number-crunching, is a bigger challenge. To make qubits, BBN uses the same fundamental circuit components as D-Wave does. Called “superconducting Josephson junctions,” these are metal contacts cooled so deeply that they have become superconductors (that is, they have no electrical resistance), electrically connected to each other through a thin barrier of insulating material. 

Superconductivity is itself a quantum mechanical effect, which is why it requires low temperatures, and the superconducting current can flow in distinct quantum states. A Josephson junction helps filter out all but two states, which correspond to the binary 1’s and 0’s. It is possible to manipulate these states, for example creating specific superpositions, using pulses of microwave radiation. That’s the physical basis of BBN’s qubit circuits, which have to be cooled to within a daunting 50 thousandths of a degree of absolute zero—the coldest possible temperature.

Even then, the superpositions don’t last long. Yet to do practical quantum computing, they only need to survive for as long as the time needed to juggle with them in quantum logic gates. In recent years, says Zachary Dutton, lead scientist of Raytheon BBN’s Quantum Information Processing group, the length of time that superpositions can last have increased dramatically, and are now at a level—tens to hundreds of microseconds—where the devices can actually perform logic processing.

At present the BBN team, who are collaborating with computer giant IBM, doesn’t have anything even vaguely like a quantum computer. Rather, they are focusing on getting very small systems—currently three qubits, but soon to be eight—to work well enough that they can be assembled into large-scale circuits. “If you looked at a circuit diagram of a quantum computer,” says Dutton, “this would be a little piece of it.” The extreme cooling “needn’t be a showstopper,” he adds, because refrigeration technologies have advanced so much in recent years—for example so that they no longer need constant refilling with a coolant such as liquid helium.

What would you use a quantum computer for? Monroe says that the first demonstrations will probably be solving “some esoteric physics problem,” not providing a general-purpose computer. There are, however, some important possible uses that anyone can appreciate. Fast factorising of huge numbers is one such, since all current data encryption methods rely on the difficulty of doing this with classical computers. Quantum computers would change the whole game in data security.

For scientific research, one of the most appealing applications would be to perform computer simulations of molecules and materials. These are governed by quantum rules, and classical computers are forced to solve the equations by laborious and merely approximate mathematical methods. Quantum computers, in contrast, could map such quantum behaviour directly and exactly into its algorithms, so that simulations that currently take days might be possible in seconds, helping to make better predictions of the properties of new drugs and materials.

Currently, the most taxing computation problems are tackled by massive, expensive supercomputers housed in a few specialised centres and leased to users. That’s what the initial market for quantum computers will look like too, says Dutton—not really a market at all, but a highly centralised oligopoly. But of course all computers used to be like this: huge mainframes dedicated to recondite problems. 

Mindful of IBM founder Thomas Watson’s (possibly apocryphal) prediction in 1943 that this is what computers would always be—Watson is said to have forecast a world market for perhaps five of them in total—it would be an unwise prophet who attempts to forecast where quantum computers might be decades down the line.

No comments:

Post a Comment