On the outskirts of Santa Barbara, California, between the orchards and the ocean, sits an inconspicuous warehouse, its windows tinted brown and its exterior painted a dull gray. The facility has almost no signage, and its name doesn’t appear on Google Maps. A small label on the door reads “Google AI Quantum.” Inside, the computer is being reinvented from scratch.
In September, Hartmut Neven, the founder of the lab, gave me a tour. Neven, originally from Germany, is a bald fifty-seven-year-old who belongs to the modern cast of hybridized executive-mystics. He talked of our quantum future with a blend of scientific precision and psychedelic glee. He wore a leather jacket, a loose-fitting linen shirt festooned with buttons, a pair of jeans with zippered pockets on the legs, and Velcro sneakers that looked like moon boots. “As my team knows, I never miss a single Burning Man,” he told me.
In the middle of the warehouse floor, an apparatus the size and shape of a ballroom chandelier dangled from metal scaffolding. Bundles of cable snaked down from the top through a series of gold-plated disks to a processor below. The processor, named Sycamore, is a small, rectangular tile, studded with several dozen ports. Sycamore harnesses some of the weirdest properties of physics in order to perform mathematical operations that contravene all human intuition. Once it is connected, the entire unit is placed inside a cylindrical freezer and cooled for more than a day. The processor relies on superconductivity, meaning that, at ultracold temperatures, its resistance to electricity all but disappears. When the temperature surrounding the processor is colder than the deepest void of outer space, the computations can begin.
Classical computers speak in the language of bits, which take values of zero and one. Quantum computers, like the ones Google is building, use qubits, which can take a value of zero or one, and also a complex combination of zero and one at the same time. Qubits are thus exponentially more powerful than bits, able to perform calculations that normal bits can’t. But, because of this elemental change, everything must be redeveloped: the hardware, the software, the programming languages, and even programmers’ approach to problems.
On the day I visited, a technician—whom Google calls a “quantum mechanic”—was working on the computer with an array of small machine tools. Each qubit is controlled by a dedicated wire, which the technician, seated on a stool, attached by hand.
The quantum computer before us was the culmination of years of research and hundreds of millions of dollars in investment. It also barely functioned. Today’s quantum computers are “noisy,” meaning that they fail at almost everything they attempt. Nevertheless, the race to build them has attracted as dense a concentration of genius as any scientific problem on the planet. Intel, I.B.M., Microsoft, and Amazon are also building quantum computers. So is the Chinese government. The winner of the race will produce the successor to the silicon microchip, the device that enabled the information revolution.
A full-scale quantum computer could crack our current encryption protocols, essentially breaking the Internet. Most online communications, including financial transactions and popular text-messaging platforms, are protected by cryptographic keys that would take a conventional computer millions of years to decipher. A working quantum computer could presumably crack one in less than a day. That is only the beginning. A quantum computer could open new frontiers in mathematics, revolutionizing our idea of what it means to “compute.” Its processing power could spur the development of new industrial chemicals, addressing the problems of climate change and food scarcity. And it could reconcile the elegant theories of Albert Einstein with the unruly microverse of particle physics, enabling discoveries about space and time. “The impact of quantum computing is going to be more profound than any technology to date,” Jeremy O’Brien, the C.E.O. of the startup PsiQuantum, said recently. First, though, the engineers have to get it to work.
Imagine two pebbles thrown into a placid lake. As the stones hit the surface, they create concentric ripples, which collide to produce complicated patterns of interference. In the early twentieth century, physicists studying the behavior of electrons found similar patterns of wavelike interference in the subatomic world. This discovery led to a moment of crisis, since, under other conditions, those same electrons behaved more like individual points in space, called particles. Soon, in what many consider the most bizarre scientific result of all time, the physicists realized that whether an electron behaved more like a particle or more like a wave depended on whether or not someone was observing it. The field of quantum mechanics was born.
In the following decades, inventors used findings from quantum mechanics to build all sorts of technology, including lasers and transistors. In the early nineteen-eighties, the physicist Richard Feynman proposed building a “quantum computer” to obtain results that could not be calculated by conventional means. The reaction from the computer-science community was muted; early researchers had trouble getting slots at conferences. The practical utility of such a device was not demonstrated until 1994, when the mathematician Peter Shor, working at Bell Labs in New Jersey, showed that a quantum computer could help crack some of the most widely used encryption standards. Even before Shor published his results, he was approached by a concerned representative of the National Security Agency. “Such a decryption ability could render the military capabilities of the loser almost irrelevant and its economy overturned,” one N.S.A. official later wrote.
Shor is now the chair of the applied-mathematics committee at the Massachusetts Institute of Technology. I visited him there in August. His narrow office was dominated by a large chalkboard spanning one wall, and his desk and his table were overflowing with scratch paper. Cardboard boxes sat in the corner, filled to capacity with Shor’s scribbled handiwork. One of the boxes was from the bookseller Borders, which went out of business eleven years ago.
Shor wears oval glasses, his belly is rotund, his hair is woolly and white, and his beard is unkempt. On the day I met him, he was drawing hexagons on the chalkboard, and one of his shoes was untied. “He looks exactly like the man who would invent algorithms,” a comment on a video of one of his lectures reads.
An algorithm is a set of instructions for calculation. A child doing long division is following an algorithm; so is a supercomputer simulating the evolution of the cosmos. The formal study of algorithms as mathematical objects only began in the twentieth century, and Shor’s research suggests that there is much we don’t understand. “We are probably, when it comes to algorithms, at the level the Romans were vis-à-vis numbers,” the experimental physicist Michel Devoret told me. He compared Shor’s work to the breakthroughs made with imaginary numbers in the eighteenth century.
Shor can be obsessive about algorithms. “I think about them late at night, in the shower, everywhere,” he said. “Interspersed with that, I scribble funny symbols on a piece of paper.” Sometimes, when a problem is especially engrossing, Shor will not notice that other people are talking to him. “It’s probably very annoying for them,” he said. “Except for my wife. She’s used to it.” Neven, of Google, recalled strolling with Shor through Cambridge as he expounded on his latest research. “He walked right through four lanes of traffic,” Neven said. (Shor told me that both of his daughters have been diagnosed with autism. “Of course, I have some of those traits myself,” he said.)