Over the Quantum Cliff

How will we live once quantum computers spill all our secrets?
essay

Nick Veasey, VW Camper Family - Grey, 2024.

Courtesy of the artist.

It may happen in 10 years. It may happen in one year. It may have already happened in a secret corporate or government facility. But whatever the timeline, a cataclysm is almost certainly coming. Someday soon, quantum computers will likely become powerful and stable enough to decrypt most of the documents and communications that have relied on encrypted security protocols for decades, including intimate chats, medical records, banking credentials, and state secrets.

The scenario unfolds as follows: One day, decades of your private digital life will be rendered public information, without your knowledge or consent. First, a handful of wealthy states and big tech companies will have access to your data. Then, as quantum computers become cheaper and your decrypted data gets repackaged and sold, other commercial interests will join in on the feast. Ultimately, given the ubiquity of datafied global networks, there’s no telling how many people will have access to your secrets. It’s easy to imagine a world a decade or two from now in which they’re simply available to anyone, in searchable public archives as easy to find as your birthday or a photo of your face. And, unless you’re a corporate executive, federal policymaker, or cybersecurity professional, it’s likely that nobody will have bothered to warn you about it.

We’re either hurtling or inching toward that future, depending on who you ask. In 2024, researchers at Shanghai University published a paper demonstrating success using quantum computers to break elements of RSA and AES encryption, both longstanding backbones of secure digital communications, including web browsing, private email, secure chat, cryptocurrency, and VPN internet traffic.

News headlines proclaimed that the Chinese computer scientists had broken “military-grade” encryption tools, raising alarms about the potentially calamitous implications for cybersecurity and geopolitics. But other commentators pushed back on these claims, arguing that their accomplishment was little more than a proof of concept, and merely an incremental step in a long process that could take years or even decades to become a threat. RSA Security, the company behind RSA standards, rushed to reassure the millions of customers who rely upon their encryption, pointing out that quantum computers were still barely in the early stages of development, and catastrophe was still a long way off.

In the year or so since then, things have accelerated. For one thing, computer scientists at Google published research showing that “quantum computers may crack RSA encryption” with only 5 percent of the processing power than previously anticipated. For another, quantum processing power is growing far faster than many had hoped or feared. At the time of writing, for instance, the latest record was set by Caltech researchers, who created a quantum computer six times as big and 10 times as stable as the previous record set last year.

Computer security professionals have seen this threat coming for years. It’s been called many things. In our book, The Secret Life of Data, we used the commonly-invoked term “the quantum cliff.” Others, such as a recent Wired Magazine article, call it “Q-Day.” Snarkier folks—especially those who still think it might be a tempest in a teapot—refer to it as “Y2Q,” a snide reference to the furor over the “Y2K bug” in 2000, which turned out to be a massive anticlimax (thanks in part to all the systems engineers who saw it coming and took adequate steps to prevent it).

From a technical standpoint, the quantum cliff is actually a much thornier problem than Y2K, because there’s no “patch” that can be applied to fix it, and the number of affected systems is colossally larger and more diverse. There are new, quantum-resilient encryption protocols, but because encryption standards aren’t forward-compatible, there’s no path to protecting everything that was previously encrypted under long-held standards like RSA and AES. And there’s currently no easy way to upgrade all of the computer and networking systems that currently serve as the infrastructure for global modern life.

Without any clear means of prevention, we are left with only the option of preparation. Yet, while governments and corporations have spent much of the last decade planning and strategizing for a post-quantum cliff future, the rest of us have been left in the dark.

To the extent that the media have covered the quantum cliff at all, the coming cataclysm tends to be treated as a cybersecurity or national security story. To be sure, these are pressing concerns, and the potential for economic, political, or even military disaster is very real. However, we shouldn’t ignore the equally transformative widespread social and cultural consequences for everyday people.

Privacy and secrecy are integral to the way we live our lives, and essential to our independence and autonomy as members of a democracy. Having our private lives exposed to our families, our neighbors, and our employers could be disastrous at the human scale, potentially destroying relationships, communities, and careers. Without any clear guidance from the tech sector or the government, it’s clear that the public at large will have to fend for ourselves if we want to avoid the worst-case scenario.

*

In order to understand why the quantum cliff is such a big deal, it’s important to understand some of the core technologies at work—beginning with quantum computers. A normal digital computer, like the one we used to write this article, and the one you may be reading it on, uses a microprocessor, which is basically a wafer of silicon with billions of minuscule transistors embedded in it. Each transistor is a tiny on/off switch, which either allows electrical signals to flow through it or blocks them. In the binary language of computer code, an open switch counts as a “1,” while a closed switch counts as a “0,” and the combination of ones and zeros is used to represent and process data. For instance, in ASCII, the standard system used to encode letters like the ones in this sentence, a capital F is rendered as 01000110. That’s known as an “eight-bit code” because there are eight ones and zeros required to store the information.

One of the reasons digital computers are powerful is because each transistor you add to a chip increases its computing power exponentially, by powers of two. One bit can appear in two different permutations (on or off), while two bits offer four permutations, three bits offer eight, and so on; when scaled up to billions of bits, the number of permutations is measured as two to the power of billions. (Even though modern encryption systems typically use only 2,048-bit processing algorithms, 22048 is still such an incredibly large number that mathematicians in most contexts consider it functionally indistinguishable from infinity.)

Having our private lives exposed to our families, our neighbors, and our employers could be disastrous at the human scale, potentially destroying relationships, communities, and careers.

But quantum computers put digital computers to shame. Instead of tiny transistors, the elements of a quantum processor are energetic particles, such as electrons or photons, bound together through the “superposition” and “entanglement” that characterize quantum physics at subatomic scales. These particles are held in place by lasers or electromagnetic fields. Each particle acts as a “qubit” (a quantum bit), which can represent a one, a zero, or a superposition of one and zero at the same time.

This means that quantum computers calculate numbers based on powers of three rather than two, as digital computers do. While this may seem like a trivial difference, relying on powers of three means that quantum computers can represent very large numbers in a much more efficient way, and can therefore theoretically process information exponentially faster than digital computers. For instance, a 1,000-qubit computer does about the same amount of calculation as a digital computer with 10302 transistors on its circuit board (the fastest processor in existence today only has about 1012 transistors).

The ability for a single processor to perform calculations faster than all of the digital computers in the world put together—gazillions of times over—is what makes quantum computers such a serious threat to cybersecurity. Encryption is essentially just really difficult math.

RSA, for example, uses an “asymmetric” model in which there are two different secret keys: one public key for encryption (you’d use this to send us a secret email) and one private key for decryption (we’d use this to read the secret email). The public key is created by taking two really large prime numbers, multiplying them together, and then subjecting the product to a bunch of other mathematical operations. The private key comes from additional mathematical gymnastics applied to the same prime numbers, calculations that are fantastically tricky to reverse engineer.

The only way to break this kind of encryption is to figure out the original prime numbers. Even the fastest digital computers are phenomenally bad at this, and it would take millions of years of nonstop number crunching for your laptop to find the answer. Even accounting for Moore’s Law, which has accurately predicted that digital computers would double in power each 18 months for the past 60 years, it would still take centuries to develop a computer fast enough to crack RSA. That’s why it’s still in use, decades after its invention.

But if there’s a quantum computing equivalent to Moore’s Law, and speeds triple every year or two (a benchmark that’s been beaten consistently over the past decade), breaking an RSA-encrypted file could, within the foreseeable future, take mere days, hours, or even milliseconds.

*

You might be thinking, Okay, that sounds pretty bad. But surely my secrets aren’t really in any danger, right? For one thing, nobody cares about little old me. Big tech companies and shadowy government intelligence operations are much more likely to target high-value secrets, like spaceship engine plans, nuclear codes, and the algorithms underpinning Bitcoin. And for another thing, even if they wanted to hack my private chats with my most recent Tinder date, they’d have to find them first, by stealing my phone or breaking into the dating app server. There’s no way it would be worth the hassle.

Those are very reassuring thoughts. Unfortunately, they’re dead wrong. The truth is that big tech companies and shadowy government intelligence operations probably already have your private chats with your Tinder date, and they’re holding onto them, waiting for the day they’ve got a computer strong enough to decode them—a widely recognized strategy called “Harvest Now, Decrypt Later.”

There are three reasons for this—two technical and one economic. First, your private chats might be sitting on your phone or on a secure server somewhere, but in order to get from your phone to your date’s and back, they had to travel over the “open internet,” the same quasi-public communications infrastructure used for everything from email to Zoom calls to your fitness tracker. The open internet is a universally accessible network, which means a relatively central node can “see” online traffic from all over the world and is able to capture and store that traffic with the click of a button. What currently keeps your communications secret is that they’re encrypted before they’re sent and decrypted when they reach their intended location. Aanyone who intercepts and downloads them can see only a garbled version, not your actual messages—at least until they have the means to decrypt them.

This leads us to the second technical factor: The price of data storage has fallen precipitously over the last seven decades, and continues to fall at an exponential rate. A terabyte of digital information storage, which was then an unthinkably large amount of data, cost $87.59 billion in 1956. By the year 2000, that figure had fallen to $6,120. Today, the cost has fallen another 99 percent to around $60, and popular tech blogs frequently offer consumer-friendly articles assessing whether a terabyte is “enough” information for a typical digital consumer to store their own files, such as photos, videos, and computer backups.

This, in turn, leads us to the economic factor underpinning the “Harvest Now, Decrypt Later” strategy employed by big tech companies and governments. Even if they assume that 99.9 percent of the encrypted traffic they hoover up from the open internet is worthless to them, the cost of collecting and storing it over time is so cheap that it’s still economically worthwhile to save it all, because that remaining 0.1 percent of traffic will be so valuable once it’s decrypted with quantum computers that it outweighs the costs. Your private chats, which they’ll decrypt along with everything else they collect, will simply be a form of collateral reputational damage.

If this scenario sounds far-fetched, consider that data brokers, the shadowy companies that currently track your online and offline behaviors to repackage and sell the data to advertisers, government agencies, and others, already do this. We know this, in part, because we interviewed the co-founder of location data seller Foursquare for The Secret Life of Data, and he told us point-blank that “sometimes, it’s easier just to store more stuff than it is to take the risk of deleting stuff.”

Anyone with access to the internet would be able to request a thorough accounting of anyone else’s historical behavior, speech, health and medical records, finances, purchases, employment, travel, sex life, drug and alcohol use, religious observances, and even subtler patterns of our lives.

Similarly, we have good reason to believe that governments have been practicing “Harvest Now, Decrypt Later” for decades, because it’s in their national security interests. When federal intelligence contractor Edward Snowden exfiltrated government secrets in 2013 and shared them with the world, one of the files he shared was related to an NSA program entitled “Penetrating Hard Targets,” dedicated to building a “cryptologically useful quantum computer.” Combined with the NSA’s other large-scale internet surveillance projects, this strongly suggests that decrypting secure communications on the open internet is one of the agency’s primary areas of research and development.

While mass warrantless surveillance by government agencies and intrusive tracking and targeting by commercial organizations are bad enough, there are other, less savory actors who may exploit the quantum cliff to do even more harmful (and less legal) things. Terrorists might use quantum decryption to infiltrate or disable energy grids, transportation networks, and other forms of critical infrastructure that haven’t yet adopted next-generation encryption standards. Industrial espionage could run rampant, enabling pirates to plunder long-held trade secrets and cutting-edge innovations. Blackmailers, stalkers, identity thieves, financial scammers, and abusive trolls—who already thrive on our publicly available data such as social media posts and commercial data broker profiles—will have a field day once our private, encrypted data is added to the mix.

Ultimately it’s possible, and perhaps even likely, that our decrypted data will eventually become publicly archived and searchable by anyone, in perpetuity. Combined with generative AI tools like ChatGPT, this means that anyone with access to the internet would be able to request a thorough accounting of anyone else’s historical behavior, speech, health and medical records, finances, purchases, employment, travel, sex life, drug and alcohol use, religious observances, and even subtler patterns of our lives (person X always drinks after going to church; person Y lies much more to their sister than to their brother).

Much ink and many pixels have been devoted over the past few decades to the “death of privacy,” but this is something else entirely. Think of it as the death of secrecy.

*

What can be done? Businesses and governments that have known about the quantum cliff for decades have had time to prepare, and thus offer some clues, if not much hope, for the rest of us.

While nobody has come up with a way to retroactively protect all the stuff previously encrypted by RSA and AES against future quantum decryption, there is a whole new set of “post-quantum cryptography” (PQC) protocols that can be used proactively to secure data now and in the future. Congress passed a law in 2018, entitled the National Quantum Initiative Act, that created a new federal office dedicated to preparing for the potentially beneficial and harmful impacts of quantum computing. Among other things, that office has selected four PCQ algorithms as official standards to replace vulnerable forms of encryption.

Industries are working to adopt these new encryption algorithms. For instance, a consortium of major financial institutions developed an initiative called “Project Leap,” which aims to use PCQ in place of traditional encryption in payments and communications. But as most academic research and business journalism has emphasized, there’s only so much preparation that can be done. We can’t know when or how hard Q-Day will hit, and many of its potential threats to national security and business simply can’t be anticipated.

The rest of us are even less well-prepared. Most people have no idea that a lifetime of secrets is likely to be exposed at some point in the not-so-distant future, and neither government agencies nor the popular press are bothering to tell us about it, let alone advise us what to do. Meanwhile, we send digital payments, message loved ones, check health scans, stream pornography, and talk shit about our bosses in group chats—assured by the good folks at Apple, Google, Meta, Amazon, PayPal, and elsewhere that their encrypted apps and networks protect our private lives.

How will things change when we find out we were wrong? How can we live in a society without secrets, or at least without the guarantee that they’ll remain secret? Every relationship, no matter how healthy, relies on a certain amount of secrecy. Think about it: You don’t really want to know whether your lover thinks you look good in that new sweater, or what your 16-year-old was actually up to when they said they were doing homework with a friend. And you definitely don’t want your boss to know what you were up to last spring when you said you were in bed with a cold.

We even thrive on keeping secrets from ourselves. Think about the sick feeling you get in the pit of your stomach when you take a bad selfie, then hastily delete it before anyone else can see what’s on your screen. Imagine that same feeling, but in every aspect of your life, all the time. None of us would willingly grant others the power to comb through the digital minutiae of our lives, reassemble it all, and shove it back in our faces.

A culture without secrets would amplify a culture of paranoia, a culture of bullying, a culture of self-hatred. A culture in which, in addition to oppressing us based on our race, gender, age, sexuality, or level of physical ability, we’d each be targeted based on our idiosyncratic weaknesses and peculiarities. A culture in which every private action, utterance, and inclination could be harvested for profit, or exploited for power.

Just like governments and businesses, we need to prepare for this new reality. We can’t prevent it from happening, but we can start to make changes in the way we live our lives, so that with any luck, when Q-Day comes, it’ll seem like less of a cataclysm and more of an anti-climax.

What does that mean in practice? As with solving any problem, the first step is to acknowledge we have one. We need to talk about data, power, and surveillance the same way we do about climate change, or pandemics, or the economy—as matters of shared concern, with the rigor and nuance they demand. We must understand that the struggle over who has access to data is a ubiquitous and powerful aspect of our environment, one that we can’t easily escape and shouldn’t risk ignoring. It should be reflected across every facet of culture: in our everyday conversations, daily news articles, and sit-com plots.

Much ink and many pixels have been devoted over the past few decades to the “death of privacy,” but this is something else entirely. Think of it as the death of secrecy.

On the flip side, we also need to understand that the digital world we’ve constructed for ourselves is neither inevitable nor all-encompassing. Just because we can’t unplug from the network doesn’t mean that it controls or contains everything that matters. Hard drives fail. Websites crash. Whole power grids go down. Our datafied world is unstable and ever-changing, and that gives us a lot of collective power to shape its future, and a lot of freedom to carve out spaces of intimacy and secrecy beyond its limits.

Putting those two observations together, we need to understand that personal secrecy and data security aren’t the same thing. Just because a database tracks your sleep, your steps, your messages, and your transactions doesn’t mean it knows you. Only other human beings can do that, and only once we let them into the secret places within us that data can’t reach. The more we can intuitively differentiate between these two concepts, the better we can exert real cultural and social power in a data-saturated world.

This will require letting go of some ingrained digital habits. We’ll need to get better at understanding that someone “liking” your post on social media isn’t the same as them liking you as a person, or that a personal achievement, whether climbing a mountain or getting a raise, can be just as satisfying even if it isn't recorded and shared. More broadly, we’ll need to psychologically distance our authentic selves from the digital puppets constructed from our data: our social media personas, our fitness-tracker profiles, even our bank balances and school grades. We’ll need to reinhabit our bodies, renew our relationships, and reinvest in our communities. It’s a tall order, to be sure, but it’s better to do it proactively, before Q-Day blows everything apart.

Seen from this perspective, the looming threat of the quantum cliff might even be seen as potentially liberating. Once upon a time, only certain powerful institutions had the ability to decide what was secret and what wasn’t (think of all the speculation around the JFK assassination or Area 51). From Cold War-era spycraft to twenty-first-century surveillance capitalism, secrecy has been weaponized as a form of social control, and our personal data have become the keys to the shackles that bind us. Governments and corporations know more about us than we know about them or one another, and as long as we believe in the absolute power of data, we remain under their thumbs. In a post-Q-Day world, however, secrecy will depend less on the power to restrict and shape the flow of information, and more on the power to create narratives (e.g. whether it’s ethically appropriate to “out” a public figure with a secret Grindr profile) and build coalitions (e.g. the #MeToo movement). Secrecy will become less technical and more cultural. We can have more agency in choosing what data matters, and when.

This leads us to our final point: In order to prepare for the quantum cliff, and for a world without technical secrecy, what we need most is to be kind to one another (this will be the subject of our next book, tentatively called Data Kindness). Despite the overwhelming toxicity of contemporary online culture, which is rife with doxxing, flame wars, and deepfakes, we still have many social mores in which we choose to ignore publicly available data in the interest of preserving each other’s dignity and the integrity of our relationships. Most of us don’t peep through our neighbors’ windows as we’re walking down the street. We don’t scroll through our coworkers’ emails when they’re away from their desks. And we don’t obsessively track our partners’ cell phones even if they’ve shared their location with us. We consider those kinds of behaviors, even though they’re technologically feasible, to be violations of unspoken social rules, or, to put it plainly, creepy.

We developed these unspoken social rules for ourselves, because nobody with power ever bothered to spell out how we should treat one another's data. So we extended some of our existing social norms into the digital sphere, carrying over expectations of basic decency, even as the platforms encouraged us to share more and more in an unregulated free-for-all. In the U.S., we still have no federal data privacy laws, both because big tech companies have lobbied against them, and because the wannabe authoritarians in our government have long fantasized about a total surveillance state. Our technology is not designed to keep our secrets very well, and our economic system rewards people for voluntarily relinquishing their own secrets and one another’s. And yet, most of us choose not to be creepy. Even without specialized expertise in data security or well-defined policy goals, we intuitively and collectively understand how to take care of ourselves and one another—and we limit our data collection and sharing as a basic duty of care.

We’ll need to do the same thing, on a grander scale, with the vastly larger and more sensitive volume of data that will be revealed after the quantum cliff. We may no longer be able to keep our secrets safe from others, but we can keep other people’s secrets safe from ourselves. It may sound like an impossibly steep climb, but history demonstrates that solutions to problems like these are always developed from the bottom up, not the top down.

If and when Q-Day arrives and those secrets are suddenly revealed, we can choose to ignore them. We can look away from our screens, and into one anothers’ eyes. We can finally free ourselves from the shackles of surveillance society, and choose to accept others based not on what the data unwittingly reveals about them, but rather on what parts of their lives and hearts they choose to share with us. Damn the data, we can choose to be kind. ♦

MORE FROM BROADCAST
Change the frequency.
Subscribe to Broadcast
Subscribe