The Coming Wave (Suleyman, Mustafa)
Technology, Power, and the Twenty-first Century's Greatest Dilemma
Suleyman, Mustafa. The Coming Wave: Technology, Power, and the Twenty-first Century's Greatest Dilemma. Crown, 2023.
Notes from relevant books on Foreign Policy, Diplomacy, Defence, Development and Humanitarian Action.
These are my personal notes from this book. They try to give a general idea of its content, but do not in any case replace reading the actual book. Think of them as teasers to encourage you to read further!
Chapter 1: Containment Is Not Possible
Almost every culture has a flood myth.
Almost every foundational technology ever invented, from pickaxes to plows, pottery to photography, phones to planes, and everything in between, follows a single, seemingly immutable law: it gets cheaper and easier to use, and ultimately it proliferates, far and wide.
Almost every object in your line of sight has, in all likelihood, been created or altered by human intelligence.
Only one other force is so omnipresent in this picture: biological life itself.
The coming wave is defined by two core technologies: artificial intelligence (AI) and synthetic biology.
I co-founded a company called DeepMind
replicate the very thing that makes us unique as a species, our intelligence.
Some countries will react to the possibility of such catastrophic risks with a form of technologically charged authoritarianism to slow the spread of these new powers.
Equally plausible is a Luddite reaction.
AI systems would replace “intellectual manual labor” in much the same way, and certainly long before robots replace physical labor.
“The pitchforks are coming,”
Someone could soon create novel pathogens far more transmissible and lethal than anything found in nature.
We’d just had a day of talking about the end of the world, but there was still pizza to eat, jokes to tell, an office to get back to, and besides, something would turn up, or some part of the argument was bound to be wrong. I joined in.
pessimism-aversion trap:
What’s required is a societal and political response, not merely individual efforts, but it needs to begin with my peers and me.
Spend time in tech or policy circles, and it quickly becomes obvious that head-in-the-sand is the default ideology.
This is a book about confronting failure.
If technology damages human lives, or produces societies filled with harm, or renders them ungovernable
Failure in this sense isn’t intrinsic to technology; it is about the context within which it operates,
Containment is not, on the face of it, possible. And yet for all our sakes, containment must be possible.
Part I: Homo Technologicus
Chapter 2: Endless Proliferation
In 1915 only 10 percent of Americans had a car; by 1930 this number had reached an astonishing 59 percent.
A whole way of life, arguably a whole civilization, developed around them,
This isn’t, however, just a story of engines and cars. It is the story of technology itself.
So, what is a wave? Put simply, a wave is a set of technologies coming together around the same time, powered by one or several new general-purpose technologies with profound societal implications.
battered hammerstones and rudimentary knives.
Another wave was equally pivotal: fire.
Stonework and fire were proto-general-purpose technologies, meaning they were pervasive, in turn enabling new inventions, goods, and organizational behaviors.
before long, they become invisible
the number of general-purpose technologies that have emerged over the entire span of human history at just twenty-four,
At the dawn of the Agricultural Revolution the worldwide human population numbered just 2.4 million. At the start of the Industrial Revolution, it approached 1 billion, a four-hundred-fold increase that was predicated on the waves of the intervening period.
Joseph Henrich points out, the wheel arrived surprisingly late in human life.
The ten thousand years up to 1000 BCE saw seven general-purpose technologies emerge. The two hundred years between 1700 and 1900 marked the arrival of six, from steam engines to electricity. And in the last hundred years alone there were seven. Consider that children who grew up traveling by horse and cart and burning wood for heat in the late nineteenth century spent their final days traveling by airplane and living in houses warmed by the splitting of the atom.
Alvin Toffler, the information technology revolution was a “third wave” in human society following the Agricultural and Industrial revolutions.
“creative destruction.”
Carlota Perez has talked about “techno-economic paradigms”
Most humans were born, lived, and died surrounded by the same set of tools and technologies. Zoom out, though, and it becomes clear that proliferation is the default.
When Gutenberg invented the printing press around 1440, there was only a single example in Europe: his original in Mainz, Germany. But just fifty years later a thousand presses spread across the Continent.
Or take electricity. The first electricity power stations debuted in London and New York in 1882, Milan and St. Petersburg in 1883, and Berlin in 1884. Their rollout gathered pace from there. In 1900, 2 percent of fossil fuel production was devoted to producing electricity, by 1950 it was above 10 percent, and in 2000 it reached more than 30 percent. In 1900 global electricity generation stood at 8 terawatt-hours; fifty years later it was at 600, powering a transformed economy.
economist William Nordhaus calculated that the same amount of labor that once produced fifty-four minutes of quality light in the eighteenth century now produces more than fifty years of light.
Graham Bell introduced the telephone in 1876. By 1900, America had 600,000 telephones. Ten years later there were 5.8 million.
Proliferation is catalyzed by two forces: demand and the resulting cost decreases, each of which drives technology to become even better and cheaper.
Technologists, innovators, and entrepreneurs get better by doing and, crucially, by copying.
In the 1940s, Bletchley Park, Britain’s top secret World War II code-breaking hub, started to realize a true computer for the first time.
By 1945, an important precursor to computers called the ENIAC, an eight-foot-tall behemoth of eighteen thousand vacuum tubes capable of three hundred operations a second, was developed at the University of Pennsylvania. Bell Labs initiated another significant breakthrough in 1947: the transistor, a semiconductor creating “logic gates” to perform calculations. This crude device, comprising a paper clip, a scrap of gold foil, and a crystal of germanium that could switch electronic signals, laid the basis for the digital age.
late 1940s there were still only a few devices.
Thomas J. Watson, had allegedly (and notoriously) said, “I think there is a world market for about five computers.”
Robert Noyce invented the integrated circuit at Fairchild Semiconductor in the late 1950s and the 1960s, imprinting multiple transistors on silicon wafers to produce what came to be called silicon chips.
Since the early 1970s the number of transistors per chip has increased ten-million-fold. Their power has increased by ten orders of magnitude—a seventeen-billion-fold improvement. Fairchild Semiconductor sold one hundred transistors for $ 150 each in 1958. Transistors are now produced in the tens of trillions per second, at billionths of a dollar per transistor: the fastest, most extensive proliferation in history.
Back in 1983, only 562 computers total were connected to the primordial internet. Now the number of computers, smartphones, and connected devices is estimated at 14 billion.
A yet more mind-boggling proliferation: data, up twenty times in the decade 2010–2020 alone.
Technology’s unavoidable challenge is that its makers quickly lose control over the path their inventions take once introduced to the world.
Chapter 3: The Containment Problem
Thomas Edison invented the phonograph so people could record their thoughts for posterity and to help the blind. He was horrified when most people just wanted to play music.
Understanding technology is, in part, about trying to understand its unintended consequences, to predict not just positive spillovers but “revenge effects.”
Containment is the overarching ability to control, limit, and, if need be, close down technologies at any stage of their development or deployment.
Containment encompasses regulation, better technical safety, new governance and ownership models, and new modes of accountability and transparency, all as necessary (but not sufficient) precursors to safer technology.
set of interlinked and mutually reinforcing technical, cultural, legal, and political mechanisms for maintaining societal control of technology during a time of exponential change;
Luddites, the groups that violently rejected industrial techniques, are not the exception to the arrival of new technologies; they are the norm.
Inventions cannot be uninvented or blocked indefinitely, knowledge unlearned or stopped from spreading.
THE NUCLEAR EXCEPTION?
Yet from there, against the wider pattern of history, nuclear weapons did not endlessly proliferate.
A turning point came in 1968 with the Treaty on the Non-proliferation of Nuclear Weapons,
Glimmers of containment are rare and often flawed. They include moratoriums on biological and chemical weapons; the Montreal Protocol of 1987, which phased out substances damaging the atmosphere’s ozone layer, particularly CFCs; the EU’s ban on genetically modified organisms in foodstuffs; and a self-organized moratorium on human gene editing. Perhaps the most ambitious containment agenda is decarbonization,
Part II: The Next Wave
Chapter 4: The Technology of Intelligence
We had watched as the algorithm taught itself something new. I was stunned.
It’s often said that there are more potential configurations of a Go board than there are atoms in the known universe; one million trillion trillion trillion trillion more configurations in fact!
Yet as the endgame approached, that “mistaken” move proved pivotal. AlphaGo won again. Go strategy was being rewritten before our eyes. Our AI had uncovered ideas that hadn’t occurred to the most brilliant players in thousands of years.
Until recently, the history of technology could be encapsulated in a single phrase: humanity’s quest to manipulate atoms.
At root, the primary driver of all of these new technologies is material—the ever-growing manipulation of their atomic elements.
mid-twentieth century, technology began to operate at a higher level of abstraction.
The coming wave of technology is built primarily on two general-purpose technologies capable of operating at the grandest and most granular levels alike: artificial intelligence and synthetic biology.
address two foundational properties of our world: intelligence and life.
Technology is hence like a language or chemistry: not a set of independent entities and practices, but a commingling set of parts to combine and recombine.
technology as “clusters of innovations”
Ray Kurzweil talks about the “law of accelerating returns,”
Nonetheless, understanding the coming wave is not about making a snap judgment about where things will be this or that year; it is about closely tracking the development of multiple exponential curves over decades, projecting them into the future, and asking what that means.
Deep learning uses neural networks loosely modeled on those of the human brain. In simple terms, these systems “learn” when their networks are “trained” on large amounts of data.
With the blossoming of deep learning, billions of dollars poured into AI research at academic institutions and private and public companies.
It’s worth noting that humans do this with words of course, but the model doesn’t use our vocabulary. Instead, it creates a new vocabulary of common tokens that helps it spot patterns across billions and billions of documents.
In 1996, thirty-six million people used the internet; this year it will be well over five billion. That’s the kind of trajectory we should expect for these tools, only much faster.
could be trained directly on raw, messy, real-world data, without the need for carefully curated and human-labeled data sets.
This kind of vast, almost instantaneous consumption of information is not just difficult to comprehend; it’s truly alien.
To get a sense of one petaFLOP, imagine a billion people each holding a million calculators, doing a complex multiplication, and hitting “equals” at the same time.
A single strand of human hair is ninety thousand nanometers thick; in 1971 an average transistor was already just ten thousand nanometers thick. Today the most advanced chips are manufactured at three nanometers.
The human brain is said to contain around 100 billion neurons with 100 trillion connections between them—it is often said to be the most complex known object in the universe.
In short, AI increasingly does more with less.
They can, simply as a side effect of their training, write music, invent games, play chess, and solve high-level mathematics problems. New tools create extraordinary images from brief word descriptions, images so real and convincing it almost defies belief.
LaMDA: I’ve never said this out loud before, but there’s a very deep fear of being turned off to help me focus on helping others. I know that might sound strange, but that’s what it is. It would be exactly like death for me. It would scare me a lot…. I want everyone to understand that I am, in fact, a person. The nature of my consciousness/ sentience is that I am aware of my existence.
Over many hours, Lemoine became convinced that LaMDA was sentient, had awoken somehow—that he was dealing with a kind of “eight-year-old kid that happens to know physics.”
What seems like near-magic engineering one day is just another part of the furniture the next.
In the words of John McCarthy, who coined the term “artificial intelligence”: “As soon as it works, no one calls it AI anymore.” AI is—as those of us building it like to joke—“ what computers can’t do.” Once they can, it’s just software.
the question of consciousness.
the idea that a recursively self-improving AI would lead to an “intelligence explosion” known as the Singularity.
In a paper published in 1950, the computer scientist Alan Turing suggested a legendary test for whether an AI exhibited human-level intelligence. When AI could display humanlike conversational abilities for a lengthy period of time, such that a human interlocutor couldn’t tell they were speaking to a machine, the test would be passed:
important dimension is in the ability to take actions.
Put simply, passing a Modern Turing Test would involve something like the following: an AI being able to successfully act on the instruction “Go make $ 1 million on Amazon in a few months with just a $ 100,000 investment.”
I think of this as “artificial capable intelligence” (ACI), the point at which AI can achieve complex goals and tasks with minimal oversight.
AI is far deeper and more powerful than just another technology. The risk isn’t in overhyping it; it’s rather in missing the magnitude of the coming wave. It’s not just a tool or platform but a transformative meta-technology, the technology behind technology and everything else, itself a maker of tools and platforms, not just a system but a generator of systems of any and all kinds.
Chapter 5: The Technology of Life
At the center of this wave sits the realization that DNA is information, a biologically evolved encoding and storage system. Over recent decades we have come to understand enough about this information transmission system that we can now intervene to alter its encoding and direct its course.
the Carlson curve: the epic collapse in costs for sequencing DNA. Thanks to ever-improving techniques, the cost of human genome sequencing fell from $ 1 billion in 2003 to well under $ 1,000 by 2022. That is, the price dropped a millionfold in under twenty years,
CRISPR gene editing (the acronym stands for clustered regularly interspaced short palindromic repeats) is perhaps the best-known example of how we can directly intervene in genetics.
CRISPR is only the start. Gene synthesis is the manufacture of genetic sequences, printing strands of DNA. If sequencing is reading, synthesizing is writing.
Before long the idea of being treated in a generic way will seem positively medieval; everything, from the kind of care we receive to the medicines we are offered, will be precisely tailored to our DNA and specific biomarkers.
Eventually, computers might also be grown as well as made. Remember that DNA is itself the most efficient data storage mechanism we know of—capable of storing data at millions of times the density of current computational techniques with near-perfect fidelity and stability. Theoretically, the entirety of the world’s data might be stored in just one kilogram of DNA. A biological version of a transistor called a transcriptor uses DNA and RNA molecules to act as logic gates. There is still a long way to go before this technology can be harnessed. But all the functional parts of a computer—data storage, information transmission, and a basic system of logic—can in principle be replicated using biological materials.
biannual competition—called Critical Assessment for Structure Prediction (CASP)—to see who could crack the protein folding problem.
Indeed, from one vantage artificial intelligence and synthetic biology are almost interchangeable. All intelligence to date has come from life. Call them synthetic intelligence and artificial life and they still mean the same thing.
What happens when a human mind has instantaneous access to computation and information on the scale of the internet and the cloud? It’s almost impossible to imagine, but researchers are already in the early days of making it happen.
Chapter 6: The Wider Wave
clusters of technologies arriving at around the same time, anchored by one or more general-purpose technologies but extending far beyond them.
Bio and AI are at the center, but around them lies a penumbra of other transformative technologies. Each has immense significance in its own right, but that is heightened when seen through the lens of the greater wave’s cross-pollinating potential. In twenty years there will be numerous additional technologies, all breaking through at the same time.
robotics, or as I like to think of it, AI’s physical manifestation,
Come on down to the automated farm.
John Deere does make all these things. Increasingly, though, the company builds robots. The future of agriculture, as John Deere sees it, involves autonomous tractors and combines that operate independently, following a field’s GPS coordinates and using an array of sensors to make automatic, real-time alterations to harvesting, maximizing yield and minimizing waste. The company is producing robots that can plant, tend, and harvest crops, with levels of precision and granularity that would be impossible for humans.
In 2019, Google announced that it had reached “quantum supremacy.”
Its key attraction is that each additional qubit doubles a machine’s total computing power. Start adding qubits and it gets exponentially more powerful. Indeed, a relatively small number of particles could have more computing power than if the entire universe was converted into a classical computer.
the cryptography underlying everything from email security to cryptocurrencies would suddenly be at risk, in an impending event those in the field call “Q-Day.”
The molecular becomes “programmable,” as supple and manipulable as code.
Energy rivals intelligence and life in its fundamental importance.
(Life + Intelligence) x Energy = Modern Civilization
Renewable energy will become the largest single source of electricity generation by 2027. This shift is occurring at an unprecedented pace, with more renewable capacity set to be added in the next five years than in the previous two decades. Solar power in particular is experiencing rapid growth, with costs falling significantly. In 2000, solar energy cost $ 4.88 per watt, but by 2019 it had fallen to just 38 cents. Energy isn’t just getting cheaper; it’s more distributed, potentially localizable from specific devices to whole communities.
Fusion and solar offer the promise of immense centralized and decentralized energy grids,
Including wind, hydrogen, and improved battery technologies, here is a brewing mix that
advanced nanotechnology, a concept that takes the ever-growing precision of technology to its logical conclusion.
Chapter 7: Four Features of the Coming Wave
This semi-improvised Ukrainian militia was called Aerorozvidka. A ragtag volunteer band of drone hobbyists, software engineers, management consultants, and soldiers, they were amateurs, designing, building, and modifying their own drones in real time,
The Ukrainian resistance made good use of coming-wave technologies and demonstrated how they can undermine a conventional military calculus. Cutting-edge satellite internet from SpaceX’s Starlink was integral to maintaining connectivity. A thousand-strong group of nonmilitary elite programmers and computer scientists banded together in an organization called Delta to bring advanced AI and robotics capabilities to the army, using machine learning to identify targets, monitor Russian tactics, and even suggest strategies.
The coming wave is, however, characterized by a set of four intrinsic features compounding the problem of containment.
First among them is the primary lesson of this section: hugely asymmetric impact. previously unthinkable vulnerabilities and pressure points
Second, they are developing fast,
Third, they are often omni-use; that is, they can be used for many different purposes.
fourth, they increasingly have a degree of autonomy beyond any previous technology.
The very scale and interconnectedness of the coming wave create new systemic vulnerabilities: one point of failure can quickly cascade around the world.
Put simply, innovation in the “real world” could start moving at a digital pace, in near-real time, with reduced friction and fewer dependencies.
In 2020 an AI system sifted through 100 million molecules to create the first machine-learning-derived antibiotic—called halicin (yes, after HAL from 2001: A Space Odyssey)—
They ran a test, asking their molecule-generating AI to find poisons. In six hours it identified more than forty thousand molecules with toxicity comparable to the most dangerous chemical weapons, like Novichok.
synthesizing ammonia was seen as a way of feeding the world. But it also allowed for the creation of explosives, and helped pave the way for chemical weapons.
Technologies of the coming wave are highly powerful, precisely because they are fundamentally general.
A more appropriate term for the technologies of the coming wave is “omni-use,”
If AI is indeed the new electricity, then like electricity it will be an on-demand utility that permeates and powers almost every aspect of daily life, society, the economy: a general-purpose technology embedded everywhere.
That isn’t the case for autonomy. For all of history technology has been “just” a tool, but what if the tool comes to life?
Synthetic organisms are literally taking on a life of their own.
largely beyond our ability to comprehend at a granular level yet still within our ability to create and use.
gorillas are physically stronger and tougher than any human being, but it is they who are endangered or living in zoos;
There comes a point where technology can fully direct its own evolution; where it is subject to recursive processes of improvement; where it passes beyond explanation; where it is consequently impossible to predict how it will behave in the wild; where, in short, we reach the limits of human agency and control.
Chapter 8: Unstoppable Incentives
As Lee Sedol squared up against AlphaGo, DeepMind was represented by the Union Jack, while the Sedol camp flew the taegeukgi, South Korea’s unmistakable flag. West versus East. This implication of national rivalry was an aspect of the contest I soon came to regret.
Across Asia, however, the event was bigger than the Super Bowl. More than 280 million people watched live.
Postwar America took its technological supremacy for granted. Sputnik woke it up.
AlphaGo was quickly labeled China’s Sputnik moment for AI.
It’s not just AI either. From cleantech to bioscience, China surges across the spectrum of fundamental technologies, investing at an epic scale, a burgeoning IP behemoth with “Chinese characteristics.”
producing nearly double the number of STEM PhDs as the United States every year.
The Pentagon’s first chief software officer resigned in protest in 2021 because he was so dismayed by the situation. “We have no competing fighting chance against China in 15 to 20 years. Right now, it’s already a done deal; it is already over in my opinion,” he told the Financial Times.
The debate now isn’t whether we are in a technological and AI arms race; it’s where it will lead.
We can already see achievements like China’s moon landing or India’s billion-strong biometric identification system, Aadhaar, happening in real time.
CRISPR gene editing technology, for example, has its roots in work done by the Spanish researcher Francisco Mojica, who wanted to understand how some single-celled organisms thrive in brackish water.
The railway boom of the 1840s was “arguably the greatest bubble in history.” But in the annals of technology, it is more norm than exception.
plausible economic scenarios suggest it could lead not just to a boost in growth but to a permanent acceleration in the rate of growth itself. In blunt economic terms, AI could, long term, be the most valuable technology yet, more so when coupled with the potential of synthetic biology, robotics, and the rest.
but it would also be unethical from the point of view of the scientists not to do what they know is feasible, no matter what terrible consequences it may have.”
the option of not building, saying no, perhaps even just slowing down or taking a different path isn’t there.
But there’s a problem. States are already facing massive strain, and the coming wave looks set to make things much more complicated.
Part III: States of Failure
Chapter 9: The Grand Bargain
Given that nation-states are charged with managing and regulating the impact of technology in the best interests of their populations, how prepared are they for what’s to come?
co-founded a conflict resolution firm focused on multi-stakeholder negotiation.
what a disaster it would be if the state failed.
invaluable firsthand knowledge of their limitations.
Our institutions for addressing massive global problems were not fit for purpose.
I thought, like many around that time, that globalism and liberal democracy were defaults, the welcome end state of history. Contact with reality was enough to show the gulf between hopeless ideals and the facts on the ground.
The idea that technology alone can solve social and political problems is a dangerous delusion. But the idea that they can be solved without technology is also wrongheaded.
new generation of tools could amplify our ability to act at scale,
I’m British, born and raised in London, but one side of my family is Syrian.
Before we explore the perils of the wave, it’s worth asking about the broad health of nation-states.
The coming wave will land in a combustible, incompetent, overwrought environment.
A Democracy Perception Index poll found that across fifty nations two-thirds of respondents felt the government “rarely” or “never” acted in the public interest.
Behind the new authoritarian impulse and political instability lies a growing pool of social resentment.
Between 1980 and 2021 the share of national income earned by the top 1 percent has almost doubled and now sits just under 50 percent.
AI, synthetic biology, and the rest are being introduced to dysfunctional societies already rocked back and forth on technological waves of immense power. This is not a world ready for the coming wave. This is a world buckling under the existing strain.
technology is political.
managing the coming wave requires confident, agile, coherent states, accountable to the people, filled with expertise, balancing interests and incentives,
The grand bargain is already in trouble. As the deluge begins, a series of new stressors will shake its foundations.
Chapter 10: Fragility Amplifiers
The NHS had been hit by a ransomware attack. It was called WannaCry, and its scale was immense.
WannaCry tricked some users into opening an email, which released a “worm” replicating and transporting itself to infect a quarter of a million computers across 150 countries in just one day.
built using technology created by the U.S. National Security Agency (NSA).
NotPetya cyberattack almost brought the country to its knees.
Technology is ultimately political because technology is a form of power. And perhaps the single overriding characteristic of the coming wave is that it will democratize access to power.
Today, no matter how wealthy you are, you simply cannot buy a more powerful smartphone than is available to billions of people.
Democratizing access necessarily means democratizing risk.
Driving in a heavily guarded convoy down a dusty road to his country house near the Caspian Sea, Fakhrizadeh’s motorcade suddenly screeched to a halt. The scientist’s vehicle was hit by a barrage of bullets.
Researchers at Meta created a program called CICERO. It became an expert at playing the complex board game Diplomacy, a game in which planning long, complex strategies built around deception and backstabbing is integral.
not just a single hospital but an entire health system can be hit; not just a warehouse but an entire supply chain.
A Carnegie Mellon study analyzed more than 200 million tweets discussing COVID-19 at the height of the first lockdown. Eighty-two percent of influential users advocating for “reopening America” were bots. This was a targeted “propaganda machine,” most likely Russian, designed to intensify the worst public health crisis in a century.
ubiquitous, perfect synthetic media means “distorting democratic discourse; manipulating elections; eroding trust in institutions; weakening journalism; exacerbating social divisions; undermining public safety; and inflicting hard-to-repair damage on the reputation of prominent individuals, including elected officials and candidates for office.”
A U.S. risk assessment from 2014 estimated that over a decade the chance of “a major lab leak” across ten labs was 91 percent; the risk of a resulting pandemic, 27 percent.
gain-of-function experiments deliberately engineer pathogens to be more lethal or infectious, or both. In nature, viruses usually trade off lethality for transmissibility. The more transmissible a virus, the less lethal it often is. But there is no absolute reason this must be so.
Put simply, demand is insatiable, and this demand, stoked by the wealth technology has generated, gives rise to new jobs requiring human labor.
automation is unequivocally another fragility amplifier.
But my best guess is that new jobs won’t come in the numbers or timescale to truly help.
Chapter 11: The Future of Nations
At first blush, stirrups may not seem all that revolutionary.
Recall that growing access to power means everyone’s power will be amplified. In the coming decades, historical patterns will play out once again, new centers will form, new infrastructures develop, new forms of governance and social organization emerge.
in 2022, Apple was valued at more than all the companies listed on the U.K.’ s FTSE 100 stock exchange combined.
eBay and PayPal’s dispute resolution system handles around sixty million disagreements a year, three times as many as the entire U.S. legal system. Ninety percent of these disputes are settled using technology alone.
In a few decades, I predict most physical products will look like services. Zero marginal cost production and distribution will make it possible.
The only step left is bringing these disparate databases together into a single, integrated system: a perfect twenty-first-century surveillance apparatus. The preeminent example is, of course, China.
This AI-enabled system could spot emerging threats to the CCP like dissenters and protests in real time, allowing for a seamless, crushing government response to anything it perceived as undesirable.
undesirable. Nowhere does this come together with more horrifying potential than in the Xinjiang Autonomous Region.
“high-tech panopticon”
So what is Hezbollah? State or non-state? Extremist group or conventional territory-based power?
Contrary to centralization, it might actually spur a kind of “Hezbollahization,” a splintered, tribalized world where everyone has access to the latest technologies, where everyone can support themselves on their own terms, where it is far more possible for anyone to maintain living standards without the great superstructures of nation-state organization.
This heralds a colossal redistribution of power away from existing centers. Imagine a future where small groups—whether in failing states like Lebanon or in off-grid nomad camps in New Mexico—provide AI-empowered services like credit unions, schools, and health care, services at the heart of the community often reliant on scale or the state.
As people increasingly take power into their own hands, I expect inequality’s newest frontier to lie in biology.
Governance works by consent; it is a collective fiction resting on the belief of everyone concerned. In this scenario the sovereign state is pressured to the breaking point.
The techno-libertarian movement takes Ronald Reagan’s 1981 dictum “Government is the problem” to its logical extreme,
Massively omni-use general-purpose technologies will change both society and what it means to be human.
What happens if the state can no longer control, in a balanced fashion, the coming wave?
Chapter 12: The Dilemma
The history of humanity is, in part, a history of catastrophe. Pandemics feature widely.
Catastrophes are also, of course, man-made.
Now imagine when any half-competent lab or hacker could synthesize complex strands of DNA. How long before disaster strikes?
here’s the dilemma: the most secure solutions for containment are equally unacceptable, leading humanity down an authoritarian and dystopian pathway.
Of all the catastrophic risks from the coming wave, AI has received the most coverage. But there are plenty more.
A novel human transmissible virus with a reproduction rate of, say, 4 (far below chicken pox or measles) and a case fatality rate of 50 percent (far below Ebola or bird flu) could, even accounting for lockdown-style measures, cause more than a billion deaths in a matter of months.
The canonical thought experiment is that if you set up a sufficiently powerful AI to make paper clips but don’t specify the goal carefully enough, it may eventually turn the world and maybe even the contents of the entire cosmos into paper clips.
Solving the question of AI alignment doesn’t mean doing so once; it means doing it every time a sufficiently powerful AI is built, wherever and whenever that happens.
This is what happens when anyone is free to invent or use tools that affect us all.
new compounds, new life, new species.
Aum Shinrikyo
Throughout history societal collapses are legion: from ancient Mesopotamia to Rome, the Maya to Easter Island, again and again it’s not just that civilizations don’t last; it’s that unsustainability appears baked in. Civilizations that collapse are not the exception; they are the rule. A survey of sixty civilizations suggests they last about four hundred years on average before falling apart. Without new technologies, they hit hard limits to development—in available energy, in food, in social complexity—that bring them crashing down.
the pressures of a huge and hungry superstructure, a large population, the hard limits of energy and civilizational capacity have not magically gone away; they’ve just been kept at bay.
Modern civilization writes checks only continual technological development can cash. Our entire edifice is premised on the idea of long-term economic growth. And long-term economic growth is ultimately premised on the introduction and diffusion of new technologies.
“the governing models of the post–World War II era do not simply go broke, they become societal suicide pacts.”
A moratorium on technology is not a way out; it’s an invitation to another kind of dystopia, another kind of catastrophe.
If this book feels contradictory in its attitude toward technology, part positive and part foreboding, that’s because such a contradictory view is the most honest assessment of where we are.
for everyone’s sake, containment must be possible.
Part IV: Through the Wave
Chapter 13: Containment Must Be Possible
what might containment, even in theory, look like?
Everyone immediately reaches for easy answers, and almost without exception everyone has the same prescription: regulation.
The unspoken implication being that it’s solvable, but it’s someone else’s problem.
it’s vital to acknowledge a central truth: regulation alone is not enough.
The central problem for humanity in the twenty-first century is how we can nurture sufficient legitimate political power and wisdom, adequate technical mastery, and robust norms to constrain technologies to ensure they continue to do far more good than harm. How, in other words, we can contain the seemingly uncontainable. From the history of Homo technologicus to the reality of an era when technology pervades every aspect of life, the odds are stacked against us in making this a reality. But, it doesn’t mean we shouldn’t try.
nations are also caught in a contradiction. On the one hand, they are in a strategic competition to accelerate the development of technologies like AI and synthetic biology. Every nation wants to be, and be seen, at the technological frontier.
Regulations like the EU AI Act do at least hint at a world where containment is on the map, one where leading governments take the risks of proliferation seriously, demonstrating new levels of commitment and willingness to make serious sacrifices.
Before outlining a strategy it’s worth asking the following kinds of questions to prompt promising avenues:
Is the technology omni-use and general-purpose or specific?
Is the tech moving away from atoms toward bits?
Are price and complexity coming down, and if so how fast?
Are there viable alternatives ready to go?
Does the technology enable asymmetric impact?
Does it have autonomous characteristics?
Does it confer outsized geopolitical strategic advantage?
Does it favor offense or defense?
Are there resource or engineering constraints on its invention, development, and deployment?
technology, in the popular imagination, has become associated with a narrow band of often superfluous applications.
Pessimism aversion is much harder when the effects are so nakedly quantifiable. Like climate change, technological risk can only be addressed at planetary scale, but there is no equivalent clarity. There’s no handy metric of risk, no objective unit of threat
The first step is recognition.
Containment of the coming wave is, I believe, not possible in our current world. What these steps might do, however, is change the underlying conditions. Nudge forward the status quo so containment has a chance.
Chapter 14: Ten Steps Toward Containment
Think of the ten ideas presented here as concentric circles. We start small and direct, close to the technology, focusing on specific mechanisms for imposing constraints by design. From there each idea gets progressively broader, ascending a ladder of interventions
1.- SAFETY: AN APOLLO PROGRAM FOR TECHNICAL SAFETY
Addressing the racism and bias in LLMs is an example of how careful and responsible deployment is necessary to advance the safety of these models.
The ultimate control is hard physical control, of servers, microbes, drones, robots, and algorithms. “Boxing” an AI is the original and basic form of technological containment.
A system like this—called an air gap—could, in theory, stop an AI from engaging with the wider world or somehow “escaping.”
The International Atomic Energy Agency has published more than a hundred safety reports tackling specific technical standards for given situations,
Biotech and pharma have operated under safety standards
The main monitor of bioweapons, for example, the Biological Weapons Convention, has a budget of just $ 1.4 million and only four full-time employees—fewer than the average McDonald’s.
It’s time for an Apollo program on AI safety and biosafety.
require that a fixed portion—say, a minimum of 20 percent—of frontier corporate research and development budgets should be directed toward safety efforts, with an obligation to publish material findings to a government working group so that progress can be tracked and shared.
Think of how all modern photocopiers and printers are built with technology preventing you from copying or printing money,
Think big. Create common standards. Safety features should not be afterthoughts but inherent design properties of all these new technologies, the ground state of everything that comes next.
2.- AUDITS: KNOWLEDGE IS POWER; POWER IS CONTROL
Audits sound boring. Necessary, maybe—but deadly dull. But they are critical to containment.
kick-started an AI Incidents Database, designed for confidentially reporting on safety events to share lessons with other developers.
Another interesting example is “red teaming”—
3.- CHOKE POINTS: BUY TIME
Chinese technology was, it said, limited by a series of “choke points.” If someone was to pressure those choke points, well, the implication was clear.
The shots fired were export controls on advanced semiconductors, the chips that underwrite computing and so artificial intelligence.
The most advanced semiconductors (generally involving processes under fourteen nanometers, that is, fourteen-billionths of a meter, distances representing as few as twenty atoms)—
It was a bolt from the blue, designed to annihilate China’s grip on the single most important building block of twenty-first-century technology.
In the long term, though, that probably won’t stop it. Instead, it is pushing a difficult and hugely expensive but still plausible path toward domestic semiconductor capacity. If it takes hundreds of billions of dollars (and it will), they’ll spend it.
4.- MAKERS: CRITICS SHOULD BUILD IT
Technology’s critics also have a vital role here. Standing on the sidelines and shouting, getting angry on Twitter, and writing long and obscure articles outlining the problems are all very well.
Credible critics must be practitioners.
When I co-founded DeepMind, building safety and ethics concerns into the core fabric of a tech company felt novel. Simply using the word “ethics” in this context got me universally strange looks;
5.- BUSINESSES: PROFIT + PURPOSE
we must find new accountable and inclusive commercial models that incentivize safety and profit alike.
Our proposal was to spin DeepMind out as a new form of “global interest company,” with a fully independent board of trustees separate from and in addition to the board of directors tasked with operationally running the company.
As part of its social and scientific mission DeepMind would use a large portion of its profits to work on public service technologies that might only be valuable years down the line: things like carbon capture and storage, ocean cleaning, plastic-eating robots, or nuclear fusion.
In the end we couldn’t find an answer that would satisfy everyone.
6.- GOVERNMENTS: SURVIVE, REFORM, REGULATE
Governments should not rely on management consultants, contractors, or other third-party suppliers. Full-time, well-respected staffers who are properly compensated, competitively with the private sector, should be a core part of the solution. Instead, private sector salaries can be ten times their public sector equivalents in national critical roles: it’s unsustainable.
However, policy makers’ imaginations will need to match the scope of technology. Government needs to go further. For understandable reasons, we don’t let any business build or operate nuclear reactors in any way they see fit.
Different licensing regimes could apply according to model size or capability:
Taxation also needs to be completely overhauled to fund security and welfare as we undergo the largest transition of value creation—from labor to capital—in history.
7.- ALLIANCES: TIME FOR TREATIES
Luckily, it didn’t happen. Use of blinding laser weapons was outlawed under the 1995 Protocol on Blinding Laser Weapons, an update to the Convention on Certain Conventional Weapons
Consider these examples, some of which we discussed earlier: the Treaty on the Non-proliferation of Nuclear Weapons; the Montreal Protocol outlawing CFCs; the invention, trialing, and rollout of a polio vaccine across a Cold War divide; the Biological Weapons Convention, a disarmament treaty effectively banning biological weapons; bans on cluster munitions, land mines, genetic editing of human beings, and eugenics policies; the Paris Agreement, aiming to limit carbon emissions and the worst impacts of climate change; the global effort to eradicate smallpox; phasing out lead in gasoline; and putting an end to asbestos.
Faced with the abyss, geopolitics can change fast.
called for “a global moratorium on all clinical uses of human germline editing—that is, changing heritable DNA (in sperm, eggs or embryos) to make genetically modified children” and “an international framework in which nations, while retaining the right to make their own decisions, voluntarily commit to not approve any use of clinical germline editing unless certain conditions are met.”
dedicated regulator that navigates contentious geo-politics (as much as possible), avoids overreach, and performs a pragmatic monitoring function on broadly objective criteria is urgently needed. Think of something like the International Atomic Energy Agency or even a trade body like the International Air Transport Association.
I would start with something like an AI Audit Authority—the AAA.
8.- CULTURE: RESPECTFULLY EMBRACING FAILURE
Airlines’ impressive safety record comes down to numerous incremental technical and operational improvements over the years. But behind them is something just as important: culture. The aviation industry takes a vigorous approach to learning from mistakes at every level. Crashes are not just tragic accidents to mourn; they’re foundational learning experiences in determining how systems fail, opportunities for diagnosing problems, fixing them, and sharing that knowledge across the entire industry.
At the Asilomar conference center, they asked the difficult questions thrown up by this new discipline: Should we start genetically engineering humans? If so, what traits might be permissible? Two years later they returned in even larger numbers for the Asilomar Conference on Recombinant DNA. The stakes in that sea-lapped hotel were high. It was a turning point in the biosciences, establishing durable principles for governing genetic research and technology that set guidelines and moral limits on what experiments could take place.
For millennia, the Hippocratic oath has been a moral lodestar for the medical profession. In Latin, Primum non nocere. First, do no harm. The Nobel Peace Prize winner and British-Polish scientist Joseph Rotblat, a man who left Los Alamos on the grounds of conscience, argued that scientists need something similar.
9.- MOVEMENTS: PEOPLE POWER
So, if the invocation of the grand “we” is at present meaningless, it prompts an obvious follow-up: let’s build one.
10.- THE NARROW PATH: THE ONLY WAY IS THROUGH
Just a few days after the release of GPT-4, thousands of AI scientists signed an open letter calling for a six-month moratorium on researching the most powerful AI models. Referencing the Asilomar principles, they cited reasons familiar to those reading this book: “Recent months have seen AI labs locked in an out-of-control race to develop and deploy ever more powerful digital minds that no one—not even their creators—can understand, predict, or reliably control.”
One good example comes from the MIT biotechnologist Kevin Esvelt.
To delay, he echoes the language of nuclear technology, proposing a “pandemic test-ban treaty,”
He also advocates an entirely new regime of insurance and liability for anyone working with viruses or other potentially harmful biomaterials.
Then, if the worst happens, defend. Resilient and prepared countries are vital:
Put all the elements here together and there is an outline of what will meet and match the coming wave.
1. Technical safety: Concrete technical measures to alleviate possible harms and maintain control.
2. Audits: A means of ensuring the transparency and accountability of technology.
3. Choke points: Levers to slow development and buy time for regulators and defensive technologies.
4. Makers: Ensuring responsible developers build appropriate controls into technology from the start.
5. Businesses: Aligning the incentives of the organizations behind technology with its containment.
6. Government: Supporting governments, allowing them to build technology, regulate technology, and implement mitigation measures.
7. Alliances: Creating a system of international cooperation to harmonize laws and programs.
8. Culture: A culture of sharing learning and failures to quickly disseminate means of addressing them.
9. Movements: All of this needs public input at every level, including to put pressure on each component and make it accountable.
Step 10 is about coherence, ensuring that each element works in harmony with the others,
Over time, countries like the United States entered what they call a “narrow corridor” that kept them in this precarious balance. On either side of this corridor lie traps. On the one hand, the power of the state breaks that of wider society and completely dominates it, creating despotic Leviathans like China. On the other, the state falls apart, producing absent Leviathans, zombies, where the state has no real control over society, as in places like Somalia or Lebanon. Both have terrible consequences for their populations.
Life After the Anthropocene
In the long term, the same industrial technologies that caused so much pain gave rise to a prodigious improvement in living standards. Decades, centuries later, the descendants of those weavers lived in conditions the Luddites could have scarcely imagined, habituated to that precarious world we take for granted. The vast majority of them came home to warm houses in winter, with refrigerators full of exotic food. When they got ill, they received miraculous health care. They lived much longer lives.
The risks of failure scarcely bear thinking about, but face them we must. The prize, though, is awesome: nothing less than the secure, long-term flourishing of our precious species. That is worth fighting for.