sourceType:: book author:: mustafa suleyman sourcePublication:: ref:: noteTitle:: the coming wave; mustafa suleyman. (book)
the coming wave - technology, power, and the 21st century’s greatest dilemma; mustafa suleyman. (book)
Containment is not possible
^c4f821
summary:: we need to contain AI, but it seems we cannot.
some countries will resort to techno-charged authoritarianism to slow the spread. massive surveillance and personal life intrusion. a drift toward dystopian surveillance.
or a Luddite reaction: bans, boycotts. but this is a risk, too. techno stagnant societies are historically unstable and prone to collapse. lost capacity to solve problems and progress.
Endless Proliferation
summary:: humans and humanity are a product of our technology, and of the unleashing of new abilities that tech provides. general-purpose technologies proliferate, usually exponentially.
General-purpose waves, the rhythm of history
re: Ford’s ramping up of automobile production. in 1915 only 10% of americans had a car; by 1930: 59%.
a whole way of life, arguably a whole civilization, developed around [automobiles], from sprawling suburbs to industrial farms, drive-thru restaurants to car mod culture. Vast highways were built, sometimes right through cities, severing neighborhoods but connecting far-flung regions. The previously challenging notion of moving from place to place in search of prosperity or fun became a regular feature of human life. […engines] were driving history.
over time, demand for new products and services grows, competition results in cheaper version with more features. this drives more demand for the underlying technology, which itself becomes cheaper and easier. cost fall, capabilities rise, experiments are tried - repeat, repeat.
inescapable evolutionary nature of technology. (similar:: ref_the-nature-of-technology_w-brian-arthur)
we are not just the creators of our tools, we are a product of them, biologically, anatomically (cooking food, brought more energy and freed humans to fashion tools and build complex social networks) ^6c8d11
the campfire was the central hub of human life, establishing communities, relationships and labor organization.
large populations (enabled by technology) give rise to greater specialization variation-between-individuals-is-required-for-societal-success. artisans, scholars - livelihood not tied to land and subsistence. more inventors, reasons to invent. these mean more people. this is also a feedback loop. cities are especially centers of technological development.
Proliferation is the default
Guteberg’s press: as demand soared, costs plummeted. the printing press’s introduction in the 15th century caused a 340-fold decrease in the price of a book (further driving adoption and thus more demand) ^e01b2a
electricity: in 1900, 2% of fossil fuel was for electricity, in 1950 10%+, 2000 30%+. in 1900, global electricity was at 8 terawatt-hours, 50 years later 600 and a transformed economy.
the same amount of labor to produce 54 minutes of quality light in 1700s now produces more than 50 years of light. 21st century person has access to 438,000x more light-hours / year than 1700s person.
Bell Labs created the transistor in 1947. it’s a semiconductor, a logic gate for performing calculations. ^d33ac3
since 1970s number of transistors per chip has increased 10 million-fold. their power increased by 10 orders of magnitude: a 17 billion fold improvement. ^0acebe
Bell Labs created the transistor in 1947. it’s a semiconductor, a logic gate for performing calculations. ^d33ac3
since 1970s number of transistors per chip has increased 10 million-fold. their power increased by 10 orders of magnitude: a 17 billion fold improvement. ^0acebe
1958: one hundred transistors were sold for $150, now tens of trillions are produced per second at billionths of a dollar per transistor. Fastest, most extensive proliferation in history. (similar:: dyson-sphere-program - when you play a factorio-like game, you get a visceral feeling for what this means, sped up through time.)
The Containment Problem
Revenge effects
trying to understand technologies unintended consequences. There are the “positive” spillover effects, but what about the bad “revenge effects”? any technology has the potential go wrong in entirely unanticipated ways
Containment is about control: stopping research directions, denying a bad actor, etc.
Containment is the foundation
the balance of power is not between competing actors, but between humans and our tools.
Containment encompasses regulation, better technical safety, new governance and ownership models, and new modes of accountability and transparency, all as necessary (but not sufficient) precursors to safer technology. It’s an overarching lock uniting cutting-edge engineering, ethical values, and government regulation. Containment shouldn’t be seen as the final answer to all technology’s problems; it is rather the first, critical step, a foundation on which the future is built.
Have we ever said no?
Nation states have often pushed back on technological advancements. Ottoman empire against the printing-press, Pope Urban II with the crossbow, Elizabeth I with a knitting machine. [most of these were about preserving hegemony - of the state, of guilds, of treasure, etc.]
luddites are not the exception, they are the norm.
people are worried about their livelihoods, ways of life, futures for their children. they will resort to physical violence.
but the luddites rarely succeed. where there is demand, technology always breaks out and finds users. and the proliferation cycle begins, all but unstoppable.
it’s not that the containment problem hasn’t been recognized in the past: the solutions have just never been enough.
The nuclear exception
not only are nukes hard and costly to build. but they are scary.
the cold calculation of mutually-assured destruction has hemmed the problem in.
similar:: nuclear-near-misses
Humanity has tried to say no and only partially succeeded. nukes are the most contained tech in history and are barely at bay.
The Technology of Intelligence
summary:: AI scaling is not beholden to any known technological trend line and has already scaled in unparalleled ways. genuinely capable AI is inevitable over the next handful of years - the existence of systems that can generate any tool, system, or platform completely autonomously will mark a turning point in human history
Alphago and the beginning of the future
AlphaGo, trained on human knowledge and then practicing on itself, beat Sedol 4-1. AlphaZero started with zero human knowledge and merely played itself millions of times. and it trounced AlphaGo.
With just a day of training, AlphaZero learned more about Go than what the entirety of human experience could teach it.
From atoms, to bits, to genes
our control over atoms is precise and complexd. we can control material down to nearly the fundamental level.
in the mid-twentieth century we realized information is a core property of the universe - whether encoded in binary format or in DNA.
Bits (and then genes) supplanted atoms as the building blocks of invention. But we had reached a limit of abstraction and complexity - until recently.
We’re now breaching those limits, beginning to operate on the foundational levels of intelligence (rivaling our own) and life (that we engineer).
This is a phase transition, not a mere tool upgrade. ^108b61
A cambrian explosion
The more technologies there are, the more then can in turn become components of other new technologies so that, in the words of the economist [[w-brian-arthur]]❌, ^2a9ce5
the overall collection of technologies bootstraps itself upward from the few to the many and from the simple to the complex.” Technology is like chemistry: commingling sets of parts to combine and recombine.
Along with evolutionary superclusters, the coming wave also has speed. the “law of accelerating returns” feedback loops (from:: [[ray-kurzweil]]❌). ie: material work in greater complexity and precision create more powerful chips, which can in turn help design even better chips through AI design.
Autocomplete everything: the rise of large language models
LLMs leverages the fact that language - as data- comes in sequential order where each unit of data is somehow related to the unit before it. Using this rationale, it can predict what should come next. your algorithm must “know where to look” for important signals in a given sentence - this is “attention”. It does this by tokenizing groups of letters and building up its own vocabulary of tokens and their sequential relationships across billions of billions of documents. This is how it learns what words to pay attention to, and thus how to “understand” what is important about a sentence - all in the service of predicting what’s next.
We’re just scratching the surface of these kinds of tools. in 1996, 36 million people used the internet. this year it will be over 5 billion. We should expect a similar - though must faster - trajectory for these tools.
Brain-scale models
A key piece of the LLM revolution is it obviated the need for human-labelled data. very large models could be trained on messy real world data.
A human, reading at 200 words per minute over a lifetime of doing nothing else for 24 hours a day would read about 8 billion words.
A typical LLM is trained on trillions of words over the course of months. many orders of magnitude more than an unrealistic human lifetime. This is truly alien.
How to understand one petaFLOP: a billion people, each person holding one million calculators doing complex multiplication, all hitting = at the same time.
The amount of compute used to train the best AI models is 10 billion petaFLOPS. (and this is up from two petaFLOPS ten years ago).
Google’s PaLM: if you had a drop of water for every FLOP it used during training, it would fill the pacific.
This kind of exponential increase far outstrips Moore’s Law or any other technological trajectory we’ve ever seen.
While hardware is hitting limits at the laws of physics - there’s only so far you can shrink a transistor - with AI training you can just connect larger and larger arrays of chips into massively parallel supercomputers.
There’s no end in sight to the scaling potential.
Capabilities: a modern turing test
ACI: “artificial capable intelligence”. ask an AI to “make $1 million on Amazon in a few months with a $100,000 investment”. This is entirely doable with some human intervention within the next year, entirely autonomously within five.
So rather than get too distracted by consciousness, focus the debate around near-term capabilities and how they’ll evolve in the next few years. The capabilities we’ve seen so far will be dwarfed by the ability to complex complex, multi-step, end-to-end tasks fully autonomously.
You don’t need superintelligence or consciousness to entirely upset the global economy and social order - you just need genuinely capable intelligence. not a tool or a platform itself, but a maker of tools and platforms and systems of any kind.
This is a turning point in human history.
The Technology of Life
DNA is information. we can now directly alter this information as an encoding.
FDA Approves World’s First Crispr Gene-Editing Drug for Sickle-Cell Disease
DNA scissors: the crispr revolution
The Carlson curve: the collapse in DNA sequencing costs. the cost of human genome sequencing fell from $1 billion in 2003 to under $1000 by 2022. a millionfold drop. one thousand times faster than Moore’s law.
DNA printers: synthetic biology comes to life
we can now print millions of pieces of DNA at once.
DNA Script is one such company that trains and adapts enzymes to build completely new molecules - synthetic biology. A field with the ability to read, edit and write the code of life.
Biological creativity unleashed
bioinformatics and computational biology let us understand the whole, unique organism: personalized medicine. Soon, the idea of being treated generically will seem medieval.
Computers may eventually be grown rather than made. DNA is the most efficient data storage mechanism known. Millions of times the density of current digital techniques with near perfect fidelity and stability.
The entirety of the world’s data could be stored in one kilogram of DNA.
A biological transistor (a “transcriptor”) uses DNA and RNA as logic gates.
This is a way off, but in principle possible purely biologically.
AI in the age of synthetic life
Knowing a DNA sequence isn’t enough; you need to know how it folds. brute-forcing this could take longer than the age of the universe.
But AI has cracked this. the team that solved this did so not by domain knowledge in biology, but just with expertise and capability in machine learning.
Used to take a team weeks or months to know a protein’s shape and folding; now it takes seconds.
artificial intelligence and synthetic biology can almost be seen as interchangeable. both fields are about recreating, re-engineering the foundational concepts of humanity: life and intelligence.
The Wider Wave
waves don’t just bring their own developments, but downstream effects: steam-driven factories, stream-driven trains carrying people to new places. software businesses, down to everything that relies on computing.
Robotics comes of age
like AI, starts with human supervision, ends up with self-directed AI, eventually generalizing to new settings.
David Brown - los vegas shooter - was killed by a police drone. the first time a robot used targeted lethal force in the US.
Robotics slowly working its way into society.
Quantum supremacy
Chemistry and biology become fully legible for the first time.
costly, tricky lab work becomes easy. new batteries and drugs more realizable. molecular becomes as programmable as code.
The next energy transition
(Life + Intelligence) * Energy = Modern Civilization
nuclear fusion; clean, abundant energy. The release of energy when isotopes of hydrogen collide and form to form helium. holy grail of energy production.
National Ignition Facility in Livermore, California. inertial confinement - compressing hydrogen-rich material with lasers and heating them to 100 million degrees. Creates a fleeting fusion reaction. in 2022 they got their first net-gains (more energy created than used by the lasers).
When capital flowing into fusion startups (plus international collaboration) it’s now “when not if”. A future of clean and limitless energy is looking real.
The wave beyond the wave
Nanotech: manipulating atoms individually. logical conclusion of bits/atoms relationship. nanomachine speed is far beyond anything.
an atomic-scale nanomotor rotates at 48 billion RPMs. at scale, would power a Tesla with 12 grains of sand worth of material volume.
Previous tech waves were about reducing the cost of broadcasting information. this one reduces the cost of acting on it. Qualitatively different. sequencing -> synthesis. reading -> writing. editing -> creating. imitating conversations -> leading them.
Even harder to centralize and oversee. An entire break from previous historical patterns.
Four Features of the Coming Wave
^7e208e
Much of Ukrainian resistance equipment was crowdsourced and crowdfunded. drone hobbyists, software engineers and soldiers were able to modify and deploy their drones in real-time, like a start-up.
Allowed a small, agile entity to defy conventional military calculus. ^e004b1
Containment problem compounded by new technology’s characteristics: ^86bcef
-
hugely asymmetric impact. new approaches for unthinkable vulnerabilities of dominant powers
[[]]❌ -
quickly developing via hyper-evolution. incredible speed
[[]]❌ -
omni-use / general / multi-purpose
[[]]❌ -
autonomy beyond anything seen
[[]]❌
Asymmetry: a colossal transfer of power
re: Ukraine vs Russia ([[]]❌) represents a shift from traditional military power to anyone able to deploy certain tech. no reason why one person couldn’t operate a network like this.
Network scales make containment impossible. we already live in a globally-interlinked world - soon, a single point in that system could alter everything.
Hyper-evolution: endless acceleration
There are massive software evolutions, but even just Moore’s law over the next decade: in 10 years $1 will buy a 100x the compute of today. This compounds as everything is reduceable to compute. W. Brian Arthur, again: [[]]❌
Omni-use: more is more
^8479fd
Automated drug discovery. re: pharmaceutical AI advancements: to date, 18 clinical assets have been derived with help from AI tools.
At launch, the PlayStation 2 was regarded by the US Department of Defense as powerful enough to potentially help hostile militaries (who were traditionally denied access to such hardware).
Omni-use technologies have wider societal spillover than narrow tech. Electricity is a on-demand resource that permeates everything - daily live, society, economy. What happens when AI is the same? A general purpose tech embedded everywhere. This is not containable.
Autonomy and beyond: will humans be in the loop?
Coming wave tech are mostly beyond our comprehension - yet we can still create and use it. autonomy means not necessarily being able to predict what it’s going to do.
The gorilla problem
Stuart Russel’s “gorilla problem”: gorillas are physically stronger and tougher than humans, yet they are either endangered or living in zoos. our intellect allows us to contain them. Something smarter than us could contain us.
Unstoppable Incentives
In the Alphago v Sedol competition, Alphago had a British flag, Sedol a Korean one. The impact of this “turf war” appearance was largely an accident.
In Asia, everyone was watching - bigger than the super bowl. ^68f320
The challenger, a Western firm, London based, American owned, had just marched into an ancient, iconic, cherished game, literally put its flag in the turf, and obliterated the home team.
This got Asia’s attention. Underscored the arms race.
Macro-drivers behind tech development and spread:
- great power competition. innovation is power. keep up with the Joneses
- global research ecosystem rewarding publication and pursuit of new ideas
- immense financial gains from tech and need to tack global social challenges
- ego
National pride, strategic necessity
With Sputnik, Russia was the first to space. it was a crisis for America, technological Pearl Harbor. science and tech became a national priority from education to government to private sector.
Enormous investment led to Apollo and the US putting the first person on the moon. USSR nearly destroyed itself economically to keep up.
Sputnik put USA on a path to superpower in rocketry, computing (with all its downstream effects). Something similar is happening in China right now wrt AI - kicked off by Alphago [[]]❌.
China’s New Generation Artificial Intelligence Development Plan:
By 2030, China’s AI theories, technologies, and applications should achieve world-leading levels, making China the world’s primary AI innovation center
A top-down approach means China can marshal this kind of strategy with the full force of the state.
The US is losing its strategic lead. Most in the West don’t appreciate the extent. China is ahead in green energy, 5G, AI and on track for quantum and biotech soon.
Pentagon’s chief software officer resigned, saying we have no chance against china in 15-20 years, it’s a done deal.
The arms race
A country could feasibly clamp down on innovation at home, but it cannot do the same from geopolitical rival competition.
It doesn’t even matter if you’re “not” in an arms race, the common line of thinking is you have to assume the other thinks you are, and therefore must race yourself to keep up with their race. it’s self-fulfilling. [[moloch]]❌
in the 1950s, Khrushchev bluffed (ICBM tests and Sputnik), the US thought there was a missile gap. it went crazy, bringing nukes and ICBMs forward by decades. Turned out the US was ahead 10-1 all along.
We’re already in the arms race. It’s too late now.
Knowledge just wants to be free
we’re in an age of globe-spanning, open, system of developing knowledge and technology. nearly impossible to steer, govern, or shut-down (what about “malinformation” and scientific research suppression during covid? covid-information-research-suppression. That seemed to work pretty well.)
Openness saturates research culture. Innovations diffuse fast, far and disruptively. Often, papers are announced on twitter with a social media influence in mind. Modern research’s optimization for curiosity and sharing work against containment. ^fa53e7
Research at the frontier is hard to predict. For instance, GPUs were designed to deliver realistic graphics for video games but ended up being [[|omni-use]]❌ good for AI. This was basically just lucky - especially for NVIDIA (1,000% share price rise during AI boom).
The $100 trillion opportunity
Railways in England: connections meant regions and cities boomed; tourism, trade and family life were transformed. The profit motives of developers and investors drove the craze.
[[|Curiosity and sharing]]❌ alone don’t propel breakthroughs into the hands of billions. Raw science has to be converted to desirable products to satisfy customers. Most technology is made to earn money
And this potential for profit is built off something even more primitive: raw demand. people need and want the fruits of technology - whether for subsistence or flourishing.
Go back just few hundred years and economic growth was almost nonexistent. Living standards stagnated for centuries at unfathomably worse levels than today. In the last two hundreds years, economic output is up more than three hundres times. Per capital GDP has risen at least thirteenfold over the same period, and in the beginning of the nineteenth century, almost everyone lied in extreme poverty. Now, globally, this its at around 9 percent. Exponential improvements in the human condition, once impossible, are routine.
^9ee0e1
In a developed economy, people work less for higher rewards. in Germany, annual working hours decreased by 60% since 1870 ^9d0b38
What we see around us is the product of human intelligence put into direct pursuit of monetary gain (or just material gain. “monetary” makes it sound like the point is money.) ^0e1a91
The coming wave is the greatest economic prize in history. To contain it would mean containing global capitalism - somehow convince other people to leave this opportunity for wealth and power on the table.
PwC forecasts AI will add $15 trillion to the global economy by 2030.
McKinsey forecasts $4 trillion from biotech. Boosting robotic installations would mean $5 trillion unleashed. Other sources of growth are scarcer than this.
Generalist AI would not only foster a cycle of growth, but would permanently increase the rate of growth going forward. AI could be the most valuable tech ever (not to mention coupling with synthetic biology and robotics)
Global challenges
Malthus in 1798 said the capacity of agriculture would be outstripped by a fast-growing population and lead to societal collapse. This is true for static yields, but it doesn’t anticipate human ingenuity. now, yields are 16x what they were in the 13th century. The same soil and geography - but technological advancements. 1kg of grain is 98% cheaper (by labor) to produce than in the early 1800s.
today, 10% of the world is undernourished. 1945 the population was three times smaller and undernourishment was still at 50%.
Ammonia, cement, plastics and steel: Vaclav Smil calls these the pillars of modern civilization. They require massive amounts of fossil fuels - but without them modern life stops.
The average tomato requires 5 tablespoons of oil to produce.
Imagine this (positive) future world: modified phytoplanktons and trees that help deal with carbon, personalized drugs, AI-generated fertilizer compounds and weather prediction.
Technology isn’t a downside-free panacea, but meeting this century’s challenges without new technologies is impossible.
Ego
Engineers are often curiosity-driven above all else; almost as a moral imperative.
“When you see something that is technically sweet, you go ahead and do it, and you argue about what to do about it only after you have had your technical success.”
- Robert Oppenheimer
“What we are creating now is a monster whose influence is going to change history, provided there is any history left, yet it would be impossible not to see it through, not only for military reasons, but it would also be unethical from the point of view of the scientists not to do what they know is feasible, no matter what terrible consequences it may have”
- John von Neumann
Technologists often aspire to an mythological heroic image of a founder single-handedly building an empire despite a hostile and ignorant world.
“if we don’t do it, someone else will come along and do it instead”. Everything leaks, is copied, is improved by others.
Technology is an indispensible mega-structure suffusing every aspect of daily life and the economy. has been for some time.
The Grand Bargain
The promise of the state
The nation state, the central unit of the political world order, offers benefits that supposedly outstrip the risks of centralized power. monopoly on power is the best way to enable peace and prosperity. That’s the bargain. [except it’s bullshit to call it a bargain, because it’s not like you’re allowed to say no. it’s more like extortion.]
Checks and balances are supposed to keep a dystopian centralized power out of everyday life to a degree, while still allowing enough intervention to maintain order.
Lessons from Copenhagen: politics is personal
local government, UN negotiations, and nonprofit showed the author the limitations of government and politics. at the Copenhagen climate negotiations where everyone was “on the same team”, every suggestion was a problem and a spiraling argument. maximum divergence on all things.
Local London politics was the same - excuses, blame, media spinning, even when legal responsibility was clear.
Technology alone cannot solve social and political problems, but attempting to do so without technology is also wrongheaded.
the tech industry has an influential minority that welcomes the demise of the nation state. the author thinks this would be a disaster (cites Syrian state failure and how it could “happen here”.)
Fragile states
Democracies require trust. Most of all that power-holders won’t abuse their dominance.
But trust has collapsed recently - in America and more broadly. since obama, trump and biden, trust in government is below 20%. 1 in 5 believe “army rule is good”. 85% of Americans feel the country is headed in the wrong direction The political duopoly doesn't address customer complaints. this general sentiment extends to media, science and general “expertise”.
across 50 democratic nations, 2/3 felt the government “rarely” or “never” acted in public interest. This breeds negativity and apathy, voter turnout craters.
Globally, there are energy shortages, inflation, bad incomes, trust breakdowns, populism waves, lack of interest in the traditional answers of the left and right. Even while we’re experiencing the highest living standards the world has known. [[everyone-is-unhappy-but-things-have-never-been-better-why]]❌
Technology is political: the wave’s challenge to states
Technology is a driver in the demise of the nation state on the world stage. impossible to contain, strategically critical, relied on by billions, global, fast-developing. A prime actor in global politics.
This coming wave is being introduced to a global political stage this is already on the back foot.
technology doesn’t necessarily cause behaviors or outcomes, but does guide or circumscribe potentials.
Technology and the modern state evolved symbiotically (or did it?): ^fa48d2
- Writing: admin/accounting tool to keep track of debts, law, taxes, ownership
- Clock: produced set times in monasteries, then mercantile cities, then across nations; helped to create common and larger social units.
- Printing press: produced a standardized language from dialects, created a national imagined community; a unified people “behind the nation state”. fixed knowledge, geography and history in place. promulgated ideology, law, culture.
- TV/Radio: supercharged printing. Produced national and international commonality and shared experience (like the World Cup or presidential speeches).
War created the state and the state created war. Weapons are technologies central to the power wielded by nation states.
By the [[hundred-years-war]]❌, offensive capability gave advantage to whoever could deploy capital-intensive cannons. Over time, the state concentrated this lethal power into its own hands.
The last century’s political clash (of communism vs capitalism I guess?) saw the modern, liberal democratic state emerge as the dominant global force in the 20th century. It provided a few defining functions:
- provision of security
- great violent power concentrated in the center + sensible checks and balances on this power
- welfare redistribution
- frameworks of technological innovation and regulation
- socioeconomic legal architecture of globalization
The author sees two directions next, both of which topple the balance of the state:
- some democratic states will erode from within, increasingly fractious and unstable. zombies who are degraded and dysfunctional ^601cdf
- unthinking adoption of coming wave tech, creating supercharged leviathan authoritarianisms ^8a88b7
What is needed to navigate this historic disruption and extract value without coming to ruin is: states that work extremely well.
Fragility Amplifiers
National emergency 2.0: uncontained asymmetry in action
WannaCry ransom attack on Microsoft systems: up to $8 billion in damage and deeper implications: exposing of cyber vulnerabilities in institutions we take for granted (NHS, Deutsche Bahn, Telefonica, FedEx, Hitachi, Chinese Ministry of Public Security).
WannaCry was built by the NSA. “the keys to the kingdom, designed to undermine the security of a lot of major government and corporate networks both here and abroad”.
Even after patches, in 2017 a new version emerged (“NotPetya”), targeted at Ukranian national infrastructure; attributed to Russia. Massive swaths of basic infrastructure were frozen, from ATMs and mobile phones to power plants and savings banks.
Uncontained asymmetry in action: one of the world’s most failed and corrupt states acquires advanced tech from the world’s most technologically advanced state. used to attack and arrest contemporary infrastructure. A total containment failure.
Surprisingly though, the fix didn’t come from the state itself. The fix came from a private company - Microsoft - and a random lone individual not affiliated with the state or microsoft. Neither the attack nor the fix respected national boundaries or established nations.
Subsequent attacks will be magnitudes more advanced - AI learning on the fly and self-replicating beyond control. The nation state will obviously fail to be recognized as the sole arbiter of security.
And providing security is one of the key tradeoffs of the state.
The plummeting cost of power
Technology is political because it is power. the coming wave will democratize access to power. Power will be amplified for anyone with goals: either good or ill.
The poorest person and the wealthiest billionaire both have the same state of the art smartphone. In the next decade, ACIs will follow the same trend. Everyone will have a world class team of experts in their corner.
Democratizing access means democratizing risk.
This is a critical threshold in the history of humans. This is the challenge the nation state faces.
Robot with guns: the primacy of offense
armed (physical) robots and maliciously-deployed AIs will reduce the barriers to violence.
Corporate AIs could ingest all legal code and find loopholes or arbitrage opportunities. To mire competitors with nuanced lawsuits, attack them through automated trading and disinformation, engineer banking runs or boycotts, etc.
Meta created CICERO, which is expert at playing Diplomacy - a game that requires deception and complex social strategies.
In general, complex and difficult-to-trace offense initiated by non-state bad actors will be empowered. This degrades one of the core pillars of the state: providing security. If the state can’t reliably provide security, electricity, schools, transportation, etc. what’s the point of complying with the grand bargain?
There has always been a delicate dance between offensive and defensive capabilities. countermeasures always quickly arise for each new kind of attack that develops. But now, powerful asymmetric omni-use tech will reach the hands of those who wish to damage the state. The [[]]❌ strongly favor offense at the moment. Defense may catch up, but there will a dangerous gap at first.
The misinformation machine
As deepfakes, phishing attacks, manipulations along fault lines like race and other sects, scams, etc become prevalent, trust is damaged and fragility is amplified. Rich synthetic histories and supercharged conspiracy theories will be easy to generate; individuals won’t have the time nor ability to verify a fraction of what they see.
State-sponsored info assaults
Russia, China, the CIA all engage in info ops.
A Brookings Institution report says of a synthetic media “Infopocalypse”:
Ubiquitous, perfect synthetic media means “distorting democratic discourse; manipulating elections; eroding trust in institutions; weakening journalism; exacerbating social division; undermining public safety; and inflicting hard-to-repair damage on the reputation of prominent individuals, including elected officials and candidates for office”
Leaky labs and unintended instability
Biosafety Level 4 (BSL-4) labs have the highest containment standards in the world.
Still there have been very many accidents. 1997 russian flu, 1979 soviet anthrax release, 2007 UK leaking pipe caused foot-and-mouth outbreak, 2021 philadelphia had a smallpox near-miss. SARS has escaped in Singapore, Taiwan, and China, including four times from the same lab in Beijing.
Most biosecurity officers don’t report accidents publicly, promptly, or at all beyond their instutition. a 2014 US risk assessment estimated that over a given decade across 10 labs, the chance of a major lab leak was 91%. risk of a resulting pandemic: 27%!
There is indication that covid-19 has been genetically altered, and growing circumstantial evidence that it leaked from Wuhan. sars-cov-2-lab-leak-hypothesis. FBI and US Dept of Energy believe this is the case. CIA is undecided
The automation debate
Broadly, technology tended to create new jobs when it damaged old ones. These new ones tended towards cognitive white-collar jobs. Factories closed and demand for lawyers, designers and influencers boomed. But if the the next job-displacing systems is itself displacing white-collar jobs, there won’t be anywhere up the “cognitive ladder” for people to turn next.
Some maintain that new technology always creates demand for new labor: making companies more productive generates more money which flows back into the economy. Demand is insatiable, basically - and demand stoked by technology-driven wealth gives rise to new human-labor-requiring jobs. There is not “one lump of labor” to go around. Instead: future is billions of people working in high-end jobs we can’t even fathom yet.
But whether it’s awful or wonderful in the long-run, most accept that automation will produce significant medium-term disruption. Hundreds of millions will need to transition to different work and potentially re-skill, which involves political ramifications through stressors: underemployment, broken government finances, insecure and angry populations.
Difference this time is the revolution won’t be limited to industrial niches, it will be everywhere. everyone will experience the “falling cost of power”.
Power is redistributed and reinforced across all of society. Omni-use tech will be found in every sector at all levels, subcultures and corners of the world. producing trillions of dollars in new economic value while destroying other sources of wealth. Some will gain a lot, others will lose everything - an enormous shake-up. Militarily it empowers militias and nation-states alike.
Rather than amplifying specific fragilities, it is in the long-term altering the foundations of society. A redistribution of power the leaves the already fragile state further shaken and the grand bargain precarious.
The Future of Nations
Delicate balances of power will be pushed and broken, the nation state subject to both extreme centralization and fragmentation where delicate balances of power will be upset. a long-term macro trend toward instability will play out over decades in this post-sovereign world.
The first result will be concentrations of power and wealth that reorder society.
returns on intelligence will compound exponentially. A select few artificial intelligences that we used to call organizations will massively benefit from a new concentration of ability - probably the greatest such concentration yet seen. Recreating the essence of what’s made our species so successful into tools that can be reused and reapplied over and over, in myriad different settings, is a mighty prize, which corporations and bureaucracies of all kinds will pursue, and wield. How these entities are governed, how they will rub against, capture, and reengineer the state, is an open question. That they will challenge it seems certain.
The stirrup
The stirrup was a simple innovation involved in the change of hundreds of millions of lives and altering the course of human society. By fixing riders to their horses, this tiny triangle made it possible for cavalry charges to break the strongest lines of infantry holding shields (for a rider, charging into a shielded soldier previously meant getting knocked off your horse) and pushed the balance of power in favor of offense. But raising and maintaining horses cavalry units is expensive and lucrative, the elites created out of the cavalry industry promised their arms to the king in exchange for their wealth and status. This expansive network of obligations, lords, jousting, blacksmiths, castles, etc. became feudalism and became the dominant political form for society for a thousand years.
Concentrations: the compounding returns on intelligence
The most powerful forces in the world are not individual intelligences (whether natural or AI), but groups of people coordinating to achieve shared goals. Companies, organizations, militaries, markets. They collect and process huge amounts of data for specific goals and build mechanisms to improve their success with that goal.
The services and responsibilities of a big tech company like Google spans a swath of services that deeply entwine with the lives of many in a way second only to nation state. “Googlization” enables massive sections of the economy and the human experience. This isn’t unique to google: the combined revenues of Fortune’s Global 500 is 44% of world GDP. Total profits 7th place in the world by annual GDP. Companies already control the biggest chunks of all AI infrastructure and IP.
The frontier of this wave is in private companies, not government or academia like previous waves.
eBay and PayPay’s dispute resolution system handles 3x as many cases as the whole US legal system at 60 million disagreements a year. 90% of these settle using tech alone.
As private interests step into the places of overstretched governments, a process similar to the East India Company will play out. The companies who can take advantage of the wave will extend their reach and see colossal gains.
With the combination of AI and synthetic bio, nearly everything can be migrated to the cloud, 3D-printed or bio-produced, made in-house or nearby through conversation with an AI agent. Zero marginal cost production and distribution. The players (with the up-front resources available to make this investment) who achieve this will be accelerated and will centralize and their scale will rival nation states. [[what-is-the-role-of-self-hosted-in-the-cloud-future]]❌ They likely become impossible to compete with - neither by other private companies nor states. Especially if they achieve AGI or quantum supremacy.
Surveillance: rocket fuel for authoritarianism
20th century government dystopias wanted complete hegemony: planned economies, controlled information ecosystems and obedient populations. But the task was always too complex.
The coming wave may make this tractable in a way that produces an altogether new sort of horrific state.
China is already on this path. facial recognition-enabled CCTV cameras in China: Fujian Province alone holds 2.5 billion facial images and are candid about the purpose: “controlling and managing people”. The Ministry of Public Security wants to stitch together all the scattered databases and surveillance systems into a single AI-enabled whole. A system that could respond decisively in real-time to anything it considered a threat. This is at its worst in Xinjiang Autonomous Region where it is used to repress the Uighur.
And the Chinese export this tech to other places, including the US. Even though its banned, due to accident and oversight, more than 100 US towns have surveillance tech developed for use on the Uighurs.
Fragmentations: power to the people
Hezbollah operates as a state within a state. The best-armed non-state actor in the world. It’s also a major mainstream political force in contemporary Lebanese government process. Operates schools, hospitals, infrastructure, microcredt-lending. Whole districts are essentially run by Hezbollah.
The coming wave’s democratization of power may instigate a “Hezbollahization” where everyone can support themselves on their own terms and maintain quality of life without the nation state superstructure.
Unbundling of the authority and service embodied by the state.
Education, medicine and other fields that rely on huge social and financial infrastructure could be trimmed and localized (especially using AI to tailor experiences to the individual student).
Security may not be provided by a single nation state umbrella but rather as an ad hoc basis for cyber and physical protection (AI hackers and drones used by private security groups, once defensive capabilities are available to all).
Redistributing real power away from the center means various communities can live as they wish, from political secessionists to corporate luxury parks, atheistic communists to religious terrorist organizations. The AI assistants and services used to facilitate these viable, self-organizing societies may be provided by profit-making groups (like Hezbollah or a off-grid hacker collective) or provided by the communities themselves. “Setting up a school” is a daunting task; asking an AI to do it for you is easy.
The disenfranchised will simply re-enfranchise themselves - on their own terms
Peter Thiel and other “sovereign individual” technologists see this as the goal itself.
When northern Italy was a patchwork of small city-states, it gave us the Renaissance, yet also constant internecine war. Imagine constant conflict with tomorrow’s military technology…
The coming wave of contradictions
Extreme centralization + extreme decentralization. Every individual, business, community will have its own AI, bio and robotics capability. If every AI aims to fulfill the goals of its owner and many of them are in direct conflict, what happens?
The internet (the last wave) already does this: centralizes a few key nodes while also individually empowering billions of individuals.
The Dilemma
Catastrophe: the ultimate failure
The history of humanity is a history of catastrophes. Scientific and technological improvements compound risks.
Varieties of catastrophe
Solving AI alignment doesn’t mean doing it once, it means doing it every single time without fail. Same for lab leaks.
Cults, lunatics, and suicidal states
Aum Shinrikyo (Supreme Truth) founded in the 1980s in Japan as a doomsday cult under the leadership of Shoko Asahara with a peak membership of 40,000-60,000 (including dozens of well-trained scientists) and $1 billion in assets. They wanted to hasten the apocalypse ([[accelerationists]]❌). The group carried out a number of chemical weapon and other attacks, killing a handful and injuring thousands. They were a rare combination of very well-organized and highly motivated.
As tools (of destruction) democratize and commoditize, groups like this will have greater adaptability and capability and fewer mistakes and opportunities for luck to be against them.
When unitary governments and populations are threatened, the reaction is to tighten the grip on power.
The dystopian turn
Some will say centralize 100%, build the panopticon. The door to dystopia is cracked open by a desire for perfect safety.
This is rare in the west, but consider the early days of covid: compliance was nearly universal; everyone was “doing their part”. But before long, people began to say enough was enough. the risk of overreach wasn’t worth the extra protection (from something that was getting less dangerous) ^26f13d
A repressive surveillance society is just another failure mode. If dystopia is the only answer to catastrophe, it’s no answer at all. Between [[|zombie states]]❌ and [[|authoritarian dystopias]]❌ there is another possibility, the worst of both worlds: slapshod repressive surveillance and control systems that still don’t create a watertight system.
Stagnation: a different kind of catastrophe
Modern civilization writes checks only continual technological development can cash. Our entire edifice is premised on the idea of long-term economic growth. And long-term economic growth is ultimately premised on the introduction and diffusion of new technologies.
- consuming more for less
- more public service without more tax
- degrading the environment while life keeps getting better
All of these require technological advancement. without new technologies, sooner or later everything stagnates and likely subsequently collapses.
A progress standstill would require 2 or 3-fold productivity improvement just to tread water. and that would leave the worst off (the vast majority of the world’s population experiences child mortality at 12x developed countries) frozen exactly where they are.
Containment Must Be Possible
The price of scattered insights
Regulation is way slower than product innovation. Ring doorbells: by the time regulation caught up to the fact that neighborhoods were now surveilled spaces, Ring already had an extensive network and stored data from around the world.
The discussion of responses to technology happen across scattered blogs, communities, schools, etc. Not unified at all.
Governments fight the last war, the last pandemic or tech wave; regulators regulate for stuff they can anticipate. But we’re entering an age of surprises.
Regulation is not enough
Regulating hyper-evolutionary, omni-use tech is monstorously challenging. Consider a relatively straightforward field: motorized transport. There’s no single regulator; it’s a complex patchwork of federal, state, community, private companies, insurance, traffic, roads, emissions, safety, etc. It has evolved over decades and is quite good but still there are 1.35 million deaths a year. “We” have decided this cost/reward is worth it. This “we” consensus is a social norm that takes time to develop - time we don’t have for the coming wave.
Nations are caught in a contradiction: regulate and contain these technologies so that they are not unseated as the ultimate power, but also accelerate development of strategic tech for national pride, security and survival. In China, this takes the form of highly constrained civilian path and a free-reign military/industrial path.
Containment revisited: a new grand bargain
Containment is not a magic box, but a set of guardrails.
Viewing the [[]]❌ (asymmetry, hyper-evolution, omni-use, autonomy) through the lens of containment:
-
Is the technology omni-use and general-purpose or specific?
- narrow-scoped and domain-specific preferred
-
Is the tech moving away from atoms toward bits?
- more dematerialized: harder to control hyper-evolution
-
Are price and complexity coming down, and if so how fast?
- threats have a wider nature when they proliferate easily
-
Are there viable alternatives ready to go?
- safer alternatives make it easier to phase out use of bad things
-
Does the technology enable asymmetric impact?
- the risk to surprise and exploit vulnerabilities is greater
-
Does it have autonomous characteristics?
- The more a tech requires human intervention, less loss of control
-
Does it confer outsized geopolitical strategic advantage?
- saying no becomes harder
-
Does it favor offense or defense?
- orienting development toward defense tends toward containment
-
Are there resource or engineering constraints on its innovation, development, and deployment?
- choke-points in supply chain or talent help containment
Before the flood
Control and containment has rarely worked before. To succeed now would take something drastically different than has been tried in the past. a novel approach to an all-encompassing program of safety, ethics, regulation and control.
Ten Steps Toward Containment
1. Safety: an apollo program for technical safety
Up-close, in the lab, on the ground. Technical fixes alone won’t solve it, but they’re the first item and the closest to the metal.
Containment isn’t just a magic box, but some boxes should be part of the solution.
Encourage and incentivize this work directly.
Proposal: 20% of frontier corporate r&d budgets should be towards safety efforts, published to a government working group.
2. Audits: knowledge is power; power is control
External scrutiny is essential.
APIs that let others use foundational AI should not be open, they should come with KYC checks, like in the banking industry.
the author advocates for enforcing coordination with legislation when it can’t be achieved voluntarily in collaboration with technology producers: safeguards, encrypted back doors, entry systems controlled by a judiciary or publicly-sanctioned body.
3. Choke points: buy time
Use choke points to check speed of development
The US Commerce Department sanctioned exports of advanced semiconductors to China on October 7, 2022. This will slow down some development in China (notwithstanding any other more complex effects it will have.) At the moment, tech is driven by power of incentives, not by pace of containment.
AI’s lion share of compute comes from NVIDIA’s GPUs in the US. most of it’s chips are manufactured by TMSC in Taiwan in a single factory. TMSC’s machinery comes from one supplier: the Dutch ASML. This is a three-company ecosystem and a choke-point. The chips they produce cost up to $10 billion per kilogram. Other choke points:
- Cloud computing
- Limited number of fiber optic cables for global data traffic
- Rare earth elements (80% of quartz essential to solar and silicon chips come from one mine in North Carolina)
- DNA synthesizers
- Quantum computers
- Skills; the number of people working on coming wave tech is no more than 150,000
4. Makers: critics should build it
Credible critics should be practitioners. not just observing and criticizing, but showing the way.
5. Businesses: profit + purpose
Profit drives the coming wave and no solution that ignores or attempts to curtail that will work.
The author tried to create an AI ethics advisory council at Google with an intellectually diverse group of stakeholders. But a woke mob complained about a conservative member’s past comments anti-trans and LGBTQ comments. All sorts of cancellation calls ensued, including removing academic funding from other board members - eventually the whole initiative fell apart. ^5348a2
New kinds of corporations and experimental governance models are needed
6. Governments: survive, reform, regulate
Richard Feynman: “What I cannot create, I do not understand”. This is true for governments. Government needs to get their hands dirty: ownership gives control, accountability is enabled by understanding.
Governments should not rely on contractors and consultants but should use respected, full-time staffers making competitive salaries (private sector salaries can be 10x public sector in critical roles).
AI and bio labs should be licensed and governed - sophisticated AI systems and DNA synthesizers and quantum computers should only be built by licensed, certified developers who are bound by safety and accountability standards. Like launching a rocket into space requiring FAA approval.
Fiscal policy as redistribution and deccellerant: Labor is taxed at 25%, equipment and software at 5%. Allowing capital to reproduce itself in the name of flourishing businesses. This should switch to a focus on taxing capital, which would redistribute towards those effected as well as slow down the transition.
Governments should decide who is able to design, develop and deploy the technology of the coming wave (says the author…).
7. Alliances: time for treaties
Diplomats play an underrated role in containing technology. We need techo-diplomats.
We need bilateral initiatives between countries, but also: a new kind of global institution devoted to technology. Some lender of last resort group who can put their hands up when asked “who contains technology?”
Start with an audit/fact-finding organization that would increase global transparency.
8. Culture: respectfully embracing failure
Too often, failures are hidden and covered up. They should be shared and used as learning opportunities. Best practices should not be corporate secrets.
When a risk is found or a lab leaks the party needs to broadcast it and everyone else should listen, learn and support.
This is the attitude of airline companies and it has saved countless lives there.
Certain bounds should not be crossed by individual researchers or groups - recursive self-improvement of AI, autonomy, etc. Researchers on the ground need to buy-in to this mindset.
9. Movements: people power
There is no real “we” nor any lever “we” could pull around technology. Not even the president of the united states can “stop the internet”.
A civil society movement has begun to spring up around technology containment. We need more of this and industry leaders should encourage, not hinder this. Concrete proposal: hold a lottery to choose population sample to debate and propose management of these technologies.
10. The narrow path: the only way is through
| 1. Technical safety | Concrete technical measures to alleviate possible harms and maintain control |
| 2. Audits | A means of ensuring the transparency and accountability of technology |
| 3. Choke points | Levers to slow development and buy time for regulators and defensive technologies |
| 4. Makers | Ensuring responsible developers build appropriate controls into technology from the start |
| 5. Businesses | Aligning the incentives of the organizations behind technology with its containment |
| 6. Government | Supporting governments, allowing them to build technology, regulate technology, and implement mitigation measures |
| 7. Alliances | Creating a system of international cooperation to harmonize laws and programs |
| 8. Culture | A culture of sharing learning and failures to quickly disseminate means of addressing them |
| 9. Movements | All of this needs public input at every level, including to put pressure on each component and make it accountable |
Safe, contained technology is a delicate balance, like a liberal democracy. for example: Each increase in state capacity requires a corresponding increase in social capacity to counterbalance it. shackled leviathan.
The author says:
As a young twentysomething, I started out from a privacy maximalist position, believing spaces of communication and work completely free from oversight were foundational rights and important parts of healthy democracy. Over the years, though, as the arguments became clearer and the technology more and more developed, I’ve updated that view. It’s just not acceptable to create situations where the threat of catastrophic outcomes is ever present. Intelligence, life, raw power - these are not playthings, and should be treated with the resect, care, and control they deserve. Technologists and the general public alike will have to accept greater levels of oversight and regulation than have ever been the case before.
Some measure of anti-proliferation is necessary. And yes, let’s not shy away from the facts; that means real censorship, possibly well beyond national borders. There are times when this will be seen - perhaps rightly - as unbridled U.S hegemony […]. Complete openness will push humanity off the narrow path. On the other side of the ledger, though, as should also be clear, complete surveillance and complete closure are inconceivable, wrong and disastrous. Overreach on control is a fast track to dystopia. It too has to be resisted.
The narrow path must be walked forever from here on out. ^4b77be
Life After the Anthropocene
Luddites: Power loom (1785) could be operated by a single child, producing as much fabric as 3.5 highly-skilled and well-paid human weavers. Mechanization meant weaver’s wages more than halved over a few decades as price of food increased. Men lost out to women and children in factory work.
English Midlands 1807, 6,000 weavers protested, then got more violent in the 1810s. Inspired by a mythical figure called Ned Ludd. They often violently attacked factories and destroyed machinery. Demonstrations were countered by draconian laws and counter-militias. Didn’t work. The number of automatic looms went from a few thousand to a quarter of a million. The Luddites could not contain proliferation.
The same industrial revolution technologies the Luddites fought meant that their descendants would have lives unimaginable in the early 1800s: warm houses in the winter, fridges full of food, miraculous health care, longer lives.
We will likely live in a time where the majority of our daily interactions are with AIs, not other people. We already spend more looking at screens than at other human faces. We will have personal intelligences that will be deeply intwined in the whole of our lives: work, home life, culture, politics, economy, friendship, fun, love. Factories will grow their own outputs locally, drones and robots will be ubiquitous, the human genome elastic, much longer life spans. Some will disappear into virtual worlds. The social contract and governance will shift dramatically.
The Luddite reaction is natural, but won’t work; never did, never will against technological proliferation.
The only option is Containment.
is a distributed, peer-to-peer OS with applications at the core, changing the way people use cloud compute a phase transition or merely a massive upgrade?