How I Learned to Stop Worrying and Love the Ooze
Less rough thoughts, and there's still a two by two
I’ve been thinking how we should think about our changing world as we move into the algorithm-climate-security era, and what the “correct” state-society response to it should be. To my surprise, Henry Farrell and Cosma Shazili's concept of shoggoths in the article “Behold the AI Shoggoth” in the Economist magazine helped. Shoggoths are first described by H. P. Lovecraft as massive, bloblike slaves that rebelled against their masters, but in Farrell’s telling, shoggoths represent vast, inhuman distributed systems of information processing. They have no human-like “agenda” or “purpose,” but instead “an implacable drive … to expand, to entrain more and more of the world within their spheres.” Over the past two centuries, markets, bureaucracies and democracies have played a larger and larger role in the ways we see the world and in how we see ourselves. We have lived among shoggoths for centuries without knowing it.
Each shoggoth translates vast bodies of knowledge, makes them intelligible, summarises the unsummarisable. The market economy uses the price mechanism to make sense of a large body of tacit knowledge about supply and demand; bureaucracies help the state "see" the world through standards; and then there's democracy, a massive churn of information transmitted through institutions such as elections and opinion polls to guide political elites to understand the public. Collectively, these shoggoths have shaped our development as a society.
In our new era of securitised global value chains, the market-bureaucracy-democracy constellation is changing form. But there is more. We have new shoggoths – the terraform and synthetic intelligences.
The terraform shoggoth describes how humanity responds to the provocations of climate change. Climate, once assumed to be unchanging and predictable, now behaves like a random event generator of unprecedented fires and floods. The uncertainty before us is not just about energy transition – the flow, which may be achievable in the decades to come – but also how we adapt to a stock of two centuries’ worth of greenhouse gases in the atmosphere. Atmospheric carbon dioxide, just to name the most important of these gases, is now 50% higher than it was before the Industrial Revolution. Until and unless there is a way to remove them at scale, it becomes crucial not just to sense and model the planet and near space as a vast information system, summarise it in a few key variables but also act on it; rinse and repeat. In other words, we start terraforming. Benjamin Bratton notes in “The terraforming”, the term usually refers to transforming the ecosystems of other planets and their satellites to make them capable of supporting Earth-like life. But the consequences of climate as a random event generator of fires and floods looms not ahead but right now, here, and we need to terraform Earth if it is to remain viable for our existence.
Synthetic intelligences are collective information systems that condense impossibly vast bodies of human knowledge from a range of sources (texts, internet, audio, media etc.), and perform stochastic summaries to generate predictions. Just as many people cannot resist describing markets as having personalities or an intelligence, its synapses made from the billions of decisions people make to buy and sell, humans can’t help seeing synthetic intelligences as human or god. This is wrong in the same way. Rather than fear post-human intelligences, it is more useful to consider how synthetic intelligences modestly-to-substantially transform the market-bureaucracy-democracy-terraform shoggoths.
Friend or Foe?
The prevailing view appears to be shoggoths as foes, driven by fear that they are out of our control, and fear of being crushed by them, with good reason. For instance, the 1997 Asian Financial Crisis wiped out decades of middle class building in a few months! But shoggoths can also be friends. Berkeley economist Brad DeLong points out in his 2022 book “Slouching towards Utopia” that they massively empower us, individually and collectively, and make us collectively so productive, so intelligent. They have built the modern world, and it is now impossible to imagine unwinding and living without them, even in some pastoral, noble savage fantasy. In doing so, DeLong put a finger on what the correct state-society response in this algorithmic-climate-security-weird era might be.
Governance and trust
“Correct” here is of course a subjective concept. In my opinion, the correct response is a productive mix of high governance and high trust in the future.
High governance refers to Science (foundational breakthroughs) and Scale (deep industrial base, computational structure to monitor and geoengineer the planet) enabled by technocratic, competent States. This is easier to imagine as we are building off an East Asian industrial model that broadly works and is popular with the technocratically inclined. But it fails in creating high trust in the future, evidenced in East Asian levels of burnout. While arguably the best response in the era of global value chains, turbocharging breakneck growth in a generation, their industrial-policy-first approach also led their societies into a social dead-end popularly called involution, a mixture of hyper-competition, low fertility, and general burnout. It’s supposed to be our golden age, but instead we got a billion-person rat-race.
Involution happens when congestion effects overcome the agglomeration benefits of networks. Historically, societies respond by opening up new frontiers: in the physical world, imperial colonies, or exploring outer space; in inner worlds, mythologies around the afterlife, psychology, the internet; balance sheet inventions like double entry ledger. It could also take place through a brute reversion of agglomeration: the Chinese government’s regulation of financialization together with breaking up financial risk in the Chinese economy, clamping down on the internet economy, and measures to cool off real estate in tier 1 cities, accompanied by expanding high-speed-rail networks to encourage movement of industry and population inland to tier 2 and 3 cities. This brute reversion is politically saleable as long as there are new frontiers; it is less saleable if there are none. You may say this is only happening in East Asia, but remember, East Asian megacities are not just technological vanguards, they are cultural ones too!
The challenge is to be simultaneously high governance and high trust, a narrow corridor of popular legitimacy and technocratic competency. And this is where many societies stumble. In “The revolt of the public”, Martin Gurri argues that the internet has empowered the public to challenge traditional authorities, eroding trust in institutions and fuelling populist revolts. While effective at tearing down, these revolts struggle to build stable solutions. Gurri suggests this has created an opening for political entrepreneurs like Trump, skilled at using digital platforms to tap into public discontent and present themselves as authentic outsider alternatives. These figures are adept at channelling outrage but often fail to translate this into effective governance once in power, contributing to a downward loop of further institutional destabilisation. Despite having specific national programs such as France’s grandes écoles to generate leaders, many end up with an incestuous meritocracy; the public revolts, and political entrepreneurs cash in. The much harder part is to create a new elite, and to reform the moral foundations of the public, what I tentatively term “Succession”.
How I learned to stop worrying and love the ooze
The East Asian approach for Science x Scale x States x Succession is responding to securitised global value chains, but remain ultimately inadequate. Outcomes display high governance but low trust in the future, and my guess is that the default East Asian approach to terraform and synthetic intelligences would lead to more of the same. An alternative approach – a better approach – might be using the new challenges of terraform and synthetic intelligences to do away with mass politics where attention economies reward extremes, have a smaller public that aligns with Dunbar scales, and solve for high governance and high trust outcomes.
For the past few years, econ, crypto and tech twitterati arguments on global value chains went something like this. Territory is not just about land; states can use rules and laws to create new territories in the cloud e.g., SWIFT governs global financial flows. Technology extends the high-resolution global reach of these rules and laws. In response to the high-resolution reach of the state, a rising chorus called for software to eat the world and unbundle the nation-state. And for a while, it seemed to work: digital platforms seemed set to hollow out the state, and coinholders dreamt their cloud nation of choice. But as Henry Farrell – who came out with the term weaponised interdependence – catalogues in Underground Empire, the debate is settled in favour of the state, specifically America.
Cloud meets land, cloud meets man
Farrell’s Underground Empire narrates how America unknowingly built up a largely invisible (hence underground) empire of networks on which globalisation rests. These are mostly in “software” – finance, intellectual property etc. and a lot less in “hardware” – production. Hence the US can sanction selected countries (e.g. Iran, Russia) using finance “software” (price caps, shipping insurance, dollar clearing, dollar financing and so on) or intellectual property “software” (e.g. China, with the “high walls, small yard” approach to semiconductors). To use a gaming analogy, how did America end up with god mode – a cheat code that makes video game players invincible – over networks such as the internet, financial infrastructure, global supply chains? There are three main reasons.
The first is the cloud doesn’t really live in the clouds, it sits in data centres, most of which are on American soil. On/off ramps facilitate the conversion of Eurodollars/stablecoins to US dollars, putting the global financial nervous system under American jurisdiction. And because semiconductors were invented in America, the US can impose export controls over foreign technology companies like ASML and Huawei because their products use American intellectual property, even if only indirectly.
The second is the theoretical beauty of networks i.e., their decentralised nature promising resilience and equity, is quickly undermined by human incentives. The decentralised hope was for thousands of decentralised autonomous organisations or DAOs like TornadoCash, so-called unstoppable code running without human oversight to as Balaji Srinivasan terms in Network States “sweep sanctions and sovereign authority away”. But redundancy is expensive, and networks quickly converge on a few central nodes for efficiency. And so the crypto economy grew into an oligopoly dominated by a few intermediaries like digital currency platform Circle. When faced with government shutdown, they chose to cutoff their connections to TornadoCash and keep making profits.
The third is to be present at the creation. Although China may be the second largest economy today, its rule making power is disproportionately smaller to its current economic power because it was not present at the creation when the rules of these networks were written.
This leads to a few observations about the relationship between Science, Scale, and State. Take a relatively new field, like electric vehicles (EVs). The State needs to shape new rules for new sectors like EVs, entrenching its intellectual property (Science), and utilise its deep production base to sell more than its competitors (Scale), setting new global standards by default. You see this in China’s dual circulation approach, with its EVs outselling competitors not only in traditional markets in the West, but also in the Rest.
Scale also refers to a growing set of countries who fear being next in line to be excised, which brings up an interesting conundrum: America’s underground empire works far better when it is … underground. The more securitisation happens, the more the Rest will work out de-Americanised alternatives. In “A new non-alignment for the Polycrisis”, Tim Sahay summarised the search for alternatives in five buckets: access to markets; access to financing especially when interest rates are high; access to technologies; access to military hardware; and access to commodities or energy. If you consider these buckets, the emerging matrix of bilateral or minilateral deals, such as between the Gulf states and India, start to make a lot more sense.
A more careful consideration of Science x Scale also points to the true underpinnings of state power in this context: where a state can deploy god mode is more significant than yet another power index. For example, India is a rising economic power but where in Science x Scale does it exercise god mode? How does India translate its moment in the diplomatic sun into real assets in industrial and science policy? State here perhaps still refers to state capacity we see in East Asian states enabling Science x Scale, but the setup might be quite different when we get to the next section on terraform and synthetic intelligence.
High governance in an era of low trust and new shoggoths
It is later in the day than you think, and the East Asian state response function for climate change is already moving beyond transition into adaptation. There is a clear default technocratic impulse, which is that the environment and economy can be engineered to solve problems. Tokyo hardens its underground tunnels for floods under its Tokyo Resilience Project and Singapore creates artificial islands to buffer against rising sea levels. And then there’s China. Giant solar and wind farms in peripheral provinces feed energy via long distance voltage lines to the booming coastal cities. Agricultural innovations like farmland reform and GMO for food security. Sponge cities, coastal fortification, and massive irrigation works like South-North Water Diversion or the Yarlung Tsampo remind the neighbourhood that China is the original hydraulic civilisation.
The State x Science x Scale impulse to terraform is not unexpected. It taps into ideology echoing deep history that fixing disasters creates state legitimacy. It is conceivable that in the future, a vast sensing system of satellites in outer space connected to sensors in the forests, farms, cities and oceans, monitors and models emissions and responds with solar geoengineering, creating atmospheric rivers with rain seeding, punishing unauthorised carbon violations with central bank digital currency sanctions, and relocating immense human flows just as they relocate rivers in a vast climate governance stack. Only a few countries will have the capability to do this, China chief among them in East Asia. China's climate governance stack is likely to be extended, exported and integrated as countries clamour for security from climate change. Jacob Dreyer asks in “Who gets to be Chinese” if other countries use Chinese technologies, knowledge, communication networks, new ways of organising ... “is China then just a place, or the recipe to the future?” Who gets to be Chinese in this future?
This all sounds too convenient. Because climate change is also a type of war, waged by nature on states. Since war makes the state, climate also makes the state. The climate governance stack as it is already emerging is dominated by state-owned enterprises; their role can only grow stronger as all under heaven descends into chaos in the years to come. We return to Jacob Dreyer, this time in “China’s Soviet Shadow”, where he notes that the logic of GDP, price signals, profit is being discarded in favour of a climate/war economy, and echoes the Soviet Union's centrally planned economy that ultimately led to ecological collapses in the Aral Sea and more. What might the intended and unintended consequences of terraform shunting the market aside be? I will return to this towards the end of this essay.
Moving on to synthetic intelligences, the focus on consciousness reflects the difficult relationship humans have with intelligence. In human history, intelligence has been used as a fig leaf for domination and destruction. Determining someone's level of intelligence not just fixes what we think they can do, but also what can be done to them. Stephen Cave warns in “Intelligence: a history” that those deemed less intelligent have been colonised, enslaved, sterilised and murdered. What does it feel like when natural humans are not automatically at the top of the chain? No wonder synthetic intelligences push all our buttons. As a start, Tyler Cowen observes in “AI’s greatest danger? The humans who use it” that AI will disrupt power relations in the "wordcel class" whose jobs deal with words and symbols. For centuries, the West has awarded high status to these "ideas people”. It is hard to predict how ideas people will respond to a reversal in status.
Our concerns should go beyond the ideas people, as there is life after language. Synthetic intelligences will imbibe video, audio, time series data, and in parallel become embodied with different species of robots. Synthetic intelligences will begin to live in real time. Our time, not in time suspension, knowing only their training data. Venkatesh Rao notes in “Massed Muddler Intelligence” that this is when these synthetic intelligences achieve agency, with real time feedback loops and network learning. We are going to get a new public, composed not just of humans, but also of avatars speaking for the weather system, immortals speaking for once dead generations, digital prophets and strongmen whispering to ecstatic crowds, cities that have a visible voice. This is not as fantastical as you might think. Corporations are not human but they exist, and lawyers speak for them. We can catch glimpses of this new public in games today. Our challenge then is to invent a new sort of governance as the new public emerges, and humans fear displacement as the market, bureaucracy and democracy begin to take non-human voices into account. Involution, the struggle over status and other scarce resources, likely intensify. The Chinese political scientist Ren Jiantao laments in “An ounce of prevention: technological revolution and great changes to state governance” that while China is undergoing a double transition from pre-modern to modern and from modern to post-modern, there is severe under-imagination of the post-modern era.
High governance x high trust
History can be useful. The answer is not to move towards ever higher levels of governance, like supranational states or planetary politics or even into the cloud. Nations and States are not going away because of the threat of violence. Friedrich Engels predicted that the “government of persons” would be replaced by the “administration of things”. But greatness in history resides in the refusal to abdicate to vast impersonal forces; its defining elements are – and must continue to be – created by human beings.
The Korean philosopher Han Byung-Chul observes how today’s euphoria about synthetic and digital everything strongly resembles the euphoria about statistics in the eighteenth century – which did not last for long. Before long, resistance mounted against statistical reason – above all, on the part of Romanticism. The fundamental affect of Romanticism is horror at everything average and normal. The singular, improbable and sudden stand opposed to what is merely probable in statistical terms. Romanticism cultivated the outlandish, abnormal and the extreme in order to counter statistical normality. LLMs are applied statistics, and just like with the arrival of statistics the first time, we see a pushback for romanticism happening today. This pushback explains the thirst for civilisational renaissance and culture wars.
Civilizational wars are between peoples who know who they are; today, cultural wars happen between peoples who do not know who they are.
My suspicion is we need to reform the Scale of the new public. Christina Lu in “The algorithmic internet: culture, capture and corruption” sums up the uncritical drive for Scale on the internet that scrambles our mental models, leading to context collapse as audiences are flattened into a single context on platforms like Twitter, encouraging rage and breeding apathy. At the same time, by making once ephemeral conversations enduring and asynchronous, the internet perpetuates a constant sense of crisis by warping our sense of time. The solution is not unwinding the Scale–- a reactionary smashing of data centers and smartphones – but reforming the Scale, acknowledging it is now a new public of humans and synthetic intelligences. This can be through new physical-digital structures of different levels of closeness – from the intimate to the planetary – that permit communal sense-making: make mountains and rivers and oceans talk. Tap into worship. Make them sing. Tap into climate change trauma. Widen the union by tapping into refugees. Expand the tent of national identity. And in doing so, synthetic intelligences can help us shrink the public square. Undo the excesses of the attention economy. Our reptilian brains were not designed to talk to millions. Go back to smaller numbers, where we have no need for mass politics in which attention economies reward extremes. It is a different sort of governance, where instead of a vast public square we have more scenes, many scenes. Many fragmented identities that need not cohere, because that’s the way we are. This may bring a sort of Westphalian peace after the thirty years of algorithmic internet war.
Succession
Evolution crept up on us in the past two centuries without us really noticing. John Maynard Smith in “The major transitions of evolution” noted in the history of life, there have been several key transitions as life increased in complexity: eukaryotic cells through asexual reproduction, then multicellular plants and animals through sexual reproduction, to cooperation within insect and animal societies via chemical signals, on to human societies scaling because of language and writing. In doing so, individual entities gave up their ability to replicate independently as they became part of a larger, more complex whole e.g. a liver does not seek to reproduce itself, but humans do. Shoggoths – the market, state, democracy, the terraform, synthetic intelligences – already enmesh human society and are the next transition of evolution i.e. Succession. Declining demographics points to a future of fewer – but not zero – humans in the next transition of evolution. This is not a lament of “nothing human makes it through in the near future”. The next phase is here, and humans are part of them. The way forward is a fairer, more balanced ecosystem of shoggoths e.g. terraform counteracting the overdominance of market. A n-body problem where we – humans and synthetic intelligences – move from one dynamic equilibrium to another, and co-evolving in a time of no time. But this is a future essay for another day.