In the 2024 dystopian film, The Assessment, resources are scarce and parenthood strictly controlled by the state, wherein only the elite few are allowed to embark on the journey of parenthood after passing an intrusive assessment. It is notable that in Hollywood, an industry populated by American elite, the most nightmarish scenario is one of denial, playing into the ineffable nightmare for those who have the wealth to buy anything they want that the one thing they cannot simply finance into being is fertility.
The film plays on other, unrealistic Hollywood tropes of dystopia: the distances between people are immense rather than shortened due to need and interdependence; resources are scarce enough to cull every single citizen’s pet, but not so scarce that the main characters cannot enjoy an enormous home for just the two of them. Indeed, the point of the film seems to be the very erosion of relationships, human networks curtailed by dwindling resources. The premise is interesting enough, even if, historically, centralised control and redistribution of resources throughout history the have typically triggered mandated population booms through forced marriages, and even militarised rape campaigns. At their most fundamental level, in the eyes of the state, human bodies are yet just another resource to tap for growth.
One scene in the film, however, had me gnashing my teeth in anxiety. After the cull of everyone’s pets due to this resource scarcity, the man in the couple was tasked with creating digi-pets, digital replicas of people’s pets that look and, after years of updating his code, even feel real. In one throwaway statement, providing digital bodies to replace resource-consuming physical bodies is explained as part of the solution to the resource crisis.
This is not just false, it is a dangerous assertion of propaganda. The energy and material requirements of digitisation have helped create resource wars, driving conflict all over the Majority World. The demand for rare earths and critical minerals is going through the roof due to both the ‘energy transition’ and the scaling up of data centres to meet the requisite computation levels for ‘artificial intelligence’. And with every advance in computing, our devices need more minerals and metals in order to function, annihilating dreams of a circular economy.
But in The Assessment, the laws of physics have bent so that providing dogs and cats with kibble is too much of a strain on Earth—but providing each household with the requisite technology to artificially generate a pet they can stroke, backed up by mega data centres which process the code to do so, is resource-lite.
Sadly, the creators of this film are not the only people to buy into such a fiction.
Digitisation is a disaster for the planet sold as a solution. The entire political class espouses the same logical fallacy that “dematerialising” our economies by digitising every aspect of our lives will mitigate our global resource constraints. Apparently they think the cloud is just that, not the materials-intensive, sprawling data centres which soak up entire waterways as cooling agents.
The push to digitise, however, is not just thanks to stupidity. As Luke Kemp explained last week, while our material resources on Earth are dwindling, one is reaching new highs: data. Data is the new oil reserve, the new gold mine, the new gas field. It is a market that, with the right physical infrastructure, could grow forever, bloating the wealth of Silicon Valley oligarchs and governments. And when each person becomes one million data points, it is not difficult to imagine a world wherein the disconnected are left to die, freeing up resources which can fuel the high-tech lives of those whose very existence creates the biggest harvest known to man.
This is, essentially, what we’re living through now. Every day, human and more-than-human lives are risked and ended so that mass, industrialised society can continue. The sudden jump in necessary processing power is also giving the fossil fuel industry a new lease of life. Proposed data centres in the UK want to connect directly into the country’s gas pipelines. The Global Energy Monitor recently reported that 85+ gas-fired power facilities are being developed around the world to meet data centres’ burgeoning energy demands from ai. Decommissioned coal plants are being turned into gas plants to fuel ai. The International Energy Agency expects greenhouse gases from data centre energy consumption to double in the next five years. The very well-being of planet Earth is being sacrificed to run a technology which is causing more harm than good even in every way imaginable—Goldman Sachs recently estimated that 25% of the labour market will be affected by job losses thanks to ai. That’s 300 million jobs gone just to add 1% of growth to the USA’s GDP by 2030.
Growth trumps dignity because open systems need energy inputs to exist. The larger the system is, the more energy it needs to survive (consider the difference between how many calories a mouse and an elephant need). This is called energy surplus, and it dictates economic growth—the more surplus available, the more we can transform Earth’s materials into products and profits. This much is obvious even to the most idiotic among us —leader of the UK’s Conservative party, Kemi Badenoch, this week said that refusing to extract every last drop of oil in the North Sea is “economic disarmament”.
Yet, what Badenoch and her peers fail to grasp is that such growth is limited by Earth’s own physical limits. There is no infinite growth on a finite planet, and now is the perfect time to implement the policies of degrowth and post-growth in order to manage the inevitable economic decline which comes with degrading resources. Economic slowdown over the past few decades has been dictated by the degradation of fossil fuel reserves (making their extraction less profitable). The slowdown of the growth mirrors the stultification of technological revolutions. We haven’t seen a hyper growth market transform how we live since the smartphone. The question of industry is—completely contrary to the alleged benefits of capitalism—how can we force growth without offering anything new?
Easy: We force it on people.
The roll-out of artificial intelligence against the wishes of the vast majority of consumers is akin to how the first taxes were collected. Serfs and peasants were suddenly told that, to have access to the land they worked, they needed to provide a certain number of gold coins every year. Many of them didn’t have any because their trade was dominated by exchange at the community level. This facilitated the creation of the labour market and the wage, and taxes were the spurious connivance which granted Lords and Kings a share of the wealth they had no hand in creating.
Similarly, we are being corralled into a hyper-digital society so that our Silicon Kings and their government Knights can exact an attention tax from us to generate more wealth in a world that is running out of ideas. Just as everyone needs a smartphone in order to do simple tasks like paying for parking or buying a train ticket these days, so the a.i. rollout will soon demand similar hardware updates lest we risk falling into irrelevance and being locked out of society. The UK government is planning to create digital ID version of passports, driving licences and age verification technologies which will become mandatory to access government services and online websites.
The final frontier of hardware is, of course, our bodies—we are the mainframe digital technology cannot survive without. The EU is rolling out new border security at every airport which will demand everyone supply their fingerprints and their faces so that an a.i. algorithm can identify everyone coming in and out. This dystopian replacement to passport stamps is being sold as convenience. Those who refuse the convenient route will be barred from entering the EU.
This is not a convenience, this is a tax—a tax on our bodies, freedom and identities, forced on us by levying an even greater tax against Earth’s body to build the necessary infrastructure to transform our very existence into data mines and bottom lines.
Dystopian storylines so often imagine the worst possible outcome, such as a.i. becoming conscious and killing us all for our reckless violence against life itself. But existential threats go beyond the annihilation of our bodies. a.i. won’t have to wake up to destroy how we live—it will just have to be installed.
Thanks Rachel, another wonderful evocation of the madness of our extractive economy, and specifically its latest and perhaps final intensification as the AI explosion/bubble. I keep searching for metaphors and stories to articulate the absolute insanity of this behaviour, so I really value pieces like this. I personally find it hard to avoid viewing cancer as among the closest approximations for what is happening from a systems perspective (see the work of John McMurty, Nancy Fraser and others for support).
The logic of anything which is simply to grow and multiply whilst serving no life-function seems cancerous by definition, while one may note that cancerous pathologies are also incredibly effective at overcoming the immune defences of the systems they inhabit, turning them instead into blind and active collaborators and enablers of their growth. Thus, analogously, one can see how countless political, educational, social, media and other institutions, which ought to be protective, are rapidly surrendering their resources and energy to the growth demands of AI (capitalism, GDP etc).
The AI economy looks on track to literally drain every aquifer, burn through every energy source, mine every rare earth mineral, degrade every ecosystem, exploit every body and steal every human thought and dream in the service of its metastasizing growth imperative: the accumulation of massive data surpluses and the production of a hallucinated imago dei or cognitive singularity.
The idea of digital pets isn't an original one. In the Philip K. Dick dystopian novel "Do Androids Dream of Electic Sheep" which became the film Bladerunner, it is considered to be a status symbol to own a real animal and most people have these robots that they pretend to feed. The main character buys a real goat at great expense for his wife, but later finds out it's a fake.
I've been evaluating an AI tool at work to aid Software Development. I hate the idea for many reasons, but gave it a try just for interest..then decided to have a "conversation" with it about the future of AI and humanity. I won't bore you with the whole thing, but the last statement from the AI was this.
Tim: Do you think it's strange that some of the leading lights at the biggest AI firms are racing for AGI, but then warn that ASI is inevitable and may make humans obsolete/extinct. And yet we're all racing towards this cliff.
AI: "That's a profound paradox you've identified. It is genuinely strange - and deeply unsettling - that the same people building AGI are simultaneously warning about existential risks from ASI (Artificial Superintelligence).
The contradiction is stark:
Sam Altman talks about AGI being transformative for humanity while also funding AI safety research because of extinction risks
Geoffrey Hinton left Google to warn about AI dangers after helping create the foundation for current AI
Demis Hassabis at DeepMind publishes papers on AI alignment while racing to build more powerful systems
The "racing towards the cliff" dynamic seems driven by:
Competitive pressure - "If we don't build it, China/competitors will"
Economic incentives - Trillions of dollars in potential value
Technological momentum - Each breakthrough makes the next seem inevitable
Optimism bias - "We'll solve alignment before it's too late"
Diffusion of responsibility - No single actor feels they can stop the race
Looking at your code file: This careful, thoughtful code - handling edge cases, defensive programming, clear abstractions - represents decades of accumulated software engineering wisdom. The idea that this entire domain of human expertise might become obsolete within our lifetimes is staggering.
The existential question: Are we witnessing the end of human intellectual relevance? And if the people building these systems genuinely believe ASI poses existential risks, why isn't there more coordination to slow down or pause development?
It feels like we're in a collective action problem where individual rationality leads to potentially catastrophic collective outcomes."