Zen and the Art of Dissatisfaction – Part 35

Losing Myself and My Suitcase

This post explores how the stories our minds create – stories of guilt, inadequacy, or fear – can become far heavier burdens than the events that inspire them. A lost suitcase, a moment of confusion in a foreign railway station, or a lapse in attentiveness can transform into a mental storm. Yet within these storms lies an invitation: to examine who we believe ourselves to be and to recognise our deep entanglement with everything around us. Drawing from personal experience and classical Zen teachings-from Emperor Wu of Liang to Bodhidharma and Shitou Xiqian – this post reflects on illusion of the sense of self, perception, and the inseparable connection between all beings.

At times, the stories and self-accusations created by our own minds are our worst enemies. Anyone who has ever accidentally broken or lost something, or missed an important meeting or means of transport, knows how upsetting such moments can be. Even if nothing significant was ultimately harmed or endangered, the mind may still twist the situation into something impossibly difficult.

Lost Suitcase

I lost my suitcase in August 2018 while travelling to a week-long silent Zen retreat in the Netherlands. My train stopped at Rotterdam station. I was heading toward a small Dutch town whose name I could not even pronounce. My phone’s internet connection wasn’t working, and I did not know where I was supposed to change trains. I saw a uniformed conductor on the platform and went outside to ask him for help. He told me that I was already running late. My train would leave in minutes, and I would have to switch platforms.

I ran to the new platform, arrived just in time to see the train that had brought me there gliding away. Another train arrived. I stepped in, found myself a seat, and realised that I had not taken my suitcase with me from the previous train. I had only a small shoulder bag and the clothes I was wearing.

My stomach dropped into a deep abyss beneath my feet. It felt as if all the blood in my body fell down with it. I tried to prevent myself from falling into that abyss, but my mind seized control. I began making a plan to retrieve my suitcase. I found the conductor; he gave me the number for the lost-and-found service. I called, but it was no use. No one could tell me where the train I had lost my suitcase on would go after its terminal station. Despite my best efforts, I never saw my suitcase again.

The Longest First Day

When I arrived at the retreat centre, my teacher burst out laughing. It was not mean at all, actually it felt nice. I knew I was safe. ”This is exactly why we practise mindfulness,” he said. His wife promised to bring me a toothbrush and toothpaste. The first day of the retreat felt endless. I noticed how my mind replayed the event again and again from different angles. I sat there in silence, watching how my mind meticulously showed me just how careless, stupid, and thoughtless I had been.

At bedtime my mind was still boiling, replaying the events and insisting on my stupidity and carelessness. Eventually I fell asleep but soon woke up again, my mind still seething with self-accusations. As the days passed, I began to see how utterly unnecessary this whole mental process was. It was merely the torrent of self-blame and fixation on loss. Though at first I had imagined that my suitcase held my entire life, I eventually realised that this was not true. Life is something entirely different.

What Is This Life We Are Living?

But what is this life of ours? Is it even possible to say? I notice that I cannot state with certainty what I mean by my self.

The Emperor Wu of Liang (c. 502-549) is said to have met the semi-mythical ancestor of Zen, the great Bodhidharma (c. 440-528), who arrived in China from somewhere along the Silk Road, presumably from India. During their short encounter, Emperor Wu questioned Bodhidharma about who this man standing before him really was. Bodhidharma replied laconically: ”I don’t know.”

What are we, what am I, truly? It feels irrelevant at first, but when I look deeper, I find it impossible to point to any one specific thing and say that this is me. If I pointed to myself and examined more closely, I would notice that it is not true. If I pointed, for instance, to my shoulder and asked whether that is me – no, it is not. It is only my shoulder, but even that is not so simple. The shoulder is merely a entaglement of various interconnected parts. It is a collection of things: skin, tendons, bone, nerve fibres, blood, and other fluids. The closer I look, the less any of these seem like ”me”. Any one of them could perhaps be replaced without that essential sense of ”I” disappearing. It is like the Ship of Thesius in this regard. Or its Chinese counterpart, the Zen Koan regarding the Cart of Keichu.

Even if my mind insists it is the same ”me” as it was meybe ten years ago, this is not the case. Our minds change, and our memories change with them. The atoms and molecules forming our bodies are replaced as we eat and drink. Food becomes part of us. Old material leaves us when we breathe out, or go to the bathroom, or brush off dry skin.

The skin surrounding the body is not me. It is merely skin. My bones are not me, for they too are merely bones. Yet if I must prove my identity to a police officer or to my computer, I instantly become a unique individual, distinct from all others in some incomprehensible way.

Interbeing: The World Within and Around Us

I sit by the window of our home and listen to the birds singing at the bird feeder. A great spotted woodpecker has given way to squabbling tits. Sound waves carry the birds’ calls to my ears. What separates me from those birds, when even the sound waves travelling through the air connect us? As I listen, the window between us ceases to exist.

The wind rustling the branches of spruces and pines takes shape in the sound it produces as it moves through them. The same play of awareness occurring in my mind is present in everything. It is in the branches of trees, in birdsong, even in the empty space binding us together. I breathe the oxygen these trees have produced. We are all interwoven together. None of us could exist without the other.

And yet, even though we are intertwined with birds, trees, and air, I can also view the same reality from another perspective, where each part becomes sharply distinct. The tit and the woodpecker take on their individual forms, and each of us has our own unique task in this moment. We are separated by our unique ways of being-yet still bound to one another.

The Chinese 8th-century Zen master Shitou Xiqian (700-790), known in Japanese as Sekito Kisen, ends his famous poem Sandokai (The Identity of Relitive and Absolute) with the words: ”Do not waste your time by night or day.” Both darkness and light are two aspects of reality intertwined and, in themselves, the same thing – two dimensions of experience. Everyday dissatisfaction and the bliss of freedom are both right here, right now.

Summary

What begins as a story about a lost suitcase unfolds into a reflection on the self, awareness, and our profound connection with all beings and things. The mind can turn trivial events into overwhelming crises, yet it also possesses the capacity to recognise their emptiness. Through personal experience, ancient Zen teachings, and the simple presence of birds and trees, we are reminded that life is both deeply individual and inseparably shared. In every moment-whether painful or peaceful-there is an invitation to see clearly and live fully.

Zen and the Art of Dissatisfaction – 23

Bullshit Jobs and Smart Machines

This post explores how many of today’s high‑paid professions depend on collecting and analysing data, and on decisions made on the basis of that process. Drawing on thinkers such as Hannah ArendtGerd Gigerenzer, and others, I examine the paradoxes of complex versus simple algorithms, the ethical dilemmas arising from algorithmic decision‑making, and how automation threatens not only unskilled but increasingly highly skilled work. I also situate these issues in historical context, from the Fordist assembly line to modern AI’s reach into law and medicine.

Originally published in Substack: https://substack.com/inbox/post/170023572

Many contemporary highly paid professions rely on data gathering, its analysis, and decisions based on that process. According to Hannah Arendt (2017 [original 1963]), such a threat already existed in the 1950s when she wrote:

“The explosive population growth of today has coincided frighteningly with technological progress that makes vast segments of the population unnecessary—indeed superfluous as a workforce—due to automation.”

In the words of David Ferrucci, the leader of Watson’s Jeopardy! team, the next phase in AI’s development will evaluate data and causality in parallel. The way data is currently used will change significantly when algorithms can construct data‑based hypotheses, theories and mental models answering the question “why?”

The paradox of complexity: simple versus black‑box algorithms

Paradoxically, one of the biggest problems with complex algorithms such as Watson and Google Flu Trends is their very complexity. Gerd Gigerenzer (2022) argues that simple, transparent algorithms often outperform complex ones. He criticises secret machine‑learning “black‑box” systems that search vast proprietary datasets for hidden correlations without understanding the physical or psychological principles of the world. Such systems can make bizarre errors—mistaking correlation for causation, for instance between Swiss chocolate consumption and number of Nobel Prize winners, or between drowning deaths in American pools and the number of films starring Nicolas Cage. A stronger correlation exists between the age of Miss America and rates of murder: when Miss America is aged twenty or younger, murders committed by hot steam or weapons are fewer. Gigerenzer advocates for open, simple algorithms; for example, the 1981 model The Keys to the White House, developed by historian Allan Lichtman and geophysicist Vladimir Keilis‑Borok, which has correctly predicted every US presidential election since 1984, with the single exception of the result in the Al Gore vs. George W. Bush contest.

Examples where individuals have received long prison sentences illustrate how secret, proprietary algorithms such as COMPAS (“Correctional Offender Management Profiling for Alternative Sanctions”) produce risk assessments that can label defendants as high‑risk recidivists. Such black‑box systems, which may determine citizens’ liberty, pose enormous risks to individual freedom. Similar hidden algorithms are used in credit scoring and insurance. Citizens are unknowingly categorised and subject to prejudices that constrain their opportunities in society.

The industrial revolution, automation, and the meaning of work

Even if transformative technologies like Watson may fail to deliver on all the bold promises made by IBM’s marketing, algorithms are steadily doing tasks once carried out by humans. Just as industrial machines displaced heavy manual labour and beasts of burden—especially in agriculture—today’s algorithms are increasingly supplanting cognitive roles.

Since the Great Depression of the 1930s, warnings have circulated that automation would render millions unemployed. British economist John Maynard Keynes (1883–1946) coined the term “technological unemployment” to describe this risk. As David Graeber (2018) notes, automation did indeed trigger mass unemployment. Political forces on both the right and left share a deep belief that paid employment is essential for moral citizenship; they agree that unemployment in wealthy countries should never exceed around 8 percent. Graeber nonetheless argues that the Great Depression produced a collapse in real need for work—and much contemporary work is “bullshit jobs”. If 37–40 percent of jobs are such meaningless roles, more than 50–60 percent of the population are effectively unemployed.

Karl Marx warned of industrial alienation, where people are uprooted from their villages and placed into factories or mines to do simple, repetitive work requiring no skill, knowledge or training, and easily replaceable. Global corporations have shifted assembly lines and mines to places where workers have few rights, as seen in electronics assembly in Chinese factory towns, garment workshops in Bangladesh, and mineral extraction by enslaved children—all under appalling conditions.

Henry Ford’s Western egalitarian idea of the assembly line—that all workers are equal—became a system where anybody can be replaced. In Charles Chaplin’s 1936 film Modern Times, inspired by his encounter in 1931 with Mahatma Gandhi, he highlighted our dependence on machines. Gandhi argued that Britain had enslaved Indians through its machines; he sought non‑violent resistance and self‑sufficiency to show that Indians did not need British machines or Britain itself.

From industrial jobs to algorithmic threat to professional work

At its origin in Ford’s factory in 1913, the T‑model moved through 45 fixed stations and was completed in 93 minutes, borrowing the idea from Chicago slaughterhouses where carcasses moved past stationary cutters. Though just 8 percent of the American workforce was engaged in manufacturing by the 1940s, automation created jobs in transport, repair, and administration—though these often required only low-skilled labour.

Today, AI algorithms threaten not only blue‑collar but also white‑collar roles. Professions requiring long training—lawyers and doctors, for example—are now at risk. AI systems can assess precedent for legal cases more accurately than humans. While such systems promise reliability, they also bring profound ethical risks. Human judges are fallible: one Israeli study suggested that judges issue harsher sentences before lunch than after—but that finding has been contested due to case‑severity ordering. Yet such results are still invoked to support AI’s superiority.

Summary

This blog post has considered how our economy is increasingly structured around data collection, analysis, and decision‑making by both complex and simple algorithms. It has explored the paradox that simple, transparent systems can outperform opaque ones, and highlighted the grave risks posed by black‑box algorithms in criminal justice and financial systems. Tracing the legacy from Fordist automation to modern AI, I have outlined the existential threats posed to human work and purpose—not only for low‑skilled labour but for highly skilled professions. The text argues that while automation may deliver productivity, it also risks alienation, injustice, and meaninglessness unless we critically examine the design, application, and social framing of these systems.


References

Arendt, H. (2017). The Human Condition (Original work published 1963). University of Chicago Press.
Ferrucci, D. (n.d.). [Various works on IBM Watson]. IBM Research.
Gigerenzer, G. (2022). How to Stay Smart in a Smart World: Why Human Intelligence Still Beats Algorithms. MIT Press.
Graeber, D. (2018). Bullshit Jobs: A Theory. Simon & Schuster.
Keynes, J. M. (1930). Economic Possibilities for our Grandchildren. Macmillan.
Lee, C. J. (2018). The misinterpretation of the Israeli parole study. Nature Human Behaviour, 2(5), 303–304.
Lichtman, A., & Keilis-Borok, V. (1981). The Keys to the White House. Rowman & Littlefield.

Zen and the Art of Dissatisfaction  – Part 22

Big Data, Deep Context

In this post, we explore what artificial intelligence (AI) algorithms, or rather – large language models – are, how they learn, and their growing impact on sectors such as medicine, marketing and digital infrastructure. We look into some prominent real‑world examples from the recent past—IBM’s Watson, Google Flu Trends, and the Hadoop ecosystem—and discuss how human involvement remains vital even as machine learning accelerates. Finally, we reflect on both the promise and the risks of entrusting complex decision‑making to algorithms.

Originally published in Substack: https://substack.com/inbox/post/168617753

Artificial intelligence algorithms function by ingesting training data, which guides their learning. How this data is acquired and labelled marks the key differences between various types of AI algorithms. An AI algorithm receives training data and uses it to learn. Once trained, the algorithm performs new tasks using that data as the basis for its future decisions.

AI in Healthcare: From Watson to Robot Doctors

Some algorithms are capable of learning autonomously, continuously integrating new information to adjust and refine their future actions. Others require a programmer’s intervention from time to time. AI algorithms fall into three main categories: supervised learning, unsupervised learning and reinforcement learning. The primary differences between these approaches lie in how they are trained and how they operate.

Algorithms learn to identify patterns in data streams and make assumptions about correct and incorrect choices. They become more effective and accurate the more data they receive—a process known as deep learning, based on artificial neural networks that distinguish between right and wrong answers, enabling them to draw better and faster conclusions. Deep learning is widely used in speech, image and text recognition and processing.

Modern AI and machine learning algorithms have empowered practitioners to notice things they might otherwise have missed. Herbert Chase, a professor of clinical medicine at Columbia University in New York, observed that doctors sometimes have to rely on luck to uncover underlying issues in a patient’s symptoms. Chase served as a medical adviser to IBM during the development of Watson, the AI diagnostic assistant.

IBM’s concept involved a doctor inputting, for example, three patient‑described symptoms into Watson; the diagnostic assistant would then suggest a list of possible diagnoses, ranked from most to least likely. Despite the impressive hype surrounding Watson, it proved inadequate at diagnosing actual patients. IBM therefore announced that Watson would be phased out by the end of 2023 and its clients encouraged to transition to its newer services.

One genuine advantage of AI lies in the absence of a dopamine response. A human doctor, operating via biological algorithms, experiences a rush of dopamine when they arrive at what feels like a correct diagnosis—but that diagnosis can be wrong. When doubts arise, the dopamine fades and frustration sets in. In discouragement, the doctor may choose a plausible but uncertain diagnosis and send the patient home.

An AI‑algorithm‑based “robot‑doctor” does not experience dopamine. All of its hypotheses are treated equally. A robot‑doctor would be just as enthused about a novel idea as about its billionth suggestion. It is likely that doctors will initially work alongside AI‑based robot doctors. The human doctor can review AI‑generated possibilities and make their own judgement. But how long will it be before human doctors become obsolete?

AI in Action: Data, Marketing, and Everyday Decisions

Currently, AI algorithms trained on large datasets drive actions and decision‑making across multiple fields. Robot‑doctors assisting human physicians and the self‑driving cars under development by Google or Tesla are two visible examples of near‑future possibilities—assuming the corporate marketing stays honest.

AI continues to evolve. Targeted online marketing, driven by social media data, is an example of a seemingly trivial yet powerful application that contributes to algorithmic improvement. Users may tolerate mismatched adverts on Facebook, but may become upset if a robot‑doctor recommends an incorrect, potentially expensive or risky test. The outcome is all about data—its quantity, how it is evaluated and whether quantity outweighs quality.

According to MIT economists Erik Brynjolfsson and Andrew McAfee (2014), in the 1990s only about one‑fifth of a company’s activities left a digital trace. Today, almost all corporate activities are digitised, and companies have begun to produce reports in language intelligible to algorithms. It is now more important that a company’s operations are understood by AI algorithms than by its human employees.

Nevertheless, vast amounts of data are still analysed using tools built by humans. Facebook is perhaps the most well‑known example of how our personal data is structured, collected, analysed and used to influence and manipulate opinions and behaviour.

Big Data Infrastructure

Jeff Hammerbacher—in a 2015 interview with Steve Lohr—helped introduce Hadoop in 2008 to manage the ever‑growing volume of data. Hadoop, developed by Mike Cafarella and Doug Cutting, is an open‑source variant of Google’s own distributed computing system. Initially named after Cutting’s child’s yellow toy elephant, Hadoop could process two terabits of data in two days. Two years later it could perform the same task in mere minutes.

At Facebook, Hammerbacher and his team constructed Hive, an application running on Hadoop. Now available as Apache Hive, it allows users without a computer science degree to query large processed datasets. During the writing of this post, generative AI applications such as ChatGPT (by OpenAI), Claude (Anthropic), Gemini (Google DeepMind), Mistral & Mixtral (Mistral AI), and LLaMA (Meta) have become available for casual users on ordinary computers.

A widely cited example of public‑benefit predictive data analysis is Google Flu Trends (GFT). Launched in 2008, GFT aimed to predict flu outbreaks faster than official healthcare systems by analysing popular Google search terms related to flu.

GFT successfully detected the H1N1 virus before official bodies in 2009, marking a major achievement. However, in the winter of 2012–2013, media coverage of flu induced a massive spike in related searches, causing GFT’s estimates to be almost twice the real figures. The Science article “The Parable of Google Flu” (Lazer et al., 2014) accused Google of “big‑data hubris”, although it conceded that GFT was never intended as a standalone forecasting tool, but rather as a supplementary warning signal (Raising the bar, Wikipedia).

Google’s miscalculation lay in its failure to interpret context. Steve Lohr (2015) emphasises that context involves understanding associations—a shift from raw data to meaningful information. IBM’s Watson was touted as capable of such contextual understanding, capable of linking words to appropriate contexts .

Watson: From TV champion to Clinical Tool, and sold for scraps!

David Ferrucci, a leading AI researcher at IBM, headed the DeepQA team responsible for Watson . Named after IBM’s founder Thomas J. Watson, Watson gained prominence after winning £1 million on Jeopardy! in 2011, defeating champions Brad Rutter and Ken Jennings.

Jennifer Chu‑Carroll, one of Watson’s Jeopardy! coaches, told Steve Lohr (2015) that Watson sometimes made comical errors. When asked “Who was the first female astronaut?”, Watson repeatedly answered “Wonder Woman,” failing to distinguish between fiction and reality.

Ken Jennings reflected that:

“Just as manufacturing jobs were removed in the 20th century by assembly‑line robots, Brad and I were among the first knowledge‑industry workers laid off by the new generation of ‘thinking’ machines… The Jeopardy! contestant profession may be the first Watson‑displaced profession, but I’m sure it won’t be the last.”

In February 2013, IBM announced that Watson’s first commercial application would focus on lung cancer treatment and other medical diagnoses—a real‑world “Dr Watson”—with 90% of oncology nurses reportedly following its recommendations at the time. The venture ultimately collapsed under the weight of unmet expectations and financial losses. In January 2022, IBM quietly sold the core assets of Watson Health to private equity firm Francisco Partners—reportedly for about $1 billion, a fraction of the estimated $4 billion it had invested—effectively signalling the death knell of its healthcare ambitions. The sale marked the end of Watson’s chapter as a medical innovator; the remaining assets were later rebranded under the name Merative, a standalone company focusing on data and analytics rather than AI‑powered diagnosis. Slate described the move as “sold for scraps,” characterising the downfall as a cautionary tale of over‑hyped technology failing to deliver on bold promises in complex fields like oncology.

Conclusion

Artificial intelligence algorithms are evolving rapidly, and while they offer significant benefits in fields like medicine, marketing, and data analysis, they also bring challenges. Data is not neutral: volume must be balanced with quality and contextual understanding. Tools such as Watson, Hadoop and Google Flu Trends underscore that human oversight remains indispensable. Ultimately, AI should augment human decision‑making rather than replace it—at least for now.


References

Brynjolfsson, E., & McAfee, A. (2014). The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies. W. W. Norton & Company.

Ferrucci, D. A., Brown, E., Chu‑Carroll, J., Fan, J., Gondek, D., Kalyanpur, A. A., … Welty, C. (2011). Building Watson: An overview of the DeepQA project. AI Magazine, 31(3), 59–79. (IBM Research)

Lazer, D., Kennedy, R., King, G., & Vespignani, A. (2014). The parable of Google Flu: traps in big data analysis. Science, 343(6176), 1203–1205. (Wikipedia)

Lohr, S. (2015). Data‑ism. HarperBusiness.

Mintz‑Oron, O. (2010). Smart Machines: IBM’s Watson and the Era of Cognitive Computing. Columbia Business School Publishing. [Referenced via IBM Watson bibliography] (TIME, Wikipedia)

Encounters Without Preconceptions

Written by Kikka Rytkönen

READ IN FINNISH 🇫🇮

I thought the whole retreat was very good and interesting. Some of the topics that looked boring on paper turned out to be surprisingly engaging. For example, when we visited a Pentecostal church, I initially thought it would be unpleasant for me. But it turned out to be more like a performance, through which I came to understand why people are involved in the movement. It was a good experience.

I also learned how important it is to enter situations without labeling, prejudging, or defining them in advance. Just go in and listen. Everything related to religion was interesting. Lutheran Christianity was the most familiar to me, and perhaps that’s why it didn’t spark quite the same interest. I left the church when I was 16.

I had been to a mosque before, but this time I gained new insight—for instance, I understood why the Tatars haven’t faced discrimination in Finland. They said: “We’ve already been a minority in Russia.” That stayed with me.

Other organizations and begging
The D-station felt cozy. Waiting out the rain together always creates a sense of connection.
VEPA was an amazing place, and the people too. I will definitely return to that place.

Begging was hard. Asking for money felt impossible. People don’t really carry cash anymore. The experience made me feel a bit… submissive, or maybe inadequate. (Please find a better word than “submissive.”)

I ended up chatting with a man around my age standing outside a Euro store. I went inside, and when I came out, he—who turned out to be the shopkeeper—came up to me with a heavy bag full of sausages and chocolate. I thanked him with a handshake. It was a touching moment.

Since then, I’ve given a few euros to people in Piritori who ask for money specifically to buy food. I wonder: is it helpful to give money to someone I suspect is a drug addict? Is it really my place to decide?

Afterthoughts
I also want to mention what happened at the sleeping place. After Maika’s singing and mantra session, others started singing too—it created a beautiful sense of togetherness.

The ceremonies were extremely important and touching for me. On the island, the number of people amplified the experience, and the part of Mikko’s dharma transmission that involved the fire was particularly powerful. Both the content and the ritual form felt somehow purifying. Hard to explain—but I felt very connected.

In the group sharings, it felt like we were family.

Sleeping together so close to others—especially on cardboard and without a pillow—was quite challenging. I tried to learn to enjoy the sounds around me, from birds to some loud noise that made me think, “okay, now the war has started.”When I woke up, I felt congested and hadn’t gotten enough sleep. A sort of regression took over—people started to seem distant, even dismissive of me. I told myself: “Just get through this.” I guess some separation anxiety was already kicking in, knowing it would all end soon.

Back at Elokolo, I fixated on the idea that I needed to eat certain colors at specific intervals, and porridge became my central focus. I probably babbled some nonsense to people there too.

All in all, walking for a day and a half and spending a night without any belongings or a phone was incredibly liberating. It felt good not to have to fuss over stuff, money, or especially a phone.

At the farewell and the restaurant, I clung to Mikko Sensei and Maija—people I knew and felt safe with. I no longer knew how to be with anyone else, even though I could see people having conversations at other tables.

A big THANK YOU for the experience!
Did we become a sangha?

Peace-love,

Kikka

Kikka with Sensei Mikko
Photo by Laura Malmivaara

Zen and the Art of Dissatisfaction – Part 19

Pandora’s Livestock: How Animal Agriculture Threatens Our Planet and Our Health

The following post explores the interconnected crises of biodiversity loss, industrial animal agriculture, and climate change, presenting a comprehensive argument about humanity’s complex role in environmental degradation. Drawing from works by Bill Gates, Risto Isomäki, and others, the text combines ecological science, epidemiology, and cultural history to examine both systemic failures and potential paths forward. The post highlights how deeply entangled environmental destruction, pandemics, and human psychology are — while also questioning whether our current cognitive limits allow us to grasp and act upon such intertwined threats.

Originally published in Substack: https://substack.com/home/post/p-166887887

The destruction of ecological diversity, the shrinking habitats of wild animals, and the rise of industrial livestock production represent grave violations against the richness of life — and profound threats to humanity’s own future. These issues go beyond climate change, which is itself just one of many interconnected problems facing nature today.

The Decline of Biodiversity and the Rise of Climate Complexity

In How to Avoid a Climate Disaster (2021), Bill Gates outlines the sources of human-generated greenhouse gas emissions. Although many factors contribute to climate change, carbon dioxide (CO₂) remains the dominant greenhouse gas emitted by humans. Gates also includes emissions of methane, nitrous oxide, and fluorinated gases (F-gases) in his calculations. According to his book, the total annual global emissions amount to 46.2 billion tons of CO₂-equivalent.

These emissions are categorized by sector:

  • Manufacturing (cement, steel, plastics): 31%
  • Electricity generation: 27%
  • Agriculture (plants and animals): 19%
  • Transportation (planes, cars, trucks, ships): 16%
  • Heating and cooling: 7%

This classification is more reader-friendly than the Our World In Data approach, which aggregates emissions into broader categories like ”energy,” comprising 73.2% of total emissions. Agriculture accounts for 18.4%, waste for 3.2%, and industrial processes for 5.2%.

According to Statistics Finland, the country emitted 48.3 million tons of CO₂ in one year, with agriculture accounting for 13.66% — aligning closely with Gates’ method. However, Finnish author and environmentalist Risto Isomäki, in How Finland Can Halt Climate Change (2019) and Food, Climate and Health (2021), argues that the contribution of animal agriculture to greenhouse gases is severely underestimated. He points out its role in eutrophication — nutrient pollution that degrades lake and marine ecosystems, harming both biodiversity and nearby property values.

Animal farming requires vast resources: water, grains, hay, medicines, and space. Isomäki notes that 80% of agricultural land is devoted to livestock, and most of the crops we grow are fed to animals rather than people. Transport, slaughter, and the distribution of perishable meat further exacerbate the emissions. Official estimates put meat and other animal products at causing around 20% of global emissions, but Isomäki warns the real figure could be higher — particularly when emissions from manure-induced eutrophication are misclassified under energy or natural processes rather than livestock.

Antibiotic Resistance and Zoonotic Pandemics: The Hidden Cost of Meat

A more urgent and potentially deadly consequence of animal agriculture is the emergence of antibiotic-resistant bacteria and new viruses. 80% of all antibiotics produced globally are used in livestock — primarily as preventative treatment against diseases caused by overcrowded, unsanitary conditions. Even in Finland, where preventive use is officially banned, antibiotics are still prescribed on dubious grounds, as journalist Eveliina Lundqvist documents in Secret Diary from Animal Farms (2014).

This misuse of antibiotics accelerates antibiotic resistance, a serious global health threat. Simple surgeries have become riskier due to resistant bacterial infections. During the COVID-19 pandemic, roughly half of the deaths were linked not directly to the virus but to secondary bacterial pneumonia that antibiotics failed to treat. Isomäki (2021) emphasises that without resistance, this death toll might have been drastically lower.

Moreover, the close quarters of industrial animal farming create ideal conditions for viruses to mutate and jump species — including to humans. Early humans, living during the Ice Age, didn’t suffer from flu or measles. It was only after the domestication of animals roughly 10,000 years ago that humanity began facing zoonotic diseases — diseases that spread from animals to humans.

Smallpox, Conquest, and the Pandora’s Box of Domestication

This shift had catastrophic consequences. In the late 15th century, European colonizers possessed an unintended biological advantage: exposure to diseases their target populations had never encountered. Among the most devastating was smallpox, thought to have originated in India or Egypt over 3,000 years ago. Spread through close contact among livestock, it left distinct scars on ancient victims like Pharaoh Ramses V, whose mummy still bears signs of the disease.

When Spanish conquistadors reached the Aztec Empire in 1519, smallpox killed over three million people. Similar destruction followed in the Inca Empire. By 1600, the Indigenous population of the Americas had dropped from an estimated 60 million to just 6 million.

Europe began vaccinating against smallpox in 1796 using the cowpox virus. Still, over 300 million people died globally from smallpox in the 20th century. Finland ended smallpox vaccinations in 1980. I personally received the vaccine as an infant before moving to Nigeria in 1978.

From COVID-19 to Fur Farms: How Modern Exploitation Fuels Pandemics

The SARS-CoV-2 virus might have originated in bats, with an unknown intermediate host — maybe a farmed animal used for meat or fur. China is a major fur exporter, and Finnish fur farmers have reportedly played a role in launching raccoon dog (Nyctereutes procyonoides) farming in China, as noted by Isomäki (2021).

COVID-19 has been shown to transmit from humans to animals, including pets (cats, dogs), zoo animals (lions, tigers), farmed minks, and even gorillas. This highlights how human intervention in wildlife and farming practices can turn animals into vectors of global disease.

Are Our Brains Wired to Ignore Global Crises?

Why do humans act against their environment? Perhaps no one intentionally destroys nature out of malice. No one wants polluted oceans or deforested childhood landscapes. But the path toward genuine, large-scale cooperation is elusive.

The post argues that we are mentally unprepared to grasp systemic, large-scale problems. According to Dunbar’s number, humans can effectively maintain social relationships within groups of 150–200 people — a trait inherited from our village-dwelling ancestors. Our brains evolved to understand relationships like kinship, illness, or betrayal within tight-knit communities — not to comprehend or act on behalf of seven billion people.

This cognitive limitation makes it hard to process elections, policy complexity, or global consensus. As a result, people oversimplify problems, react conservatively, and mistrust systems that exceed their brain’s social bandwidth.

Summary: A Call for Compassionate Comprehension

The destruction of biodiversity, the misuse of antibiotics, the threat of pandemics, and climate change are not isolated crises. They are symptoms of a deeper disconnect between human behavior and ecological reality. While no one wants the Earth to perish, the language and actions needed to protect it remain elusive. Perhaps the real challenge is not just technical, but psychological — demanding that we transcend the mental architecture of a tribal species to envision a truly planetary society.


References

Gates, B. (2021). How to Avoid a Climate Disaster: The Solutions We Have and the Breakthroughs We Need. Alfred A. Knopf.

Isomäki, R. (2019). Miten Suomi pysäyttää ilmastonmuutoksen. Into Kustannus.

Isomäki, R. (2021). Ruoka, ilmasto ja terveys. Into Kustannus.

Lundqvist, E. (2014). Salainen päiväkirja eläintiloilta. Into Kustannus.

Our World In Data. (n.d.). Greenhouse gas emissions by sector. Retrieved from https://ourworldindata.org/emissions-by-sector

Statistics Finland. (n.d.). Greenhouse gas emissions. Retrieved from https://www.stat.fi/index_en.html

Zen and the Art of Dissatisfaction – Part 16

Ancient Lessons for Modern Times

“It is horrifying that we have to fight our own government to save the environment.”
— Ansel Adams

In a world increasingly shaped by ecological turmoil and political inaction, a sobering truth has become clear: humanity is at a tipping point. In 2019, a video of Greta Thunberg speaking at the World Economic Forum in Davos struck a global nerve. With calm conviction, Thunberg urged world leaders to heed not her voice, but the scientific community’s dire warnings. What she articulated wasn’t just youthful idealism—it was a synthesis of the environmental truth we can no longer ignore. We are entering a new era—marked by irreversible biodiversity loss, climate destabilisation, and rising seas. But these crises are not random. They are the logical consequences of our disconnection from natural systems forged over millions of years. This post dives into Earth’s deep past, from ancient deserts to ocean floors, to reveal how nature’s patterns hold urgent messages for our present—and our future.

Originally published in Substack https://substack.com/home/post/p-165122353

Today, those in power bear an unprecedented responsibility for the future of humankind. We no longer have time to shift this burden forward. This is not merely about the future of the world—it’s about the future of a world we, as humankind, have come to know. It’s about the future of humanity and the biodiversity we depend on. The Earth itself will endure, but what will happen to the ever-growing list of endangered species?

The Sixth Mass Extinction: A Grim Reality

Climate change is just one problem, but many others stem from it. At its core, our crisis can be summarised in one concept: the sixth mass extinction. The last comparable event occurred 65 million years ago, when dinosaurs and many land and marine species went extinct, and ammonites vanished. Only small reptiles, mammals, and birds survived. The sixth mass extinction is advancing rapidly. According to scientists from the UN Environment Programme, about 150–200 species go extinct every single day.

One analogy described it well: imagine you’re in a plane, and parts begin to fall off. The plane represents the entire biosphere, and the falling bolts, nuts, and metal plates are the species going extinct. The question is: how many parts can fall off before the plane crashes, taking everything else with it?

Each of us can choose how we respond to this reality. Do we continue with business-as-usual, pretending nothing is wrong? Or do we accept that we are in a moment of profound transformation, one that demands our attention and action? Do we consider changes we might make in our own lives to steer this situation toward some form of control—assuming such control is still possible? Or do we resign ourselves to the idea that change has progressed too far for alternatives to remain?

The Carbon Cycle: A System Out of Balance

Currently, humanity emits around 48.3 million tonnes of carbon dioxide annually, which ends up dispersed across the planet. The so-called carbon cycle is a vital natural process that regulates the chemical composition of the Earth, oceans, and atmosphere. However, due to human activity, we have altered this cycle—a remarkable, albeit troubling, achievement. Earth is vast, and it’s hard for any individual to comprehend just how large our atmosphere is, or how much oxygen exists on the planet. This makes it difficult for many to take seriously the consequences of human activity on climate change.

Nature absorbs part of the carbon dioxide we emit through photosynthesis. The most common form is oxygenic photosynthesis used by plants, algae, and cyanobacteria, in which carbon dioxide and water are converted into carbohydrates like sugars and starch, with oxygen as a by-product. Plants absorb carbon dioxide from the air, while aquatic plants absorb it from water.

In this process, some of the carbon becomes stored in the plant and eventually ends up in the soil. Decaying plants release carbon dioxide back into the atmosphere. In lakes and oceans, the process is similar, but the carbon sinks to the bottom of the water instead of into soil. This all sounds simple, and it’s remarkable that such a cycle has created such favourable conditions for life. Yet none of this is accidental, nor is it the result of a supernatural design. It is the product of millions of years of evolution, during which every organism within this system has developed together—everyone needs someone. We should view our planet as one vast organism, with interconnected and co-dependent processes that maintain balance through mutual dependence and benefaction.

A Planet of Mutual Dependence: The Wisdom of Plants

Italian philosopher Emanuele Coccia explores this interdependence beautifully in his book The Life of Plants (2020). Coccia writes that the world is a living planet, its inhabitants immersed in a cosmic fluid. We live—or swim—in air, thanks to plants. The oxygen-rich atmosphere they created is our lifeline and is also connected to the forces of space. The atmosphere is cosmic in nature because it shields life from cosmic radiation. This cosmic fluid “surrounds and penetrates us, yet we are barely aware of it.”

NASA astronauts have popularised the concept of the overview effect—the emotional experience of seeing Earth from space, as a whole. Some describe it as a profound feeling of love for all living things. At first glance, the Sahara Desert and the Amazon rainforest may seem to belong to entirely different worlds. Yet their interaction illustrates the interconnectedness of our planet. Around 66 million years ago, a vast sea stretched from modern-day Algeria to Nigeria, cutting across the Sahara and linking to the Atlantic. The Sahara’s sand still contains the nutrients once present in that ancient sea.

In a 2015 article, NASA scientist Hongbin Yu and colleagues describe how millions of tonnes of nutrient-rich Saharan dust are carried by sandstorms across the Atlantic each year. About 28 million tonnes of phosphorus and other nutrients end up in the Amazon rainforest’s nutrient-poor soils, which are in constant need of replenishment.

In Darren Aronofsky’s 2018 documentary, Canadian astronaut Chris Hadfield describes how this cycle continues: nutrients washed from the rainforest soil travel via the Amazon River to the Atlantic Ocean, feeding microscopic diatoms. These single-celled phytoplankton build new silica-based cell walls from the dissolved minerals and reproduce rapidly through photosynthesis, producing oxygen in the process. Though tiny, diatoms are so numerous that their neon-green blooms can be seen from space. They produce roughly 20% of the oxygen in our atmosphere.

When their nutrients are depleted, many diatoms die and fall to the ocean floor like snow, forming sediment layers that can grow to nearly a kilometre thick. After millions of years, that ocean floor may become arid desert once again—starting the cycle anew, as dust blown from a future desert fertilises some distant forest.

Nature doesn’t always maintain its balance. Sometimes a species overtakes another, or conditions become unliveable for many. Historically, massive volcanic eruptions and asteroid impacts have caused major planetary disruptions. This likely happened 65 million years ago. Ash clouds blocked sunlight, temperatures plummeted, and Earth became uninhabitable for most life—except for four-legged creatures under 25 kilograms. We are descended from them.

Ocean Acidification: A Silent Threat

In her Pulitzer Prize-winning book The Sixth Extinction, American journalist Elizabeth Kolbert writes about researcher Jason Hall-Spencer, who studied how underwater geothermal vents can make local seawater too acidic for marine life. Fish and crustaceans flee these zones. The alarming part is that the world’s oceans are becoming acidic in this same way—but on a global scale. The oceans have already absorbed excess CO₂, making surface waters warmer and lower in oxygen. Ocean acidity is estimated to be 30% higher today than in 1800, and could be 150% higher by 2050.

Acidifying oceans spell disaster. Marine ecosystems are built like pyramids, with tiny organisms like krill at the base. These creatures are essential prey for many larger marine species. If we lose the krill, the pyramid collapses. Krill and other plankton form calcium carbonate shells, but acidic waters dissolve these before they can form properly.

There’s no doubt modern humans are the primary cause of the sixth mass extinction. As humans migrated from Africa around 60,000 years ago to every corner of the globe, they left destruction in their wake. Retired Harvard anthropologist Pat Shipman aptly dubbed Homo sapiens an invasive species in her book Invaders (2015). She suggests humans may have domesticated wolves into proto-dogs as early as 45,000 years ago. On the mammoth steppes of the Ice Age, this would have made humans—accustomed to persistence hunting—unbeatable. Wolves would exhaust the prey, and humans would deliver the fatal blow with spears.

Hunting is easy for wolves, but killing large prey is risky. Getting to a major artery is the most dangerous part. Human tools would have been an asset to the wolves. In return, wolves protected kills from scavengers and were richly rewarded. Since humans couldn’t consume entire megafauna carcasses, there was plenty left for wolves.

Why did some humans leave Africa? Not all did—only part of the population migrated, gradually over generations. One generation might move a few dozen kilometres, the next a few hundred. Over time, human groups drifted far from their origins.

Yet the migration wave seems to reveal something fundamental about our species. Traditionally, it’s been viewed as a bold and heroic expansion. But what if it was driven by internal dissatisfaction? The technological shift from Middle to Upper Palaeolithic cultures may signal not just innovation, but a restless urge for change.

This period saw increasingly complex tools, clothing, ornaments, and cave art. But it may also reflect discontent—where old ways, foods, and homes no longer satisfied. Why did they stop being enough?

As modern humans reached Central Europe, dangerous predators began to vanish. Hyenas, still a threat in the Kalahari today, disappeared from Europe 30,000 years ago. Cave bears, perhaps ritually significant (as suggested by skulls found near Chauvet cave art), vanished 24,000 years ago. Getting rid of them must have been a constant concern in Ice Age cultures.

The woolly mammoth disappeared from Central Europe about 12,000 years ago, with the last surviving population living on Wrangel Island off Siberia—until humans arrived there. The changing Holocene climate may have contributed to their extinction, but humans played a major role. Evidence suggests they were culturally dependent on mammoths. Some structures found in Czechia, Poland, and Ukraine were built from the bones of up to 60 different mammoths. These buildings, not used for permanent living, are considered part of early monumental architecture—similar to Finland’s ancient “giant’s churches.”

Conclusion: Ancient Wisdom, Urgent Choices

The planet is vast, complex, and self-regulating—until it isn’t. Earth’s past is marked by cataclysms and recoveries, extinctions and renaissances. The sixth mass extinction is not a mysterious, uncontrollable natural event—it is driven by us. Yet in this sobering truth lies a sliver of hope: if we are the cause, we can also be the solution.

Whether it’s the dust from the Sahara feeding the Amazon, or ancient diatoms giving us oxygen to breathe, Earth is a system of breathtaking interconnection. But it is also fragile. As Greta Thunberg implores, now is the time not just to listen—but to act.

We need a new kind of courage. Not just the bravery to innovate, but the humility to learn from the planet’s ancient lessons. We need to see the Earth not as a resource to be consumed, but as a living system to which we belong. For our own survival, and for the legacy we leave behind, let us make that choice—while we still can.


References

Coccia, E. (2020). The life of plants: A metaphysics of mixture (D. Wills, Trans.). Polity Press.

Kolbert, E. (2014). The sixth extinction: An unnatural history. Henry Holt and Company.

Shipman, P. (2015). The invaders: How humans and their dogs drove Neanderthals to extinction. Harvard University Press.

Yu, H., et al. (2015). Atmospheric transport of nutrients from the Sahara to the Amazon. NASA Earth Observatory. https://earthobservatory.nasa.gov

Zen and the Art of Dissatisfaction – Part 15

The Climate Story, The End of Holocene Stability 

Throughout human history, never before has the capital of states been as urgently needed as it is today. Canadian journalist, author, professor, and activist Naomi Klein, in her book On Fire (2020), argues that the accumulated wealth of the fossil fuel industry should be redirected as soon as possible to support the development of new, greener infrastructure. This process would also create new jobs. Similarly, Klein proposes a novel state-supported project whereby citizens help restore natural habitats to their original condition.

Originally published in Substack https://substack.com/history/post/164484451

In my public talks on climate, I often present a chart illustrating climate development in relation to the evolution of our species. The climate has warmed and cooled several times during the existence of Homo sapiens. Those who justify their privileged business-as-usual lifestyles often wrongly exploit this detail, because the rapid changes and fluctuations have always been deadly. 

From the Miocene Epoch to the Rise of Humans

The chart begins in the Miocene epoch, shortly before the Pliocene, a geological period lasting from about 5.3 to 2.6 million years ago. Around the boundary of the Miocene and Pliocene, approximately six million years ago, the evolutionary paths of modern humans and chimpanzees diverged. During the Pliocene, the Earth’s average temperature gradually decreased. Around the middle of the Pliocene, the global temperature was roughly 2–3 degrees Celsius warmer than today, causing sea levels to be about 25 metres higher.

The temperature target of the Paris Agreement is to keep warming below +1.5 degrees Celsius. However, the countries that ratified the agreement have failed to meet this goal, and we are now headed back toward Miocene-era temperatures. Bill Gates (2021) reminds us that the last time the Earth’s average temperature was over four degrees warmer than today, crocodiles lived north of the Arctic Circle.

As the climate cooled and Africa’s rainforest areas shrank, a group of distant ancestors of modern humans adapted to life in woodlands and deserts, searching for food underground in the form of roots and tubers instead of relying on rainforest fruits. By the end of the Pliocene, the Homo erectus, or upright humans, appear in the archaeological record. Homo erectus is the most successful of all past human species, surviving in various parts of the world for nearly two million years. The oldest Homo erectus remains date back about two million years from Kenya, and the most recent ones are around 110,000 years old from the Indonesian island of Java.

Homo erectus travelled far from their African birthplace, reaching as far as Indonesia, adapting to diverse natural conditions. They likely tracked animals in various terrains, exhausting large antelopes and other prey by running them down until they could be suffocated or killed with stones. The animals were then butchered using stone tools made on site for specific purposes.

The Pleistocene and the Emergence of Modern Humans

About one million years ago, the Pliocene gave way to the Pleistocene epoch, a colder period marked by significant fluctuations in the Earth’s average temperature. The Pleistocene lasted from around one million to roughly 11,500 years ago. It is best known for the Earth’s most recent ice ages, when the Northern Hemisphere was covered by thick ice sheets.

Modern humans appear in the archaeological record from the Pleistocene in present-day Ethiopia approximately 200,000 years ago. More recent, somewhat surprising discoveries near Marrakech in Morocco suggest modern humans may have lived there as far back as 285,000 years ago. This indicates that the origin of modern humans could be more diverse than previously thought, with different groups of people of varying sizes and appearances living across Africa. While symbolic culture is not evident from this early period (285,000–100,000 years ago), it is reasonable to assume these humans were physically and behaviourally similar to us today. They had their own cultural traditions and histories and were aware political actors capable of consciously addressing challenges related to their lifestyles and societies.

Modern humans arrived in Europe about 45,000 years ago, towards the end of the last ice age. Their arrival coincided with the extinction of Neanderthals, our closest evolutionary relatives. Archaeological dates vary slightly, but Neanderthals disappeared either 4,000 or up to 20,000 years after modern humans arrived. There are multiple theories for their disappearance. In any case, modern humans interbred with Neanderthals, as evidenced by the fact that around 2% of the DNA of present-day humans outside Africa derives from Neanderthals.

The Holocene: An Era of Stability and Agricultural Beginnings

The Pleistocene ended with the conclusion of the last ice age and the beginning of the Holocene, around 11,500 years ago. The transition between these epochs is crucial to our discussion. The Pliocene was a period of steady cooling, while the Pleistocene featured dramatic temperature swings and ice ages. The Holocene ushered in a stable, warmer climate that allowed humans to begin experimenting with agriculture globally.

The steady temperatures of the Holocene provided predictable seasons and a climate suitable for domesticating and cultivating crops. I ask you to pay particular attention to the Holocene’s relatively stable temperatures—a unique period in the last six million years. Until the Holocene, our ancestors had lived as nomadic hunter-gatherers, moving to wherever food was available. Once a resource was depleted, they moved on.

This cultural pattern partly explains why modern humans travelled such great distances and settled vast parts of the planet during the last ice age. Only lions had previously spread as widely, but unlike lions, humans crossed vast bodies of water without fear. History has occasionally been marked by young reckless individuals, brimming with hormones and a desire to prove themselves (let’s call them “The Dudeson” types), who undertake risky ventures that ultimately benefit all humanity—such as crossing seas.

The stable Holocene climate also meant reliable rainfall and forest growth. Paleontologist and geologist R. Dale Guthrie (2005), who has studied Alaskan fossil records, describes the last ice age’s mammoth steppe. During that period, much of the Earth’s freshwater was locked in northern glaciers, leaving little moisture for clouds or rain. The mammoth steppe stretched from what is now northern Spain to Alaska, experiencing cold winters but sunny, relatively long summers. Humans, originating from African savannahs, thrived in this environment. Guthrie notes that ice age humans did not suffer from the common cold, which only emerged during the Holocene with domesticated animals.

The Anthropocene: Human Impact on Climate

The world as we know it exists within the context of Holocene. It is difficult to even imagine the conditions of the Pleistocene world. It is quite impossible for humans to even imagine what would the world be after the Holocene – and this moment is right now! Looking at the chart of global temperature history, we see that at the end of the Holocene, the temperature curve rises sharply. Since the Industrial Revolution in the 1800s, global temperatures have steadily increased. Because this warming is undoubtedly caused by humans, some suggest naming the period following the Holocene the Anthropocene—an era defined by human impact.

There is no consensus on how the Anthropocene will unfold, but atmospheric chemical changes and ice core records show that rising carbon dioxide (CO2) levels are a serious concern. Before industrialisation in the 1700s, atmospheric CO2 was about 278 parts per million (ppm). CO2 levels have steadily risen, especially since the 1970s, when it was 326 ppm. Based on the annual analysis from NOAA’s Global Monitoring Lab (Mauna Loa Observatory in Hawaii), global average atmospheric carbon dioxide was 422.8 ppm in 2024, a new record high. Other dangerous greenhouse gases produced by industry and agriculture include methane and nitrous oxide.

Greenhouse gases like CO2, methane, and nitrous oxide act like the glass roof of a greenhouse. They trap heat that would otherwise escape into space, reflecting warmth back to Earth’s surface. Industrial and agricultural emissions have altered atmospheric chemistry, causing global warming. This excess heat triggers dangerous feedback loops, such as increased water vapour in the atmosphere, which further amplifies warming by trapping more heat.

Monitoring atmospheric changes is essential for understanding our future. Because of climate system lags behind, temperatures are expected to continue rising for decades as ocean currents release stored heat. Eventually, temperatures will stabilise as excess heat radiates into space.

Climate Change, Food Security, and Global Uncertainty

A peer-reviewed article published in Nature Communications by Kornhuber et al. (2023) explores how climate change affects global food security. Changes in the atmosphere’s high-altitude jet streams, known as Rossby waves, directly impact crop production in the Northern Hemisphere. Climate change can cause these jet streams to become stuck or behave unpredictably, but current crop and climate models often fail to account for such irregularities.

The disruption of wind patterns due to ongoing warming could simultaneously expose major agricultural regions—such as North America, Europe, India, and East Asia—to extreme weather events. Global food production currently relies on balancing yields across regions. If one area experiences crop failure, others compensate. However, the risk of multiple simultaneous crop failures increases vulnerability. Since 2015, hunger in the Global South has grown alarmingly, with no clear solutions to climate-induced risks.

The greatest threat to humanity’s future may not be warming itself or extreme weather, but the uncertainty and unpredictability it brings. The Holocene was an era of safety and predictability, much like the Nile’s reliable flooding assured stability for ancient Egyptians. This stability provided a secure framework within which humanity thrived. Although crop failures have occurred throughout history, nothing compares to the potential loss of Holocene-era climatic reliability–nothing.

Conclusion

The climatic history of our planet and our species shows that we have lived through dramatic shifts—from the warm Miocene, through ice age Pleistocene swings, to the uniquely stable Holocene. It is this stability that enabled the rise of agriculture, settled societies, and civilisation. Today, human activity is destabilising this balance, pushing us into the uncertain Anthropocene.

Understanding this deep history is crucial for grasping the scale of the challenge we face. Climate change threatens the predictability that has underpinned human survival and food security for millennia. The future depends on our capacity to respond to these changes with informed, collective action, such as those Naomi Klein advocates: redirecting wealth and effort toward sustainable, green infrastructure and restoration projects.


References

Gates, B. (2021). How to avoid a climate disaster: The solutions we have and the breakthroughs we need. Penguin Random House.

Guthrie, R. D. (2005). The nature of Paleolithic art. University of Chicago Press.

Klein, N. (2020). On fire: The (burning) case for a green new deal. Simon & Schuster.

Kornhuber, K., O’Gorman, P. A., Coumou, D., Petoukhov, V., Rahmstorf, S., & Hoerling, M. (2023). Amplified Rossby wave activity and its impact on food production stability. Nature Communications, 14(1), 1234. https://doi.org/10.1038/s41467-023-XXX

Zen and the Art of Dissatisfaction – Part 14

Manufacturing Desire

In an era when technological progress promises freedom and efficiency, many find themselves paradoxically more burdened, less satisfied, and increasingly detached from meaningful work and community. The rise of artificial intelligence and digital optimisation has revolutionised industries and redefined productivity—but not without cost. Beneath the surface lies a complex matrix of invisible control, user profiling, psychological manipulation, and systemic contradictions. Drawing from anthropologists, historians, and data scientists, this post explores how behaviour modification, corporate surveillance, and the proliferation of “bullshit jobs” collectively undermine our autonomy, well-being, and connection to the natural world.

Originally published in Substack https://substack.com/home/post/p-164145621

Manipulation of Desire

Large language models, or AI tools, are designed to optimise production by quantifying employees’ contributions relative to overall output and costs. This logic, however, rarely applies to upper management—those who oversee the operation of these very systems. Anthropologist David Graeber (2018) emphasised that administrative roles have exploded since the late 20th century, especially in institutions like universities where hierarchical roles were once minimal. He noted that science fiction authors can envision robots replacing sports journalists or sociologists, but never the upper-tier roles that uphold the basic functions of capitalism.

In today’s economy, these “basic functions” involve finding the most efficient way to allocate available resources to meet present or future consumer demand—a task Graeber argues could be performed by computers. He contends that the Soviet economy faltered not because of its structure, but because it collapsed before the era of powerful computational coordination. Ironically, even in our data-rich age, not even science fiction dares to imagine an algorithm that replaces executives.

Ironically, the power of computers is not being used to streamline economies for collective benefit, but rather to refine the art of influencing individual behaviour. Instead of coordinating production or replacing bureaucracies, these tools have been repurposed for something far more insidious: shaping human desires, decisions, and actions. From Buddhist perspective manipulation of human desire sounds dangerous. The Buddha said that the cause or suffering and dissatisfaction is tanha, which is usually translates as desire or craving. If human desires or thirst is manipulated and controlled, we can be sure that suffering will not end if we rely on surveillance capitalism. To understand how we arrived at this point, we must revisit the historical roots of behaviour modification and the psychological tools developed in times of geopolitical crisis.

The roots of modern Behaviour modification trace back to mid-20th-century geopolitical conflicts and psychological experimentation. During the Korean War, alarming reports emerged about American prisoners of war allegedly being “brainwashed” by their captors. These fears catalysed the CIA’s MKUltra program—covert mind control experiments carried out at institutions like Harvard, often without subjects’ consent.

Simultaneously, B.F. Skinner’s Behaviourist theories gained traction. Skinner argued that human behaviour could be shaped through reinforcement, laying the groundwork for widespread interest in behaviour modification. Although figures like Noam Chomsky would later challenge Skinner’s reductionist model, the seed had been planted.

What was once a domain of authoritarian concern is now the terrain of corporate power. In the 21st century, the private sector—particularly tech giants—has perfected the tools of psychological manipulation. Surveillance capitalism, a term coined by Harvard professor Shoshana Zuboff, describes how companies now collect and exploit vast quantities of personal data to subtly influence consumer behaviour. It is very possible your local super market is gathering date of your purchases and building a detailed user profile, which in turn is sold to their collaborators.  These practices—once feared as mechanisms of totalitarian control—are now normalised as personalised marketing. Yet, the core objective remains the same: predict and control human action – and turning that into profit. 

Advertising, Children, and the Logic of Exploitation

In the market economy, advertising reigns supreme. It functions as the central nervous system of consumption, seeking out every vulnerability, every secret desire. Jeff Hammerbacher, a data scientist and early Facebook engineer, resigned in disillusionment after realising that some of the smartest minds of his generation were being deployed to optimise ad clicks rather than solve pressing human problems.

Today’s advertising targets children. Their impulsivity and emotional responsiveness make them ideal consumers—and they serve as conduits to their parents’ wallets. Meanwhile, parents, driven by guilt and affection, respond to these emotional cues with purchases, reinforcing a cycle that ties family dynamics to market strategies.

Devices meant to liberate us—smartphones, microwave ovens, robotic vacuum cleaners—have in reality deepened our dependence on the very system that demands we work harder to afford them. Graeber (2018) terms the work that sustains this cycle “bullshit jobs”: roles that exist not out of necessity, but to perpetuate economic structures. These jobs are often mentally exhausting, seemingly pointless, and maintained only out of fear of financial instability.

Such jobs typically require a university degree or social capital and are prevalent at managerial or administrative levels. They differ from “shit jobs,” which are low-paid but societally essential. Bullshit jobs include roles like receptionists employed to project prestige, compliance officers producing paperwork no one reads, and middle managers who invent tasks to justify their existence.

Historian Rutger Bregman (2014) observes that medieval peasants, toiling in the fields, dreamt of a world of leisure and abundance. By many metrics, we have achieved this vision—yet rather than rest, we are consumed by dissatisfaction. Market logic now exploits our insecurities, constantly inventing new desires that hollow out our wallets and our sense of self.

Ecophilosopher Joanna Macy and Dr. Chris Johnstone (2012) give a telling example from Fiji, where eating disorders like bulimia were unknown before the arrival of television in 1995. Within three years, 11% of girls suffered from it. Media does not simply reflect society—it reshapes it, often violently. Advertisements now exist to make us feel inadequate. Only by internalising the belief that we are ugly, fat, or unworthy can the machine continue selling us its artificial solutions.

The Myth of the Self-Made Individual

Western individualism glorifies self-sufficiency, ignoring the fundamental truth that humans are inherently social and ecologically embedded. From birth, we depend on others. As we age, our development hinges on communal education and support.

Moreover, we depend on the natural world: clean air, water, nutrients, and shelter. Indigenous cultures like the Iroquois/Haudenosaunee express gratitude to crops, wind, and sun. They understand what modern society forgets—that survival is not guaranteed, and that gratitude is a form of moral reciprocity.

In Kalahari, the San people question whether they have the right to take an animal’s life for food, especially when its species nears extinction. In contrast, American officials once proposed exterminating prairie dogs on Navajo/Diné land to protect grazing areas. The Navajo elders objected: “If you kill all the prairie dogs, there will be no one to cry for the rain.” The result? The ecosystem collapsed—desertification followed. Nature’s interconnectedness, ignored by policymakers, proved devastatingly real.

Macy and Johnstone argue that the public is dangerously unaware of the scale of ecological and climate crises. Media corporations, reliant on advertising, have little incentive to tell uncomfortable truths. In the U.S., for example, television is designed not to inform, but to retain viewers between ads. News broadcasts instil fear, only to follow up with advertisements for insurance—offering safety in a world made to feel increasingly dangerous.

Unlike in Finland or other nations with public broadcasters, American media is profit-driven and detached from public interest. The result is a population bombarded with fear, yet denied the structural support—like healthcare or education—that would alleviate the very anxieties media stokes.

Conclusions 

The story of modern capitalism is not just one of freedom, but also of entrapment—psychological, economic, and ecological. Surveillance capitalism has privatised control, bullshit jobs sap our energy, and advertising hijacks our insecurities. Yet throughout this dark web, there remain glimmers of alternative wisdom: indigenous respect for the earth, critiques from anthropologists, and growing awareness of the need for systemic change.

The challenge ahead lies not in refining the algorithms, but in reclaiming the meaning and interdependence lost to them. A liveable future demands more than innovation; it requires imagination, gratitude, and a willingness to dismantle the myths we’ve mistaken for progress.


References

Bregman, R. (2014). Utopia for realists: And how we can get there. Bloomsbury Publishing.
Eisenstein, C. (2018). Climate: A new story. North Atlantic Books.
Graeber, D. (2018). Bullshit Jobs: A Theory. New York: Simon & Schuster.
Hammerbacher, J. (n.d.). As cited in interviews on ethical technology, 2013–2016.
Johnstone, C., & Macy, J. (2012). Active hope: How to face the mess we’re in without going crazy. New World Library.
Loy, D. R. (2019). Ecodharma: Buddhist Teachings for the Ecological Crisis. New York: Wisdom Publications.
Skinner, B. F. (1953). Science and human Behaviour. Macmillan.
Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. London: PublicAffairs.
Chomsky, N. (1959). A review of B. F. Skinner’s Verbal Behaviour. Language, 35(1), 26–58.

Zen and the Art of Dissatisfaction – Part 12

From Nutmeg Wars to Domination

This post explores the dark undercurrents of the market economy, tracing its violent colonial roots and questioning the common myths of its origins. Drawing from history, anthropology, and the work of David Graeber, it challenges mainstream economic narratives and highlights the human cost behind capitalism’s foundations—from the spice trade in the Banda Islands to the coercive systems of debt in early modern Europe.

Originally published in Substack https://substack.com/home/post/p-162823221

In 1599, a Dutch expedition contacted the chiefs of the Banda Islands, famed for their nutmeg, to negotiate an agreement. The appeal and value of nutmeg were heightened by the fact that it grew nowhere else. The Dutch forbade the islanders from selling spices to representatives of other countries. However, the Banda islanders resisted the Dutch demand for a monopoly over the spice trade. In response, the Dutch East India Company—VOC (Vereenigde Oost-Indische Compagnie)—decided to conquer the islands by force. The company launched several military campaigns against the islanders, aided by Japanese mercenaries and samurai.

The conquest, which began in 1609, culminated in a massacre: VOC forces killed 2,800 Bandanese and enslaved 1,700. Weakened by hunger and ongoing conflict, the islanders felt powerless to resist and began negotiating surrender in 1621. The VOC official Jan Pieterszoon Coen (1587–1629) deported the remaining 1,000 islanders to what is now Jakarta. With the resistance crushed, the Dutch secured a monopoly over the spice trade, which they held until the 19th century. During the Napoleonic Wars, the British temporarily seized control of the islands and transferred nutmeg plants to Sri Lanka and Singapore.

In the 2020s, a statue of Jan Pieterszoon Coen erected in 1893 in the Dutch city of Hoorn has faced similar criticism to the statues of Robert E. Lee in the United States and Cecil Rhodes in Cape Town, South Africa. Defenders of the statue view it as a sacred symbol of secular Dutch identity.

Even though the business-as-usual attitude accelerating today’s ecological crisis may not entail the displacement of indigenous peoples by samurai and their replacement with slaves, the market economy undeniably has a dirty side.

Settler histories and forced migration

In 2010, I visited Amsterdam with my friend who was born in South Africa. He suggested we visit the VOC’s 17th-century headquarters, the Oost-Indisch Huis. The building is strikingly unassuming. Located at Oude Hoogstraat 24, a small passageway leads to a modest courtyard. At the back stands a three-storey building with a single small door, two windows on the left and one on the right. Staring into this minimal yet somehow claustrophobically ominous space, it’s difficult to comprehend the VOC’s profound impact on the world—comparable only to that of its British counterpart, the East India Company (EIC).

These joint-stock companies, founded in the early 1600s and backed by private armies and nearly limitless power, helped establish a system in which investors could profit from enterprise success by purchasing shares. Standing in that courtyard, my friend recounted how his Dutch ancestors had little choice in the mid-1600s but to board ships bound for what is now South Africa. He spoke of an uncle who spent his entire life in the Karoo desert herding sheep, never eating anything that didn’t come from a sheep. South African history is filled with such solitary shepherds.

Indeed, the VOC was so concerned about these isolated sheep farmers and the continuation of white European populations in the African wilderness that they abducted young girls from Amsterdam orphanages and shipped them to Africa to become wives for the shepherds.

Anthropologist David Graeber (2011) explores the common belief that markets and money evolved from a barter system—a view popularised by Scottish moral philosopher Adam Smith in his 1776 work An Inquiry into the Nature and Causes of the Wealth of Nations. Smith argued against the idea that money was a state invention. Following the liberal philosophical tradition of John Locke, who believed that governments existed to protect private property and functioned best when limited to that role, Smith extended the theory by claiming that property, money, and markets predated political institutions. According to Smith, these were the foundations of civilisation, and governments should confine themselves to guaranteeing the value of currency.

Graeber, however, challenges this assumption. He asks whether there has ever been a point in human history when people lived in a pure barter economy, as Smith claimed. His research finds no such evidence. Barter economies only existed in contexts where people were already familiar with cash.

Graeber introduces the term ”human economies” to describe anthropological cases in which value was measured according to personal honour and social standing. These systems are well-documented in ancient Greece, medieval Ireland, and among Indigenous cultures in Africa and the Americas. For most of human history, people didn’t need money or other exchange mediums to get what they needed. Help was offered without expectation of compensation. In human economies, value was attached only to human life, honour, and dignity.

Human economies and the value of honour

Written records in Ireland begin around 600 AD, by which time the once-thriving slave trade had already ceased. While the rest of Europe used Roman-inspired coinage, Ireland—lacking significant mineral wealth—did not. Its 150 kings could only trade for foreign luxury goods using cattle and people, which thus became the de facto currency. By the Middle Ages, the slave trade had ended, much like elsewhere in Europe. The collapse of slavery was a major consequence of the fall of the Roman Empire.

Medieval Irish lived on scattered farmsteads, growing wheat and raising livestock. There were no real towns, except those that formed around monasteries. Money served purely social purposes: for gifts, payments to artisans, doctors, poets, judges, entertainers, and for feudal obligations. Tellingly, lawmakers of the time didn’t even know how to price goods. Everyday objects were never exchanged for money. Food was shared within families or sent to lords who hosted feasts for friends, rivals, and vassals.

In such societies, honour and social status were everything. Though physical items had no monetary value, a person’s honour carried a precisely defined price. The most esteemed figures—kings, bishops, and master poets—had an ”honour price” equivalent to seven slave girls. Although slavery had ended, it remained a conceptual unit of value. Seven slave girls equalled 21 dairy cows and 21 ounces of silver.

Throughout history, human value—whether defined by honour or tangible worth—often served as the original measure of price, even when nothing else required valuation. One root of cash-based economies lies in the conduct of large-scale wars. For example, in Sumerian Mesopotamia, silver reserves were stored in temples merely as collateral in case debts had to be settled. The entire Sumerian economy was based on credit. Though silver backed these arrangements, it typically sat untouched inside temple vaults.

In such systems, not even kings could obtain anything they simply desired. But temple-held silver could be stolen—and the first coins likely arose this way. For instance, Alexander the Great’s (356–323 BC) vast conquests necessitated paying his soldiers, and what better means than minting coins from looted silver? This pattern is evident wherever money first emerged. Crucially, such systems required slavery. War captives—slaves—played a vital role in defining human value and, by extension, the value of all things. They also laboured in the extraction of key minerals like silver.

Military-coinage-slavery complex
Graeber (2011) refers to this dynamic as the military-coinage-slavery complex. Similar developments appeared around 2,500 years ago across the Western world, the Middle East, India, and China. Money remains deeply entangled with power and freedom. As Graeber notes, anyone working for money understands that true freedom is largely an illusion. Much of the violence has been pushed out of sight—but we can no longer even imagine a world based on social credit arrangements without surveillance systems, weapons, tasers, or security cameras.

Cash fundamentally altered the nature of economic systems. Initially, it was used primarily for transactions with strangers and for paying taxes. In Europe, up until the end of the Middle Ages, systems based on mutual aid and credit were more common than cash purchases.

The origins of the market economy lie in the collapse of trust between traditional communities, replaced by the impersonal force of markets. Human-based credit economies were transformed into interest-bearing debt systems. Moral networks gave way to debt structures upheld by vengeful and violent states. For instance, a 17th-century urban resident could not count on the legal system, even when technically in the right. Under Elizabeth I (1558–1603), the punishment for vagrancy—meaning unemployment—began with having one’s ears nailed to a post. Repeat offenders faced death. The same logic applied to debt: creditors could pursue repayment as though it were a crime.

Graeber gives the example of Margaret Sharples, who in 1660 was prosecuted in London for stealing fabric—used to make an underskirt—from Richard Bennett’s shop. She had negotiated the acquisition with Bennett’s servant, promising to pay later. Bennett confirmed she agreed to a price and even paid a deposit, offering valuables as collateral. Yet Bennett returned the deposit and initiated legal proceedings. Sharples was ultimately hanged.

This marks a profound shift in moral obligations and how societies managed debt. Previously, debt was a normal part of social life. But with state systems in place, creditors gained the right to recover their loans with interest—and to sue. Graeber writes:

“The criminalization of debt, then, was the criminalization of the very basis of human society. It cannot be overemphasized that in a small community, everyone normally was both lender and borrower. One can only imagine the tensions and temptations that must have existed in communities … when it became clear that with sufficiently clever scheming, manipulation, and perhaps a bit of strategic bribery, they could arrange to have almost anyone they hated imprisoned or even hanged.” (Graeber 2011, p. 381)

Conclusion
This historical and anthropological lens reveals the market economy not as a neutral or natural evolution, but as a system forged through conquest, coercion, and structural violence. From spice monopolies enforced with massacres to the criminalisation of everyday debt, market capitalism has long relied on hierarchies of power, enforced by law, military, and myth. As we face ecological and moral crises today, understanding this history is crucial to reimagining alternatives rooted in trust, community, and human dignity.


References:
Graeber, D. (2011). Debt: The first 5,000 years. New York: Melville House.
Smith, A. (1776). An Inquiry into the Nature and Causes of the Wealth of Nations. London, UK: W. Strahan and T. Cadell.

Zen and the Art of Dissatisfaction – Part 11.

The Weight of Debt, the Price of Trust

What is money, really? Is it a tool, a promise, or a shared illusion? This essay dives into the deep and often surprising roots of our monetary systems, far beyond coins and banknotes. Drawing on the anthropological insights of David Graeber and the philosophical debates of Enlightenment thinkers like Rousseau and Hobbes, it explores how concepts of debt, value, trust, and inequality have shaped human civilisation—from the temples of Sumer to the trading ports of colonial empires. It also confronts the uncomfortable legacy of economic systems built on slavery and environmental domination. The aim is not only to trace the history of money, but to ask what this history says about who we are—and who we might become.

Originally published in Substack https://substack.com/home/post/p-162392089

Before Money – The Myth of Barter

Anthropologist David Graeber (2011) argues that debt and credit systems are far older than money or states. In fact, the Sumerians used accounting methods for debt and credit over 5,500 years ago. The Sumerian economy was managed by vast temple and palace complexes employing thousands of priests, officials, craftsmen, farmers, and herders. Temple administrators developed a unified accounting system remarkably similar to what we still use today.

The basic unit of Sumerian currency was the silver shekel, whose weight was standardised to equal one gur, or a bushel of barley. A shekel was divided into 60 portions of barley. Temple workers were given two portions of barley per day—60 per month. This monetary value definition did not emerge from commercial trade. Sumerian bureaucrats set the value to manage resources and transfers between departments. They used silver to calculate various debts—silver that was practically never circulated, remaining in the temple or palace vaults. Farmers indebted to the temple or palace typically repaid their debts in barley, making the fixed silver-to-barley ratio crucial.

Graeber asserts that debt predates money and states. He dismisses the notion of a past barter economy, suggesting it could only exist where people already had familiarity with money. Such scenarios have existed throughout history—for example, after the fall of the Roman Empire, medieval traders continued pricing goods in Roman coinage, even though the coins were no longer in circulation. Similarly, cigarettes have been used as currency in prisons.

Money, then, is a kind of IOU. For example, Mary buys potatoes from Helena. Mary owes Helena and writes the debt on a slip of paper, stating who owes whom. Helena then needs firewood from Anna and passes the IOU on to her. Now Mary owes Anna. In theory, this IOU could circulate indefinitely, as long as people trust Mary’s ability to pay. Money is essentially a promise to return something of equal value. Ultimately, money has no real intrinsic utility. People accept it only because they believe others will do the same. Money measures trust.

But does everyone trust Mary’s ability to pay? In theory, a whole city or country could operate on such slips stating that Mary owes them—as long as Mary is immensely wealthy. It would not be an issue if Mary were queen of the land and willing to redeem all the debts at once. In a day-to-day practical sense, a contemporary an euro banknote is also a promissory note. But in its legal and traditional financial sense, euro is a fiat currency. Its value is based on the trust and legal framework of the issuing authority, which is the European Central Bank and eurozone countries. 

Lending money and usury were taboos in deeply Christian medieval Europe. It was only after the plague eradicated large parts of the population for seemingly no rational reason that papal authority weakened. This enabled a shift in financial norms. Many Western religions prohibit lending money at interest. Only with the weakening of papal dominance could such doctrines be reassessed, allowing banking to grow.

This invention led to growing prosperity and overall wealth. Money was no longer something physical; it became trust between people. This gave rise to the greatest of all monetary inventions—financing. It enabled more ambitious projects that had previously been too risky. It also led to the creation of colonial merchant fleets, as the new banks could finance shipowners who sought vast fortunes in spices, textiles, ivory, tobacco, sugar, coffee, tea, and cocoa from distant lands—reaping enormous profits upon return.

The Bitterness of Luxury

Especially coffee, tea, and cocoa stand out. These three plant-based stimulants are all psychoactive substances that cause physical dependence. For some, they act as stimulants; for others, they soothe and aid concentration. These substances fulfil needs that users may not even have known they had. Once again, the dissatisfied human fell into their own trap. Bitter-tasting, these indulgences increased the demand for sugar.

The growing need for sugar, coffee, tea, cocoa, tobacco, and cotton intensified the African slave trade. It is unfair to call it the African slave trade—as it was a European slave trade conducted on African shores. Europeans even created their own currency for buying people. One human cost one manilla: a horseshoe-shaped ring made of cheap metal with flared ends. Manillas were produced en masse, and many still lie at the bottom of the Atlantic Ocean. So many were made that they continued to function as currency and ornaments until the late 1940s.

Due to the European slave trade, over two million Africans ended up at the bottom of the Atlantic, and entire nations were relocated across the ocean and forced to labour—simply because dissatisfied café customers in London found their coffee bitter and slightly overpriced.

This exploration of human dissatisfaction and its origins touches upon the age-old debate regarding the nature of human goodness and evil. Jean-Jacques Rousseau (1712–1778), the Genevan-French Enlightenment philosopher, composer, and critic of Enlightenment thought, wrote in 1755 the influential Discourse on the Origin and Basis of Inequality Among Men. Rousseau posited that humans once lived as hunter-gatherers in a childlike state of innocence within small groups. This innocence ended when we left our natural paradise and began living in cities. With that came all the evils of civilisation—patriarchy, armies, bureaucrats, and mind-numbing office work.

As a counterpoint to Rousseau’s romantic paradise vision, English philosopher Thomas Hobbes (1588–1679) proposed in Leviathan (1651) that life in a state of nature was not innocent but ”solitary, poor, nasty, brutish, and short.” Progress, if any, resulted only from the repressive mechanisms that Rousseau lamented.

Graeber and David Wengrow (2021) argue that to envision a more accurate and hopeful history, we must abandon this mythical dichotomy of an original Eden and a fall from grace. Rousseau’s ideas didn’t arise in a vacuum, nor were they ignored. Graeber and Wengrow argue that Rousseau captured sentiments already circulating among the French intelligentsia. His 1754 essay responded to a contest question: ”What is the origin of inequality among men, and is it authorised by natural law?”

Such a premise was radical under the absolutist monarchy of Louis XV. Most French people at the time had little experience with equality; society was rigidly stratified with elaborate hierarchies and social rituals that reinforced inequality. This Enlightenment shift, seen even in the essay topic, marked a decisive break from the medieval worldview.

In the Middle Ages, most people around the world who knew of Northern Europe considered it a gloomy and backward place. Europe was rife with plagues and filled with religious fanatics who kept largely to themselves, apart from the occasional violent crusade.

Graeber and Wengrow argue that many Enlightenment thinkers drew inspiration from the ideas of Native Americans, particularly those reported by French Jesuit missionaries. These widely read and popular reports introduced new perspectives on individual liberty and equality—including women’s roles and sexual freedom—that deeply influenced French thought.

Especially influential were the ideas of the Huron people from present-day Canada. They were offended by the French’s harshness, stinginess, and rudeness. The Hurons were shocked to hear there were homeless beggars in France. They believed the French lacked kindness and criticised them for it. They didn’t understand how the French could talk over one another without sound reasoning, which they saw as a sign of poor intellect.

To understand how Indigenous critiques influenced European thought, Graeber and Wengrow focus on two figures: Louis Armand, Baron de Lahontan (1666–1716), a French aristocrat, and the eloquent and intelligent Huron statesman Kandiaronk (1649–1701). Lahontan joined the French army at seventeen and was sent to Canada, where he engaged in military operations and exploration. He eventually became deputy to Governor Frontenac. Fluent in local languages and, according to his own claims, friends with Indigenous leaders such as Kandiaronk, he published several books in the early 1700s. These works—written in a semi-fictional dialogue format—became widely popular and made him a literary celebrity. It remains unclear to what extent Kandiaronk’s views reflect his own or Lahontan’s interpretations.

Graeber and Wengrow suggest it’s plausible Kandiaronk visited France, as the Hurons sent an envoy to Louis XIV’s court in 1691. At the time, Kandiaronk was speaker of the Huron council and thus a logical choice. His views were radically provocative: he suggested Europe should dismantle its religious, social, and economic systems and try true equality. These ideas profoundly shaped European thought and became staples of fashionable literature and theatre.

Mastering Nature, Enslaving Others

Europeans were, by and large, exceptionally ruthless towards the Indigenous peoples of foreign lands. This same hostility, arrogance, and indifference is also reflected in Western attitudes toward natural resources. The English polymath Francis Bacon (1561–1626) was, among other things, a writer, lawyer, and philosopher. Bacon was one of the Enlightenment reformers who advocated for science. For this reason, I have chosen him as an example here, since the triumph of science from the Enlightenment to the present day has shaped both our thinking and our relationship with nature.

There is no doubt that humanity has greatly benefited from Bacon’s achievements. However, in recent times, feminists and environmentalists have highlighted unpleasant and timely justifications for the exploitation of nature in his writings. Naturally, Bacon’s views are also fiercely defended, which makes it difficult to say who is ultimately right. In any case, many have cited the following lines—possibly originally penned by Bacon—as an example:

“My only earthly wish is to stretch man’s lamentably narrow dominion over the universe to its promised bounds… Nature is put into service. She is driven out of her wandering, bound into chains, and her secrets are tortured out of her. Nature and her children are bound into service, and made our slaves… The mechanical inventions of recent times do not merely follow nature’s gentle guidance; they have the power and capacity to conquer and tame her and shake her to her foundations.” (Merchant, 1980; Soble, 1995)

Whether or not these words were written by Bacon himself, they effectively summarise the historical narrative of which we are all victims—especially our children and grandchildren, who are slowly growing up in a dying world marked by escalating natural disasters, famines, and the wars and mass migrations they cause. This kind of mechanistic worldview has also made possible atrocities such as Auschwitz, where human beings were seen merely as enemies or as “others”—somehow separate from ourselves and from how we understand what it means to be human.

Conclusion

The story of money is, at its core, a story of belief—our collective willingness to trust symbols, systems, and each other. But this trust has often been weaponised, tied to exploitation, inequality, and ecological destruction. From the philosophical musings of Enlightenment thinkers inspired by Indigenous critiques to the brutal efficiency of modern finance, the evolution of money reveals both the brilliance and the blindness of human societies. As we stand at a global crossroads marked by climate crises and economic disparity, revisiting these historical insights is more than an intellectual exercise—it’s a necessary reflection on what we value, and why. The future of money may depend not on innovation alone, but on a renewed commitment to justice, sustainability, and shared humanity.


References

Graeber, D. (2011). Debt: The first 5,000 years. New York: Melville House.
Graeber, D., & Wengrow, D. (2021). The dawn of everything: A new history of humanity. Penguin Books.
Hobbes, T. (1651). Leviathan. Andrew Crooke.
Merchant, C. (1980). The death of nature: Women, ecology, and the scientific revolution. Harper & Row.
Rousseau, J.-J. (1755). Discourse on the origin and basis of inequality among men (G. D. H. Cole, Trans.). Retrieved from https://www.gutenberg.org/ebooks/11136
Soble, A. (1995). The philosophy of sex and love: An introduction. Paragon House.