Zen and the Art of Dissatisfaction – Part 16

Ancient Lessons for Modern Times

“It is horrifying that we have to fight our own government to save the environment.”
— Ansel Adams

In a world increasingly shaped by ecological turmoil and political inaction, a sobering truth has become clear: humanity is at a tipping point. In 2019, a video of Greta Thunberg speaking at the World Economic Forum in Davos struck a global nerve. With calm conviction, Thunberg urged world leaders to heed not her voice, but the scientific community’s dire warnings. What she articulated wasn’t just youthful idealism—it was a synthesis of the environmental truth we can no longer ignore. We are entering a new era—marked by irreversible biodiversity loss, climate destabilisation, and rising seas. But these crises are not random. They are the logical consequences of our disconnection from natural systems forged over millions of years. This post dives into Earth’s deep past, from ancient deserts to ocean floors, to reveal how nature’s patterns hold urgent messages for our present—and our future.

Originally published in Substack https://substack.com/home/post/p-165122353

Today, those in power bear an unprecedented responsibility for the future of humankind. We no longer have time to shift this burden forward. This is not merely about the future of the world—it’s about the future of a world we, as humankind, have come to know. It’s about the future of humanity and the biodiversity we depend on. The Earth itself will endure, but what will happen to the ever-growing list of endangered species?

The Sixth Mass Extinction: A Grim Reality

Climate change is just one problem, but many others stem from it. At its core, our crisis can be summarised in one concept: the sixth mass extinction. The last comparable event occurred 65 million years ago, when dinosaurs and many land and marine species went extinct, and ammonites vanished. Only small reptiles, mammals, and birds survived. The sixth mass extinction is advancing rapidly. According to scientists from the UN Environment Programme, about 150–200 species go extinct every single day.

One analogy described it well: imagine you’re in a plane, and parts begin to fall off. The plane represents the entire biosphere, and the falling bolts, nuts, and metal plates are the species going extinct. The question is: how many parts can fall off before the plane crashes, taking everything else with it?

Each of us can choose how we respond to this reality. Do we continue with business-as-usual, pretending nothing is wrong? Or do we accept that we are in a moment of profound transformation, one that demands our attention and action? Do we consider changes we might make in our own lives to steer this situation toward some form of control—assuming such control is still possible? Or do we resign ourselves to the idea that change has progressed too far for alternatives to remain?

The Carbon Cycle: A System Out of Balance

Currently, humanity emits around 48.3 million tonnes of carbon dioxide annually, which ends up dispersed across the planet. The so-called carbon cycle is a vital natural process that regulates the chemical composition of the Earth, oceans, and atmosphere. However, due to human activity, we have altered this cycle—a remarkable, albeit troubling, achievement. Earth is vast, and it’s hard for any individual to comprehend just how large our atmosphere is, or how much oxygen exists on the planet. This makes it difficult for many to take seriously the consequences of human activity on climate change.

Nature absorbs part of the carbon dioxide we emit through photosynthesis. The most common form is oxygenic photosynthesis used by plants, algae, and cyanobacteria, in which carbon dioxide and water are converted into carbohydrates like sugars and starch, with oxygen as a by-product. Plants absorb carbon dioxide from the air, while aquatic plants absorb it from water.

In this process, some of the carbon becomes stored in the plant and eventually ends up in the soil. Decaying plants release carbon dioxide back into the atmosphere. In lakes and oceans, the process is similar, but the carbon sinks to the bottom of the water instead of into soil. This all sounds simple, and it’s remarkable that such a cycle has created such favourable conditions for life. Yet none of this is accidental, nor is it the result of a supernatural design. It is the product of millions of years of evolution, during which every organism within this system has developed together—everyone needs someone. We should view our planet as one vast organism, with interconnected and co-dependent processes that maintain balance through mutual dependence and benefaction.

A Planet of Mutual Dependence: The Wisdom of Plants

Italian philosopher Emanuele Coccia explores this interdependence beautifully in his book The Life of Plants (2020). Coccia writes that the world is a living planet, its inhabitants immersed in a cosmic fluid. We live—or swim—in air, thanks to plants. The oxygen-rich atmosphere they created is our lifeline and is also connected to the forces of space. The atmosphere is cosmic in nature because it shields life from cosmic radiation. This cosmic fluid “surrounds and penetrates us, yet we are barely aware of it.”

NASA astronauts have popularised the concept of the overview effect—the emotional experience of seeing Earth from space, as a whole. Some describe it as a profound feeling of love for all living things. At first glance, the Sahara Desert and the Amazon rainforest may seem to belong to entirely different worlds. Yet their interaction illustrates the interconnectedness of our planet. Around 66 million years ago, a vast sea stretched from modern-day Algeria to Nigeria, cutting across the Sahara and linking to the Atlantic. The Sahara’s sand still contains the nutrients once present in that ancient sea.

In a 2015 article, NASA scientist Hongbin Yu and colleagues describe how millions of tonnes of nutrient-rich Saharan dust are carried by sandstorms across the Atlantic each year. About 28 million tonnes of phosphorus and other nutrients end up in the Amazon rainforest’s nutrient-poor soils, which are in constant need of replenishment.

In Darren Aronofsky’s 2018 documentary, Canadian astronaut Chris Hadfield describes how this cycle continues: nutrients washed from the rainforest soil travel via the Amazon River to the Atlantic Ocean, feeding microscopic diatoms. These single-celled phytoplankton build new silica-based cell walls from the dissolved minerals and reproduce rapidly through photosynthesis, producing oxygen in the process. Though tiny, diatoms are so numerous that their neon-green blooms can be seen from space. They produce roughly 20% of the oxygen in our atmosphere.

When their nutrients are depleted, many diatoms die and fall to the ocean floor like snow, forming sediment layers that can grow to nearly a kilometre thick. After millions of years, that ocean floor may become arid desert once again—starting the cycle anew, as dust blown from a future desert fertilises some distant forest.

Nature doesn’t always maintain its balance. Sometimes a species overtakes another, or conditions become unliveable for many. Historically, massive volcanic eruptions and asteroid impacts have caused major planetary disruptions. This likely happened 65 million years ago. Ash clouds blocked sunlight, temperatures plummeted, and Earth became uninhabitable for most life—except for four-legged creatures under 25 kilograms. We are descended from them.

Ocean Acidification: A Silent Threat

In her Pulitzer Prize-winning book The Sixth Extinction, American journalist Elizabeth Kolbert writes about researcher Jason Hall-Spencer, who studied how underwater geothermal vents can make local seawater too acidic for marine life. Fish and crustaceans flee these zones. The alarming part is that the world’s oceans are becoming acidic in this same way—but on a global scale. The oceans have already absorbed excess CO₂, making surface waters warmer and lower in oxygen. Ocean acidity is estimated to be 30% higher today than in 1800, and could be 150% higher by 2050.

Acidifying oceans spell disaster. Marine ecosystems are built like pyramids, with tiny organisms like krill at the base. These creatures are essential prey for many larger marine species. If we lose the krill, the pyramid collapses. Krill and other plankton form calcium carbonate shells, but acidic waters dissolve these before they can form properly.

There’s no doubt modern humans are the primary cause of the sixth mass extinction. As humans migrated from Africa around 60,000 years ago to every corner of the globe, they left destruction in their wake. Retired Harvard anthropologist Pat Shipman aptly dubbed Homo sapiens an invasive species in her book Invaders (2015). She suggests humans may have domesticated wolves into proto-dogs as early as 45,000 years ago. On the mammoth steppes of the Ice Age, this would have made humans—accustomed to persistence hunting—unbeatable. Wolves would exhaust the prey, and humans would deliver the fatal blow with spears.

Hunting is easy for wolves, but killing large prey is risky. Getting to a major artery is the most dangerous part. Human tools would have been an asset to the wolves. In return, wolves protected kills from scavengers and were richly rewarded. Since humans couldn’t consume entire megafauna carcasses, there was plenty left for wolves.

Why did some humans leave Africa? Not all did—only part of the population migrated, gradually over generations. One generation might move a few dozen kilometres, the next a few hundred. Over time, human groups drifted far from their origins.

Yet the migration wave seems to reveal something fundamental about our species. Traditionally, it’s been viewed as a bold and heroic expansion. But what if it was driven by internal dissatisfaction? The technological shift from Middle to Upper Palaeolithic cultures may signal not just innovation, but a restless urge for change.

This period saw increasingly complex tools, clothing, ornaments, and cave art. But it may also reflect discontent—where old ways, foods, and homes no longer satisfied. Why did they stop being enough?

As modern humans reached Central Europe, dangerous predators began to vanish. Hyenas, still a threat in the Kalahari today, disappeared from Europe 30,000 years ago. Cave bears, perhaps ritually significant (as suggested by skulls found near Chauvet cave art), vanished 24,000 years ago. Getting rid of them must have been a constant concern in Ice Age cultures.

The woolly mammoth disappeared from Central Europe about 12,000 years ago, with the last surviving population living on Wrangel Island off Siberia—until humans arrived there. The changing Holocene climate may have contributed to their extinction, but humans played a major role. Evidence suggests they were culturally dependent on mammoths. Some structures found in Czechia, Poland, and Ukraine were built from the bones of up to 60 different mammoths. These buildings, not used for permanent living, are considered part of early monumental architecture—similar to Finland’s ancient “giant’s churches.”

Conclusion: Ancient Wisdom, Urgent Choices

The planet is vast, complex, and self-regulating—until it isn’t. Earth’s past is marked by cataclysms and recoveries, extinctions and renaissances. The sixth mass extinction is not a mysterious, uncontrollable natural event—it is driven by us. Yet in this sobering truth lies a sliver of hope: if we are the cause, we can also be the solution.

Whether it’s the dust from the Sahara feeding the Amazon, or ancient diatoms giving us oxygen to breathe, Earth is a system of breathtaking interconnection. But it is also fragile. As Greta Thunberg implores, now is the time not just to listen—but to act.

We need a new kind of courage. Not just the bravery to innovate, but the humility to learn from the planet’s ancient lessons. We need to see the Earth not as a resource to be consumed, but as a living system to which we belong. For our own survival, and for the legacy we leave behind, let us make that choice—while we still can.


References

Coccia, E. (2020). The life of plants: A metaphysics of mixture (D. Wills, Trans.). Polity Press.

Kolbert, E. (2014). The sixth extinction: An unnatural history. Henry Holt and Company.

Shipman, P. (2015). The invaders: How humans and their dogs drove Neanderthals to extinction. Harvard University Press.

Yu, H., et al. (2015). Atmospheric transport of nutrients from the Sahara to the Amazon. NASA Earth Observatory. https://earthobservatory.nasa.gov

Zen and the Art of Dissatisfaction – Part 15

The Climate Story, The End of Holocene Stability 

Throughout human history, never before has the capital of states been as urgently needed as it is today. Canadian journalist, author, professor, and activist Naomi Klein, in her book On Fire (2020), argues that the accumulated wealth of the fossil fuel industry should be redirected as soon as possible to support the development of new, greener infrastructure. This process would also create new jobs. Similarly, Klein proposes a novel state-supported project whereby citizens help restore natural habitats to their original condition.

Originally published in Substack https://substack.com/history/post/164484451

In my public talks on climate, I often present a chart illustrating climate development in relation to the evolution of our species. The climate has warmed and cooled several times during the existence of Homo sapiens. Those who justify their privileged business-as-usual lifestyles often wrongly exploit this detail, because the rapid changes and fluctuations have always been deadly. 

From the Miocene Epoch to the Rise of Humans

The chart begins in the Miocene epoch, shortly before the Pliocene, a geological period lasting from about 5.3 to 2.6 million years ago. Around the boundary of the Miocene and Pliocene, approximately six million years ago, the evolutionary paths of modern humans and chimpanzees diverged. During the Pliocene, the Earth’s average temperature gradually decreased. Around the middle of the Pliocene, the global temperature was roughly 2–3 degrees Celsius warmer than today, causing sea levels to be about 25 metres higher.

The temperature target of the Paris Agreement is to keep warming below +1.5 degrees Celsius. However, the countries that ratified the agreement have failed to meet this goal, and we are now headed back toward Miocene-era temperatures. Bill Gates (2021) reminds us that the last time the Earth’s average temperature was over four degrees warmer than today, crocodiles lived north of the Arctic Circle.

As the climate cooled and Africa’s rainforest areas shrank, a group of distant ancestors of modern humans adapted to life in woodlands and deserts, searching for food underground in the form of roots and tubers instead of relying on rainforest fruits. By the end of the Pliocene, the Homo erectus, or upright humans, appear in the archaeological record. Homo erectus is the most successful of all past human species, surviving in various parts of the world for nearly two million years. The oldest Homo erectus remains date back about two million years from Kenya, and the most recent ones are around 110,000 years old from the Indonesian island of Java.

Homo erectus travelled far from their African birthplace, reaching as far as Indonesia, adapting to diverse natural conditions. They likely tracked animals in various terrains, exhausting large antelopes and other prey by running them down until they could be suffocated or killed with stones. The animals were then butchered using stone tools made on site for specific purposes.

The Pleistocene and the Emergence of Modern Humans

About one million years ago, the Pliocene gave way to the Pleistocene epoch, a colder period marked by significant fluctuations in the Earth’s average temperature. The Pleistocene lasted from around one million to roughly 11,500 years ago. It is best known for the Earth’s most recent ice ages, when the Northern Hemisphere was covered by thick ice sheets.

Modern humans appear in the archaeological record from the Pleistocene in present-day Ethiopia approximately 200,000 years ago. More recent, somewhat surprising discoveries near Marrakech in Morocco suggest modern humans may have lived there as far back as 285,000 years ago. This indicates that the origin of modern humans could be more diverse than previously thought, with different groups of people of varying sizes and appearances living across Africa. While symbolic culture is not evident from this early period (285,000–100,000 years ago), it is reasonable to assume these humans were physically and behaviourally similar to us today. They had their own cultural traditions and histories and were aware political actors capable of consciously addressing challenges related to their lifestyles and societies.

Modern humans arrived in Europe about 45,000 years ago, towards the end of the last ice age. Their arrival coincided with the extinction of Neanderthals, our closest evolutionary relatives. Archaeological dates vary slightly, but Neanderthals disappeared either 4,000 or up to 20,000 years after modern humans arrived. There are multiple theories for their disappearance. In any case, modern humans interbred with Neanderthals, as evidenced by the fact that around 2% of the DNA of present-day humans outside Africa derives from Neanderthals.

The Holocene: An Era of Stability and Agricultural Beginnings

The Pleistocene ended with the conclusion of the last ice age and the beginning of the Holocene, around 11,500 years ago. The transition between these epochs is crucial to our discussion. The Pliocene was a period of steady cooling, while the Pleistocene featured dramatic temperature swings and ice ages. The Holocene ushered in a stable, warmer climate that allowed humans to begin experimenting with agriculture globally.

The steady temperatures of the Holocene provided predictable seasons and a climate suitable for domesticating and cultivating crops. I ask you to pay particular attention to the Holocene’s relatively stable temperatures—a unique period in the last six million years. Until the Holocene, our ancestors had lived as nomadic hunter-gatherers, moving to wherever food was available. Once a resource was depleted, they moved on.

This cultural pattern partly explains why modern humans travelled such great distances and settled vast parts of the planet during the last ice age. Only lions had previously spread as widely, but unlike lions, humans crossed vast bodies of water without fear. History has occasionally been marked by young reckless individuals, brimming with hormones and a desire to prove themselves (let’s call them “The Dudeson” types), who undertake risky ventures that ultimately benefit all humanity—such as crossing seas.

The stable Holocene climate also meant reliable rainfall and forest growth. Paleontologist and geologist R. Dale Guthrie (2005), who has studied Alaskan fossil records, describes the last ice age’s mammoth steppe. During that period, much of the Earth’s freshwater was locked in northern glaciers, leaving little moisture for clouds or rain. The mammoth steppe stretched from what is now northern Spain to Alaska, experiencing cold winters but sunny, relatively long summers. Humans, originating from African savannahs, thrived in this environment. Guthrie notes that ice age humans did not suffer from the common cold, which only emerged during the Holocene with domesticated animals.

The Anthropocene: Human Impact on Climate

The world as we know it exists within the context of Holocene. It is difficult to even imagine the conditions of the Pleistocene world. It is quite impossible for humans to even imagine what would the world be after the Holocene – and this moment is right now! Looking at the chart of global temperature history, we see that at the end of the Holocene, the temperature curve rises sharply. Since the Industrial Revolution in the 1800s, global temperatures have steadily increased. Because this warming is undoubtedly caused by humans, some suggest naming the period following the Holocene the Anthropocene—an era defined by human impact.

There is no consensus on how the Anthropocene will unfold, but atmospheric chemical changes and ice core records show that rising carbon dioxide (CO2) levels are a serious concern. Before industrialisation in the 1700s, atmospheric CO2 was about 278 parts per million (ppm). CO2 levels have steadily risen, especially since the 1970s, when it was 326 ppm. Based on the annual analysis from NOAA’s Global Monitoring Lab (Mauna Loa Observatory in Hawaii), global average atmospheric carbon dioxide was 422.8 ppm in 2024, a new record high. Other dangerous greenhouse gases produced by industry and agriculture include methane and nitrous oxide.

Greenhouse gases like CO2, methane, and nitrous oxide act like the glass roof of a greenhouse. They trap heat that would otherwise escape into space, reflecting warmth back to Earth’s surface. Industrial and agricultural emissions have altered atmospheric chemistry, causing global warming. This excess heat triggers dangerous feedback loops, such as increased water vapour in the atmosphere, which further amplifies warming by trapping more heat.

Monitoring atmospheric changes is essential for understanding our future. Because of climate system lags behind, temperatures are expected to continue rising for decades as ocean currents release stored heat. Eventually, temperatures will stabilise as excess heat radiates into space.

Climate Change, Food Security, and Global Uncertainty

A peer-reviewed article published in Nature Communications by Kornhuber et al. (2023) explores how climate change affects global food security. Changes in the atmosphere’s high-altitude jet streams, known as Rossby waves, directly impact crop production in the Northern Hemisphere. Climate change can cause these jet streams to become stuck or behave unpredictably, but current crop and climate models often fail to account for such irregularities.

The disruption of wind patterns due to ongoing warming could simultaneously expose major agricultural regions—such as North America, Europe, India, and East Asia—to extreme weather events. Global food production currently relies on balancing yields across regions. If one area experiences crop failure, others compensate. However, the risk of multiple simultaneous crop failures increases vulnerability. Since 2015, hunger in the Global South has grown alarmingly, with no clear solutions to climate-induced risks.

The greatest threat to humanity’s future may not be warming itself or extreme weather, but the uncertainty and unpredictability it brings. The Holocene was an era of safety and predictability, much like the Nile’s reliable flooding assured stability for ancient Egyptians. This stability provided a secure framework within which humanity thrived. Although crop failures have occurred throughout history, nothing compares to the potential loss of Holocene-era climatic reliability–nothing.

Conclusion

The climatic history of our planet and our species shows that we have lived through dramatic shifts—from the warm Miocene, through ice age Pleistocene swings, to the uniquely stable Holocene. It is this stability that enabled the rise of agriculture, settled societies, and civilisation. Today, human activity is destabilising this balance, pushing us into the uncertain Anthropocene.

Understanding this deep history is crucial for grasping the scale of the challenge we face. Climate change threatens the predictability that has underpinned human survival and food security for millennia. The future depends on our capacity to respond to these changes with informed, collective action, such as those Naomi Klein advocates: redirecting wealth and effort toward sustainable, green infrastructure and restoration projects.


References

Gates, B. (2021). How to avoid a climate disaster: The solutions we have and the breakthroughs we need. Penguin Random House.

Guthrie, R. D. (2005). The nature of Paleolithic art. University of Chicago Press.

Klein, N. (2020). On fire: The (burning) case for a green new deal. Simon & Schuster.

Kornhuber, K., O’Gorman, P. A., Coumou, D., Petoukhov, V., Rahmstorf, S., & Hoerling, M. (2023). Amplified Rossby wave activity and its impact on food production stability. Nature Communications, 14(1), 1234. https://doi.org/10.1038/s41467-023-XXX

Zen and the Art of Dissatisfaction – Part 14

Manufacturing Desire

In an era when technological progress promises freedom and efficiency, many find themselves paradoxically more burdened, less satisfied, and increasingly detached from meaningful work and community. The rise of artificial intelligence and digital optimisation has revolutionised industries and redefined productivity—but not without cost. Beneath the surface lies a complex matrix of invisible control, user profiling, psychological manipulation, and systemic contradictions. Drawing from anthropologists, historians, and data scientists, this post explores how behaviour modification, corporate surveillance, and the proliferation of “bullshit jobs” collectively undermine our autonomy, well-being, and connection to the natural world.

Originally published in Substack https://substack.com/home/post/p-164145621

Manipulation of Desire

Large language models, or AI tools, are designed to optimise production by quantifying employees’ contributions relative to overall output and costs. This logic, however, rarely applies to upper management—those who oversee the operation of these very systems. Anthropologist David Graeber (2018) emphasised that administrative roles have exploded since the late 20th century, especially in institutions like universities where hierarchical roles were once minimal. He noted that science fiction authors can envision robots replacing sports journalists or sociologists, but never the upper-tier roles that uphold the basic functions of capitalism.

In today’s economy, these “basic functions” involve finding the most efficient way to allocate available resources to meet present or future consumer demand—a task Graeber argues could be performed by computers. He contends that the Soviet economy faltered not because of its structure, but because it collapsed before the era of powerful computational coordination. Ironically, even in our data-rich age, not even science fiction dares to imagine an algorithm that replaces executives.

Ironically, the power of computers is not being used to streamline economies for collective benefit, but rather to refine the art of influencing individual behaviour. Instead of coordinating production or replacing bureaucracies, these tools have been repurposed for something far more insidious: shaping human desires, decisions, and actions. From Buddhist perspective manipulation of human desire sounds dangerous. The Buddha said that the cause or suffering and dissatisfaction is tanha, which is usually translates as desire or craving. If human desires or thirst is manipulated and controlled, we can be sure that suffering will not end if we rely on surveillance capitalism. To understand how we arrived at this point, we must revisit the historical roots of behaviour modification and the psychological tools developed in times of geopolitical crisis.

The roots of modern Behaviour modification trace back to mid-20th-century geopolitical conflicts and psychological experimentation. During the Korean War, alarming reports emerged about American prisoners of war allegedly being “brainwashed” by their captors. These fears catalysed the CIA’s MKUltra program—covert mind control experiments carried out at institutions like Harvard, often without subjects’ consent.

Simultaneously, B.F. Skinner’s Behaviourist theories gained traction. Skinner argued that human behaviour could be shaped through reinforcement, laying the groundwork for widespread interest in behaviour modification. Although figures like Noam Chomsky would later challenge Skinner’s reductionist model, the seed had been planted.

What was once a domain of authoritarian concern is now the terrain of corporate power. In the 21st century, the private sector—particularly tech giants—has perfected the tools of psychological manipulation. Surveillance capitalism, a term coined by Harvard professor Shoshana Zuboff, describes how companies now collect and exploit vast quantities of personal data to subtly influence consumer behaviour. It is very possible your local super market is gathering date of your purchases and building a detailed user profile, which in turn is sold to their collaborators.  These practices—once feared as mechanisms of totalitarian control—are now normalised as personalised marketing. Yet, the core objective remains the same: predict and control human action – and turning that into profit. 

Advertising, Children, and the Logic of Exploitation

In the market economy, advertising reigns supreme. It functions as the central nervous system of consumption, seeking out every vulnerability, every secret desire. Jeff Hammerbacher, a data scientist and early Facebook engineer, resigned in disillusionment after realising that some of the smartest minds of his generation were being deployed to optimise ad clicks rather than solve pressing human problems.

Today’s advertising targets children. Their impulsivity and emotional responsiveness make them ideal consumers—and they serve as conduits to their parents’ wallets. Meanwhile, parents, driven by guilt and affection, respond to these emotional cues with purchases, reinforcing a cycle that ties family dynamics to market strategies.

Devices meant to liberate us—smartphones, microwave ovens, robotic vacuum cleaners—have in reality deepened our dependence on the very system that demands we work harder to afford them. Graeber (2018) terms the work that sustains this cycle “bullshit jobs”: roles that exist not out of necessity, but to perpetuate economic structures. These jobs are often mentally exhausting, seemingly pointless, and maintained only out of fear of financial instability.

Such jobs typically require a university degree or social capital and are prevalent at managerial or administrative levels. They differ from “shit jobs,” which are low-paid but societally essential. Bullshit jobs include roles like receptionists employed to project prestige, compliance officers producing paperwork no one reads, and middle managers who invent tasks to justify their existence.

Historian Rutger Bregman (2014) observes that medieval peasants, toiling in the fields, dreamt of a world of leisure and abundance. By many metrics, we have achieved this vision—yet rather than rest, we are consumed by dissatisfaction. Market logic now exploits our insecurities, constantly inventing new desires that hollow out our wallets and our sense of self.

Ecophilosopher Joanna Macy and Dr. Chris Johnstone (2012) give a telling example from Fiji, where eating disorders like bulimia were unknown before the arrival of television in 1995. Within three years, 11% of girls suffered from it. Media does not simply reflect society—it reshapes it, often violently. Advertisements now exist to make us feel inadequate. Only by internalising the belief that we are ugly, fat, or unworthy can the machine continue selling us its artificial solutions.

The Myth of the Self-Made Individual

Western individualism glorifies self-sufficiency, ignoring the fundamental truth that humans are inherently social and ecologically embedded. From birth, we depend on others. As we age, our development hinges on communal education and support.

Moreover, we depend on the natural world: clean air, water, nutrients, and shelter. Indigenous cultures like the Iroquois/Haudenosaunee express gratitude to crops, wind, and sun. They understand what modern society forgets—that survival is not guaranteed, and that gratitude is a form of moral reciprocity.

In Kalahari, the San people question whether they have the right to take an animal’s life for food, especially when its species nears extinction. In contrast, American officials once proposed exterminating prairie dogs on Navajo/Diné land to protect grazing areas. The Navajo elders objected: “If you kill all the prairie dogs, there will be no one to cry for the rain.” The result? The ecosystem collapsed—desertification followed. Nature’s interconnectedness, ignored by policymakers, proved devastatingly real.

Macy and Johnstone argue that the public is dangerously unaware of the scale of ecological and climate crises. Media corporations, reliant on advertising, have little incentive to tell uncomfortable truths. In the U.S., for example, television is designed not to inform, but to retain viewers between ads. News broadcasts instil fear, only to follow up with advertisements for insurance—offering safety in a world made to feel increasingly dangerous.

Unlike in Finland or other nations with public broadcasters, American media is profit-driven and detached from public interest. The result is a population bombarded with fear, yet denied the structural support—like healthcare or education—that would alleviate the very anxieties media stokes.

Conclusions 

The story of modern capitalism is not just one of freedom, but also of entrapment—psychological, economic, and ecological. Surveillance capitalism has privatised control, bullshit jobs sap our energy, and advertising hijacks our insecurities. Yet throughout this dark web, there remain glimmers of alternative wisdom: indigenous respect for the earth, critiques from anthropologists, and growing awareness of the need for systemic change.

The challenge ahead lies not in refining the algorithms, but in reclaiming the meaning and interdependence lost to them. A liveable future demands more than innovation; it requires imagination, gratitude, and a willingness to dismantle the myths we’ve mistaken for progress.


References

Bregman, R. (2014). Utopia for realists: And how we can get there. Bloomsbury Publishing.
Eisenstein, C. (2018). Climate: A new story. North Atlantic Books.
Graeber, D. (2018). Bullshit Jobs: A Theory. New York: Simon & Schuster.
Hammerbacher, J. (n.d.). As cited in interviews on ethical technology, 2013–2016.
Johnstone, C., & Macy, J. (2012). Active hope: How to face the mess we’re in without going crazy. New World Library.
Loy, D. R. (2019). Ecodharma: Buddhist Teachings for the Ecological Crisis. New York: Wisdom Publications.
Skinner, B. F. (1953). Science and human Behaviour. Macmillan.
Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. London: PublicAffairs.
Chomsky, N. (1959). A review of B. F. Skinner’s Verbal Behaviour. Language, 35(1), 26–58.

Zen and the Art of Dissatisfaction – Part 13

Who Do We Owe?

This post delves into the often overlooked complexities of our financial systems and the deep-rooted mechanisms of debt that have shaped our world. In exploring the history of money, state power, and the intricate relationship between banks and citizens, we see how dissatisfaction has long been embedded in the foundations of economic systems. Just as Zen practice challenges the conventional pursuit of constant pleasure and accumulation, our financial history reveals a pattern of never-ending striving, often at the expense of broader social equity. The financial system, much like the pursuit of material satisfaction, is a cycle of continual debt, obligation, and inequality. All of this often equals suffering, or at least dissatisfaction. Understanding this cycle is key to understanding the dissatisfaction that runs through modern society—how it originates from systems that promise wealth and prosperity, yet often deliver nothing more than perpetual indebtedness.

Originally published in Substack https://substack.com/home/post/p-163455665

The Roots of Debt and Power

During the Middle Ages, rulers realised they could manipulate their finances by giving more power to bankers. Anthropologist David Graeber (2011) writes that the history of modern financial instruments and paper money traces back to municipal bond issuance. The Venetian government began this practice in the 12th century when it needed money for military purposes. They collected loans from taxpaying citizens, offering a 5% annual interest rate in return. These bonds were made transferable, creating a market for government debt. Since these bonds had no set maturity date, their market prices fluctuated wildly, as did the probability of repayment.

Similar practices spread quickly across Europe. The state ensured tax compliance by requiring citizens to lend money with interest. But what exactly is this interest? This concept originates from Roman economics, where interest (lat. interesse) was considered compensation for the lender’s loss if repayment was delayed. In practice, the Venetian state agreed to pay this Roman ”interest” — a penalty for late repayment — to citizens who lent money to the state.

Such a system undoubtedly raised questions about the legal and moral relationship between citizens and the state. However, it spread quickly, as it made financing wars and conquests easier for states. By 1650, most Dutch households owned government debt. The true paradox of this system appeared when such bonds were monetised, and citizens started using these promises of repayment as currency for trade.

These government bonds sparked an economic revolution, transforming independent townspeople and villagers into wage labourers, forced to work for those with access to higher forms of credit. Gold bars imported from the Americas were rarely used for daily transactions. Instead, they travelled from Spain’s Seville to Genoese bankers’ vaults, then onward to China, where they were exchanged for silk and other luxury goods.

The Birth of Paper Money: Shifting Trust and Power

Government bonds were, in principle, already paper money, but it wasn’t until the establishment of the Bank of England in 1694 that true paper money emerged. The bank issued notes that were not government debt obligations but were tied directly to the king’s war debts. This money marked a shift in the nature of currency, now determined by speculative forces — interest rates and profits derived from military success and the exploitation of colonial resources.

This development led to a market-based economy, still characterised by a complex relationship between militarism, banking, and exploitation. The value of money shifted from direct human trust and the exchange of precious metals to government bonds, promising profits. This practice of issuing debt extended from government bonds to shares in corporations, suggesting that money could be endlessly created through interest-bearing loans.

The US dollar today is still a form of debt issued by the Federal Reserve, a coalition of banks. This arrangement mirrors the initial loan system introduced by the Bank of England, where the central bank loans money to the US government by purchasing government bonds, which are then turned into money through further lending.

Supporters of market economies claim that such systems have existed for 5,000 years. Yet, when examining the history of monetary economies, we see that the earliest market economies, such as 17th-century Holland and 18th-century England, experienced disastrous speculative crashes, like the tulip mania and the South Sea bubble.

Ultimately, the money we use is an extension of public debt. We engage in trade based on government promises, which are essentially loans from future generations. State debt, as politicians have noted since its inception, is borrowed from future generations. On one hand, this arrangement increases political power in the hands of the state; on the other, it suggests the government owes something to its citizens. The problem lies in the fact that state debt originates from the deprivation of freedom, war, and violence. It is not owed equally to all people but primarily to capital owners. The word ”capitalist” originally referred to someone who owned government bonds.

Debt and Global Ruin?

Enlightenment thinkers feared that state debt could lead to global ruin. The introduction of impersonal debt carried the ever-present risk of bankruptcy. While individual bankruptcy meant losing property, imprisonment, torture, hunger, and death, no one knew what state bankruptcy might entail.

For centuries, capitalism operated in a state of perpetual anxiety, with thinkers like Karl Marx, Max Weber, Joseph Schumpeter, and Ludwig von Mises confident that the system could not last beyond two generations. Graeber describes how, in 1870s Chicago, many wealthy industrialists built homes near military bases, convinced a revolution was imminent.

Debt and interest are significant factors in our world today. Graeber recounts how the International Monetary Fund (IMF) was created when OPEC countries poured vast amounts of oil wealth into Western banks during the 1970s oil crisis. These banks could not find new investment opportunities, so they started persuading global South dictators and politicians to take loans. These loans began with low interest rates but soared to 20% during the tight monetary policies of the 1980s. This aggressive lending process led to a debt crisis in the global South in the 1980s and 1990s. To refinance these loans, the IMF required nations to cut food subsidies, abandon free healthcare and education, and so on.

The tragedy of such loans is that the global South has often repaid them multiple times. The initial loan often ended up in the pockets of dictators, and as interest accumulated, the debt never truly had an endpoint. The IMF created a way to generate money out of nothing, without a concrete limit. But what moral right did they have to act in this manner? What moral obligation do these countries have to repay something they have already paid?

Since the late 19th century, American economic thought has shaped global political and economic development. In the 1800s, anti-capitalist views in the United States, such as producerism, argued that labour, not capital, created true wealth. President Abraham Lincoln, a prominent producerist, stated that capital was merely the fruit of labour. However, from the 1890s, a new ideology emerged, promoted by industrial magnates, bankers, and political allies, arguing that it was capital, not labour, that created wealth.

This cultural campaign, championed by steel magnate Andrew Carnegie, argued that concentrated capital under wise leadership could reduce commodity prices so much that future workers would live as well as past kings. Carnegie believed that high wages for the poor were not beneficial for the ”race.”

It is crucial to remember that the Marxist term ”worker” never referred to factory workers. In fact, during Marx’s time, more working-class individuals were employed as maids, servants, shoeshiners, waste collectors, cooks, nurses, taxi drivers, teachers, prostitutes, janitors, and traders than in mines or factories.

The idea of wage labour, working under supervision in factories performing tasks set by a boss, originates from colonial plantation slavery and the hierarchical command structure of trading companies’ fleets.

Shaping of Modern Economies

By the early 20th century, this ideology of capital producing wealth became entrenched in Western thinking. Several events in the United States changed the relationship to work, family, leisure, and especially consumption. Advertisements for consumer products began to take their modern form in the 1920s, and with the Great Depression of the 1930s, the idea of a shorter workweek was no longer discussed.

In 1914, Henry Ford, the founder of the Ford Motor Company, reduced his factory’s workday from nine to eight hours, doubled workers’ wages, and promised to share profits with employees. The main reason for this was the shortage of good workers, as few were willing to work on Ford’s assembly line. The promise of better wages and potential company profits resolved this issue, providing a quick economic gain for Ford’s company.

Ford believed that, although the company would lose money initially, it would recover because workers would now have more money to spend on Ford products. Ford turned his employees into loyal consumers. However, this extra pay was tied to social expectations. Ford’s sociological department would visit workers’ homes unannounced to assess cleanliness, safety, and alcohol use. Any deviations would result in a deduction from their bonus.

Ford’s shareholders took him to court, claiming his duty was to maximise profits for shareholders, even if done legally. The court ruled that while Ford’s humanitarian sentiments were admirable, his company existed to make profits for its shareholders.

In the US, the debate about a shorter workweek continued until World War II, after which post-war economic growth led citizens to forget the issue, as shorter working hours would have meant lower wages, which in turn would have led to less money for consuming the products promoted by the media.

The United States has been an exceptional example of how the middle class has been built and, in recent decades, undermined. Despite consistent GDP growth since the 1960s, wages have stagnated since the early 1970s, even while workers were the best educated in the world. Globalisation and technological change have reshaped business practices.

Large companies like Amazon still employ thousands, but this is a fraction of what would have been needed before automation. AI algorithms optimise business operations, replacing human labour with machines. Today, the rule is that the working class continues to grow poorer while GDP rises.

Afterword
As we reflect on the origins of debt and finance, it becomes evident that the complex relationships between states, banks, and citizens have had far-reaching consequences for the modern world. From the early municipal bonds issued by Venetian rulers to the creation of the US dollar and the global financial system of today, the trajectory of money is intricately linked to power, conflict, and inequality. What was once a simple exchange of goods and services has evolved into a global network of debt and credit, often with severe repercussions for ordinary people. The lessons of history remind us that economic systems, while essential for progress, often come at the cost of social justice and equality. As we move forward, it is crucial to question the morality of the systems that govern our financial world and to explore alternatives that prioritise the well-being of all individuals, rather than just the few who control the flow of capital.


References

Graeber, D. (2011). Debt: The First 5,000 Years. Brooklyn, NY: Melville House.

Graeber, D. (2018). Bullshit Jobs: A Theory. New York: Simon & Schuster.

Harvey, D. (2005). A Brief History of Neoliberalism. Oxford University Press.

Klein, N. (2007). The Shock Doctrine: The Rise of Disaster Capitalism. Penguin.

Mazzucato, M. (2018). The Value of Everything: Making and Taking in the Global Economy. Allen Lane.

Standing, G. (2011). The Precariat: The New Dangerous Class. Bloomsbury Academic.

Zen and the Art of Dissatisfaction – Part 12

From Nutmeg Wars to Domination

This post explores the dark undercurrents of the market economy, tracing its violent colonial roots and questioning the common myths of its origins. Drawing from history, anthropology, and the work of David Graeber, it challenges mainstream economic narratives and highlights the human cost behind capitalism’s foundations—from the spice trade in the Banda Islands to the coercive systems of debt in early modern Europe.

Originally published in Substack https://substack.com/home/post/p-162823221

In 1599, a Dutch expedition contacted the chiefs of the Banda Islands, famed for their nutmeg, to negotiate an agreement. The appeal and value of nutmeg were heightened by the fact that it grew nowhere else. The Dutch forbade the islanders from selling spices to representatives of other countries. However, the Banda islanders resisted the Dutch demand for a monopoly over the spice trade. In response, the Dutch East India Company—VOC (Vereenigde Oost-Indische Compagnie)—decided to conquer the islands by force. The company launched several military campaigns against the islanders, aided by Japanese mercenaries and samurai.

The conquest, which began in 1609, culminated in a massacre: VOC forces killed 2,800 Bandanese and enslaved 1,700. Weakened by hunger and ongoing conflict, the islanders felt powerless to resist and began negotiating surrender in 1621. The VOC official Jan Pieterszoon Coen (1587–1629) deported the remaining 1,000 islanders to what is now Jakarta. With the resistance crushed, the Dutch secured a monopoly over the spice trade, which they held until the 19th century. During the Napoleonic Wars, the British temporarily seized control of the islands and transferred nutmeg plants to Sri Lanka and Singapore.

In the 2020s, a statue of Jan Pieterszoon Coen erected in 1893 in the Dutch city of Hoorn has faced similar criticism to the statues of Robert E. Lee in the United States and Cecil Rhodes in Cape Town, South Africa. Defenders of the statue view it as a sacred symbol of secular Dutch identity.

Even though the business-as-usual attitude accelerating today’s ecological crisis may not entail the displacement of indigenous peoples by samurai and their replacement with slaves, the market economy undeniably has a dirty side.

Settler histories and forced migration

In 2010, I visited Amsterdam with my friend who was born in South Africa. He suggested we visit the VOC’s 17th-century headquarters, the Oost-Indisch Huis. The building is strikingly unassuming. Located at Oude Hoogstraat 24, a small passageway leads to a modest courtyard. At the back stands a three-storey building with a single small door, two windows on the left and one on the right. Staring into this minimal yet somehow claustrophobically ominous space, it’s difficult to comprehend the VOC’s profound impact on the world—comparable only to that of its British counterpart, the East India Company (EIC).

These joint-stock companies, founded in the early 1600s and backed by private armies and nearly limitless power, helped establish a system in which investors could profit from enterprise success by purchasing shares. Standing in that courtyard, my friend recounted how his Dutch ancestors had little choice in the mid-1600s but to board ships bound for what is now South Africa. He spoke of an uncle who spent his entire life in the Karoo desert herding sheep, never eating anything that didn’t come from a sheep. South African history is filled with such solitary shepherds.

Indeed, the VOC was so concerned about these isolated sheep farmers and the continuation of white European populations in the African wilderness that they abducted young girls from Amsterdam orphanages and shipped them to Africa to become wives for the shepherds.

Anthropologist David Graeber (2011) explores the common belief that markets and money evolved from a barter system—a view popularised by Scottish moral philosopher Adam Smith in his 1776 work An Inquiry into the Nature and Causes of the Wealth of Nations. Smith argued against the idea that money was a state invention. Following the liberal philosophical tradition of John Locke, who believed that governments existed to protect private property and functioned best when limited to that role, Smith extended the theory by claiming that property, money, and markets predated political institutions. According to Smith, these were the foundations of civilisation, and governments should confine themselves to guaranteeing the value of currency.

Graeber, however, challenges this assumption. He asks whether there has ever been a point in human history when people lived in a pure barter economy, as Smith claimed. His research finds no such evidence. Barter economies only existed in contexts where people were already familiar with cash.

Graeber introduces the term ”human economies” to describe anthropological cases in which value was measured according to personal honour and social standing. These systems are well-documented in ancient Greece, medieval Ireland, and among Indigenous cultures in Africa and the Americas. For most of human history, people didn’t need money or other exchange mediums to get what they needed. Help was offered without expectation of compensation. In human economies, value was attached only to human life, honour, and dignity.

Human economies and the value of honour

Written records in Ireland begin around 600 AD, by which time the once-thriving slave trade had already ceased. While the rest of Europe used Roman-inspired coinage, Ireland—lacking significant mineral wealth—did not. Its 150 kings could only trade for foreign luxury goods using cattle and people, which thus became the de facto currency. By the Middle Ages, the slave trade had ended, much like elsewhere in Europe. The collapse of slavery was a major consequence of the fall of the Roman Empire.

Medieval Irish lived on scattered farmsteads, growing wheat and raising livestock. There were no real towns, except those that formed around monasteries. Money served purely social purposes: for gifts, payments to artisans, doctors, poets, judges, entertainers, and for feudal obligations. Tellingly, lawmakers of the time didn’t even know how to price goods. Everyday objects were never exchanged for money. Food was shared within families or sent to lords who hosted feasts for friends, rivals, and vassals.

In such societies, honour and social status were everything. Though physical items had no monetary value, a person’s honour carried a precisely defined price. The most esteemed figures—kings, bishops, and master poets—had an ”honour price” equivalent to seven slave girls. Although slavery had ended, it remained a conceptual unit of value. Seven slave girls equalled 21 dairy cows and 21 ounces of silver.

Throughout history, human value—whether defined by honour or tangible worth—often served as the original measure of price, even when nothing else required valuation. One root of cash-based economies lies in the conduct of large-scale wars. For example, in Sumerian Mesopotamia, silver reserves were stored in temples merely as collateral in case debts had to be settled. The entire Sumerian economy was based on credit. Though silver backed these arrangements, it typically sat untouched inside temple vaults.

In such systems, not even kings could obtain anything they simply desired. But temple-held silver could be stolen—and the first coins likely arose this way. For instance, Alexander the Great’s (356–323 BC) vast conquests necessitated paying his soldiers, and what better means than minting coins from looted silver? This pattern is evident wherever money first emerged. Crucially, such systems required slavery. War captives—slaves—played a vital role in defining human value and, by extension, the value of all things. They also laboured in the extraction of key minerals like silver.

Military-coinage-slavery complex
Graeber (2011) refers to this dynamic as the military-coinage-slavery complex. Similar developments appeared around 2,500 years ago across the Western world, the Middle East, India, and China. Money remains deeply entangled with power and freedom. As Graeber notes, anyone working for money understands that true freedom is largely an illusion. Much of the violence has been pushed out of sight—but we can no longer even imagine a world based on social credit arrangements without surveillance systems, weapons, tasers, or security cameras.

Cash fundamentally altered the nature of economic systems. Initially, it was used primarily for transactions with strangers and for paying taxes. In Europe, up until the end of the Middle Ages, systems based on mutual aid and credit were more common than cash purchases.

The origins of the market economy lie in the collapse of trust between traditional communities, replaced by the impersonal force of markets. Human-based credit economies were transformed into interest-bearing debt systems. Moral networks gave way to debt structures upheld by vengeful and violent states. For instance, a 17th-century urban resident could not count on the legal system, even when technically in the right. Under Elizabeth I (1558–1603), the punishment for vagrancy—meaning unemployment—began with having one’s ears nailed to a post. Repeat offenders faced death. The same logic applied to debt: creditors could pursue repayment as though it were a crime.

Graeber gives the example of Margaret Sharples, who in 1660 was prosecuted in London for stealing fabric—used to make an underskirt—from Richard Bennett’s shop. She had negotiated the acquisition with Bennett’s servant, promising to pay later. Bennett confirmed she agreed to a price and even paid a deposit, offering valuables as collateral. Yet Bennett returned the deposit and initiated legal proceedings. Sharples was ultimately hanged.

This marks a profound shift in moral obligations and how societies managed debt. Previously, debt was a normal part of social life. But with state systems in place, creditors gained the right to recover their loans with interest—and to sue. Graeber writes:

“The criminalization of debt, then, was the criminalization of the very basis of human society. It cannot be overemphasized that in a small community, everyone normally was both lender and borrower. One can only imagine the tensions and temptations that must have existed in communities … when it became clear that with sufficiently clever scheming, manipulation, and perhaps a bit of strategic bribery, they could arrange to have almost anyone they hated imprisoned or even hanged.” (Graeber 2011, p. 381)

Conclusion
This historical and anthropological lens reveals the market economy not as a neutral or natural evolution, but as a system forged through conquest, coercion, and structural violence. From spice monopolies enforced with massacres to the criminalisation of everyday debt, market capitalism has long relied on hierarchies of power, enforced by law, military, and myth. As we face ecological and moral crises today, understanding this history is crucial to reimagining alternatives rooted in trust, community, and human dignity.


References:
Graeber, D. (2011). Debt: The first 5,000 years. New York: Melville House.
Smith, A. (1776). An Inquiry into the Nature and Causes of the Wealth of Nations. London, UK: W. Strahan and T. Cadell.

Zen and the Art of Dissatisfaction – Part 11.

The Weight of Debt, the Price of Trust

What is money, really? Is it a tool, a promise, or a shared illusion? This essay dives into the deep and often surprising roots of our monetary systems, far beyond coins and banknotes. Drawing on the anthropological insights of David Graeber and the philosophical debates of Enlightenment thinkers like Rousseau and Hobbes, it explores how concepts of debt, value, trust, and inequality have shaped human civilisation—from the temples of Sumer to the trading ports of colonial empires. It also confronts the uncomfortable legacy of economic systems built on slavery and environmental domination. The aim is not only to trace the history of money, but to ask what this history says about who we are—and who we might become.

Originally published in Substack https://substack.com/home/post/p-162392089

Before Money – The Myth of Barter

Anthropologist David Graeber (2011) argues that debt and credit systems are far older than money or states. In fact, the Sumerians used accounting methods for debt and credit over 5,500 years ago. The Sumerian economy was managed by vast temple and palace complexes employing thousands of priests, officials, craftsmen, farmers, and herders. Temple administrators developed a unified accounting system remarkably similar to what we still use today.

The basic unit of Sumerian currency was the silver shekel, whose weight was standardised to equal one gur, or a bushel of barley. A shekel was divided into 60 portions of barley. Temple workers were given two portions of barley per day—60 per month. This monetary value definition did not emerge from commercial trade. Sumerian bureaucrats set the value to manage resources and transfers between departments. They used silver to calculate various debts—silver that was practically never circulated, remaining in the temple or palace vaults. Farmers indebted to the temple or palace typically repaid their debts in barley, making the fixed silver-to-barley ratio crucial.

Graeber asserts that debt predates money and states. He dismisses the notion of a past barter economy, suggesting it could only exist where people already had familiarity with money. Such scenarios have existed throughout history—for example, after the fall of the Roman Empire, medieval traders continued pricing goods in Roman coinage, even though the coins were no longer in circulation. Similarly, cigarettes have been used as currency in prisons.

Money, then, is a kind of IOU. For example, Mary buys potatoes from Helena. Mary owes Helena and writes the debt on a slip of paper, stating who owes whom. Helena then needs firewood from Anna and passes the IOU on to her. Now Mary owes Anna. In theory, this IOU could circulate indefinitely, as long as people trust Mary’s ability to pay. Money is essentially a promise to return something of equal value. Ultimately, money has no real intrinsic utility. People accept it only because they believe others will do the same. Money measures trust.

But does everyone trust Mary’s ability to pay? In theory, a whole city or country could operate on such slips stating that Mary owes them—as long as Mary is immensely wealthy. It would not be an issue if Mary were queen of the land and willing to redeem all the debts at once. In a day-to-day practical sense, a contemporary an euro banknote is also a promissory note. But in its legal and traditional financial sense, euro is a fiat currency. Its value is based on the trust and legal framework of the issuing authority, which is the European Central Bank and eurozone countries. 

Lending money and usury were taboos in deeply Christian medieval Europe. It was only after the plague eradicated large parts of the population for seemingly no rational reason that papal authority weakened. This enabled a shift in financial norms. Many Western religions prohibit lending money at interest. Only with the weakening of papal dominance could such doctrines be reassessed, allowing banking to grow.

This invention led to growing prosperity and overall wealth. Money was no longer something physical; it became trust between people. This gave rise to the greatest of all monetary inventions—financing. It enabled more ambitious projects that had previously been too risky. It also led to the creation of colonial merchant fleets, as the new banks could finance shipowners who sought vast fortunes in spices, textiles, ivory, tobacco, sugar, coffee, tea, and cocoa from distant lands—reaping enormous profits upon return.

The Bitterness of Luxury

Especially coffee, tea, and cocoa stand out. These three plant-based stimulants are all psychoactive substances that cause physical dependence. For some, they act as stimulants; for others, they soothe and aid concentration. These substances fulfil needs that users may not even have known they had. Once again, the dissatisfied human fell into their own trap. Bitter-tasting, these indulgences increased the demand for sugar.

The growing need for sugar, coffee, tea, cocoa, tobacco, and cotton intensified the African slave trade. It is unfair to call it the African slave trade—as it was a European slave trade conducted on African shores. Europeans even created their own currency for buying people. One human cost one manilla: a horseshoe-shaped ring made of cheap metal with flared ends. Manillas were produced en masse, and many still lie at the bottom of the Atlantic Ocean. So many were made that they continued to function as currency and ornaments until the late 1940s.

Due to the European slave trade, over two million Africans ended up at the bottom of the Atlantic, and entire nations were relocated across the ocean and forced to labour—simply because dissatisfied café customers in London found their coffee bitter and slightly overpriced.

This exploration of human dissatisfaction and its origins touches upon the age-old debate regarding the nature of human goodness and evil. Jean-Jacques Rousseau (1712–1778), the Genevan-French Enlightenment philosopher, composer, and critic of Enlightenment thought, wrote in 1755 the influential Discourse on the Origin and Basis of Inequality Among Men. Rousseau posited that humans once lived as hunter-gatherers in a childlike state of innocence within small groups. This innocence ended when we left our natural paradise and began living in cities. With that came all the evils of civilisation—patriarchy, armies, bureaucrats, and mind-numbing office work.

As a counterpoint to Rousseau’s romantic paradise vision, English philosopher Thomas Hobbes (1588–1679) proposed in Leviathan (1651) that life in a state of nature was not innocent but ”solitary, poor, nasty, brutish, and short.” Progress, if any, resulted only from the repressive mechanisms that Rousseau lamented.

Graeber and David Wengrow (2021) argue that to envision a more accurate and hopeful history, we must abandon this mythical dichotomy of an original Eden and a fall from grace. Rousseau’s ideas didn’t arise in a vacuum, nor were they ignored. Graeber and Wengrow argue that Rousseau captured sentiments already circulating among the French intelligentsia. His 1754 essay responded to a contest question: ”What is the origin of inequality among men, and is it authorised by natural law?”

Such a premise was radical under the absolutist monarchy of Louis XV. Most French people at the time had little experience with equality; society was rigidly stratified with elaborate hierarchies and social rituals that reinforced inequality. This Enlightenment shift, seen even in the essay topic, marked a decisive break from the medieval worldview.

In the Middle Ages, most people around the world who knew of Northern Europe considered it a gloomy and backward place. Europe was rife with plagues and filled with religious fanatics who kept largely to themselves, apart from the occasional violent crusade.

Graeber and Wengrow argue that many Enlightenment thinkers drew inspiration from the ideas of Native Americans, particularly those reported by French Jesuit missionaries. These widely read and popular reports introduced new perspectives on individual liberty and equality—including women’s roles and sexual freedom—that deeply influenced French thought.

Especially influential were the ideas of the Huron people from present-day Canada. They were offended by the French’s harshness, stinginess, and rudeness. The Hurons were shocked to hear there were homeless beggars in France. They believed the French lacked kindness and criticised them for it. They didn’t understand how the French could talk over one another without sound reasoning, which they saw as a sign of poor intellect.

To understand how Indigenous critiques influenced European thought, Graeber and Wengrow focus on two figures: Louis Armand, Baron de Lahontan (1666–1716), a French aristocrat, and the eloquent and intelligent Huron statesman Kandiaronk (1649–1701). Lahontan joined the French army at seventeen and was sent to Canada, where he engaged in military operations and exploration. He eventually became deputy to Governor Frontenac. Fluent in local languages and, according to his own claims, friends with Indigenous leaders such as Kandiaronk, he published several books in the early 1700s. These works—written in a semi-fictional dialogue format—became widely popular and made him a literary celebrity. It remains unclear to what extent Kandiaronk’s views reflect his own or Lahontan’s interpretations.

Graeber and Wengrow suggest it’s plausible Kandiaronk visited France, as the Hurons sent an envoy to Louis XIV’s court in 1691. At the time, Kandiaronk was speaker of the Huron council and thus a logical choice. His views were radically provocative: he suggested Europe should dismantle its religious, social, and economic systems and try true equality. These ideas profoundly shaped European thought and became staples of fashionable literature and theatre.

Mastering Nature, Enslaving Others

Europeans were, by and large, exceptionally ruthless towards the Indigenous peoples of foreign lands. This same hostility, arrogance, and indifference is also reflected in Western attitudes toward natural resources. The English polymath Francis Bacon (1561–1626) was, among other things, a writer, lawyer, and philosopher. Bacon was one of the Enlightenment reformers who advocated for science. For this reason, I have chosen him as an example here, since the triumph of science from the Enlightenment to the present day has shaped both our thinking and our relationship with nature.

There is no doubt that humanity has greatly benefited from Bacon’s achievements. However, in recent times, feminists and environmentalists have highlighted unpleasant and timely justifications for the exploitation of nature in his writings. Naturally, Bacon’s views are also fiercely defended, which makes it difficult to say who is ultimately right. In any case, many have cited the following lines—possibly originally penned by Bacon—as an example:

“My only earthly wish is to stretch man’s lamentably narrow dominion over the universe to its promised bounds… Nature is put into service. She is driven out of her wandering, bound into chains, and her secrets are tortured out of her. Nature and her children are bound into service, and made our slaves… The mechanical inventions of recent times do not merely follow nature’s gentle guidance; they have the power and capacity to conquer and tame her and shake her to her foundations.” (Merchant, 1980; Soble, 1995)

Whether or not these words were written by Bacon himself, they effectively summarise the historical narrative of which we are all victims—especially our children and grandchildren, who are slowly growing up in a dying world marked by escalating natural disasters, famines, and the wars and mass migrations they cause. This kind of mechanistic worldview has also made possible atrocities such as Auschwitz, where human beings were seen merely as enemies or as “others”—somehow separate from ourselves and from how we understand what it means to be human.

Conclusion

The story of money is, at its core, a story of belief—our collective willingness to trust symbols, systems, and each other. But this trust has often been weaponised, tied to exploitation, inequality, and ecological destruction. From the philosophical musings of Enlightenment thinkers inspired by Indigenous critiques to the brutal efficiency of modern finance, the evolution of money reveals both the brilliance and the blindness of human societies. As we stand at a global crossroads marked by climate crises and economic disparity, revisiting these historical insights is more than an intellectual exercise—it’s a necessary reflection on what we value, and why. The future of money may depend not on innovation alone, but on a renewed commitment to justice, sustainability, and shared humanity.


References

Graeber, D. (2011). Debt: The first 5,000 years. New York: Melville House.
Graeber, D., & Wengrow, D. (2021). The dawn of everything: A new history of humanity. Penguin Books.
Hobbes, T. (1651). Leviathan. Andrew Crooke.
Merchant, C. (1980). The death of nature: Women, ecology, and the scientific revolution. Harper & Row.
Rousseau, J.-J. (1755). Discourse on the origin and basis of inequality among men (G. D. H. Cole, Trans.). Retrieved from https://www.gutenberg.org/ebooks/11136
Soble, A. (1995). The philosophy of sex and love: An introduction. Paragon House.

Zen and the Art of Dissatisfaction – PART 10.

Money: Debt and the Death of Meaning

“If nature were a bank, we would have already saved it.”
— Eduardo Galeano

In today’s world, money is more than a means of exchange—it’s a source of power, anxiety, and inequality. The way we earn, spend, and owe has profound effects not only on our personal lives but also on the planet itself. This post explores the complex relationships between money, debt, environmental destruction, and the philosophies that seek to restore balance.

Originally published in Substack https://substack.com/home/post/p-161959754

Money—and especially the lack of it—is one of the biggest sources of dissatisfaction. Debt in particular shapes our lives and influences our sense of contentment. Even though education is free in Finland, I still had to spend several years paying off my student loans, which had ballooned to incomprehensible amounts. But that’s nothing compared to what my American colleagues have to pay for their education. The average U.S. household carries about $111,740 in debt. In Finland, the average household debt is around €49,500. People are often blamed for borrowing money, as being in debt is seen as shameful or even sinful. Yet this money is rarely used for frivolous purposes. Studies show that most debt is incurred for housing, children’s education, sharing with friends, or maintaining relationships.

Anthropologist David Graeber explores the origin and meaning of debt, money, and credit in his groundbreaking work Debt: The First 5,000 Years (2012). According to Graeber, people must go into debt just to reach an income level that covers more than mere survival. Despite their debt, people still buy homes for their families, alcohol for celebrations, gifts for their friends. They’re willing to pay for weddings and funerals even if their credit cards are maxed out. One of the pillars of market economy is the idea of endless growth and the illusion of an ever-increasing GDP that promises a better future and free money for all. But is this really true? Can the limited resources of our planet sustain endless growth?

Humans have always transformed their environment when migrating to new areas, but the large-scale exploitation of nature and irreversible modification of the atmosphere began only after the principles of market economy solidified in the 18th century.

Did ancient hunter-gatherers destroy their environment with the same ruthlessness? Romanticising hunter-gatherers has its risks, and many scholars have pointed out that humans have been dangerous mass killers for as long as we’ve existed. Australia is one such example. When modern humans arrived on the continent some 50,000 years ago, nearly all large predators and edible animals vanished. These animals had no concept of how dangerous a hairless, two-legged ape could be—and not enough time to learn.

But just as it’s dangerous to romanticise hunter-gatherers, it’s also dangerous to label them as mass murderers. Nearly all examples show that Indigenous peoples eventually found some kind of balance with nature. There’s no known case where an Indigenous group caused large-scale ecological destruction on their own. Easter Island is often cited as an exception, but that may stem from misinterpretation. Scholars still debate whether the island’s original inhabitants were responsible for the collapse of their own culture.

Historian Rutger Bregman (2020) discusses this debate by reviewing research on Easter Island’s history. He concludes that everything was fine until Western explorers arrived—bringing with them violence and rats that altered the ecosystem. The islanders began to covet Western culture and treasures. Eventually, about a third of the population was taken as slaves to Peruvian mines. Some were returned, now infected with smallpox. That finally ended the peace on the island and eradicated most of the remaining inhabitants.

American environmental activist, author, Buddhist scholar, systems theorist, and deep ecology thinker Joanna Macy has become an influential figure in recent years as ecological activism has risen as a political movement. The climate movement Extinction Rebellion, which started in the UK in October 2018, has included from the beginning people of many religious backgrounds. Buddhist members, in particular, frequently cite Macy’s ideas.

Born in 1929 in Los Angeles, Macy attended the Lycée Français de New York and graduated from Wellesley College in 1950 with a degree in Biblical studies. Her husband, Francis Macy (1927–2009), was a Harvard-trained psychologist and expert in Slavic culture, which led them abroad during the Cold War on assignments for the United States Information Agency (USIA). Macy studied political science at the University of Bordeaux in the early 1950s and was recruited by the CIA to gather intelligence in Germany. While living there, she began a lifelong project translating the work of Austrian poet Rainer Maria Rilke (1875–1926).

Between 1964 and 1972, Macy traveled with her husband, who served in leadership roles with peacekeeping missions in India, Tunisia, Nigeria, and across Africa. She earned her doctorate from Syracuse University in 1978 on the relationship between systems theory and Buddhism, which she had studied while assisting Tibetan refugees in northern India. During that time, she became friends with the young Dalai Lama.

After the Cold War, Francis Macy played a pivotal role in supporting hundreds of activists in Russia, Ukraine, Georgia, and Kazakhstan as they confronted the environmental legacy of nuclear weapons and the Chernobyl disaster. He founded multiple professional associations and collaborated with former Soviet colleagues beginning in 1983.

Thanks to her rich life experience, Joanna Macy serves as a powerful role model for today’s environmental movement—on whose shoulders rests the future of the entire ecosystem. Macy has influenced many other thinkers who operate at the intersection of spirituality and environmental activism, such as philosopher and Zen teacher David R. Loy.

Joanna Macy (2021) argues that humanity does not truly believe the current situation is dangerous. On an individual level, we don’t feel we have a role in solving the crisis. We fear ridicule if we panic, because everyone else seems to think things are just fine. We also fear jeopardising our political or economic standing in our communities if we take action. We think it’s better not to think about it at all—because it is painful and terrifying. We’re paralysed: aware of the danger, but unsure what to do. Some may think nothing can be done, and nothing matters anymore.

Norwegian philosopher Arne Næss (1912–2009) coined the term deep ecology. He was a central figure in the environmental movement from the late 20th century, combining his ecological worldview with Gandhi’s principles of nonviolent resistance and actively participated in defending biodiversity. His ecological view could be described as a kind of elegant self-realisation: every living being—human, animal, or plant—has an equal right to live. Næss believed people could become part of Earth’s ecosystems through recognising the illusion of the separate self. Joanna Macy agrees, stating that no action in defence of biodiversity feels like a sacrifice, once we experience our deeper ecological self—one that includes all life. The whole world becomes myself. When we act on behalf of the world, we restore balance within ourselves.

In 2012, Joanna Macy and psychologist Chris Johnstone developed a method called The Work That Reconnects, which encourages people toward active hope—because what’s the point of acting if the game is already lost? They define hope in two ways. First, it’s the outcome we desire, which we believe is possible. The second aspect of hope is passion—the drive to work toward our desired outcome, no matter how unlikely. Passive hope is simply wishing things would go a certain way and waiting for external forces to make it happen. Active hope means taking the situation seriously and doing, right now, whatever we can to move toward our desired future.

Macy categorises today’s dominant narratives into three types. The first is business as usual—the belief that economic growth will inevitably lead to progress. While things have improved on average for many, the future depends on an unprecedented level of motivation and global cooperation—without any guarantees of economic reward.

Human creation

However, our economic system is not a law of nature, but rather a man-made construct—one that could be changed simply through collective human decision. It is not a force of nature or a law carved in stone for which there have never been alternatives. Market capitalism has merely proven so efficient at generating wealth and health that few dare to question it. Yet, the fruits of capitalism have not been shared equally among the world’s population. After the fall of communism, capitalism was left without serious competition.

Although capitalism can be seen as the bearer of gifts and freedom here in the wealthy parts of the world, the relationship between capitalism and violence becomes evident when we look at countries once subjugated by colonial systems. In most cases, the original tribal borders and systems were dismantled, and the populations enslaved. In some instances, the entire indigenous population was replaced, as happened in the 17th century on the volcanic Banda Islands—now part of Indonesia.

Western culture underwent several significant changes in its transition from the Middle Ages to the Enlightenment, particularly concerning the treatment of colonial populations and the development of the banking system in the Renaissance Italy. Giovanni di Bicci de’ Medici (1360–1429) established his first bank in Florence in 1397. Though he had a branch in Rome, it was Florence’s investment opportunities that made the bank thrive. Art lovers know the House of Medici through their renowned patronage. The Medici bank became the largest in Europe during the 15th century. The family produced five popes in the 16th century—the last of whom, Leo XI, ascended the papacy and died in the same year, 1605—as well as two two queens of France—Catherine de’ Medici (1547–1559) and Marie de’ Medici (1600–1610). Their protégés included major artists of the Italian Renaissance such as Filippo Brunelleschi (1377–1446), Donatello (1386–1466), Fra Angelico (1395–1455), Leonardo da Vinci (1452–1519), and Michelangelo (1475–1564).

Western historical accounts also credit the Medici family with introducing double-entry bookkeeping. This system was devised by the Franciscan monk Luca Pacioli (c. 1447–1517), a friend of Leonardo da Vinci? The Medici’s accounting practice documented where money came from and where it went. Should we, then, each time someone asks whether we’re paying by debit or credit, remember this monk?

Conclusion: Rewriting the Script

If nature were a bank, would we have saved it already? Eduardo Galeano’s biting quote still holds true. Our world is organised around the movement of money, not the flourishing of life. But the system we live in is not immutable. It was made by us, and we can remake it.

Hope begins when we realise this truth. Joanna Macy, David Graeber, Arne Næss—these thinkers remind us that alternatives are not only possible but necessary. Deep ecology, active hope, and historical self-awareness can help us shift from a paradigm of endless extraction to one of deep connection. The future isn’t written yet. Whether or not we act—together, and now—will determine how that story unfolds.

References

Atwood, M. (2008). Payback: Debt and the shadow side of wealth. Toronto, ON: Anansi Press. 
Bregman, R. (2020). Humankind: A hopeful history. Bloomsbury.
Galeano, E. (n.d.). If nature were a bank, we would have already saved it [Quote].
Graeber, D. (2012). Debt: The first 5,000 years. Melville House.
Johnstone, C., & Macy, J. (2012). Active hope: How to face the mess we’re in with unexpected resilience and power. New World Library.
Macy, J. (2021). A wild love for the world: Joanna Macy and the work of our time (S. Macy, Ed.). Shambhala Publications.
Naess, A. (2008). The ecology of wisdom: Writings by Arne Naess (A. Drengson & B. Devall, Eds.). Counterpoint.
Pacioli, L. (2007). Particularis de computis et scripturis [Facsimile edition]. (Original work published 1494). Lucerne: Verlag am Klosterhof.

Zen and the Art of Dissatisfaction – PART 9.

Chasing Shadows: Understanding the Roots of Human Dissatisfaction?

In this post, we explore the nature of dissatisfaction and the human tendency to experience suffering—a theme central to Eastern philosophy for over 2,500 years. Drawing primarily from Buddhist thought, this article outlines how dissatisfaction pervades human existence and how we might begin to understand and engage with it differently. Rather than proposing a clear-cut solution, it invites readers to reflect more deeply on the illusions of self, permanence, and happiness.

Originally published in Substack: https://substack.com/home/post/p-161291109

Dissatisfaction or Suffering?

The nature of dissatisfaction and the possibility of liberation from it has been a consistent theme in Eastern philosophy. Central to this discourse are the teachings of Siddhartha Gautama—known as the Buddha—who lived in India over 2,500 years ago. According to Buddhist thought, our fundamental dissatisfaction stems from a mistaken belief in a concrete, separate self—an illusion that this “I” exists independently of the surrounding world.

The Buddha’s understanding of suffering is summed up in what are known as the Four Noble Truths:

  1. Suffering is a natural part of life.
  2. Suffering is caused by craving and attachment.
  3. It is possible to end suffering.
  4. There is a path that leads to the end of suffering. This path includes eight guiding principles based on honesty, awareness, and ethical living: right understanding, right intention, right speech, right action, right livelihood, right effort, right mindfulness, and right concentration

The term the Buddha used for suffering is dukkha, a Pāli word often translated as “suffering,” though it also encompasses inner unease or stress. Even ancient Buddhist texts mention body image issues as a form of suffering. Feeling unattractive or unworthy has long been part of human dissatisfaction. In this view, suffering is caused by our own actions, desires, and failure to perceive the true nature of reality.

Dukkha can be categorised in three ways:

  • Dukkha-dukkha: physical and emotional suffering due to aging, illness, birth, and death.
  • Viparinama-dukkha: suffering of change, frustration that pleasant experiences don’t last.
  • Sankhara-dukkha: existential dissatisfaction rooted in impermanence and change. we fear that life doesn’t offer us solid ground and that our very existence is questionable. It is the fear that life doesn’t offer us solid ground and that the very existence is questionable.

Philosopher and Zen teacher David R. Loy (2018) describes dissatisfaction as a kind of existential void, which cannot be fulfilled. He argues that we cannot address dissatisfaction without deconstructing the illusion of self. If dissatisfaction is inherent to our identity, then perhaps humans are, by nature, dissatisfied beings. Loy emphasises that fixing one area of life often just shifts our dissatisfaction elsewhere, without addressing its root.

In Christianity, dissatisfaction is often interpreted as the result of sin—original disobedience against God. If we are to overcome our dissatisfaction, we must theoretically resolve this ancient transgression, a task beyond our capabilities. In contrast, Buddhism encourages us to accept dissatisfaction as real and to embark on a path toward liberation.

Sense of Lack

Dissatisfaction is an emotion, but it can’t be dismissed nor suppressed. According to Loy, our real struggle is a suppressed fear that our sense of self is groundless and insecure. Trying to secure it, is like trying to catch your own shadow. We try to solve this issue, which is internal, through external material means. We try to fulfil the psychological internal void with external achievements, validation, power, money, romantic relationships, and consumer goods—but the illusion persists.

Loy calls this the lack project. It’s our effort to overcome an internal void through symbolic acts—writing books, painting, founding hospitals, or competitive hot dog eating (Joey ”Jaws” Chestnut ate 76 hot dogs in ten minutes in 2021). In contemporary times, social media amplifies these projects, as we craft idealised identities and seek validation through likes.

Buddhism offers a surprisingly simple practice in response: just sit still. Literally. Sitting meditation—sometimes anchored to the breath or other body sensations—invites us to observe thoughts and emotions as temporary occurring phenomena without clinging to them. Over time, these mental bubbles burst like soap bubbles. The goal isn’t to eliminate dissatisfaction, but to develop awareness of it and our fleeting sense of self.

Loy notes that when dissatisfaction has nowhere left to go—when it cannot project itself outward—it collapses inward. The illusion of self, which is always craving, dissolves. And with it, the need to satisfy that craving. Dissatisfaction often manifests as guilt: “There’s something wrong with me.” It encompasses the trauma of birth, illness, aging, and the fear of death. We feel bound to situations we dislike, estranged from what we love. Even in moments of peace, the mind fears this peace won’t last. Why does everything nice and beautiful have to end? As long as we feel incomplete, real life always seems just out of reach, never quite here.

The opposite of dukkha is sukha, a Sanskrit word meaning joypleasure, or ease. It’s often mistakenly believed to be the root of the word for sugar in many languages—like sukkar (Arabic), zucchero (Italian), and azúcar (Spanish). While the similarity is striking, the actual linguistic roots are more complex and likely stem from the Sanskrit word śarkarā, meaning “gravel” or “sugar crystals.” Ironically, sugar—once a symbol of sweetness and pleasure—played a central role in one of humanity’s darkest chapters: the transatlantic slave trade.

In 17th- and 18th-century London, coffeehouses became centres of political and philosophical dialogue, fuelled by coffee, tea, and cocoa—all bitter substances sweetened with sugar. The rising demand for sugar drove mass slavery, with millions of Africans kidnapped, sold, and forced to labor on plantations under brutal conditions. The true number of lives affected may never be known.

Human cruelty has recurred throughout history—extinction of species, oppression, murder, ecological destruction. Our dissatisfaction has driven both innovation and devastation. Climate change and environmental collapse are now results of centuries of viewing nature merely as a resource. This seemingly logical mindset, has triggered nonlinear feedback loops we can no longer control.

Nonlinear processes—such as ecological collapse—don’t follow neat cause-and-effect paths. Small triggers can lead to large consequences. Our ability to cultivate our experience of interdependence through meditation practice, may help us understand and respond to these challenges.

But who are we, really? What makes us “us”? Are we truly unique?

Consider the Ship of Theseus. If every plank in a ship is eventually replaced, is it still the same ship? If every cell in our body is replaced over time, are we still the same person?

Imagine a teleportation device on Mars. It scans your body and transmits the data to Earth, where a perfect copy is reconstructed. After successful teleportation, you can choose whether the original ”you” on Mars is destroyed. But which one is really you? The teleported copy on Earth or the original on Mars?

Are we just a fleeting arrangement of atoms that briefly feels like a “self”? Our cells are replenished with food and expelled through waste. If our sense of self is rooted in this ever-changing matter, then our uniqueness—and perhaps even our suffering—may be far more fragile than we think.

Conclusion

Dissatisfaction is a fundamental part of human life—rooted in illusion, fear, and longing. Whether viewed through the lens of ancient Buddhist philosophy or modern existential thought, the sense of lack cannot be fulfilled through materialism, achievement, or even reason alone. Instead, it calls for a deeper chance in our awareness of the self and its impermanence. As paradoxical as it may seem, liberation from dissatisfaction may lie not in solving it, but in understanding and integrating it into our way of being.


Resources:

Loy, D. R. (2018). Lack & transcendence: The problem of death and life in psychotherapy, existentialism, and Buddhism. Second edition. New York: Simon & Schuster.
Walpola, R. (1967). What the Buddha taught. Bedford: Gordon Fraser.

Zen and the Art of Dissatisfaction – PART 8.


The Self Illusion: Why We’re Never Quite Satisfied
Why do we so often feel like something is missing in our lives? That quiet, persistent itch that if only we had this or changed that, we might finally be at peace. American philosopher David Loy argues that this dissatisfaction stems from a fundamental sense of inner lack—a feeling that we are somehow incomplete. But what if that very notion of incompleteness is built on a psychological illusion? In today’s blog, I’ll explore the deep roots of the self, or more precisely, the self illusion, from perspectives across philosophy, Buddhism, psychology, and neuroscience.

Originally published in Substack: https://substack.com/home/post/p-160701201

The Self Illusion

In his marvellous book Lack & transcendence (2018), the American philosopher and Zen teacher David Loy suggests that the feeling of dissatisfaction in human life stems from a never ending internal craving—or sense of lack. This sense of lack arises from the feeling that we must fulfil some need in order to make our inner self more stable or complete. We believe that satisfying this need will resolve our fundamental problems. However, according to Loy, this lack cannot truly be satisfied or solved, as it has no concrete foundation.

This deep-seated sense of something missing—something believed to be the key to our happiness—stems from a concept known in psychology as the “self illusion,” and in Buddhism it is formalized in the teaching of non-self (Pali: anattā, Sanskrit: anatman), which asserts that there is no unchanging, permanent self, but rather a constantly shifting flow of experiences, sensations, and mental formations. This idea suggests that human psychology is troubled by the uncertain belief that we possess a concrete, stable, immovable—even eternal—inner self. In reality, this “self” is merely an illusion, constructed by various psychological processes and lacking any true anchor or fixed substance. As David Loy suggests, this inner self is inherently dissatisfied, constantly demanding that we fulfil its desires in countless ways.

In everyday language, this inner self is often referred to by the Freudian term ego, though we might just as well call it the self. Nothing is more dissatisfied than our ego. In Freudian theory, the ego is one part of a dynamic system made up of the id, superego, and ego itself—each representing different aspects of our psyche: the pleasure-seeking id, the socially-minded superego, and the ego, which seeks realistic and balanced compromises between the two.

Modern neuropsychology suggests that the prefrontal cortex regulates these impulses of selfhood. In newborns, this region is underdeveloped, which explains their reactive behaviour. For individuals with Tourette’s syndrome, this regulatory function is partially impaired, contributing to difficulties in social interaction. Social situations often demand an inner struggle for conformity, and for someone with Tourette’s, the stress of adapting to social norms can trigger tics—physical manifestations of the effort to conform.

In the early 1900s, American sociologist Charles Horton Cooley (1902) introduced the concept of the ”looking-glass self.” It refers to how our self-concept is shaped by how others perceive us. Essentially, our autobiographical sense of self is a narrative built from the perspectives and impressions of those we’ve encountered. Cooley’s argues, we view our lives as a series of events in which we are the protagonist, shaped by the actions and opinions of others.

According to Cooley, people see us in their own ways. This explains why public figures often complain that no one truly understands them. But Cooley argues there is no “true self” behind these perceptions. In reality, we are precisely what others see us as—even if it’s difficult for us to accept their views of us.

In Jungian psychology, the term ”self” refers to the unification of the conscious and unconscious mind. It represents the totality of the psyche and manifests as a form of individual consciousness.

Many religions embed similar ideas into the concept of a permanent and immortal soul, which continues to exist beyond physical death and, in some traditions, reincarnates. Buddhism, however, challenged the Hindu notion of a permanent, reincarnating self (attā) with the doctrine of anattā (non-self), one of its foundational principles.

For clarity, when this text refers to the self, ego, or a permanent identity, it means essentially the same thing. When necessary, I also use terms like “brain talk,” “inner voice,” or “internal dialogue,” as this is often how this particular psychological phenomenon manifests. According to American neuropsychologist Chris Niebauer (2019), this process is more verb than noun—there is no tangible self, only the experience of self, which is created by mental processes that produce inner speech and feelings, which influence our behaviour.

From a neuropsychological perspective, the concept of self and the often-dissatisfied “brain talk” it generates might originate from processes such as these: the left and right hemispheres of the brain are responsible for slightly different aspects of interpreting the sense of self and the world—a theory of phenomenon called hemispheric asymmetry. The processes responsible for the illusion of a fixed self are thought to reside in the language centres of our the left hemisphere.

Modern brain scans has shown that when the brain is not engaged in any task, a specific neural system called the default mode network (DMN) becomes active. During such moments, our thoughts wander to self-related concerns, memories, anxieties, and hopes for the future. This inner activity is believed to amplify our brain talk and, when overactive, can turn against us—making us feel as though the world is against us.

Michael Gazzaniga, a pioneer in cognitive neuroscience, demonstrated in 1967 that the brain’s hemispheres perform surprisingly different roles. The left hemisphere processes language, categories, logic, and narrative structure. It loves categorisation, it divides things into right and wrong, good and bad. The right hemisphere, on the other hand, is responsible for spatial awareness, bodily sensations, and intuition.

The left hemisphere is believed to construct the narrative of a permanent self—with a beginning, middle, and imagined future. It also creates a static image of our physical selves—often distorted in relation to the current social norms and ideals. The right hemisphere, however, perceives our boundaries as more fluid and sees us as one with the timeless world of oneness. It’s the source of empathy, compassion, and a sense that the well-being of others is related with our own. It functions almost like a spiritual organ—like Star Wars’ Yoda reminding us that we are luminous beings, not this crude matter.

Although the theory of hemispheric asymmetry is controversial, it’s commonly misunderstood in popular culture. People are not left-brained or right-brained in any rigid sense. Both hemispheres contribute to self-perception in unique ways. Studying this phenomena empirically is difficult without harming subjects.

Fortunately—or unfortunately—neuroscientist Jill Bolte Taylor (2008) experienced a stroke that silenced her left hemisphere. She described her hand turning transparent and merging with the energy around her. She felt an overwhelming joy and silence in her internal dialogue. Though the stroke was traumatic and left her disabled for several years, she eventually recovered and shared her important insights.

Taylor wrote that her sense of self changed completely when she no longer perceived herself as a solid being. She felt fluid. She describes how everything around us, within us, and between us is composed of atoms and molecules vibrating in space. Even though the language centres of our brain prefer to define us as individual, solid entities, we are actually made up of countless cells and mostly water—and are in a constant state of change.

Beyond the language centres of the left hemisphere, the default mode network is another central component in producing internal dialogue. When scientists perform fMRI scans, they often begin by mapping the resting state of a participant’s brain. Marcus Raichle (et al 2001), a neurologist at Washington University, discovered that when participants were asked to do nothing, several brain regions actually became more active. He named this the ”default mode network” (DMN).

This network activates when there are no external tasks—when we are “just waiting.” It’s when our minds wander freely, contemplating ourselves, others, the past, and the future. It may even be the source of the continuous stream of consciousness we associate with our inner world.

The DMN is central to self-reflection. It kicks in when we think about who we are and how we feel. It’s also involved in social reflection—thinking not just of ourselves but others as well. Concepts like empathy, morality, and social belonging stem from this same process.

The DMN also stirs up memory. It plays a vital role in recalling past events and helps construct the narrative we tell about ourselves—those vivid, personal moments like when our father left, or we met our partner, or our child was born. The network also activates when we think about the future, dream, or fear what may come.

When our mind is at rest, it spins a self-protective, often conservative inner dialogue full of dreams, fears, regrets, and desires. This rarely produces contentment with the present moment. But why call it a dialogue? Isn’t there just one voice in our head? Shouldn’t it be a monologue? Apparently not—our inner speech behaves like it’s talking to someone else. For instance, when we are alone and looking for a lost key, and as we finally find it, we might exclaim, “Yes! Found them!” as if others were present. Our brain talk evolved alongside our spoken language, which is inherently dialogical.

Our inner dialogue often wanders to the past and future, where it finds plenty of material for dissatisfaction. From the past, it dredges up nostalgia, regret, and bitterness. From the future, it conjures hopes, dreams, and fears—overdue bills, home repairs, environmental collapse, health concerns, or children preparing to leave home.

Zen teacher Grover Genro Gauntt once described his first experience of noticing this inner voice. In the 1970s, as a new Zen practitioner, he listened to Japanese master Taizan Maezumi speak of this constant dialogue and the importance of not identifying with it. Genro had this dialogue pop in his mind, “What is this guy talking about? I don’t have any internal dialogue!” That moment captures the tragicomic nature of the mind’s attempts to deny its own patterns—like trying not to think of a pink elephant.

Jill Bolte Taylor also writes that one of the key roles of the left hemisphere (which she experienced as temporarily malfunctioning) is to define the self by saying, “I am.” Through what she calls ”brain talk,” our minds replay autobiographical events to keep them accessible in memory. Taylor locates the “self center” specifically in the language areas of the left hemisphere. It’s what allows us to know our own name, roles, abilities, skills, phone number, Social Security number, and home address. From time to time, we need to be able to explain to others what makes us who we are—such as when the police ask to see identification. 

Taylor writes that unlike most cells in the body, our brain neurons don’t regenerate unless there’s a specific need for it. All our other cells are in constant flux and in dialogue with the outside world. Taylor postulates that the illusion of a permanent self might arise from this neurological exception. We feel like we remain the same person throughout life because we spend our entire lives with the same neurons. 

However, the atoms and molecules that form our neurons do change over time. Everything in our bodies is in constant flux. Nothing is permanent, not even the matter that constitutes our neurons. Maybe this is the unhappy psychological reality: we believe in a permanent, unchanging self, obey its internal commands—and are therefore perpetually dissatisfied. This dissatisfaction stems from the deep emptiness that our inner dialogue continuously generates.

Our belief in a fixed self begins in childhood, when we first conceive of ourselves as separate from our parents and the outside world. This inner dialogue shapes behaviour, guiding our decisions for survival and well-being. The left hemisphere’s language centres negotiate with us—when to eat, what to crave, and how to avoid social pain or getting hit by a train. But we also need the awareness of the eternal and oneness conjured by the right hemisphere. Our life would not make any sense without it. 

Conclusion

What emerges from this exploration is a realisation: our sense of a permanent self is not a solid truth but a mental construct. It’s a story told by our brain, particularly the left hemisphere, supported by cultural narratives and social feedback. This illusion, while useful for navigating daily life, is also the root of our chronic dissatisfaction. However, perhaps the greatest relief lies in understanding that we are not trapped by this narrative. As Buddhist teachings and modern neuroscience suggest, loosening our grip on the idea of a fixed self may open the door to deeper peace, compassion, and freedom. There is so much more to ourselves than our inner voice is telling us. This voice is mostly trying to prevent accidents and embarrassment, but there’s more to our true selves than that. 


Resources

  • Cooley, C. H. (1902). Human nature and the social order. New York: Scribner’s.
  • Gazzaniga, M. S. (1967). The split brain in man. Scientific American, 217(2), 24–29.
  • Loy, D. R. (2019). Ecodharma: Buddhist teachings for the precipice. Somerville: Wisdom Publications.
  • Loy, D. R. (2018). Lack & transcendence: The problem of death and life in psychotherapy, existentialism, and Buddhism. Second edition. New York: Simon & Schuster.
  • Niebauer, C. (2019). No self, no problem: How neuropsychology is catching up to Buddhism. Hierophant Publishing.
  • Raichle, M. E. et al. (2001). A default mode of brain function. Proceedings of the National Academy of Sciences, 98(2), 676–682.
  • Taylor, J. B. (2008). My stroke of insight: A brain scientist’s personal journey. New York: Plume.

Zen and the Art of Dissatisfaction – Part 7.

Voices Within – Exploring the Inner Dialogue

Originally published in Substack: https://substack.com/inbox/post/160343816

The Canadian-born British experimental psychologist and philosopher, Bruce Hood, specialises in researching human psychological development in cognitive neuroscience. Hood works at the University of Bristol, and his research focuses on intuitive theories, sense of self, and the cognitive processes underlying adult magical thinking. In his book The Self Illusion: Why There Is No ‘You’ Inside Your Head (2012), Hood argues that our internal dissatisfaction stems from a form of psychological uncertainty. It is still very common to think that some kind of internal self or soul is the core that separates humans from other animals. It is also very common to think that after the human body dies, this core continues to live forever through some form of reincarnation, either here or in another parallel dimension. Our understanding of the internal self does not arise from nothing. It is the result of a long developmental process that takes time to build. According to Bruce Hood, this is an illusion because the sense of self has no permanent anchor or form, yet people experience it as very real and often claim it to be the essence that makes us who we are.

In neurosciences the human consciousness is often divided into various conceptual meanings that together form human consciousness. The first of these meanings is the awareness, referring to whether we are awake or not, such as when we are asleep, we are in a state of mild and temporary form of unconsciousness. The second significant concept related to consciousness is attention, which moves between different activities, depending on what requires our attention at the moment. The third concept of consciousness is experiential consciousness, which defines subjective experiences occurring within ourselves, such as how salt tastes, or what is the sensation a red colour evokes. The fourth meaning of consciousness is reflective consciousness. If something happens to us at the level of experiential consciousness, we begin to consciously ponder how we should act. For instance, if we hammer our finger with a mallet, this experience immediately enters our experiential consciousness as a very intense experience, but almost simultaneously, the same event jumps into our reflective consciousness, where we start weighing the severity of the injury and what we should do to ease pain and prevent further calamity. Should I cry for help? Should I go to the hospital? Or shall I just take a photo and post it on Instagram? Conscious thought flows in this way.

Reflective consciousness speaks of experience, and it leads to conscious thinking, which is characterised by the inability to think about more than one thing at a time. One important form of conscious thought is self-awareness, which also involves the awareness of our own body. The concept of self represents conscious thought and is formed by beliefs and thoughts about an individual’s personal history, identity, and future plans.

Self-awareness is also referred to as the self or sense self, a concept I have and will be using in my writings. However, it’s important to note that this term does not refer to identity. This same issue is often explored in the fields of psychology and sociology.  

The sense of self can be seen as an evolutionary tool or a feature, which helps the organism to stay alive. It makes the organisms feel that they are very important, more important than anyone or anything else. However, this sense of self can also transform over time and through adverse experience into a process that turns against itself. These processes have been seen underlying conditions such as severe depression, in which our sense of self has gone into a deep rut and goes through endless loops of self loathing.  

American journalist and Harvard professor Michael Pollan writes in his book How to Change Your Mind (2019) that modern psychedelic therapies have shown promising results for patients with depression. In his book, Pollan writes about the research conducted by British neuroscientist and psychologist Robin Carhart-Harris (2010) has researched the effect of the brain’s default mode network (DMN) on the formation of the self, ego, or sense of self. The DMN is a neurological process that turns on when the person is not engaged in goal-directed activity. This process has also been linked to the formation of a the sense self.

The human experience of the self is a biographical anchor created by multiple overlapping neural processes of our brain. We get a feeling that everything that happens in my life happens to me. The self is that which experiences all things. That inner center is significant particularly to me. Without our internal awareness and experience of the self, we would never have conceived of the Universal Declaration of Human Rights, drafted by a UN committee chaired by Eleanor Roosevelt. It protects the persons right to physical integrity. We believe that every human being is unique and valuable because we all have an inner sense of self.

However, humans are not the only animals with some form of internal self-awareness. When visiting the London Zoo in 1838, Charles Darwin (1809–1882) saw an orangutang named Jenny becoming upset when a keeper teased her with an apple. This made Darwin reflect on the orangutang’s subjective experience. He noticed Jenny looking into a mirror and wondered if she recognised herself in the reflection.

American psychologist Gordon G. Gallup (1970) experimentally studied self-recognition in two male and two female wild chimpanzees, none of whom had previously seen a mirror. Initially, the chimpanzees made threatening gestures in front of the mirror, perceiving their reflections as threats. Eventually, they began using the mirror for self-directed behaviours, such as grooming parts of their bodies they couldn’t see without it, picking their noses, grinning, and blowing bubbles at their reflections.

Bruce Hood (2012) writes that this process is crucial in human development because, without it, humans would struggle in socially challenging and complex environments. Human children fail the mirror test until around 18 months of age. Before this, they may think the reflection is of another child and look behind the mirror. However, by 18 months, they understand that the person reflected in the mirror is themselves. But humans and chimpanzees are not the only animals to pass the mirror test. Crows, dolphins, and orcas also pass the test. Elephants do too, but cats do not (although my guess is that cats could pass the test if they wanted).

The sensation of self, which the human mind creates, which feels like a concrete structure, and which is referred to as the self, ego, or ”I,” is a process aimed at protecting us from both internal and external threats. When everything functions as it should, our inner narrator keeps the organism on track, helping it achieve its goals and meet its needs, especially eating, seeking shelter, and reproducing. This process works well under normal circumstances, but it is inherently conservative. Our experience of the self is a process, not a fixed entity, though it often feels like one. It emerges as a result of various mental functions and manifests as an internal narrator, or even as an internal dialogue.

The dialogue generated by the self often sounds like someone is explaining things to us, as if to a blind person, about what’s happening around us. We enter a room and might hear someone say inside our mind, “Look, what a nice place this is! Those wallpapers are beautiful, and the furniture is great, but those electrical outlets needs replacing!”

Sometimes, we might hear an internal negotiation, such as whether to run through a red light to catch a tram. Running through traffic might put us in physical danger or cause us to be socially judged. Social shame is one of the worst things a person can experience, and our internal narrator picks up on such details immediately and warns us to at least consider the possibility. At times, our narrator can turn into an internal tyrant, turning its energy against us.

This narrator, or brain talk, sounds very reasonable, but it often shows how our minds are trying to preserve the structures formed earlier, built from previous experiences. Unfortunately, sometimes we’re left with that inner narrator and nothing else, which can leave one feeling out of place. And when this narrator becomes rigid and inflexible, it has the power to push us into states of psychological distress, even driving us into despair.

In cases where brain talk gets stuck in repetitive loops, as is often the case with anxiety, depression, or psychosis, people feel their lives are determined by this narrator, inner force living inside ones head. A stuck self could feel isolated in its inner world and find it impossible to reach outside. The idea of having self-awareness — of being someone in this world — becomes crushed under the weight of these loops. For some, it is as if the voice of our mind becomes detached from the physical person, forcing it into another dimension where everything becomes dark, and disconnected from the social world.

American author David Foster Wallace (1962–2008), who had much experience of this process, reminded us in his commencement speech at Kenyon College in 2005 of the old cliché that the mind is an excellent servant, but a terrible master. However, this cliché expresses a terrible truth. According to Wallace, it is no coincidence that people who commit suicide with firearms almost always shoot themselves in the head. They are shooting that terrible narrator turned into a master — a terrible dark lord. 

Mind out of a Dolmio pasta sauce commercial

People experience their brain talk in a unique and private way. Most of us have some form of inner voice. A voice that guides, directs, and commands us. A voice that warns “Watch out! Car!” or “Remember to buy toilet paper.” For many of us, this voice sounds like our own, but for some people, their inner narrator is not a straightforward speech that scolds, advises, or reminds them of things. For some, brain talk may take the form of an Italian arguing couple or a calm interviewer. Or it may not be a voice at all, but a taste, feeling, or colour. In some cases, there is no voice at all, only deep and calm silence.

English journalist Sirin Kale (2021) wrote an interesting article on this internal narrator, presenting a few rare examples of different types of inner voices. One of the people interviewed for the article, a 30-year-old English woman named Claudia, hears her inner dialogue in a unique way. Claudia has never been to Italy, nor does she have Italian family or friends. She has no idea why the loud, arguing Italian couple has taken over her inner voice. Claudia says, “I have no idea where this came from. It’s probably offensive to Italians.” The arguing couple in Claudia’s mind sounds like something straight out of a Dolmio pasta sauce commercial. They are expressive and prone to waving their hands and shouting. When Claudia needs to make a decision in her life, this Italian couple takes the reins.

The Italian couple living inside Claudia’s mind argues passionately about almost anything. Claudia finds it very helpful because they do all the work for her. The couple is always in the kitchen and surrounded by food. Claudia has not yet named her Italians, but they have helped her make important decisions, including encouraging her to quit her job and pursue her lifelong dream of going to sea.

Kale writes that the Italian woman in Claudia’s mind supported her resignation, but her husband was more cautious. The Italian man said, “It’s a stable job!” and the woman responded, “Let her enjoy life!” The woman won, and Claudia left for a job on the seas in Greece. Overall, this Italian couple has helped Claudia live a happier life, and they’ve even calmed down a bit. Claudia says, “Less shouting. They just argue now.”

Dr Helene Loevenbruck of Grenoble Alpes University’s, mentioned in the article, claims that the brain talk arises in the same way as our thoughts turn into actions. Our brains predict the consequences of actions. The same principle of predicting actions also applies to human speech. When we speak, our brains create a predictive simulation of speech in our minds to correct any potential mistakes. The inner voice is thought to arise when our minds plan verbalised actions but decide not to send motor commands to the speech muscles. Loevenbruck says this simulated auditory signal is the small voice we hear in our minds. Loevenbruck explains that for the most part, we hear something she refers to as inner language, a more comprehensive term for this phenomenon. This is because, for example, people with hearing impairments do not hear an inner voice but might see sign language or observe moving lips. (Loevenbruck et al., 2018).

In exploring the sense of self and inner voice, we’ve seen how the self emerges as a process rather than a fixed entity. It is shaped by our own evolution, culture, and personal experience. Our brain talk can guide us, deceive us, or even take on unexpected forms and destroy us, yet it remains central to our sense of identity. It feels like the core for which everything happens. But is the self really real? And if not, if the self is an illusion, as neuroscientists and psychologists suggest, what does that mean for how we live? In the next post, I’ll dive into Buddhist perspectives on the self—examining how centuries-old wisdom aligns with modern psychological insights.


Resources:

Carhart-Harris, RL, & Friston, KJ (2010). The default-mode, ego-functions and free-energy: a neurobiological account of Freudian ideas. Brain, 133(4), 1265-1283.

Gallup, GG (1970). Chimpanzees: Self-Recognition. Science. 167 (3914): 86–87.

Hood, B (2012). The self illusion: How the social brain creates identity. HarperCollins Publishers.

Kale, S (2021) The last great mystery of the mind: meet the people who have unusual – or non-existent – inner voices. Guardian 25 Oct 2021 <https://www.theguardian.com/science/2021/oct/25/the-last-great-mystery-of-the-mind-meet-the-people-who-have-unusual-or-non-existent-inner-voices&gt; Link visited 1 April 2025. 

Loevenbruck et al. (2018). A cognitive neuroscience view of inner language: to predict, to hear, to see and to feel. In Inner Speech:  New Voices. Peter Langland-Hassan & Agustín Vicente (eds.), Oxford University Press, 131-167. 

Pollan, M (2019). How to change your mind the new science of psychedelics. Penguin Books.