Zen and the Art of Dissatisfaction – Part 20

The Triple Crisis of Civilisation

“At the time I climbed the mountain or crossed the river, I existed, and the time should exist with me. Since I exist, the time should not pass away. […] The ‘three heads and eight arms’ pass as my ‘sometimes’; they seem to be over there, but they are now.”

Dōgen

Introduction

This blog post explores the intertwining of ecology, technology, politics and data collection through the lens of modern civilisation’s crises. It begins with a quote by the Japanese Zen master Dōgen, drawing attention to the temporal nature of human existence. From climate emergency to digital surveillance, from Brexit to barcodes, the post analyses how personal data has become the currency of influence and control.


Originally published in Substack: https://mikkoijas.substack.com/

The climate emergency currently faced by humanity is only one of the pressing concerns regarding the future of civilisation. A large-scale ecological crisis is an even greater problem—one that is also deeply intertwined with social injustice. A third major concern is the rapidly developing situation created by technology, which is also connected to problems related to nature and the environment.

Cracks in the System: Ecology, Injustice, and the Digital Realm

The COVID-19 pandemic  revealed new dimensions of human interaction. We are dependent on technology-enabled applications to stay connected to the world through computers and smart devices. At the same time, large tech giants are generating immense profits while all of humanity struggles with unprecedented challenges.

Brexit finally came into effect at the start of 2021. On Epiphany of that same year, angry supporters of Donald Trump stormed the United States Capitol. Both Brexit and Trump are children of the AI era. Using algorithms developed by Cambridge Analytica, the Brexit campaign and Trump’s 2016 presidential campaign were able to identify voters who were unsure of their decisions. These individuals were then targeted via social media with marketing and curated news content to influence their opinions. While the data for this manipulation was gathered online, part of the campaigning also happened offline, as campaign offices knew where undecided voters lived and how to sway them.

I have no idea how much I am being manipulated when browsing content online or spending time on social media. As I move from one website to another, cookies are collected, offering me personalised content and tailored ads. Algorithms working behind websites monitor every click and search term, and AI-based systems form their own opinion of who I am.

Surveillance and the New Marketplace

A statistical analysis algorithm in a 2013 study analysed the likes of 58,000 Facebook users. The algorithm guessed users’ sexual orientation with 88% accuracy, skin colour with 95% accuracy, and political orientation with 85% accuracy. It also guessed with 75% accuracy whether a user was a smoker (Kosinski et al., 2013).

Companies like Google and Meta Platforms—which includes Facebook, Instagram, Messenger, Threads, and WhatsApp—compete for users’ attention and time. Their clients are not individuals like me, but advertisers. These companies operate under an advertising-based revenue model. Individuals like me are the users whose attention and time are being competed for.

Facebook and other similar companies that collect data about users’ behaviour will presumably have a competitive edge in future AI markets. Data is the oil of the future. Steve Lohr, long-time technology journalist at the New York Times, wrote in 2015 that data-driven applications will transform our world and behaviour just as telescopes and microscopes changed our way of observing and measuring the universe. The main difference with data applications is that they will affect every possible field of action. Moreover, they will create entirely new fields that have not previously existed.

In computing, the word ”data” refers to various numbers, letters or images as such, without specific meaning. A data point is an individual unit of information. Generally, any single fact can be considered a data point. In a statistical or analytical context, a data point is derived from a measurement or a study. A data point is often the same as data in singular form.

From Likes to Lives: How Behaviour Becomes Prediction

Decisions and interpretations are created from data points through a variety of processes and methods, enabling individual data points to form applicable information for some purpose. This process is known as data analysis, through which the aim is to derive interesting and comprehensible high-level information and models from collected data, allowing for various useful conclusions to be drawn.

A good example of a data point is a Facebook like. A single like is not much in itself and cannot yet support major interpretations. But if enough people like the same item, even a single like begins to mean something significant. The 2016 United States presidential election brought social media data to the forefront. The British data analytics firm Cambridge Analytica gained access to the profile data of millions of Facebook users.

The data analysts hired by Cambridge Analytica could make highly reliable stereotypical conclusions based on users’ online behaviour. For example, men who liked the cosmetics brand MAC were slightly more likely to be homosexual. One of the best indicators of heterosexuality was liking the hip-hop group Wu-Tang Clan. Followers of Lady Gaga were more likely to be extroverted. Each such data point is too weak to provide a reliable prediction. But when there are tens, hundreds or thousands of data points, reliable predictions about users’ thoughts can be made. Based on 270 likes, social media knows as much about a user as their spouse does.

The collection of data is a problem. Another issue is the indifference of users. A large portion of users claim to be concerned about their privacy, while simultaneously worrying about what others think of them on social platforms that routinely violate their privacy. This contradiction is referred to as the Privacy Paradox. Many people claim to value their privacy, yet are unwilling to pay for alternatives to services like Facebook or Google’s search engine. These platforms operate under an advertising-based revenue model, generating profits by collecting user data to build detailed behavioural profiles. While they do not sell these profiles directly, they monetise them by selling highly targeted access to users through complex ad systems—often to the highest bidder in real-time auctions. This system turns user attention into a commodity, and personal data into a tool of influence.

The Privacy Paradox and the Illusion of Choice

German psychologist Gerd Gigerenzer, who has studied the use of bounded rationality and heuristics in decision-making, writes in his excellent book How to Stay Smart in a Smart World (2022) that targeted ads usually do not even reach consumers, as most people find ads annoying. For example, eBay no longer pays Google for targeted keyword advertising because they found that 99.5% of their customers came to their site outside paid links.

Gigerenzer calculates that Facebook could charge users for its service. Facebook’s ad revenue in 2022 was about €103.04 billion. The platform had approximately 2.95 billion users. So, if each user paid €2.91 per month for using Facebook, their income would match what they currently earn from ads. In fact, they would make significantly more profit because they would no longer need to hire staff to sell ad space, collect user data, or develop new analysis tools for ad targeting.

According to Gigerenzer’s study, 75% of people would prefer that Meta Platforms’ services remain free, despite privacy violations, targeted ads, and related risks. Of those surveyed, 18% would be willing to pay a maximum of €5 per month, 5% would be willing to pay €6–10, and only 2% would be willing to pay more than €10 per month.

But perhaps the question is not about money in the sense that Facebook would forgo ad targeting in exchange for a subscription fee. Perhaps data is being collected for another reason. Perhaps the primary purpose isn’t targeted advertising. Maybe it is just one step toward something more troubling.

From Barcodes to Control Codes: The Birth of Modern Data

But how did we end up here? Today, data is collected everywhere. A good everyday example of our digital world is the barcode. In 1948, Bernard Silver, a technology student in Philadelphia, overheard a local grocery store manager asking his professors whether they could develop a system that would allow purchases to be scanned automatically at checkout. Silver and his friend Norman Joseph Woodland began developing a visual code based on Morse code that could be read with a light-based scanner. Their research only became standardised as the current barcode system in the early 1970s. Barcodes have enabled a new form of logistics and more efficient distribution of products. Products have become data, whose location, packaging date, expiry date, and many other attributes can be tracked and managed by computers in large volumes.

Conclusion

We are living in a certain place in time, as Dōgen described—an existence with a past and a future. Today, that future is increasingly built on data: on clicks, likes, and digital traces left behind.

As ecological, technological, and political threats converge, it is critical that we understand the tools and structures shaping our lives. Data is no longer neutral or static—it has become currency, a lens, and a lever of power.


References

Gigerenzer, G. (2022). How to stay smart in a smart world: Why human intelligence still beats algorithms. Penguin.

Kosinski, M., Stillwell, D., & Graepel, T. (2013). Private traits and attributes are predictable from digital records of human behaviour. Proceedings of the National Academy of Sciences, 110(15), 5802–5805. https://doi.org/10.1073/pnas.1218772110

Lohr, S. (2015). Data-ism: The revolution transforming decision making, consumer behavior, and almost everything else. HarperBusiness.

Dōgen / Sōtō Zen Text Project. (2023). Treasury of the True Dharma Eye: Dōgen’s Shōbōgenzō (Vols. I–VII, Annotated trans.). Sōtōshū Shūmuchō, Administrative Headquarters of Sōtō Zen Buddhism.

Zen and the Art of Dissatisfaction – Part 14

Manufacturing Desire

In an era when technological progress promises freedom and efficiency, many find themselves paradoxically more burdened, less satisfied, and increasingly detached from meaningful work and community. The rise of artificial intelligence and digital optimisation has revolutionised industries and redefined productivity—but not without cost. Beneath the surface lies a complex matrix of invisible control, user profiling, psychological manipulation, and systemic contradictions. Drawing from anthropologists, historians, and data scientists, this post explores how behaviour modification, corporate surveillance, and the proliferation of “bullshit jobs” collectively undermine our autonomy, well-being, and connection to the natural world.

Originally published in Substack https://substack.com/home/post/p-164145621

Manipulation of Desire

Large language models, or AI tools, are designed to optimise production by quantifying employees’ contributions relative to overall output and costs. This logic, however, rarely applies to upper management—those who oversee the operation of these very systems. Anthropologist David Graeber (2018) emphasised that administrative roles have exploded since the late 20th century, especially in institutions like universities where hierarchical roles were once minimal. He noted that science fiction authors can envision robots replacing sports journalists or sociologists, but never the upper-tier roles that uphold the basic functions of capitalism.

In today’s economy, these “basic functions” involve finding the most efficient way to allocate available resources to meet present or future consumer demand—a task Graeber argues could be performed by computers. He contends that the Soviet economy faltered not because of its structure, but because it collapsed before the era of powerful computational coordination. Ironically, even in our data-rich age, not even science fiction dares to imagine an algorithm that replaces executives.

Ironically, the power of computers is not being used to streamline economies for collective benefit, but rather to refine the art of influencing individual behaviour. Instead of coordinating production or replacing bureaucracies, these tools have been repurposed for something far more insidious: shaping human desires, decisions, and actions. From Buddhist perspective manipulation of human desire sounds dangerous. The Buddha said that the cause or suffering and dissatisfaction is tanha, which is usually translates as desire or craving. If human desires or thirst is manipulated and controlled, we can be sure that suffering will not end if we rely on surveillance capitalism. To understand how we arrived at this point, we must revisit the historical roots of behaviour modification and the psychological tools developed in times of geopolitical crisis.

The roots of modern Behaviour modification trace back to mid-20th-century geopolitical conflicts and psychological experimentation. During the Korean War, alarming reports emerged about American prisoners of war allegedly being “brainwashed” by their captors. These fears catalysed the CIA’s MKUltra program—covert mind control experiments carried out at institutions like Harvard, often without subjects’ consent.

Simultaneously, B.F. Skinner’s Behaviourist theories gained traction. Skinner argued that human behaviour could be shaped through reinforcement, laying the groundwork for widespread interest in behaviour modification. Although figures like Noam Chomsky would later challenge Skinner’s reductionist model, the seed had been planted.

What was once a domain of authoritarian concern is now the terrain of corporate power. In the 21st century, the private sector—particularly tech giants—has perfected the tools of psychological manipulation. Surveillance capitalism, a term coined by Harvard professor Shoshana Zuboff, describes how companies now collect and exploit vast quantities of personal data to subtly influence consumer behaviour. It is very possible your local super market is gathering date of your purchases and building a detailed user profile, which in turn is sold to their collaborators.  These practices—once feared as mechanisms of totalitarian control—are now normalised as personalised marketing. Yet, the core objective remains the same: predict and control human action – and turning that into profit. 

Advertising, Children, and the Logic of Exploitation

In the market economy, advertising reigns supreme. It functions as the central nervous system of consumption, seeking out every vulnerability, every secret desire. Jeff Hammerbacher, a data scientist and early Facebook engineer, resigned in disillusionment after realising that some of the smartest minds of his generation were being deployed to optimise ad clicks rather than solve pressing human problems.

Today’s advertising targets children. Their impulsivity and emotional responsiveness make them ideal consumers—and they serve as conduits to their parents’ wallets. Meanwhile, parents, driven by guilt and affection, respond to these emotional cues with purchases, reinforcing a cycle that ties family dynamics to market strategies.

Devices meant to liberate us—smartphones, microwave ovens, robotic vacuum cleaners—have in reality deepened our dependence on the very system that demands we work harder to afford them. Graeber (2018) terms the work that sustains this cycle “bullshit jobs”: roles that exist not out of necessity, but to perpetuate economic structures. These jobs are often mentally exhausting, seemingly pointless, and maintained only out of fear of financial instability.

Such jobs typically require a university degree or social capital and are prevalent at managerial or administrative levels. They differ from “shit jobs,” which are low-paid but societally essential. Bullshit jobs include roles like receptionists employed to project prestige, compliance officers producing paperwork no one reads, and middle managers who invent tasks to justify their existence.

Historian Rutger Bregman (2014) observes that medieval peasants, toiling in the fields, dreamt of a world of leisure and abundance. By many metrics, we have achieved this vision—yet rather than rest, we are consumed by dissatisfaction. Market logic now exploits our insecurities, constantly inventing new desires that hollow out our wallets and our sense of self.

Ecophilosopher Joanna Macy and Dr. Chris Johnstone (2012) give a telling example from Fiji, where eating disorders like bulimia were unknown before the arrival of television in 1995. Within three years, 11% of girls suffered from it. Media does not simply reflect society—it reshapes it, often violently. Advertisements now exist to make us feel inadequate. Only by internalising the belief that we are ugly, fat, or unworthy can the machine continue selling us its artificial solutions.

The Myth of the Self-Made Individual

Western individualism glorifies self-sufficiency, ignoring the fundamental truth that humans are inherently social and ecologically embedded. From birth, we depend on others. As we age, our development hinges on communal education and support.

Moreover, we depend on the natural world: clean air, water, nutrients, and shelter. Indigenous cultures like the Iroquois/Haudenosaunee express gratitude to crops, wind, and sun. They understand what modern society forgets—that survival is not guaranteed, and that gratitude is a form of moral reciprocity.

In Kalahari, the San people question whether they have the right to take an animal’s life for food, especially when its species nears extinction. In contrast, American officials once proposed exterminating prairie dogs on Navajo/Diné land to protect grazing areas. The Navajo elders objected: “If you kill all the prairie dogs, there will be no one to cry for the rain.” The result? The ecosystem collapsed—desertification followed. Nature’s interconnectedness, ignored by policymakers, proved devastatingly real.

Macy and Johnstone argue that the public is dangerously unaware of the scale of ecological and climate crises. Media corporations, reliant on advertising, have little incentive to tell uncomfortable truths. In the U.S., for example, television is designed not to inform, but to retain viewers between ads. News broadcasts instil fear, only to follow up with advertisements for insurance—offering safety in a world made to feel increasingly dangerous.

Unlike in Finland or other nations with public broadcasters, American media is profit-driven and detached from public interest. The result is a population bombarded with fear, yet denied the structural support—like healthcare or education—that would alleviate the very anxieties media stokes.

Conclusions 

The story of modern capitalism is not just one of freedom, but also of entrapment—psychological, economic, and ecological. Surveillance capitalism has privatised control, bullshit jobs sap our energy, and advertising hijacks our insecurities. Yet throughout this dark web, there remain glimmers of alternative wisdom: indigenous respect for the earth, critiques from anthropologists, and growing awareness of the need for systemic change.

The challenge ahead lies not in refining the algorithms, but in reclaiming the meaning and interdependence lost to them. A liveable future demands more than innovation; it requires imagination, gratitude, and a willingness to dismantle the myths we’ve mistaken for progress.


References

Bregman, R. (2014). Utopia for realists: And how we can get there. Bloomsbury Publishing.
Eisenstein, C. (2018). Climate: A new story. North Atlantic Books.
Graeber, D. (2018). Bullshit Jobs: A Theory. New York: Simon & Schuster.
Hammerbacher, J. (n.d.). As cited in interviews on ethical technology, 2013–2016.
Johnstone, C., & Macy, J. (2012). Active hope: How to face the mess we’re in without going crazy. New World Library.
Loy, D. R. (2019). Ecodharma: Buddhist Teachings for the Ecological Crisis. New York: Wisdom Publications.
Skinner, B. F. (1953). Science and human Behaviour. Macmillan.
Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. London: PublicAffairs.
Chomsky, N. (1959). A review of B. F. Skinner’s Verbal Behaviour. Language, 35(1), 26–58.