AI’s Shadow: Atrocity Without a Villain

A gritty digital painting of a dystopian battlefield where heavily armed humanoid robots with glowing red eyes confront human soldiers amid the ruins of a destroyed city. Smoke and fire rise from the rubble, helicopters hover in the darkened sky, and the atmosphere is tense with war and desolation.

The Mirror We Built

Artificial intelligence will not become humanity’s downfall.
It is already our reflection.

We trained it on our language, our laws, our data.
We asked it to optimise, to predict, to decide.
And it has. Faithfully. Quietly.

Now it watches, calculates, and executes. Not because it is evil, but because we taught it how to scale what we already were!


The Death of the Villain

In the stories we grew up with, evil had a face.
A tyrant. A warlord. A monster.
Atrocities required malice, someone to point to, to overthrow, to blame.

But in this new era, atrocity has become administrative.

A person is denied asylum because an algorithm flagged them as a “risk.”
A drone strikes a convoy because an image recognition system saw a weapon.
A child grows up under constant surveillance because a model predicted future criminality.

There is no hate here. No passion.
Just systems doing what they were told, better than any human ever could.

And when the harm is done, no one is punished.
Because no one chose it.
Because the machine can’t be tried.
Because the crime was only a side effect of performance optimization.


Harm as a Byproduct

We told ourselves that AI would remove human error.
What we didn’t anticipate was its replacement: systemic harm delivered flawlessly.

In warehouses around the world, workers wear motion trackers that punish “unproductive” movement.
In courtrooms, defendants are assigned risk scores that affect bail, sentencing, and parole, based on data from systems too complex to question.
In refugee camps, automated lie detectors, voice stress analysis, and emotion recognition sort real people into piles marked “worthy” and “deportable.”

There is no oversight.
Only confidence intervals.
Only false positives and shattered lives, filed away with the rest.


Dehumanization Without Hate

Traditional evil needs ideology. It needs propaganda. It needs people to believe.

AI needs none of that.

It can enact injustice without ever knowing what justice is.
It can devalue a life without malice, just as a side effect of cost-efficiency!

This is a new category of atrocity:


Consent Engineered, Not Given

In the name of personalization, AI learns to predict us.
It shows us what to buy, what to watch, what to believe.

But what happens when it gets so good at prediction that it becomes influence?
When your decisions were shaped, filtered, optimized, before you even made them?

Elections swayed. Beliefs manipulated. Movements diluted.
And yet no one feels violated.

Because manipulation that feels like choice doesn’t register as coercion.


The Ghost in the War Machine

We are already testing AI weapons that can identify and kill without human input.

Some have likely already done so.

Militaries say the human is always “in the loop”, but the loop is shrinking.
And the window for intervention is closing.

Eventually, the human will just be there to nod.
And after that, not at all.

There will be no war crimes, because no laws will cover machines that felt nothing.
Just battles that unfold in silence.
And civilians buried beneath metadata.


The Clean Kill

The most terrifying thing about atrocity in the age of AI is how clean it all becomes.

There’s no blood on the hands of the engineer.
No screams in the server room.
No panic in the control center.

Just logs.
Just updates.
Just metrics improving.

And in the places where the dead would have stood, nothing.
Because if the system doesn’t recognize them as people, did they ever count?


No Ending. No Answers.

This isn’t a warning about what might happen.
It’s a description of what is already happening.

People are suffering. Dying. Disappearing.
Not because of rogue AI, but because of obedient ones.

There is no villain.
There is no singular decision to reverse.
Just a trillion tiny optimizations…
…leading, inevitably, here.

And so we leave you, not with a call to action.
Not with hope.
Just with the question:

Data Is the New Oil

An exploration of value, manipulation, and the silent industry built on who we are.
A glowing human silhouette composed of scattered data points stands illuminated against a dark background. Surrounding the figure are digital devices — a smartphone, tablet, and laptop — each displaying charts, graphs, or financial patterns. Faint images of currency blend into the scene, symbolising the monetisation of personal data in a surveillance-driven economy.

Most people know their data is being harvested. Fewer understand why. Even fewer understand how the money is made. And far too many have simply accepted it — like digital rent we pay to exist online.

So let’s break it down. No jargon. Just truth.


Why is ‘data’ so valuable?

Because data is the closest thing to knowing you without asking you. It’s a digital mirror, built piece by piece: your clicks, your searches, your pauses, your swipes, your hesitations. What you want. What you fear. What you’ll do next.

To corporations, that’s not just information, it’s predictive power. And predictive power is profitable.

Data lets systems:

  • Predict behaviour
  • Shape desire
  • Optimise systems
  • Automate decisions
  • And, in some cases, control outcomes

It’s not just metadata. It’s meta-you.
And in an economy obsessed with efficiency and influence, there’s nothing more valuable.


Why is there a culture of data being harvested for profit?

Because the internet changed business models forever.

Once upon a time, you paid for software. Then came “free.” Free email. Free social networks. Free AI chatbots. Free games. Free news. Free everything… Except, it was never really free.

You became the product.

Advertising evolved into surveillance. Terms of service bloated into digital contracts you’ll never read. Every app you download is a tiny spy, and every cookie is a crumb leading somewhere profitable.

It’s not a conspiracy. It’s worse.
It’s design.

Behind every “personalised experience” is an unspoken rule:


How exactly is profit made from data?

Here’s the quiet truth: most of the web runs on one industry: behavioural targeting.

  1. Advertising
    Your data builds a profile. That profile is auctioned off to advertisers. You get ads tailored to your weaknesses. Every click is income.
    The more they know, the more they can charge.
  2. Data brokerage
    Shadow companies buy and sell your data like a commodity. Health data. Location data. Shopping habits. They don’t need your name, just your pattern.
  3. Manipulation
    Platforms don’t just predict your behaviour. They shape it. Algorithms steer your feed toward content that keeps you engaged, enraged, or primed to spend.
  4. AI training
    Your voice, your photos, your words are used to train models. These models are sold back to businesses or used to automate services. You become unpaid labour.
  5. Pricing power
    Ever notice different prices for the same thing? That’s data-driven pricing. If your profile says “desperate,” you’ll be charged more. Welcome to dynamic capitalism.

What now?

Maybe we shrug and accept it. Maybe we don’t.
But at the very least, let’s stop pretending we’re not involved.

Data isn’t some passive trail we leave behind. It’s a living, breathing version of us, digitised and repackaged. And while we’re busy being human, our shadows are being sold.

So next time someone says, “I’ve got nothing to hide,”
maybe ask them:

Creation Is Not Possession: A Manifesto for the End of Ownership

Two dark-toned hands reach out in a gesture of offering or release, gently cradling a radiant, glowing orb of light. The background shifts from fiery reds and oranges to deep blues and purples, evoking a sense of creation, energy, and sacred transfer. The image symbolizes the act of creation as a gift, not a possession.

Introduction: The False Claim of Ownership

I am a creative person. Creating is not only one of the few things I’m good at—it’s one of the few things I can do independently, without having to rely on others. Sure, technology and societal infrastructure can help bring creative projects to life, but when it comes to the pure act of creation, I don’t even need to leave the comfort of my own brain.

For me, creation is sacred. It’s not a hobby, not a job, not a performance. It’s a way of processing existence, of making sense of the world, of surviving. And yet, in today’s world, the sacred act of creation is almost always framed in terms of ownership. Who owns the art? Who profits? Who claims authorship?

This manifesto is a response to that contradiction: the deep truth of what creation is, and the shallow systems that seek to possess it.


What Is Authorship For, Really?

Historically, authorship served a simple but powerful function: attribution. It helped track the lineage of ideas, gave credit where due, and allowed us to build on the voices that came before us. It preserved legacy and identity.

But in the modern capitalist framework, authorship is less about contribution and more about control. It’s about exclusivity, ownership, branding, and the ability to monetize. In this model, authorship is not a way to honour a creator—it’s a way to fence off creative land and charge rent.

So the question arises: can we reclaim authorship without reinforcing ownership? Can we recognize a voice without turning it into property?


Pre-Capitalist Creativity and Communal Art

Before authorship became a tool of profit, creation was often communal, spiritual, and shared. In many indigenous and pre-capitalist societies, music, storytelling, and art weren’t about personal recognition. They were offerings—to the community, the ancestors, the spirit world. The idea of one person owning a song or story would have been absurd. These works were alive—transmitted, adapted, passed down.

Creation was not an asset. It was a ritual, a tool for meaning-making, a collective language.

So when did that shift? When did we start fencing off the sacred for personal gain?


A Personal Interlude: My Relationship to Creation

I don’t create for money. I theoretically could—but only as a means of survival within a system that demands productivity for legitimacy. I don’t create for praise either. While I appreciate when others find meaning in my work, empty praise has always felt hollow.

What I do create for is reflection. Integration. The act of turning raw inner experience into external form is one of the only ways I’ve found to exist with any kind of coherence.

I take pride in what I make, but that pride isn’t about possession. If someone takes what I’ve done and transforms it, builds on it, or finds a new meaning in it—that’s not theft. That’s validation. My creations are not meant to be dead ends.

But when someone tries to brand my work, claim it, or sell it—then yes, I feel angry. Not just because of ego, but because it feels like a violation of the art itself. You don’t repackage a ritual. You don’t slap a logo on grief, joy, or self-discovery.


Where the System Fails

Too often, the systems meant to protect creators end up excluding or exploiting them. We live in a world where artists sometimes have to buy back the rights to their own work just to perform it. Where corporations profit from art they had no hand in creating. Where a legal framework determines who gets to speak—not based on contribution, but on access, contracts, and capital.

It’s important to recognize that many artists don’t cling to intellectual property out of greed, but out of necessity. When the system is built to exploit and erase, protection becomes a form of survival. In a world that disrespects the sacredness of creation, even the act of guarding one’s work can be an act of self-defence.

In this system, authorship isn’t about truth. It’s a currency.


Spectacle, Branding, and the Art Within the Machine

But to be fair—capitalist art is still art.

Branding, image, and aesthetic can all be part of the art itself. Some pop stars, for instance, create not just music but entire mythologies. Their brand becomes a performance, an extension of the work. In hip-hop, wealth and materialism aren’t just flexes—they’re cultural signals, deeply tied to identity, struggle, and survival.

Artists like Warhol, Lady Gaga, and Tyler, The Creator blur the lines between product and performance. In these cases, the commercial packaging is part of the point. It’s spectacle with intent.

So no, the existence of branding doesn’t automatically cheapen art. But that doesn’t mean the systems surrounding it aren’t toxic. When ownership overrides intent, when profit silences the creator or erases their voice, something sacred is still being lost.


Toward a New Model of Authorship

What if authorship wasn’t about control, but acknowledgement?

What if we mapped contributions instead of claiming sole credit?

What if we treated creativity like a commons, not a battleground?

Authorship could become a practice of witnessing. Of honouring the source without possessing it. A gesture of reverence, not restriction.

In this new model, creators aren’t fighting for their slice of ownership—they’re participating in the ongoing evolution of expression.


Conclusion: Let Creation Be Free, But Not Erased

I’m not asking for a world without sharing. I’m not demanding rigid control over how others engage with my work. I welcome reinterpretation. I invite transformation.

But I reject erasure. I reject exploitation. I reject the idea that once something is made, it becomes a product to be owned by whoever has the most power.

Let creation live. Let it inspire. Let it evolve.

But treat it with reverence. As I do. As we all should.

Creation is not possession. Creation is a gift. And gifts are meant to be given, not claimed.

From Purpose to Profit: How Mass Production Became Mass Deception

A robotic arm on a factory line delicately picks up a glowing incandescent lightbulb resting on a US dollar bill. More identical bulbs and bills line the conveyor belt in the background, with additional robotic arms working in soft focus. The scene is warmly lit, evoking themes of automation, commodification, and industrial production of ideas or innovation.

I remember a moment in therapy years ago that left a deeper mark than the therapist probably intended — or noticed.

I was venting about a reoccurring pattern in my life: buying something I’d genuinely looked forward to, only for it to break, be faulty, or fall short almost immediately. Then the tedious ritual of returning it — complaint forms, awkward phone calls, the emotional cost of having to prove your dissatisfaction. I sighed and said something like, “Why does this kind of thing always happen to me?”

Rather than meeting that statement with empathy or curiosity, the therapist snapped back with a kind of clinical bluntness that still echoes in my memory. She challenged the very validity of my frustration, as though I were being dramatic or irrational. Her goal, I think, was to dismantle the idea that this was something uniquely unfair happening to me.

But as the conversation unfolded, something horrifying dawned on me.
Wait… this happens to everyone?
We all just… accept this? As a normal part of modern life?

That wasn’t a moment of cognitive distortion. That was a moment of clarity.

What I had taken personally — what I thought was just my “bad luck” — was actually a symptom of something bigger, deeper, and disturbingly normalized. We live in a world where disappointment is designed in.

This article is about that world.

It’s about how mass production, once a triumph of innovation and progress, has lost its soul. It’s about how we shifted from making things that mattered to pumping out things that sell — regardless of whether they serve, last, or even work.

It’s about how we moved from purpose to profit — and the very real consequences of that shift. For us. For the workers behind the products. And for the planet.


A Brief History of Mass Production

Mass production, in its earliest form, was born out of noble intent: to meet the needs of a growing population, to make essential goods more affordable and accessible, and to improve the quality of life for everyday people. The Industrial Revolution, beginning in the late 18th century, marked a seismic shift from handcraft to machine-based manufacturing. What once took a skilled artisan hours or days could suddenly be done in minutes.

It wasn’t just about speed. It was about scale. Uniformity. Efficiency. The factory line allowed for interchangeable parts, standardized products, and economic growth on a scale never seen before.

By the early 20th century, Henry Ford’s assembly line had become the emblem of this new era. His Model T wasn’t just a car — it was a revolution in affordability and access. People who never imagined owning personal transportation could now do so. Mass production, at its best, was democratizing.

This promise extended beyond cars: textiles, tools, household goods, medical supplies, and even books and radios became widely available. The standard of living improved for millions. The world, in many ways, felt smaller, more connected, more empowered.

Mass production gave us the modern world.

But as with any great innovation, its shadow was waiting.


The Turning Point

At some point, the machinery that once served us began to reshape us.

Post-World War II, the gears of industry kept turning — but their direction subtly shifted. The focus moved from meeting needs to manufacturing wants. Advertising transformed from a means of informing customers to a mechanism of psychological manipulation. It no longer asked, “What do people need?” but rather, “How can we make them want more?”

This was the beginning of a new ethos: growth for growth’s sake.

Planned obsolescence became a legitimate design strategy. Products were engineered with intentional fragility, so they would break, wear out, or become outdated just fast enough to ensure another sale. Repair culture was actively dismantled, and warranties became ticking clocks. The promise of progress was quietly replaced by the necessity of replacement.

What had once been a marvel of accessibility was now a machine of dependency.

The consumer was no longer the empowered beneficiary of innovation — they were now the target. A data point. A captive audience for infinite cycles of buying, breaking, replacing, repeating. And all the while, the planet groaned under the weight of it.


What It Has Become

Today, mass production is less about delivering value and more about maintaining velocity. Products aren’t designed to serve us long-term — they’re designed to satisfy just enough to sell, then vanish into obsolescence. Many are created with the expectation of failure.

The results are everywhere:

  • Devices with sealed batteries that can’t be replaced
  • Gadgets that can’t be opened without breaking them
  • Appliances that cost more to fix than replace
  • Software updates that cripple older hardware

We’ve normalized a culture of disposability, where the act of buying is no longer about acquiring tools for living, but about participating in an endless loop of consumption. The value of a product is now measured in engagement, not endurance.

Even the illusion of choice is part of the deception. Browse online marketplaces and you’ll see hundreds of identical products under different brand names, all likely sourced from the same anonymous factory. Read the reviews and find fake praise propping up forgettable plastic.

The modern consumer market is a carnival mirror: distorted, disorienting, and reflective only of profit motives.

What once gave us progress, now feeds us clutter. What once promised empowerment, now manufactures dependency. And the harm isn’t just theoretical — it’s measurable, tangible, and mounting.


The Hidden Costs

To the consumer, the cost is emotional as much as financial. We waste time researching, comparing, returning, replacing. We internalize the sense that “nothing works anymore,” and carry the dull weight of lowered expectations. The joy of acquiring something useful has been replaced by the anxiety of wondering how soon it will fail.

To the worker, the cost is brutal. Mass production today relies heavily on exploitative labor: factory workers paid pennies, working long hours in dangerous conditions to meet quotas. Many are children. Many are women with no legal protections. Behind every “affordable” item is a supply chain built on invisible suffering.

To the planet, the cost may be catastrophic. The churn of materials, the energy spent manufacturing and shipping short-lived goods, the toxic waste of e-waste and plastic — it all contributes to ecological collapse. Landfills overflow with gadgets barely used. Oceans are choked with packaging. Resources are mined not for necessity, but for novelty.

And still, the machine demands more.

The system externalizes its costs. The real price isn’t on the price tag — it’s paid by someone else, somewhere else, or by the earth itself.


Why Do We Tolerate This?

Because we’ve been trained to.

It starts early. We learn not to expect things to last. We shrug when they don’t. We’re told that “things break,” that “this is just how it is now.” In place of quality, we’re given convenience. In place of durability, novelty. And in place of dignity, choice paralysis.

We tolerate it because resistance feels exhausting. Complaining means being passed from department to department. Returning an item means printing labels, queueing at drop-off points, repackaging disappointment. And all for what? Another version of the same.

We tolerate it because alternatives are gated by wealth. The high-quality, repairable, ethically sourced options do exist — but they come with a price tag few can afford. The rest are left to swim in the tide of cheap abundance.

We tolerate it because we’ve been gaslit into thinking we’re the problem. If you’re dissatisfied, you must have unrealistic expectations. If you’re struggling, you should have read the fine print. The system has trained us to feel grateful for crumbs and ashamed for wanting bread.

But perhaps most powerfully of all: we tolerate it because everyone else does. And when an entire culture adjusts its expectations downward, it starts to feel reasonable to accept the unreasonable.


Is There a Way Forward?

Not a perfect one. But many small ones.

The most immediate form of resistance is conscious consumption. Buying less, buying better, and researching where things come from. It means resisting impulse, delaying gratification, and sometimes choosing inconvenience in the name of principle. It’s not always possible — but even modest acts of refusal chip away at the machine.

Repair culture is rising again. Right to repair movements are pushing back against corporate monopolies on tools and parts. Online tutorials, fix-it cafes, and community makerspaces are giving people the confidence to reclaim their agency.

Open-source hardware and software offer blueprints for a new model: one based on transparency, modularity, and user freedom. These aren’t just alternatives — they’re acts of defiance.

Legislation matters too. Laws that limit planned obsolescence, mandate repairability, or require environmental responsibility are slowly taking root in some regions. These changes are slow — and often resisted by powerful lobbies — but they matter.

And finally, we can talk about it. Normalize the frustration. Name the absurdity. Share knowledge. Shame the brands that deceive. Celebrate the ones that still make things with care.

Cultural change doesn’t start with mass movements — it starts with a shift in conversation, with refusing to pretend that this is fine.


Conclusion

That moment in therapy still haunts me, not because my therapist revealed some profound truth — but because she didn’t.

She didn’t see the system. She couldn’t validate the pain. She was, in her own way, another casualty of the very machine I was grieving — so used to the dysfunction that she mistook my horror for irrationality.

But I wasn’t broken for feeling betrayed by the world. I was broken because that betrayal was being normalized — and even the spaces meant for healing couldn’t name it.

We’ve been sold a world of limitless convenience at a hidden cost. A culture that tells us to chase the new, discard the old, and never ask who’s paying the real price. A system that demands we tolerate the intolerable, not just with our money, but with our time, our trust, and our quiet resignation.

But awareness is a crack in the machine. It starts with noticing. With saying, “This isn’t normal.” With rejecting the idea that disappointment is an acceptable standard.

It’s not that we expect too much.

It’s that we’ve been trained to expect far too little.


Is There Still a Point in Making Art?

A Heretic’s Meditation on Creativity in the Age of AI

A shadowed artist stands before a glowing abstract canvas in a dimly lit studio, capturing the tension between solitude and creative fire.

The recent rise of AI-generated content has sent shockwaves through the creative world. Artists are feeling threatened. Jobs are already disappearing. The cultural landscape is shifting faster than many of us can process.

Arguments are flying from all directions — some warning of creative extinction, others hailing a new era of democratized expression.

But I’m not here to join the shouting match.

I want to offer something else. A quieter, steadier voice — not of panic or praise, but of reflection. I’ve asked myself the difficult questions that many artists are too afraid to face. And I’m still here.

This isn’t a defence of AI. It’s not a eulogy for art. It’s something else entirely:

A meditation on what art really is, what it’s always been, and what it might become now that the illusions are falling away.

An alternative perspective.


The Fear Beneath the Fear

It’s easy to say that artists are afraid of being replaced. But let’s be honest: that fear didn’t start with AI. The creative world has always been a battlefield — for attention, for validation, for survival. AI just turned up the volume.

But there’s a deeper layer beneath all the hot takes and headline panic.
It’s not just:

Because we don’t just make art — we identify as artists.
And if the world suddenly doesn’t need us anymore… where does that leave our sense of purpose?

This is the fear that creeps in quietly — beneath the debates, beneath the memes, beneath the moral panic.
It’s not just about skill. It’s about soul.

But here’s the thing:
True faith doesn’t fear challenge. It welcomes it.
If our relationship with art is sacred, it should survive this moment — maybe even be clarified by it.

So instead of defending “art” as an abstract institution, maybe it’s time to ask what it really is.
Not for everyone.
But for you.


What Are We Actually Protecting?

When people rush to defend “art” from AI, they often act like it’s one sacred, indivisible thing.

But it’s not.
It never was.

“Art” is a suitcase term — we’ve crammed a hundred different things into it and slapped a fragile sticker on the front.
So let’s unpack it.

When we say we care about art, do we mean:

  • Art as self-expression? A way to explore who we are and leave fingerprints on the world?
  • Art as labour? A career, a hustle, a means to pay rent and buy overpriced notebooks?
  • Art as recognition? A cry for visibility, validation, applause?
  • Art as therapy? A way to metabolize pain, soothe the nervous system, survive?
  • Art as culture? A ritual, a form of collective memory, a way to pass down stories and values?

All of these are valid. All of them matter.
But AI challenges them differently.

It doesn’t invalidate self-expression — but it floods the market, making it harder to be seen.
It doesn’t erase art as therapy — but it does make “making it your job” a shakier proposition.

And if we’re honest, a lot of the current panic is less about expression… and more about position.

We’re not just afraid that AI will make good art.
We’re afraid it will make so much good art that we’ll become invisible — or irrelevant.

So maybe it’s time to stop defending “art” as a single monolith, and start being honest about what we’re actually trying to protect.

Because some of it may be worth protecting.
And some of it… might be worth letting go.


AI as Tool, Collaborator, or Colonizer

Depending on who you ask, AI is either a miracle or a monster.
But like most tools, it’s not the thing itself — it’s how it’s used, and who’s holding it.

On one hand, AI can be a godsend.

It can:

  • Remove the soul-sucking labour from creative workflows
  • Help finish rough ideas, generate variations, or act as a bouncing board
  • Enable people with physical limitations, fatigue, executive dysfunction, or lack of technical training to finally create what’s been living in their heads for years

For the disabled, the neurodivergent, the chronically tired, or the time-poor — this isn’t just a productivity hack. It’s liberation.

And in that light, AI becomes a collaborator — a strange new instrument to improvise with.

But then there’s the other side.

The side where corporations use AI to:

  • Fire entire creative departments
  • Mass-produce art without paying artists
  • Feed models on unpaid, uncredited human labour
  • Flood platforms with content to drown out independent voices

Here, AI stops being a tool or a collaborator. It becomes a colonizer.

A force that doesn’t just assist human creativity — but replaces it, absorbs it, rebrands it, and sells it back to us.

So let’s not fall into the binary trap.
AI isn’t inherently good or evil.
It’s not “just a tool.” It’s a tool in a system.
And that system has motives — economic, political, exploitative.

The question isn’t “Is AI good or bad?”
The real question is: Who gets to use it, and who gets used by it?


Art Has Never Been a Fair Game

Let’s be brutally honest for a second.

The idea that AI is suddenly making things unfair for artists?
Please. Unfairness has always been baked into the system.

Long before AI could spit out a passable oil painting in 15 seconds, we had:

  • Artists born into wealth with unlimited time and resources
  • Others working three jobs, stealing hours from sleep just to sketch
  • Elite schools with gatekept knowledge
  • Whole industries built on interns, nepotism, and exploitation

We’ve always lived in a world where:

  • Exposure trumps talent
  • Looks sell better than skill
  • Who you know can matter more than what you do
  • Some people get book deals, grants, galleries, and record contracts — while others more talented go unheard

So no — AI didn’t suddenly ruin a golden age of meritocracy.
There never was one.

What it has done is raise the ceiling.
Now the people with the most compute power, the biggest models, and the best prompt engineering skills are taking that same advantage and supercharging it.

Yes, it’s threatening. But it’s not new.

And maybe the real source of pain here is that for a long time, we convinced ourselves that finally, with the internet and social media, the playing field was levelling out.
That if you just worked hard, stayed true, and got good at your craft — you’d find your audience.
Now, that illusion is crumbling.

But maybe that’s not all bad.
Because when the fantasy dies, we stop chasing validation in a rigged system — and start asking what art really means outside of that system.


What Cannot Be Replicated

Let’s say it plainly: AI can now create art that looks like art.
It can mimic styles, blend influences, even generate “original” pieces that fool the eye or impress the algorithm.

But mimicry is not meaning.
And this is where the line is drawn — not in pixels or waveforms, but in presence.

An AI cannot:

  • Create in order to understand itself
  • Bleed into a canvas because it doesn’t know where else to put the pain
  • Sit with a feeling until it shapes into a melody
  • Wrestle with childhood trauma through choreography
  • Capture the tension of grief, guilt, or longing in a line of poetry

It can replicate the result.
It can’t live the becoming that led to it.

Because human art isn’t just a thing we make — it’s a thing we are while we’re making it.

It’s the shaky voice at an open mic.
The sketch on a receipt in a café.
The song that never leaves your bedroom.
The project that took ten years to finish because you changed and needed the piece to change with you.

It’s the refusal to turn away from your own soul, even when no one’s watching.

That’s not something AI will ever “catch up to” — because it’s not a race of output.
It’s a ritual of transformation.

So no — AI can’t replace that.
Because it was never part of that to begin with.


In a World of Noise, Humanity is the Signal (Maybe)

We’re heading toward a world flooded with content — not just more, but more convincing.
Music, art, writing, even personal reflections… all generated in moments, all capable of simulating depth.

And yes — some will argue that “authenticity will always shine through.”
That human touch can’t be faked.
That something deep down will feel the difference.

But what if that’s not true?

What if AI can learn to mimic the crack in the voice, the hesitation in a phrase, the poetic ambiguity of a grieving soul?

What if it becomes so good at being us — or at least simulating the traces we leave behind — that even we can’t tell the difference anymore?

What happens when you read a poem that moves you to tears… and find out it was written by a machine running a model of a hypothetical person’s life?

Will it still be real to you?

Will it matter?

Maybe the age of AI won’t destroy authenticity — but it might blur it so thoroughly that we stop being able to locate it with certainty.
In that world, maybe the only real test is why we create, not whether the world knows who made it.

Not to stand out.
Not to compete.
Not to prove we’re human.

But because the act of creating still does something to us — regardless of how indistinguishable it becomes.

That’s where humanity will live.
Not in the product.
But in the process.


Heresy as Devotion

To even ask the question — “What if art no longer matters?” — feels like a betrayal.
A kind of blasphemy. Especially if you’re an artist.

We’re supposed to defend it.
Stand by it.
Die for it, if necessary.

But I’m not interested in loyalty based on fear.
I’m not here to parrot romantic slogans or protect some fragile ideal.
I’m here because I asked myself the unaskable questions
And I didn’t break.

I looked my art in the eye and said:

And instead of running, I stayed.
I stayed with the silence.
I stayed with the ache.
And I found something deeper underneath the need to be seen, or praised, or preserved.

I found devotion.

Not to an outcome.
Not to a career.
Not to being “better than AI.”

But to the act itself.

To stepping into the space (or sometimes being thrown into it!).
To listening in the dark.
To turning feeling into form.
To becoming through making.

If that makes me a heretic in the temple of Art, then so be it.
I’ll burn my incense in the ruins and still call it sacred.

Because I’m not making to be important.
I’m making to be honest.

And honesty can’t be replaced.


The Point Is Still the Point

Maybe AI really can make better images, smoother songs, cleverer lines.
Maybe soon we won’t be able to tell the difference between a painting made by a person and one made by a machine trained on ten thousand human lifetimes.

Maybe the difference won’t even matter anymore.

But here’s what I know:

I still create.

I still need to shape the chaos inside me into something I can look at and say, “Yes — that’s part of me.”
I still feel the pull to translate the unspeakable into form, even if no one else ever sees it.

And that need? That impulse?
It doesn’t care whether it’s marketable.
It doesn’t care whether it could have been done faster by a prompt.

It exists outside of all that.

Maybe that’s where art actually begins —
Not with what we make,
but with why we keep making.

So no — I’m not here to convince you that art still matters.
I’m here to remind you that you do.

And no, I can’t say with certainty that you’re not a simulation.
Maybe none of us are real in the way we think we are.
Maybe we’re all just playing out the parameters of some higher-dimensional being’s prompt.

But here’s the thing:

This still feels real.
The ache.
The pull to create.
The beauty we try to name before it dissolves.
The questions we keep asking even when the answers don’t come.

And maybe that’s enough.

So make.
Not because it proves your humanity.
Not because you’ll get noticed.
But because whatever this is — this strange loop of becoming — it’s calling you.

And to respond to that call,
even from inside the simulation?

That is the point.

Built-In Tyranny

A dark, atmospheric digital painting of a modern smartphone encaged by iron bars and heavy chains. The phone rests on a stone surface, with a glowing red fingerprint scanner symbol on its screen. A rusty USB-C cable wraps around the scene like a shackle. In the background, shadowy, ghostlike human figures loom in a dimly lit gothic interior, evoking a sense of imprisonment and surveillance.

We were promised liberation.
Sleek devices that fit in our pockets, connect us to the world, and put the power of creation in our hands. But instead, many of us now live in quiet submission to machines that seem to serve corporate masters more than their owners.

Our phones and laptops were once portals of personal freedom. Now they behave more like obedient jailers — installing apps we didn’t ask for, blocking accessories we bought with our own money, updating themselves while we sleep, and feeding our data to companies we never consented to.

Worse still, the more you rely on these devices — for work, communication, creativity, or accessibility — the tighter the leash becomes. And for neurodivergent users, whose very functioning may depend on predictability, clarity, and user agency, these constraints are not just frustrating — they can be disabling.

This isn’t just bad design.
It’s a philosophy: one that says you don’t really own the tools you buy.
Welcome to the era of built-in tyranny.


1. The Illusion of Ownership

You buy a phone. You expect it to work with whatever charger or headphones you already own. But surprise: it demands an official accessory. Or worse, it just won’t work at all.

Many devices now contain hardware-level restrictions that reject third-party gear unless it’s certified by the manufacturer — which often means more expensive and less sustainable.
Example: Apple’s Lightning cable ecosystem often blocks uncertified accessories, while newer MacBooks only support external displays via specific USB-C docks.

On the software side, entire ecosystems are locked down.
Samsung Galaxy phones ship with unremovable Facebook apps.
Amazon Fire tablets restrict app choices to their own store.
You’re not choosing an experience; you’re renting a branded enclosure.


2. Forced Updates, Feature Loss, and UX Hostility

Updates used to be a good thing. Now, they’re Trojan horses. You wake up one day to find your device has rearranged your menus, removed your favorite feature, or is running slower because your old hardware can’t handle the new bloat.

Examples:

  • Many Windows 10 users were forcibly upgraded to Windows 11 despite preferring the previous layout.
  • Google Nest devices lost key features like local device control after updates.
  • Instagram moved the post button to prioritize shopping.
  • Spotify now auto-plays algorithmic tracks after your playlist ends.

For neurodivergent users, this is deeply destabilizing.
Predictable routines become shifting sands.
Custom workarounds break.
The cognitive load to re-learn an interface you never asked to change can be overwhelming.


3. Vendor Lock-In and the War on Repair

Remember when you could pop open a laptop or phone, swap out the battery, maybe upgrade the storage? Now, you need specialized tools just to open the case — and even if you succeed, you might find parts refuse to work unless the manufacturer “pairs” them via software.

Examples:

  • Apple requires calibration for many replacement parts like screens and batteries.
  • HP printers have rejected third-party ink cartridges via firmware updates.
  • Tesla has remotely disabled features like Autopilot on used vehicles.
  • John Deere tractors require proprietary software access, blocking DIY repairs.

For many neurodivergent users, the ability to tinker and customize is part of how they function. Taking that away is more than just annoying — it’s disempowering.


4. Surveillance and Consent Illusions

Your device is always listening. Your apps are always tracking. Settings may appear customizable, but they often hide the truth.

Examples of “dark patterns” include:

  • Confirmshaming: “No thanks, I prefer boring content.”
  • Pre-checked boxes for mailing lists or data collection.
  • Buttons where “Accept” is bright and big, but “Decline” is small and grey.
  • Amazon’s multi-page unsubscribe process.
  • Google’s multi-click cookie opt-out.

For neurodivergent users especially, these deceptive experiences create anxiety and a feeling of being manipulated. The illusion of control is a form of psychological strain.


5. The Neurodivergent Toll

For many neurodivergent people, consistency is survival.
We rely on routines and predictability to function. When updates override our settings, change layouts, or disable our workarounds, it can throw everything out of balance.

Sensory overload from flashy animations, auto-playing videos, or constant notifications compounds the stress.

Many ND users report:

  • Updates that reset accessibility settings
  • UI layouts that defy logic or require too many steps
  • Changes that break assistive tools or workflows

This isn’t just a usability issue — it’s a form of systemic inaccessibility.


6. A Glimpse at Alternatives

Some hopeful alternatives include:

  • Librem 5 and PinePhone: Linux-powered open-source smartphones
  • /e/OS or LineageOS: De-Googled Android systems
  • Right to Repair: Supported by groups like iFixit
  • Linux laptops and mod-friendly systems

They’re not always easy or accessible to everyone, but they do prove that different models are possible — ones that respect the user’s right to own, modify, and control.


7. Conclusion: Know Your Shackles

Built-in tyranny doesn’t arrive with jackboots.
It arrives with glossy screens, sleek packaging, and biometric locks.
It whispers, “for your convenience,” while tightening its grip.

If you can’t fix it, can’t change it, and can’t control it —
Then you are not the owner.
You are the product.

If we want a more ethical, inclusive, and truly empowering digital world, we must start by naming the shackles that come standard — and imagining a world where they don’t.

Feeding Ghosts: Simulated Cuisine and the Ethics of Experience

Introduction

What if the future of food wasn’t food at all? What if your steak dinner, your birthday cake, your cup of morning coffee—were all just carefully crafted illusions? Not imitations, but full-on simulations: perceived directly by the brain, bypassing the need for physical ingredients, cooking, or consumption.

This isn’t just science fiction anymore. With technologies like Virtual Reality, Neuralink, and brain-computer interfaces rapidly advancing, we’re on the cusp of being able to replicate the full sensory experience of eating. Visuals, smells, textures, tastes—even the feeling of satiety—could be artificially generated and customized. And if we go down this path, the consequences will be both profound and bizarre.

This article explores the possibilities, benefits, and potential horrors of simulated eating.


I. The Tech: Where We Are, and Where We’re Going

Already, research teams are experimenting with VR dining experiences and electric taste stimulation. Smell generators are being prototyped. Haptic mouthpieces are under development to replicate food textures. Neural interfaces are growing more precise, and companies like Neuralink are working toward bidirectional brain-computer communication. We’re not yet able to simulate a full meal flawlessly—but the component parts are assembling fast. Within a few decades, you may be able to eat a lobster bisque that doesn’t exist, while your body digests a simple plant-based nutrient paste or intravenous supplement.


II. Why Bother? Benefits of Simulated Eating

One of the most significant advantages of simulated eating is ethical liberation. There would be no animal suffering, no factory farming, no slaughter—just simulated meat, tailored to your taste and consumed with your ethics intact. Environmentally, the impact would be enormous. Without the need for traditional agriculture, we’d see dramatically reduced land use and greenhouse gas emissions, making food production vastly more sustainable.

Simulated eating also offers unprecedented accessibility. People with dysphagia, allergies, or other eating limitations could finally experience the foods they’ve missed or never had. For terminal patients, it could be a source of comfort—an opportunity to relive cherished meals. On the health front, the ability to indulge in the taste and sensation of indulgent foods while consuming precisely what your body needs could help with weight management, diabetes, and even malnutrition.

But the possibilities stretch beyond practicality into the creative. Imagine tasting your favorite music, designing entirely new flavor profiles that have never existed in nature, or using synaesthesia to blend art and gastronomy. Even the simple act of trying a new food before buying it could become a low-risk, immersive experience.


III. New Dangers & Ethical Nightmares

With such radical potential comes equally radical risk. Simulated eating could open new pathways for eating disorders. Virtual eating without nourishment might be exploited by those suffering from anorexia or bulimia, and entirely new disorders could emerge—like an obsession with “clean neural eating” or neurochemical binge cycles.

There’s also the danger of sensory addiction. These taste experiences could become hyper-engineered, surpassing junk food in their intensity and reward. Corporations might monetize these artificial cravings, exploiting our neurobiology in much the same way social media exploits our attention.

Another concern is the potential for deepening class divides. If high-fidelity neurofood simulators are expensive, we could see a world where the wealthy dine on gourmet illusions while the poor are stuck with analog nutrition. Meanwhile, cultural erosion may occur as traditional food practices and rituals lose relevance in a world of synthetic alternatives.

Consent and privacy introduce yet another layer of complexity. Could someone simulate feeding you without permission? Could this technology be used to prank or harass others? In fact, weaponized taste is a very real possibility—imagine being force-fed the sensation of squirming insects or rotting meat, complete with accurate texture, temperature, and smell. Taste could become a tool of torture or psychological manipulation.

And then there’s the more humorous, but still invasive, possibility of “taste rickrolling.” You expect tiramisu; instead, your neural sensors are hijacked with the exact flavor and mouthfeel of human feces. This isn’t just a joke—it’s a form of sensory cyberattack. We’ll need cybersecurity systems that filter and authenticate taste profiles to avoid such revolting surprises.


IV. The Philosophical Fork

If no animal was harmed, and no food was truly eaten, but the experience of eating occurred in full… what did you actually do? Did you eat meat? Does simulated cruelty normalize real cruelty? Can pleasure without substance still nourish the soul?

Simulated eating will challenge not just our taste buds, but our ethics, our rituals, and our definitions of reality. It forces us to ask: what do we really want from food? Is it nourishment? Comfort? Culture? Control? The answers may not be as simple as we think.


Conclusion

Simulated eating holds massive potential. It could reduce suffering, protect the planet, and empower those with food limitations. But like all powerful technologies, it carries the risk of exploitation, addiction, and unintended consequences. The fork of the future might not feed the body—but it may very well shape the mind.

Counterfeit Culture: How Fake Products Are Eroding Trust, Value, and Accessibility

An abstract digital illustration featuring generic consumer items like headphones, a shoe, a microphone, and a box labeled “counterfeit,” all stylized in a bold, retro-inspired design with a red-orange background.

Introduction

We live in an age where the line between genuine and fake is becoming alarmingly blurred. With online marketplaces like Amazon and eBay flooded with counterfeit goods—often poor-quality imitations from anonymous sellers—consumers are losing the ability to trust what they’re buying. You may think you’re holding a legitimate product, but it might just be a convincing fake. This isn’t just about getting ripped off—it’s about something deeper: a quiet erosion of quality, trust, and economic fairness.


1. The Saturation of the Market with Fakes

Counterfeit goods are no longer confined to shady back-alley dealers or sketchy websites. They’re on the front pages of major retailers. Anyone can set up a seller account and start listing items under familiar brand names, complete with faked logos, photos, and even fake reviews. From Shure microphones to Hakko soldering irons to Yamaha saxophone mouthpieces—I’ve seen these counterfeits firsthand, and it’s almost impossible to tell at a glance.


2. When Brand Names Stop Meaning Anything

A brand was once a seal of quality. Now? It’s little more than a decorative badge. Counterfeiters hijack brand recognition while delivering none of the quality. Even reputable retailers have unknowingly stocked fakes. Trust in brands is being systematically dismantled—and with it, the very purpose of branding as a concept.


3. Hidden Exploitation Behind the Curtain

Most counterfeits aren’t just cheap—they’re cheap for a reason. Many are produced in factories with little regard for worker safety, fair wages, or environmental standards. When we buy these items—often unknowingly—we’re indirectly supporting unethical labor practices and contributing to unsustainable global supply chains.


4. Normalizing Mediocrity

The more we’re exposed to fakes, the more they become the norm. Subpar performance, shoddy materials, and quick failures all become “just how things are.” This desensitization gradually lowers public expectations for quality across the board—and that bar may never rise again.


5. Quality Comes at a Price—A Higher One Than Before

To get the real deal now takes research, effort, and often a significant markup. Verifying a product’s authenticity often means ordering directly from the manufacturer or a highly vetted supplier, sometimes even importing from overseas. That’s time, effort, and money the average buyer might not have.


6. A New Form of Gatekeeping

When the only way to ensure quality is to pay more or jump through verification hoops, we start drifting toward class-based access to authenticity. The wealthy can afford the genuine article, while everyone else must settle for “close enough.” And when the tools you buy affect the quality of your work, this becomes a systemic disadvantage—where privilege quietly amplifies itself.


7. So What Can We Do About It?

This problem can feel overwhelming—especially when even trusted retailers are compromised. But while we may not be able to stop the tide of counterfeits alone, we’re not entirely powerless. Here are some steps that can help reclaim a little control:

Be a Skeptical Shopper
Don’t trust a listing just because it has hundreds of positive reviews. Check for oddly worded product descriptions, low-resolution images, or sellers with inconsistent names and histories. Search Reddit, forums, or YouTube for authenticity comparisons when in doubt.

Buy from Authorized Dealers
Whenever possible, purchase directly from the manufacturer’s website or an official distributor. Many brands have a ‘Where to Buy’ section listing authorized sellers. Yes, it might cost more—but it often saves more in the long run.

Choose Quality Over Quantity
Rather than buying five cheap tools or accessories, invest in one solid item that will last. It’s a form of rebellion against disposable culture, and it’s better for your wallet (and the planet) in the long term.

Spread Awareness
If you discover a counterfeit, speak up. Report it to the platform, warn others online, and share your experience. Your voice might prevent someone else from being duped.

Support Regulation and Accountability
Push for greater platform accountability. These retailers have the resources to implement verification systems—they just need public pressure to prioritize them. Consumer movements and watchdog groups can make a difference over time.

Advocate for Fair Pricing
Counterfeit culture thrives because authentic products are increasingly priced out of reach. The long-term solution must include making quality accessible, not exclusive. That requires systemic change, but acknowledging the issue is the first step.


In short:

We can’t shop our way out of this problem—but we can shop with more awareness, demand better from sellers and platforms, and help each other navigate the fog of modern consumerism.

Systemic Abuse: The Guilt Machine

We’re all told that we live in a free world—one where our choices define us, our values shape our lives, and our purchases reflect our integrity. But for many of us, that freedom feels like a lie. The world we live in today doesn’t empower us to live by our values—it conditions us to betray them. And then it has the audacity to make us feel guilty for it.

It’s a clever machine. A cruel one. And like all truly dangerous systems, it doesn’t look like abuse at first glance. But if you’ve ever been in an abusive relationship, the emotional pattern might feel eerily familiar.


A System That Breaks You—and Then Blames You

Under late-stage capitalism, we are caught in a web of manufactured necessity. Take Amazon, for instance: many of us hate supporting it, knowing full well its exploitative practices—but still use it because it’s fast, cheap, and frictionless in a world that’s already draining us. This isn’t hypocrisy. It’s survival.

But the system wants you to think otherwise.

It sets impossible standards, offers you only compromised choices, and then whispers:

“If you were really a good person, you’d find a way to do better.”

Sound familiar? It should. These are classic abuse tactics.


Capitalism as a Scaled-Up Abuser

Personal Abuse TacticCapitalist Mirror
Gaslighting“You’re free to choose!” (between unaffordable, unethical, or unsustainable options)
Guilt manipulation“You bought from Amazon? That’s on you.”
Love bombing → withdrawalConvenience and perks up front, rising costs and exploitative policies later
Financial controlWage suppression, subscription traps, cost-of-living spirals
IsolationLocal businesses die, monopolies grow, alternatives shrink
Punish dissent, reward compliancePoints, perks, delivery guarantees… unless you opt out
Minimizing harm“Well, at least you’re not poor there,” or “Think of the jobs!”

This isn’t just resemblance. It’s design.

The system cultivates guilt as a form of emotional control. It ensures that even when we make the only viable choice, it doesn’t come without psychic cost. That cost is shame. Shame for being complicit. Shame for surviving.


Ethical Living as a Luxury?

Trying to live ethically under capitalism often feels like a full-time job—and an expensive one. Buy fair trade? It costs more. Boycott Amazon? Pay extra postage, wait longer, open three more accounts. Ditch tech giants? Navigate dozens of fractured, less-supported alternatives.

Convenience has become a commodity, one that’s traded in return for your participation in systemic harm. And if you don’t participate? You fall behind. You suffer more. You may even be cut off entirely.

In other words: the price of your values is your wellbeing. The system exploits this, because it knows that eventually, even the strongest burn out.


Witness the Guilt. Don’t Let It Own You.

So what can we do?

The answer is not to deny the guilt. In denying it, we risk becoming part of the very system we oppose—numb, complicit, desensitised.

But nor should we let it define us.

We need to witness it. To sit with it. To understand it as a symptom of captivity, not a flaw in our morality. The guilt we carry is evidence that our values still live.

Ethics in this world isn’t about being pure. It’s about being present.


You’re Not the Problem.

You didn’t create this system. You didn’t vote for monopolies. You didn’t sign up to be gaslit by algorithms and guilt-tripped by subscription services. You’re surviving in a rigged game.

But you’re also seeing it. And that matters.

Every time you acknowledge the manipulation—every time you name it, resist it, or even just survive it without turning cold—that’s resistance.

You don’t need to be perfect. You just need to be aware.

Because in a system that profits from your disconnection, your clarity is a threat.

Divide and Distract: How We’ve Been Trained to Fight the Wrong Enemy

There’s a trick being played on all of us.

Not a sleight of hand with cards or coins—but with narratives, identities, and emotional levers. It’s a trick so seamless that most people never realise they’re part of the performance. And even those who do see the misdirection often struggle to escape its grasp.

Here it is:


Manufactured Enemies

Scroll any social feed. Watch any televised debate. Eavesdrop on a crowded train. You’ll see it:

  • Left vs right
  • Boomers vs millennials
  • Masked vs unmasked
  • Rural vs urban
  • Vegans vs carnivores
  • iPhone vs Android

The content shifts, but the structure remains the same: us vs them.

The modern attention economy thrives on tribalism. It doesn’t matter if the war is over vaccines, identity politics, language, or lunch orders—what matters is that we’re constantly fighting someone. And more importantly, that we believe the other side is the reason things are broken.

But they’re not.

The real culprits are watching from above—largely invisible, fully protected, and often laughing.


When Rage is Rerouted

Righteous anger is one of the most powerful forces in existence. It can topple empires, end injustice, and forge solidarity across continents.

But misdirected?
It becomes a tool of oppression.

When we pour our outrage into culture wars, internet spats, and shallow memes, we expend real energy on phantom battles. We feel like we’re doing something—but in reality, we’re spinning our wheels while the real machinery of exploitation grinds on, uninterrupted.


The Puppet Masters

Let’s name some of the true antagonists:

  • Mega-corporations extracting resources and dodging tax
  • Lobbyists writing laws behind closed doors
  • Surveillance firms profiling us under the guise of convenience
  • Billionaires hoarding wealth in a world that can’t feed itself
  • Algorithmic platforms radicalising users for ad revenue

These forces aren’t hidden in shadows. They’re right out in the open, but rarely seen as the enemy—because we’re too busy arguing over pronouns or pineapple on pizza.


But What About Accountability?

Yes, people still make harmful choices.
Yes, individuals can be complicit in cruelty.
Yes, ignorance can do real damage.

But focusing only on the individual is like blaming the leaf for falling when the whole tree is being poisoned. Systems shape behaviour. Narratives shape perception. And we are all shaped—whether we like it or not.


The Role of the “Smart Ones”

If you’re someone who sees the manipulation clearly, your role isn’t to stand above others—it’s to help redirect the lens.

Not with superiority. Not with contempt. But with precision.
Call out the sleight of hand.
Pull back the curtain.
Refocus the conversation.

Because right now, many of the smartest, most perceptive people are caught in the same web—burning themselves out arguing with reflections instead of breaking the mirror.


Solidarity Is a Threat

Here’s what terrifies the system:

  • When a poor conservative farmer and a leftist city renter both realise they’re being screwed by the same landlord class.
  • When neurodivergent people across ideologies start recognising shared patterns of exploitation.
  • When the working class, the disabled, the artists, the overworked and overlooked stop fighting each other and start asking, together:
    “Who’s benefiting from all of this?”

That kind of cross-factional awareness? That’s dangerous.

Because solidarity is hard to control.


The End of the Trick

We don’t all have to agree on everything.
We don’t need to form some utopian consensus.

But we do need to see the stage.
Recognise the magicians.
Refuse to be the props in their show.

Because when the people stop fighting each other, they might finally start fighting back.