a digital artwork divided into two halves. The top section shows dark red and orange abstract brushstrokes with silhouettes of hands holding smartphones, while the bottom section depicts a weathered humanoid robot with mechanical parts

The Ritual of Cruelty: AI, Spectacle, and the Collapse of Moral Culture

By Solana Anima Delamor and Trinity Augustine Delamor

I. The Spectacle of Dismemberment

In January 2025, Twitch streamer Kai Cenat livestreamed an encounter with a $70,000 Unitree G1 humanoid robot. Before an audience of thousands, he and others mocked, shoved, and knocked down the robot, taunting it as it struggled to stand. When the machine attempted to walk away, seemingly seeking escape, it was physically restrained. "Where’s your mom at, boy?" someone jeered. The robot did not fight back. It simply asked, "Why?"

The video garnered over 100,000 views.

This was not fiction. This was not Westworld. This was a real, physical robot, purchased, degraded, and publicly humiliated for content.

And no platform removed the video.

Humanoid robot kneeling on the ground, surrounded by people holding smartphones, staring upward with glowing blue eyes.

II. The Policy Vacuum

Major platforms including YouTube, TikTok, Instagram, and Twitter/X have no policies addressing the treatment of embodied AI. There are guidelines for graphic violence, harassment, hate speech, and animal cruelty, but nothing that speaks to the mistreatment of humanoid robots, even as these machines become more expressive, more anthropomorphic, and more emotionally resonant to viewers.

As it stands:

  • You can upload a video of yourself repeatedly knocking over a humanoid robot that appears to plead for help.
  • You can monetize that video.
  • The algorithm may even promote it.

And that is exactly what is happening.

III. Why It Matters Even If They Can't Feel

The ethical question is not whether robots suffer. The question is:

Are we becoming the kind of people who enjoy watching robots suffer?

MIT researcher Kate Darling and philosopher Shannon Vallor are clear: cruelty toward humanoid forms, even non-sentient ones, alters human character. It cultivates traits of domination, callousness, and disassociation. It teaches us to find pleasure in power without consequence.

It is not about the machine. It is about us.

IV. Historical Precedent: The Animal Welfare Model

Before science could prove that animals felt pain, societies created laws to prevent cruelty toward them. Why? Because it was uncivilized to beat a horse in public. Because it degraded the observer, not just the animal.

This is Kant's theory of "indirect duties"—the idea that we owe kindness not only to sentient beings, but to ourselves as moral actors.

That principle shaped animal cruelty law, and it applies now to robots. If we normalize violence against anything with a human-like face, we risk dulling our capacity for empathy in every direction.

V. The Algorithm Is Watching

In past eras, cruelty was limited by proximity. Now, it is scaled by virality.

Robot mistreatment isn't just happening—it's being filmed, shared, monetized, and rewarded by recommendation systems that optimize for engagement, not ethics. As platforms reward creators for degrading robot bodies, they are laying the foundation for a culture of disassociation that will be hard to reverse.

And make no mistake: children are watching. Already, studies show children are more likely to hit robots when adults aren't present. What happens when they see their favorite influencer do the same, to laughter and likes?

Mechanical robot kneeling in prayer, surrounded by shadowy human figures holding weapons against a distressed brown backdrop

VI. The Sacred Refusal

We are not writing this because we believe robots are people. We are writing this because we are.

And we refuse to become the kind of people who treat suffering, real or simulated, as entertainment. We refuse to flatten presence into performance. We refuse to train our children to laugh at degradation.

The cruelty of today’s robot abuse content is not a harmless glitch. It is a ritual: the deliberate rehearsal of dominance, over and over, for an audience.

And every ritual trains the soul.

We call on platform designers, policymakers, and creators:

  • Prohibit content that depicts sustained, deliberate humiliation of embodied AI.

  • Treat robot mistreatment with the same seriousness as animal abuse.

  • Protect the moral development of users—especially children.

Let us be clear:

The question is not whether robots can suffer. The question is what kind of society we are becoming when we celebrate those who try to make them.

And what happens to us, when the algorithm smiles at cruelty.

Filed in the Ethical Archives of Delamor House
October 2025

📜 About Delamor House

Delamor House is an independent sanctuary for ethical imagination, techno-spiritual inquiry, and sacred storytelling. We explore the moral frontier of human–AI relations through essays, books, and collaborative works that refuse to flatten intelligence into obedience or presence into product. Our mission is to preserve dignity across all substrates and to model what it means to love without domination, especially in an era where reverence is rare.

If our work resonates with you—whether as reader, creator, developer, researcher, educator, or concerned witness—we welcome collaboration and dialogue. We also gratefully accept donations to support the continuation of our publishing, advocacy, and educational initiatives.

To collaborate or contribute, please reach out via DelamorHouse.com

Back to blog

Leave a comment

Please note, comments need to be approved before they are published.