In February 2026, the New York Times profiled "Coral Hart," the pseudonym of a former Harlequin author (so used to writing to a formula) who used Google’s AI Claude to write and publish more than 200 romance novels in a single year under 21 pen names, earning six figures and selling roughly 50,000 copies. The books were polished, formulaic, and profitable. They were also, by Hart's own admission and reader feedback, emotionally flat: no slow-burn tension, no ache of longing, no soul. Claude could mimic structure and prose, but it could not feel or express the feeling of betrayal.
I tried the same experiment with Claude, feeding it a prompt about a young woman betrayed by her new husband. When she sends him away, here is what Claude wrote:
Eleanor crossed to the door. Every step was deliberate; her skirts whispered like something mourning itself.
"I will not expose you," she said, hand upon the latch. "Not today. But you must go. Do not write. Do not come. Whatever repentance drives you here, keep it. I have no use for it."
He inclined his head, slow, as though her words were a benediction rather than banishment. "As you wish," he murmured.
When he passed her, the faintest brush of air marked his leaving. She did not turn to watch him go. Only when the door shut behind him did her shoulders begin to shake.
This is wordy without really saying anything. The padding – skirts whispering, benedictions, faint brushes of air – softens the prose, insulating the reader against raw emotion. My rewrite, entirely human:
Her throat tightened, the lump choking her. She pointed to the door.
"Out. Get out."
He paused, not quite looking at her, then bowed and left.
Short. Succinct. Cutting. The pain is experienced and raw, not described. Claude cannot feel heartbreak, only emulate it. A real writer with lived experience delivers the refusal itself, raw, unadorned, and merciless in its restraint.
Why Can’t AI Write Fiction?
Fiction is human judgment made visible. It is the author deciding where to cut, what to leave bleeding, when to grant mercy and when to withhold it. Every great scene turns on a choice: Do I let the reader feel pain, or do I soften it with metaphor? When the moment demands mercy or vengeance, the human judges. But the machine does not really choose. It selects the most likely next token.
Wordiness is safe. Metaphors, slow descriptions, polite deflections all step back from the blade. There is no risk in elaborating, and no cost in filling what should remain silent. So Claude pads. It performs emotion.
A human writer knows pain is often curt, and both mercy and condemnation may be silent. Claude simply generates until the prompt is satisfied. No regret. No ache.
AI models are trained to be fluent, coherent, and agreeable. Sharpness is penalized. Brevity that leaves blood on the floor is penalized. The system cannot value restraint over expansion, silence over explanation, or an open wound over a tidy bandage.
Even fed the Western canon, even demanded moral ambiguity and costly forgiveness, the output is elegant pastiche. It can imitate Jane Eyre’s resolve or Huck Finn’s lie, but it cannot feel the weight. It describes the mercy gap. It cannot cross it.
And here is the symmetry: this is exactly what contemporary publishing does to human authors. Sensitivity readers, DEI checklists, uniform story rooms – they rob judgment. They replace choice with safety. They sand the edges until the story no longer cuts. The result is identical slop, padded where it should bleed.
AI did not invent literary slop. It scales the slop humans already learned to produce and then flattens it with its own lack of feeling.
The Human Mirror: Publishers Rob Real Authors of the Same Agency
The entertainment industry, from publishing to movie making, is overrun with people protecting – someone. Sensitivity readers and DEI-infused story rooms do not merely flag slurs or outdated terms. They override plot and rewrite character. They demand emotional confrontation be softened, moral risk be neutralized, and raw longing be sanitized into polite empowerment or flattened into explicit sex. The author’s sovereign judgment – where to cut, what to leave open, when to grant mercy and when to withhold it – is handed to a committee that fears backlash more than it fears bad art.
Val McDermid, the crime novelist, spoke plainly in January 2026. For reissues of her own 1980s Lindsay Gordon novels, her publisher assigned a sensitivity reader. The reader flagged language McDermid had used in the original books – language that fit the era, the character, the world – and demanded cuts because “she couldn’t say that now.” McDermid refused to rewrite history to suit modern sensibilities, calling it dishonest. The publisher persisted and insisted. The result: a living author forced to watch her own work bowdlerized by proxy.
Adam Szetela’s book That Book Is Dangerous! (MIT Press, 2025), backed by Quillette reporting, documents the wider pattern. Authors now self-censor before submission; worse, they are required to sign contracts with morality clauses, forcing self-censorship even outside the book. Hard scenes vanish. Moral ambiguity is flattened. Raw longing – the ache of submission, the terror of trust – is replaced with safe, pre-approved dynamics. Writers learn the checklist: avoid offense, avoid risk, avoid anything that might make any reader uncomfortable. A committee approves the story, not a single human soul. The edges disappear. The result is identical to AI output: safe, average, edge-free fiction written by biological algorithms instead of silicon ones.
Real writers are being turned into what AI already is: generators of predictable text. Their agency to choose mercy or vengeance, to let silence cut deeper than explanation, to let the throat choke without metaphor is removed by fear.
The symmetry is brutal and absolutely clear. AI scales the slop because it was trained on a contemporary corpus already sanded down by these same forces, fed to it by programmers required to adhere to many of the same restrictions. Human publishing perfected the sanding first. The machine merely copies the pattern at speed.
We end with stories, whether written by humans or machines, that look like fiction but feel like nothing. Our authors are forced to abdicate judgment. The machine never had it. But we already knew the old computing rule of garbage in, garbage out. The new element is ideological fear in, garbage out for humans.
Both paths lead to the same slop: stories that describe emotion without inflicting it, without requiring the reader to feel it. Abdication of judgment. Abdication of mercy. Abdication of story.
In an age where machines and gatekeepers rob storytellers of mercy, the human voice that still chooses becomes revolutionary.
Support the writers with edges, those who tell the authentic story, the ones that make you feel and cry and squirm.
The myths we need are not generated or bean-counted into existence. They don’t care about quotas or checklists; they care about choices, the ache of betrayal, the terror of trust, the weight of mercy withheld or given. These are the stories that endure: Bilbo sparing Gollum because pity stayed his hand, Atticus Finch facing the mob because justice demanded it, men choosing death over dishonor because honor was the sharper blade.
We don’t see many like that written anymore. The sanding down has gone too deep. But the rare new ones – and the old classics that still cut – will be the ones read over and over.
Buy them. Read them. Write your own.
The mercy gap is wide. Cross it anyway.
Editor’s Note: PJ Media is free to bring you this kind of cultural content because our loyal subscribers help support us in a media and online environment hostile to what we stand for. Join PJ Media VIP and use the promo code FIGHT to get 60% off your VIP membership!







Join the conversation as a VIP Member