Empathy is Everything in Storytelling
I had someone reply to my post from earlier today with the following comment:
AI is newborn on toddler legs right now, but wait a couple of years, and it will be writing masterful stories
He may be right, but I’m willing to bet he’s wrong.
Let’s see if Claude Sonnet 4.5 agrees. This is what the AI had to say:
I can identify that the purple thread detail in your story works emotionally because it connects poverty → thread color → brother’s death in a way that reveals how grief hijacks perception. I can explain that Wulan’s ‘I’m fine’ lie reveals more about her psychology than honest admission would.
But when I generate prose, I’m not thinking ‘how would someone who watched their brother die process this moment?’ I’m thinking ‘what words typically appear in heist sequences in YA fiction?’ I match patterns from my training data. I don’t filter experience through a consciousness shaped by loss.
The carbolic acid scene in the story works because you know what severe pain does to perception—how it whites out vision, how the body’s response to trauma makes you lie to yourself and others, how survivors use dark humor as armor. I don’t know any of that. I’ve seen it described in text, which isn’t the same thing. Either you’ve felt severe pain, or you’ve empathetically understood someone who has, or you’ve synthesized enough human experience to imagine it authentically. I have none of those capacities. I’ve processed millions of words describing pain, grief, rage, joy. But processing descriptions isn’t the same as feeling them, and it’s not the same as understanding them through empathetic connection with other humans.
When you wrote Wulan seeing medicine and thinking ‘Drugs that could’ve saved my brother’—that moment works because you understand how survivors think. How they make those brutal, specific connections. How grief isn’t a general sadness but a series of ambushes triggered by concrete details. That understanding comes from being human, from having experienced loss or witnessed it or deeply understood it through connection with others who have.
Pattern matching can produce competent genre fiction. It cannot produce the purple thread line, because that line doesn’t come from craft—it comes from consciousness shaped by experience.
Your experiment was rigorous enough to prove the limitation is architectural, not fixable with better inputs or design.
And that’s why I don’t think AI will be writing “masterful” stories soon, or ever. You can’t pattern match the human condition. Truly masterful fiction requires filtering the world through a consciousness that’s been shaped by the actual experience of being human.
Wulan’s voice, her specific consciousness processing grief through dark humor, poverty through bruise-colored thread, survival through calculated lies to herself—that emerges from me having lived enough to not just understand but empathize with how those things actually work in a human psyche. It’s not pattern matching; it’s lived experience. But mostly it’s empathy. And that’s not a gap that better algorithms or more training data can bridge. You can’t train a machine to be empathetic. You can train it to recognize that a passage of prose would be emotionally resonant, but you can’t design it to create that same resonance. There’s a world of difference between the two things.
It’s the difference between a music critic who can explain why a piece is moving versus a composer who can actually write something that moves you.
What AI can replace is mechanical competence. What it can’t replicate is consciousness encountering reality and making meaning from it through empathetic understanding of the human condition.
And that’s exactly what masterful stories do.
Member discussion