It’s easy to think of code as cold. A string of logic. If-this-then-that. Zero or one. But somewhere behind the keyboard, there’s a person. And behind them, a whole lineage of stories, wounds, assumptions, longings.
We forget that.
We build tools to sort, rank, flag, detect. And somewhere in the process, we stop asking how it feels to be the one on the other side of that logic.
Is it possible for a system to be efficient and kind?
A spreadsheet doesn’t care if a number is a name. An algorithm doesn’t flinch if it doesn’t recognize someones face showing signs of pain or hurt. A model can recite the language of empathy and still miss the moment entirely.
Because empathy isn’t in the words or the code. It’s in the pause. The noticing. The presence.
And machines don’t pause.
There’s a temptation to keep making them faster. Smarter. More useful. More predictive.
But useful to whom?
Predictive of what?
That’s where things get thorny. Because algorithms don’t emerge from nothing. They carry the fingerprints of their creators. Every training set is a mirror. Every prompt an echo of our own thinking patterns.
We aren’t just teaching machines to complete our sentences or our thoughts. We’re teaching them what matters. Who matters.
And right now, that teaching is shaped by market incentives, not moral ones.
I talked to an engineer recently who told me he was proud of how accurate their new healthcare triage model was. Then he paused and added, almost to himself, “It still doesn’t feel right when it tells a mother she’s not a priority.”
That’s the tension.
We’ve built systems that are technically correct and relationally brutal.
We’ve optimized for utility without asking who pays the cost. Or how it shapes the human spirit to be categorized by something that cannot love.
What would it mean to write code with a compassion instinct?
Not to sentimentalize it. But to build with the awareness that someone vulnerable will experience the outcome.
Maybe it starts with small shifts. A notification that invites rather than demands. A recommendation that explains itself plainly. A chatbot that knows when to stop talking.
Not just what can we automate—but what shouldn’t we?
We talk a lot about alignment. But maybe the real question is: do our tools reflect our values, or just our capabilities?
If we want compassion in our machines, we’ll need more than data.
We’ll need to teach them what we still struggle to practice: mercy. Nuance. Patience. The ability to sit with ambiguity.
You can’t code that into an algorithm. It has to be lived.
And if we don’t, we’ll keep building systems that serve the strong, punish the struggling, and mask it all in a friendly tone.
So let’s slow down.
Let’s remember that the heart of any system isn’t its intelligence.
It’s the intention behind it.
So what does this mean for the work ahead? Whether you're writing product copy, training a model, or sketching out user flows, ask yourself who’s on the receiving end. Picture their confusion, their fatigue, their humanity.
Build as if someone you love is the one interacting with your design. Compassion isn't a feature you bolt on later—it’s a posture you build with from the start.