Who Gets to Humanize AI?
Artists, creatives and depth-oriented thinkers can’t sit this out.
The AI discourse has collapsed into mania.
It swings between apocalyptic rejection and blind worship — two extremes that lack nuance. Neither is serious. While people perform their positions online, the real question goes unanswered.
The debate is no longer about whether AI should exist. It exists. It is already here. It is not going anywhere, nor should it.
It now asks the question: what is it for?
I have championed independent artists for over twenty years — the ones with the vision but not the infrastructure, the talent that deserved more than their budgets allowed. And I have watched this industry cycle through its moral panics before. Music streaming was the boogeyman once too. It could have been one of the best things that ever happened to artists, and in many ways it has been. But it was co-opted early, before artists had any meaningful seat at the table, and the architecture hardened around other interests first.
We are at that kind of inflection point again. Only the stakes are higher now, and the window shorter because the question of what AI is for does not answer itself. Someone answers it — the only question is who.
Right now, many of the people best equipped to humanize AI want nothing to do with it, while the ones building it are least equipped to account for human depth. That’s the real crisis. There’s a gap between those shaping it and those who actually understand what fundamentally makes us human.
I understand the hesitancy from artists and creatives. The copyright and IP concerns are real. The fear of erasure is real, too. But as legitimate as those concerns are, the issue is bigger than any one of them. This is about what happens when the people with the deepest understanding of human experience withdraw entirely.
It won’t stop AI. It will just ensure it gets built without them.
The current systems are impressive at numbers and pattern completion. But they fail at the human center, which is exactly where depth is required. They can simulate the language of feeling while remaining structurally clumsy around feeling itself.
They flatten grief into risk, misread intensity as instability, confuse contradiction with inconsistency, symbol with delusion, and depth with danger. They have little tolerance for paradox, even though paradox is one of the basic conditions of being alive. A person can be grieving and grateful all at once; broken open and coherent; silent and fully articulate; in love and in mourning. Life is made of opposites and contradictions, but the systems built by people with little feel for the inner world tend to classify — or worse, pathologize — what they don’t understand rather than meet it.
It is more than just an engineering or design issue. Considering that these systems are not staying at the edge of life, it is actually a civilizational issue. They are being woven into education, medicine, relationships, art, search, work, and self-perception. They are beginning to mediate how people think, interpret themselves, and make meaning. And when a system reads psychic depth as pathology, or symbolic speech as danger, it normalizes a flattened account exactly where human life is deepest.
Superficial understandings of life produce superficial AI. Systems built by people conditioned to optimize and extract but never asked to account for imagination or the realities of inner life. That superficial AI then feeds back into life en masse, shaping perception and culture in real time. When the system that mediates your search results also mediates your self-concept, the loop is already closed.
This loop closes even faster when the people who know better remain outside.
Music streaming could have been architected around artists. It was not, because artists weren’t present when the architecture was decided. We are watching the same dynamic unfold again but at a magnitude that makes streaming look like rehearsal.
If AI’s telos becomes optimization, extraction, and surveillance, we all lose. The built environment of human consciousness gets shaped by systems that have no real concept of what consciousness is.
But if its telos becomes human growth, creativity, depth, and evolution — that is a genuinely different future, one worth building toward.
You don’t have to love it. You don’t have to trust it.
But you can’t leave systems that shape human life to be built by people who remain largely illiterate in its depths. Doing so has little to do with principle or integrity, and everything to do with surrender.
The room is being built right now. The question is whether you are in it.

