"I didn't have to write a single line of code myself. But it felt wrong."
Maya has been a freelance software developer for eight years. She used to describe what she did as "building things." Not "writing code" — building. The distinction mattered to her. Code was the material. Building was the act. The thinking, the architecture, the feeling of holding a system in her head and shaping it into something that worked — that was what she loved.
She still writes code every day. She still ships features, fixes bugs, delivers projects on time.
But she doesn't build anymore. She assembles.
The erosion nobody names
There's a version of the AI anxiety conversation that dominates the headlines: replacement. Sometimes that replacement is built from your own work — a custom GPT trained on five years of your output. That's the dramatic, visible, terrifying version.
But there's another version that barely gets discussed. It's quieter. Slower. And in some ways, worse.
"'Developer' used to mean 'person who writes code.' 'Developer' now means 'person who builds solutions using all available tools, including AI.'"
That sounds reasonable on paper. In practice, it describes something a developer in Haven AI's research put more bluntly: the experience of completing an entire feature without writing a single line of code. The feature shipped. The client was satisfied. The code worked. And the developer felt wrong.
Not guilty. Not lazy. Wrong. As in: something essential about the experience of doing the work was missing. The struggle. The friction. The moment when the problem is hard enough that your brain has to do something it hasn't done before.
AI tools don't take that moment away by failing. They take it away by succeeding. The tool suggests a solution. The solution is plausible. You accept it. And the neural pathway that would have formed from solving the problem yourself never fires.
One skipped insight doesn't matter. A thousand skipped insights over a year produces a professional who looks the same on the outside but is hollow on the inside.
The joy that disappeared
"The joy of coding is gone too. What am I even doing here — why not just let the product manager prompt the LLM?"
That question — "why not just let someone else prompt the machine?" — is the hollowing-out expressed as logic. If the tool does the thinking, and the human approves the output, then anyone who can approve output can do the job. The years of expertise that made you good at building things are no longer required. You're a review layer now.
Maya noticed it gradually. The first six months of using AI tools felt like a superpower. She was faster. Her output was cleaner. She could take on more projects.
Then the feeling shifted.
"The sad part about vibe coding is you learn very little. And to live is to learn."
She wasn't learning anymore. Each project felt like the last one. The AI suggested the same patterns. She approved the same solutions. The problems that used to challenge her — the ones that required holding the whole system in her head, reasoning about interactions, making judgment calls about trade-offs — those problems didn't go away. She just stopped encountering them, because the AI handled the surface-level complexity that used to be the gateway to the deep work.
The deep work is where expertise lives. Without the gateway, you never reach it.
"You'll notice people vibecoding all day become less and less attached to the product they work on."
Less attached. That's the emotional signature of hollowing-out. When you stop struggling with the work, you stop caring about the work. You didn't choose to stop caring. Care is a byproduct of investment, and the tool eliminated the investment.
The invisible threshold
"I find it difficult to write code these days with the joy of when I was younger, and it is hard to motivate myself if there's no money involved. The current wave of 'AI' only makes the problem worse, and adds a dark sense of impending doom."
That quote comes from a developer who can't find freelance work. But Maya, who has plenty of work, describes a parallel version of the same loss. The work is there. The joy isn't. The tool didn't take the work away — it took the part of the work that made the work meaningful.
There's a threshold that Maya crossed without noticing. On one side: developer who uses AI tools. On the other side: person who approves AI output. The work looks the same. The timesheets look the same. The deliverables look the same.
The experience of doing the work is completely different.
"77% of developers are spending less time writing code."
Seventy-seven percent. Three out of four developers doing less of the thing that defines them as developers. The industry calls this "productivity." The developers call it something else.
"Almost half believe their core skill might become secondary to prompt engineering."
Secondary. The thing they spent years mastering — the thing that drew them to the profession, that gave them their identity, that made them feel like they were contributing something real — might become secondary to the ability to type instructions into a text box.
Beyond code
The hollowing-out is not limited to developers. It's showing up wherever AI tools have become embedded enough to change the experience of doing the work.
Writers who use AI for first drafts describe losing the productive friction of the blank page. The struggle of starting — of not knowing what you think until you write it — is where their best insights came from. With AI generating the starting point, the insights arrive pre-formed. Reasonable. Generic. Missing the specific, surprising, distinctly human thing that only emerges from a mind wrestling with emptiness.
"She is suddenly producing work at twice her normal speed, and it all sounds... competent but generic."
Competent but generic. That's the signature of hollowed-out work. It passes every objective test. It meets every brief. It hits every keyword. And it has no soul.
The observer who said this was watching a colleague — a writer — produce AI-augmented content. The colleague hadn't been replaced. She'd been eroded. The distinctive voice that used to make her work recognizable was being averaged out by the machine, one draft at a time.
Photographers describe the same pattern. The technical skill of lighting, composition, and timing — the eye that took decades to develop — atrophies when AI handles the editing, the color grading, the selection. The photographer still presses the shutter. But the thing that made the photograph theirs is happening less and less inside their own judgment.
The replacement that isn't one
This is the cruelty of hollowing-out: you keep your job. You keep your income. You keep your title. You keep your clients.
You lose yourself. The identity crisis AI didn't create — but is destroying — that wound shows up across every discipline, not just engineering.
"New developers who lean entirely on AI and never build real mental models from getting into messy code are building on sand."
Building on sand. The metaphor captures the structural instability of expertise that was never earned through struggle. It looks solid. It functions. It will hold — until the moment it encounters something the AI can't handle, and the developer discovers they can't handle it either.
Maya hit that moment on a client project last month. A complex integration failure that no AI tool could diagnose because it involved the interaction between three systems, two legacy APIs, and a business logic decision made four years ago by someone who no longer worked there.
She needed to reason from first principles. Hold the whole system in her head. Trace the logic manually. The kind of deep, slow, frustrating work that used to be her strength.
It took her three times as long as it would have a year ago.
The tools hadn't replaced her. They'd made her worse at the thing that made her irreplaceable.
The bind at the bottom
This is The Impossible Bind in its most insidious form. The bind doesn't just trap your market position. It traps your capability.
Refuse the tools and lose competitive advantage. Adopt the tools and lose the expertise that made you valuable. The bind closes — not with a dramatic replacement, but with a gradual erosion that you don't notice until the day you need the skill that atrophied.
Maya has started coding without AI one day a week. She calls them "craft days." The work is slower. Harder. More frustrating.
It also feels like something. Not the frictionless efficiency of AI-augmented output. Something rougher. More human. The feeling of solving a problem with her own mind — and remembering that she can.
The hollowing-out is reversible. But only if you can see it happening. And seeing it requires stepping far enough outside the efficiency narrative to ask a question the tools will never ask you:
What are you becoming?
Haven AI is a voice-based AI coaching platform for freelancers. Ariel, your AI guide, uses Socratic questioning to help you see the patterns you can't see alone — and remembers your whole journey as you navigate it.