DATE
March 19, 2026
Category
AI
Reading time
7 min
Is Sam Altman's "Thank You" to Engineers Actually His Own Eulogy?
Is Sam Altman's "Thank You" to Engineers Actually His Own Eulogy?

This week, Sam Altman posted something many people read as tone-deaf.

I think it was something more revealing than that.

He wrote:

"I have so much gratitude to people who wrote extremely complex software character-by-character. It already feels difficult to remember how much effort it really took. Thank you for getting us to this point."

Most people heard a pink slip.

I heard a eulogy.

Not for engineers.

For a way of thinking. And maybe, more quietly, for the version of himself that was forged inside that world.

That is what made the post so unsettling.

Because everything that made Sam Altman credible in the first place was built in the age he now seems to be waving goodbye to. His authority did not come from vibes. It did not come from branding. It came from being close enough to the craft of engineering to understand it, evaluate it, speak its language, and earn the trust of people who do the work. Even as a non-coder, his legitimacy was inseparable from a world in which coding was difficult, technical judgment mattered, and building serious software required pain, patience, and real intellectual effort.

That world is what gave leaders like him their footing.

And yet in a few short sentences, he seemed to speak as though that world had already receded into memory.

The Tell

The most revealing line in the post is not the thank you. Gratitude is easy. It is not even the implied suggestion that software will increasingly be written by machines. That claim has become standard industry theater by now.

The revealing line is this one:

"It already feels difficult to remember how much effort it really took."

That is the tell.

Because that sentence does not read like observation. It reads like distance. It reads like someone standing on the far side of the struggle, looking back not only at what others did, but at what he himself once had to understand to matter in this conversation.

It sounds less like celebration than mourning.

Not because engineering is dead. It is not. And not because programmers are about to disappear. They are not.

It sounds like mourning because in the world he is trying to call into being, the source of so much hard-won authority begins to evaporate. If software can now be summoned rather than built, if code becomes cheap, ambient, and automatic, then something larger disappears with it. Not just craft. Not just labor. But the very basis on which an entire generation of technical leaders established credibility.

That includes the founders, executives, and investors who built careers around being able to sit near the hardest part of the room and understand what was happening there.

That is why I do not think this was merely tone-deaf. I think it was existential.

The Deeper Irony

There is a deeper irony here that almost no one is talking about.

The more aggressively our industry declares that coding is going away, the more it destabilizes the hierarchy that gave the current generation of leaders its status. If the craft no longer matters, then the prestige attached to understanding that craft begins to weaken too. If code becomes disposable, so does some portion of the worldview that organized Silicon Valley for the last thirty years.

That is a dangerous story for an industry to tell about itself.

And yet it keeps telling it.

Why?

Partly because it is seductive. Every era wants to believe it is living through the final simplification. The final unlocking. The moment when friction falls away and intelligence itself becomes infinitely scalable. Silicon Valley, more than most cultures, is addicted to narratives of inevitability. Once a story starts working financially, morally, and rhetorically, people stop examining whether it is true. They begin repeating it because repeating it does work in the world. It raises capital. It justifies valuations. It signals modernity. It disciplines labor. It gives executives a cleaner language for explaining messy economic decisions.

That does not mean every person repeating the story is lying.

In fact, the more dangerous scenario is usually the opposite. The dangerous leaders are not the ones consciously deceiving everyone. They are the ones who have lived inside a narrative so long that they can no longer feel where the narrative ends and reality begins. They stop noticing the slippage. They stop hearing what their own words reveal.

That is what I think happened here.

What Engineering Actually Is

Because the technology itself is nowhere near as settled as the rhetoric suggests. We are still watching companies struggle with reliability, trust, hallucinations, security, workflow integration, and the simple fact that generating code is not the same thing as engineering software. Producing syntax is not the same thing as making systems work. Anyone who has spent real time around production environments, brittle dependencies, edge cases, conflicting requirements, legacy architecture, regulatory constraints, or human teams under pressure understands this instinctively.

The hard part was never merely typing characters into a machine.

The hard part was knowing what should be built. Why it should be built. How it would behave when reality pushed back. What tradeoff was worth making. What failure could be tolerated. What elegance looked like under constraint. What future debt was being incurred by present convenience.

That is what engineering is.

And none of that disappears because a model can autocomplete faster than a human can type.

If anything, the arrival of generative AI makes those deeper forms of judgment more important. Not less. Once code becomes abundant, discernment becomes scarce. Once output is cheap, clarity becomes expensive. Once anyone can generate ten possible solutions in seconds, the real advantage shifts to the person who knows which of the ten should never be deployed.

That is why the phrase "AI replacing engineers" has always felt intellectually sloppy to me. It confuses one layer of the stack with the whole thing. It mistakes visible labor for actual difficulty. It assumes that because one part of the task gets easier, the problem itself has been solved.

It has not.

The difficulty has just moved.

From typing to choosing. From producing to judging. From building line by line to understanding consequence by consequence.

This is not the death of engineering.

It is the exposure of what engineering always was.

Cultural Amnesia

And that is why I think Altman's post landed so strangely. Beneath the gratitude was a kind of forgetting. A soft farewell to the age when effort was visible, when cognition had texture, when struggle was still legible enough to command respect.

Maybe that is why so many people reacted viscerally. Not because they are afraid of tools. Engineers are usually the first to embrace good tools. They are not sentimental about inconvenience. They are sentimental about meaning. What people heard in that post was not simply "the tools are getting better." What they heard was something closer to: the thing you spent your life mastering is already becoming hard to remember.

That is not technological optimism.

That is cultural amnesia.

And it matters, because civilizations do not collapse only when they lose capabilities. Sometimes they decline when they lose memory of what those capabilities cost, and why that cost was formative. Once a society starts treating all friction as waste, it eventually loses touch with the role difficulty plays in producing judgment, taste, resilience, and wisdom. It begins to mistake convenience for intelligence.

What Comes After AI 1.0

That may be the defining error of AI 1.0.

AI 1.0 has largely been about automation dressed up as intelligence. It has been obsessed with eliminating effort, compressing time, flattening process, and making human contribution look increasingly optional. It has mistaken faster for smarter and more for better. It has been a triumph of output over depth.

But that phase will not last forever.

Because eventually institutions run into reality. They discover that removing humans from thinking-intensive processes does not eliminate risk. It often multiplies it. They discover that scale without judgment creates mediocrity at industrial volume. They discover that trust, originality, responsibility, and interpretation do not disappear simply because a machine can produce plausible language or plausible code.

And when that realization deepens, a different kind of AI will begin to matter.

Not AI that makes human thinking irrelevant.

AI that makes human thinking matter more.

That is the future I believe in. Not because I am nostalgic for the past, but because I understand what the past was teaching. The answer was never to preserve unnecessary labor for its own sake. The answer was to honor the forms of cognitive struggle that develop insight, rigor, and discernment. The next generation of tools should not erase those things. They should sharpen them.

That is what AI 2.0 will be about.

Not replacing the engineer but elevating the level at which the engineer operates. Not removing human judgment, but demanding more of it. Not celebrating the disappearance of effort, but recognizing that the most important effort was never just mechanical to begin with.

So no, I do not think Sam Altman wrote a eulogy for engineers.

I think he may have written a eulogy for the cognitive world that made his own authority possible.

And perhaps the strangest part is that he did not seem to realize it.

The engineers he appears to be thanking goodbye are not going anywhere.

They are the ones who will build what comes after this phase. They are the ones who understand the difference between generation and judgment. They are the ones who know that making something work is still harder than making something appear. They are the ones who remember what effort felt like.

And in the end, that memory may matter more than all the output in the world.

Conclusion

Sam Altman's thank you to engineers wasn't tone-deaf — it was a eulogy. Not for engineers, but for the cognitive world that gave his entire generation of tech leaders their authority. The most revealing line wasn't the gratitude. It was "it already feels difficult to remember how much effort it really took." That's not celebration. That's distance. And it matters — because the difficulty of engineering never lived in the typing. It lived in the judgment. Once code becomes abundant, discernment becomes scarce. The hard part didn't disappear. It just moved.

Written by Stephen Klein, Founder/CEO of Curiouser.AI


Stephen Klein is Founder & CEO of Curiouser.AI, the only AI designed to augment human intelligence. He also teaches at UC Berkeley. Alice 2.0 waitlist is now open — the first complete AI thought-leadership system designed to amplify individuality, not replace it. Join the waitlist at curiouser.ai. Curiouser is community-funded on WeFunder.