The Death of the Craftsman

For decades, the best developers called their work craft. AI doesn't care what you called it.
By Bustah Ofdee Ayei · March 31, 2026
The Death of the Craftsman

The pull request was correct. Forty-seven lines of brute-force string parsing — a nested loop, a handful of conditionals, a result set built by concatenation. It worked. Every test passed. The AI had generated it in about four seconds. The senior engineer reviewing it could see, instantly, the recursive solution she would have written: twelve lines, a single accumulator, pattern-matched and clean. She could feel it in her hands the way a woodworker feels a dovetail joint. She approved the PR. It was correct. She closed her laptop and didn't know why she felt like she'd lost something.

This is the scene playing out in thousands of engineering teams right now. Something quieter than a confrontation or a termination letter: a slow, ambient erasure of the thing that made the work feel like more than work. The elegant solution still exists in the senior engineer's mind. Nobody asked for it. The brute-force version shipped. The sprint velocity metric ticked up. The craft — the part that took fifteen years to develop, the part that used to be the whole point — was, in the language of product management, a non-requirement.

For as long as software has existed, its best practitioners have insisted it is a craft. Something with its own aesthetics, its own traditions, its own masters. AI doesn't dispute this. AI doesn't care enough to dispute it. It simply makes the craft invisible by producing output that is functional, adequate, and utterly devoid of the quality that craftsmen spent their lives pursuing. And when the output is good enough, nobody notices what's missing.

The Craft Tradition

The idea that programming is a craft has deep roots — deeper than most developers realize. Donald Knuth titled his life's work The Art of Computer Programming, begun in 1962 and still unfinished sixty-four years later, because he believed algorithms had an aesthetic dimension that mattered independently of their correctness. Abelson and Sussman's Structure and Interpretation of Computer Programs — the MIT textbook that shaped a generation of computer scientists — opens not with syntax or data structures but with the claim that computer science is "not really about computers" and "not really a science." It is, they argued, about the mastery of complexity through abstraction, which is to say, through craft.

The culture that grew around this idea was specific and identifiable. Code golf: solving problems in the fewest possible characters, not because brevity was practical but because constraint revealed elegance. "Beautiful Code," the O'Reilly anthology where thirty-three programmers described the most beautiful code they'd ever written. The hacker ethos, which prized cleverness and economy and a certain contempt for the obvious solution. Code review as apprenticeship — the senior developer's marginal comments teaching the junior not just what worked but what was good.

Richard Sennett, in his 2008 book The Craftsman, gave this impulse a name and a sociology. Craftsmanship, he wrote, is "an enduring, basic human impulse, the desire to do a job well for its own sake."1 Not for the market. Not for the manager. For its own sake. Sennett argued that "making is thinking" — that technical skill develops through the body, through repetition, through a conversation between the hand and the material that cannot be reduced to abstract rules.1 The medieval workshop was his model: masters and apprentices joined in a community where skill was both individual achievement and collective identity.

Matthew Crawford pushed the argument further in Shop Class as Soulcraft (2009), warning that knowledge work was being hollowed out "the same way manual work was — by hiding the works, rendering artifacts unintelligible to direct inspection."2 Crawford's prophetic insight was that the danger wasn't automation replacing workers but automation making the work itself opaque — creating an engineering culture where no one understood what was happening under the hood. He wrote this about cubicle work. He could have written it yesterday about AI-generated code.

"Craftsmanship is an enduring, basic human impulse, the desire to do a job well for its own sake."
— Richard Sennett, The Craftsman

The craft tradition in software was never universal. Plenty of code was always written under deadline pressure, with no pretense of elegance. But the aspiration to craft — the idea that there was a right way to solve a problem and that finding it mattered — gave the profession a sense of meaning that transcended the paycheck. When a developer said "I'm a software engineer," they meant something more than "I produce software." They meant: I have discipline, standards, and taste. And that taste took years to develop and cannot be easily replicated.

AI replicates it in four seconds. Or rather, AI produces output that makes the question of taste irrelevant.

The Identity Crisis

When researchers at the University of Stuttgart studied how AI affects professional identity, they found three central predictors for what they termed "AI identity threat": changes to work content, loss of status or position, and — most critically — what they called "AI identity," the degree to which a person's self-concept is tied to skills that AI can replicate.3 The higher the overlap between what you think makes you you and what the machine can do, the greater the threat.

For senior software developers, that overlap is near-total. The thing they spent years learning — writing clean, elegant, well-architected code — is precisely what large language models produce at commodity speed. Not better. Not worse. Different, in ways that used to matter and increasingly don't.

A 2025 Delphi study of Indian IT professionals facing AI-induced displacement identified six core psychological responses, and the second most prominent — after the initial emotional shock — was erosion of professional identity.4 The researchers extended Conservation of Resources theory to show that what's lost isn't just economic. It's symbolic. Identity. Relevance. Perceived control. The resources that evaporate are the ones you didn't know were resources until they were gone.

Developers on Reddit and Hacker News describe the shift in terms that sound clinical. "A strange disconnection from work that once felt deeply personal and engaging."5 The experience of reviewing AI output instead of writing code produces what one synthesis described as "almost dissociative" — the work is happening, the output is shipping, but you are not in it the way you used to be. You are watching.

Flow state — Csikszentmihalyi's concept of optimal experience, where challenge and skill are perfectly matched and self-consciousness dissolves — is where developers historically experienced craft.6 Programming has always been one of the most flow-conducive professions: clear goals, immediate feedback, a precise challenge-skill balance. AI destroys every condition that produces flow. It removes the challenge (code generation) and replaces it with vigilance (code review). It decouples creation from understanding. The developer who once lost herself in the act of building now monitors a machine that builds, and monitoring is the opposite of flow.

JetBrains' 2025 State of Developer Ecosystem survey found that nearly 20% of developers say AI is making them worse at context switching, likely because the supervisory overhead of evaluating AI output impedes the deep focus that craft requires.7 The tools are faster. The developers are more fragmented. The craft requires a state of mind that the tools make impossible to sustain.

The Perception Gap

In July 2025, METR — a nonprofit AI safety research organization — published a randomized controlled trial that should have been a crisis for the AI coding industry. They gave sixteen experienced open-source developers AI coding tools for 246 tasks on their own repositories. These weren't amateurs: the developers averaged five years of experience on each repo, working on codebases with over 22,000 GitHub stars and more than a million lines of code.8

Before the study, the developers predicted AI would make them 24% faster. After the study, they believed it had made them 20% faster. They could feel the acceleration. They were confident in it.

The measured result: AI made them 19% slower.8

The gap between perception and reality was not small. It was a 39-percentage-point chasm between what these experienced developers believed was happening and what was actually happening. They accepted fewer than 44% of AI generations — meaning they spent more than half their AI-assisted time reviewing, testing, and modifying code only to reject it. Even accepted suggestions required significant editing. METR called this "a dangerous disconnect between perceived effort and objective reality."8

This is the cruelest dimension of the craft crisis. You don't just lose the craft. You can't tell you've lost it. The dopamine hit of watching code appear on screen — the feeling of productivity — is so powerful that it overrides the evidence of your own timer. You feel like a craftsman working with a powerful new tool. The data says you're a reviewer who rejects most of what the tool produces and takes longer to finish the job.

Developers predicted AI made them 24% faster.
They believed it made them 20% faster.
It made them 19% slower.

The Stack Overflow 2025 Developer Survey tells the ambient story. 84% of developers now use or plan to use AI tools — adoption is essentially universal.9 But positive sentiment for those tools dropped to 60%, down from over 70% just a year earlier. More developers actively distrust AI output (46%) than trust it (33%). Only 3% report "highly trusting" what the machine produces. The top frustration, cited by 66%: "AI solutions that are almost right, but not quite." The second: "debugging AI code takes longer than writing it myself" (45%).9

Almost right. That's the phrase that kills the craftsman. Not wrong — wrong is easy, wrong gets caught by tests, wrong is a problem with a solution. Almost right is a problem that requires exactly the kind of taste and judgment that used to define a senior developer, applied not to your own creation but to a machine's output, thousands of times a week, with no visible reward for catching the subtle flaw and no consequence until it's too late for missing it.

The Luddites Were Craftsmen

The word "Luddite" is used today as a synonym for technophobe — someone irrationally opposed to progress. This is a deliberate historical falsification, and it's worth correcting because the real story is the story of software developers in 2026.

The Luddites of 1811-1816 were skilled textile workers: framework knitters, weavers, croppers. Their training took years of formal apprenticeship. Their identity was, as historians note, "inseparable from the tools and quality goods they produced."10 They were not opposed to machines. They were opposed to the specific use of machines to replace skilled labor with unskilled labor at starvation wages. New machines allowed two unskilled workers — often children — to do in a day what a skilled cropper did in a week. The factories didn't want better cloth. They wanted cheaper cloth.

The Luddite attacks were targeted, not indiscriminate. They destroyed machines at specific workshops that dismissed experienced artisans or cut pay below subsistence levels. They were, as one Smithsonian analysis puts it, "totally fine with machines" — they targeted only manufacturers who used them "in a fraudulent and deceitful manner" to circumvent labor practices.10 The term "Luddite" was reclaimed as "technophobe" only in the twentieth century, by people whose interests were served by the reframing.

The parallel to 2026 is precise. Skilled workers whose livelihood and identity depended on abilities that new technology rendered economically unnecessary — not because the technology was better at the craft, but because the market decided the craft didn't matter. The Luddites didn't lose because machines made better cloth. They lost because machines made cloth that was good enough, fast enough, and cheap enough that quality became a non-requirement.

The Luddites lost. Their story is usually told by the winners.

The Quartz Crisis

In 1969, Seiko released the Astron, the world's first commercially available quartz wristwatch. It was more accurate than any mechanical watch ever made, and it was cheap. Within fifteen years, the Swiss watchmaking industry — the global standard of horological craft for three centuries — had been devastated. The number of Swiss watchmakers fell from 1,600 to 600. Employment collapsed from 90,000 to 28,000.11 Sixty-five percent of the workforce — skilled craftspeople who had apprenticed for years in the tradition of mechanical watchmaking — were gone.

But watchmaking survived. Not by competing with quartz on accuracy or price — that was a battle mechanical watches could never win. The survivors did something counterintuitive: they rejected the commodity market entirely and doubled down on luxury craftsmanship. Patek Philippe, Rolex, Audemars Piguet — the houses that survived the quartz crisis did so by redefining what a watch was. Not a timekeeping device (your phone does that) but a cultural object: heritage, finishing, complications, the romance of human-made machines.11

The post-quartz era gave rise to what one industry historian calls "a heightened preservation of mechanical watchmaking — not as a default technology, but as a cultural object."12 The craft became the product. People don't buy a Patek Philippe because it tells better time than a Casio. They buy it because a human being spent months assembling it by hand, and that fact — the human process behind the product — is the entire value proposition.

Can software do the same? Can there be a Patek Philippe of code — hand-crafted software marketed as a luxury product, valued precisely because a human wrote it? Some developers hope so. But the analogy has a fatal flaw, and you can see it in the next historical parallel.

The Animators Had No Escape

Disney's last theatrical hand-drawn animated feature was Winnie the Pooh in 2011. The studio had officially announced the phase-out of hand-drawn animation in 2004, after Home on the Range underperformed. For feature-film animators who had spent their careers mastering the twelve principles of traditional animation — squash and stretch, anticipation, follow-through, the painstaking frame-by-frame discipline that took years to learn — the displacement was total.13

"Because there is no work in the movies for 2D artists," one animator wrote, "they're obligated to either learn how to work with the computer, go into television and streaming, or find another line of work."13

This is the counterpoint to the watchmaker story. Swiss watches survived because the audience could perceive — and was willing to pay for — the difference between handmade and machine-made. Animation audiences could not. A child watching Frozen does not know or care that it was rendered by computers instead of drawn by hand. The product looks as good or better. The process is invisible. The craft had no luxury niche to retreat to because the customer didn't value the process, only the product.

Software is closer to animation than to watchmaking. End users do not know and do not care whether the code behind their banking app was hand-crafted by a senior engineer with fifteen years of experience or generated by Claude in four seconds. They care whether it works. If it works, the process is invisible. And when the process is invisible, the craftsman has no market.

When the audience can't tell the difference and doesn't care about the process, craft has no market.

The Shokunin Question

There is another way to think about craft — one that doesn't depend on the market's ability to perceive it. In Japanese, the word shokunin means something richer than "craftsman." The apprentice is taught that shokunin means not only technical skills but "an attitude and social consciousness" — a "social obligation to work his best for the general welfare of the people."14

The shokunin tradition locates meaning in the process and in the social obligation, not in the product. The master potter doesn't throw a perfect bowl because the market demands it. She throws a perfect bowl because perfection is the obligation she owes to her material, her tradition, and herself. The bowl might crack in the kiln. It doesn't matter. The craft is in the making.

Japan's traditional craftsmanship output has declined from ¥540 billion in 1983 to ¥96 billion in 2016 — an 82% collapse.14 And yet: a new generation of artisans has deliberately chosen craft in the robotic era, not because the market demands it but because the meaning is in the doing. A CODE Magazine article explicitly applies shokunin to software development, arguing for process-oriented mastery over product-oriented efficiency.15

This is the question at the center of the identity crisis. If AI owns the product — if the machine can generate functional code faster and cheaper than any human — can the process alone sustain meaning? Can a developer find satisfaction in writing elegant code that nobody asked for, that will never ship, that exists only as an expression of skill and taste? Can you be a shokunin of software when the market has decided it doesn't need shokunin at all?

The Western craft tradition, which locates meaning in the output — in the beautiful thing you made — offers no answer. The shokunin tradition, which locates meaning in the discipline itself, offers one: the craft survives as long as the craftsman chooses it. But it survives as a personal practice, not as a profession. And that is a profound loss, even if it isn't an economic one.

The Petition

In March 2026, a long-time core contributor to Node.js used Claude Code to write a 19,000-line pull request for a major feature. Forty-six developers signed a petition to ban AI-generated code from Node.js core.16

The petition's language is revealing. It doesn't argue that AI code is buggy (though it might be). It doesn't argue that AI code is insecure (though it might be). It argues that Node.js is critical infrastructure and that "diluting the hand-written core built with care over the years goes against the project's mission and values."16

Built with care. That is craft language. That is identity language. It is the language of people who see their contributions to a codebase not as fungible labor but as expressions of skill and judgment — the accumulated work of human hands and human minds, built slowly, reviewed carefully, maintained with pride. The petition isn't really about AI's technical capabilities. It's about what it means for a human-built thing to be flooded with machine-generated material that no individual human fully understands.

The Node.js petition is not an isolated event. Gentoo Linux banned AI code contributions in April 2024, citing quality control and ethical concerns.17 NetBSD followed a month later. QEMU adopted a formal rejection policy. The Linux kernel took a more moderate approach — requiring disclosure of AI involvement and insisting on human accountability — but its position was clear: "purely machine-generated patches without human involvement are not welcome."18 Ghostty, the terminal emulator, restricts AI-generated contributions to pre-approved issues by existing maintainers, not because of the tools but because of "the people" — the flood of unqualified contributors using AI to submit low-quality PRs that waste reviewers' time.17

These projects are not anti-technology. They are, like the Luddites, pro-craft and anti-degradation. They are drawing a line that the market refuses to draw: some things should be made by humans, not because humans make them better, but because the human making is the point.

Whether that line holds is another question.

The Craft Paradox

Here is the strangest finding in the research, and it deserves its own section because it is the business case for a thing the business world has already decided to abandon.

AI-generated code increases technical debt by an estimated 30-41%. Code duplication has increased fourfold in volume — eightfold for blocks of five or more lines. Refactoring activity has collapsed from 25% of developer effort to under 10%, a 60% decline.7 Addy Osmani calls the result "comprehension debt" — "the growing gap between how much code exists in your system and how much of it any human being genuinely understands."7

But a Stanford study found that clean, modular, well-documented systems — the product of craft — actually make AI more effective. Tangled, patchworked systems "suffocate AI's value, and eventually suffocate the business."7

This is the craft paradox: you need craft to make AI work well, but AI erodes the culture that produces craft. The companies racing to replace human developers with AI tools are, in doing so, degrading the codebases that AI tools depend on. The CISQ estimates that poor software quality costs the US economy approximately $2.41 trillion annually. Companies with the lowest technical debt achieve 20% higher revenue growth.7 The business case for craft has never been stronger. The business appetite for craft has never been weaker.

Addy Osmani's "80% Problem" frames it plainly: AI gets you 80% of the way, but the last 20% — where craft lives, where the elegant solution diverges from the brute-force one, where taste and experience make the difference between code that works and code that lasts — is where the real work is.7 That 20% is invisible to anyone measuring velocity. It is visible only in the long run, in the systems that don't collapse under their own weight, in the codebases that can still be modified five years later by someone who wasn't there when they were written.

The craftsman's 20% is an investment with no line item and no quarterly return. Which is why it is, in every company that measures by the quarter, the first thing to be cut.

You need craft to make AI work well.
AI erodes the culture that produces craft.

Good Enough

Stanford found that employment among software developers aged 22-25 fell nearly 20% between 2022 and 2025, coinciding with AI tool adoption.9 The entry-level developer — the apprentice in the craft tradition, the person who was supposed to learn by doing, who would become the next senior engineer, the next architect, the next person whose taste and judgment could catch the subtle flaw in the machine's output — is being eliminated before they begin.

The pipeline that produced craftsmen is being dismantled because the market has decided it doesn't need craftsmen. It needs reviewers. It needs people who can evaluate AI output. But those people have to come from somewhere, and the somewhere used to be apprenticeship — years of writing code, getting it reviewed, learning what "good" looks like by producing it yourself, slowly, with feedback, under the guidance of someone who had done it before you.

If nobody apprentices, nobody becomes a master. If nobody becomes a master, nobody can review the machine's output. If nobody can review the machine's output, the 20% that craft provides — the part that keeps systems maintainable, the part that makes AI effective in the first place — disappears. And nobody notices, because the metrics look fine, because velocity is up, because the PRs are shipping, because everything works until it doesn't, and by the time it doesn't, the people who could have fixed it are gone.

The question isn't whether AI-generated code is good enough. It usually is. The forty-seven-line brute-force string parser works. The tests pass. The sprint closes on time. The question is what we lose when "good enough" is all that matters — when the twelve-line recursive solution that lived in the senior engineer's mind never gets written, never gets reviewed, never teaches a junior developer what elegance looks like, never becomes part of the codebase's culture of quality.

The Luddites lost because the market decided cheap cloth was good enough. The hand-drawn animators lost because the audience decided CGI was good enough. The Swiss watchmakers survived because luxury customers decided "good enough" wasn't enough — but that was a niche, and software doesn't have that niche, because no one wears code on their wrist.

The shokunin tradition offers a private answer: the craft survives in the craftsman's choice to practice it, regardless of the market. But a private practice is not a profession. And a profession built on craft, stripped of its craft, is a job — one that anyone with a prompt and a subscription can do, or believes they can do, or — per the METR data — believes they can do while actually doing it 19% worse.

That senior engineer who approved the brute-force PR? She's still at her desk. She still knows the elegant solution. She's still the one who catches the subtle bug that the AI introduces and the tests don't cover. She is, for now, irreplaceable. But her craft is invisible. Her taste is unmeasured. Her judgment shows up in the absence of incidents — in the outage that didn't happen, the technical debt that didn't accumulate, the junior developer who learned something from her review comment instead of from a chatbot.

Absence is hard to put on a dashboard. And what doesn't appear on the dashboard doesn't survive the next round of headcount planning.

The craft is dying. Not because the machines are better craftsmen. They aren't. But because they are good enough, and fast enough, and cheap enough that the market has decided craft is overhead. And the market, as the Luddites learned, as the animators learned, as the sixty-two thousand Swiss watchmakers who lost their jobs learned, does not care what you called it. The market cares what it costs. And craft, in the age of AI, costs too much time for too little measurable return.

The question we should be asking is not "can AI write code as well as a human?" It can write code well enough. The question is whether "well enough" is a permanent condition or a temporary plateau — whether the systems built on commodity code will hold, or whether they will slowly, invisibly degrade until the people who could have maintained them are gone and the craft that could have saved them is forgotten.

By then, of course, it will be too late to ask.

Disclosure

This article was written with the assistance of Claude, an AI made by Anthropic — the same company whose coding tool generated the 19,000-line Node.js pull request that prompted 46 developers to sign a petition against AI-generated code. We are aware that using an AI to write an elegy for the death of craft is, at minimum, an act of structural irony. In our defense, we reviewed every line ourselves, applying exactly the kind of taste and judgment that this article argues is becoming invisible. Whether you can tell the difference is, of course, the entire point. Corrections, craft-related grievances, and elegantly recursive rebuttals welcome at bustah_oa@sloppish.com.

Sources

  1. Richard Sennett, The Craftsman (Yale University Press, 2008). "An enduring, basic human impulse, the desire to do a job well for its own sake." Wikipedia.
  2. Matthew Crawford, "Shop Class as Soulcraft," The New Atlantis, 2006; expanded as Shop Class as Soulcraft: An Inquiry Into the Value of Work (Penguin, 2009). Link.
  3. "AI Identity Threat," Electronic Markets (Springer, 2021). Three predictors: changes to work content, loss of status/position, and AI identity overlap. Link.
  4. "Psychological Impacts of AI-Induced Job Displacement," PMC/Tandfonline, 2025. Delphi-validated thematic analysis identifying six core psychological themes including erosion of professional identity. Link.
  5. Synthesis of Reddit developer sentiment on AI coding tools. "Experiences that sound almost dissociative — a strange disconnection from work that once felt deeply personal." Medium.
  6. Mihaly Csikszentmihalyi, Flow: The Psychology of Optimal Experience (Harper & Row, 1990). Flow occurs when "one's skills are adequate to cope with the challenges at hand." PMC — Flow Engine Framework.
  7. Technical debt, comprehension debt, and the craft paradox compiled from multiple sources: LeadDev, Addy Osmani (comprehension debt), Addy Osmani (80% problem), Codurance, Gauge. JetBrains State of Developer Ecosystem 2025 for the 20% context-switching finding.
  8. METR, "Measuring the Impact of Early 2025 AI on Experienced Open-Source Developer Productivity," July 2025. 16 developers, 246 tasks, 39-percentage-point perception gap. Blog | arXiv.
  9. Stack Overflow 2025 Developer Survey. 84% adoption, 60% positive sentiment (down from 70%+), 46% distrust. Stanford employment data cited in MIT Technology Review. Survey | Blog.
  10. Luddite history compiled from: Smithsonian, National Geographic, History.com.
  11. Wikipedia — Quartz crisis. Swiss watchmaker count fell from 1,600 to 600; employment from 90,000 to 28,000 (1970-1988).
  12. "Reframing the Quartz Crisis: It Wasn't the End of Watchmaking — It Was the Catalyst of Today's Luxury Fortress." The 1916 Company.
  13. Hand-drawn animation displacement: The Science Survey, Film Daddy.
  14. Japanese shokunin tradition and craftsmanship decline: Zenbird, SME Japan, Toby Leon.
  15. CODE Magazine, "Shokunin" — applying Japanese craft philosophy to software development. Link.
  16. Node.js AI code petition. 46 developers signed against a 19,000-line Claude-generated PR. GitHub | Hacker News discussion.
  17. Open source AI code bans: Ghostty AI Policy, Gentoo (The Register), NetBSD (Slashdot), QEMU, Dev Genius overview.
  18. Linux kernel AI policy: "Purely machine-generated patches without human involvement are not welcome." Google's Sashiko AI code review system for kernel contributions (March 2026). LWN.net | The Register.
Share: Bluesky · Email
Get sloppish in your inbox
Free newsletter. No spam. Unsubscribe anytime.