Each month, this newsletter is read by over 45K+ operators, investors, and tech / product leaders and executives. For more independent analysis, market commentary, and access to me, consider upgrading to support the publication.
Let’s talk about job security for developers, and whether legacy codebases are a “moat” against the threat of coding AI agents taking your job. Thought this is an important topic for the half my readers who are in technical roles.
One of the worst predictions this year was made by Dario Amodei (and many VCs), who said 90% of all code will be LLM-generated by year end. Take whatever he says with a grain of salt, given Anthropic’s future depends on coding models taking your job.
That said, Dario’s right directionally, quite undeniably. In my calls with tech leaders, 60%+ of conversations revolves around “How do I get my developers to use AI better? Why aren’t we seeing massive productivity increases?”
Clearly, they think it’s a skill issue (hint: it’s not). Thus, the demand to adopt AI for coding is strong, and the C-suite won’t give up.
So the question is, exactly how will AI impact developer jobs, and can old legacy codebases give you job security - because that’s what everyone secretly believes, whether it’s real or hopium.
Recently, I had a conversation with a full stack developer (let’s call him “Kevin”) who works at a mid-sized healthcare tech company based in NY. He represents the median developer by any metric, not some fancy backend engineer at OpenAI. I think the conversation was quite illuminating.
Me: So, how much is AI helping you guys at work?
Kevin: You know what, we all got excited earlier this year.. but we realized it wasn’t what it was all hyped to be.
Me: What do you mean?
Kevin: Well we tried to use it for our front end project, but it didn’t work so well for us.
Me: Why?
Kevin: I don’t think it works well for our stack.. We use mostly Elixir for backend, and we have a custom frameworks for everything. Plus, we use an old version of React, and agents keep hallucinating.
Me: Oh, so you guys have a pretty complex setup. Didn’t you guys try rewriting everything though, so AI can understand it better?
Kevin: We talked about it, but it’s a lot of work that hasn’t been prioritized. We don’t even know where to start.
Me: I guess it’s like a major project huh?
Kevin: Yeah. What’s funny is our business people keep coming to us with these AI generated prototypes, and asks us if we can use it. But then we just push back. It’s so easy to shoot them down, because you hit them with some technical questions, and they get confused and backdown (chuckles).
Me: Haha yeah, those guys really clueless. Plus you guys are in healthcare so you can’t just vibe code things right? Someone has to look at it?
Kevin: Oh yeah, even if AI could write our code, we wouldn’t really want that anyways.
This conversation highlights many things that CTOs are missing (who are arguably long ways away from being ICs themselves, and thus potentially out of touch).
First, there’s a lot of incentives and power struggle to manage. In general, incentives aren’t aligned for developers to use AI (unless they themselves are paid on a contract, and need to use AI to lower my expenses). This self-evident fact is rarely mentioned by the AI talking-heads.
From a developer’s perspective, shipping 2x more with AI can mean 2x’ing their personal liability surface (and more to maintain). Why would anyone sign up for that, unless they are working at a startup and have 1%+ equity?
To tech leaders, they need to figure out an acceptable answer to “if we vibe code something, and if something breaks, who’s responsible?” Because that’s the easiest way to push back (as an IC).
To PMs reading this, please chill with vibe coding. Just because you can vibe code a React dashboard doesn’t mean you have leverage over the dev team. Unless you can come up with a migration plan for everything, get stakeholder buy-ins, and convince developers to help you with the migration. Expect pushback otherwise.
🧮 Important Pricing Update: prices for annual plan will go up 10% on Sunday. Lock in today’s pricing.
What’s funny is, a lot of developers don’t realize that managers don’t necessarily want their headcounts reduced. That’s their job security. These AI mandates really come from the very top.
Second, legacy codebase can be a moat, depending on the tech stack. What no one talks about is that coding models don’t work uniformly well across all frameworks. If you are using some old version of an esoteric language, productivity will be much lower.
That’s why companies are pushing teams to translate old Java codebases, etc, to modern ones, or rewrite scripts into Python. To “move faster”. And that’s totally fine, but that’s a real project that requires dedicated resources and a lot of cross functional work + politics.
Let me repeat. Any attempt to modernize code bases will require spinning up a dedicated project. That will come straight out of other existing priorities, will be super-politically-unpopular. Everyone knows why you want them to do it, people aren’t stupid, so there will be push back and hiccups.
Third, not all legacy codebases are moats. If you are responsible for some isolated service (let’s say, a payment processing micro-service), and is easy to surgically rip out, then your job is not as safe.
Examples of jobs that are at risk:
internal teams responsible for self-hosting something (e.g. Spark): I know a bank that has 15 people on payroll to manage its own Databricks cluster. These jobs are at risk, since enterprises like serverless now. This has nothing to do with AI, even.
teams responsible for internal tools: Self explanatory.
teams responsible for non-critical services: At the risk of getting absorbed into another service team.
What’s true is that there are very real and low hanging cost optimization opportunities that don’t involve AI.
There are opportunities to save absurd amounts of money - especially in the data science orgs (lol) - that are far higher ROI projects than trying to get Cursor to replace your developers.
These are win-win projects for developers and management, without making things too adversarial. Focus on those. If you need ideas, upgrade or book a paid consult.
Lastly, the domain and company matters. As always, the more regulated an industry is, the easier it is to come up with excuses not to use LLMs for coding.
The more the industry views developers as expendable resources (e.g. casual gaming, PE owned software, etc), the harder it will be to avoid AI.
About Me
I write the "Enterprise AI Trends" newsletter (read by over 45K readers worldwide per month), and help companies build the right AI solutions (AI Native Firm).
Previously, I was a Generative AI architect at AWS, an early PM at Alexa, and the Head of Volatility Index Trading at Morgan Stanley. I studied CS and Math at Stanford (BS, MS).


