The one who succeeds with AI isn't who you think
Everyone assumes the best programmer gets the best AI output. I used to think so too. But after a year of watching people use these tools in production, I've seen the opposite.
The speed is proven — but faster at what?
4-5x coding speed is proven. Maybe even 10x. That's not controversial anymore.
But a programmer with 10x speed still delivers within the same working hours, the same scope, the same understanding. The coding got faster. The outcome didn't change. Speed without direction is faster waste.
Three things that must be present
The people who get extraordinary results have three things simultaneously:
Technical understanding — they don't need to code, but they need to understand what's happening.
History — they know the domain, the systems, the context. They've seen what's worked and what's broken.
Business outcome understanding — they care about the result. Not the code, not the tool — but what actually happens for the customer, the business, the end user.
The third factor is the multiplier. Without it, the other two produce faster waste. With it, the same tool, the same LLM — becomes something entirely different.
The developer who understands the business
The best AI practitioner I've seen is the developer who has all three. They code, they know the system — and they care about the business outcome. An operations person who builds a diagnostic tool in ten minutes that replaces a full day of work. Not because the tool was magical — but because they knew which question to ask. They'd lived with the problem for years and cared about the outcome.
That's the ultimate combination. Technical skill, history, and business outcome in one person.
But those people are rare. Which leads to a more important question.
But what about the other smart people?
There are people in leadership positions with minimal programming background who get better results from AI in one afternoon than a developer got in a week. Not because they were better prompters — but because they knew exactly what they wanted to achieve.
Project managers who understand the delivery. Business developers who know where the friction sits. Consultants who have done twenty migrations and know which questions always come up. They have the business outcome. They have the history. The technical understanding doesn't need to be deep — it just needs to be sufficient.
And their raw material isn't code. It's text. Transcriptions. Conversations. Meetings. Customer dialogues. That's their code. When you start treating that text with the same rigor that developers treat code — with context windows, with domain knowledge, with structure — something entirely different happens.
The entry point is verbal
I've written about this before — about messy and crisp, about how voice input carries more information than any written prompt. When you talk freely about a problem you've lived with for twenty years, the associations come, the nuances — everything that gives AI something real to work with. Messy input isn't sloppiness. It's raw material that only someone with experience can produce.
The tools already exist. Teams records meetings. Your phone records calls. But most organizations have recording turned off by default — not for technical reasons, but out of habit. Customer service records every call. The strategy meeting that gathers the organization's most expensive hours vanishes without a trace.
And yes — Teams has its own meeting summary. But it's bland. A generic recap that doesn't understand what mattered and what was a tangent. Without you spending time reflecting on what actually happened — what you heard, what you interpreted, what deviated — the summary becomes a flat list that nobody acts on. The tool does the work. But only if you do yours first.
The investment nobody has made
It's natural to start with the development department. That's where the tools land first, where speed is easiest to measure. But 12-18 months into the hype, a different pattern emerges.
The business power users have Copilot, sure. They have Teams summaries. But that's not what I'm talking about. Nobody has invested in showing them what happens when their twenty years of experience meets these tools for real — not as a generic summary, but as systematic treatment of text with the same rigor developers treat code. The developers got the investment. The business got a Teams license and a hope.
But the day they start — the day someone actually puts the tool in the hands of the person who understands the business, who has lived with the customers, who knows where the friction sits — we're not talking about 4-5x faster. We're talking about world class. Ten times more value. Not because the tool got better, but because the right person finally got to use it.
The one who succeeds with AI isn't who you think. That's worth thinking about.
In series 16, I wrote that it doesn't start with the prompt — it starts with you. This is the follow-up: which "you" is it?
See also: You can't validate what you don't understand (series 6) and One person, a thousand deliveries (series 7).
Mindtastic on why domain knowledge multiplies AI output — Domain knowledge multiplies AI output.