The 90% Problem
Projects get to 90% done and stay there. Not because people are slow — because nobody owns the outcome, only the tasks.
The CMO announces the team can now use AI for drafting.
Drafts start arriving faster. What took a writer two days now takes two hours. Production accelerates visibly. The team feels the difference immediately.
But the draft still sits in review for three weeks.
The bottleneck wasn't writing speed. It was decision clarity. And AI didn't touch that.
This matters to get right, because the assumption driving most AI adoption in marketing is subtly wrong.
AI accelerates production. It doesn't accelerate execution.
Those are different things.
Production is the creation of output: drafts, campaigns, content, assets. AI is genuinely good at this. Faster, cheaper, higher volume than before.
Execution is how work moves through an organization: decisions that close, handoffs that hold, priorities that don't collide, campaigns that actually finish. AI doesn't touch any of this.
The confusion is understandable. Production is visible. You can measure it. Drafts per week. Assets created. Campaigns launched. When AI makes these numbers go up, it looks like progress.
Execution is mostly invisible. It shows up in how long things take, how often work loops back, how much of what gets started actually finishes properly. Harder to measure. Harder to see. But far more consequential.
When teams adopt AI without fixing execution first, they get faster production feeding into the same broken system. The drafts arrive quickly. Then they wait.
Three patterns show up consistently when AI lands in a team with unclear execution.
Unclear positioning produces more content, faster. The team uses AI to scale content production. Ten blog posts a month becomes thirty. Social output doubles. Email sequences multiply.
But the positioning was already vague. The target audience was already broad. The differentiation was already weak.
Now there's three times as much content saying the same unclear thing to the same undefined audience. The volume increases. The impact doesn't.
Approval bottlenecks don't move. AI drafts faster. The draft still goes to the same four people for review. The same stakeholder still weighs in late. The same decision still doesn't close on time.
The writing step shortened from two days to two hours. The review step stayed at three weeks. Total time to publish: slightly faster. Not meaningfully different.
Fragmented priorities produce more of everything. The team has seventeen active initiatives. AI helps them produce content for all seventeen faster.
But attention is still fragmented. Nothing gets the depth it needs. Campaigns still launch halfway. Work still sits at ninety percent for weeks.
AI didn't create the priority problem. But it made the team more productive at operating within it. More output. Same fragmentation. Same results.
Production speed is measurable and immediate. You can see it. A dashboard shows assets created, posts published, campaigns launched.
Execution clarity is slower to measure. It shows up in deal velocity, pipeline predictability, how often work actually closes. These move over months, not days.
When AI makes the fast thing faster, it's easy to interpret that as progress. The numbers go up. The team feels more productive. Leadership sees output increasing.
But if the output isn't connected to outcomes, faster production just means arriving at the wrong destination sooner.
This is why teams that adopt AI without fixing execution first often feel busier than before without seeing results improve. They're producing more. But production isn't the constraint.
Four things need to be in place for AI to actually accelerate execution rather than just production.
Positioning clarity. If you don't know precisely who you're talking to and why they should care, AI will help you say the unclear thing more often and in more formats. Clarity of message before scale of production.
Decision ownership. If reviews loop and approvals stall, AI drafts will queue behind the same bottleneck as everything else. The constraint isn't creation speed. It's how decisions close.
Priority discipline. If seventeen initiatives run simultaneously, AI helps you create content for all seventeen without any of them getting proper attention. Fewer things running in parallel means each one can actually finish.
Workflow clarity. If handoffs are unclear and work stalls between people, faster drafting just moves work to the next stall point sooner. The system needs to support finishing, not just starting.
None of these are AI problems. They're execution problems. And they need to be resolved before AI amplifies them.
A team fixes their priority problem first. Seventeen initiatives become three. Everyone knows what matters this quarter and why.
Then they clarify decision ownership. One person owns approvals for each content type. Reviews have deadlines. Decisions close.
Then they map their workflow. Handoffs are explicit. Everyone knows who owns each step and what triggers the next one.
Then they add AI for drafting.
Now AI works. Drafts arrive faster and move through a system designed to close them. Content ships.
Campaigns finish properly. The team isn't just producing more. They're completing more.
AI didn't fix the system. The system was fixed first. AI made it faster.
The sequence matters more than the tool.
Before scaling AI adoption, one question is worth asking honestly:
Can your team finish things consistently without AI?
Not perfectly. Not without friction. But consistently.
Does work generally complete? Do campaigns close properly? Do decisions get made?
If the answer is mostly yes, AI will likely help. The constraint is probably production speed, and AI addresses that directly.
If the answer is mostly no, or "it depends on who's involved," or "some things finish but a lot doesn't," then AI will accelerate the symptom without touching the cause.
Faster production into a broken execution system doesn't create better outcomes. It creates more visible evidence of the underlying problem.
The teams that get the most from AI aren't the early adopters. They're the teams that did the structural work first.
They know what they're trying to say and to whom. They have decision ownership. They operate from clear priorities. Their workflows support finishing, not just starting.
When AI arrives in that environment, it genuinely accelerates execution. Drafts move fast because the system around them moves fast. Production speed and execution speed align.
The teams still struggling with AI aren't struggling because of the tool. They're struggling because the system it landed in wasn't ready for more speed.
More speed into an unclear system doesn't create clarity. It creates more of what was already happening, faster.
A Diagnostic Sprint identifies where execution is fragile before AI makes it more visible. Often, the structural gaps were already there. AI just accelerated them into plain sight.
The output isn't an AI implementation plan. It's clarity on what needs to work in your system before production speed becomes an advantage rather than a liability.