Most executives feel behind on AI — not because they aren't trying, but because nothing they've tried has actually changed how the work gets done. The gap between what's being sold and what's actually working is the defining challenge of AI adoption right now. And it has almost nothing to do with the technology.

The SaaS model is built for scale. Vendors build for the largest possible audience, which means they build for the median use case. A feature that works for 80% of customers gets shipped. For decades, that was an acceptable tradeoff. AI has changed the economics of that tradeoff entirely.

Why most AI tools don't actually get used

Custom tools that once required large engineering teams and 18-month timelines can now be built in weeks, fitted precisely to a specific workflow, and improved continuously based on how the team actually uses them. The case for accepting a generic solution — for bending your operation to fit a vendor's product — is weaker than it has ever been. But most organizations don't know this yet. They are still buying platforms. Still running pilots on tools designed for someone else's process. Still wondering why adoption is low.

When an AI tool doesn't get used, the instinct is to blame the rollout. Better training. A stronger change management plan. More executive sponsorship. Sometimes that's right. But more often, the tool itself is the problem. Generic AI tools ask your team to change how they work. They sit adjacent to the workflow instead of inside it. And so people default back — a rational response to a tool that wasn't built for them.

The tools that get used are the ones that show up where the work already happens — inside the systems your team lives in, at the moment a decision needs to be made. They don't ask for adaptation. They adapt. — the adoption argument
§ Key takeaways
  • The case for generic SaaS AI has never been weaker. Custom tools that once required 18-month timelines can now be built in weeks, fitted precisely to your workflow.
  • When AI tools fail to get adopted, the tool is usually the problem, not the team. Generic tools sit adjacent to the workflow; custom tools sit inside it.
  • Start with one recurring decision that has a measurable impact. One decision done well creates more momentum than ten workflows done adequately.
  • Measure behavior change, not implementation. The right question at six months is not "is the tool live?" but "has anything changed about how this decision gets made?"
A considered desk — papers, light, nothing rushed.
Tempo over theatre — the playbook’s operative move.

The playbook: what actually works

The executives making the most progress right now are not launching AI strategies. They are solving specific problems. Pick one recurring decision in your business — something that happens weekly or monthly, something where the quality of the call has a measurable impact. Understand what information feeds it. Understand where the time goes. Then build something that improves exactly that, and nothing else. One decision done well creates more momentum than ten workflows done adequately.

Fit the tool to the workflow, not the other way around. The best AI tools are nearly invisible. They don't add steps — they remove them. They don't require your team to learn a new system. They surface the right information inside the systems your team already uses. This is the standard worth holding vendors to: not "does this tool have the feature we need?" but "does this tool fit how we actually work?"

Implementation is not the milestone. Behavior change is. The right question six months after a deployment is not "is the tool live?" It is "has anything changed about how this decision gets made?" — the measurement principle

The quiet thesis

The organizations seeing real results from AI share a few traits. They are skeptical of platforms and specific about problems. They evaluate tools based on workflow fit, not feature lists. They treat adoption as the outcome, not the milestone. Custom AI is no longer expensive or slow to build. The organizations that recognize this — and act on it — will have tools their competitors can't replicate from a SaaS catalog. The opportunity is real. So is the cost of waiting.