The AI Question Everyone’s Getting Wrong
“Are we using AI yet?”
It’s the question bouncing around every L&D team, every conference hallway, every budget meeting. Leaders are scrambling to answer it.
Wrong question.
The question that actually predicts success or failure: Are we ready for AI?
Most organizations aren’t. And the gap is costing them.
Where the Time Actually Goes
We recently worked with a global pharmaceutical company running an eight-week content development cycle. Three of those weeks — nearly half — were spent just gathering content from subject matter experts before any actual development could begin.
SMEs spending countless hours translating their knowledge into instructional formats. Manually scouting through documents. Collaborating back and forth with learning managers. All before a single piece of training gets built.
They bought AI tools expecting acceleration. They got inconsistent outputs and frustrated teams instead.
But the tools didn’t fail, the foundation just wasn’t there: No content standards. No clear source hierarchies. No agreement on what “good” looked like.
AI can’t fix that. It just produces inconsistency faster. Too often, AI adoption means tool adoption — a bandaid on broken processes.
From Authoring to Orchestration
The organizations seeing real results have stopped thinking about AI as an authoring tool. They think about it as orchestration.
The difference matters.
Authoring means prompting, getting content, then manually reviewing and processing it in disconnected workflows. Tool A for this. Tool B for that. Humans cleaning up after machines.
Orchestration means AI handles the heavy lifting — extracting information, drafting content, generating variations — while humans provide strategic oversight at the right moments. The workflow is designed. The checkpoints are intentional. The output is consistent.
That pharmaceutical company? Their SMEs weren’t hired to create training content. But because they hold the knowledge, they’ve been stuck filtering and formatting for development teams. When AI takes over that extraction work, SMEs go back to what they’re actually supposed to do.
The CEO has a specific mandate: reduce time to market for new products. That directive flows into everything, including how fast training gets built. When you shorten the time SMEs spend gathering content, training stops being a bottleneck. It becomes a competitive advantage.
What Readiness Actually Looks Like
Everyone wants to talk about AI capabilities. Nobody wants to talk about the unglamorous work that makes those capabilities useful.
Standardization before automation. Human validation at key checkpoints — not just at the end. Measuring beyond speed to track what actually matters. Connecting AI to your organizational knowledge instead of relying on generic public training data. Governance that bridges IT and L&D.
Organizations that build these foundations see development cycles cut in half. SMEs freed for higher-value work. Teams producing significantly more with the same resources.
The difference? They made sure they were ready before they tried to scale.
We put it all in a free guide: The AI Integration Framework for L&D Leaders.
It walks through what needs to be in place before AI integration actually works. The questions to ask and the stages to build.
Because the technology isn’t the hard part. Readiness is.

















































