The prompting trap
2.5 hrs
wasted per knowledge worker, per week, on tasks AI could handle
McKinsey Global Institute
$9,750
per employee per year in recoverable productivity
At $75/hr fully loaded
Everyone is teaching prompting. Workshops, webinar series, certification programs. It has become the default answer to "how do we upskill on AI."
But prompting is to AI what typing is to writing. You wouldn't send your team to a typing workshop and call it a communications program. Yet that's what most prompt courses deliver: a faster way to interact with a tool, without addressing whether the tool is being applied to the right problem.
McKinsey estimates the average knowledge worker spends 2.5 hours per week on tasks AI could handle. At $75/hour, that's $9,750 per employee per year in recoverable time. Prompt workshops aren't capturing it because they aren't changing what people do with their working hours.
What building looks like for non-technical people
None of these require code. They're custom GPTs and automated workflows, tuned to a specific role and a recurring task.
When you tell someone "you're going to learn to build AI tools," they picture software engineering. When you show them it takes two hours and zero code, the resistance disappears. The afternoon project becomes the thing they show their manager the next morning.
- HR coordinator: Builds a custom GPT that drafts job descriptions in the company's voice and tone
- Marketing lead: Builds one that turns campaign data into exec summaries for the Monday standup
- Ops manager: Builds one that generates weekly status reports from project notes, replacing a 90-minute Friday task
- Sales team: Connects their CRM to an AI workflow that drafts follow-up emails from call notes
- Finance team: Automates monthly variance commentary that used to mean pulling data from three different systems
Your team can already prompt. Can they build something that saves them two hours every week?
The two moments that convert skeptics
The first is the time savings shock. A task that took two hours takes ten minutes. The report they write every Friday, done before the coffee gets cold. You can see it on their faces.
The second is ownership. "I built this." They didn't watch a demo. They sat down, made something with their own hands, and it worked. That's the moment skeptics convert.
These two moments explain a pattern. AI tools that IT rolls out with a company-wide email see single-digit daily usage. Teams that build their own custom assistants show 70%+ weekly engagement.
The difference is ownership. You can't get ownership from a platform license.
Why this changes the adoption math
When people build their own tools, they use them. When you hand them someone else's tool, they abandon it in weeks.
Instructor-led training sees 85-95% completion rates, compared to 5-15% for self-paced platforms. But completion isn't the goal. The goal is people doing something different on Monday.
The programs that produce behavior change are the ones where people leave with something they built. Something they'll open again on Tuesday morning.