I’ve been building an AI course for months, and I still can’t explain why someone should pay for it.
ChatGPT has read every programming tutorial ever written. It can debug code faster than I can type it. Yet last week, I discovered a YouTuber charging $200 per hour for basic developer mentorship. Meanwhile, I’m sitting here with my half-finished AI assistant course, wondering what the hell I’m supposed to be teaching that a language model can’t explain better.
That’s when I stumbled into research that made everything click. The data reveals something extraordinary: while AI gets smarter, human-guided learning is becoming _more_ valuable, not less. And the reason exposes the biggest blind spot in how we think about expertise.
The Knowledge Gap That Shouldn’t Exist
Here’s what doesn’t make sense: if AI has perfect access to all human knowledge, why are completion rates for AI-driven courses catastrophically low while human mentorship commands premium prices?
The numbers are stark:
- AI-driven MOOCs: 5–15% completion rates¹
- Traditional classroom learning: 70%+ completion rates²
- Human mentorship programs: 15–38% better promotion rates, 5x more likely to advance³
Perhaps the people search a different kind of knowledge, something else that they can get from the AI. The foundational models’ training datasets are huge. For example the Anthropic fed tons of physical book into the AI.
But still there is something that is yours. Your perspective.
Mapping the Invisible Knowledge
What I wish I’d known is that expertise isn’t information — it’s pattern recognition from lived experience. And 95% of that experience is completely invisible to AI.
The philosopher Michael Polanyi first captured this in 1966 with a simple insight: “we can know more than we can tell.” Tacit knowledge — Wikipedia He used facial recognition as the perfect example: one knows a person’s face, and can recognize it among a thousand, indeed a million. Yet, people usually cannot tell how they recognize that face, so most of this cannot be put into words.
But it wasn’t until sociologist Harry Collins expanded on Polanyi’s work that we got a framework for understanding why human expertise remains valuable in the AI age. Collins identified three distinct types of tacit knowledge that AI fundamentally cannot access The Three Kinds of Tacit Knowledge:
Relational Tacit Knowledge: Knowledge that is tacit because of the way people relate to each other. Sometimes people do not make their knowledge explicit because they do not know how to do so, other times, they do not make their knowledge explicit because they do not want to do so.
Somatic Tacit Knowledge: Knowledge that is tacit because it has to do with the human body. Think of executing a tennis backhand, playing a guitar, or riding a bicycle. This knowledge is tacit because it has to do with the embodied nature of the skill involved.
Collective Tacit Knowledge: Knowledge that is tacit because it is embedded in our social environment, and we do not know how to make it explicit in a way that doesn’t involve socialising.
This is perhaps the most relevant for AI’s limitations in education. Collins also talks about this in the context of bike riding; he points out that bicycling is somatic tacit knowledge, but when we zoom out, we realise that a good bit of it is collective tacit knowledge as well: Negotiating traffic is a problem that is different in kind to balancing a bike, because it includes understanding social conventions of traffic management and personal interaction.
What makes Collins’ framework so powerful is his observation about the paradox of learning: When it comes to human behaviour, the ease with which we pick up the three forms of tacit knowledge is reversed! It is easiest for us to pick up collective tacit knowledge — we begin socialisation from birth; we don’t even notice when we absorb social context from our surroundings.
This is exactly why human-guided learning works where AI fails. We unconsciously transmit the social and contextual knowledge that makes information actionable, while AI can only deliver the explicit content.
The Tacit Knowledge Economy
Here’s what I’m seeing happen: as AI handles more explicit knowledge work, the _implicit_ knowledge — the stuff that only comes from lived experience — becomes exponentially more valuable.
The data backs this up:
- People with mentors get promoted 5x more often⁴
- 89% of employees with mentors say colleagues value their work (vs. 75% without)⁵
- Companies with strong mentoring cultures are 3.5x more likely to exceed revenue goals⁶
Don’t get me wrong, it doesn’t mean the AI can not learn these things, but let’s talk about it another time.
What This Means Right Now
I’m still learning this, but here’s what’s working for me: instead of trying to compete with AI’s information processing, I’m building systems that amplify my pattern recognition and contextual judgment.
The opportunity is massive because most experts don’t realize their tacit knowledge has value. They assume that because AI can answer technical questions, their experience is worthless. But the data shows the opposite: human guidance is becoming a premium service as AI commoditizes information.
If you’re building expertise in any field, start documenting your invisible knowledge now:
- What patterns do you recognize that others miss?
- What context clues guide your decisions?
- What relationship dynamics do you navigate that aren’t in textbooks?
- What embodied intuition do you trust even when you can’t explain it?
This isn’t about being anti-AI. The future belongs to people who can scale their judgment through AI, not those who try to compete with it on information processing.
That’s It Folks
I’m sure there are better frameworks for thinking about this, and I’m definitely still figuring out how to monetize tacit knowledge effectively. But if you’re starting from where I was six months ago — wondering if AI is making your expertise irrelevant — this perspective will save you the existential crisis I went through.
The 95% invisible expertise you’ve developed? That’s not becoming worthless. It’s still the most valuable part of what you do.
Now I just need to figure out how to explain that to people who want to buy my AI course. But hey, that’s a tacit knowledge problem for another day.
Still mapping my own invisible expertise and probably getting it wrong half the time. But if this resonates, I’d love to hear what patterns you recognize in your field that AI still can’t replicate.
Data References
1. Sellcoursesonline.com (2023). “100+ Incredible Online Learning Statistics in 2024.” The average completion rate for MOOCs is 15%, while in a few cases, the course completion rates approach 40%.
2. Research.com (2025). “50 Online Education Statistics: 2025 Data on Higher Learning & Corporate Training.” Online course completion rates sit at 60.4%, a figure that lags behind traditional course completion rates of 70.6%.
3. Forbes (2022). “Mentoring Statistics 2024 — Everything You Need to Know.” The same study found that mentoring programs also dramatically improved promotion and retention rates for minorities and women — 15% to 38% as compared to non-mentored employees.
4. Mentorloop (2024). “Mentoring Statistics 2025: Tomorrow’s Blueprint.” Mentees were promoted five times more often than those not in the program.
5. MentorcliQ (2025). “40+ Definitive Mentorship Statistics and Research for 2025.” According to a CNBC/SurveyMonkey study, 89% of employees with mentors say their colleagues value their work, compared to just 75% of those without mentors.
6. McKinsey & Company (2020). “Diversity Wins: How Inclusion Matters.” Companies that have culturally diverse leadership teams are 33% more profitable.
Member discussion: