The Existential Crisis of Modern Education
- James Purdy
- 6 days ago
- 9 min read

Key Takeaways
The fundamental question isn't how to use AI in education—it's whether traditional educational institutions are still necessary: Without radical transformation, schools risk becoming obsolete relics rather than engines of human development
Critical human skills are being neglected: Education must urgently focus on six areas where humans still outperform AI: critical thinking, creative work, human connection, complex assessment, context-dependent judgment, and authentic research
The mass exodus is already happening: Students are increasingly choosing alternative pathways over traditional secondary and post-secondary education, with 80% reporting AI tools in universities don't meet expectations and only 5% aware of institutional guidelines
I spend considerable time in educational circles, reading, researching, and talking with colleagues about AI in education (online and off). The conversations are always earnest and well-intentioned. Educators discuss important issues like how AI is changing pedagogy, how AI tools can be used safely in classrooms, which specific tools to recommend, and how teachers can develop AI literacy. These are all important topics that deserve attention and resources.
I increasingly feel that educators are missing the point entirely. While we debate the fine details of AI integration and safety protocols, we're avoiding the fundamental question that should be driving every discussion: Do we actually need secondary and post-secondary education in the world we now inhabit? If the answer is yes, why? And what unique value does the education industry bring to the table that can't be replicated, automated, or delivered more effectively through other means?
The data reveals a stark disconnect between educational institutions and the people they claim to serve. Students are already immersed in AI-powered learning experiences outside of school, yet educational institutions remain largely unprepared. Key statistics paint a troubling picture: 80% of students report that AI in universities isn't meeting their expectations, while only 5% are fully aware of their institution's AI guidelines. Meanwhile, two-thirds of teens have heard of ChatGPT, but schools actively discourage its use
This isn't a temporary technological disruption that schools can wait out. There's a double-edged sword emerging: students who use AI thoughtfully learn at an accelerated rate and gain a massive boost to their potential, while those who use AI to replace their problem-solving abilities are losing critical thinking skills through cognitive offloading. Research confirms a significant negative correlation between over-reliance on AI tools and critical thinking abilities, with younger participants exhibiting higher dependence and lower critical thinking scores. Ironically, schools are mostly blind to both issues. The very generation that education claims to serve is developing cognitive patterns and learning preferences that traditional institutions aren't equipped to address.
The crisis runs deeper than policy gaps or technological catch-up. Students are fundamentally questioning whether the promise of education (improved life outcomes through institutional learning) remains viable when AI can provide personalized instruction, immediate feedback, and adaptive learning experiences without the bureaucratic constraints of traditional schooling.
What follows examines why this educational exodus is accelerating, what students are finding outside traditional institutions, and whether education as we know it can adapt fast enough to remain relevant.
The Student Migration to AI-Powered Learning
According to a 2024 Pew Research poll, roughly half of Americans use AI at least several times a week
Virtually all Americans use products that contain AI (though two-thirds don't realize it)
Research shows students using AI-supported learning systems can double their learning outcomes in less time
80% of students say AI in universities is not fully meeting expectations
Students aren't waiting for schools to figure out AI - they're already immersed in AI-powered learning experiences that adapt to their individual needs, provide instant feedback, and deliver personalized instruction without institutional constraints.
Cognitive Revolution Is Happening Outside Classrooms
Recent research reveals a fundamental shift in how students approach learning and problem-solving. A comprehensive study published in the journal Societies surveyed 666 participants across diverse age groups and found a significant negative correlation between frequent AI tool usage and traditional critical thinking abilities. Crucially, younger participants exhibited higher dependence on AI tools and lower scores on conventional critical thinking assessments compared to older participants.
But this isn't necessarily the crisis educators assume it is. As researcher Gerlich noted, "Many participants suspected that AI was hampering their ability to think critically," yet they continued using these tools because they were finding genuine value. One participant explained: "I find myself using AI tools for almost everything, whether it's finding a restaurant or making a quick decision at work."
This represents a generational shift in cognitive strategy, not cognitive decline. Students are developing what researchers call "collaborative intelligence". The ability to work effectively with AI systems to achieve superior outcomes than either humans or machines could produce alone.
The Skills Gap That Schools Created
Traditional education has focused on developing skills that AI now handles more efficiently. Meanwhile, it has largely ignored the six areas where humans still significantly outperform AI: critical thinking in ambiguous contexts, creative work requiring genuine originality, human connection and empathy, complex assessment involving multiple variables, context-dependent judgment, and authentic research requiring synthesis across domains.
Research published in Nature Scientific Reports demonstrates this paradox clearly. While AI chatbots generally outperformed human participants in standard creativity tasks like the Alternate Uses Task, "the best human ideas still matched or exceeded those of the chatbots." This suggests that exceptional human performance remains irreplaceable, but educational institutions aren't cultivating these peak capabilities, yet.
As public intellectual Yuval Noah Harari has observed, humans could become "more and more cognitively idle with increasing AI automation, leading to programmed thinking and societal stagnation." Yet rather than addressing this challenge directly, most schools continue teaching skills that students can now outsource to AI.
Institutional Identity Crisis
When students can access personalized tutoring, instant feedback, and adaptive learning through AI tools, schools struggle to articulate what unique value they provide.
The response has been largely defensive: banning tools, creating policies, and trying to preserve traditional methods rather than asking the harder question of why students should choose institutional learning over AI-powered alternatives. This defensive posture signals to students that educators themselves aren't confident about what they offer that can't be replicated or improved upon by technology.
The fundamental problem isn't just that schools lack AI guidelines. It's that they lack a compelling answer to why students need them at all in an AI-integrated world. Educators may be thinking that over-reliance on AI could lead to a loss of critical thinking skills, but institutions have failed to demonstrate that their current methods actually develop superior critical thinking compared to thoughtful AI-assisted learning. Students recognize this uncertainty and are making their own choices about where to invest their learning time and energy.
Professional Preparation Problem
Perhaps most damaging to institutional credibility is the gap between what schools teach and what workplaces expect. According to recent surveys, nearly all employers now say they are or soon will be expecting employees to possess AI competencies and literacies, with many willing to pay a premium for those skills.
Students recognize this disconnect. They see internships requiring AI tool proficiency while their universities treat using ChatGPT like plagiarism. They hear about industries being transformed by AI-assisted workflows while their professors stubbornly assign handwritten essays as the pinnacle of authentic learning.
Microsoft and Carnegie Mellon University research found that workers who most trusted AI assistants actually thought less critically about those tools' conclusions, but they also completed tasks more efficiently and with greater accuracy in many contexts. Students are learning to navigate this trade-off outside of educational settings, often without the guidance that schools could provide if they embraced rather than resisted these tools.
Acceleration Factor
The rate of change is accelerating beyond educational institutions' capacity to adapt. A study by IBM's Watson analyzing medical datasets shows AI can now aid doctors in diagnosing complex illnesses by identifying patterns that might elude human scrutiny. Students pursuing healthcare careers need to understand these tools, yet most medical education programs barely acknowledge their existence.
In creative fields, AI tools like "Daddy's Car," composed by Sony's AI program after analyzing Beatles songs, raise fundamental questions about the boundary between authentic human creativity and machine-assisted creation. Art students are exploring these boundaries on their own, often in direct contradiction to their instructors' guidance.
The research from NYU's School of Professional Studies captures this tension: "The uniqueness of human thinking, with its intuitive leaps and emotional nuances, risks being overshadowed by machine-generated algorithms." Yet students are discovering that the most innovative work often emerges from thoughtful human-AI collaboration, not from avoiding AI entirely.
Trust Deficit
Educational institutions face a credibility crisis that extends beyond AI policy. When schools ban tools that students know are valuable, when policies change every semester without clear rationale, when institutional guidelines contradict workplace realities, students lose faith in educational judgment and use whatever tools they want anyways.
As one researcher notes: "Over-reliance on AI could lead to a loss of critical thinking skills and judgment among individuals. The inability to discern between AI-generated and human-generated insights can be detrimental in situations requiring independent decision-making." This is precisely the kind of nuanced guidance students need. Instead, they often encounter blanket prohibitions or vague warnings about "academic integrity."
The result is what researchers call "cognitive offloading" without guidance — students become dependent on AI tools without developing the critical evaluation skills necessary to use them effectively. Schools create the very problem they claim to prevent by refusing to teach responsible AI use.
Where Students Are Going Instead
Students are increasingly finding learning communities outside traditional institutions. Online platforms offer AI-integrated learning experiences that adapt to individual needs and learning styles. Professional communities provide mentorship and skill development aligned with industry realities. Entrepreneurial ventures allow students to apply AI tools in real-world contexts while developing practical expertise.
Research shows that students using AI-supported learning systems can double their learning outcomes in less time than traditional instructional methods. When faced with a choice between inefficient institutional constraints and effective external alternatives, more students are choosing efficiency.
This trend will accelerate. Educational institutions that continue to resist rather than guide AI integration risk losing their most motivated and capable students to alternative pathways that embrace technological reality while developing the distinctly human capabilities that remain irreplaceable.
AI Didn't Break Education, It Exposed What Was Already Broken
AI hasn't caused the crisis in education. It has only exposed it. The educational system as we know it was invented roughly 150 years ago to create a population ready for industrialization and colonization. Both of those industrial-era purposes are clearly misaligned with the needs of today’s learners.
For generations, schools maintained a monopoly on information access and credentialing. If you wanted to learn to read and write, you went to school. If you wanted to practice a profession, you needed institutional validation. But with AI and the internet, we now live in an era of information abundance with multiple pathways to professional competency that bypass traditional educational institutions entirely.
Fundamentally, little has changed in the form and function of educational institutions in the last 150 years. If schools and universities want to stay relevant in this new, AI powered age, they'll have find a new sense of purpose beyond "just trying to keep up" or "working with new technologies".
This isn't an argument against education. It's quite the opposite. I am deeply pro-education, despite what this analysis might suggest. The goal isn't to tear down educational institutions but to help them understand that their survival depends on rediscovering their essential purpose in a world where information scarcity and credentialing monopolies no longer exist.
The institutions that will thrive won't be those that resist technological change or cling to outdated models. They'll be the ones bold enough to ask fundamental questions about human development, community building, and the cultivation of wisdom rather than mere knowledge transfer. AI offers education a chance not just to weather this transformation, but to emerge as something far more valuable than what it replaced.
-Ryan James Purdy
References
Ally, M., & Mishra, S. (2024). Developing policy guidelines for artificial intelligence in post-secondary institutions. Commonwealth of Learning. https://www.col.org/resources/developing-policy-guidelines-for-artificial-intelligence-in-post-secondary-institutions/
Digital Education Council. (2024, March 18). EU AI Act: What it means for universities. https://www.digitaleducationcouncil.com/post/eu-ai-act-what-it-means-for-universities
EdWeek Research Center. (2024, February). Educator perspectives on AI in education. Education Week.
Gerlich, R. N. (2025, January 3). AI tools in society: Impacts on cognitive offloading and the future of critical thinking. Societies, 15(1), 6. https://doi.org/10.3390/soc15010006
Guo, L., & Lee, K. (2023). The effects of over-reliance on AI dialogue systems on students' cognitive abilities: A systematic review. Smart Learning Environments, 10, 40. https://doi.org/10.1186/s40561-024-00316-7
Intelligent.com. (2024, February 27). 4 in 10 college students are using ChatGPT on assignments.
Koivisto, M., & Grassini, S. (2023). Best humans still outperform artificial intelligence in a creative divergent thinking task. Scientific Reports, 13, 13865. https://doi.org/10.1038/s41598-023-40858-3
Lee, H.-P., Sarkar, A., Tankelevitch, L., Drosos, I., Rintel, S., Banks, R., & Wilson, N. (2025). The impact of generative AI on critical thinking: Self-reported reductions in cognitive effort and confidence effects from a survey of knowledge workers. In CHI Conference on Human Factors in Computing Systems (CHI '25), April 26–May 1, 2025, Yokohama, Japan. ACM. https://www.microsoft.com/en-us/research/wp-content/uploads/2025/01/lee_2025_ai_critical_thinking_survey.pdf
NYU School of Professional Studies. (n.d.). Thinking with AI - pros and cons: Language, logic, and loops. https://www.sps.nyu.edu/homepage/metaverse/metaverse-blog/Thinking-with-AI-Pros-and-Cons-Language-Logic-and-Loops.html
Pew Research Center. (2024). AI usage among Americans survey.
Princeton University Library. (2025). Disclosing the use of AI – research guides. https://libguides.princeton.edu/generativeAI/disclosure
Purdy, R. J. (2025). Stop-gap AI policy guide: Secondary & post-secondary edition: Seven meetings to build a practical AI policy. Purdy House Publishing.
Royce, C. A., & Bennett, V. (2025). To think or not to think: The impact of AI on critical-thinking skills. NSTA Blog. https://www.nsta.org/blog/think-or-not-think-impact-ai-critical-thinking-skills
University of California, Santa Barbara, Writing Program. (2024). AI policy: Responsible integration of AI writing technology. https://www.writing.ucsb.edu/resources/faculty/ai-policy
Wu, C., Zhang, H., & Carroll, J. M. (2024). AI governance in higher education: Case studies of guidance at Big Ten universities. arXiv preprint arXiv:2409.02017. https://arxiv.org/abs/2409.02017
Comments