top of page
Search

CBC Report Exposes Canada's AI Education Crisis: Teachers "Struggling on Their Own" While Children Face Real Legal Risks

  • Writer: James Purdy
    James Purdy
  • Nov 28, 2025
  • 5 min read

Key Takeaways

 A new CBC investigation reveals Canadian teachers are "struggling on their own" because there are no policies and frameworks in place, with provincial responses described as "a mishmash" that's "all over the place" rather than meaningful guidance.

 Only 4 of 13 Canadian provinces have developed any AI guidelines, creating dangerous inconsistencies where Alberta focuses on teacher autonomy, Ontario boards apply conflicting rules, and federal legislation has been stalled since 2022 with no education-specific provisions for minors.

 Real compliance violations are happening in classrooms daily as students access commercial AI platforms without proper safeguards, while teachers make decisions without training or clear legal frameworks to protect themselves or their students.

 The solution requires a cross-jurisdictional approach that works regardless of future legislation, rather than waiting for governments to coordinate policies that may never come or will be outdated when they arrive.

When Personal Experience Meets Policy Reality

Three weeks ago, my young daughter came home from school and casually mentioned that she had used ChatGPT in class on a Chromebook. Intrigued, I asked what she had done. She explained that she had gone to ChatGPT through Google, asked it some questions using the voice feature, and then the lesson ended and the teacher took the computers away.

While this sounds innocuous, I can identify at least four potential legal violations in this simple classroom interaction: a minor using a commercial AI platform that requires users to be 13+, voice data being collected and stored by OpenAI, no parental consent for AI usage during school hours, and probable violations of school technology policies that haven't been updated to address AI tools.

This isn't just a policy problem - it's a crisis that's harming both teachers and children right now. And according to a damning new CBC investigation, this confusion isn't isolated to one classroom or one province. It's a nationwide failure of governance that's leaving educators to navigate legal minefields without maps.

CBC's Investigation Reveals the Scope of Canada's Policy Failure

The CBC's recent reporting on AI in Canadian classrooms reveals the scope of this policy disaster. Heidi Yetman, former President of the Canadian Teachers' Federation, told CBC that teachers are "struggling on their own because there are no policies and frameworks put in place." The provincial responses that do exist? She describes them as "a mishmash" that's "all over the place" and notes they "aren't specific enough and don't offer true, meaningful education for teachers on potential and pitfalls of AI in the classroom."

The CBC investigation confirms what many of us suspected but couldn't quantify: Canada's approach to AI in education is a disaster. Their research shows that of Canada's 13 provinces and territories, only four have developed any AI frameworks or guidelines at all. British Columbia, New Brunswick, Quebec, and Ontario have issued some guidance, while the other nine provinces and territories have essentially left teachers to figure it out themselves.

But even among the four provinces with guidelines, the inconsistencies are staggering. In Alberta, the Alberta School Boards Association released guidance focusing on "ethical and responsible use" but left implementation largely to individual districts. The result? Some districts like Northern Gateway Public Schools have developed detailed policies emphasizing teacher autonomy, while others have nothing at all. The Alberta Teachers' Association has been conducting multi-year research projects with the University of Alberta, but teachers report still feeling unprepared for practical classroom decisions.

Ontario presents an even more chaotic picture. The CBC noted that school boards across the province are implementing wildly different approaches. The Ottawa Catholic School Board has established detailed policies including specific rules about AI input limits, while the Toronto District School Board - Canada's largest - has barely begun addressing the issue. Some boards require parental consent before AI tools are used, others mandate teacher approval, and many have no clear guidelines on AI's role in academic assessment at all.

At the federal level, the situation borders on absurd. Canada's proposed Artificial Intelligence and Data Act, part of Bill C-27, has been stalled in parliamentary committee since 2022, was terminated when Parliament dissolved in March 2025, and must now be completely reintroduced. Even if it eventually passes, AIDA focuses on commercial "high-impact" AI systems and includes virtually no provisions for education, particularly concerning minors. As the Canadian Teachers' Federation notes, federal legislation "does not include explicit provisions specific to the unique risks of AI in education, especially concerning minors."

The real-world consequences are already devastating. The CBC report doesn't mention it, but the December 2024 PowerSchool data breach exposed sensitive information for 1.5 million students in Toronto alone, affecting over 80 school boards across seven provinces. This happened precisely because 76% of Canadian students use educational technology platforms without robust governance frameworks to ensure vendor accountability. Teachers are being asked to evaluate AI tools and protect student data without the training, policies, or legal frameworks needed to do either effectively.

Why This Crisis Demands Immediate Action

The CBC's findings should be a wake-up call for anyone who thinks Canada can wait for perfect federal coordination or comprehensive provincial policies. Teachers are making AI decisions right now. Students are using these tools daily. The regulatory framework the CBC investigated isn't just inadequate - it's actively harmful to both educators and children.

What we need isn't another government committee or provincial working group. What we need is a practical compliance framework that works regardless of which level of government eventually sorts out their jurisdiction. This means establishing best practices based on emerging global standards that will align with future legislation rather than trying to predict what that legislation will look like.

I've spent the past year developing what I believe is the first cross-jurisdictional AI compliance framework specifically designed for K-12 education. This includes practical vendor evaluation tools, the first quantified human-AI oversight rule, and implementation systems that can function immediately without waiting for perfect policies. The framework addresses exactly the gaps the CBC identified - giving teachers clear guidance while protecting students and schools from legal and privacy risks.

"Stop-Gap AI Compliance: Primary and Secondary Edition" provides these tools in a format that educators can implement today, while broader policy frameworks catch up to classroom reality. 

I'm actively seeking partners for micro-pilot programs across Canada - collaborations as short as one day with four simple actions - to help demonstrate that practical solutions can close these legal liability gaps immediately. Contact me directly for more information. 

The CBC investigation shows us the cost of inaction. Teachers struggling alone. Students at risk. Policies that are "all over the place." The question isn't whether we should wait for better government coordination. The question is whether we're going to provide practical solutions now, while children like my daughter continue using AI tools in classrooms where teachers lack the frameworks they desperately need.

Ryan James Purdy is an AI governance specialist and author focused on helping educational institutions navigate AI adoption safely and effectively. His work provides the first cross-jurisdictional compliance framework specifically designed for K-12 education. He can be reached directly for consultation on AI governance challenges and micro-pilot partnerships.

References

CBC News. (2025, October). "Canadian teachers want lessons on how to ethically incorporate AI into classrooms." CBC News. https://www.cbc.ca/news/education-teachers-ai-training-1.7597806

Canadian Teachers' Federation. (2025, August). "Artificial Intelligence in Public Education." https://www.ctf-fce.ca/take-action/ai-in-public-education/

Center for Democracy and Technology. (2025, October). "Schools' Embrace of AI Connected to Increased Risks." Education Week. https://www.edweek.org/technology/rising-use-of-ai-in-schools-comes-with-big-downsides-for-students/2025/10

Government of Canada. (2024). "The Artificial Intelligence and Data Act (AIDA) Companion Document." https://ised-isde.canada.ca/site/innovation-better-canada/en/artificial-intelligence-and-data-act-aida-companion-document




 
 
 

Comments


bottom of page