Teaching in the age of AI shortcuts

Students will use AI. Here’s what it takes to ensure it strengthens their thinking instead of replacing it.

March 02, 2026
Graphic by: Moor Studio

In October 2025, a photo from a University of Illinois data science class went viral, showing students’ overreliance on AI.  Two professors sent a warning to more than 100 students after discovering that far more people were checking in for attendance through an online tool than were actually present in the lecture hall. Apology emails arrived, and the viral picture showed a glaring symptom of AI use: almost every email began with the exact same phrase: “I sincerely apologize.”  

In a recorded video of the lecture, students gasp, laugh and yell: “ChatGPT!” The moment ricocheted across social media — millions of views, thousands of comments and a familiar refrain from academics everywhere: This is what teaching looks like now.  

I understand the concern. When every apology, essay or problem-set explanation starts to sound indistinguishable and insincere, the fear goes beyond plagiarism and becomes about the erosion of student thinking.  

Still, AI isn’t going away — in classrooms or in the broader world our students are preparing to enter. Recent research, including a 2024 systematic review of AI dialogue systems in education lead-authored by Chunpeng Zhai and published in Springer Nature Link, suggests that over-reliance on AI-generated content can weaken critical thinking, analytical reasoning and decision-making. But that same body of work also shows that AI can support learning when it’s structured to promote reflection rather than replacement.  

For the past two years, my colleagues and I at the University of Toronto — and now through AXL, a Canadian venture studio focused on co-creating human-centric AI — have been working on LearnAid, an AI system built specifically for teaching and learning. LearnAid is now being used at the University of Toronto (U of T) by more than 3,500 students in the computer science department, with plans to expand the platform’s capabilities to other areas of study in the future.   

It is designed around a simple principle: help students learn with AI, rather than using AI to replace their thinking.  Based on that experience, here are three lessons about using AI effectively used in classrooms.   

AI works best when it slows students down  

One of the reasons those “I sincerely apologize” emails struck a nerve is because they represented a familiar pattern: students increasingly use AI not to understand material but to avoid engaging with it for the purpose of “saving time.” In turn, these time savings become a disadvantage.  

LearnAid was built to enforce the opposite. Instead of providing final answers, it asks students to reflect on the question and walks them through the problem step-by-step, thereby engaging their problem-solving skills. The model is simple but powerful: students must attempt the problem first. They must articulate what they do and don’t understand, before receiving help. before receiving help.   

In the classroom deployments we ran at U of T, students interacted with LearnAid more than 10,000 times, with clear evidence of deeper engagement patterns compared to open-ended chatbots.  

AI must align with the course — not the Internet  

Another challenge with generic AI tools is that they collapse context. A large language model doesn’t know your syllabus, your assignment policies or the concepts you emphasized last week.  

In contrast to general-purpose platforms like ChatGPT that source materials from the Internet, LearnAid can be customized to specific course materials such as lecture slides, assignment guidelines and worked examples. Instructors can even specify rules about how direct the AI’s hints are allowed to be.  

The result is that students stop getting confused by conflicting explanations and instead receive guidance that reinforces core course concepts.  

AI can improve equity — if you design for it  

One pattern that emerged from our deployments surprised us: female students, representing half of the course population, used LearnAid nearly twice as often as male students. Research has found that women participate in whole-class discussions less often than men, so some women may feel more comfortable using a tool that doesn’t require interaction in a large classroom.   

Classrooms are not equally accessible environments. Office hours can be intimidating. Help requests might spike after students have left the classroom.  

AI — when built to support thinking rather than bypass it — can reduce those barriers. It can make high-quality academic support available 24/7, without reinforcing patterns of over-reliance. This is one of the most promising benefits of AI in education. 

AI is here. The question is how we use it 

The viral apology emails were a symptom of a deeper shift: students now have access to tools that can simulate thinking without requiring it. If we respond only with prohibition, we will keep repeating the same cycle of new tools, new restrictions and new workarounds. 

But if we integrate AI thoughtfully — anchored in pedagogy, aligned with course content and designed to promote cognitive effort — we can help students build the skills that will matter most in an AI-integrated world: critical thinking, problem-solving and the ability to verify and challenge AI itself.  

The weekly read
for Canadian higher ed professionals
Join thousands of subscribers who receive career advice, news, opinion columns and feature stories from University Affairs.