California went big on AI in universities. Canada should go smart instead
Learning requires struggle, persistence, iteration and deep focus.

The California State University system recently announced the largest-ever deployment of ChatGPT Edu, rolling out AI access to more than 460,000 students and 63,000 faculty and staff across the system. The move is being celebrated as a milestone in higher education’s embrace of artificial intelligence. But there’s a risk in rushing to integrate these tools without first considering what we know about how people actually learn.
The promise of generative AI is that it makes things easier. In some cases, much easier. With a few keystrokes, students can generate essays, summarize research papers or complete problem sets that once required hours of work. This frictionless efficiency is intoxicating — but it comes at a cost.
Learning is not frictionless. It requires struggle, persistence, iteration and deep focus. The risk of a too-hasty full-scale AI adoption in universities is that it offers students a way around that struggle, replacing the hard cognitive labour of learning with quick, polished outputs that do little to build real understanding.
For years as a university professor, I’ve watched students’ tolerance for struggle decline. When I started teaching large first-year science courses, I would ask students anonymously how long they typically spent on a difficult problem before giving up. Year after year, the amount of time dropped.
Now, in an age where solutions are just a prompt away, it’s tempting to not even attempt a problem before turning to AI. This is not a failure of individual students. It’s a reflection of an environment where immediate answers are always within reach. Why wrestle with something difficult when an algorithm can do it for you in seconds?
Writing offers another example. The act of constructing an argument, refining an idea, wrestling with structure, clarity and argument — this is the real intellectual work of writing. The finished product is important, but the process is where the learning happens. Offloading that to AI doesn’t just remove effort — it removes the very thinking that makes the challenge of writing a valuable experience to a university student.
The biggest danger of AI in education is not that students will cheat. It’s that they will miss the opportunity to build the skills that higher education is meant to cultivate. The ability to persist through complexity, to work through uncertainty, to engage in deep analytical thought — these are the foundations of expertise. They cannot be skipped over.
This is why universities need a framework for AI integration that prioritizes learning over efficiency. A recent whitepaper on generative AI in higher education, that I co-authored with Danny Liu at the University of Sydney in collaboration with colleagues from universities across the Pacific Rim, outlines a structured approach to AI adoption. Our framework is built around five key principles: culture, rules, access, familiarity and trust.
In short, these principles provide a structured approach to AI adoption, ensuring it is integrated in ways that support student learning rather than undermine it. Culture shapes how AI aligns with a particular higher education institution’s value. Rules establish clear policies on when, how and if AI should be used. Access must be equitable, ensuring students and faculty receive not just access to the technology, but the guidance to use it effectively. Familiarity means structured opportunities to engage with AI critically rather than relying on it blindly. And trust is built through transparency, ethical safeguards and ongoing evaluation of AI’s impact on student learning.
Here’s what that could look like in practice. At UBC, we’re taking a measured but purposeful approach: supporting experimentation with the tools and support our faculty need; building programming that supports confidence in using the tools effectively and ethically; and listening to our various communities – faculty, staff and students – about what they need and what their concerns are.
California State University’s AI rollout reflects a common assumption: that if AI is good, more AI is better. But more isn’t always better in higher education. Canadian universities have an opportunity to take a more thoughtful approach, — one that embraces AI while preserving the struggle of learning. Because real learning comes from wrestling with ideas, not just arriving at answers — and the struggle is what makes it worthwhile.
Featured Jobs
- Accounting - Tenured or Tenure-Track Faculty PositionUniversity of Alberta
- Psychology - Assistant Professor (Human Neuroscience or Quantitative Methods)University of New Brunswick
- Human Resources and Organizational Behaviour - Lecturer, 2-year termUniversity of Saskatchewan
- Research Chair in Systems Transformation and Family Justice (Faculty Position)University of Calgary
- Architecture - Assistant ProfessorMcGill University
Post a comment
University Affairs moderates all comments according to the following guidelines. If approved, comments generally appear within one business day. We may republish particularly insightful remarks in our print edition or elsewhere.