Skip navigation

Canadian universities score big in new sustainable development rankings

But, as with all university rankings, the question remains: what does it really mean?


Each fall, the U.K.-based publication Times Higher Education (THE) announces its World University Rankings, evaluating more than 1,000 global universities. And each year a handful of Canadian universities reliably make it into the top 100, but none seem to ever crack the top 10 – last year’s highest-ranked Canadian school was the University of Toronto, in 21st place.

So, this April, when THE released its inaugural University Impact Rankings, which evaluates universities’ contributions to 11 of the United Nations’ 17 sustainable development goals (SDGs), Canada’s performance was a nice surprise. Three Canadian universities landed in the top 10: McMaster University at number two, the University of British Columbia in third and Université de Montréal at number seven. Overall, 10 Canadian universities placed in the list of more than 450, with most in the top 100, and all in the top half.

“I think Canadian universities have genuinely taken a lead in this area, and it shows in the quality of the data they put forward,” says Duncan Ross, THE’s data and analytics director. “We can see they’ve been invested in social impacts and responsibility for a long time.”

The new rankings are partly a response to longstanding criticism of THE’s flagship rankings, first published in 2004, which have come under fire for focusing too narrowly on research productivity and institutional reputation, rather than taking a more holistic view encompassing factors such as teaching quality or, indeed, social impact. The response from many critics to the new rankings has been more positive – but cautiously so.

Edmund Adam is a PhD candidate in higher education at the Ontario Institute for Studies in Education of the University of Toronto whose work focuses on how rankings influence institutional policy. He says that while he hopes the impact rankings show that THE is taking a broader view of institutional excellence, he remains wary. “It fits right into their business model, which is providing consultancy. So, they’re expanding their client base, basically,” he says.

Bahram Bekhradnia is president of the U.K.’s Higher Education Policy Institute. Like Mr. Adam, he believes that the impact rankings are a positive development, but casts doubt on the methodology. “Like all rankings, they suffer from self-reporting,” he says. “You depend on all universities using the same definitions and data quality, and there’s no way to do that internationally.”

As an example, Mr. Bekhradnia points to SDG 10, “Reduce Inequalities.” Among other things, it measures an institution’s commitment to recruiting staff and students from under-represented groups. “In some countries, it might be mandated to collect data on staff with disabilities,” he says. “While in others it’s an ad-hoc survey. And there will be differences in what is even considered to be a disability.”

Mr. Bekhradnia also says that since universities are ranked only on their top three SDGs (plus the mandatory SDG 17, which evaluates the strength of global partnerships), the rankings don’t compare like-to-like. Universities are ranked on their broad commitment to sustainability in general, rather than directly comparable goals. For example, McMaster’s top three SDGs were SDG 3 (good health and well-being), 8 (decent work and economic growth) and 11 (sustainable cities and communities). Third-ranked UBC also saw SDG 11 among its top scores, but its ranking was otherwise determined by its high score on SDG 9 (industry, innovation and infrastructure) and 13 (climate action).

For THE’s Mr. Ross, however, this is a strength, ensuring that social impact is evaluated against unique local needs rather than something defined from the company’s London headquarters. “Universities prioritized SDGs they were most involved in,” he says, “and that leads us to this framework. In part that was to stop us being dominated by Western global universities.”

While Western universities are well-represented (all but one of the top 10 are European or North American), it’s true that the results don’t look much like THE’s flagship rankings. The Ivy League, which dominates that top 10, is nowhere to be seen. Neither is China’s comparable C9 League, and few institutions in the League of European Research Universities are represented. Most, in fact, didn’t participate – which leads to another criticism, that the performance from relative underdogs is only thanks to the omission of certain usual suspects.

The day the rankings were released, Alex Usher, president of Toronto-based Higher Education Strategy Associates, wrote on his blog that “these new rankings do showcase a new group of ‘top’ institutions. But to some extent, that’s a function of most of the usual ‘top’ universities simply refusing to play in this sandbox.”

Mr. Ross counters that many universities didn’t participate due to the short timeline for data collection, and “an inevitable desire to see how it shakes out.” He anticipates higher participation next year, when all 17 of the UN’s SDGs will be included.

Patrick Deane, president of second-place McMaster, believes that while the rankings may not be infallible, the data-collection process alone was invaluable. “It gave us the chance to do an internal stock-taking,” he says. “The revelation that comes through self-study has absolutely reinforced for us a sense of where our energies have been committed, where we’re succeeding and where we can improve. … All universities should be careful the use they make of rankings. But so long as you’re circumspect about them, I think they have a lot to tell us.”

Post a comment
University Affairs moderates all comments according to the following guidelines. If approved, comments generally appear within one business day. We may republish particularly insightful remarks in our print edition or elsewhere.

Your email address will not be published. Required fields are marked *

Click to fill out a quick survey