Ontario university strategic mandate agreements: a train wreck waiting to happen
Where the plan goes off track is the system-wide metrics it uses to assess research excellence and impact.
Ontario universities currently find themselves in the second phase of the province’s strategic mandate agreements. This second phase, or SMA2, runs from 2017 to 2020 and is part of a planned three-phased process which began with the implementation of phase 1 from 2014 to 2017. The initiative is championed by the Ministry of Advanced Education and Skills Development and was initially signed onto by 21 Ontario universities with the stated aim of building on current strengths and to help drive system-wide objectives and government priorities.
The agreement is designed to collect quantitative information grouped under the following broad themes: a) student experience; b) innovation in teaching and learning excellence; c) access and equity; d) research excellence and impact; and e) innovation, economic development and community engagement. The collection of system-wide data is not a bad idea on its own. For example, looking at metrics like student retention data between years one and two, proportion of expenditures on student services, graduation rates, data on the number and proportion of Indigenous students, first-generation students and students with disabilities, and graduate employment rates, all can be helpful.
Where the plan goes off-track is with the system-wide metrics used to assess research excellence and impact: 1) Tri-council funding (total and share by council); 2) number of papers (total and per full-time faculty); and 3) number of citations (total and per paper). A tabulation of our worth as scholars is simply not possible through narrowly conceived, quantified metrics that merely total up research grants, peer-reviewed publications and citations. Such an approach perversely de-incentivises time-consuming research, community-based research, Indigenous research, innovative lines of inquiry and alternative forms of scholarship. It effectively displaces research that “matters” with research that “counts” and puts a premium on doing simply what counts as fast as possible.
How so? Because as universities are inevitably benchmarked and compared with one another on league tables, misguided conclusions about quality will be drawn, thus creating intense pressure to excel on these narrowly conceived performance indicators. A recent report by the Conference Board of Canada entitled Beyond Citations. Knowledge Mobilization, Research Impact, and the Changing Nature of Academic Work, concluded, “numbers alone, via citation metrics, evaluation framework scores, or even funds raised, paint only half a picture of research impact.”
Danger ahead
Even more alarming – and what is hardly being discussed – is how these damaging and limited terms of reference will be amplified when the agreement enters its third phase, SMA3, from 2020 to 2023. In this third phase, the actual funding allotments to universities will be tied to their performance on the agreement’s extremely deficient metrics. Here we can confidently predict that the assessment scheme will completely lose its way, as we have already witnessed in countries with similar national performance-based funding schemes like the Research Excellence Framework in the U.K., the Excellence in Research in Australia, and the Performance-Based Research Fund in New Zealand.
The one-size-fits-all “factory model” of knowledge creation, dissemination and measurement does a disservice to us all. First, not all researchers require funding to carry out their important and potentially ground-breaking work. Low-cost research methods should not be devalued or indirectly categorized as lesser. Second, to meet the demands of an audit culture set on counting quantity over quality, researchers will almost certainly experience greater pressure to “game” results through p-hacking or data dredging, or “salami-slicing” research into the least publishable unit.
Finally, it is beyond dispute that measuring research impact via mere citation counts is not an accurate measure of quality. As an illustration, consider the recent example of the article “The case for colonialism” published in Third World Quarterly (now withdrawn). Such an article’s citation count would have been through the roof, but only inasmuch as scholars would be citing it to critique it and dispute its claims and dubious quality.
As Yves Gingras, who holds the Canadian Research Chair in the History and Sociology of Science at Université du Québec à Montréal, recently noted in his book, Bibliometrics and Research Evaluation: Uses and Abuses, both the Council of Canadian Academies and the Higher Education Funding Council for England have concluded that “mapping research funding allocation directly to quantitative indicators is far too simplistic, and is not a realistic strategy.”
The Government of Ontario needs to rethink its SMA3 strategy now, in order to avert the damage that will follow this misguided policy. Now is also the time for Ontario university administrators to speak up and denounce this agreement before any further harm can result.
Marc Spooner is a professor in the faculty of education at the University of Regina.
Featured Jobs
- Public Policy - JW McConnell Visiting ScholarMcGill University
- Vice-President Research & Scientific EngagementMS Canada
- Politics and Public Administration - Assistant Professor (Public Policy)Toronto Metropolitan University
- Fashion - Instructional Assistant/Associate Professor (Creative & Cultural Industries)Chapman University - Wilkinson College of Arts, Humanities, and Social Sciences
- Economics - Associate/Full Professor of TeachingThe University of British Columbia
Post a comment
University Affairs moderates all comments according to the following guidelines. If approved, comments generally appear within one business day. We may republish particularly insightful remarks in our print edition or elsewhere.