Ontario university strategic mandate agreements: a train wreck waiting to happen
Where the plan goes off track is the system-wide metrics it uses to assess research excellence and impact.
Ontario universities currently find themselves in the second phase of the province’s strategic mandate agreements. This second phase, or SMA2, runs from 2017 to 2020 and is part of a planned three-phased process which began with the implementation of phase 1 from 2014 to 2017. The initiative is championed by the Ministry of Advanced Education and Skills Development and was initially signed onto by 21 Ontario universities with the stated aim of building on current strengths and to help drive system-wide objectives and government priorities.
The agreement is designed to collect quantitative information grouped under the following broad themes: a) student experience; b) innovation in teaching and learning excellence; c) access and equity; d) research excellence and impact; and e) innovation, economic development and community engagement. The collection of system-wide data is not a bad idea on its own. For example, looking at metrics like student retention data between years one and two, proportion of expenditures on student services, graduation rates, data on the number and proportion of Indigenous students, first-generation students and students with disabilities, and graduate employment rates, all can be helpful.
Where the plan goes off-track is with the system-wide metrics used to assess research excellence and impact: 1) Tri-council funding (total and share by council); 2) number of papers (total and per full-time faculty); and 3) number of citations (total and per paper). A tabulation of our worth as scholars is simply not possible through narrowly conceived, quantified metrics that merely total up research grants, peer-reviewed publications and citations. Such an approach perversely de-incentivises time-consuming research, community-based research, Indigenous research, innovative lines of inquiry and alternative forms of scholarship. It effectively displaces research that “matters” with research that “counts” and puts a premium on doing simply what counts as fast as possible.
How so? Because as universities are inevitably benchmarked and compared with one another on league tables, misguided conclusions about quality will be drawn, thus creating intense pressure to excel on these narrowly conceived performance indicators. A recent report by the Conference Board of Canada entitled Beyond Citations. Knowledge Mobilization, Research Impact, and the Changing Nature of Academic Work, concluded, “numbers alone, via citation metrics, evaluation framework scores, or even funds raised, paint only half a picture of research impact.”
Danger ahead
Even more alarming – and what is hardly being discussed – is how these damaging and limited terms of reference will be amplified when the agreement enters its third phase, SMA3, from 2020 to 2023. In this third phase, the actual funding allotments to universities will be tied to their performance on the agreement’s extremely deficient metrics. Here we can confidently predict that the assessment scheme will completely lose its way, as we have already witnessed in countries with similar national performance-based funding schemes like the Research Excellence Framework in the U.K., the Excellence in Research in Australia, and the Performance-Based Research Fund in New Zealand.
The one-size-fits-all “factory model” of knowledge creation, dissemination and measurement does a disservice to us all. First, not all researchers require funding to carry out their important and potentially ground-breaking work. Low-cost research methods should not be devalued or indirectly categorized as lesser. Second, to meet the demands of an audit culture set on counting quantity over quality, researchers will almost certainly experience greater pressure to “game” results through p-hacking or data dredging, or “salami-slicing” research into the least publishable unit.
Finally, it is beyond dispute that measuring research impact via mere citation counts is not an accurate measure of quality. As an illustration, consider the recent example of the article “The case for colonialism” published in Third World Quarterly (now withdrawn). Such an article’s citation count would have been through the roof, but only inasmuch as scholars would be citing it to critique it and dispute its claims and dubious quality.
As Yves Gingras, who holds the Canadian Research Chair in the History and Sociology of Science at Université du Québec à Montréal, recently noted in his book, Bibliometrics and Research Evaluation: Uses and Abuses, both the Council of Canadian Academies and the Higher Education Funding Council for England have concluded that “mapping research funding allocation directly to quantitative indicators is far too simplistic, and is not a realistic strategy.”
The Government of Ontario needs to rethink its SMA3 strategy now, in order to avert the damage that will follow this misguided policy. Now is also the time for Ontario university administrators to speak up and denounce this agreement before any further harm can result.
Marc Spooner is a professor in the faculty of education at the University of Regina.
Featured Jobs
- Economics - Associate/Full Professor of TeachingThe University of British Columbia
- Indigenous Studies - Assistant Professor, 1-year termFirst Nations University of Canada
- Electrical Engineering - Assistant Professor (Electromagnetic/Photonic Devices and Systems)Toronto Metropolitan University
- Electrical and Computer Engineering - Assistant/Associate ProfessorWestern University
Post a comment
University Affairs moderates all comments according to the following guidelines. If approved, comments generally appear within one business day. We may republish particularly insightful remarks in our print edition or elsewhere.
5 Comments
DON’T COUNT YOUR CHICKENS
This genre of article appears periodically decade after decade. The underlying premise is that at some time systems to assess research excellence were actually on track. So, at the time an article was written, the systems must have gone “off track.” In the 1990s Canadians for Responsible Research Funding (CARRF) did not agree with this “good old days” argument. In general, its members recognized that, from the “start,” (to the extent that there was ever a clear starting point, such as immediately post WWII), assessment had always been off track. CARRF offered diagnoses of the problem and offered suggestions (e.g. 1, 2). All to no avail.
In those times, the Canadian problem was more serious than in the lavishly funded USA. However, there is now growing disquiet south of the border, and it’s not just Trump. Addressing “systematic problems in the research enterprise” an article entitled “Rescuing US biomedical research from its systemic flaws,” has received much attention (3). More fire and fury to come, but do not count on change any time soon!
1. Forsdyke DR (1993) On giraffes and peer review. FASEB J 7:619-621.
2. Forsdyke DR (2000) Tomorrow’s Cures Today? How to Reform the Health Research System. Harwood Academic, Amsterdam.
3. Alberts B, Kirschner MW, Tilghman S, Varmus H (2014) Rescuing US biomedical research from its systemic flaws. Proc Natl Acad Sci USA 111:5773-5777.
It is not only the effectiveness and relevance of external funding that varies widely by discipline. Citation counts do as well. Different disciplinary cultures, not quality of research, account for this. If this is not corrected for, this train will be a wreck before it leaves the station.
“Not everything that can be counted counts.
Not everything that counts can be counted.”
William Bruce Cameron
Feel free to use whatever metrics you wish, or no metrics at all, as long as you are willing to skip gov’t money. If, on the other hand, you wish someone to give you money, you will find that the person giving you money might have some specific expectations in exchange for that money.
Rather like every prof. when s/he goes to the store to buy something.
Sounds like you need to take your own advice, Doc and Read The F Manual– AGAIN!
In fact, maybe just try reading this piece again…
Or if that’s too much, here’s a Coles Notes version:
Universities contribute countless social benefits to society, are economic engines,
and taxpayers are better served by meaningful & innovative scholarship, not tired, wasteful, and debunked managerial tick-box schemes.