Skip navigation
In my opinion

It’s time to change the way research grants are awarded in Canada

SSHRC’s peer-review system is protracted, opaque and unaccountable.


I failed to get a Social Sciences and Humanities Research Council insight grant this year. Disappointing, but not surprising since nearly 80 percent of applicants came up short. Academics often experience failure in publishing and grantsmanship. It’s part of the job. In this case, though, it’s how I failed, and what this says about accountability and peer review at SSHRC, that warrants consideration.

Like others, I found the SSHRC application process to be especially protracted. Long and short versions of the research proposal combine with pages of forms and many other requirements. Final page count for the entire application: 40. Total hours invested: 70 to 80.

Early on, I often asked myself, why do this? Since my last SSHRC research grant, I had supported my work with funds from visiting fellowships and internal research grants. None of these required 70 hours to apply. My graduate students are funded from faculty sources, not supervisor research grants. A rational move would be to skip the whole exercise.

I didn’t for a couple of reasons. My university, like others, has moved to link internal research funding to SSHRC/Tri-Council applications. Don’t apply for a SSHRC and you are not eligible for some pots of intra-university funding.

More important, my proposed research was labour intensive: a historical account of cigarette smoking in youth media after 1945. This would mean examining films, TV shows, comic books and teen magazines. It would require research assistants and funding for archival visits. Tobacco companies, involved in litigation, had for years hired university historians to do similar work. It was time, I thought, for a non-aligned scholar to do so too.

SSHRC seemed the best option, so I spent much of August and September dealing with training plans, budget justifications, and a capricious online application system.

The following April I learned that my proposal was unsuccessful. Six weeks later, SSHRC sent me a package with two reports from anonymous external reviewers, most likely experts in tobacco history. Both offered unqualified praise and support for the project (25 of 26 scores were in the highest possible category). I couldn’t recall getting a more positive set of peer reviews in my 15 years in academe.

On another page was my score from the SSHRC selection committee, which was in the bottom 35 percent of applicant scores. The committee as a whole did not produce this score. Instead, SSHRC uses a “triage” method to adjudicate files. Initially, each application goes to two committee members. If at this stage one’s combined tally falls within the bottom 35 percent of scores, then it’s effectively game over; the application does not go to the entire committee.

Incredibly, no comments are provided to explain this score, not from the two relevant committee members or anyone else at SSHRC. Weak publishing record? Budget inflated? Project too ambitious? I have no idea. The policy is for no written feedback to be sent to the “Group of 35 Percent.”

I contacted a SSHRC program officer to ask about this, to see whether I could at least be sent notes from the two committee members who had read my application. I received this response: “In terms of committee members’ notes, we ask them to destroy them as soon as the meeting is over. The conflict of interest sheet that they sign does tell them not to discuss any applications or keep any notes. We do not keep preliminary notes.”

What about the discrepancy between the external reviews and my low score? It happens, I was told. Might an unreasonably low score from a single committee member who is not required to provide any written rationale effectively scupper an application? This was unlikely since “committee members take their responsibilities seriously.”

My application had identified historians at four Canadian universities who worked for tobacco companies, some serving as expert witnesses in court. Two members of the SSHRC committee were departmental colleagues of these historians. I advised SSHRC of this before the evaluation, asking if conflict-of-interest provisions prevented them from assessing my file. I was told after the competition that conflict of interest did not factor in the assessment of my application.

That may be true. But how can I really know? SSHRC policy prevents the identification of committee members who read particular files at the triage stage. They, in turn, are not required to provide written justifications for low-score assessments. It is optional whether they consider the external reviews.

SSHRC trumpets its commitment to the peer-review process, but it’s a bizarre form of peer review. Imagine a peer-reviewed journal whose editor solicited reviews from experts which came back highly positive. The editor then rejected the paper outright without explanation to the author. If done routinely, how long would this journal remain credible?

What about the SSHRC appeal process? For that, one must show that procedural errors occurred during adjudication. But SSHRC’s procedures permit the very things presented here as most problematic: triage evaluation, take-it-or-leave-it use of external reviews, and no comments for nearly four in 10 applicants.

What’s the solution? SSHRC sees its main problems as insufficient research funds and too many applicants. The latter necessitates triage assessment and the foregoing of written feedback to many. Surely better options exist. The number of applicants could be reduced, for example, by having those with A-to-M surnames apply in even numbered years. Or a “first-cut” of grant applicants could occur at the university or college level, as exists with the SSHRC doctoral award program.

More substantively, Ottawa could adopt a variant of what happens with the Ontario Graduate Scholarship program: SSHRC could provide block funding to universities and colleges and have them administer competitions for certain grants. SSHRC would still set research funding priorities. But instead of adjudicating grant competitions, it would, in three- or five-year intervals, evaluate the research output of grant holders and their institutions. This would require fewer bureaucratic resources than the current system, freeing up money for under-funded research programs.

On its website, SSHRC promotes its adherence to “merit review”, a “transparent, in-depth and effective way to allocate public research funds.” If my experience with this “merit review” process is commonly shared, then it may be time to reassess SSHRC’s mandate and operating practices.

Daniel Robinson is a historian and associate professor in the media studies program, faculty of information and media studies, at Western University.


Post a comment
University Affairs moderates all comments according to the following guidelines. If approved, comments generally appear within one business day. We may republish particularly insightful remarks in our print edition or elsewhere.

Your email address will not be published. Required fields are marked *

  1. Catherine Rubincam / July 4, 2013 at 05:06

    Having served a 3-year term a dozen years ago on one of SSHRC’s peer review committees (acting as Chair of the committee for two of those years), I read Dr. Robinson’s article with particular interest. Alternative schemes for the allocation of grant money, such as those made here, are certainly worth thinking about, but a campaign to achieve such changes would require a lot of time and energy. In the meantime, I would suggest that he might be able to get helpful advice by seeking out a colleague who has served on a peer review committee in a subject area close to his, and asking that person to read through all the documents related to his application (i.e., the application itself, the external assessors’ reports and the scores received from SSHRC, and the publicly available information about the rate of success of applications in the relevant competition). A scholar who has participated recently in the decision-making process of one of these committees should be able to explain more fully how decisions are reached, and make specific suggestions about aspects of the application that could be improved. I am not advocating an approach to a member of the particular committee that gave Dr. Robinson’s application a low score, which would involve serious problems of confidentiality, but rather arguing that the general expertise acquired by service on a similar committee could be helpful to anyone wanting to rework an unsuccessful application. I speak from my own participation in such a process.

  2. Axel van den Berg / July 4, 2013 at 06:15

    Thanks for this piece, Daniel. I had the exact same experience with my SSHRCC application this year. One evaluator gave me ONLY top marks and the other gave me good to excellent on most categories. Then the committee systematically gave me lower marks than either of them. Since I only have the evaluators’ comments I am totally in the dark as to how to improve my chances of getting a grant. Curiously, the strong evaluation may have hurt my chances. My grant officer at the university told me that committee members “tend to ignore comments from external reviewer who express very extreme opinions, i.e. overly positive or negative.” So we’re better off with lukewarm support? This is no way of running a funding agency.

  3. David Graham / July 4, 2013 at 09:14

    Based on my own admittedly dated experience (member and chair of the SRG interdisciplinary committee, chair of the former MRCI committee, member of various SSHRC working groups, etc.), I agree with Dr Rubincam’s advice. I don’t doubt that there are problems with the current system, just as there have been problems with all the previous peer-review committee assessment systems, but I know that when I’ve been unsuccessful I’ve always had exactly the same feelings that Dr Robinson had, and so I sympathize. Conflict of interest and arbitrariness are definitely problems, as is theoretical bias of one sort or another (confirmation bias being a huge problem, as it is everywhere) but I would raise the following points:

    1. With some exceptions, it is normal for evaluators to give high scores, in part because the majority of the applications are very good; this is not very helpful to the committee, since there tends to be little variance across the evaluations to help discriminate (and with the amount of money available and the constantly rising level of demand, discriminate the committee must!). In practice, then, there will be little difference between any two adjacently ranked applications including those situated just above and just below the cut-off line. Is this optimal? Hardly, but the financial realities are implacable.

    2. This means that committee members may tend to focus on small imperfections in an application and “overweight” them in order to be able to discriminate clearly; I’m not arguing that this is the best or right way to do it, I’m just saying that this is what tends to happen in practice.

    3. The problem of rising demand cannot be dealt with as easily as Dr Robinson suggests: dividing applicants by surname does nothing to reduce the total number of applications over time (as opposed to in a single year sample). If applicants were limited to one application every five (or ten, or fifteen) years, that would do the trick, but I can imagine the reaction to such a proposal!

    4. Doing a first cut at the university level does nothing to address the fundamental issues (arbitrariness, conflict of interest, opacity): it would merely displace them, and in fact might well exacerbate them, because the assessors would be closer to the applicants and thus more likely to be personally biased. The same issues would apply in the case of a block grant scheme, alas, though with a solid performance assessment as Dr Robinson suggests, that might be an interesting approach to try.

    5. The biggest issue, in my view, is the one Dr Rubincam’s comment addresses, namely the “black box” nature of the process. Getting general advice about committee culture and the process from those most familiar with it is invaluable. Serving a term on a committee is especially enlightening! If your university provides grant facilitation service, taking advantage of it to identify fundamental issues in your application can be extremely useful (e.g., to identify flaws in logic, missing documentation, conformity to the committee’s mandate, etc.), as can getting the advice of a successful scholar in your own field (e.g., to assess how valuable and original your idea is: if the central idea is not good, treating the cosmetic blemishes is useless).

    Apologies for the length of this post, but the issues raised by Dr Robinson are not going away any time soon, and are in fact likely to continue to become even more acute (as are others, in particular the insistence of governments on providing targeted funding).

  4. Gabriella Solti / July 4, 2013 at 14:02

    Dr Rubincam’s advice is very thoughtful and I think it is the right approach to increase the quality of research proposals and success rate in getting SSHRC Insight Grants. Especially for Dr. Robinson who works at a university which has an outstanding Research Office. I speak from personal experience; they helped me greatly.

    I believe Dr. Robinson’s suggestion to introduce a block-funding (quota) system could lead to an unfair situation. Currently all universities and public colleges are eligible to hold a SSHRC Insight Grant; it’s an even playing field. But what would happen if Dr. Robinson’s suggestion to a block funding model without national competition were implemented? Would he suggest considering the size of university / enrollment (which has nothing to do with quality of research) for quota calculation or would he suggest considering the past performance of obtaining SSHRC grants by his university? This can lead to the complete exclusion of smaller universities and colleges if they had no or only insignificant grants in the past. Exactly at a time when they are in an up and coming stage of hiring faculty with research degrees and becoming more and more successful in competing for large SSHRC Grants. Quite a number of smaller colleges obtained university status in recent years and they are very ambitious. Let them take the grants if they can in a fair process even if it means a fiercer competition for larger universities.

    I am also surprised why the effort of writing a grant application and investing 70-80 hours of work into the process is considered a waste of time. Shouldn’t writing a proper research plan, a solid budget, resource plan and training plan for graduate students be part of the daily life of a researcher in academia regardless if they apply for grant or not? These are very important activities. I am an artist as well as a graduate student with three year experience assisting my former professor in applying for multi-year, major SSHRC Insight Grants, this year successfully. In my own practice as an artist I have to compete for every exhibition opportunity (and they don’t pay) and for even the tiniest grant money or a partially paid artist residency I have to write extensive proposals. In the arts we don’t have big grants and competition for funds is way more fierce than in academia, but most of the time writing those applications are very helpful in clarifying our ideas and artistic intents and often they propel us to realize an artistic project regardless if we can obtain full funding or not. Artists consider grant applications as part of their art practice. Maybe researchers could consider their grant applications as part of their research practice, something that is useful in itself and essential to research beyond its potential monetary value.

  5. Amelia / July 4, 2013 at 14:31

    In reply to Axel’s comment: I’d agree that lukewarm support is the best. Mediocrity and the ‘blending in’ are key qualities to get grants, jobs etc. in this country. Sad, but true. Oh, and let’s not forget the magic ‘networking’: you gotta know those “committee members [who] take their responsibilities seriously.” Wonder why graduates of great universities are semi-illiterate? It’s all connected…

Click to fill out a quick survey