NSERC Discovery grants
Peer reviewers’ perspectives

Q. It looks like my institution is going to take advantage of the volatility in the U.S. right now by poaching one or more U.S.-based faculty. What do established researchers new to the Canadian funding environment need to know to be successful with Tri-Agency?
– Anonymous, Research Grants Facilitation
A. To answer this question, I’m drawing on interviews I conducted with Tri-Agency peer reviewers last year. I’ve already shared what peer reviewers for the NFRF Exploration and Transformation, CIHR Project and SSHRC Insight and Insight Development Grant competitions told me; this month, I’m focusing on my findings from interviews with 28 former NSERC Discovery peer reviewers, including at least one person from each of the competition’s 12 committees.
One important distinction between Discovery and the other major Tri-Agency competitions is that, for the Discovery, you’ll need to articulate a program of research, rather than an individual project. This is baseline knowledge. Yet my conversations with Discovery reviewers — some of whom also review for CIHR or SSHRC — revealed additional points of divergence that strong grant-writers can take advantage of to meet the expectations of NSERC reviewers.
Below, I’ll summarize the main themes that I heard from Discovery peer reviewers; I also have a free 26-page PDF that shares peer reviewers’ thoughts in their own words.
1. Discovery reviewers want big steps and strong legs
If you were applying to CIHR, you’d want to have pilot project or preliminary data that you could use to show the viability of your proposed work. But NSERC Discovery peer reviewers told me they are much more interested in discoverythan in incremental research — meaning that they’re excited to see a sense of open-endedness, a sense of not-yet-knowing, a sense of the possibility of unexpected as well as expected outcomes.
So that’s the big step — but what are the strong legs?
Reviewers across Tri-Agency competitions regularly told me that the best evidence of a person’s ability to do something is their having done that thing before. For the NSERC Discovery, instead of turning to pilot data to see that evidence, reviewers told me they went to the Canadian Common CV (CCV) — a web portal that allows researchers to enter their CV data once and output it in formats suitable for submission to CCV Network member organizations, including Tri-Agency.
Most NSERC Discovery peer reviewers read the CCV before they look at any other part of an application — even the summary. So, while you’ll want to describe a big step forward, you’ll also have to show that you have the capacity to make such a step — either because you have a track record of big steps, or because the culmination of your experience and expertise has positioned you to take one now.
In my opinion, the forthcoming move to a narrative CV for grant competitions will be great for Discovery applicants. The CCV is a nightmare, yet some peer reviewers told me that, when they find errors in applicants’ CCVs, they think, “if this applicant can’t do something as basic as submit a correct and up-to-date CCV, how much can I depend on their ability to pay attention to detail in their experiments?” That’s not good for applicants, and it’s not good for science either. In my opinion, a person’s ability to wrestle with a glitchy and frustrating submission portal is no reflection of their ability to conduct careful and meticulous science.
When you write your narrative CV instead of wrestling with the CCV website, that shift should help you to see the CV as a part of your application — a persuasive tool that needs to be aligned with the rest of your application materials.
At the same time, I’m also worried that the switch to the narrative CV won’t be good for applicants who don’t have strong writing skills, including applicants whose first language isn’t English. At the moment, that’s not a concern I’m hearing Tri-Agency acknowledge or address.
2. If the problem isn’t your CV, then it’s your methodology
When NSERC Discovery peer reviewers finish reading the CCV and get into the proposal proper, they often identify the methods section as the main point of weakness.
Peer reviewers told me about applicants:
- not justifying their choice of methods or their sample size;
- not articulating how and why their selected methods will answer their research question;
- not demonstrating the ability to implement their methods;
- not outlining the step-by-step of their planned approach;
- and, increasingly importantly, not including equity, diversity and inclusion in both their research design and practice.
Based on this input, it seems to me that applicants are misunderstanding what it means to write for a broad, non-expert audience in the NSERC Discovery context. As a result, a surprising number of applicants are misfiring on the methods section. Remember that there are any number of methodological approaches that can address a single objective or research question. Be sure to justify the methodological choices you’ve made.
3. You can’t take a slapdash approach to equity, diversity and inclusion
Finally, when it comes to EDI in research practice — that is, the day-to-day, week-to-week operations of your team, including your trainees — peer reviewers told me that they have little patience for generic cut-and-paste statements that seem to have come from a central research office rather than from the individual applicant in their specific lab and context.
Peer reviewers want to learn about the challenges to access and inclusion that exist in an applicant’s specific discipline, niche and institution, and their strategies to address those problems. If your discipline’s problem is a leaky pipeline, then you should be able to identify where the leak is located and how you’re going to help fix it.
By avoiding the boilerplate, you’ll distinguish your approach from those of other grant applicants, which may contribute to tipping your application over the line between un-fundable and fundable projects.
One key issue remains, though: peer reviewers demand comprehensive EDI plans upfront, but there’s no follow-up to verify that successful applicants actually put their plans into practice. A narrative CV could perhaps serve as that check in the future, if applicants choose to include this information, and peer reviewers use it as part of their evaluation.
To learn more about how NSERC Discovery peer reviewers describe their work —including details about what they want to see in applications, and how they expect applicants to write them — please check out my free 26-page PDF.
Featured Jobs
- Science - Assistant Professor (Teaching)The University of British Columbia
- Psychology - Assistant Professor (Clinical Psychology)Queen's University
- Political Science - Assistant Professor (Political Theory)Saint Mary's University
- Canadian Politics - Assistant ProfessorUniversity of Toronto
- Canada Excellence Research Chair in Energy TransitionsUniversité du Québec à Trois-Rivières (UQTR)
Post a comment
University Affairs moderates all comments according to the following guidelines. If approved, comments generally appear within one business day. We may republish particularly insightful remarks in our print edition or elsewhere.
1 Comments
I love reading your suggestions, Letitia!