Wasting time and money through ineffective evaluation
We need to start thinking about why a process exists and whether the proposed mechanism actually achieves that goal.
Over the last 11 months chairing our departmental research committee, I’ve gained a much broader understanding of the administrative workload that goes into decision-making at universities. Inevitably, this experience is unique and many universities will do things differently, but there are some insights that I feel readers will be interested in hearing and I would be equally keen to hear how other institutions approach similar issues. With increasing demands on our time, it is important to consider the activities and processes that consume this time and whether or not they are fit for purpose – many times they are not.
Perfection has a cost
One of the most common “pots of money” to direct is research pump-priming. In an ideal world, applicants would provide a detailed case for support that could be evaluated in a manner not dissimilar to a national grant panel with expert review. The application process would also prime the grant for external submission – sounds brilliant, right? The problem for most universities comes when you look at who would be making the decision about what to fund. At the vast majority of universities, you would struggle to find a panel with appropriate expertise to assess the proposed research in an in-depth manner and this has a huge potential to waste time and lead to misdirected priorities. With such a flawed process, it also begs the question of why we should get applicants to write detailed proposals in the face of the (likely!) prospect is that they will never be assessed by an expert? If the intention is to improve the chance of external success at a later date, we are better off designing ways of seeking external expert advice rather than pretending that a non-specialist panel can do the job.
Money spent awarding > money awarded
Internal university research/strategic money is not often at the scale of external funding organizations (in amount or length of time) and it is important to remember this when the process of fund disbursal is crafted. If you imagine that a panel of eight professors spend five hours reading proposals and one hour discussing them and they were supported by eight hours of administrator time to organize the proposals and ranking system, you quickly find yourself in the >$15-25K range in terms of salary money spent. If the total amount of funding is not much more than that sum, it is incredibly difficult to justify the process, especially considering that this back-of-the-envelope calculation also does not account for the time invested by the applicants themselves.
Simpler is better
In light of these sorts of arguments, we have moved some of our processes to a simple, streamlined application and assessment procedure. In many cases, this consists of one or two 100 to 200-word boxes to justify research need, strategic priorities, and so on. These can be written and evaluated by more people in less time and can result in valuable strategic input from generalists with the added benefit of liberating people and funds to support other tasks. Moreover, the risk of making a “wrong” decision is quite low when you consider that applicants are already members of the institution and the sums involved are meant to prime things for the future rather than alter the course of a researcher’s career. Thus far, the feedback for our new process has been uniformly positive from both the applicant and assessor perspective.
Past history predicts future success
Readers may see the above as a slightly reckless process with insufficient research oversight. To mitigate the potential wasted investment, simplified processes such as the one described above should incorporate some version of retrospective assessment of how previously awarded funds were spent. This is relatively easy to manage internally and would guard against serial offenders who do not deliver on promises. In such a scenario, early career researchers with no track record must be afforded the benefit of the doubt and not be penalized for the absence of track record. Finally, from an administrative side of things, the extra layer of scrutiny on a particular application is only applied retrospectively to successful applicants rather than all applicants, again resulting in saved time and fewer unnecessary assessments.
Academics need to stop creating their own problems
Overall, we need to avoid becoming our own worst enemy with unnecessarily onerous processes. This goes well beyond pump priming some research projects and is often cured by a little bit of thinking about why a process exists and whether the proposed mechanism actually achieves that goal. Pumping time and money into making things perfect typically results in an inability to get everything done. After all, it’s better to brush your teeth badly than not at all.
Featured Jobs
- Economics - Associate/Full Professor of TeachingThe University of British Columbia
- Vice-President Research & Scientific EngagementMS Canada
- Public Policy - JW McConnell Visiting ScholarMcGill University
- Fashion - Instructional Assistant/Associate Professor (Creative & Cultural Industries)Chapman University - Wilkinson College of Arts, Humanities, and Social Sciences
- Public Policy - JW McConnell Professor of Practice McGill University
Post a comment
University Affairs moderates all comments according to the following guidelines. If approved, comments generally appear within one business day. We may republish particularly insightful remarks in our print edition or elsewhere.