Back to top

CAUT Bulletin Archives
1996-2016

March 2011

Tıme to Revisit NSERC Grant Rules

By John D. Murimboh
As science and engineering faculty across the country anxiously await the results of the 2011 Discovery Grant competition, it is worth reflecting on the changes the Natural Sciences and Engineering Research Council of Canada (the program’s administrator) has introduced over the past two years. The October 2010 CAUT Bulletin highlighted problems with the program’s success rates, but a much bigger problem is the disregard of its own rules by NSERC and the executive committees which make the funding decisions.

For those not familiar with NSERC, adjudication of Discovery Grant applications is a two-step process. Members of evaluation “groups,” broken down into “sections,” based on the match between their expertise and areas of a subset of applications, use a rating sheet to assign scores in reviewing the excellence of the researcher, merit of the proposal, and training of highly qualified personnel. Grant applications with the same overall scores are then grouped into funding bins.

Introduced in February 2009, two months after the 2009 competition deadline, this two-step process was supposed to increase transparency.1 However, funding recommendations (step two) are controlled by each group’s executive committee (consisting of section and group chairs) through the application of undocumented “funding principles.” The principles are generally not published, but we can gain insight from the chair of the chemistry evaluation group, who outlined the principles used in the 2010 funding competition — the most recent year for which results are available — in an email listserv conversation.2

Although the process described is specific to chemistry, it is likely this also applies to the four other evaluation groups in which only some of the applicants in quality bin “J” were funded. Discrepancies between the chair’s explanation and NSERC’s 2009–2010 Peer Review Manual are also outlined below.

One. “The principles we used were the same principles which have guided former selection committees: fund the early career researchers; we funded both the ECR as defined by NSERC as well as the first renewals.” — chair, chemistry evaluation group

The chemistry evaluation group’s tradition of funding first renewals effectively ended in the 2009 Discovery Grant competition with the elimination of the 3 + 2 funding system for early career researchers and first renewal applicants. I was one of the first to benefit from this change.

First renewals are actually established researchers as defined by the Peer Review Manual. (p. 3) There is no mention of first renewal applicants in the manual for the 2010 competition.

Evaluation groups may only establish a different quality cutoff for early career researchers. (Ibid., p. 15) It is therefore apparent the decision to preferentially fund first renewal established researchers was neither consistent nor allowed under NSERC rules at the time. While NSERC’s desire to support first renewal applicants is commendable, perhaps the rules should be established before the competition.

Two. “Of the remaining 31 grants (all established researchers), we funded the 10 VSM (very strong/strong/moderate) and not the 21 SSS (strong/strong/strong). For myself, that was based on my view, and the view expressed to me by many members of the community, that too much weight is based on the training of HQP (highly qualified personnel) and the evaluation of the excellence of the researcher is more objective than the other categories.” — chair, chemistry evaluation group

“The combination of an applicant’s ratings for the three criteria determines the funding bin.” (Peer Review Manual, p. 21)

“Each criterion is important and has equal weight when determining the quality category for the application.” (Ibid., p. 3)

Apparently all selection criteria are equal but some are more equal than others.

Three. “The Chemistry community over many years has pushed for selectivity in the awarding of grants. That is clearly evident in the average grant awarded by the Chemistry Evaluation Group: our average grant is $55,092 which is approximately ~$14,000 higher than the next highest average for an evaluation group and ~$20,000 higher than the average grant of all Evaluation Groups. We maintained this principle in deciding not to reduce the bin values. I also worried that if we cut the bin levels this year, they may erode continually over time … The option of decreasing all the grant levels for those awarded grants both last year and this year was not modeled or considered.” — chair, chemistry evaluation group

“Bin levels, budget permitting in a given competition year, are expected to be in a similar range from year to year.” (Peer Review Manual, p. 21)

Corollary: funding for each bin should be reduced if the budget is insufficient. The executive committee’s bias for higher grant levels, therefore, should be irrelevant. According to the manual, grant levels for each bin should be determined from the previous year and adjusted based on the available budget.

It is clear that even in the case of significant financial constraints, the manual contained the procedures necessary to assign funding levels without the need for intervention by the executive committee.

Furthermore, it would appear that all three funding principles used by the executive committee violate numerous rules as outlined in the manual. In addition, these funding principles seem to change arbitrarily from year to year (e.g., treatment of first renewal applicants). While the 2010-2011 manual has been revised to reflect the preferential treatment afforded to first renewal applicants, the other violations likely remain.

Although I appreciate that executive committees should rely on funding principles appropriate to their communities, presumably these funding principles must be consistent with the Peer Review Manual. Otherwise, what is the point of establishing the rules in the first place?

At the very least, the process has the appearance of unfairness. In the worst case, it suggests the executive committees did not like the rules established by NSERC, so they simply invented their own after the grant submission and merit assessment process to achieve a predeter­mined outcome — increased grant sizes through a reduction in success rates.

It seems the new Discovery Grant adjudication process is actually much less transparent than the system it replaced, as the all-powerful exe­cutive committees simply override the decisions and the generally excellent work of the evaluation groups.

Those considering an appeal of grant competition results may be surprised to learn that showing NSERC has violated its own rules may not be sufficient for a “compelling demonstration of error or procedural unfairness in the review process.”

NSERC’s endorsement of the executive committees’ violations is particularly disappointing for a system that was once regarded as the fair­est in the world. Given NSERC’s reluctance to abide by its own rules, perhaps it should dispense with the adjudication process and simply “sign below” to indicate they not approve funding.

---------------------------------------------------------------
John Murimboh is a professor of chemistry at Acadia University.

1. NSERC. 2009­–2010 Peer Review Manual. Retrieved October 18, 2009.

2. Chair, Chemistry Evaluation Group. (June 7, 2010). (ccucc-l mailing list) Follow up to last e-mail: Funding of Bin J; http://www.acadiau.ca/~jmurimbo/docs.htm.

The views expressed are those of the author and not necessarily CAUT.

Comment
CAUT welcomes articles between 800 and 1,500 words on contemporary issues directly related to post-secondary education. Articles should not deal with personal grievance cases nor with purely local issues. They should not be libellous or defamatory, abusive of individuals or groups, and should not make unsubstantiated allegations. They should be objective and on a political rather than a personal subject. A commentary is an opinion and not a “life story.” First person is not normally used. Articles may be in English or French, but will not be translated. Publication is at the sole discretion of CAUT. Commentary authors will be contacted only if their articles are accepted for publication. Commentary submissions should be sent to Liza Duhaime.