DEB Numbers: FY 2017 Wrap-Up

Fiscal year 2017 (FY 2017) officially closed out on September 30. Now that we are past our Fall panels, we have a chance to look back and report on the DEB Core Program merit review and funding outcomes for FY 2017.

This post follows the format we’ve used in previous years. For a refresher, and lengthier discussions of the inner workings of the metrics, you can visit the FY 2016, FY 2015, FY 2014, and FY 2013 numbers.

FY 2017 Summary Numbers

The charts below all reflect DEB Core Program projects through each stage of the review process: preliminary proposals, full proposals, and awards.

DEB reviewed 1384 preliminary proposals received under the DEB Core Programs solicitation and LTREB solicitation in January 2017, about 28% of which were invited to the full proposal stage.

The preliminary proposal invitees were joined at the full proposal stage by 1) Direct submissions to DEB under the CAREER, OPUS, and RCN solicitations, and 2) Projects shared for co-review by another NSF program. Altogether, 514 full proposals were reviewed in DEB during the Fall of 2017 (this includes the OPUS, CAREER, RCN, and co-reviews).

From this pool of full proposals, DEB made awards to 119 projects. Below, we present and discuss the Division-wide success rate and some select project demographics. The demographic numbers are presented as proportions for comparison across the review stages.

Success Rate

Success Rates 17

Figure 1: DEB Core Program success rates from fiscal year 2007 through the present. Prior to fiscal year 2012, there were two rounds of full proposal competition per fiscal year. Preliminary proposals were first submitted in January 2012, initiating the 2-stage review process and leading to the fiscal year 2013 award cohort.

Calculation Notes:

Preliminary proposal success rate is calculated as the number of invitations made divided by the number of preliminary proposals submitted.

Full proposal success rate is calculated as the number of awards made, divided by the number of full proposals reviewed, including OPUS, CAREER, and RCNs.

Note that post-2012, under the preliminary proposal system, the set of full proposals reviewed is ~80% invited full proposals and ~20% CAREER, OPUS, RCN and co-reviewed proposals, the latter of which were exempt from the preliminary proposal stage.

Overall success rate is calculated as the number of awards made divided by the total number of distinct funding requests (i.e., the sum of preliminary proposals submitted plus the exempt CAREER, OPUS, RCN, and co-reviewed full proposals).

Reminder: Elevated success rates (in 2009 and 2012) were due to:

  • a one-time ~50% increase in funding for FY2009 (the ARRA economic stimulus funding) without which success would have been ~13-15%; and,
  • a halving of proposal submissions in FY2012 (the first preliminary proposal deadline replaced a second full proposal deadline for FY2012), without which success would have been ~8-9%.

Individual and Collaborative Projects

As a reminder to readers: the gap between the proportion of single investigator projects in the preliminary proposal and full proposal stages is due to the single-investigator proposals in the CAREER and OPUS categories. The CAREER and OPUS proposals are not subject to the preliminary proposals. . The absence of CAREER and OPUS proposals at the preliminary proposal stage lowers the single investigator proportion of the preliminary proposal counts relative to the historical full proposal baseline.

Single investigators 17

Figure 2: The proportion of DEB Core Program projects lead by a single PI over time and at the different stages of merit review.

The proportion of collaborative proposals in our award portfolio rebounded from last year’s drop and is near the all-time high for both full proposals and awards. This is consistent with the general trend toward greater collaboration over the past decade and beyond.

Collab 17

Figure 3: The proportion of DEB Core Program projects with two or more different institutional participants over time and at the different stages of merit review.

Readers may notice that the collaborative and single-investigator groupings don’t sum to 100%. The remainders are intra-institutional multi-PI arrangements; such projects are certainly intellectual collaborations, but they are not a “collaborative project” per the NSF PAPPG definition (Figure 3).

Early Career Scientists

The best identifier of researcher career stage is the difference between the year that the PI obtained their Ph.D. (as self-reported by the PI) and the submission date. This “Degree Age” metric can be used as a proxy for how long each individual has been in the population of potential PIs.

PI degree age profile 17

Figure 4: Distribution of degree ages among PIs on DEB Core Program full proposal submissions.

PI degree age success rate for full 17

Figure 5: Full proposal success rates for PIs on DEB Core Program proposals by degree age. Figure displays annual data and a 5-year mean for the period of the preliminary proposal system in DEB.

Gender & Predominantly Undergraduate Institution (PUI) Status

famle PIs 17

PUIs 17

Figure 6: The representation of female PIs and predominantly undergraduate institutions in DEB Core Program proposals and awards. These two groups were noted by the community as groups of concern that would be potentially impacted by the pre-proposal system.

Concluding Thoughts

This concludes our 5th fiscal year wrap-up. This series originally started in 2013 to track metrics some PIs thought would be sensitive to preliminary proposal implementation in 2012. However, given our move to a no-deadline model and the elimination of the preliminary proposal system, next year’s fiscal year wrap-up may look a little different. We still plan on reporting our funding rates but the other metrics will change.

Spring 2017: DEB Preliminary Proposal Results

This past week, DEB completed processing all preliminary proposals submitted to the January 23rd 2017 deadline. Below is a summary of the outcomes for this year.

Panel Recommendations

The “Invite” column in the chart above reflects the panels’ recommendations while the “Total Invited” column reflects the programs’ recommendations. Each program’s final invite decision was based not only on the panel recommendation but also the availability of funds and portfolio balance.

The four DEB clusters convened 10 preliminary-proposal panels. Panelists reviewed 1,384 preliminary proposals and recommended 346 be invited for full proposal submission. We are very thankful to panelists who traveled from all over the country to participate in our merit review process. DEB program officers subsequently made adjustments for portfolio balance and invited 373 (27%) for full proposal submission.

By this time, all PIs who submitted a 2017 preliminary proposal should have heard back from DEB about the program’s recommendation (“Invite” or “Do Not Invite”). If you have not, please visit Fastlane.nsf.gov and select the “proposal functions” option then click on “proposal status.” If you were a Co-PI, please follow-up with your lead PI.

The chart below shows long-term trends in the numbers of preliminary proposals DEB has received since 2012, as well as the total invite numbers and percentages. As you can see, the numbers submitted have been decreasing and the overall invite rate has been increasing.

trends

 

 


DEB Numbers: FY 2016 Wrap-Up

Fiscal year 2016 officially closed out on September 30. Now that we are past our panels in October and early November, we have a chance to look back and report on the DEB Core Program merit review and funding outcomes for FY 2016.

This post follows the format we’ve used in previous years. For a refresher, and lengthier discussions of the hows and whys of the metrics, you can visit the 2015,  2014, and 2013 numbers.

Read on to see how 2016 compares.

FY2016 Summary Numbers

The charts below all reflect proportions of DEB Core Program projects through each stage of the review process: preliminary proposals, full proposals, and awards.

In the review process leading to awards in FY2016:

DEB reviewed 1502 preliminary proposals received under the DEB Core Programs solicitation and LTREB solicitation in January 2015, about 26% of which were invited to the full proposal stage.

The preliminary proposal invitees were joined at the full proposal stage by 1) Direct submissions to DEB under the CAREER, OPUS, and RCN solicitations, and 2) Projects shared for co-review by another NSF program. Altogether 524 full proposals were reviewed in DEB during October and November of 2015.

From this pool of full proposals, DEB made awards to 133 projects (technically, these were 202 separate institutional awards but for analysis purposes we count collaborative groups once, i.e., as a single proposed project).

Below, we present and discuss the Division-wide success rate and some selected project demographics that were concerns coming in to the preliminary proposal system. The demographic numbers are presented as proportions for comparison across the review stages. However, the progressive reduction in size of the denominators from preliminary proposals (1502) to awards (133) means each step becomes more sensitive to small absolute changes.

Success Rate

Success rate is a function of the number and size of requests submitted by the research communities and appropriated funding levels. The success rate for research proposals in the DEB Core Programs (Figure 1) has stabilized (even rebounded somewhat) since the preliminary proposal process was instituted. This stabilization emerges from:

  • Stable award numbers: Since FY 2013 the number of Core Program awards has consistently been between 131 and 136 funded projects[i].
  • Reduced request numbers: The initial wave of new people “testing the waters” by submitting a preliminary proposal is subsiding. As PIs became more familiar with the process, total submissions have dropped by about 10% across the last three years. With stable award numbers, fewer submissions translate directly into a 10% increase in overall success rate from 7.3% in FY2013 to 8.1% for 2016.
Figure 1: DEB Core Program success rates from fiscal year 2007 through the present. Prior to fiscal year 2012, there were two rounds of full proposal competition per fiscal year. Preliminary proposals were first submitted in January 2012, initiating the 2-stage review process and leading to the fiscal year 2013 award cohort.

Figure 1: DEB Core Program success rates from fiscal year 2007 through the present. Prior to fiscal year 2012, there were two rounds of full proposal competition per fiscal year. Preliminary proposals were first submitted in January 2012, initiating the 2-stage review process and leading to the fiscal year 2013 award cohort.

 

Calculation Notes:

Preliminary proposal success rate is calculated as the number of invitations made divided by the number of preliminary proposals submitted.

Full proposal success rate is calculated as the number of awards made, divided by the number of full proposals reviewed.

Note that post-2012, under the preliminary proposal system, the set of full proposals reviewed is ~80% invited full proposals and ~20% CAREER, OPUS, RCN and co-reviewed proposals, the latter of which are exempt from the preliminary proposal stage.

Overall success rate is calculated as the number of awards made divided by the total number of distinct funding requests (i.e., the sum of preliminary proposals submitted plus the exempt CAREER, OPUS, RCN, and co-reviewed full proposals).

Reminder: Elevated success rates (in 2009 and 2012) were due to:

  • a one-time ~50% increase in funding for FY2009 (the ARRA economic stimulus funding) without which success would have been ~13-15%; and,
  • a halving of proposal submissions in FY2012 (the first preliminary proposal deadline replaced a second full proposal deadline for FY2012), without which success would have been ~8-9%.

Individual and Collaborative Projects

As seen in Figure 2 below, there was little year-to-year change in the submission and funding success of single investigator projects. While the proportion of single investigator preliminary proposals increased slightly, there was a small decrease in both the full proposal and award groups. As a reminder to readers: the gap between the proportion of single investigator projects in the preliminary proposal and full proposal stages is due to the single-investigator proposals in the CAREER and OPUS categories. The CAREER and OPUS proposals are not subject to the preliminary proposal screen and make up a relatively larger portion of the full proposals.  Similarly, the absence of CAREER and OPUS proposals at the preliminary proposal stage lowers the single investigator proportion of the preliminary proposal counts relative to the historical full proposal baseline.

Figure 2: The proportion of DEB Core Program projects lead by a single PI over time and at the different stages of merit review.

Figure 2: The proportion of DEB Core Program projects lead by a single PI over time and at the different stages of merit review.

The proportion of collaborative proposals in our award portfolio rebounded from last year’s drop and is near the all-time high for both full proposals and awards. This is consistent with the general trend toward greater collaboration over the past decade and beyond.

Figure 3: The proportion of DEB Core Program projects with two or more different institutional participants over time and at the different stages of merit review.

Figure 3: The proportion of DEB Core Program projects with two or more different institutional participants over time and at the different stages of merit review.

Readers may notice that the collaborative and single-investigator groupings don’t sum to 100%. The remainders are intra-institutional multi-PI arrangements; such projects are certainly intellectual collaborations, but they are not a “collaborative project” per the NSF PAPPG definition (Figure 3).

Early Career Scientists

The best identifier of researcher career stage is a metric we calculate. It is the difference between the year that the PI obtained their Ph.D. (a self-reported by the PI) and the current date. This number can be used as a proxy for how long each individual has been in the population of potential PIs.

Figure 4: Distribution of degree ages among PIs on DEB Core Program full proposal submissions.

Figure 4: Distribution of degree ages among PIs on DEB Core Program full proposal submissions.

 

Figure 5: Full proposal success rates for PIs on DEB Core Program proposals by degree age. Figure displays annual data and a 4-year mean for the period of the preliminary proposal system in DEB.

Figure 5: Full proposal success rates for PIs on DEB Core Program proposals by degree age. Figure displays annual data and a 4-year mean for the period of the preliminary proposal system in DEB.

Little changes in the profile of submitter- and awardee-degree ages from 2013 through 2016. Moreover, success rate improves slightly with degree age. Success rates climb slowly from 20% for the newest PhDs, to 30% at 35 years post PhD. Note: PIs more than 35 years post-PhD comprise ~5% or less of the total PIs on proposals and awards. Although more experienced PIs have a somewhat better individual success rate, the PI population skews toward the early career group. Thus, early-, mid-, and later-career PIs wind up with similar representation in core program award decisions.

Gender & Predominantly Undergraduate Institution (PUI) Status

Another concern heading in to the preliminary proposal system was that there would be unintended consequences for different categories of submitters. Two years ago we saw a small change in award numbers lead to a visually jarring drop in the representation of female PIs among DEB awards, as well as a jump in the proportion of PUI awardees. Last year, we saw the pattern reversed. In addition to the apparent negative correlation between the proportion of female PI awardees and PUI awardees that has appeared, the award data just appears noisier than it was under the old system. But, as we stated in last year’s 2015 Wrap-up post:

We conclude that we are witnessing a small numbers effect; 131 awards is simply not a sufficiently large “sample” to be representative of the population across all of the potential decision-making variables and concerns.  PUIs are a minority component of the proposal pool (~18%). Female PIs are a minority component of the proposal pool (~30%). Beginning Investigators are a minority component of the proposal pool (~20%). Proposals that fall into two categories are an even smaller fraction of the proposal pool (~6%) and proposals that fit into all three are even smaller yet (~1%).”

Which now brings us to 2016.

Figure 6: The representation of female PIs and predominantly undergraduate institutions in DEB Core Program proposals and awards. These two groups were noted by the community as groups of concern that would be potentially impacted by the pre-proposal system.

Figure 6: The representation of female PIs and predominantly undergraduate institutions in DEB Core Program proposals and awards. These two groups were noted by the community as groups of concern that would be potentially impacted by the pre-proposal system.

Once again, we see the same pattern in the representation of female PIs and PUIs in the award portfolio: one goes up, the other down. As best as we can determine, our previous conclusion still provides the best explanation: with only 133 projects awarded, it’s a small numbers problem.

The longer-term pattern for both groups is not affected by these latest numbers. The proportion of female PIs has been increasing between by 0.5-1 percentage points per year. The proportion of primarily undergraduate institutions has held steady and there’s a ~4 percentage point gap from the full proposal to the award stage. The PUI gap pre-dates the preliminary proposal system and this group was not intended to be impacted by the preliminary proposal process, so we didn’t expect change.

Moreover, we note that the proportion of PUI preliminary proposals is growing. This represents a small absolute increase but is exaggerated by the reduced total number of preliminary proposals. While there has been no corresponding change in full proposals and awards, if these preliminary proposals represent new entrants to competition then we would expect some lag in those later-stage metrics. Should this development persist, it would be interesting to see if there is any effect on the representation gap between PUIs in preliminary proposals and DEB Core Programs awards. This gap is something that would likely be worth a deeper examination in the future.

Concluding Thoughts

Since the implementation of the preliminary proposal system in January 2012, we have not seen major departures from pre-2012 trajectories across measures of our portfolio with respect to PI or institution demographics or collaborations. Four years in, we have not seen indication of Division wide changes, especially any worsening of submission and award proportions among any of these major groups of concern. Success rates appear stable across multiple years for the first time in recent memory. However, the overall climate for awards remains poor due to continued funding stagnation. If there is any bright side to that fact, funding woes at least appear to not have been amplified for any particular subgroup in recent years. But, the limited purchasing power of program budgets means our ability to support the full diversity of the research community will continue to be tested and subject to difficult trade-offs.


[i] During the 13 years before preliminary proposals, DEB Core Program award numbers fluctuated quite a bit; they regularly rose or fell by 20-30 (and as many as 80!) awards per year.


DEB Numbers: Historical Proposal Loads

Last spring we posted on the per-person success rate and pointed out several interesting findings based on a decade of DEB data. We were seeing a lot of new PIs and, conversely, a lot of PIs who never returned after their first shot. And, the vast majority of PIs who managed to obtain funding are not continuously funded.

This post is a short follow-up to take a bigger picture look at submission rates.

Since preliminary proposals entered the scene, DEB really hasn’t seen much change in the submission pattern: 75% of PIs in any year submit one preliminary proposal and the other 25% submit two (and a small number submit three ideas in a year, if one also counts full proposals to special programs).

Before the preliminary proposals were launched, we ran some numbers on how often people tended to submit. The results were that, in the years immediately prior to preliminary proposals (~2008-2011), around 75% of PIs in a year were on a single proposal submission (25% on two or more). Fewer than 5% of PIs submitted more than two proposals in a year. Further, most PIs didn’t return to submit proposals year after year (either new ideas or re-working of prior submissions); skipping a year or two between submissions was typical. These data conflicted with the perceptions and anecdotes that “everyone” submitted several proposals every year and were increasing their submission intensity. Although recent data don’t support those perceptions, we still wondered if there might be a kernel of truth to be found on a longer time scale. What is the bigger picture of history of proposal load and submission behavior across BIO?

Well, with some digging we were able to put together a data set that lets us take a look at full proposal research grant submissions across BIO, going all the way back to 1991 when, it seems, the NSF started computerized record-keeping. Looking at this bigger picture of submissions, we can see when changes have occurred and how they fit into the broader narrative of the changing funding environment.

Total BIO full research grant submissions per year (line, right axis) and proportions of individuals submitting 1, 2, 3, 4, 5, or more proposals each calendar year from 1991 to 2014. (Note: 2015 is excluded because proposals submitted in calendar year 2015 are still being processed at the time of writing.)

Total BIO full research grant submissions per year (line, right axis) and proportions of individuals submitting 1, 2, 3, 4, 5, or more proposals each calendar year from 1991 to 2014. (Note: 2015 is excluded because proposals submitted in calendar year 2015 are still being processed at the time of writing.)

 

1990s: Throughout the 1990s BIO received about 4000 proposals per year. This period of relative stability represents the baseline for more than a decade of subsequent discussions of increasing proposal pressure. Interestingly, the proportion of people submitting two or more proposals each year grew over this period, but without seeming to affect total proposal load; this could result from either increasing collaboration (something we’ve seen) or a shrinking PI pool (something we haven’t seen). At this time NSF used a paper-based process, so the cost and effort to prepare a proposal was quite high. Then….

2000s: In 2000, FastLane became fully operational and everyone switched to electronic submission. BIO also saw the launch of special programs in the new Emerging Frontiers division. In a single year, it became easier to submit a proposal and there were more deadlines and target dates to which one could potentially submit. The new electronic submission mechanism and new opportunities likely both contributed to increased submissions in subsequent years.

Following the switch to FastLane, from 2001 to 2005, total annual submissions grew to about 50% above the 1990s average and stayed there for a few years. This period of growth also coincided with an increasing proportion of people submitting 2+ proposals. Increasing numbers of proposals per person had only a limited effect on the total proposal load because of continued growth in collaboration (increasing PIs per proposal). Instead, the major driver of proposal increases was the increasing number of people submitting proposals. This situation was not unique to BIO.

This period from 2001 to 2005 was the rapid growth that sparked widespread discussion in the scientific community of overburdening of the system and threats to the quality of merit review, as summarized in the 2007 IPAMM report.

Eventually, however, the community experienced a declining success rate because BIO budgets did not go up in any way to match the 50% increase in proposal submissions. From 2005-2008 submissions/person seemed to stabilize and submissions peaked in 2006. We interpret this as a shift in behavior in response to decreasing returns for proposal effort (a rebalancing of the effort/benefit ratio for submissions). It would have been interesting to see if this held, but….

2009/2010: In 2009 and 2010, BIO was up another ~1000 proposals over 2006, reaching an all-time high of nearly 7000 proposal submissions. These were the years of ARRA, the economic stimulus package. Even though NSF was very clear that almost all stimulus funding would go toward funding proposals that had been already reviewed (from 2008) and that we wouldn’t otherwise be able to afford, there was a clear reaction from the community. It appears that the idea of more money (or less competition) created a perception that the effort/benefit relationship may have changed, leading to more proposals.

2011: We see a drop in 2011. It is plausible that this was the realization that the ARRA money really was a one-time deal, there were still many more good proposals than could be funded, and that obtaining funding hadn’t suddenly become easier. As a result, the effort/benefit dynamic could be shifting back; or, this could’ve been a one-time off year. We can’t know for sure because…

2012: Starting in 2012 IOS and DEB, the two largest Divisions in BIO, switched to a system of preliminary proposals  to provide a first-pass screening of projects (preliminary proposals are not counted in the chart). This effectively restricted the number of full proposals in the two largest competitions in BIO such that in 2012, 2013, and 2014 the full proposal load across BIO dropped below 5000 proposals per year (down 2000 proposals from the 2010 peak). The proportion of individuals submitting 2+ full proposals per year also dropped, consistent with the submission limits imposed in DEB, IOS, and MCB. PIs now submitting multiple full proposals to BIO in a given year are generally submitting to multiple programs (core program and special program) or multiple Divisions (DEB and [IOS or MCB or EF or DBI]) and diversifying their submission portfolios.

In summary, the introduction of online and multi-institutional submissions via FastLane kicked off a decade of change marked by growth in proposal submissions and per-PI submissions to BIO. The response, a switch to preliminary proposals in IOS and DEB, caused a major (~1/3) reduction in full proposals and also a shift in the proportion of individuals submitting multiple proposals each year. In essence, the pattern of proposal submission in BIO has shifted back to what it was like in the early 2000s. However, even with these reductions, it is still a more competitive context than the 1990s baseline, prior to online submissions via FastLane.


DEB Numbers: Are aquatic ecologists underrepresented?

Editor’s note: This post was contributed by outgoing rotating Program Officer Alan Wilson and is a write-up of part of a project performed by DEB summer student Zulema Osorio during the summer of 2015.

Generalizability has been fundamental to the major advances in environmental biology and is an important trait for current research ideas proposed to NSF.  Despite its significance, a disconnect between terrestrial and aquatic ecological research has existed for several decades (Hairston 1990).

For example, Menge et al. (1990) quantitatively showed that authors heavily (~50%-65%) cite more studies from their representative habitat but that terrestrial ecologists are less likely to include citations from aquatic systems than the converse.  Failure to broadly consider relevant literature when designing, conducting, and sharing findings from research studies not only hinders future scientific advances (Menge et al. 2009) but may also compromise an investigator’s chances for funding[i] when proposing research ideas.

More recently, there have been anecdotal reports from our PI community that freshwater population or community ecology research is under-represented in NSF’s funding portfolio.  To explore the potential bias in proposal submissions and award success rates for ecological research associated with focal habitat, we compared the submissions and success rates of full proposals submitted to the core Population and Community Ecology (PCE) program from 2005-2014 that focused on terrestrial systems, aquatic systems, or both (e.g., aquatic-terrestrial linkages, modeling, synthesis).  Data about focal ecosystems were collected from PI-reported BIO classification forms.  To simplify our data analysis and interpretation, all projects (including collaboratives) were treated only once.  Also, the Division of Environmental Biology (DEB) switched to a preliminary proposal system in 2012.  Although this analysis focuses only on full proposals, the proportion of preliminary and full proposal submissions for each ecosystem type were nearly identical for 2012-2014.  Some projects (2.7% of total projects) provided no BIO classification data (i.e., non-BIO transfers or co-reviews) and were ignored for this project.  Finally, several other programs inside (Ecosystem Science, Evolutionary Processes, and Systematics and Biodiversity Science) and outside (e.g., Biological Oceanography, Animal Behavior, Arctic) of DEB fund research in aquatic ecosystems.  Thus, our findings only relate to the PCE portfolio.

In total, 3,277 core PCE projects were considered in this analysis. Means + 1 SD were calculated for submissions and success rates across 10 years of data from 2005-2014. Terrestrial projects (72% ± 2.8% SD) have clearly dominated projects submitted to the core PCE program across all ten years surveyed (Figure 1).  Aquatic projects accounted for 17% (± 2.6% SD) of the full proposal submissions while projects that include aspects of both aquatic and terrestrial components accounted for only 9% (± 1.6% SD) (Figure 1).  The full proposal success rate has been similar across studies that focused on terrestrial or aquatic ecosystems (calculated as number of awards ÷ number of full proposal submissions; Figure 2; terrestrial: 20% ± 6.9% SD; aquatic: 18% ± 6.5% SD).  Proposal success rate dynamics for projects that focus on both ecosystems are more variable (Figure 2; 16% ± 12.7% SD), in part, due to the small population size (9.5% of the projects considered in this study).

Figure 1. Submission history of full proposals submitted to the core PCE program from 2005-2014 for terrestrial (brown), aquatic (blue), or both ecosystems (red). Proposals were classified based on PI-submitted BIO classification forms. Note that some projects did not provide BIO classification data. These projects were ignored for this analysis and explain why yearly relative data may not total 100%.

Figure 1. Proportion of full proposals submitted to PCE based on focal ecosystem from 2005 to 2014.

Figure 2. Success rate of full proposals submitted to PCE based on focal ecosystem from 2005 to 2014.

Figure 2. Success rate of full proposals submitted to PCE based on focal ecosystem from 2005 to 2014.

In summary, anecdotal PI concerns of fewer funded aquatic proposals in PCE are consistent with available data but are an artifact of fewer aquatic proposal submissions.  Although funding rates for all full PCE proposals have generally varied from 2005-2014 (mean: 19.9% ± 6.4% SD; range: 11%-29%) as a function of available funds and the number of proposals considered, terrestrial- and aquatic-focused research proposals have fared similarly for the past decade.  PCE, like the rest of DEB and NSF, is motivated to have a diverse portfolio and encourages ecologists from varied institutions and backgrounds to submit ideas that study interesting, important questions that will generally move the field of population and community ecology forward.

Figures

Figure 1. Submission history of full proposals submitted to the core PCE program from 2005-2014 for terrestrial (brown), aquatic (blue), or both ecosystems (red).  Proposals were classified based on PI-submitted BIO classification forms.  Note that some projects did not provide BIO classification data.  These projects were ignored for this analysis and explain why yearly relative data may not total 100%.

Figure 2. Success rate history of full proposals submitted to the core PCE program from 2005-2014 for terrestrial (brown), aquatic (blue), or both ecosystems (red).  Proposal success rate is calculated for each ecosystem type as the number of awards ÷ number of full proposal submissions.   Proposals were classified based on PI-submitted BIO classification forms.

References

Hairston, Jr., N. G. 1990. Problems with the perception of zooplankton research by colleagues outside of the aquatic sciences. Limnology and Oceanography 35(5):1214-1216.

Menge, B. A., F. Chan, S. Dudas, D. Eerkes-Medrano, K. Grorud-Colvert, K. Heiman, M. Hessing-Lewis, A. Iles, R. Milston-Clements, M. Noble, K. Page-Albins, R. Richmond, G. Rilov, J. Rose, J. Tyburczy, L. Vinueza, and P. Zarnetska. 2009. Terrestrial ecologists ignore aquatic literature: Asymmetry in citation breadth in ecological publications and implications for generality and progress in ecology. Journal of Experimental Marine Biology and Ecology 377:93-100.

[i] Generalizability “within its own field or across different fields” is a principal consideration of the Intellectual Merit review criterion: http://www.nsf.gov/pubs/policydocs/pappguide/nsf16001/gpg_3.jsp#IIIA


Spring 2016: DEB Preliminary Proposal Results

Notices

All PIs should have received notice of the results of your 2016 DEB Core Program preliminary proposals by now. Full proposal invitation notices were all sent out by the first week of May (ahead of schedule), giving those invited PIs a solid three months to prepare their full proposals. ‘Do Not Invite’ decisions began going out immediately thereafter and throughout the rest of May.

If you haven’t heard, go to fastlane.nsf.gov and log in. Then, select the options for “proposal functions” then “proposal status.” This should bring up your proposal info. If you were a Co-PI, check with the lead PI on your proposal: that person is designated to receive all of the notifications related to the submission.

If you are the lead PI and still have not heard anything AND do not see an updated proposal status in FastLane, then email your Program Officer/Program Director. Be sure to include the seven-digit proposal ID number of your submission in the message.

Process

All told, DEB took 1474 preliminary proposals to 10 panels during March and April of 2016. A big thank you to all of the panelists who served and provided much thoughtful discussion and reasoned recommendations. Note: if you’re interested in hearing a first-hand account of the DEB preliminary proposal panel process, check out this great post by Mike Kaspari.

Panelists received review assignments several weeks prior to the panels and prepared individual written reviews and individual scores. During the panel, each proposal was discussed by the assigned panelists and then presented to the entire panel for additional discussion and assignment to a rating category. Panels were presented two recommendation options for each preliminary proposal: Invite or Do Not Invite. Following discussion, the assigned panelists prepared a panel summary statement to synthesize the key points of the panel discussion and rationale for the assigned rating.

Both the individual written reviews and the panel summary statement are released to the PI of the preliminary proposal.

As we’ve discussed previously, the final decisions on the preliminary proposals are made by the programs with concurrence of senior management. These decisions take into account the panel recommendations, especially the substance of the discussions, as well as expectations for future award-making capacity based on the availability of funds, additional expected proposal load at the full proposal stage, and portfolio balance issues.

Results

Total Reviewed Panel Recommendations Total Invited Invite Rate
DEB Cluster Invite Do Not Invite No Consensus
SBS 289 79 210 0 85 29%
EP 440 94 346 0 101 23%
PCE 439 122 315 2 110 25%
ES 306 94 212 0 86 28%
DEB Total 1474 389 1083 2 382 26%

These numbers are consistent with our goal of inviting the most promising projects while targeting a success rate of approximately 25% for the resulting full proposals that will be submitted this summer.

Big Picture

Comparing to the previous rounds of preliminary proposals…

2012 2013 2014 2015 2016
Reviewed 1626 1629 1590 1495 1474
Invited 358 365 366 383 382
Invite Rate 22% 22% 23% 26% 26%

…we see that the system has recovered somewhat from the initial flood of submissions. Moreover, the invite rate, and subsequent full proposal success rate, has stabilized in a range that reasonably balances against the effort required to produce each submission.


DEB Numbers: Success Rates by Merit Review Recommendation

We recently received a comment from a panelist (paraphrasing): how likely are good proposals to get funded? We’ve previously discussed differences between the funding rates we report directly to you from panels and the NSF-wide success rate numbers reported on our website.  But the commenter was interested in an even more nuanced question: to what extent do award decisions follow the outcomes of merit review? This is a great topic for a post and, thanks to our Committee of Visitors review last year, we already have the relevant data compiled. (So this is really the perfect data-rich but quick post for panel season.)

To address this question, we need to first define what a “good proposal” is.

In our two-stage annual cycle, each project must pass through review at least twice before being awarded: once as a preliminary proposal, and once as an invited full proposal.

At each stage, review progresses in three steps:

  • Three individual panelists independently read, review, and score each proposal prior to the panel. A single DEB panelist is responsible for reviewing an assigned subset of all proposals at the panel. This is the same for preliminary proposals and full proposals. Full proposals also receive several non-panelist “ad hoc” reviews prior to the panel.
  • The proposal is brought to panel where the panelists discuss the proposal and individual reviews in relation to each other and in the context of the rest of the proposals in the panel to reach a consensus recommendation. This is the same for preliminary proposals and full proposals.
  • The Program Officers managing the program take into consideration the reviews, the recommendations of the panel(s) that assessed the proposal, and their portfolio management responsibilities to arrive at a final recommendation. This is the same for preliminary proposals and full proposals.

In this case, since we are discussing the Program’s actions after peer review, we are defining as “good” anything that received a positive consensus panel recommendation. Initially, the label of “good” will be applied by the preliminary proposal panel. Then, at the full proposal panel it will receive a second label, which may or may not also be “good”. A “good” recommendation for either preliminary or full proposals includes any proposal not placed into the lowest (explicitly negative) rating category. The lowest category usually has the word “not” in it, as in “Do Not Invite” or “Not Fundable”. All other categories are considered “good” recommendations, whether there is a single positive category (e.g., “Invite”) or several ordinal options conveying varying degrees of enthusiasm (e.g., “high priority”, “medium priority”, “low priority”).

To enable this analysis, we traced the individual review scores, panel review recommendations, and outcomes for proposals from the first three years of the DEB preliminary proposal system (i.e., starting with preliminary proposals from January 2012 through full proposals from August 2014).

As we’ve reported previously, preliminary proposal invitation rates are between 20% and 30%, and between 20% and 30% of invited full proposals are funded, leading to end-to-end funding rates around 7%. But, as our commenter noted, that obscures a lot of information and your individual mileage will vary. So…

How likely are “good” proposals to get funded?

In the table below, you can see the overall invitation rate for preliminary proposals is 23%, but it looks very different depending on how well it performed in the panel[i].

Preliminary Proposal Outcomes by Panel Recommendation % of Proposals Receiving Rating Pre-Proposal Outcome
Not Invited Invited Invite Rate
Pre-Proposal Panel Rating High (Good) 19% 22 879 98%
Low (Good) 5% 100 141 59%
Do Not Invite 76% 3597 74 2%
Total 100% 3719 1094 23%

This stage is a major winnowing of projects. On the one hand, we tend toward inviting most of that which is recommended by the panel. On the other hand, for the majority of preliminary proposals that aren’t well-rated (so falling outside our working definition of “good”), it is highly unlikely it will see the full proposal stage. There is a low, 2%, Invite rate for proposals that the panels recommended as Do Not Invite. This is a measure of the extent to which program officers disagree with panelists and choose to take a chance on a particular idea or PI, based on their own knowledge of submission history and portfolio balance issues.

From these invitations, the programs receive full proposals. After review, programs award approximately 25% of the full proposals, but again the outcome is strongly influenced by the panel ratings.

Full Proposal Outcomes by Panel Recommendation % of Proposals Receiving Rating Full Proposal Outcome
Declined Awarded Funding Rate
Full Proposal Panel Rating High (Good) 17% 30 122 80%
Medium (Good) 23% 115 98 46%
Low (Good) 21% 165 21 11%
Not Competitive 39% 349 7 2%
Total 100% 659 248 27%

Program Officers are faced with a greater responsibility for decision-making at the full proposal stage. Whereas, preliminary proposal panels only gave the nod (High or Low positive recommendations) to ~23% of submissions, full proposal panels put 551 of 907 proposals into “fundable” categories (Low, Medium, or High). Since this is more than twice as many as the programs could actually fund,[ii] the work of interpreting individual reviews, panel summaries, and accounting for portfolio balance plays a greater role in making the final cut. Also note, that these are the cumulative results of three years of decision-making by four independently managed program clusters, so “divide by 12” to get a sense of how common any result is for a specific program per year.

Ultimately, the full proposal panel rating is the major influence on an individual proposal’s likelihood of funding and the hierarchy of “fundable” bins guides these decisions:

Success rates of DEB full proposals when categorized by preliminary proposal and full proposal panel recommendations.

Success rates of DEB full proposals when categorized by preliminary proposal and full proposal panel recommendations.

While funding decisions mostly ignore the preliminary proposal ratings, readers may notice an apparent “bonus” effect in the funding rate for “Do Not Invite” preliminary proposals that wind up in fundable full proposal categories. For example, of 15 preliminary proposals that were rated “Do Not Invite” but were invited and received a “Medium” rating at the full proposal stage, 10 (67%) were funded compared to 45% and 42% funding for Medium-rated full proposals that preliminary proposal panelists rated as High or Low priority, respectively.  However, this is a sample size issue. Overall the numbers of Awarded and Declined full proposals are not associated with the preliminary proposal recommendation (Chi-Square = 2.90, p = 0.235).

 

Does Preliminary Proposal rating predict Full Proposal rating?

This is a difficult question to answer since there is nothing solid to compare against.

We don’t have a representative set of non-invited full proposals that we can compare to say “yes, these do fare better, the same as, or worse than the proposals that were rated highly” when it comes to the review ratings. What we do have is the set of “Low” preliminary proposals that were invited, and the small set of “Do Not Invite” preliminary proposals that were invited by the Program Officers against the panel recommendations. However, these groups are confounded by the decision process: these invites were purposely selected because the Program Officers thought they would be competitive at the full proposal stage. They are ideas we thought the panels missed or selected for portfolio balance; therefore, they are not representative of the entire set of preliminary proposals for which the panels recommended Low or Do Not Invite.

Distribution of Full Proposal Panel Ratings versus Preliminary Proposal Ratings # Recvd As Full Proposals Full Proposal Panel Rating
High Medium Low Not Competitive
Pre-Proposal Panel Rating High 728 19% 24% 20% 37%
Low 117 10% 21% 20% 50%
Do Not Invite 62 8% 24% 23% 45%

So, given the active attempts to pick the best proposals out of those in the “Low” and “Do Not Invite” preliminary proposal categories, those which had been invited based on “High” ratings were twice as likely to wind up in the “High” category at the full proposal stage than those that had been invited from Low or Do Not Invite preliminary proposal categories. And, those invited from the Low or Do Not Invite categories were somewhat more likely to wind up in Not Competitive. Moreover, the score data presented below provides additional evidence that suggests this process is, in fact, selecting the best proposals.

 

What do individual review scores say about the outcomes and different panel ratings?

We expect the full proposal review stage to be a more challenging experience than the preliminary proposal stage because most of the clearly non-competitive proposals have already been screened out. Because of this, full proposals should present a tighter grouping of reviewer scores than preliminary proposals. The distribution of average proposal scores across the two stages is shown below. We converted the “P/F/G/V/E” individual review scores to a numerical scale from P=1 to E=5, with split scores as the average of the two letters (e.g., V/G = 3.5). As a reminder, the individual reviewer scores are sent in prior to the panel, without access to other reviewers’ opinions and having access to a relatively small number of proposals. So the average rating (and spread of individual scores for a proposal) is mostly a starting point for discussion and not the end-result of the review[iii].

Distribution of mean review scores at different points in the DEB core program review process.

The preliminary proposal scores are distributed across the entire spectrum, with the average review scores for most in the 3 to 4 range (a Good to Very Good rating). That we don’t see much in the way of scores below 2 might suggest pre-selection on the part of applicants or rating inflation by reviewers. Invitations (and high panel ratings) typically go to preliminary proposals with average scores above Very Good (4). Only a few invitations are sent out for proposals between Very Good and Good or lower.

The average scores for full proposals are more evenly distributed than the preliminary proposal scores with a mean and median around Very Good. The eventual awards draw heavily from the Very Good to Excellent score range and none were lower than an average of Very Good/Good. And, while some full proposals necessarily performed worse than they did at the preliminary proposal stage, there are still roughly twice as many full proposals with average scores above Very Good than the total number of awards made, so there is no dearth of high performing options for award-making.

So, what scores correspond to different panel ratings?

Average Review Score of Invited Full Proposals by Panel Recommendation Full Proposal Panel Rating
High Medium Low Not Competitive Overall
Pre-Proposal Panel Rating High 4.41 4.08 3.76 3.53 3.88
Low 4.32 4.13 3.88 3.52 3.81
Do Not Invite 4.42 4.00 3.75 3.44 3.73
Overall 4.40 4.08 3.78 3.53 3.87

There’s virtually no difference in average full proposal scores among groups of proposals that received different preliminary proposal panel ratings (rows, above). This further supports the notion that the full proposals are being assessed without bias based on the preliminary proposal outcomes (which are available to full proposal panelists after individual reviews are written). There is approximately a whole letter score difference between the average scores of full proposals (columns) from highly rated full proposals (E/V) to Not Competitive Full proposals (V/G). The average score for each rating is distinct.

 

About the Data:

The dataset used in this analysis was originally prepared for the June 2015 DEB Committee of Visitors meeting. We traced the review outcomes of preliminary proposals and subsequent full proposals over the first 3 cycles of proposal review. This dataset included the majority of proposals that have gone through the 2-stage review in DEB, but is not a complete record because preliminary proposal records are only tied to full proposals if this connection is successfully made by the PI at the time of full proposal submission. We discussed some of the difficulties in making this connection on DEBrief in the post titled “DEB Numbers: Per-person success rate in DEB”.

There are 4840 preliminary proposal records in this dataset; 1115 received invitations to submit full proposals. Of those 1115, 928 (83%) submitted full proposals and successfully identified their preliminary proposal. Full proposal records are lacking for the remaining 187 invitees; this is combination of 1) records missing necessary links and 2) ~a few dozen invitations that were never used within the window of this analysis. For full proposal calculations, we considered only those proposals that had links and had been processed to a final decision point as of June 2015 (907 records) when the data was captured.

The records followed the lead proposal of collaborative groups/projects in order to maintain a 1 to 1 relationship of all records across preliminary and full proposal stages and avoid counting duplications of review data. The dataset did not include full proposals that were reviewed alongside invited proposals but submitted under other mechanisms that bypass the preliminary proposal stage such as CAREER, OPUS, and RCN.

Data Cleaning: Panel recommendations are not required to conform to a standard format, and the choice of labels, number of options, and exact wording vary from program to program and has changed over time in DEB. To facilitate analysis, the various terms have been matched onto a 4-level scale (High/Medium/Low/Not Invite (or Not Competitive)), which was the widest scale used by any panel in the dataset; any binary values were matched to the top and bottom of the scale. Where a proposal was co-reviewed in 2 or more panels, the most positive panel rating was used for this analysis.

[i] Cases where the highly recommended preliminary proposal was Not Invited were typically because the project received funding (either we were still waiting on our budget from the prior year and the PI re-submitted, or the same work was picked up by another funding source). So, the effective invite rate for “high priority” recommendations is ~100%. The middle “Low” priority rating was used in only a limited set of preproposal panels in the first years of preproposals; at this point, all DEB preproposal panels used two-level “Invite or Do Not Invite” recommendations.

[ii] 248 is less than what we actually funded from the full proposal panels: when CAREER, OPUS, RCN, and proposals that were not correctly linked to preproposal data are accounted for, we’re a bit over 300 core program projects awarded in FYs 2013, 2014 and 2015: 100 new projects/year.

[iii] If the program were to be purely conservative and follow the scoring exactly in making award decisions, there would have been no awards with an average score below 4.2 (Very Good+) and even then half of the proposals that averaged Very Good (4) or better would go unfunded.


DEB Numbers: FY2015 Wrap-Up

Fiscal year 2015 has come to a close. With the dust settled, we can crunch the numbers on the DEB Core Program merit review and funding outcomes.

This post follows the format we’ve used in previous years. For a refresher, and lengthier discussions of the hows and whys of the metrics, you can visit the 2014 and 2013 numbers.

Read on to see how 2015 compares.

FY2015 Summary Numbers

The charts below all reflect proportions of DEB Core Program projects through each stage of the review process: preliminary proposals, full proposals, and awards.

In the review process leading to awards in FY2015:

DEB reviewed 1590 preliminary proposals received under the DEB Core Programs solicitation and LTREB solicitation in January 2014, about 25% of which were invited to the full proposal stage.

The preliminary proposal invitees were joined at the full proposal stage by 1) Direct submissions to DEB under the CAREER, OPUS, and RCN solicitations, and 2) Projects shared for co-review by another NSF program. Altogether 510 full proposals were reviewed in DEB during October and November of 2014.

From this pool of full proposals, DEB made awards to 131 projects (technically, these were 193 separate institutional awards but for analysis purposes we count collaborative groups once, i.e., as a single proposed project).

Below, we present and discuss the Division-wide success rate and some selected project demographics that were raised as concerns coming in to the preliminary proposal system. The demographic numbers are presented as proportions for comparison across the review stages. However, the progressive reduction in size of the denominators from preliminary proposals (1590) to awards (131) means each step becomes more sensitive to small absolute changes.

Success Rate

The success rate for research proposals in the DEB Core Programs remains much as it was since the preliminary proposal process was instituted; success rate is a function of the number and size of requests submitted by the research communities and appropriated funding levels.

FY2015_WrapUp_9

Reminder: Elevated success rates (in grey) were due to:

  • a one time ~50% increase in funding for FY2009 (the ARRA economic stimulus funding) without which success would have been ~13-15%; and,
  • a halving of proposal submissions in FY2012 (the first preliminary proposal deadline replaced a second full proposal deadline for FY2012), without which success would have been ~8-9%.

 

Individual and Collaborative Projects

As seen in the figure below, there was little year-to-year change in the submission and funding success of single investigator projects.

FY2015_WrapUp_1 While the proportion of single investigator preliminary proposals declined slightly, there was no decrease when it came to either full proposals or awards. As a reminder to readers: most of the apparent increase in the proportion of single investigator projects between the preliminary proposal and full proposal stages is because the primarily single-investigator proposals in the CAREER and OPUS categories are not subject to the preliminary proposal screen and thus they make up a relatively larger portion of the full proposals.  Similarly, the absence of CAREER and OPUS proposals at the preliminary proposal stage depresses the single investigator proportion of the preliminary proposal counts relative to the historical full proposal baseline.
The proportion of collaborative proposals in our award portfolio declined slightly from last year’s peak but is still above other prior years and doesn’t reverse the general upward trend over the past decade or so.FY2015_WrapUp_2
Readers may notice that the collaborative and single-investigator groupings don’t sum to 100%. The remainders are intra-institutional multi-PI arrangements; such projects are intellectual collaboration to be sure, but not a collaborative project per the NSF definition.

Early Career Scientists

As we discussed in the FY2013 Wrap-up, there are several imperfect metrics for identifying early career investigators, with the “Beginning Investigator” check-box on the cover page being the most immediately visible but also the most inconsistently applied identifier. Beginning Investigator includes everyone who has never received federal funding, and many researchers don’t, so it is not as directly related to career stage as we may want to think it. For the purposes of measuring “Beginning Investigators” we use the response in the BIO Classification Form (this poses the same question as the cover page but captures a more complete record of response than a single a check box).

FY2015_WrapUp_3According to the classification form data, beginning investigators continue to receive awards in proportion to full proposal submissions but consistently represent a smaller segment of the proposal pool at these later stages than at the preliminary proposal stage[i].

The better identifier of researcher career stage is the years since PhD of the PI.

FY2015_WrapUp_4FY2015_WrapUp_5FY2015_WrapUp_6Little changed in the profile of submitter and awardee degree ages from 2013 through 2015. Success rate does not improve markedly with degree age and generally stays between 20-30% up through 35 years post PhD. PIs more than 35 years post-PhD are typically less than 5% of the total PIs on proposals and awards: too few in number to read much into the data. Early-, mid-, and later-career PIs appear to be faring equally well in core program award decisions.

Gender & Predominantly Undergraduate Institution (PUI) Status

Last year we saw what seemed like a potentially spurious but also curious pair of year to year changes in these groups: the proportion of PUI awardees jumped up at the same time as the proportion of female PIs slumped.

We thought these changes were spurious because the absolute numbers involved were small and potential confounding factors were numerous. That is, there are a range of factors beyond the summary rating of the panel such as nuances of the reviewer input, other demographic characteristics, additional funding, overlap with others’ awards, past productivity, etc., that weigh into each award decision. If only a few proposals satisfy any one factor, even fewer proposals will satisfy combinations of factors. Thus, a decision to make an award that boosts one aspect of portfolio diversity may come with an opportunity cost of not addressing another aspect.

We thought this was curious because, with PUIs experiencing a notable jump and female PIs a notable drop, these stood out visually from other results and related to preexisting areas of concern. And, this year we see another big change but in the opposite direction.FY2015_WrapUp_7 FY2015_WrapUp_8

The first key thing to note is that there is no surprising up or down change in submissions of preliminary proposals or full proposals in either group since before the preliminary proposal system. That doesn’t take away, however, from the sudden appearance of negative reciprocal changes between award proportions for these two demographics seen in FY2014 and FY2015.  On the other hand, there’s no trend (just greater variation), and there’s no direct mechanism we’ve been able to identify that would lend itself to management or modification.

Since the differences emerge only at the award stage, we continue to view this as we did last year.  Since we don’t see a similar effect from preliminary proposal review and invitation decisions (yielding the full proposals), it seems likely that this is directly related to the final decisions regarding use of limited funds across too many strong candidates.

We conclude that we are witnessing a small numbers effect; 131 awards is simply not a sufficiently large “sample” to be representative of the population across all of the potential decision-making variables and concerns. PUIs are a minority component of the proposal pool (~18%). Female PIs are a minority component of the proposal pool (~30%). Beginning Investigators are a minority component of the proposal pool (~20%). Proposals that fall into two categories are an even smaller fraction of the proposal pool (~6%) and proposals that fit into all three are even smaller yet (~1%).

Concluding Thoughts

Since the implementation of the preliminary proposal system in January 2012, we have not seen major departures from pre-2012 trajectories across these measures of our portfolio with respect to PI or institution demographics or collaborations. Three years in, we have not seen indication of Division-wide changes, especially any worsening of submission and award proportions among any of these major groups of concern. Success rates appear stable across multiple years for the first time in recent memory. However, the overall climate for awards remains poor due continued funding stagnation. If there is any bright side to that fact, funding woes at least appear to not have been amplified for any particular subgroup in recent years. But, the limited purchasing power of program budgets means our ability to support the full diversity of the research community will continue to be tested and subject to difficult trade-offs.


 

[i] Two points we shared in our post from this past spring on individual success rates come to mind: the first years of the preliminary proposal system have seen a large uptick in first-time submitters, and only 25% of individual PIs received any funding from DEB over a 9-year period examined in that post. Lots of PIs have never, and will never, receive funding from us. In the context of the result above, we see the learning curve. There is a high failure rate: especially among PIs without prior funding success, but even among those who have been successful in the past. While a successful preliminary proposal from a would-be first-time PI goes on to compete on even footing with full proposals from experienced PIs, we don’t expect the preliminary proposals from all unfunded PIs to have the same distribution across the quality spectrum as those from previously successful PIs.


Are small grants doing well in review?

In contrast to the trend of decreasing numbers of preliminary proposals, we have seen a rapid increase in the category of Small Grant preliminary proposals (these are also included in the total counts in our previous post).

DEB Small Grants 2012 2013 2014 2015
Submitted N/A 83 95 126
Invited N/A 20 25 29
Invite Rate N/A 24% 26% 23%

 

We attribute this to a growing awareness of this option to submit preliminary proposals with total budgets under $150K. Small grants came about in the second year of the preliminary proposal system in response to a long-standing desire, expressed by numerous voices in our communities, for some sort of “small” category. DEB realized it was particularly appropriate in the case of the preliminary proposal system in order that reviewers be able to adjust their expectations for the scope of a project relative to the expense without requiring the extensive preparations of a full budget. We added the category to our solicitation for the 2013 preliminary proposal deadline.

We’ve had lots of positive feedback on this option, but also recognize that awareness still needs to be improved among both applicants and reviewers. This year, 8% of all preliminary proposals were identified as small grants.

Small Grants are found in all four clusters and are generally on the increase, but we also think feedback, such as this post, is necessary to successfully integrate this idea into our communities and maintain enthusiasm for this option. We would not be surprised to see these numbers grow to the point where SGs make up as large a part (or larger) of the preliminary proposal pool as Predominantly Undergraduate Institutions or Beginning Investigators.

Since 2013, we’ve funded 22 awards based on invited full small grants (9 of 18 in 2013, 12 of 24 in 2014, and 1 of 1 in 2015 thus far[1]), for a 51% success rate at the full proposal stage. This is roughly twice the success rate of full proposals without the SG designation.

 

[1] Not everyone who received an invitation eventually submitted a full proposal (individual reasons vary). Also, we have an award already based on a 2015 preliminary proposal because instead of inviting a full proposal, DEB determined this project was appropriate for the EAGER mechanism and invited the team to submit an EAGER proposal allowing for quick turnaround of an award.


DEB Spring 2015 Panel Update

At this point everyone should have heard back on your DEB Preliminary Proposals from the spring panels. If you have not:

1) Log in to your FastLane account. The information should be accessible there, but also make sure your contact email is correct because a typo there would prevent you from receiving updates and notifications.

2) If you were a CoPI, check with the lead PI on the preliminary proposal. The lead PI should have the results of review.

3) Did it wind up in your spam folder?

4) If you have exhausted all of the above options and have had no other contact with your DEB Program Officer, then it’s probably a good time to send us an email.

 

Preliminary Proposal Panel Results

DEB panels reviewed 1495 preliminary proposals; in consideration of the reviews and panel discussion, DEB Program Officers extended 383 invitations to submit full proposals for the August 2015 deadline. The Division-wide invitation rate for the preliminary proposals was 26%. Below, we detail the results of preliminary proposal review by programmatic cluster.

Cluster Invited Not Invited Total Invite Rate
Systematics and Biodiversity Science 87 221 308 28%
Evolutionary Processes 105 331 436 24%
Population and Community Ecology 107 320 427 25%
Ecosystem Science 84 240 324 26%
Grand Total 383 1112 1495 26%

 

This is the fourth round of preliminary proposal review for DEB core programs, which was started in 2012. DEB extended more invitations and achieved a higher invitation rate in comparison to prior years.

2012 2013 2014 2015
Reviewed 1626 1629 1590 1495
Invited 358 365 366 383
Invite Rate 22% 22% 23% 26%

 

As we discussed in our recent post on per-person success rate, the launch of the preliminary proposal system drew in a large number of “new” applicants. We believe we are now seeing this wave of applicants pass and this is reflected in the decrease in number of preliminary proposals reviewed in DEB as our communities realize that preliminary proposals do not make grants easier to get.

At the same time, the number of invitations has gone up. The increase is primarily a result of program management decisions as we have been able to refine our expectations for the number of proposals that will come to the full proposal panel through other mechanisms (CAREER, OPUS, RCN, and co-review).