Myths and FAQs about Project Reporting

Annual and final reports have changed quite a bit over the years. Twenty years ago annual reports and final reports were distinct requirements (see section 340) for which you printed out a form, filled it out and mailed it in. In the early 2000s reporting moved online to FastLane; this allowed a degree of integration with other electronic systems and also brought about the creation of a single template that applied to both annual and final reports. Some people took this to mean that the final report was just another annual report, and others started treating each annual report like a cumulative final report. However, since 1) most reports during the FastLane era were uploaded as unstructured PDF narratives they could be reviewed but weren’t useful for systematic analysis (data mining) and 2) both approaches provided the necessary record of activity, there was no pressure to enforce a standard either way on whether the content was annual or cumulative.

The most recent move, to Research.gov, came with a more structured web-form-based report template that enables further integration of reporting requirements into companion systems. The structured report data improves compatibility between grant data systems across agencies (e.g., NASA data in Research.gov search results) and adds records to a database of project results that can automatically compile the cumulative outcomes for any project without the PI having to regurgitate the same items over and over in an ever-expanding list. Once we get beyond the rough period of adjusting to the change, structured reporting of data can be easier to enter and provide greater value for analyses of all types, including those to make a case for future investments in research.

The purpose of this post is to help you shorten the learning curve of Research.gov reporting, answer some of the common questions we hear from you, and debunk the persistent myths and old habits that no longer fit with current practice.

  1. What does DEB do with project reports?

Reports are required because we need to maintain a record regarding how our investments in research are spent. We use them to make the case to various audiences that funds are being spent on good and important work. Your Program Officers (POs) are responsible for overseeing those funds and need to be able to document that they were used for the approved purposes in an appropriate fashion. Because failure to show that grant money was used productively makes it very difficult to justify sending you more money, POs review each report and will request additional information if critical aspects are weak or missing. Failure to report at all places a flag on your file for missing a requirement of your grant that will block further funding until it is cleared. In short, reports are a necessary component of financial stewardship.

With structured data available via the Research.gov reporting form, we are also gaining the ability to better quantify research outcomes and ask new questions about our portfolio, our decision-making, and the effects of different approaches to supporting science.

  1. What should I put in my project report, and how much of each thing?

This is a really common question and a big part of the change from FastLane reporting to Research.gov reporting. The old “upload a PDF” days imposed no limitation whatsoever on reporting and inconsistent requests for more or different information from POs in various programs encouraged a narrative-intensive “everything and the kitchen sink” approach to the report. This resulted in the expenditure of a lot of time and effort by PIs to compile and present all that information and by POs to wade through it to check if the critical pieces were all there.

Please take this to heart: The volume of report text is not an indicator of the quality of the work. A report is also not a journal article.

What we want is an efficient description of what happened/was accomplished during the reporting period. A lot of this comes down to discrete counts of the inputs, activities, and outputs of the project and providing enough detail about each that we could verify the count if needed. The Research.gov template seeks to encourage this by providing clear fields to generate list-style report items with consistent detail requirements.

There are some places with paragraph-sized text boxes for narrative explanations. Some are optional and some are required. Responses to any of them are most helpful when clear and direct. Research.gov imposes character-limits on text boxes where narratives are required/allowed by design.

  1. What are the most common problems that cause POs to return a report for revision?
  • Failure to list participants and collaborators who are mentioned/described in narrative sections of the report text (See Questions 4, 5, and 6).
  • Multiple typographical errors; apparent cut and paste errors (incomplete sentences or paragraphs).
  • Listing items that fall outside of the reporting period.
  1. Who should I list in the Participants section? The other collaborators section?

Between the three “Participants/Organizations” sections, please list everyone who has been engaged in the project within the previous 12 months. This includes students, volunteers and those paid through other sources. As long as their activities were related to the objectives (Intellectual or Broader Impact) of your award, they “count”. A rule of thumb in deciding which section to report under is that individual “participants” carried out the work of the objectives, “organizational partners” directly enabled the work done by the participants, and “other collaborators or contacts” would include indirect supporters or beneficiaries of the work (e.g., schools at which your student conducted a demonstration). Please note that “other collaborators and contacts” are entered into a plain narrative text-box; it doesn’t include any specific structure or data requirements.

If participants worked less than a total of ~100 hours (2.5 weeks), enter “0” under Nearest Person Month Worked. (Yes, zeros count, too.) If they worked far less than 100 hours trust your own judgment about whether to list them as participants– i.e., whether you think their participation was meaningful, or might be better listed as an “other collaborator or contact”.

  1. I have multiple sources of funding and people in my lab often work on overlapping projects. In the Participants section, what should I enter for Funding Support?

If a participant was paid from a funding source other than your current NSF award, please list that source of support. Do not enter anything if the participant was paid solely from your current NSF award or if they were a volunteer.

  1. I have an RCN or workshop award (or any other type award that may involve dozens of participants). Do you really want them all listed as Participants?

Yes. The list of participants provides an increasingly valuable database that NSF can use to quantify the impact of its investments. A common alternative to listing participants individually in the Participants section has been to upload a pdf document under “Supporting Files”. Please keep in mind that NSF cannot search the data provided in those documents, except by opening them one-by-one, and they will generally be ignored in official analyses and comparisons. Hence we always prefer that Participants be entered one-by-one in the Participant section.

  1. I have a collaborative award. How should my reports differ from those of my collaborators?

Obviously, you and your collaborators will have at least some shared objectives and impacts; some overlap in reports is expected. Your report should focus on the components of the project and the personnel unique to your institution.

  1. Are Annual Reports cumulative? Is the Final Report cumulative?

No and no. Current NSF Policy and the Research.gov reporting system instruct PIs to report only on the previous year of work, including for the final report. Except for “Major Goals” and “Impacts”, there should be little or no overlap from one report to the next. The Final Report should be written as an Annual Report – there’s nothing special about it other than it being the last report on a given project.

You may have done it differently in the past or received outdated advice from a colleague because it was different in the past and old habits are tough to shake, but the rules were clarified and the system changed to enforce them when reporting was moved to Research.gov.

  1. What is the Project Outcomes Report?

The Project Outcomes Report is a third reporting requirement (in addition to annual and final reports) that is due at the same time as your final report. It is a counterpart to the “public abstract” of your award. The abstract was written at the start of the project and explained what you planned to do and why it was important. The Project Outcome Report summarizes the overall goal(s) and accomplishments of the project upon its completion. It is provided directly to the general public as a permanent record and justification for our investment of taxpayer dollars in your research. Please write it carefully, paying particular attention to the major accomplishments and general significance of your work; avoid jargon and grammatical errors. Do not cut-and-paste text from your Annual or Final Reports because you wrote them for a very different audience.

  1. What happens if I don’t submit my report?

You and any Co-PIs will not be allowed to receive any new funding (e.g., annual increments, supplements, or new grants) or process any other actions (e.g., no cost extensions, PI changes) until the report is submitted and approved. Your annual report is due starting 90 days before your award anniversary, your final report is due within 90 days after your award end date. After either of those 90 day windows, the report is considered “overdue” and the block is automatically put in place. Even if you aren’t overdue when you submit a report, waiting until late in the 90-day window risks delaying timely release of annual funds and possibly going overdue before we’ve had a chance to review, receive any needed corrections, and approve the report.

  1. I submitted my Annual Report, but there’s still a block in Fastlane preventing me from receiving new funds from NSF. Why?

It’s most likely that your report still needs to be approved by the managing Program Officer; new money cannot go out the door until reports have been submitted and approved. If your report has been languishing, it’s appropriate to ask the managing Program Officer to take a look at it. (Although we enjoy learning about your discoveries, annual reports can pile up when our priorities must be placed elsewhere.)

  1. Can I submit a proposal if I have an overdue report?

Yes. Be aware, however, that every time we attempt to access any of your proposals (submitted or already awarded), we’ll be redirected to a warning message on a separate screen that tells us we cannot approve any money for the proposal because of a missing or overdue report(s). We’re required to acknowledge this by clicking a “Continue” button before we’re allowed to see any of the proposal contents. The effect of those irksome messages on Program Officers is worth keeping in mind.

  1. If one of my collaborators has an overdue report on an award that I’m not associated with, what are the consequences for me?

If that collaborator is a PI/co-PI (i.e., listed on the cover page) of a proposal or award on which you are a PI/co-PI, you will be blocked from receiving any new funds from NSF or processing any other actions in relation to that shared proposal/award. Any proposal that shares a person on the cover page with the cover page of a proposal that has a missing or overdue report is subject to the block.

  1. Why am I being asked to submit my report in June or July when it’s not overdue until August or September (or later)?

Because we don’t want you to miss your annual funding increment. If you have received an award in the last quarter of our fiscal year (the fiscal year runs from October 1 to September 30, so July, August, or September) and are scheduled to receive that grant in annual increments then you have likely encountered this situation: a program officer calls you up and says “hey, can you get your report in this week” but when you look at research.gov it says it won’t be past due for a month (or two or three).

All annual reports are due 90 days before the anniversary of the award: this provides the time to review and process everything in order to get your annual increment released to you by the actual anniversary. Frequently, reports are submitted much closer to the anniversary or even late. This pushes the start of the approval process later and often pushes the release of money to after the anniversary. But, if that anniversary date is late in the fiscal year, any sort of delay — even within the allowed “reporting window” — can push back the processing time over the year-end deadline, at which point the money is no longer available to be released. That’s not a happy state of affairs for you or for us! So if your award was made and started on September 1, the report “due date” would be June 1 of the next year and your PO would probably be hounding you by July 1 to make sure you don’t lose your funding.

This is a little bit annoying, but generally makes sense when the project begins immediately upon receipt of the award. However, some awardees request a later “start date” for the project that is well after the actual award is made (someone receiving money in September might schedule a start date for December or January). At this point things get complicated. Following the September 1 award/January 1 start example: we need an approved report in order to release funds for the second year by the September anniversary of the award being made, otherwise we run out of time within the fiscal year to actually distribute the money. But, the reporting system is, for whatever reason, blind to this and tells you to file a report based on the “start date” so the very beginning of the “due” period is in October and after the point at which our ability to send you the money due upon receipt of that report has been canceled.

So, two lessons here: 1) don’t ask for a start date way after your award is available, especially if doing so crosses the Sept./Oct. dividing line, and 2) if your PO calls and asks you to submit a report RIGHT NOW, please do it; we’re trying to give you the money we promised and not doing it can really muck things up.

 

Additional Reporting Resources

More detailed answers to many of these questions and accompanying screen shots from Research.gov are available in a pdf guide available here (click to follow, but caveat emptor: there appear to have been some additional updates since the file was posted).

If you’re really into this, a long list of guides, tutorials, templates, and demonstrations related to Project Reports is available here .


Myths and FAQs about Project Reporting

Annual and final reports have changed quite a bit over the years. Twenty years ago annual reports and final reports were distinct requirements (see section 340) for which you printed out a form, filled it out and mailed it in. In the early 2000s reporting moved online to FastLane; this allowed a degree of integration with other electronic systems and also brought about the creation of a single template that applied to both annual and final reports. Some people took this to mean that the final report was just another annual report, and others started treating each annual report like a cumulative final report. However, since 1) most reports during the FastLane era were uploaded as unstructured PDF narratives they could be reviewed but weren’t useful for systematic analysis (data mining) and 2) both approaches provided the necessary record of activity, there was no pressure to enforce a standard either way on whether the content was annual or cumulative.

The most recent move, to Research.gov, came with a more structured web-form-based report template that enables further integration of reporting requirements into companion systems. The structured report data improves compatibility between grant data systems across agencies (e.g., NASA data in Research.gov search results) and adds records to a database of project results that can automatically compile the cumulative outcomes for any project without the PI having to regurgitate the same items over and over in an ever-expanding list. Once we get beyond the rough period of adjusting to the change, structured reporting of data can be easier to enter and provide greater value for analyses of all types, including those to make a case for future investments in research.

The purpose of this post is to help you shorten the learning curve of Research.gov reporting, answer some of the common questions we hear from you, and debunk the persistent myths and old habits that no longer fit with current practice.

  1. What does DEB do with project reports?

Reports are required because we need to maintain a record regarding how our investments in research are spent. We use them to make the case to various audiences that funds are being spent on good and important work. Your Program Officers (POs) are responsible for overseeing those funds and need to be able to document that they were used for the approved purposes in an appropriate fashion. Because failure to show that grant money was used productively makes it very difficult to justify sending you more money, POs review each report and will request additional information if critical aspects are weak or missing. Failure to report at all places a flag on your file for missing a requirement of your grant that will block further funding until it is cleared. In short, reports are a necessary component of financial stewardship.

With structured data available via the Research.gov reporting form, we are also gaining the ability to better quantify research outcomes and ask new questions about our portfolio, our decision-making, and the effects of different approaches to supporting science.

  1. What should I put in my project report, and how much of each thing?

This is a really common question and a big part of the change from FastLane reporting to Research.gov reporting. The old “upload a PDF” days imposed no limitation whatsoever on reporting and inconsistent requests for more or different information from POs in various programs encouraged a narrative-intensive “everything and the kitchen sink” approach to the report. This resulted in the expenditure of a lot of time and effort by PIs to compile and present all that information and by POs to wade through it to check if the critical pieces were all there.

Please take this to heart: The volume of report text is not an indicator of the quality of the work. A report is also not a journal article.

What we want is an efficient description of what happened/was accomplished during the reporting period. A lot of this comes down to discrete counts of the inputs, activities, and outputs of the project and providing enough detail about each that we could verify the count if needed. The Research.gov template seeks to encourage this by providing clear fields to generate list-style report items with consistent detail requirements.

There are some places with paragraph-sized text boxes for narrative explanations. Some are optional and some are required. Responses to any of them are most helpful when clear and direct. Research.gov imposes character-limits on text boxes where narratives are required/allowed by design.

  1. What are the most common problems that cause POs to return a report for revision?
  • Failure to list participants and collaborators who are mentioned/described in narrative sections of the report text (See Questions 4, 5, and 6).
  • Multiple typographical errors; apparent cut and paste errors (incomplete sentences or paragraphs).
  • Listing items that fall outside of the reporting period.
  1. Who should I list in the Participants section? The other collaborators section?

Between the three “Participants/Organizations” sections, please list everyone who has been engaged in the project within the previous 12 months. This includes students, volunteers and those paid through other sources. As long as their activities were related to the objectives (Intellectual or Broader Impact) of your award, they “count”. A rule of thumb in deciding which section to report under is that individual “participants” carried out the work of the objectives, “organizational partners” directly enabled the work done by the participants, and “other collaborators or contacts” would include indirect supporters or beneficiaries of the work (e.g., schools at which your student conducted a demonstration). Please note that “other collaborators and contacts” are entered into a plain narrative text-box; it doesn’t include any specific structure or data requirements.

If participants worked less than a total of ~100 hours (2.5 weeks), enter “0” under Nearest Person Month Worked. (Yes, zeros count, too.) If they worked far less than 100 hours trust your own judgment about whether to list them as participants– i.e., whether you think their participation was meaningful, or might be better listed as an “other collaborator or contact”.

  1. I have multiple sources of funding and people in my lab often work on overlapping projects. In the Participants section, what should I enter for Funding Support?

If a participant was paid from a funding source other than your current NSF award, please list that source of support. Do not enter anything if the participant was paid solely from your current NSF award or if they were a volunteer.

  1. I have an RCN or workshop award (or any other type award that may involve dozens of participants). Do you really want them all listed as Participants?

Yes. The list of participants provides an increasingly valuable database that NSF can use to quantify the impact of its investments. A common alternative to listing participants individually in the Participants section has been to upload a pdf document under “Supporting Files”. Please keep in mind that NSF cannot search the data provided in those documents, except by opening them one-by-one, and they will generally be ignored in official analyses and comparisons. Hence we always prefer that Participants be entered one-by-one in the Participant section.

  1. I have a collaborative award. How should my reports differ from those of my collaborators?

Obviously, you and your collaborators will have at least some shared objectives and impacts; some overlap in reports is expected. Your report should focus on the components of the project and the personnel unique to your institution.

  1. Are Annual Reports cumulative? Is the Final Report cumulative?

No and no. Current NSF Policy and the Research.gov reporting system instruct PIs to report only on the previous year of work, including for the final report. Except for “Major Goals” and “Impacts”, there should be little or no overlap from one report to the next. The Final Report should be written as an Annual Report – there’s nothing special about it other than it being the last report on a given project.

You may have done it differently in the past or received outdated advice from a colleague because it was different in the past and old habits are tough to shake, but the rules were clarified and the system changed to enforce them when reporting was moved to Research.gov.

  1. What is the Project Outcomes Report?

The Project Outcomes Report is a third reporting requirement (in addition to annual and final reports) that is due at the same time as your final report. It is a counterpart to the “public abstract” of your award. The abstract was written at the start of the project and explained what you planned to do and why it was important. The Project Outcome Report summarizes the overall goal(s) and accomplishments of the project upon its completion. It is provided directly to the general public as a permanent record and justification for our investment of taxpayer dollars in your research. Please write it carefully, paying particular attention to the major accomplishments and general significance of your work; avoid jargon and grammatical errors. Do not cut-and-paste text from your Annual or Final Reports because you wrote them for a very different audience.

  1. What happens if I don’t submit my report?

You and any Co-PIs will not be allowed to receive any new funding (e.g., annual increments, supplements, or new grants) or process any other actions (e.g., no cost extensions, PI changes) until the report is submitted and approved. Your annual report is due starting 90 days before your award anniversary, your final report is due within 90 days after your award end date. After either of those 90 day windows, the report is considered “overdue” and the block is automatically put in place. Even if you aren’t overdue when you submit a report, waiting until late in the 90-day window risks delaying timely release of annual funds and possibly going overdue before we’ve had a chance to review, receive any needed corrections, and approve the report.

  1. I submitted my Annual Report, but there’s still a block in Fastlane preventing me from receiving new funds from NSF. Why?

It’s most likely that your report still needs to be approved by the managing Program Officer; new money cannot go out the door until reports have been submitted and approved. If your report has been languishing, it’s appropriate to ask the managing Program Officer to take a look at it. (Although we enjoy learning about your discoveries, annual reports can pile up when our priorities must be placed elsewhere.)

  1. Can I submit a proposal if I have an overdue report?

Yes. Be aware, however, that every time we attempt to access any of your proposals (submitted or already awarded), we’ll be redirected to a warning message on a separate screen that tells us we cannot approve any money for the proposal because of a missing or overdue report(s). We’re required to acknowledge this by clicking a “Continue” button before we’re allowed to see any of the proposal contents. The effect of those irksome messages on Program Officers is worth keeping in mind.

  1. If one of my collaborators has an overdue report on an award that I’m not associated with, what are the consequences for me?

If that collaborator is a PI/co-PI (i.e., listed on the cover page) of a proposal or award on which you are a PI/co-PI, you will be blocked from receiving any new funds from NSF or processing any other actions in relation to that shared proposal/award. Any proposal that shares a person on the cover page with the cover page of a proposal that has a missing or overdue report is subject to the block.

  1. Why am I being asked to submit my report in June or July when it’s not overdue until August or September (or later)?

Because we don’t want you to miss your annual funding increment. If you have received an award in the last quarter of our fiscal year (the fiscal year runs from October 1 to September 30, so July, August, or September) and are scheduled to receive that grant in annual increments then you have likely encountered this situation: a program officer calls you up and says “hey, can you get your report in this week” but when you look at research.gov it says it won’t be past due for a month (or two or three).

All annual reports are due 90 days before the anniversary of the award: this provides the time to review and process everything in order to get your annual increment released to you by the actual anniversary. Frequently, reports are submitted much closer to the anniversary or even late. This pushes the start of the approval process later and often pushes the release of money to after the anniversary. But, if that anniversary date is late in the fiscal year, any sort of delay — even within the allowed “reporting window” — can push back the processing time over the year-end deadline, at which point the money is no longer available to be released. That’s not a happy state of affairs for you or for us! So if your award was made and started on September 1, the report “due date” would be June 1 of the next year and your PO would probably be hounding you by July 1 to make sure you don’t lose your funding.

This is a little bit annoying, but generally makes sense when the project begins immediately upon receipt of the award. However, some awardees request a later “start date” for the project that is well after the actual award is made (someone receiving money in September might schedule a start date for December or January). At this point things get complicated. Following the September 1 award/January 1 start example: we need an approved report in order to release funds for the second year by the September anniversary of the award being made, otherwise we run out of time within the fiscal year to actually distribute the money. But, the reporting system is, for whatever reason, blind to this and tells you to file a report based on the “start date” so the very beginning of the “due” period is in October and after the point at which our ability to send you the money due upon receipt of that report has been canceled.

So, two lessons here: 1) don’t ask for a start date way after your award is available, especially if doing so crosses the Sept./Oct. dividing line, and 2) if your PO calls and asks you to submit a report RIGHT NOW, please do it; we’re trying to give you the money we promised and not doing it can really muck things up.

 

Additional Reporting Resources

More detailed answers to many of these questions and accompanying screen shots from Research.gov are available in a pdf guide available here (click to follow, but caveat emptor: there appear to have been some additional updates since the file was posted).

If you’re really into this, a long list of guides, tutorials, templates, and demonstrations related to Project Reports is available here .


DEB Numbers: Analysis of Broader Impacts

A recent paper in Bioscience by a AAAS Fellow (Sean Watts), an Einstein Fellow (Melissa George), and an NSF Program Director (Doug Levey) explores how the Broader Impacts Criterion was applied and reported in DEB proposals between 2000 and 2010. A major conclusion is that activities aimed at recruiting and mentoring students from underrepresented groups are proposed more than twice as often as they are eventually reported by PIs; of all the types of broader impact activities, broadening participation is by far the toughest to achieve. This result and others are discussed in the context of a recent review of the Merit Review Criteria by the National Science Board and resulting revisions to the Grant Proposal Guide.


What’s your DEB Story?

Sometimes, it can be hard to fit what you want to tell us into your annual report. Other times, the coolest results, recognition of important research outcomes, and broader impacts only come to fruition in the years after a grant was closed and the final reports compiled.

We’re interested in unearthing the dark data on award outcomes. Help us tell the full story of DEB funding: from personal experiences to news-making discoveries, we want to hear from you. Comment, email us, or schedule a time to talk with us to share your experiences.

Non-exhaustive list of examples:

  • Any part of the project that you couldn’t fit in your reports and want us to know about.
  • Updates on how research on a grant has influenced student careers.
  • Publications that have become modern classics.
  • Awards and recognitions.
  • Institutional legacies of innovative programs.
  • Follow-up or translational research that took your results in unexpected directions.

We’re always looking to highlight the results of research spending to illustrate the breadth of impacts of past and current awards. Short summary stories of award outcomes are passed to NSF’s Public Affairs team and may wind up on research.gov. Your responses can help us highlight the many ways in which DEB awards serve to promote the progress of science, support education, and contribute to national well-being.


International Research Experience for Undergraduates (IREU) Supplements: Notes from the field

An International Research Experience for Undergraduates (IREU) supplement is a modification of the NSF-wide Research Experience for Undergraduates (REU) program. These IREU supplements are available exclusively to PIs in DEB. Traditional REU supplements available to NSF funded investigators support mentorship for undergraduate students to conduct independent research.

DEB has developed a partnership with CAPES, a funding agency in the Brazilian Ministry of Education. Through this partnership, DEB-funded investigators are eligible to apply for an IREU supplement to NSF in parallel with a Brazilian colleague who can apply for undergraduate funding through CAPES. If awarded, DEB funds the US student, and CAPES funds the Brazilian student so both students have an opportunity to conduct research at home and abroad. Over the course of the supplement, the two undergraduates overlap in the laboratory or the field and conduct a veritable student exchange.

This type of experience can spark a passion for science and research in undergraduate students. While the IREU supplement opportunity is still nascent, it has already provided numerous students the opportunity to conduct international research. Furthermore, programs like this allow international funding agencies to make use of their aligned interests and provide a greater funding impact through coordination and cooperation.

We had an opportunity to catch up with one of these students and her mentor as they shared some of their experiences in the IREU program:

~~~~~~~~~~~~~~~

Kendra Avinger

Undergraduate Student, Rutgers University

US Mentor: Dr. Siobain Duffy, Rutgers University

Brazil Mentor: Dr. F. Murilo Zerbini, Federal University of Viçosa

Duffy_IREU_student

Kendra Avinger, a DEB IREU student.

Tell us a little bit about your IREU experience thus far?

Surreal is the closest word to describe my IREU experience thus far. My mother is a second-generation Puerto Rican who raised my sister and me. The IREU experience is the type of horizon broadening opportunity she has always hoped for me.

In Brazil I am studying Begomoviruses, a group of viruses that cause disease in dicotyledonous plants, such as Tomato golden mosaic virus (TGMV) and East African cassava mosaic virus (EACMV). Begomoviruses were reported have major agricultural impacts with losses of 40 to 100% in Southeastern Brazilian states. I am learning to collect, catalogue, amplify, and sequence samples of Begomoviruses, as well as assisting multiple graduate students in the lab with their Begomovirus projects.

Have you participated in international activities before?

This IREU is the first international activity I have been involved in, in any capacity.

How did you prepare for your trip?

Mental preparation was key for me before arriving in Viçosa, Brazil. It is important to accept the overwhelming feeling of a new culture, location, and language without allowing it to overwhelm you and consequently, your work. To academically prepare for these differences, I enrolled in a Portuguese course tailored to speakers of the Spanish language, as I am fluent.

Tell us about working with your Brazilian counterpart?

My Brazilian counterpart, Hermano Pereira, and I overlapped at Rutgers for a week before I left for his University. Although our communications were filtered through two languages, we were inspired by our shared connection as young scientists.

Tell us about your mentors?

My mentor in Brazil is the phenomenal Dr. F. Murilo Zerbini. I am the first student to be sent to Brazil from the Duffy Lab, yet Professor Zerbini has had 3 students travel to Rutgers so far.

My US based mentor is Dr. Siobain Duffy. She understands that thriving graduate students are born from efficacious, confident undergraduates. She has helped me to realize that I have as much creative power as any professor, and to view myself on the same playing field. She gives me the confidence to move forward and share my own ideas.

What are your future professional/academic plans?

With my future, I hope to effect change eclectically. Public speaking, presenting, technical writing, and life sciences pull at my strong points thus far. I am not sure where I would like to end up long term, however, a short-term goal of mine after graduation is to manage a lab somewhere that is warm year-round, explore other interests, and eventually apply to graduate school.

~~~~~~~~~~~~~~~

Dr. Siobain Duffy

Assistant Professor, Rutgers University

How did you first get involved in conducting international research?

PI and IREU mentor, Siobain Duffy.

PI and IREU mentor, Siobain Duffy.

I am lucky that evolutionary virology is a very international field. Not just because cataloging viral diversity is a global pursuit, but some of the best experimental viral evolutionary biologists are in Europe, and some of the computational tools we use on a daily basis were developed in Africa.

My closest international collaborations both came from introductions made by a mentor of mine, who is very active internationally herself. As I was starting out as an assistant professor, she invited me to collaborate on some large multi-institution proposals. Some of them succeeded, some didn’t get funded, but the emailing back and forth created the personal connections required to start research collaborations.

Science has always advanced most quickly when ideas are shared internationally, and email, online dissemination of journal articles and VoIP technology [e.g. voice over internet technology] have made it trivial to collaborate with people on the other side of the planet.

What advice would you give other investigators who are considering applying for an IREU supplement?

My best advice is to start everything earlier than you think you need to – visas to the US take time, and undergraduates are less likely than grad students and postdocs to have gone through these processes before. The PIs on both sides will have to walk the students through obtaining housing, making sure their health insurance is set up, making sure they have working cell phones, etc.

What advice would you give to undergraduates, who may be inspired by Kendra’s work, and who are interested in getting involved in international research?

There are many more opportunities for international research exchanges than undergrads realize there are! If you are working in a lab with international collaborations, ask if there is a chance to participate, and if there is time to write an IREU proposal for your work. Look for REUs with an international component! Look for post-baccalaureate programs, especially in countries where you are already proficient in the main language. A former undergraduate researcher in my lab is working in a German science lab doing exactly this.

And don’t sweat it if your research doesn’t turn international at this stage – if you stay in science, there’s a very good chance you can work and live abroad for a while.

Has the IREU supplement impacted your collaborations with international investigators?

NSF’s partner in this exchange, CAPES, has been so pleased with the project they are funding additional Brazilian undergrads to come to my lab this fall. Because I have visited my collaborator’s lab in Viçosa, I know these students, and they already have some sense of who I am and what their time in the US will be like – which makes it much less intimidating to get on the plane. I will be looking for ways to write more IREU students into projects in the future.

 


Assessing the Value of the Doctoral Dissertation Improvement Grant

Caveat: This post is based on the research and analysis of Kara Shervanick, a 2013 Summer Student in DEB. She did valuable work but her time was relatively brief for this complex information gathering and analysis process. This work does provide some context for understanding DDIG program outcomes, however, we point out that the small sample size limits the power of these analyses.

See our other recent posts on the DDIG program here and here.

Summary

Receipt of a DDIG award does not appear correlated with remaining in research oriented careers over time. Having a DDIG award, however, is associated with a more successful research-oriented career for the success metrics of: number of additional NSF grants, total number of publications and citations, and h-index. The sampled data did not provide clear evidence for gender associated with the success of awarded or declined applicants. In the limited instances of apparent differences associated with gender, no clear mechanism emerges.

Study Background

In the summer of 2013 we conducted a retrospective analysis of the DEB DDIG program, making use of the dedicated effort of a temporary summer student intern. The goal of the project was to see if there were discernable correlations between DDIG award receipt and subsequent career trajectories.

Our student researcher set out to collect and analyze DEB and publicly available data to address the following questions about the DDIG applicant pool:

(i)            Does receiving a DDIG award correlate with who leaves or remains in a research-oriented career?

(ii)           Does gender correlate with who leaves or remains in a research-oriented career?

(iii)          Does having a DDIG correlate with a more successful research career?

(iv)         Of those in research-oriented careers, does the potential effect of receiving a DDIG award differ based on gender?

Since the total number of DDIG applicants is quite large, the data difficult to assemble, and the time in which to complete the project limited, the scope of the data collection and analysis was limited to samples of the DDIG student applicants from 5 (2008), 10 (2003), 15 (1998), and 20 (1993) years ago.

The percentage awarded each year ranged from 28.6% to 32.5% and averaged 30.8% over the four years, in line with the 1983-2013 average noted in our previous post. DDIG3.1

There was no apparent gender bias in the awards and female students were represented in the awards in proportion with the proposal pool in each year.DDIG3.2

Success, of course, is subjective. However, for the purposes of this study, we looked at several imperfect proxies: future awards, and publication and citation metrics.

Sample Selection

For each year, 30 awardees were randomly selected from the list of students receiving a DDIG; these comprised the “awarded” samples.

With any such retrospective analysis, there is the problem of distinguishing correlation for causation. Receiving a DDIG award might improve the career of an individual, or the most talented people might receive awards and would have gone on to have more successful careers had there been no DDIG program. To minimize this potential bias, the “declined” comparison group was not random but consisted of the 30 declined proposals with the highest average reviewer score in each year. Students who later received DDIG support on a resubmission were excluded from consideration in the declined group for any year. Thus the mean score of the proposals in the selected declined group is higher than the mean for all declined proposals and is nearer the mean score of the awarded samples than the bulk of declined proposals.DDIG3.3

For each of the 240 student applicants (sample of 30 across two outcomes and four years), NSF award search, Web of Knowledge, publicly available webpages, and direct contact were used to identify: additional number of NSF grant awards, total number of publications, total number of citations, citations per paper, h-index, and current position and institution or company.

Does a DDIG award correlate with retention in research-oriented careers?

Position title and employer were considered in assigning each individual’s current position to a “research” or “non-research” category. Except for a few individuals who couldn’t be found or left the workforce entirely, nearly all of the people who left research are still in the broader environmental and scientific workforce as educators and private sector professionals.DDIG3.4

Award status did not appear correlated with whether or not an applicant remained in a research-oriented career in any of the reviewed years. However, we do see a consistent pattern at 10, 15, and 20 years which suggests some small but real correlation with retention in research careers may be present. And, though the data doesn’t directly address this, losses from the research pool seem to stop after 10 years post-PhD.

Is applicant gender correlated with retention in research-oriented careers?DDIG3.5

Only one year, 1993, showed a significant difference in this respect. There is no clear explanation for why 1993 is so different from the other years for gender differences. There were fewer females participating in the DDIG program in 1993 compared to the later years, so it is more sensitive to small differences. However, none of the women in this cohort were missing data on current position and the positions were similar (private industry, government, education, administration) to those occupied by both men and women in non-research careers.

One thing we don’t see in the samples is an exodus of female scientists from the workforce entirely (regardless of funding outcome). For the most part, neither the men nor women in our sample seem to have given up research.

Does receiving a DDIG correlate with a more successful research career?

Within the limits of this analysis, it appears so.DDIG3.6

For all measures at 10, 15, and 20 years awardees appear to have more success than those who applied for but did not receive DDIG funding. There definitely is an association between DDIG funding and positive research career outcomes.

Although we generally see larger differences in the older cohorts, this data doesn’t tell us how, if at all, these differences grow over time for individual year groups or whether the trajectories are changing from year to year. However, the small differences between awarded and declined students in the 2008 cohort suggests to us that both groups are starting from the same point and the DDIG program is not simply selecting those already “ahead in the game” of academic output.

On the flip side, the lack of differences at the 5-year mark could be seen as supporting the idea that the program simply selects those who would have done well in research without the DDIG program rather than providing a direct early career boost.

Of those in research-oriented careers, does the potential effect of receiving a DDIG award differ based on gender?

For those remaining in research, there were few detectable effects of funding outcome and gender on career metrics. None were particularly surprising. In general, for both genders, awardees were more successful than declined students and in many of the metrics males had a slight advantage over females for the year and funding outcome. However the differences were neither sufficiently large nor consistent to suggest real relationships between gender, DDIG award status, and career outcomes.


Hot off the press: new edition of the Dimensions of Biodiversity Abstract Booklet

The Dimensions of Biodiversity program has just released its latest abstract booklet, available at: http://www.nsf.gov/pubs/2014/nsf14057/nsf14057.pdf

DEB's Dimensions of Biodiversty 2010-2013 abstract book cover.This special program in the Division of Environmental Biology focuses on understanding the least known aspects of biodiversity. It is a dynamic and multi-disciplinary program that takes a broad view of biodiversity, ranging from genes through species to ecosystems. The program requires investigators to integrate genetic, taxonomic/phylogenetic, and functional aspects of biodiversity.

The 2013 awards were co-funded by the NSF Divisions of Environmental Biology and Ocean Sciences, NASA, the Natural National Science Foundation of China, as well as the São Paulo Research Foundation FAPESP of Brazil.

Check out the cool new projects that were funded in 2013, as well as the impressive outputs from previous awardees. These include numerous publications, creative outreach activities and spotlights in high profile media outlets.

Congratulations to the Dimensions investigators for all of their hard work and accomplishments!