Reexamining the NSF GRFP review process

Roxanne Evande
SciTech Forefront
Published in
9 min readJun 14, 2023

BACKGROUND

In the United States, securing federal research funding is a top priority for most graduate students, and understandably so. Graduate fellowships provide financial security, prestige and a valuable network. Compared to other federal funding agencies, such as the National Institutes of Health and the Department of Energy, the National Science Foundation (NSF) offers a more competitive graduate research fellowship program (GRFP), which has awarded scholarships to several Nobel laureates and leaders in science and engineering. However, in late July 2022, two US senators and a ranking member of the U.S. Senate Committee on Health, Education, Labor and Pensions sent a letter to the NSF director, expressing concerns about the importance of the agency’s mission and the type of grants that it was funding. While the senators were primarily concerned about NSF-funded research projects with a particular political bent, they also called into question the overall NSF grant awarding process [1].

The NSF had received approximately $8.8 billion in federal funding for the fiscal year 2022. As that amount was estimated to increase in 2023, the senators wanted to understand the specific standards the NSF applies in reviewing grant awards and how it determines the value of the research it fund. According to the NSF GRFP, the program has two main goals: 1) to select, recognize and financially support early-career individuals with the demonstrated potential to be high-achieving scientists and engineers, and 2) to broaden participation in science and engineering by women, members of groups historically underrepresented in STEM, persons with disabilities and veterans. As policymakers scrutinize the NSF budget and grant award process, the NSF must begin looking at its grant and fellowship review processes to ensure that they are in line with the overall mission of the NSF GRFP and are completely devoid of bias. In this paper, we will be discussing the NSF GRFP and the changes it can implement in order to improve its processes and remain consistent with its main goals.

Historically, in reviewing proposals, the NSF GRFP used publication records, grade point average (GPA) and research conducted to evaluate an individual’s career trajectory. However, that measure of potential was riddled with problems. It eliminated many members of underrepresented minorities who the NSF GRFP claims it wants to fund. Some applicants may not have participated in undergraduate research because their university was underserved or the applicants themselves were not able to commit due to financial stressors. Consequently, such applicants are disadvantaged, because without research they cannot publish papers. While publications in scientific research have always been seen as the gold standard, some individuals have enormous opportunities to publish while others have few or none.

Copyright © 2020 Shutterstock

In 2010, the NSF GRFP decided to switch to a more holistic style of reviewing. It started reviewing proposals against two main criteria: intellectual merit and broader impacts. Reviewers were asked to consider a broad range of factors when reviewing applications, including educational, research, leadership, outreach and service records. According to Gisele Mueller Parker, the then director of the NSF GRFP program: “The goal was to move away from grades and focus more on the individual as a whole” [2]. Unfortunately, in recent years, despite those changes, applicants have reported reviewers’ comments that are less than constructive and more focused on metrics which might not be a realistic indicator of potential or success, such as number of publications. In April 2022, many individuals took to social media to express their joy or disappointment about their NSF GRFP outcome using the hashtag #NSFGRFP. Many former GRFP applicants shared messages of perseverance while others brought to light an unfortunate reality: their reviewers gave critiques that displayed unconscious and implicit bias.

The comments that reviewers leave on an application can have tremendous impact on a young individual. For example, a Ph.D. student at the University of California* shared her experience applying to the GRFP in 2020. She did not receive the award and a reviewer critiqued that she received a B in general chemistry and that she lacked a strong publications record. Yet, publication record should not be a measure of how good of a scientist an applicant might become. The student that received the comments above now has multiple publications, has presented at various national conferences, and has received grants and awards that recognize her potential. There could be any number of young scientists and engineers who may have been overlooked due to circumstances beyond their control or due to their background.

RECOMMENDATIONS

In an effort to quantify the instances of unfair reviewer comments, such as the one above, survey data was collected from potential and past applicants and previous reviewers to develop a set of informed suggestions for the improvement of the NSF GRFP review process. The surveys were completed by applicants that received the fellowship or an honorable mention, or did not receive the fellowship. Reviewers anonymously provided feedback on their training process with insight that informed the following three main suggestions for the improvement of the NSF GRFP application.

1. Create and publicly post an annual or biannual update of the NSF GRFP report

2. Increase transparency of the application review process

3. Improve reviewer training by creating an in-person orientation

CREATE AND PUBLICLY POST AN UPDATED NSF GRFP REPORT

According to the NSF website, the last known report of the NSF GRFP was published in 2014: Evaluation of the National Science Foundation’s Graduate Research Fellowship Program. While informative, this report is now outdated. NSF must establish a concurrent, regular report with concrete recommendations to improve the review process. This will ensure that the fellowship review is modified regularly to increase equity and transparency for the scientific community. Additionally, this report mentions that data was taken from applicants that received the award or an honorable mention. To further expand on the data collected, standardizing a follow up survey and utilizing the data for all applicants would be beneficial and prevent bias. The survey I conducted received responses from awardees, applicants who received an honorable mention and individuals who did not receive the fellowship (Figure 1). The feedback was similar, regardless of the outcome of the application. Awardees voiced many of the same concerns as applicants that did not receive the fellowship. At least 20% of applicants mentioned that reviewers scrutinized their grades too much, even if they received the fellowship. Applicant x even stated: “Reviewers mentioned my GPA as a “limitation”, but since I had five publications, they funded me anyway”. Introducing a follow-up survey following distribution of the awards would help keep the NSF GRFP report current and could additionally assist reviewers in adjusting their methods periodically. Results should be examined regularly to consider and ultimately address the concerns of applicant.

Figure 1. A Pie chart representing the survey respondents’ NSF GRFP outcomes.

INCREASE TRANSPARENCY OF THE APPLICATION REVIEW PROCESS

The NSF GRFP is a prestigious and dream fellowship for many doctoral students. When applicants begin the process, they visit the fellowship website to ensure that they meet the standard for the award. The current NSF GRFP solicitation states that NSF proposals are reviewed against two main criteria: intellectual merit and broader impacts. It also mentions that reviewers are encouraged to holistically review the rest of the application and assess applicants with a balanced consideration of educational and research record, leadership, outreach, service activities and future plans, as well as individual competencies, experiences and other attributes. While the NSF might think this is more than enough information, I believe that the NSF still needs to be more transparent. Several other fellowships, such as the NIH fellowships, set out the exact rubrics that reviewers follow. This provides applicants with a guideline of what needs and should be included on their application. Only some applicants are privileged enough to have guidance from their institutions, previous applicants or and reviewers. Universities such as Stanford, Yale, University of Florida and Drexel have courses and workshops to specifically designed to prepare students for the NSF GRFP application process. Nonetheless, the vast majority of applicants attend institutions that are somewhat underserved and do not provide these resources.

In the survey I conducted, an applicant that received the fellowship mentioned that “My PhD program had a really strong track record of successful GRFP applications and senior students were extremely willing to share their experience from their successful proposals and to offer feedback on their drafts. This was extremely helpful — it helping me in putting together a competitive application and in making the award seem achievable.” Students who do not have the resources rely solely on the solicitations and NSF GRFP website for guidance. I believe the NSF GRFP should include a set guideline or rubric that applicants can see, and reviewers will use. This would allow less privileged applicants to prepare ahead of time in order to meet the criteria, and would make it easier for reviewers to reach agreement on individual applications.

IMPROVE REVIEWER TRAINING BY CREATING AN IN-PERSON ORIENTATION

NSF GRFP reviewers are expected to have completed online training and attended an orientation webinar. However, according to an article entitled, “‘Dysfunctional.’ NSF graduate fellowship review process draws criticism”, reviewers mentioned that the NSF does not provide enough guidance on how to weigh the different review criteria.

The survey I conducted yielded similar reviewer comments. I asked reviewers if they believed they received adequate training in three areas: (1) reviewing the application as a whole instead of focusing on one specific area; (2) refraining from judging applicant based on their background (academic, socioeconomic, ethnic); and (3) providing constructive feedback that elevates and helps applicants moving forward. While most reviewers agreed that the training received covered all these topics, some believed that they would need to know judge applicants from various backgrounds. Reviewer X, for example, said: “I would enjoy discussions of what it means to support candidates from diverse backgrounds; I feel applicants too often have to display their trauma to get noticed.” Reviewers also suggested having trained professionals explicitly point out biased reviewer comments that should be avoided and having relevant updated scenarios for training. Introducing the follow-up survey proposed above could help develop scenarios for reviewer training. Ultimately, I suggest introducing an in-person reviewer seminar to ensure that all reviewers know exactly what is expected of them. Ideally, the reviewer seminar would take the place of the webinar and should be required for all reviewers, regardless of their years of experience with the NSF GRFP. It should include skilled professionals who would help to enhance the knowledge of reviewers by identifying relevant scenarios and teaching them how to engage with applicants from diverse backgrounds.

Figure 2. A) Responses from reviewers asked: have you ever witnessed bias in the fellowship review sessions, B) Responses from reviewers asked: Do you believe you would benefit from a 1–2 day in person training seminar

If the scenarios and the training are constantly updated to reflect the times, frequent reviewers would not find them repetitive. Transitioning to annual reviewer training would benefit all reviewers and the NSF GRFP. Figure 2A shows that 22% of reviewers believed that they would benefit from a reviewer training seminar, 22% believed that they would not benefit from the training and, interestingly, 55.5% said that they might benefit from the training. In a follow-up question, 55.56% of the reviewers said they had witnessed bias in fellowship reviewing. Although reviewers feel they are properly trained, instances of bias are still being witnessed. This leads me to believe that an annual reviewer seminar would be of benefit to all reviewers and the NSF GRFP. Alternatively, if attending an orientation in person becomes too expensive, the current webinar should either become hybrid or stay online but with improved training, as outlined above.

CONCLUSION

In summary, receiving an external fellowship can help enhance the personal development of young researchers and provide them with resources to propel them to greater heights in the future success. As a federally funded agency, the NSF should be implementing measures to ensure equity in its fellowship awards. I urge the NSF to take these suggestions onboard and strive for a more inclusive and fair review process, so that the most talented, rather than the most privileged, students are awarded this fellowship.

Acknowledgements

This publication was written as part of the American Society for Biochemistry and Molecular Biology (ASBMB) Advocacy Training Program. The views reflected in herein are solely attributed to those of the author.

*University name changed for discretion

[1]https://www.help.senate.gov/imo/media/doc/Burr%20Paul%20Cruz%20to%20NSF%20re%20grant%20oversight%20072922.pdf

[2]https://www.science.org/content/article/dysfunctional-nsf-graduate-fellowship-review-process-draws-criticism

--

--