Getting to ‘Yes’: NSF Pilots Research Security Assessment for Grants
The National Science Foundation has begun piloting a grant assessment process focused on research security, called TRUST, that will implement risk mitigation measures for project proposals that are close to being funded and meet certain risk criteria. The process responds to congressional directives requiring NSF to scrutinize research areas posing security concerns but also aims to enable the agency to still take on risky projects.
“We want to continue to fund the best research here in the United States, and if we decline proposals because we think they’re too risky, then researchers might seek that funding elsewhere,” said Rebecca Keiser, NSF’s chief research security official, at a briefing on TRUST this month. “We also want to focus on mitigation, rather than decline, and get to ‘yes.’”
The TRUST process uses three criteria that respectively relate to risk from institutional affiliations, nondisclosures, and the research itself. Analysts in the research security office consider whether senior project personnel have active appointments or positions with certain proscribed organizations. Such organizations include:
- Those on the U.S. government’s Entity List, which are subject to export restrictions;
- Those named in Executive Order 14032, which are Chinese companies with ties to the defense or surveillance technology sectors;
- Chinese military companies as determined by the Department of Defense;
- Research institutions that present risks of improper technology transfer as determined by DOD, which currently include certain institutions in China, Russia, and Iran; and
- Foreign talent recruitment programs deemed “malign” by DOD, whose participants are automatically prohibited from receiving federal funds per a provision of the CHIPS and Science Act.
The analysts then consider any nondisclosures of appointments, activities, and sources of financial support.
Finally, a research security review team of NSF subject matter experts, program officers, and representatives from the research security office reviews the analyses of the first two criteria and assesses the third, which covers potential national security applications of the research. If the team confirms concerns from the first two criteria or finds sufficient national security risk, it negotiates a risk mitigation plan with the research institution.
Scenario discussions during this step of the review process are intended to help NSF determine exactly how to apply the third criterion concerning the potential national security applications of the research, Keiser said. She added that while some cases are obvious, such as the prospect of developing a quantum computer capable of breaking standard encryption methods, often the potential applications are less tangible, especially for the type of early-stage research that NSF typically funds.
Keiser said mitigations could include providing information to the research team about the agency’s concerns regarding national security applications, securing commitments from the research team to give up funding from potentially concerning entities, or requiring more frequent project reporting to help ensure the research does not veer into concerning areas. “It does not mean we would terminate the research at that point,” she said of this third mitigation example. “It may mean that we consult with our defense-related colleagues to see if maybe that is a research project that has to go more into that area.”
Keiser added that NSF is also considering whether to include representatives from national security agencies to observe or participate as non-voting members of the review team, or to consult them only when the need arises. She noted that there was “real concern” about bringing anyone from outside of NSF into the proposal review process.
NSF is piloting the process during fiscal year 2025 on proposals in quantum information science and technology in order to assess the time and resources needed. If the pilot is successful, Keiser said, NSF will consider expanding the review process to other critical technology areas such as AI, microelectronics, and biotechnology, and eventually covering all “priority areas.” She added that the office is “very nervous” about using the case-by-case approach across all NSF grants given the volume of applications; the agency receives more than 43,000 proposals a year and funds more than 11,000.
Keiser noted that Congress asked NSF to produce a list of research areas with potential national security implications as part of fiscal year 2023 appropriations legislation. However, based on a report from JASON, an independent group of scientists that advises the government on sensitive matters, NSF advised Congress that producing such a list could be counterproductive. The report found that such a list could lead to a one-size-fits-all approach that either goes too far in restricting researchers or fails to address less obvious risks, Keiser said.
Keiser was speaking to the National Academies’ Committee on Science, Engineering, Medicine, and Public Policy, whose members probed potential pitfalls in the process. Committee member Kelvin Droegemeier, who headed the White House Office of Science and Technology Policy during the Trump administration, suggested that grant applicants would likely try to write proposals in ways that avoid triggering the mitigation plan process. As a result, concerning features of the research may only arise in their annual project reports, which are not subject to the same scrutiny, Droegemeier said.
Keiser replied by noting her office has been working to raise awareness and due diligence among NSF program officers about research security, which she said has led to “encouraging” outreach to her office flagging potential concerns. “The vast majority of the time when we’re contacted, we’re able to assure them that it’s not an issue or can be easily mitigated with increased disclosure,” she said.
Committee member Ellen Pawlikowski, a former Air Force general, asked whether NSF has any collaboration with DOD on research security, expressing concern about the administrative burden of the various new screening processes being deployed across government. Keiser said agencies have been working to harmonize their risk mitigation approaches but face various challenges, such as differing levels of risk tolerance.
DOD and the National Institutes of Health have both established decision matrix processes which describe types of researcher activity, all related to foreign ties, that can lead to implementing risk mitigation measures or rejecting the application. The Department of Energy is also currently developing risk matrices. In contrast, NSF describes its step-by-step assessment process as a “decision tree.”
“I’m afraid if we have too much of a common approach, it might be the lowest common denominator,” Keiser stated. “That being said, we have to be more transparent and share with one another for another reason: we’re funding the same PIs. If we have the same PI in the same institution and we’re putting three different types of mitigation plans on the same, then that’s just not okay.”