NSF Public

Outcomes Report

Since January 4, 2010, the NSF has in theory required all grantees to submit Project Outcome Reports within ninety days of their grants’ expiration. (Since 2016, it has been 120 days.) Quoting from a Research.gov fact sheet: “The Project Outcomes Report for the General Public is a required report, written by Principal Investigators (PIs) specifically for the public, to provide insight into the outcomes of National Science Foundation (NSF)-funded research. The America COMPETES Act (ACA) of 2007, Section 7010, requires that research outcomes and citations of published documents resulting from research funded, in whole or in part, by NSF be made available to the public in a timely manner and electronic format. . . . [It is] required for new awards made or existing awards that receive funding amendments on or after January 4, 2010.”[1]

The NSF’s “Proposal and Award Policies and Procedures Guide” for 2016 states:

No later than 120 days following expiration of the grant [the 2010, 2013, and 2014 editions of the guide specify 90 days], a project outcomes report for the general public must be submitted electronically. This report serves as a brief summary, prepared specifically for the public, of the nature and outcomes of the project. This report will . . . describe the project outcomes or findings that address the intellectual merit and broader impacts of the work as defined in the NSF merit review criteria. This description should be a brief (generally, two to three paragraphs) summary of the project’s results that is written for the lay reader. Principal Investigators are strongly encouraged to avoid use of jargon, terms of art, or acronyms. [2]

Examining data at Research.gov, however, only a limited number of grantees submitted reports from 2010 through 2013. [3]  Few anthropologists are willing to challenge the NSF given it is a key source of much anthropological funding. Fortunately, the Center for a Public Anthropology does not receive NSF funding. When writing directly to the NSF director did not elicit a response, the Center and a number of student volunteers (using their home addresses) wrote letters to members of the US House Subcommittee on Research and Technology and the US Senate Subcommittee on Science and Space, who jointly control the NSF’s budget. The budget, it should be noted, was up for renewal in 2014. As the campaign started, various media (e.g., the Chronicle of Higher Education, October 24, 2014) reported on the House Subcommittee’s attempt to review NSF grants, especially in the social sciences. However, the Subcommittee ran into various problems and was unsuccessful. Still, the NSF felt threatened.

In assessing the campaign’s effectiveness, it is interesting to compare how many additional Project Outcome Reports were submitted during the three months of the campaign compared to the number of the reports submitted prior to its start. The table below reveals Project Outcome Reports from grants awarded as a percentage of total awarded grants in 2010, 2011, 2012, and 2013:

Grants Awarded as Percent of Total Awarded Grants

Before the Campaign After the Campaign
2010 6% 85%
2011 11% 82%
2012 3% 82%
2013 23% 84%


What I hypothesize occurred is this: When the House Subcommittee ran into resistance in seeking to review NSF grants, the student’s “snail mail” letters— over two thousand in all—attracted the Subcommittee’s interest and offered an alternative means for it to assert authority over the NSF. (Copies of the letters were also sent to the NSF director.) Given the NSF’s failure to enforce its own requirement in relation to federal law, the NSF had little choice but to address the problem regarding the Project Outcomes Reports. It could not push back against the Subcommittee as it did with the Subcommittee’s inquiry into NSF grants.  Clearly, the Center and students, working together, could not have influenced the NSF on their own. But they could provide the House Subcommittee with information it needed to push back against the NSF—information it apparently did not previously have. The House Subcommittee, given its control over the NSF’s budget, had to be listened to. The Center and students did not.

[1] The fact sheet is available at here (accessed October 5, 2017).

[2]  “Proposals and Award Policies and Procedure Guide,” NSF, effective January 2016, see here. Please note the guides for 2010, 2013, and 2014 follow the same or similar phrasings.

[3] See here. After recent correspondence with the NSF and further investigation, there is a more accurate compilation of those who have submitted their Project Outcome Reports based on Faculty Award Numbers. Reviewing these and related data, four important points stand out: (1) Data collected with the Faculty Award Numbers, while higher than that publicly displayed at the above website, still indicate compliance is comparatively low. (2) While the NSF asserts it has a high compliance rate in respect to Project Outcome Report submissions, from the data I have available, I believe the NSF has in fact not done a systematic study of compliance rates using the Faculty Award Numbers and hence is uncertain as to what the actual compliance rate is for Project Outcome Report submissions. (3) Nor is it certain, from a limited examination of related data, that the NSF software, contrary to what is claimed, always bars those who have failed to submit Project Outcomes Reports from obtaining new grants. (4) The overall impression gained is that, while the NSF acts quite professional in its grant approval process, it is, at times, less than professional in insuring grantees actually carry out their research in the manner promised and provide a Project Outcomes Reports as promised when they received their funding (see An Anthropology of Anthropology, 2019:33,187).