Searching by grant number: comparison of funding acknowledgments in NIH RePORTER, PubMed, and Web of Science.
ABSTRACT: Objective:Several publication databases now index the associated funding agency and grant number metadata with their publication records. Librarians who are familiar with the particulars of these databases can assist investigators and administrators with data gathering for publication summaries and metrics required for renewals of and progress reports for National Institutes of Health (NIH) grants. Methods:Publication lists were pulled from three main indexers of publication-associated funding information (NIH RePORTER, PubMed, and Web of Science), using iterative search strategies. All discovered variations for the cited grant number of interest were recorded and tested. Publication lists were compared for overall coverage. Results:A total of 986 publications citing the single grant number of interest were returned from the given time frame: 920 were found in PubMed, 860 in NIH RePORTER, and 787 in Web of Science. Web of Science offered the highest percentage of publications that were not found in the other 2 sources (n=63). Analysis of publication funding acknowledgments uncovered 21 variations of the specific NIH award of interest that were used to report funding support. Conclusions:This study shows that while PubMed returns the most robust list of publications, variations in the format of reported funding support and indexing practices meant no one resource was sufficient to capture all publications that cited a given NIH project grant number. Librarians looking to help build grant-specific publication lists will need to use multiple resources and be aware of the most frequently reported grant variations to identify a comprehensive list of supported publications.
Project description:In this paper, we estimate the impact of receiving an NIH grant on subsequent publications and citations. Our sample consists of all applications (unsuccessful as well as successful) to the NIH from 1980 to 2000 for standard research grants (R01s). Both OLS and IV estimates show that receipt of an NIH research grant (worth roughly $1.7 million) leads to only one additional publication over the next five years, which corresponds to a 7 percent increase. The limited impact of NIH grants is consistent with a model in which the market for research funding is competitive, so that the loss of an NIH grant simply causes researchers to shift to another source of funding.
Project description:This study analyzes funding acknowledgments in scientific papers to investigate relationships between research sponsorship and publication impacts. We identify acknowledgments to research sponsors for nanotechnology papers published in the Web of Science during a one-year sample period. We examine the citations accrued by these papers and the journal impact factors of their publication titles. The results show that publications from grant sponsored research exhibit higher impacts in terms of both journal ranking and citation counts than research that is not grant sponsored. We discuss the method and models used, and the insights provided by this approach as well as it limitations.
Project description:We investigated the association between a U.S. National Institutes of Health (NIH) R01 applicant's self-identified race or ethnicity and the probability of receiving an award by using data from the NIH IMPAC II grant database, the Thomson Reuters Web of Science, and other sources. Although proposals with strong priority scores were equally likely to be funded regardless of race, we find that Asians are 4 percentage points and black or African-American applicants are 13 percentage points less likely to receive NIH investigator-initiated research funding compared with whites. After controlling for the applicant's educational background, country of origin, training, previous research awards, publication record, and employer characteristics, we find that black applicants remain 10 percentage points less likely than whites to be awarded NIH research funding. Our results suggest some leverage points for policy intervention.
Project description:RATIONALE:Funding decisions for cardiovascular R01 grant applications at the National Heart, Lung, and Blood Institute (NHLBI) largely hinge on percentile rankings. It is not known whether this approach enables the highest impact science. OBJECTIVE:Our aim was to conduct an observational analysis of percentile rankings and bibliometric outcomes for a contemporary set of funded NHLBI cardiovascular R01 grants. METHODS AND RESULTS:We identified 1492 investigator-initiated de novo R01 grant applications that were funded between 2001 and 2008 and followed their progress for linked publications and citations to those publications. Our coprimary end points were citations received per million dollars of funding, citations obtained <2 years of publication, and 2-year citations for each grant's maximally cited paper. In 7654 grant-years of funding that generated $3004 million of total National Institutes of Health awards, the portfolio yielded 16 793 publications that appeared between 2001 and 2012 (median per grant, 8; 25th and 75th percentiles, 4 and 14; range, 0-123), which received 2 224 255 citations (median per grant, 1048; 25th and 75th percentiles, 492 and 1932; range, 0-16 295). We found no association between percentile rankings and citation metrics; the absence of association persisted even after accounting for calendar time, grant duration, number of grants acknowledged per paper, number of authors per paper, early investigator status, human versus nonhuman focus, and institutional funding. An exploratory machine learning analysis suggested that grants with the best percentile rankings did yield more maximally cited papers. CONCLUSIONS:In a large cohort of NHLBI-funded cardiovascular R01 grants, we were unable to find a monotonic association between better percentile ranking and higher scientific impact as assessed by citation metrics.
Project description:BACKGROUND:Large cross-disciplinary scientific teams are becoming increasingly prominent in the conduct of research. PURPOSE:This paper reports on a quasi-experimental longitudinal study conducted to compare bibliometric indicators of scientific collaboration, productivity, and impact of center-based transdisciplinary team science initiatives and traditional investigator-initiated grants in the same field. METHODS:All grants began between 1994 and 2004 and up to 10 years of publication data were collected for each grant. Publication information was compiled and analyzed during the spring and summer of 2010. RESULTS:Following an initial lag period, the transdisciplinary research center grants had higher overall publication rates than the investigator-initiated R01 (NIH Research Project Grant Program) grants. There were relatively uniform publication rates across the research center grants compared to dramatically dispersed publication rates among the R01 grants. On average, publications produced by the research center grants had greater numbers of coauthors but similar journal impact factors compared with publications produced by the R01 grants. CONCLUSIONS:The lag in productivity among the transdisciplinary center grants was offset by their overall higher publication rates and average number of coauthors per publication, relative to investigator-initiated grants, over the 10-year comparison period. The findings suggest that transdisciplinary center grants create benefits for both scientific productivity and collaboration.
Project description:Objective:To better support dentistry researchers in the ever-changing landscape of scholarly research, academic librarians need to redefine their roles and discover new ways to be involved at each stage of the research cycle. A needs assessment survey was conducted to evaluate faculty members' research support needs and allow a more targeted approach to the development of research services in an academic health sciences library. Methods:The anonymous, web-based survey was distributed via email to full-time researchers at the Faculty of Dentistry, University of Toronto. The survey included twenty questions inquiring about researchers' needs and behaviors across three stages of the research cycle: funding and grant applications, publication and dissemination, and research impact assessment. Data were also collected on researchers' use of grey literature to identify whether current library efforts to support researchers should be improved in this area. Results:Among library services, researchers considered support for funding and grant applications most valuable and grey literature support least valuable. Researcher engagement with open access publishing models was low, and few participants had self-archived their publications in the university's institutional repository. Participants reported low interest in altmetrics, and few used online tools to promote or share their research results. Conclusions:Findings indicate that increased efforts should be made to promote and develop services for funding and grant applications. New services are needed to assist researchers in maximizing their research impact and to increase researcher awareness of the benefits of open access publishing models, self-archiving, and altmetrics.
Project description:In 2004, there existed limited tuberculosis (TB) research capacity in the country of Georgia. In response, a collaborative research training program (RTP) supported by a National Institutes of Health Fogarty International Center Global Infectious Diseases grant was formed between a U.S. academic institution and the National Center for Tuberculosis and Lung Disease (NCTLD) and other institutions in Georgia. We sought to assess outcomes of this RTP. The TB RTP combined didactic and mentored research training for Georgian trainees. Long-term trainees were supported for a 2-year period and with posttrainee career development mentoring. Metrics used to measure program performance included publications, grants received, and career advancement. From 2004 to 2015, 20 trainees participated in the program with 15 (75%) authoring a total of 65 publications in PubMed-listed journals. The median number of publications per trainee was six (interquartile range 2-14). A total of 16 (80%) trainees remain working in the area of TB; nine were promoted to leadership positions and three to lead research units at Georgian institutions. Ten (50%) trainees were the principal investigator (PI) of a peer-reviewed external grant after Fogarty-supported training, and 40% served as research mentors. Annual TB-related research funding at the NCTLD increased from $5,000 in 2005 to ?$1.5 million in 2017. A Georgian Fogarty trainee was either PI, site PI, or coinvestigator on > 90% of all research funding. We believe that the NIH Fogarty-funded TB research training grant has made critical contributions to increasing the TB-related research infrastructure and capacity in Georgia, particularly at the NCTLD.
Project description:OBJECTIVE:Determine drivers of academic productivity within U.S. departments of surgery. METHODS:Eighty academic metrics for 3,850 faculty at the top 50 NIH-funded university- and 5 outstanding hospital-based surgical departments were collected using websites, Scopus, and NIH RePORTER. RESULTS:Mean faculty size was 76. Overall, there were 35.3% assistant, 27.8% associate, and 36.9% full professors. Women comprised 21.8%; 4.9% were MD-PhDs and 6.1% PhDs. By faculty-rank, median publications/citations were: assistant, 14/175, associate, 39/649 and full-professor, 97/2250. General surgery divisions contributed the most publications and citations. Highest performing sub-specialties per faculty member were: research (58/1683), transplantation (51/1067), oncology (41/777), and cardiothoracic surgery (48/860). Overall, 23.5% of faculty were principal investigators for a current or former NIH grant, 9.5% for a current or former R01/U01/P01. The 10 most cited faculty (MCF) within each department contributed to 42% of all publications and 55% of all citations. MCF were most commonly general (25%), oncology (19%), or transplant surgeons (15%). Fifty-one-percent of MCF had current/former NIH funding, compared with 20% of the rest (p<0.05); funding rates for R01/U01/P01 grants was 25.1% vs. 6.8% (p<0.05). Rate of current-NIH MCF funding correlated with higher total departmental NIH rank (p < 0.05). CONCLUSIONS:Departmental academic productivity as defined by citations and NIH funding is highly driven by sections or divisions of research, general and transplantation surgery. MCF, regardless of subspecialty, contribute disproportionally to major grants and publications. Approaches that attract, develop, and retain funded MCF may be associated with dramatic increases in total departmental citations and NIH-funding.
Project description:Some scholars add authors to their research papers or grant proposals even when those individuals contribute nothing to the research effort. Some journal editors coerce authors to add citations that are not pertinent to their work and some authors pad their reference lists with superfluous citations. How prevalent are these types of manipulation, why do scholars stoop to such practices, and who among us is most susceptible to such ethical lapses? This study builds a framework around how intense competition for limited journal space and research funding can encourage manipulation and then uses that framework to develop hypotheses about who manipulates and why they do so. We test those hypotheses using data from over 12,000 responses to a series of surveys sent to more than 110,000 scholars from eighteen different disciplines spread across science, engineering, social science, business, and health care. We find widespread misattribution in publications and in research proposals with significant variation by academic rank, discipline, sex, publication history, co-authors, etc. Even though the majority of scholars disapprove of such tactics, many feel pressured to make such additions while others suggest that it is just the way the game is played. The findings suggest that certain changes in the review process might help to stem this ethical decline, but progress could be slow.
Project description:PURPOSE:Implementation science offers methods to evaluate the translation of genomic medicine research into practice. The extent to which the National Institutes of Health (NIH) human genomics grant portfolio includes implementation science is unknown. This brief report's objective is to describe recently funded implementation science studies in genomic medicine in the NIH grant portfolio, and identify remaining gaps. METHODS:We identified investigator-initiated NIH research grants on implementation science in genomic medicine (funding initiated 2012-2016). A codebook was adapted from the literature, three authors coded grants, and descriptive statistics were calculated for each code. RESULTS:Forty-two grants fit the inclusion criteria (~1.75% of investigator-initiated genomics grants). The majority of included grants proposed qualitative and/or quantitative methods with cross-sectional study designs, and described clinical settings and primarily white, non-Hispanic study populations. Most grants were in oncology and examined genetic testing for risk assessment. Finally, grants lacked the use of implementation science frameworks, and most examined uptake of genomic medicine and/or assessed patient-centeredness. CONCLUSION:We identified large gaps in implementation science studies in genomic medicine in the funded NIH portfolio over the past 5 years. To move the genomics field forward, investigator-initiated research grants should employ rigorous implementation science methods within diverse settings and populations.