Building capacity in implementation science research training at the University of Nairobi.
Ontology highlight
ABSTRACT: Health care systems in sub-Saharan Africa, and globally, grapple with the problem of closing the gap between evidence-based health interventions and actual practice in health service settings. It is essential for health care systems, especially in low-resource settings, to increase capacity to implement evidence-based practices, by training professionals in implementation science. With support from the Medical Education Partnership Initiative, the University of Nairobi has developed a training program to build local capacity for implementation science.This paper describes how the University of Nairobi leveraged resources from the Medical Education Partnership to develop an institutional program that provides training and mentoring in implementation science, builds relationships between researchers and implementers, and identifies local research priorities for implementation science.The curriculum content includes core material in implementation science theory, methods, and experiences. The program adopts a team mentoring and supervision approach, in which fellows are matched with mentors at the University of Nairobi and partnering institutions: University of Washington, Seattle, and University of Maryland, Baltimore. A survey of program participants showed a high degree satisfaction with most aspects of the program, including the content, duration, and attachment sites. A key strength of the fellowship program is the partnership approach, which leverages innovative use of information technology to offer diverse perspectives, and a team model for mentorship and supervision.As health care systems and training institutions seek new approaches to increase capacity in implementation science, the University of Nairobi Implementation Science Fellowship program can be a model for health educators and administrators who wish to develop their program and curricula.
Project description:BackgroundImplementation research (IR) is increasingly gaining popularity as the act of carrying an intention into effect. It is thus an important approach to addressing individual practices, policies, programmes and other technologies to solving public health problems. Low- and middle-income countries (LMICs) continue to experience public health problems which could be addressed using implementation research. These countries however fall behind prioritizing implementation research due to the disorganized approach used to providing knowledge about the value and scope of implementation research. This paper seeks to explain steps taken to resolve this by capacity strengthening activities through a comprehensive implementation research training and mentorship programme which was informed by needs assessment.MethodsThe roll-out of the comprehensive implementation research training and mentorship was done in phases, including engaging the implementation research community through TDR Global, competency building for programme officers and ethical review board/committee members, and practical guidance to develop an implementation research proposal. The Bloom taxonomy guided the training whilst the Kirkpatrick Model was used for the evaluation of the effectiveness of the capacity building.ResultsThe findings identified critical areas of mentors and how mentorship should be structured and the most effective ways of delivering mentorship. These findings were used to develop a mentorship guide in IR. The mentorship guidance is to be used as a check-tool for mentoring participants during trainings as part of the package of resources in implementation research. It is also to be used in equipping review board members with knowledge on ethical issues in implementation research.ConclusionThe approach for providing comprehensive implementation research training and mentorship for programme personnel has provided an opportunity for both potential mentors and mentees to make inputs into developing a mentorship guidance for LMICs. This guidance would help address mentorship initiation and implementation challenges in IR.
Project description:BackgroundChinese Center for Disease Control and Prevention (China CDC) introduced the Structured Operational Research Training Initiative (SORT IT) into China to build a special capacity and equip public health professionals with an effective tool to support developing countries in strengthening their operational research. The paper aims to investigate and analyze the implementation, outcomes and challenges of the first cycle of SORT IT in China.Main textAs a result of the successful implementation, SORT IT China, Cycle 1 has demonstrated fruitful outputs as exemplified by the 18-month follow-up to the post-training initiatives of the twelve participants, who all achieved the four milestones required by SORT IT. Eleven of twelve (92%) manuscripts generated that focused on the prevention and control of malaria, influenza, HIV/AIDS, hepatitis B, schistosomiasis, tuberculosis and Japanese encephalitis were published by peer-reviewed international journals with the impact factor ranging from 2.6 to 4.8. The most up-to-date citation count on February 19, 2021 was 53 times out of which 31 times were cited by Science Citation Index papers with 94.827 impact factor in total. Six senior professionals from China CDC also facilitated the whole SORT IT training scheme as co-mentors under the guidance of SORT IT mentors. The twelve participants who gained familiarity with the SORT IT courses and training principles are likely become potential mentors for future SORT IT, but they as the non-first language speakers/users of English also faced the challenge in thoroughly understanding the modules delivered in English and writing English academically to draft the manuscripts.ConclusionThe outcomes from the first cycle of SORT IT in China have led to studies contributing to narrowing the knowledge gap among numerous public health challenges nationally and internationally. It is believed the researchers who participated will continue to apply the skills learned within their domain and help build the training capacity for future operational research courses both in China and in developing countries with similar needs.
Project description:BackgroundIn Uganda and other resource-poor countries, relevant research findings face a tortuous path to translation into policy and routine practice. Implementation science (ImSc) research could facilitate faster translation. Presently it is unclear what ImSc research capacity and possible training needs exist among Ugandan researchers. To assess both components, we interviewed potential trainees in Kampala, Uganda.MethodsWe used a cross-sectional design to survey potential ImSc trainees who had some research training and involvement in generating or utilizing research. Using a questionnaire, we documented eligibility for ImSc training, knowledge and interest in training, existing self-assessed confidence in initiating clinical research (SCICR) and self-assessed confidence in initiating ImSc research (SCIIR), availability for training and preferred modes of training. We developed scores from the Likert scales and used descriptive statistics, logistic regression and ordinal logistic regression to evaluate predictors of SCIIR.ResultsBetween November 2016 and April 2017, we interviewed 190 participants; 60% were men, with a median age of 37 years. Among participants, 33% comprised faculty, 37% were graduate students and 30% were project staff. The majority of respondents knew about ImSc (73%) and were research-trained (80%). Only 9% reported any ImSc-related training. Previous ImSc training was associated with higher odds of a SCIIR score ≥ 75th percentile. Previous ImSc training compared to not having any training was associated with higher odds of reporting abilities in behaviour change theory integration (OR: 3.3, 95% CI: 1.3-8.5, p = 0.01) and framework use in intervention design and implementation (OR: 2.9, 95% CI: 1.1-7.4, p = 0.03), accounting for age, sex and current employment. In addition, 53% of participants preferred in-person (face-to-face) short ImSc courses compared to a year-long training, while 33% preferred online courses. Participants reported median availability of 6 hours per week (IQR: 4, 10) for training.ConclusionMost participants had some understanding of ImSc research, had research training and were interested in ImSc training. Those with previous ImSc training had better skills and SCIIR, compared to those without previous training. A hybrid approach with modular face-to-face training and online sessions would suit the preferences of most potential trainees.
Project description:BackgroundThe field of dissemination and implementation (D&I) science has grown significantly over recent years. Alongside this, an increased demand for training in D&I from researchers and implementers has been seen. Research describing and evaluating D&I training opportunities, referred to here as 'capacity building initiatives' (CBIs), can help provide an understanding of different methods of training as well as training successes and challenges. However, to gain a more detailed understanding of the evidence-base and how D&I CBIs are being reported in publications, a field-wide examination of the academic literature is required.MethodsSystematic review to identify the type and range of D&I CBIs discussed and/or appraised in the academic literature. EMBASE, Medline and PsycINFO were searched between January 2006 and November 2019. Articles were included if they reported on a D&I CBI that was developed by the authors (of each of the included articles) or the author's host institution. Two reviewers independently screened the articles and extracted data using a standardised form.ResultsThirty-one articles (from a total of 4181) were included. From these, 41 distinct D&I CBIs were identified which focussed on different contexts and professions, from 8 countries across the world. CBIs ranged from short courses to training institutes to being part of academic programmes. Nearly half were delivered face-face with the remainder delivered remotely or using a blended format. CBIs often stipulated specific eligibility criteria, strict application processes and/or were oversubscribed. Variabilities in the way in which the D&I CBIs were reported and/or evaluated were evident.ConclusionsIncreasing the number of training opportunities, as well as broadening their reach (to a wider range of learners), would help address the recognised deficit in D&I training. Standardisation in the reporting of D&I CBIs would enable the D&I community to better understand the findings across different contexts and scientific professions so that training gaps can be identified and overcome. More detailed examination of publications on D&I CBIs as well as the wider literature on capacity building would be of significant merit to the field.
Project description:The UC San Diego Altman Clinical and Translational Research Institute Dissemination and Implementation Science Center (DISC) launched in 2020 to provide dissemination and implementation science (DIS) training, technical assistance, community engagement, and research advancement. DISC developed a program-wide logic model to inform a process evaluation of member engagement and impact related to DISC services. The DISC Logic Model (DLM) served as the framework for a process evaluation capturing quantitative and qualitative information about scientific activities, outputs, and outcomes. The evaluation involved a multimethod approach with surveys, attendance tracking, feedback forms, documentation of grant outcomes, and promotions metrics (e.g., Twitter engagement). There were 540 DISC Members at the end of year 2 of the DISC. Engagement in the DISC was high with nearly all members endorsing at least one scientific activity. Technical assistance offerings such as DISC Journal Club and consultation were most frequently used. The most common scientific outputs were grant submission (65, 39%), formal mentoring for career award (40, 24%), and paper submission (34, 21%). The DLM facilitated a comprehensive process evaluation of our center. Actionable steps include prioritizing technical assistance, strengthening networking opportunities, identifying streamlined approaches to facilitate DIS grant writing through writing workshops, as well as "office hours" or organized writing leagues.
Project description:BackgroundWhile dissemination and implementation (D&I) science has grown rapidly, there is an ongoing need to understand how to build and sustain capacity in individuals and institutions conducting research. There are three inter-related domains for capacity building: people, settings, and activities. Since 2008, Washington University in St. Louis has dedicated significant attention and resources toward building D&I research capacity. This paper describes our process, challenges, and lessons with the goal of informing others who may have similar aims at their own institution.ActivitiesAn informal collaborative, the Washington University Network for Dissemination and Implementation Research (WUNDIR), began with a small group and now has 49 regular members. Attendees represent a wide variety of settings and content areas and meet every 6 weeks for half-day sessions. A logic model organizes WUNDIR inputs, activities, and outcomes. A mixed-methods evaluation showed that the network has led to new professional connections and enhanced skills (e.g., grant and publication development). As one of four, ongoing, formal programs, the Dissemination and Implementation Research Core (DIRC) was our first major component of D&I infrastructure. DIRC's mission is to accelerate the public health impact of clinical and health services research by increasing the engagement of investigators in later stages of translational research. The aims of DIRC are to advance D&I science and to develop and equip researchers with tools for D&I research. As a second formal component, the Washington University Institute for Public Health has provided significant support for D&I research through pilot projects and a small grants program. In a third set of formal programs, two R25 training grants (one in mental health and one in cancer) support post-doctoral scholars for intensive training and mentoring in D&I science. Finally, our team coordinates closely with D&I functions within research centers across the university. We share a series of challenges and potential solutions.ConclusionOur experience in developing D&I research at Washington University in St. Louis shows how significant capacity can be built in a relatively short period of time. Many of our ideas and ingredients for success can be replicated, tailored, and improved upon by others.
Project description:BackgroundImplementation research is increasingly being recognised as an important discipline seeking to maximise the benefits of evidence-based interventions. Although capacity-building efforts are ongoing, there has been limited attention on the contextual and health system peculiarities in low- and middle-income countries. Moreover, given the challenges encountered during the implementation of health interventions, the field of implementation research requires a creative attempt to build expertise for health researchers and practitioners simultaneously. With support from the Special Programme for Research and Training in Tropical Diseases, we have developed an implementation research short course that targets both researchers and practitioners. This paper seeks to explain the course development processes and report on training evaluations, highlighting its relevance for inter-institutional and inter-regional capacity strengthening.MethodsThe development of the implementation research course curriculum was categorised into four phases, namely the formation of a core curriculum development team, course content development, internal reviews and pilot, and external reviews and evaluations. Five modules were developed covering Introduction to implementation research, Methods in implementation research, Ethics and quality management in implementation research, Community and stakeholder engagement, and Dissemination in implementation research. Course evaluations were conducted using developed tools measuring participants' reactions and learning.ResultsFrom 2016 to 2018, the IR curriculum has been used to train a total of 165 researchers and practitioners predominantly from African countries, the majority of whom are males (57%) and researchers/academics (79.4%). Participants generally gave positive ratings (e.g. integration of concepts) for their reactions to the training. Under 'learnings', participants indicated improvement in their knowledge in areas such as identification of implementation research problems and questions.ConclusionThe approach for training both researchers and practitioners offers a dynamic opportunity for the acquisition and sharing of knowledge for both categories of learners. This approach was crucial in demonstrating a key characteristic of implementation research (e.g. multidisciplinary) practically evident during the training sessions. Using such a model to effectively train participants from various low- and middle-income countries shows the opportunities this training curriculum offers as a capacity-building tool.
Project description:BackgroundMany public health programs fail because of an inability to implement tested interventions in diverse, complex settings. The field of implementation science is engaged in developing strategies for successful implementation, but current training is primarily researcher-focused. To tackle the challenges of the twenty-first century, public health leaders are promoting a new model titled Public Health 3.0 where public health practitioners become "chief health strategists" and develop interdisciplinary skills for multisector engagement to achieve impact. This requires broad training for public health practitioners in implementation science that includes the allied fields of systems and design thinking, quality improvement, and innovative evaluation methods. At UNC Chapel Hill's Gillings School of Global Public Health, we created an interdisciplinary set of courses in applied implementation science for Master of Public Health (MPH) students and public health practitioners. We describe our rationale, conceptual approach, pedagogy, courses, and initial results to assist other schools contemplating similar programs.MethodsOur conceptual approach recognized the vital relationship between implementation research and practice. We conducted a literature review of thought leaders in public health to identify skill areas related to implementation science that are priorities for the future workforce. We also reviewed currently available training programs in implementation science to understand their scope and objectives and to assess whether any of these would be a fit for these priorities. We used a design focused implementation framework to create four linked courses drawing from multiple fields such as engineering, management, and the social sciences and emphasizing application through case studies. We validated the course content by mapping them to implementation science competencies in the literature.ResultsTo date, there is no other program that provides comprehensive interdisciplinary skills in applied implementation science for MPH students. As of April 2018, we have offered a total of eleven sections of the four courses, with a total enrollment of 142, of whom 127 have been master's-level students in the school of public health. Using Kirkpatrick's Model, we found positive student reaction, learning, and behavior. Many students have completed applied implementation science focused practicums, master's papers, and special studies.ConclusionsA systematically designed interdisciplinary curriculum in applied implementation science for MPH students has been found by students to be a useful set of skills. Students have demonstrated the capability to master this material and incorporate it into their practicums and master's papers.
Project description:ProblemDissemination and implementation (D&I) science provides the tools needed to close the gap between known intervention strategies and their effective application. The authors report on the Mentored Training for Dissemination and Implementation Research in Cancer (MT-DIRC) program-a D&I training program for postdoctoral or early-career cancer prevention and control scholars.ApproachMT-DIRC was a 2-year training institute in which fellows attended 2 annual Summer Institutes and other conferences and received didactic, group, and individual instruction; individualized mentoring; and other supports (e.g., pilot funding). A quasi-experimental design compared changes in 3 areas: mentoring, skills, and network composition. To evaluate mentoring and D&I skills, data from fellows on their mentors' mentoring competencies, their perspectives on the importance of and satisfaction with mentoring priority areas, and their self-rated skills in D&I competency domains were collected. Network composition data were collected from faculty and fellows for 3 core social network domains: contact, mentoring, and collaboration. Paired t tests (mentoring), linear mixed models (skills), and descriptive analyses (network composition) were performed.OutcomesMentors were rated as highly competent across all mentoring competencies, and each mentoring priority area showed reductions in gaps between satisfaction and importance between the 6 and 18 months post-first Summer Institute. Fellows' self-rated skills in D&I competencies improved significantly in all domains over time (range: 42.5%-52.9% increase from baseline to 18 months post-first Summer Institute). Mentorship and collaboration networks grew over time, with the highest number of collaboration network ties for scholarly manuscripts (n = 199) in 2018 and for research projects (n = 160) in 2019.Next stepsBuilding on study findings and existing literature, mentored training of scholars is an important approach for building D&I skills and networks, and thus to better applying the vast amount of available intervention evidence to benefit cancer control.
Project description:BackgroundResearch centers and programs focused on dissemination and implementation science (DIS) training, mentorship, and capacity building have proliferated in recent years. There has yet to be a comprehensive inventory of DIS capacity building program (CBP) cataloging information about activities, infrastructure, and priorities as well as opportunities for shared resources, collaboration, and growth. The purpose of this systematic review is to provide the first inventory of DIS CBPs and describe their key features and offerings.MethodsWe defined DIS CBPs as organizations or groups with an explicit focus on building practical knowledge and skills to conduct DIS for health promotion. CBPs were included if they had at least one capacity building activity other than educational coursework or training alone. A multi-method strategy was used to identify DIS CBPs. Data about the characteristics of DIS CBPs were abstracted from each program's website. In addition, a survey instrument was developed and fielded to gather in-depth information about the structure, activities, and resources of each CBP.ResultsIn total, 165 DIS CBPs met our inclusion criteria and were included in the final CBP inventory. Of these, 68% are affiliated with a United States (US) institution and 32% are internationally based. There was one CBP identified in a low- and middle-income country (LMIC). Of the US-affiliated CBPs, 55% are embedded within a Clinical and Translational Science Award program. Eighty-seven CBPs (53%) responded to a follow-up survey. Of those who completed a survey, the majority used multiple DIS capacity building activities with the most popular being Training and Education (n=69, 79%) followed by Mentorship (n=58, 67%), provision of DIS Resources and Tools (n=57, 66%), Consultation (n=58, 67%), Professional Networking (n=54, 62%), Technical Assistance (n=46, 52%), and Grant Development Support (n=45, 52%).ConclusionsTo our knowledge, this is the first study to catalog DIS programs and synthesize learnings into a set of priorities and sustainment strategies to support DIS capacity building efforts. There is a need for formal certification, accessible options for learners in LMICs, opportunities for practitioners, and opportunities for mid/later stage researchers. Similarly, harmonized measures of reporting and evaluation would facilitate targeted cross-program comparison and collaboration.