research article

Developing a Skilled Clinical Trials Workforce in Patient-Oriented Research: Impact of an Innovative Training Approach

Melanie King Rosario1*, Marilynne Hebert2, Michael D. Hill3, Dean Eurich4, Mari Boesen5

 

1Career Development Platform, Alberta SPOR SUPPORT Unit, Department of Community Health Sciences, Cumming School of Medicine, University of Calgary, Calgary AB, Canada

2Department of Community Health Sciences, Cumming School of Medicine, University of Calgary and Joint Lead, Career Development Platform, Alberta SPOR SUPPORT Unit, Calgary AB, Canada

3Departments of Clinical Neurosciences, Community Health Sciences, Medicine and Radiology, Cumming School of Medicine, University of Calgary, and Joint Lead, Pragmatic Clinical Trials Platform, Alberta SPOR SUPPORT Unit, Calgary AB, Canada

4School of Public Health, University of Alberta, and Joint Lead, Career Development Platform, Alberta SPOR SUPPORT Unit, Edmonton AB, Canada

5Department of Clinical Neurosciences, University of Calgary, Pragmatic Clinical Trials Platform, Alberta SPOR SUPPORT Unit, Calgary AB, Canada


*Corresponding author: Melanie King Rosario, Program Manager, University of Calgary, 3280 Hospital Dr NW, Calgary AB T2N 1N4, Department of Community Health Sciences, TRW 3rd Floor, Calgary AB, Canada. Tel: +14032207205; Email: mrosario@ucalgary.ca 


Received Date: 21 July, 2018; Accepted Date: 10 August, 2018; Published Date: 17 August, 2018

CitationKing Rosario M, Hebert M, Hill MD, Eurich D, Boesen M (2018) Developing a Skilled Clinical Trials Workforce in Patient-Oriented Research: Impact of an Innovative Training Approach. Educ Res Appl: ERCA-155. DOI: 10.29011/2575-7032/100055

1.  Abstract

This paper presents the design, implementation, and evaluation of a blended learning program to develop research capacity in the clinical trials workforce. The purpose of the program was to advance knowledge and skills, provide practical applications, and integrate patient-oriented research into clinical trials. An advisory committee of multidisciplinary stakeholders informed the design, resulting in an 18-week, 56-hour accredited program. The initial program was implemented in January 2017, with evaluation conducted through online surveys following each module, followed by semi-structured interviews at 4-6 months post program completion. Early impacts included increased knowledge, confidence, and efficiencies; changes in personal perspectives and attitudes; and amendments to studies already underway in learner’s work environments.

2. Keywords: Blended learning; Clinical trials education; Health research capacity development; Patient-oriented research; Professional development; Program evaluation

3. Abbreviations

AI           :               Alberta Innovates

CIHR     :               Canadian Institutes of Health Research

HRCD    :               Health Research Capacity Development

POR        :               Patient-Oriented Research

SPOR     :               Strategy for Patient-Oriented Research

4. Introduction 

4.1.  Patient-Oriented Research - Context

Over the past two decades there has been a growing global interest in supporting public, patient and family collaboration in health research. In Canada, a national research funder, the Canadian Institutes of Health Research [1] developed and funded the Strategy for Patient-Oriented Research (SPOR). This was in response to the growing need to ensure that public and patients were involved as more than human study subjects, but as experts in the experience of their disease area, partners in assessing relevance and priorities for research, and key stakeholders and end users for new medications, treatments, and interventions. The overarching goal of SPOR is to increase the quantity and quality of patient-oriented research [1]. Through long-term integration of this new approach, the goal is to effectively shift the health research culture to become more patient-oriented.

Several journals, such as the Canadian Medical Association Journal, now require contributors to provide evidence of their patient-oriented research approach in order to be eligible for publication [2]. CIHR also identified increased competitiveness in conducting clinical trials as a key goal for SPOR [1]. Applying a patient-oriented research approach to clinical trials reflects an increasing emphasis on patient engagement, partnership, priorities, and patient outcomes.

Education and training is a common intervention to create change at individual and team levels through the development of a core of well-trained individuals to increase local capacity [3]. The larger environments, such as the organizations and networks, where these individuals and teams work must be considered if changes at the individuals and team levels are to have maximum impact and create sustainable change [3].

Change management theory suggests that organizations change in response to natural shifts in strategic priorities [4]. The shift in health research towards approaches of patient-oriented research and patient engagement may indeed be the larger drivers towards culture change than individual educational and training programs. Bottom-up approaches, such as training for front-line staff, are an integral component of a much larger ecosystem in the health research environment and overall health system.

One of the ways the Alberta SPOR SUPPORT Unit addressed capacity development for clinical trials was through the development of the Leadership in Patient-Oriented Research: Pragmatic Clinical Trials Certificate. Stakeholders were clear about a need for such a program; however, evaluation was required to determine if the impact and outcomes reflected the initiative’s purpose and objectives.

This paper provides an overview of the program design, learner impact and outcomes, as well as discusses the findings in relation to current literature in health research capacity development.  This is one of few reports on research capacity development activities that focus on clinical trials and prioritizes the learning and development of the workforce. This program is also the only known educational activity that bridges clinical trials work with patient-oriented research.

4.2.  Capacity Development 

A number of definitions describe the endeavour to increase capacity in the health research environment, including capacity building, capacity development, and capacity strengthening. The definition of research capacity development for the purpose of this initiative is the “process of individual and institutional development which leads to higher levels of skills and greater ability to perform useful research” [5]. The goal of the capacity development program described here was to increase skill, knowledge, utilization of relevant resources, and integration of patient-oriented research into clinical trial work. 

The Commission on Health Research for Development indicated that the “strengthening of expertise in research was one of the most powerful, cost effective and sustainable means of advancing health and development” (1997, p. 265). Condell and Begley [6] considered education and training as key interventions to strengthen research expertise. These included training activities such as workshops on ethics, study design, and research governance [7]. Whitworth [8] described such programs as intending to ensure a research-ready workforce capable of conducting daily tasks and adherence to regulatory requirements. Other forms of capacity development include mentorship, funding support, and development of partnerships [8-10]. 

Many Health Research Capacity Development (HRCD) programs focus on developing capacity within the researcher, faculty, and graduate student audience [8,11-13], while few target other health and research professionals that are critical members of health research and clinical trials teams. The HRCD context also extends beyond the environment of traditional research institutions, as many have noted the importance of developing research capacity in areas such as primary care and public health [9], or in global and international health research, such as in low- and middle-income countries [10,14,15]. The overarching purpose for research capacity development programs, regardless of the audience or context, is to improve the ability to conduct research, use the results effectively, and create a strong infrastructure to continue this cycle [15]. 

4.3.  Learning Environment 

Professional development is typically offered through periodic activities and training that is often not context-driven [16]. Many local offerings to develop skills and knowledge to conduct clinical trials are one-time workshops or two-day conference style activities that focus on a particular topic, rather than creating long term linkages between skills to develop a comprehensive knowledge base. Longer term professional development is more likely to have an impact than shorter learning and training activities; learners have more opportunity for ‘hands-on’ development that is linked to their daily work and is therefore more likely to demonstrate enhanced knowledge and skills [17]. 

A longer-term cohort-based format was selected for this program to provide time and opportunity for collaborative learning, development of networks and relationships, and to ensure adequate time to reflect on learning for integration into practice. The health research capacity development program described here is designed to approach research staff professional development in a comprehensive format. This program design extends beyond workshops, conferences and self-study modules and uses a community approach where technology provides a space for collaborative inquiry that enhances learning [16]. 

4.4.  Measurement of HRCD 

The literature on health research capacity development documents the debate over the methods required to consistently measure the outcomes of HRCD. Cooke [9] developed a framework designed to plan and measure the progress of research capacity building activities. The identification of relevant and appropriate outcomes is a crucial component of this framework. To represent the full spectrum of impact, non-traditional measurements in HRCD may be more appropriate than traditional measurements of publications, conference presentations, and grant funds [9]. However, the value of Cooke’s [9] framework expands beyond individual educational outcomes to explore a variety of components of capacity building, such as linking practitioners to research, building networks and partnerships, and supporting health system and research infrastructure. 

4.5.  Context 

There is currently no direct pathway leading to employment in clinical trials in Alberta and no common training activities to provide consistency in the knowledge, ability, and practical skills of clinical trial professionals across the province. Numerous short-term or single event training opportunities exist that focus on a particular topic or area in clinical trial conduct, particularly regulatory rules. Multiple provincial stakeholders identified the need to consolidate and expand this learning, incorporate patient-oriented research strategies, and provide opportunity to integrate practical applicability with a focus on Alberta-specific contexts. 

5. Materials and Methods 

5.1.  Design

A multidisciplinary advisory group was created to inform the design, content, format, and overall structure for the Certificate program. Membership in the advisory committee included clinical trial experts and leaders, education experts, research staff (including nurses, coordinators, and managers), patients, and university administrators. Committee members participated in numerous face-to-face meetings, teleconferences, and email interactions to provide input into program design and development. 

Essential content was arranged sequentially in the order one might encounter each topic in the real-world chronology of a clinical trial. To add value for learners, experts across the province contributed a lecture or presentation for each module, with subsequent interactive components or assignments that focused on the practical application of each module topic. Patient-oriented research principles and strategies were integrated into the content, providing learners the opportunity to explore how to partner with patients at various stages of clinical trial studies. Clinical trial efficiency, accuracy, skill building, practical application, and utilization of available resources were also critical elements of the program design. 

The final framework included 56 hours of accredited original content, seven external content sources, and 24 subject matter experts. A course coordinator handled logistics for the online learning management system as well as for live and in-person sessions; learner and speaker communications; tracking assignments and progress. Content experts were hired as moderators to follow the cohort through the duration of the program and acted as a resource in the topic area to provide continuity in addressing questions, provide additional content and resources, and offer feedback on assignments. The initial cohort included two moderators: one with expertise in clinical trials and one with expertise in patient-oriented research and patient engagement. 

Learners were invited to participate in the initial cohort at no fee in return for completion of comprehensive online surveys each week, a more in-depth program evaluation at the mid-way and final stages of the program, and a semi-structured interview at 3-4 months post completion of the program. 

5.2.  Setting 

Learners for the program were located throughout the province, which required an accessible delivery format. A blended learning format was chosen for program delivery, with in-person learning to begin and end the program with weekly online modules for the weeks in between. Online modules included a combination of synchronous and asynchronous sessions so that learners had regular opportunities to continue to learn together in real time, while at other times completing the work at their own convenience. Online learning has demonstrated numerous advantages to the barriers of geography while still providing options for interactivity and learner autonomy and control [18]. 

5.3.  Learner Demographics 

Learners were recruited through two sources for the initial cohort. First, advisory committee members were asked to provide recommendations for individuals that would be able to provide strong feedback. Information sessions were held at the two major universities in the province and clinical trials staff were invited to attend. All learners completed an application form to gather information on their current role, level of experience, types of trials and studies experienced, and expectations for the program. Only learners with some experience or context for clinical trials were considered capable of providing meaningful feedback on the content in relation to their daily work and responsibilities. 

5.4.  Data Collection 

5.4.1. Surveys 

Through weekly online evaluations learners provided feedback on the content quality, presenter efficacy, and alignment with learning objectives. A mid-way comprehensive online survey addressed program-level inquiries related to the timing and sequence of modules, applicability of content to individual roles and responsibilities, linkages across modules and assignments, usability of the online learning tools, and feasibility of the workload. 

Evaluations included a combination of ordinal questions, where respondents selected a rating for each item, as well as open-ended questions. Following the final in-person day, learners completed an overall program evaluation to assess program value and early outcomes, such as changes in tasks or knowledge as a result of incorporating POR into their clinical trials work. 

5.4.2. Semi-structured Interviews 

Four months after program completion, semi-structured interviews were completed to determine any additional impact of the program on daily work, knowledge, skills, or perspectives. A research assistant conducted, recorded, and transcribed the interviews using an interview guide developed by MKR and MH. 

5.5.  Analysis 

Online survey responses were collected and compiled using Survey Monkey, including quantitative feedback as well as qualitative responses from open-ended questions. This information was compiled on an ongoing basis to make program adjustments to meet learner needs and for reporting early learner impact. 

At the completion of all semi-structured interviews, the first author (MKR) reviewed all interview transcripts, utilizing audio recordings as needed. The qualitative analysis included grouping interview data into categories and themes that reflected learners’ experiences. These categories were compared with the program learning objectives to identify if they had been met. This analysis also identified concrete examples of the impact of the learning and application on the work environment. Excerpts from learner interviews are presented below, using ‘he’ and ‘she’ randomly to ensure anonymity. 

3. Results 

Twenty-two of twenty-four learners (90.9%) completed the 56-hour Certificate program. Learners were clinical trials staff employed by one of the province’s major universities, the provincial health system, or independent/external research organizations. The fields of study and departments represented included paediatrics, mental health, neuroscience, oncology, cardiovascular, respiratory, and surgery. 

6.1. Survey

At the conclusion of the program 19 of 22 of learners indicated their participation in the program already had an impact on their work routines through improvements in documentation, more closely following regulatory requirements, audit preparations, budgetary considerations, and ethics applications. In addition, 15 learners also made changes related to patient-oriented research. For example, they identified obtaining patient feedback and recruiting patient collaborators to inform aspects of the study in areas such as informed consent and recruitment strategies. Respondents who had not yet incorporated any patient-oriented principles or strategies into their work reported discussions with their team, manager, or principal investigator related to these topics.

6.2. Semi-structured Interviews

Fourteen learners participated in individual interviews, conducted four months after successful completion of the program. Twelve of the fourteen learners stated when they first began working in clinical trials their primary sources of information and training were informal. They described their training as “on the job learning,” including asking colleagues, trial and error, and even ‘sink or swim.’ These responses supported a need for the program, as it was clear the current workforce were not learning their roles and responsibilities in a formalized, comprehensive, or consistent manner. 

6.2.1. Increased trial knowledge and skills

Given the learners’ experiences in conducting clinical trials (Figure 1), the program developers anticipated that learners may be familiar with some of the material, particularly topics related to clinical trial conduct. However, almost every participant in the interview process described a “sharpening of their skills” (P07), an “increase in knowledge” (P05), or that the program “reinforced and solidified my ability to do my work” (P08).

Learners stated they had gained confidence in their skills and ability to conduct clinical trials work:

·         I am definitely more competent in what is needed, whereas before the course I kind of had a rough idea but would need a lot of guidance; I am definitely better because of it (P04).

·         Confidence was also expressed in terms of becoming more efficient now… accurate and efficient (P05).

·         Increased motivation through enhancing my skills and finding a renewed enthusiasm for my work (P07).

·         A transition in ability from novice to a leadership role after completing the program; he became the person on the team others go to with questions (P10). 

6.2.2. Ability to apply knowledge in practice 

Building competencies to expand a skilled workforce made it imperative that the program structure bridged knowledge of clinical trial requirements with the ability to apply that knowledge to work activities. Modules focused on topics such as creating a regulatory binder, planning a budget, recruiting participants, and designing informed consent processes. 

Learners reflected on the impact of this design: 

Pragmatic and solid examples on how to do it rather than talking at a broader level was helpful… doing study close-out, writing budgets, all those sorts of things that were very hands on, each one was helpful (P09). 

Learners also described being more thorough on the job (P07), changing study processes (P02), and creating a framework for documentation that we now use in every study we run (P01).  

6.2.3. Knowledge and integration of patient-oriented research into clinical trials

Patient-oriented research was a new topic for many learners and the initial reactions to the material were reflected in some reticence in early program surveys, anticipating barriers and challenges in bringing it to the workplace. Learners noted patient-oriented research was not something I had been learning on the job, it was all new (P08). 

Learners who had some knowledge of patient-oriented research when entering the program found:

Even more now I am trying to build patient-oriented strategies into our work, not to just involve them but to involve them in a meaningful way (P03). 

Upon program completion they were not only aware of it but felt they had the knowledge to try and implement it (P01) and were using strategies they wouldn’t have used before… but are now tailored towards patient-oriented research from the beginning (P09). During the interviews learners noted the program added valuable practical skills and strategies (P05) for implementation, including:              

Various ways to potentially integrate patients that made you think more about it within the context of what you do (P12).

These approaches enabled learners to speak with their teams and recommend patient-oriented strategies: 

I am able to suggest to the team that we incorporate patient input and now we’re really talking about that… we’re not there yet, but it’s a start (P14). 

A few learners took on the challenge of incorporating patient-oriented research and were successful in changing the way their study proceeded. Two learners described conversations with their Principal Investigators (PI) and trial managers on how to incorporate patient-oriented research. One learner’s enthusiasm convinced the study PI to pursue patient-oriented research strategies and together they re-developed the study design as well as completed ethics and grant applications that incorporated this new approach. They were, now actually doing patient-oriented research (P02). 

On the other hand, some learners described the intent to pursue patient-oriented research in the future but were not ready to explore the integration of patient-oriented research:

I’m more aware of patient-oriented research, but I don’t think there’s as much opportunity to apply it in my team just yet” (P11).

 (I am) now looking at ways to solicit more feedback from patients (P08). 

6.2.4. The Bigger Picture 

Learners reported significant impact of the program on their thought processes, perspectives, and attitudes about both clinical trials and patient-oriented research. Learners described changes in their personal perspectives compared to prior to the program: 

I was looking at the research as a job, but now I consider myself more as a server to the community and to society; it [the program] changed the way I look at the patient and how they are involved in the research (P05). 

Reset my mindset of what we’re setting out to do and the altruism of doing clinical trials (P06). 

(Made me) more aware of the importance of my job and the bigger picture (P02). 

They also reported an awareness of the bigger picture; that is, seeing what is being done outside of their own studies and progress in the field: 

It gives you an idea of what’s going on outside your own team and what the possibilities might be (P04). 

These changes in perspective also influenced the learners’ team members. One learner described her nursing background and previously unsuccessful attempts to make changes in her work based on a personal philosophy of valuing patient experience and input:

Before this program I was basically shut down when I made suggestions to involve patients, but now I have this concrete knowledge of the concept of patient-oriented research and I can recommend ways of how we can make these changes, and I’m seeing a different response (P14).

Many of the experienced learners were also involved in the onboarding and training of new staff and described changes in the way they oriented new team members to the work: 

It’s different now when orientating staff, a slightly different way of thinking; not just ‘we’re trying to create new medicines’ or something, but we’re trying to create better involvement with the patient and community and that mindset is spreading (P12). 

I feel that I can share this with people who are just starting out and now they will have this perspective to guide them as they learn their job (P04).

Learner reflections on impact tended to be positive, as reflected in the results above. This may be due, in part, to the likelihood that those who agreed to participate in the interview were those with stories to share about the impact of the program on their work. Many learners also provided insightful critiques about the content, workload, sequence of modules, and gave recommendations, such as more examples or case studies would have been helpful (P01) or live sessions in the evening were too inconvenient (P05). These comments were carefully reviewed and contributed to numerous program amendments prior to admitting the next cohort.

4. Discussion

National and provincial research funding was the impetus for this initiative. Conceptually the overall expectations were to develop capacity in health research, specifically POR. However, the practical aspects of program implementation and impact assessment were largely undefined. This funding supported development and delivery of the program as well as subsidized learner registration fees to reduce barriers to participation. Clinical trials staffs were not typically provided with professional development funds that may be available for other audiences, such as graduate students, clinicians and faculty. Whether this program could be supported on a cost recovery model through learner registration fees remains unclear. Other sources of funding such as university and industry grants are being pursued.

A group of learners with diverse levels of experience in clinical trial management completed this initial certificate program and contributed to an in-depth understanding the program impact. A number of barriers were identified for learners attempting to integrate patient-oriented research into their work within the timeline of the initial program. Learners were all involved in ongoing trials at various stages of the research process and once the study design had been set, many learners reported difficulty in attempting to suggest changes or amendments.

Industry sponsored trials were thought to be much more difficult to change, as the study team often had little or no control over the details of the research. As organizations, industries, and sponsors begin to realize the significance of patient-oriented research, it may become more feasible for future learners to contribute to these changes. For learners that reported success in approaching study investigators or sponsors with suggested changes to incorporate patient-oriented research, two factors were credited in moving this forward. First, learners found their knowledge of patient-oriented research with concrete examples or plans of how they might approach this were more convincing to decision-makers than abstract ideas. Second, an openness to considering these changes was paramount. A culture shift in the health research and clinical trials environment is a significant contributor to the willingness of decision-makers to consider this change.

Given the geographic spread of intended learners, the program adopted a blended learning approach, which addressed this potential challenge. In assessing program impact, learners’ comments reflected the value of this delivery model. They valued both the opportunity to meet and connect in person, while appreciating the large amount of content delivered online. The online format was critical to ensure this provincial initiative was accessible to learners across the province. It also allowed for the involvement of a wide variety of provincial experts as guest speakers for weekly modules, as they could log in and share their knowledge from virtually any location.

As Cooke [9] described, research capacity development endeavours are largely acknowledged to occur at different levels, each with their own role in changing practice and creating impact within their respective environment Subsequent work in this area has suggested frameworks to assess, plan and implement, and evaluate the various research development activities at each level [19]. According to Huber et al. [19], individual and team levels of impact are demonstrated through knowledge, skills, attitudes, and applying good practice. While impact of the initial training program was reported primarily at the individual level, many learner’s stories have transcended their own experience and affected the teams in which they work. Learner reports of collaborations with their principal investigator or industry sponsor, proposing changes to study design, and implementing these changes to be “more patient-oriented” suggest impact beyond the individual learning that took place.

The evidence of learning and workplace impact described in this paper are indicative of the program efficacy. While this program has been considered a resounding success, the data collected from the initial group of learners is potentially insufficient to analyze how or why it was a success. Some explanations for the success have been placed on the initial design work and involvement of key multidisciplinary stakeholders as meaningful engagement strategies to inform the program. This aligns with Huenneke et al. [19] claim the involvement of key leadership in an advisory committee is an essential factor in obtaining buy-in and creating support for the program and learners. The program success paves the way for additional capacity development initiatives that will bridge POR knowledge into other health research areas, such as big data management and knowledge translation.

Early individual impact results have been largely positive, yet the challenge of long-term assessment must be addressed. Long-term impact evaluation will require a larger alumni group, but will also require strategic design to determine how to accurately demonstrate the kinds of outcomes that may result. Potential avenues may include assessing for changes to timelines to complete studies, comparing reported issues, such as number of revisions to ethics applications or time taken to complete audit processes, as well as staff retention and long term uptake of patient-oriented strategies. Due to the variation in institutions and organizations across the province, as well as the potential confounding factors, such as availability of funding, regulatory changes, and other external forces that might impact potential measurement items, it may be difficult to isolate the program effect from other external factors in each local health research environment.

7.1.  Limitations of the Study

Pre-testing would have provided a more concrete measurement of the individual increase in knowledge that could have been compared with a final evaluation with the same questions.

Evaluation was prepared and compiled by staff of the program, which may have introduced bias into the interpretation of results.

8. Conclusion

Capacity building is widely described as more than educational initiatives, such as funding and partnerships. However, this program evaluation supports the assertion that educational initiatives are a key component and can create impact at all levels of change. The program successes extended beyond individual learning benefits, and have exhibited other early successes that extend to the teams, institutions, and organizations in which they work. 

The development of well-trained individuals is recognized as a key priority in ensuring local capacity can meet the demands of health research [3]. The results shown here have demonstrated this type of program can effectively develop capacity in the targeted workforce. Learners made a variety of changes in practice upon completion of the 56-hour program in areas related to both clinical trials conduct and patient-oriented research. 

Long-term assessment of the impact of this program should be addressed, even considering the inherent challenges. Evaluation is difficult due to the contextual nature of programs, and numerous factors may make it unrealistic to isolate an individual program in relation to outcomes that are also influenced by a wide variety of additional factors. This issue should be addressed in further research to explore how this long-term evaluation can be conducted. 

9. Acknowledgements 

The authors gratefully acknowledge funding support from the Canadian Institute of Health Research and Alberta Innovates. 

10. Conflict of Interest

The authors declare no conflict of interest. 

11. Ethical Considerations 

Quality improvement or program evaluation initiatives, which are understood to be those things relating to the assessment, management or improvement of a local program, are exempt from ethics review (Article 2.5, TCPS2). 

The work reported here was designed as program evaluation and as such not subject to ethics approval requirements. Learners participated voluntarily in the program. Verbal consent was obtained prior to the semi-structured interviews, including specific consent for tape-recording. Submission of anonymous online surveys was considered voluntary consent.


Figure 1: Clinical trials experience of initial cohort learners.


  1. Canadian Institutes of Health Research (2017) Strategy for Patient-Oriented Research. Ottawa: Canadian Institutes of Health Research.
  2. Patrick K (2016) Realizing the vision of patient-relevant clinical research. Canadian Medical Association Journal 188: 1063.
  3. Crisp BR, Swerissen H, Duckett SJ (2000) Four approaches to capacity building in health: Consequences for measurement and accountability. Health Promotion International 15: 99-107.
  4. Al-Haddad S, Kotnour T (2015) Integrating the organizational change literature: A model for successful change. Journal of Organizational Change Management 28: 234-262.
  5. Trostle J (1992) Research capacity building in international health: Definitions, evaluations and strategies for success. Social Science & Medicine 35: 1321-1324.
  6. Condell SL, Begley C (2007) Capacity building: A concept analysis of the term applied to research. International Journal of Nursing Practice 13: 268-275.
  7. Gee M, Cooke J (2018) How do NHS organisations plan research capacity development? Strategies, strengths, and opportunities for improvement. BMC Health Services Research 18: 198.
  8. Whitworth A, Haining S, Stringer H (2012) Enhancing research capacity across healthcare and higher education sectors: Development and evaluation of an integrated model. BMC Health Services Research 12: 287-287.
  9. Cooke J (2005) A framework to evaluate research capacity building in health care. BMC Family Practice 6: 44-55.
  10. Mahmood S, Hort K, Ahmed S, Salam M, Cravioto A (2011) Strategies for capacity building for health research in Bangladesh: Role of core funding and a common monitoring and evaluation framework. Health Research Policy and Systems / BioMed Central 9: 31.
  11. Huenneke LF, Stearns DM, Martinez JD, Laurila K (2017) Key strategies for building research capacity of university faculty members. Innovative Higher Education 42: 421-435.
  12. Ramkalawan T, Dieppe P (2008) Research capacity development and training. Journal of Health Services Research & Policy 13(3-suppl): 6-11.
  13. Golenko X, Pager S, Holden L (2012) A thematic analysis of the role of the organisation in building allied health research capacity: A senior managers’ perspective. BMC Health Services Research 12: 276-376.
  14. Boyd A, Cole DC, Cho D, Aslanyan G, Bates I (2013) Frameworks for evaluating health research capacity strengthening: A qualitative study. Health Research Policy and Systems / BioMed Central 11: 46.
  15. Bates I, Akoto AYO, Ansong D, Karikari P, Bedu-Addo G, et al. (2006) Evaluating health research capacity building: An evidence-based tool. PLoS Medicine 3: e299.
  16. Lock JV (2006) A new image: Online communities to facilitate teacher professional development. Journal of Technology and Teacher Education 14: 663-678.
  17. Garet MS, Porter AC, Desimone L, Birman BF, Yoon KS (2001) What makes professional development effective? Results from a national sample of teachers. American Educational Research Journal 38:  915-945.
  18. Byrne E, Donaldson L, Manda-Taylor L, Brugha R, Matthews A, et al. (2016) The use of technology enhanced learning in health research capacity development: Lessons from a cross country research partnership. Globalization and Health 12: 19.
  19. Huber J, Nepal S, Bauer D, Wessels I, Fischer M, et al. (2015) Tools and instruments for needs assessment, monitoring and evaluation of health research capacity development activities at the individual and organizational level: A systematic review. Health Research Policy and Systems 13: 80.

© by the Authors & Gavin Publishers. This is an Open Access Journal Article Published Under Attribution-Share Alike CC BY-SA: Creative Commons Attribution-Share Alike 4.0 International License. With this license, readers can share, distribute, download, even commercially, as long as the original source is properly cited. Read More.

Educational Research Applications

cara menggunakan pola slot mahjongrtp tertinggi hari inislot mahjong ways 1pola gacor olympus hari inipola gacor starlight princessslot mahjong ways 2strategi olympustrik mahjong ways 2trik olympus hari inirtp koi gatertp pragmatic tertinggicheat jackpot mahjongpg soft link gamertp jackpotelemen sakti mahjongpola maxwin mahjongslot olympus mudah mainrtp live starlightrumus slot mahjongmahjong scatter hitamslot pragmaticjam gacor mahjongpola gacor mahjongstrategi maxwin olympusslot jamin menangrtp slot gacorscatter wild banditopola slot mahjongstrategi maxwin sweet bonanzartp slot terakuratkejutan scatter hitamslot88 resmimaxwin olympuspola mahjong pgsoftretas mahjong waystrik mahjongtrik slot olympusewallet modal recehpanduan pemula slotpg soft primadona slottercheat mahjong androidtips dewa slot mahjongslot demo mahjonghujan scatter olympusrtp caishen winsrtp sweet bonanzamahjong vs qilinmaxwin x5000 starlight princessmahjong wins x1000rtp baru wild scatterpg soft trik maxwinamantotorm1131