Case study
Open Access

Piloting an Extracurricular Quality Improvement and Patient Safety Training Program for Interprofessional Learners

Akshay Rajaram[1][a], Mialynn Lee-Daigle[1], Malcolm Eade[1], Anna Curry[1], Robert Connelly[1], Roy Ilan[2]

Institution: 1. Queen's University, 2. Faculty of Medicine, Technion
Corresponding Author: Dr Akshay Rajaram ([email protected])
Categories: Educational Strategies, Learning Outcomes/Competency, Students/Trainees, Postgraduate (including Speciality Training), Undergraduate/Graduate
Published Date: 19/11/2019


Experts recommend that health professions students acquire knowledge and skills in quality improvement and patient safety. Educational initiatives exist, but involve minimal interprofessional contact and experiential learning. We piloted an extracurricular program combining didactic elements and projects to address these issues.

We collected demographic information and administered a post-program survey to assess the pilot’s reach and impact. We analyzed responses using simple descriptive statistics and thematically analyzed unstructured feedback.

Fifty-one students participated, including twenty-one (41%) undergraduate students, sixteen (31%) graduate students, and fourteen (27%) medical students. Nineteen (37%) participants responded to the survey. Qualitatively, themes around workshop effectiveness, program administration, project-student mismatch, and engagement and accountability emerged.

Despite limited response rates, our training program appeared to be well received. However, key issues of engagement and impact remain. Future efforts will focus on improvement in these areas and more rigorous evaluation of learning outcomes.

Keywords: Interprofessional; Quality Improvement; Patient Safety; Health Professions Education


The Health Professions Education: A Bridge to Quality report advocated that all students and working health professionals develop and maintain proficiency in delivering patient-centred care, working in interdisciplinary teams, practicing evidence-based medicine, focusing on quality improvement, and using information technology (G. Armstrong et al., 2012). To this end, experts recommend that patient safety (PS) and quality improvement (QI) be taught to healthcare professionals, and that competencies in these areas complement clinical knowledge and skills in the delivery of safe and high-quality patient care (Wong et al., 2010).


A number of reviews describing various PS and QI curricula and their impacts on health professions students have emerged over the last decade (Wong et al., 2010; L. Armstrong, Shepherd and Harris, 2017). These reviews have found that most curricula combine didactic and experiential learning to address topics in QI, root cause analysis and systems thinking, and examine changes in student experience and knowledge (Wong et al., 2010; L. Armstrong, Shepherd and Harris, 2017). The majority of studies involved nursing or medical students, and, of the curricula focused exclusively on medical students, the majority involved fewer than 10 contact hours and did not include learners in PS or QI projects (Wong et al., 2010; L. Armstrong, Shepherd and Harris, 2017). 


Other published initiatives have included other health professions, including nutrition, pharmacy, and occupational and physical therapy students (Galt et al., 2006; Dobson et al., 2009). These programs tended to use lectures, readings and assignments to convey key concepts in QI theory, patient safety, and the disclosure and prevention of errors through system improvement (Galt et al., 2006; Dobson et al., 2009). However, these curricula were limited in duration, offered minimal interprofessional contact, and infrequently offered experiential opportunities to learn and practice QI (Galt et al., 2006; Dobson et al., 2009). 


Given these limitations, more recent work has focused on identifying barriers to curricular integration of QI and patient safety. Faculty highlighted competing demands in already overcrowded undergraduate curricula, insufficient numbers of faculty members and clinical preceptors with adequate expertise to teach and mentor, and the limited utility of the classroom setting to provide an authentic learning experience (Tregunno et al., 2014). Other barriers include insufficient time or budget, stakeholder resistance and hierarchy (de Vries-Erich et al., 2017).



To address these gaps and overcome these barriers, we developed an extracurricular initiative, the Quality Improvement Practical Experience Program (QIPEP), using the general principles of educational experiences in healthcare as a blueprint: combining didactic learning and project-based work, linking projects with health system improvement efforts, and assessing outcomes (G. Armstrong et al., 2012). We describe the structure of the pilot program, provide an overview of the demographics of the first cohort, and report on participants’ attitudes and satisfaction. In this way, we draw key lessons to help others create and launch their own similar educational initiatives.


This study was approved by the Kingston Health Sciences Centre Research Ethics Board (PAED-440-18).


Participants, Eligibility, and Selection

QIPEP spanned the academic year (September to April). All full-time Queen’s University undergraduate and graduate students for the academic year were eligible to apply. Formal recruitment ran from April until June with students submitting an electronic application form. Applications were scored by QIPEP leadership using a rubric (we would be pleased to share both with interested readers) and admission offers were made in May.


Program Components

Figure 1 provides an overview of the program and components.


Figure 1: Pilot QI Practical Experience Program 2016-2017 Structure and Timeline


Didactic Learning

Participants completed the Institute for Healthcare Improvement (IHI) Basic Certificate of Quality and Safety in preparation for their projects (Certificates and Continuing Education, 2018). To complement the IHI modules, QIPEP offered a series of workshops focused on practical skills students needed to apply the theoretical information. Five workshops were piloted during the year: Basics of Project Management, Process Mapping and Analysis, Data Collection and Analysis in Excel, Slide Deck Development, and Poster Development and Writing a Final Report.



Projects aligned with or built on existing healthcare improvement efforts at university-affiliated teaching hospitals. During participant recruitment, the QIPEP leadership team identified clinicians with expertise in PS and/or QI to serve as “faculty supervisors.” Prior to the official start of the program in September, participants were matched to projects and connected with their supervisors to gain an understanding of the nature and scope of the problem. 


Students began their work in September with a focus on the “Plan” stage of the “Plan-Do-Study-Act” (PDSA) cycle. The “Plan” stage ran until December and gave teams time to refine the identified problem and proposed intervention, submit an ethics application, develop a formal project charter, review the relevant literature, and collect baseline data. Between January and February, the teams aimed to conduct two PDSA cycles. During this time, teams implemented their tests of change, continued to collect, analyze and interpret data, and made recommendations based on their findings. By the end of March, teams concluded any outstanding PDSA work and wrote their final reports. Teams were also encouraged to submit an abstract to relevant conferences to share their work and results with the larger PS and QI community.


Data Collection

We collected demographic information, including gender, primary degree, program and degree and year of study from participants’ applications to the program. A structured survey requiring Likert-based and free-text responses was emailed to all participants following conclusion of the program (Appendix 1). 



We analyzed responses to Likert-based questions using simple descriptive statistics. We compiled unstructured feedback and analyzed responses using a constant comparative approach (Glaser, 2006). AR and MLD independently coded responses and then worked iteratively to develop a final set of themes. 



The program received a total of 80 applications. Fifty-one students were accepted and matched to seven projects. Thirty-eight (75%) participants were female and eighteen (25%) were male. Twenty-one (41%) participants were undergraduate students, sixteen (31%) were graduate students, and fourteen (27%) were medical students. Table 1 provides a breakdown of participants’ academic programs. Five (14%) undergraduate students were in their first year, fifteen (43%) in their second year, six (17%) in their third year, and eight (23%) in their fourth year. Table 2 describes projects and their outcome at the end of the year. Three projects concluded at the end of the 2016-17 year and four projects continued into the following academic year (2017-18). 


Table 1: Breakdown of participants’ academic programs, by primary degree


Number of Participants


Arts & Science






Biomedical Computing




Biological and Medical Science





MSc – Epidemiology


MSc – Occupational Therapy


Master of Public Health


MSc – Chemistry


MSc – Clinical Psychology


MSc – Neuroscience


MSc – Physical Therapy


PhD – Clinical Psychology


PhD - Economics



Table 2: Description of QI Practical Experience Program Projects and Outcomes for 2016-17




Overutilization of non-essential blood tests in the Intensive Care Unit (ICU)


This initiative sought to understand why and how non-essential tests were being ordered in the ICU and to identify opportunities to curtain unnecessary testing.

This project successfully concluded at the end of the year. 

Assessing pain, agitation and delirium (PAD) in Critical Care

This project focused on staff behavioural change associated with pre-printed order sets designed to improve the capture of PAD data in the ICU.


This project successfully concluded at the end of the year and results were presented at a national nursing student conference.


Improving the transition of Neonatal Intensive Care Unit (NICU) infants to the community


This project fostered the development of a standardized discharge form to promote effective communication between NICU and community physicians.


This project concluded at the end of the year due to unforeseen administrative issues. 

Interdisciplinary Lung Cancer Clinic (ILCC) with respirologists and oncologists


An ILCC was piloted to decrease time to treatment and reduce clinic visits for patients with a new diagnosis of lung cancer. 


This project continued during the 2017/18 year, but preliminary results were presented at a national conference.


Improving patient flow and education in the Lung Diagnostic Assessment Program (LDAP) Clinic

This project established baseline patient flow by process and value stream mapping of a typical patient encounter in the LDAP clinic and used these data to develop QI initiatives to improve patient flow and education. 


This project continued during the 2017/18 year.

Improving the discharge summary process for rehabilitation services 


This project examined ways to reduce time taken by medical staff to prepare, sign, and disseminate summaries for discharged patients. 


This project continued during the 2017/18 year.

Improving the discharge transitions process for rehabilitation services

This project involved the analysis of factors contributing to suboptimal communication in discharge planning and the development of a patient-oriented transition sheet to convey important information.

This project continued during the 2017/18 year.


Survey Responses

Nineteen (37%) participants responded to the structured survey. Fifteen (79%) respondents agreed or strongly agreed that they have changed their way of thinking about roles and responsibilities within the healthcare system. Seventeen (89%) participants agreed or strongly agreed that they identified gaps in the healthcare system through completion of their QIPEP project. Eighteen (95%) respondents agreed or strongly agreed that QI has the capability to improve overall healthcare and seventeen (89%) agreed that they wanted to learn more about QI.


Thirteen (68%) agreed or strongly agreed that they would be able to apply the skills and knowledge learned from the program in their career. Six (32%) agreed or strongly agreed that they often felt disengaged from the project and two (11%) agreed or strongly agreed that the program was too time-consuming. Six (32%) agreed or strongly agreed that they were satisfied with the impact of their project. Fourteen (74%) agreed or strongly agreed that they would recommend participation in the program to friends and colleagues.


Survey Themes

Five themes emerged from the unstructured feedback. We grouped these into strengths (two themes) and opportunities for improvement (three themes). Strengths included effectiveness and utility of the workshops as tools for facilitating learning and ensuring accountability and robust program administration and communication. Opportunities for improvement included a mismatch between project needs and assigned students, maintaining the participant engagement and accountability, and a protracted research ethics board (REB) approval process. 


Effectiveness and utility of workshops

As a learning tool, respondents commented that workshops were informative and well organized, presented in a logical manner, and complemented their projects. Multiple participants also cited that they provided a dedicated time and place for teams to convene and “work together in a more accountable way.” 


Robust program administration and communication 

Participants highlighted that there was strong communication between teams and site managers about projects and workshops. They also commented that the site managers were quick to address all of the administrative tasks at the very start of the program. 


Project-student mismatch

According to respondents, the mismatch between project needs and assigned individuals took many forms. Some students commented that projects were overstaffed (the average project had seven students) and that there was not enough work to go around while others cited a discordance between project needs and participants’ schedules or skills. Students also commented about the mismatch between project topics and their interests. 


Maintaining participant engagement and accountability

The disengagement of participants and the lack of accountability within certain teams was another theme arising in the responses. One graduate student in particular commented, “I found it difficult to lead a team of undergraduate students who were not as committed to the project as I had originally hoped, and found myself completing a lot of the project myself.”Others discussed the need for improved communication within teams as well as greater faculty input and supervision as suggestions for improvement in this area.


Protracted REB Approval Process

In the context of an already compressed timeline, many participants highlighted that the long REB approval process detracted from their ability to carry out key aspects of their projects, including PDSA cycles. After raising these issues, students suggested solutions, either only allowing projects with prior ethics approval or securing approval prior to the start of the program. 


We developed and launched a yearlong extracurricular initiative, QIPEP, using a combination of didactic and experiential elements to provide interprofessional students with fundamental knowledge and skills in QI and PS. Our pilot involved over 50 students representing more than 15 different academic programs. Generally, the program was well received with participants reporting a desire to learn more about QI and a willingness to recommend the program to peers. 


However, through the quantitative and qualitative responses, two key areas of focus for the future emerged. The first is engagement. We infer that the high level of participant disengagement during the pilot was likely the result of a mismatch between project needs and the number of assigned students or their skills, a lack of communication between team members, or some combination of the two. Although the workshops were designed to fill gaps in practical skills (e.g., data collection and analysis) and provide a regular forum for teams to meet, we recognize they are insufficient in addressing these issues. With this in mind, we initiated a plan for the following year to consult faculty supervisors early in the summer to assess the human resource needs of their project and align them with the availability and schedules of incoming participants. 


Impact is another area. Due to the compressed timeline of the program, achieving milestones is critical. At our institution, QI projects must receive ethics approval, which can sometimes take weeks. Such lengthy approval processes diminished teams’ time to move forward with projects, including collecting and analyzing data, proposing changes, and testing them. In contrast, some teams were unaffected by ethics-related delays, creating disparities in what groups were able to accomplish during the program. Recognizing that projects should start on the same proposed start date to give enough time for teams to complete PDSA cycles, we are working with faculty supervisors to have REB approval secured the summer before the program officially begins. 


Limitations & Future Directions

There were a number of limitations to our approach. First, we did not employ a formal program evaluation to identify changes in commonly assessed educational outcomes, including knowledge and skill acquisition. Our assessment of attitudes and satisfaction at the end of the program was also informal and leveraged a homegrown questionnaire. Linking participants’ application submissions with their post-program responses may highlight potential differences of interest. Future work should also incorporate mixed-methods approaches, including testing of knowledge and skill acquisition using validated instruments like the QI Knowledge Application Tool-Revised (Singh et al., 2014), and examine the perspectives of faculty supervisors using semi-structured interviews.

Second, our findings may not be readily applicable to other settings. Although nursing students have been the target of many curricular efforts (L. Armstrong, Shepherd and Harris, 2017), only 5% of participants in our pilot were nursing students. We hypothesize that reasons for the low level of nursing engagement are likely multifactorial, with barriers including busy academic schedules and limited numbers of nursing faculty with PS and QI expertise available to mentor and promote this work to students. Low response rate also limits the internal validity of our findings; however, responses received provided important insights. Our post-program survey was sent to participants a few weeks after the final workshop and was not mandatory. We also acknowledge that such surveys are prone to selection and recall bias. Given the size of the overall program, future program evaluation may wish to mitigate these issues by mandating in-person survey completion at key milestones during the year or at the program’s last workshop or event.


We piloted the QI Practical Experience Program, an extracurricular, interprofessional initiative designed to equip students with basic PS and QI knowledge and skills. Although the program was well received by students, key issues of engagement and impact remain. Future efforts will focus on improvement in these areas and more rigorous evaluation of learning outcomes.

Take Home Messages

  • We piloted an interprofessional training program in quality improvement and patient safety that was well received by participating undergraduate and graduate students. 
  • Maintaining participant engagement is critical and future work will focus on ensuring congruency between project needs and the number of assigned students and better intra-project communication. 
  • Students want their work to have an impact; to this end, future work will ensure that projects start at the beginning of the academic year to give participants sufficient time to evaluate the problem and test solutions. 

Notes On Contributors

Akshay Rajaram MMI, MD is a first-year resident physician in Family Medicine at Queen’s University (Kingston, ON). Prior to medical school (Queen’s University), he completed a Master’s degree in Management of Innovation (University of Toronto) has strong interests in informatics, analytics, the social determinants of health, and quality improvement. 


Mialynn Lee-Daigle RN, BNSc is a Queen’s University alumni. She is currently working at Hotel Dieu Grace Hospital as a registered nurse, and Audacia Bioscience as clinical project manager in Windsor, Ontario. She has worked in clinical, research, and management capacities in the areas of hospital and community care, academia, and business.


Anna Curry PhD is a third-year medical student at Queen’s University with clinical interests in Family Medicine. She completed her PhD in cancer immunotherapy at the University of Toronto prior to beginning her medical studies. Her current research interests include quality improvement and assurance in healthcare settings.


Malcolm Eade BSc (Life Sciences, Queen’s University) served as the President of the local chapter of the Institute for Healthcare Improvement (IHI). He is now a co-founder of a technology start-up focused on developing improved chemical detection technology to help combat the Opioid Crisis.


Robert Connelly MD, MBA is Associate Professor of Pediatrics at Queen’s University and Head of the Department of Pediatrics at Kingston Health Sciences Centre. His works as a neonatologist in the Neonatal Intensive Care Unit and in the Neonatal Follow-up Clinic. He has interests in the use of handoff tools to improve patient safety.


Roy Ilan MD, MSc is an Assistant Professor of Medicine at the Technion - Israel Institute of Technology. He practices critical care medicine at the Rambam Healthcare campus in Haifa, Israel. His academic activities focus on patient safety and healthcare quality through research, education, and improvement activities.


Figure 1 was created by AR using Microsoft PowerPoint.


Armstrong, G., Headrick, L., Madigosky, W. and Ogrinc, G. (2012) 'Designing education to improve care', Jt Comm J Qual Patient Saf, 38(1), pp. 5-14.


Armstrong, L., Shepherd, A. and Harris, F. (2017) 'An evaluation of approaches used to teach quality improvement to pre-registration healthcare professionals: An integrative review', Int J Nurs Stud, 73, pp. 70-84.


Certificates and Continuing Education (2018). Available at: (Accessed: 13 May 2019).


de Vries-Erich, J., Reuchlin, K., de Maaijer, P. and van de Ridder, J. M. (2017) 'Identifying facilitators and barriers for implementation of interprofessional education: Perspectives from medical educators in the Netherlands', J Interprof Care, 31(2), pp. 170-17.


Dobson, R. T., Stevenson, K., Busch, A., Scott, D. J., et al. (2009) 'A quality improvement activity to promote interprofessional collaboration among health professions students', American Journal of Pharmaceutical Education, 73(4), p. 64. (Accessed: 13 May 2019).


Galt, K. A., Paschal, K. A., O'Brien, R. L., McQuillan, R. J., et al. (2006) 'Description and Evaluation of an Interprofessional Patient Safety Course for Health Professions and Related Sciences Students', Journal of Patient Safety, 2(4), pp. 207-21.


Glaser, B. G., Strauss, A.L. (2006) The Discovery of Grounded Theory: Strategies for Qualitative Research. New York: AldineTransaction.


Singh, M. K., Ogrinc, G., Cox, K. R., Dolansky, M., et al. (2014) 'The Quality Improvement Knowledge Application Tool Revised (QIKAT-R)', Acad Med, 89(10), pp. 1386-91.


Tregunno, D., Ginsburg, L., Clarke, B. and Norton, P. (2014) 'Integrating patient safety into health professionals' curricula: a qualitative study of medical, nursing and pharmacy faculty perspectives', BMJ Qual Saf, 23(3), pp. 257-64.


Wong, B. M., Etchells, E. E., Kuper, A., Levinson, W., et al. (2010) 'Teaching quality improvement and patient safety to trainees: a systematic review', Acad Med, 85(9), pp. 1425-39.


Appendix 1: Post-program structured survey 

1. Please select the site of your project.

2. Please select the option (strongly disagree, disagree, neutral, agree, strongly agree) which you feel best represents your attitude towards the following statements.

     a. I feel that the program was too time consuming.

     b. The administrative obligations associated with the program were reasonable.

     c. I will be able to apply the skills and knowledge learned from the program in my career.

     d. I am satisfied with the impact my project had.

     e. I would recommend participation in the program to friends and colleagues.

     f. There was effective communication within our group while collaborating on this project.

     g. I often felt disengaged from this project.

     h. I am satisfied with the leadership skills demonstrated by my team leader.

     i. I feel that I applied the knowledge I learned from the IHI modules I completed over the course of this project.

     j. During this QI project, I reflected on the tasks I completed and considered alternative ways of doing them.

     k. Working on this QI project caused me to question other peoples’ work habits, and consider ways to improve my own work.

     l. As a result of completing this program, I have changed my way of thinking about roles and responsibilities within the healthcare system.

    m. Doing this QI project has challenged some of my expectations of clinician behaviour.

    n. When completing this project, I discovered gaps in our healthcare system.

    o. Having participated in this program, I believe that QI has the capability to improve overall healthcare.

    p. This program has motivated me to want to learn more about QI.

3. What was especially well done in the program?

4. What suggestions for improvement do you have?


There are no conflicts of interest.
This has been published under Creative Commons "CC BY-SA 4.0" (

Ethics Statement

This study was approved by the Kingston Health Sciences Centre Research Ethics Board (PAED-440-18).

External Funding

This article has not had any External Funding


Please Login or Register an Account before submitting a Review

Barbara Jennings - (13/05/2020) Panel Member Icon
The authors introduce a clear rationale for introducing their pilot quality improvement (QI) and patient safety (PS) curriculum.
They also present an outline of the objectives and timeline for the curriculum in this case study, from learner recruitment and selection - to evaluation with the post-program survey (the fields included for this survey are listed in the appendix). Project work was integral to the curriculum but the organisation of the teamwork was not entirely clear and could be explained more fully in a revised version of the article.
I have some specific questions that could also be addressed in a revised version of the article;
(1) In your methods section you explain the ethical approval of your pilot study but don’t explain at what stage participants had to consent to their data being included in the evaluation and analysis. Was participation in the case study integral to completing the post-course survey?
(2) the authors note that their cohort of learners had fewer nursing students than previous comparable studies but don’t explain the mixed cohort that they recruited. The value of the patient safety programme to some of the student groups listed in Table 1 was not obvious to me in some cases. The recruitment rationale and any associations explored with respect to engagement could have been included in the discussion.
This case study and the cited literature will be of interest to colleagues who manage interprofessional learning, leadership, and patient safety strands of clinical curricula.

Possible Conflict of Interest:

For transparency, I am an Associate Editor of MedEdPublish. However, I have posted this review as a member of the review panel and so this review represents a personal, not institutional, opinion.

Ken Masters - (25/12/2019) Panel Member Icon
The paper deals with piloting an extracurricular quality improvement and patient safety training program for Interprofessional learners.

As pilot, the paper is interesting, although there are two areas that need addressing:

• I would like to see a more detailed description of the team composition and the interprofessional learning. The description in the paper appears to sell the project short: it describes the program and then the identification of the projects. Until that stage, it appears that the students are working individually. Then there is a switch to the teams, but we are not given any information about the teams, so some important questions do need to be answered, such as:
o How were teams constructed (self-selected or randomised or discipline-specific selection, etc.), and the justification for the methods.
o How many students in each team (We are told only in the results that “the average project had seven students”)?
o The composition (especially by professions/degrees ) of each team. (I would recommend that this information be added to Table 2, but it could be an extra table, if it makes Table 2 too unwieldy).
o How was inter-professional work required? This, and the bullet above, are especially important to ensure that the work was truly inter-professional, and not simply students from different professions attending the same course.

• As the survey form also gathered the site of the project, it would be useful to see some results by site. I realise that the sub-groups may be too small to have strong statistical significance (especially because the sample size is so small), but clusters of agreement or disagreement may emerge and some sites may not be represented at all, and this information will be useful for implementing any changes to the course.

So, as a Pilot, the research will provide some valuable input for the institution. Even as a Pilot, however, for the paper to have value to readers, some further detail is required. In addition to a Version 2 of this paper, I look forward to another paper in which the lessons learned from this pilot are implemented, and significantly more students are enlisted (and the administration of the survey is adjusted in order to obtain a higher response rate).
Possible Conflict of Interest:

For transparency, I am an Associate Editor of MedEdPublish. However, I have posted this review as a member of the review panel with relevant expertise and so this review represents a personal, not institutional, opinion.