Case study
Open Access

Conducting a high-stakes OSCE in a COVID-19 environment

Katharine Boursicot[1], Sandra Kemp[2], Thun How Ong[3], Limin Wijaya[3], Sok Hong Goh[1], Kirsty Freeman[1], Ian Curran[1]

Institution: 1. Duke-NUS Medical School, 2. Curtin Medical School, Curtin University, 3. Singapore General Hospital
Corresponding Author: Prof Katharine Boursicot ([email protected])
Categories: Assessment, Education Management and Leadership, Learning Outcomes/Competency, Curriculum Evaluation/Quality Assurance/Accreditation, Clinical Skills
Published Date: 27/03/2020


The COVID-19 pandemic has presented significant challenges for medical schools. It is critical to ensure final year medical school students are not delayed in their entry to the clinical workforce in times of healthcare crisis. However, proceeding with assessment to determine competency for graduation from medical school, and maintaining performance standards for graduating doctors is an unprecedented challenge under pandemic conditions. This challenge is hitherto uncharted territory for medical schools and there is scant guidance for medical educators. In early March 2020, Duke-National University Singapore Medical School embraced the challenge for ensuring competent final year medical students could complete their final year of studies and graduate on time, to enter the medical workforce in Singapore without delay. This paper provides details of how the final year clinical performance examinations were planned and conducted during the COVID-19 pandemic. The aim of the paper is to provide guidance to other medical schools in similar circumstances who need to plan and make suitable adjustments to clinical skills examinations under current pandemic conditions. The paper illustrates how it is possible to design and implement clinical skills examinations (OSCEs) to ensure the validity and reliability of high-stakes performance assessments whilst protecting the safety of all participants, minimising risk and maintaining defensibility to key stakeholders.


Keywords: OSCE; Clinical Examination; Pandemic; COVID-19; Simulation; Clinical Skills Assessment, Performance Assessment; Quality Assurance


Singapore was one of the first countries to test, diagnose and treat people infected with COVID-19 in January 2020. On 7 February 2020, the Ministry of Health (MOH) in Singapore declared ‘DORSCON Orange’ (DORSCON refers to Disease Outbreak Response System Condition and is a risk assessment) (MOH, 2020) prior to the WHO declaration of COVID-19 as a global pandemic on 11 March 2020.


Singapore has well-developed protocols since the 2003 outbreak of severe acute respiratory syndrome (SARS) and the H1N1 influenza 2009 pandemic (MOH, 2019). One example is the practice of 'cohorting' or segregation which was activated early January 2020 for COVID-19, especially for healthcare teams. The healthcare workers are divided to work in ‘cohorts’ - small teams matched for essential skill sets, and everyone is advised to limit socialisation outside of the team. The logic is that if one health care worker falls ill with COVID-19, then only that team will be quarantined and the others can continue to provide care for patients. This concept is also extended to health care workers who are not patient-facing.


The immediate effect of DORSCON Level Orange was that all students in the three Singapore medical schools were no longer allowed to continue educational activities in clinical environments and there was advice from the MOH that gatherings of students should be limited to 50 people.


There was also a national level policy decision that medical students in their final year, who had completed all their clinical rotations and were 3 months away from graduation, should undertake their graduation level knowledge tests and clinical examinations. This was to avoid delaying graduation for current final year students and ensure those students who met the required standards could enter the workforce as newly qualified doctors, as house officers, an important addition to strained healthcare manpower resources.


We worked closely with the MOH and our healthcare partners to proceed with our final OSCE (Objective Structured Clinical Examination) (Boursicot, Roberts et al., 2018): 25 stations, with community-based real and simulated patients , so that we could contribute more medical staff to the clinical working environment.


This paper is to document our experience of planning and conducting an OSCE during the COVID-19 pandemic, so that others in similar circumstances will be able to plan and make suitable adjustments to the clinical skills examinations that ensure the validity and reliability of such high-stakes assessments (Lockyer, Carraccio et al., 2017) whilst protecting the safety of all participants, minimising risk and maintaining defensibility to key stakeholders.


The following key principles were applied throughout:

  1. Strict infection control and personal hygiene; cleaning between circuits, face masks, hand sanitisation, health, travel declarations, temperature screening, solitary lunches
  2. Cohorting or segregation of all participant groups including by patient, student, faculty and healthcare institution
  3. Social distancing of individuals and physical isolation of different cohorts
  4. Zoom-facilitated briefings
  5. Wifi-enabled data gathering from iPad-based OSCE scoring system (already in place)
  6. No large group gatherings

Planning and delivery of the OSCE

Circuit design and allocation of students, examiners, real and simulated patients, and administrative staff

We divided students into four cohorts (14 students each, total number of students 56); the same cohorts were maintained throughout the three days of the examination and each cohort was managed separately, on a different circuit with different reporting and holding rooms. Each circuit was allocated specific administrative staff, and faculty did not cross over to other circuits.


The OSCE was conducted across a three day period and there were different examiners on each of the three days. Allocation to a circuit was according to hospital where an examiner usually worked; we avoided allocating examiners from different sites to the same circuit. At no time were students, staff or examiners from different circuits allowed to mix, and all were also given strict instructions about avoiding meeting and socializing after hours. This was to maintain the cohorting requirements of the healthcare institutions, as well as maintaining the student cohorts.


We were not permitted to use our usual clinical venue, which had been four parallel outpatient clinics, so we had to revise the plan to conduct the examinations in ‘non clinical buildings’. We used one medical school building, teaching rooms and clinical skills rooms, for two circuits. A non-clinical building owned by the main teaching hospital nearby was used for another two circuits: we used a floor of teaching rooms and set up beds and bedding, chairs, rest stations, and equipment for practical skills, to mirror the circuits in the medical school building. This enabled four parallel circuits.


All participant groups were cohorted. Students, examiners, external examiners, simulated patients (SPs) and community-based patients entered and left by different exits and circuits were conducted in a manner to ensure complete physical separation between circuits.


General administrative arrangements

Administration on each of the three days of the examination involved

  1. segregated participant registration areas at entrance of building
  2. individual temperature taken on arrival, recorded, and displayed on sticker on clothing
  3. individual travel declaration about last month of travel, any encounters with sick people
  4. everyone wore masks, except for students and patients
  5. hand sanitiser outside and inside every room
  6. cohorts in different circuits directed to different toilets
  7. no re-usable bedding, gowns, etc.
  8. no non-essential touching (such as handshaking)
  9. use of hand sanitiser before and after touching patients (real or simulated)
  10. social distancing (one metre apart, except when examining patients)


Special measures

Activities during the OSCE usually conducted in groups were adapted to ensure that there was minimal meeting between people involved in the examination

  1. SPs kept in separate groups by circuit, staying in the room for their station all day
  2. (real) patients kept in separate groups by circuit, staying in the room for their station all day. Dedicated nurses were provided for each cohort of patients
  3. examiners allocated to one circuit, but not gathered or briefed together: after registration, they were escorted directly to their allocated room and examiners were not allowed out of the room until the end of the examination (apart from comfort breaks)
  4. examiner briefing and calibration was conducted via Zoom on iPads which were also being used for scoring
  5. students in groups of 14 maximum in a large room where they were required to sit at least one metre apart (for social distancing purposes)
  6. no congregation of any people during coffee or lunch breaks: all food and drinks (individually packaged and sealed) were delivered and consumed in the individual room or examination room. There was no sharing of food or drinks; the open buffets originally planned were cancelled


OSCE content

There were no changes made to the blueprint, station design and content, or standard setting procedures. As there was some difficulty in the recruitment of real patients, we had to replace some with simulated patients and the scoring rubric was adjusted accordingly in those stations (only two clinical examination stations).


The examination consisted of two parts:

  1. The CPX (Clinical Performance Examination) consisted of 15 x 12 minute stations, including history taking, clinical examination and explanation/advice skills being tested, together with clinical decision making, diagnostic acumen and management planning.
  2. The OSEPS (Objective Structured Examination of Practical Skills) consisted of 10 x 8 minute stations, mapped to the procedures dictated by the MOH document ‘Outcomes Framework’ 2018. These were ‘hybrid’ stations, where SPs were present in every station to appropriately situate the practical tasks, and engage the students in professional interactions with patients, including explanations, consent, and patient safety.

The scoring scheme was based on rating scales and rubrics, with some short checklists integrated for the technical elements of practical skills stations (Wood and Pugh, 2020). The Borderline Regression Method (Yousuf, Violato et al., 2015, Homer, Fuller et al., 2019) was used to set the passing scores. The scores of the CPX and OSEPS sections were not conjunctive.

Challenges and solutions

Healthcare sector

The MOH and directors of government health providers were very supportive and enabled us to: 

  • purchase bulk orders of hand sanitiser, masks, gloves (in an environment where suppliers were only allowed to sell to hospitals)
  • obtain support for doctors to leave clinical work to be examiners

This official ministry level declared policy that it was a priority to proceed with the examination was important, as were clear ministry guidelines and written directives from senior management that clinicians coming for the examination would not flout ministry cohorting rules (since examiners from different hospitals were cohorted).



In a tight time frame of only four weeks we achieved a number of administration changes to implementation plans.

  1. Restructuring of the whole system of circuit venues and creation of maps of direction of flow, highlighting separate entry and exits points of each building (to ensure physical separation of cohorts).
  2. Reallocation of examiners to different circuits according to work site/hospital.
  3. Recruitment of a new group of examiners to replace nurse educator examiners for the OSEPS part of the examination: all nurses were redeployed to front line patient care, and so could not examine in our OSCE. This necessitated the recruitment of 36 new clinician examiners at four weeks’ notice.
  4. Telephone calls to reassure real patients and simulated patients about their safety, seek their consent to still participate, and recruit more where needed.
  5. Examiner training/calibration sessions had been conducted face to face until DORSCON Orange level was declared. After that, the sessions were migrated into an online environment, with the use of videoconferencing (Zoom) for discussion.
  6. Briefing at the start of each day conducted via Zoom, with each examiner at their respective OSCE station. Zoom enabled examiners on the same stations in the four parallel circuits to hold small group discussions for calibration.
  7. All medical school staff leave was cancelled and many extra examiners were on standby in case of last minute absence on the day, an increased likelihood during the pandemic.
  8. Extra SPs were recruited in the event of a no show by real patients. 
  9. Reconfirmation with external examiners (one from the UK, one from Australia, and one from Singapore) of intention to still attend the OSCE in Singapore.



We briefed the students during the month before the examinations were scheduled.

Students were also keen to avoid delay with graduation and were already mentally prepared for examinations. During the briefing to students, we included a senior Infectious Disease consultant, the Dean of the medical school, Clinical Deans, and the Vice Dean for Education. Senior clinicians were present, all of whom were continuing to work clinically.



A major concern was with bringing actual patients back for the examination. This was a multipronged pronged challenge – firstly, we were not permitted to use any inpatients or patients with any healthcare contact in the preceding 2 weeks. Secondly, there was initially fear amongst the patients about coming back to a healthcare facility as there was a perception that this might increase the risk of acquiring the COVID infection. Thirdly, there was of course the risk that one of the patients might actually have contracted and been a spreader of the disease. On the other hand, we felt that not being able to test the students’ ability to pick up physical signs would impact the validity of the examination.


We were able to overcome this by utilizing a pool of very stable patients with chronic conditions, minimising the exchange of patients during the examination, and by rigorous screening of patients prior to attending for the examination. All patients had to fulfil these requirements:

  • no travel history or any sick family members in the two weeks prior to the examination.
  • patients were telephoned to check on this twice in the two weeks prior to the examination.

Patients were reassured that the examination was to be held at a location separate from the hospital, that the other people involved in the examination were to be thoroughly screened and that patient safety precautions would be paramount.

Post-OSCE reflection and conclusion

To date, there have been no reports of COVID-19 infections, by those who participated in the OSCE. External examiners provided reports that commended the measures adopted and affirmed the defensibility of the examination results. The students who failed the OSCE will have a supplementary exam after two months and will have another opportunity to demonstrate their readiness for clinical practice.


In a time of outbreak, the primary concern is, and should be for the patients; training needs necessarily take a back seat. And yet it is still our duty to train the next generation of doctors. Our planning and adaptation to COVID-19 conditions reflects the values of the medical profession and the importance we place on students being prepared for the clinical workplace. By proceeding with the examination, we believe that we have sent a powerful signal to our nascent young doctors that in their chosen profession, we stay calm and carry on with important work, while taking appropriate precautions to ensure everyone’s safety.

Take Home Messages

  • It is possible to take sufficient measures in a pandemic environment to conduct clinical examinations safely.
  • These measures require the collective support of the healthcare sector, medical school academic and administrative staff, students, examiners and external examiners, simulated patients and patients.
  • Good communication and collaborative working are key factors.
  • Think imaginatively, be flexible and nimble.

Notes On Contributors

Katharine Boursicot is the Associate Dean for Assessment and Progression, Duke-National University of Singapore Medical School, Singapore.

Medical educationalist, overall academic director of OSCEs at Duke-NUS, lead for blueprinting, design of stations, standard setting, examiner training.

Main author, conceptualized, wrote and revised manuscript based on comments and suggestions from the other authors.


Sandra Kemp, Professor of Medical Education, is the Director, Learning and Teaching, Curtin Medical School, Curtin University, Perth, Australia.

An external examiner for the OSCE. Contributed to the conceptualization of the paper, reviewed and revised drafts.


Thun How Ong, Singapore General Hospital.  

Senior respiratory physician, Clinical Lead for Assessment at Duke-NUS, clinical lead for the OSCE, worked to recruit more examiners, ensure minimisation of cross-infection, delivered on line examiner briefing and calibration.

Reviewed the early drafts, with fact checking and suggestions for improvement.


Limin Wijaya, Singapore General Hospital.

Senior infectious diseases physician, Co-lead for OSCE, advised on infection control measures.

Reviewed drafts, advised on the correct national level directives.


Sok Hong Goh, Duke-National University of Singapore Medical School, Singapore.

Senior Manager, Office of Education.

Overall administrative lead, sourced required infection control supplies, new venue, cohorting of administrative staff.

Reviewed drafts, fact checked information about practical arrangements.


Kirsty Freeman, Duke-National University of Singapore Medical School, Singapore.

Lead Associate for Simulation,Worked to revise circuit plans of OSCE, cohorting arrangements, ensured supplies of adequate equipment in new venue, supervised 2 OSCE circuits.

Reviewed draft.


Ian Curran, Vice-Dean for Education, Duke-National University of Singapore Medical School, Singapore.

Reviewed drafts, advised on national level and institutional directives.


We would like to thank everyone who was involved in the enabling, planning and delivery of these high stakes OSCEs, especially:

Dr Mabel Yap, Director of Professional Training and Assessment Standards, Ministry of Health, Singapore.

Professor Chan Choong Meng, Group Director, Education, SingHealth.

Dr Nigel Tan, Deputy Group Director, Education (Undergraduate), SingHealth.

The members of the Clinical Performance Centre.

The members of the Department of Assessment and Progression.

All the patients and simulated patients who took part in our OSCE.

All the examiners who gave up their time to engage in this important event.


Boursicot, K. A. M., Roberts, T. E. and Burdick, W. P. (2018). 'Structured Assessments of Clinical Competence. Understanding Medical Education.' London, Wiley: 335-345. 


Homer, M., Fuller, R., Hallam, J. and Pell, G. (2019) 'Setting defensible standards in small cohort OSCEs: Understanding better when borderline regression can ‘work’.' Medical Teacher: 1-10.


Lockyer, J., Carraccio, C., Chan, M. K., Hart, D., et al. (2017) 'Core principles of assessment in competency-based medical education.' Medical Teacher, 39(6): 609-616.


MOH (2019). Being prepapred for a pandemic. Singapore MOH website., (Accessed: 23 March 2020).


MOH (2020). Risk assessment raised to DORSCON Orange. Singapore MOH website., (Accessed: 23 March 2020).


Wood, T. J. and Pugh, D. (2020) 'Are rating scales really better than checklists for measuring increasing levels of expertise?' Medical Teacher, 42(1): 46-51.


Yousuf, N., Violato, C. and Zuberi, R. W. (2015). 'Standard Setting Methods for Pass/Fail Decisions on High-Stakes Objective Structured Clinical Examinations: A Validity Study.' Teaching and Learning in Medicine, 27(3): 280-291.




There are no conflicts of interest.
This has been published under Creative Commons "CC BY-SA 4.0" (

Ethics Statement

This case report was approved by the Vice-Dean for Education at Duke-NUS Medical School and the Director of Professional Training and Assessment Standards, Ministry of Health, Singapore.

External Funding

This article has not had any External Funding


Shaoting Feng - (03/04/2020)
This paper provides us practical guidance in details to plan and make suitable adjustments to clinical skills examinations under current pandemic conditions. It's very enlightening! Thanks for sharing!
Richard Hays - (02/04/2020) Panel Member Icon
This is a timely paper and provides a comprehensive approach to running an OSCE with all participants on-site. The issue not addressed, but not intended to be addressed, is how can a high-stakes OSCE be managed with physical separation between some participants? It could be argued that moving people around and between health facilities is currently unwise. One model to consider is that of the Australian College of Rural and Remote Medicine (ACRRM), which offers multi-station clinical examinations remotely, because rural docs find it difficult to leave their communities (see: Smith J, Prideaux D, Wolfe CL, Wilkinson TJ, Sen Gupta T, DeWitt D, Worley P, Hays RB, Cowie M. Developing the accredited postgraduate assessment program for Fellowship of the Australian College of Rural and Remote Medicine. Rural and Remote Health 2007; 7: 805. Available: Th ACRRM model is smaller scale and has simulated patients in a different location, to candidates and examiners but is much smaller scale than an OSCE in a large medical school. However, it seems to work. Would it be possible to have students and (real or simulated) patients at one site, moving around as usual, with examiners in another location watching by video? This would substantially reduce the number of clinicians gathering and interacting with students. If simulated patients (including people with stable signs not currently unwell) were used, the impact on a health care facility should be reduced. Something like this might be worth considering as tighter social distancing is introduced.
Possible Conflict of Interest:

For transparency, I am the Editor of MedEdPublish

Tripti Srivastava - (31/03/2020) Panel Member Icon
An innovative approach to conduct assessments in challenging situations. weighing the nature of assessment against the perceived risk factors is the most critical aspect of undertaking such initiative. Authors have tried to share their experience of one such initiative focussed on minimising the risk for students , assessors and patients.
This case study can be a guide for conducting such assessments, without going into the scientific details of the study.
Wendy Hu - (29/03/2020) Panel Member Icon
I congratulate the authors in rapid publication of this extremely timely information, which is of great interest to educators worldwide in these "testing times". A great strength of the report is the level of contextual detai provided, allowing readers to take what is workable in their situation. While educational evaluation is yet to come, the process quality assurance, relationship to evidence, and vitally, the absence of infection in any participants, is key. The unusual situation created a new purpose - to ensure infection control to the highest standard, and the procedures appear to have met this aim.

The altenrnative of suspending live clinical examinations is however, being faced by many medical schools and I wonder if the authors would like to explore this option in future publications? The Singapore success in COVID control has been globally recognised, and the medical school's success is testament of the deep relationships between health care, government and educational systems, which the authors acknowledge as a strength. However, for other contexts, this may be lacking, and a relative lack of resources will preclude the measures described. Authors in less well resourced contexts, should also be encouraged to contribute their experiences and not see this example as best practice that they must emulate.

The publication is also testament to the flexibility of the MedEdPublish format and its fit with practitioners and innovation focussed researchers' information needs. If anything is to be gained from the current crisis, it is the rapid and huge increase in research and release of results based on "live" data seen in all major medical journals in recent months, and to be emulated by medical and heatlh professions education research. given our interdependency and integration with clinical practice.
P Ravi Shankar - (29/03/2020) Panel Member Icon
The authors describe conducting a high-stakes OSCE during the pandemic. Though it has not been explicitly stated, I assume this is the final clinical exam for students prior to graduation. The authors have described the logistics and preparation very well. The use of Zoom for briefing and for examiners to communicate with each other is interesting. The use of iPad for electronic contact and scoring minimized physical contact. The modality approached for separating students, examiners and patients has been highlighted. The separate packaging of food and refreshments and delivery of food to individual rooms has been mentioned. Methods to minimize physical contact have been well described.
I have a question about the external examiners. What was their exact role? Were they physically present or did they participate electronically? The class size is relatively small and I am thinking about possible challenges in scaling up the process to class sizes of 200 to 250 students. The initiative will be of special interest to educators in these challenging times though adaptations to local circumstances may be required.
Deb Halder - (27/03/2020)
Review on the Paper

The Crucial Points to focus on:

i. The title of the paper uses ‘high-stakes’ though it has never been clarified why ‘OSCE’ is ‘high-stakes’.
ii. An introduction is inseparable part for a case study where the considered issues are focused, the title is analyzed and the area of study is specified that the paper deliberately passes over.
iii. In the background there is an excerpt that ‘MOH advises gatherings of 50 clinical students at a time and place’. But, this remark has not been proceeded with any citation which makes this statement little valid and credible.
iv. Case study develops with lots of references along with the prevailing case. But, this paper is found to have used less related references to defend why these measures should be taken, what is advised by WHO and what they need to follow. These are few points that knocks the criteria of credibility of actions.

Few Syntactic Issues:
Few grammatical, lexical issues attract attention. For example, the very first sentence of the abstract uses ‘presented’ as the main verb of the paper which should be ‘posed’ or ‘created’.
The very two lines under planning and delivery of OSCE tends to be sound with syntactical harmony.

The Strongest Part of the Paper:

The Case study is presented in details. An order has been followed to designate the adopted measures point by point so that no important issue is missed. This detailed description of ways and procedures make these ways adoptable.
Possible Conflict of Interest:

The reviewer has no conflicts of interest.

Trudie Roberts - (27/03/2020) Panel Member Icon
Another valuable contribution to the stable of articles on managing educational activities or 'Learning in a time of Covid' to paraphrase Gabriel Garcia Marquez. This event ran extremely well - I know I was one of the external examiners - but only because of the commitment of the clinicians, academics, administrator, patients - real and simulated and students. As the authors comment thinking imaginatively and flexibly, looking at the opportunities that newer technologies is one of the keys to success in these challenging times. Congratulations to these educators on this achievement.
Possible Conflict of Interest:

I was an external examiner for this assessment

Judy McKimm - (27/03/2020) Panel Member Icon
A very timely and clear description of how to be adaptive in times of rapid change. I liked the discussion of how you worked to allay patients' fears and the care taken to minimise cross infection etc. The timings are crucial aren't they for what is possible, and clear communications to everyone, plus contingency planning are vital. It would be very interesting to gather ideas about how others have tackled the practical/clinical skills assessment in these times.
Trevor Gibbs - (27/03/2020) Panel Member Icon
A very insightful and appropriate paper published at exactly the right time. As we all continue to worry and develop newer approaches to standardised methods, this paper clearly shows that with thought, that we can continue with a tried and tested assessment approach.
I will look forward to hearing the student / patient and faculty evaluations of this activity went, but I hve ne reservations in recommending this paper to all curriculum developers
Possible Conflict of Interest:

For transparency, I am one of the Associate Editors of Medical Teacher

Felix Silwimba - (27/03/2020)
very informative and informative article on medical education in times of a pandemic. it definitely addresses concerns of all medical schools world over