Case study
Open Access

Disruptive Innovation: successful implementation of an electronic marking in an Objective Structured Clinical Examinations – a single institution case study

James Fraser[1], Margo Lane[2]

Institution: 1. School of Medicine, Griffith University, 2. School of Medicine, Griffith University,
Corresponding Author: Dr James Fraser j_fraser@bigpond.net.au
Categories: Assessment, Education Management, Technology

Abstract

Problem: Objective Structured Clinical Examinations are commonly included in the suite of assessment tasks in health professional programs.  Conducting these assessments using paper-based methods has resulted in challenges that have been identified across the range of disciplines. These challenges include laborious administrative processes, missing assessment data, delays in data analysis and result release, which may limit the capacity to provide feedback to students.

Intervention: The implementation of an electronic marking system was piloted in an attempt to address these issues. The pilot included the identification and testing of the software, evaluation of its impacts on the conduct and outcomes of the assessment and the identified challenges. The outcomes were considered through the lens of disruptive innovation.

Context: This medical program has an annual intake of more than 500 students with learning, teaching and assessment occurring at multiple distributed campuses in metropolitan, regional and rural areas of Queensland, Australia, and the United States of America. This results in a complex and resource-intensive assessment and data management process. The identified challenges were found to be compounded when the examination was delivered concurrently at several examination centres to a large cohort.

Outcomes: The introduction of an electronic marking system was feasible to implement across multiple examination sites and to a large cohort, and was found to be more reliable and efficient compared with traditional paper-based assessment systems. Examiner feedback regarding accessibility and acceptability of electronic marking on tablet devices for clinical assessments was favourable with enhancements to qualitative feedback and no adverse impact on student results being identified.

Lessons Learned: The utilization of an electronic marking system in Objective Structured Clinical Examinations was found to be consistent with both disruptive and sustainable innovation. The opportunity to provide immediate feedback was consistent with the definition of a disruptive innovation and improved data management and processing outcomes consistent with a sustainable innovation. Furthermore, the use of electronic marking systems technology is likely to be transferable to other competency and workplace–based assessments.

Keywords: Assessment, OSCE, Digital

Introduction

In 2014, the use of an electronic marking system for Objective Structured Clinical Examinations (OSCEs) was piloted by a medical school in Queensland, Australia to address a number of challenges that had been identified in previous years. These included the complexity of the preparatory administrative processes, the need to ensure a complete data set after the OSCEs, the requirement for efficient access to assessment data for result analysis and processing for both academic and administrative staff, and the provision of meaningful and timely feedback to students.   These challenges are not unique to this program and have been described in other medical and health professional programs.1-3

In this institution, however, other factors were also relevant including a large cohort size, and the geographically distributed nature of the medical program. The medical school enrols more than 500 students per academic year into a four year post graduate medical program, and has clinical teaching sites in urban, regional, and rural centres, and international locations.  Summative OSCEs are included in the assessment program at the end of Year 2 and Year 4 and these are conducted concurrently at a number of these sites. However, the OSCEs are not run at the international clinical sites in this program. Prior to 2014, OSCEs were administered by traditional paper-based assessment methods. Because of the cohort size, OSCEs were run in four half-day examination sessions, across both days of the weekend to allow for access to appropriately equipped venues.  Four equivalent papers were developed for each OSCE and administrative preparation procedures for such a complex undertaking were both labour-intensive and time-consuming.  In a resource constrained environment, this was not sustainable, and the key innovation to ensure that these high stakes examinations remained feasible was the introduction of an electronic marking system.

The challenges with data management when utilizing paper-based assessments have been described in the literature and include identifying and remedying missing results and incomplete or inaccurate data entry.4  Addressing these tasks after the completion of an OSCE consumes an appreciable amount of academic and administrative staff time, especially when coordination is necessarily spread  across distributed examination centres. Schmitz and colleagues concluded that up to sixty percent of examination papers have missing or incomplete results, though in our experience this figure was much lower.4  However minimal, any missing data requires further clarification. This can be difficult to achieve and is a threat to the validity of the assessment item for an individual student or a whole cohort if the missing data rate is high.  Manual data entry of results from paper-based assessment sheets is time-consuming and potentially susceptible to error, especially for a large data set. Consequently, results release to students may be delayed.

The provision of feedback on learning to students was identified by Harden as one of the important considerations for utilizing this assessment design.This aim had not been optimally realised at this medical school due in part to the use of paper-based marking sheets and the associated administrative issues noted previously. Access to examiners’ narrative feedback in real-time using the electronic marking system makes this goal achievable.  In addition, the use of electronic marking may change the nature of the feedback to students with Denison and colleagues reporting that both the quantity and quality of narrative feedback was enhanced by the use of tablet devices in OSCE marking, in particular for the poorer performing student.6

Rogers discussed the factors that may facilitate or hinder the acceptance and implementation of new technology.  These included characteristics of the technology and the users, and how users are persuaded to adopt the technology. He noted that the uptake of innovation was aided by the following – the innovation provided attributes desired by the users, there were similar personal characteristics between the proposers and the users, and there was a staged implementation process for the innovation.7 Christensen defined a disruptive technology as one that triggers new practice or process, beginning with inferior performance metrics but supported by a small number of early adopters and expanding to ultimately dominate the pre-existing process. The technology is not designed to be a disruptor, but rather disruption occurs because of the application of the technology.8 Christensen and Raynor further emphasised this by renaming the term ‘disruptive innovation’.9 This is in contrast to sustainable innovation which produces a quality improvement but does not, in essence, change the existing process.10

Christensen applied this definition to studies of higher education in the USA and concluded that whilst there has been broad adoption of technology in educational activities, the technology has not resulted in disruptive innovation in this field.  He further contended that an important marker of technology producing a disruptive innovation in higher education will be the development of personalised learning opportunities. 8,9 Viewing the implementation of the electronic marking system for OSCEs through this lens will allow further understanding of the scope of educational technology and its potential contribution to the goal of enhanced personalised student-centred learning.

Project Implementation

The primary aim of this project was to evaluate whether the introduction of an electronic marking system addressed the identified challenges. A secondary aim was to assess the effect of the introduction of electronic marking in OSCEs using the lens of disruptive innovation. The OSCE examiners’ views on of the use of tablet devices compared with paper marking sheets was particularly relevant in the consideration of these aims.

The planning, development, and implementation of the transition from paper to digital formats in high stakes assessments such as OSCEs requires a team based approach with inputs of administrative, clinical, and educational expertise. The successful introduction of electronic marking requires careful attention to each aspect of Zeleny’s technology support network.11 In relation to the OSCE, these are the core network of physical, organisational, administrative and cultural structures which form the basis of the social relationships among the human participants in the preparation, delivery and finalisation of the OSCE.

The electronic marking system pilot was conducted in October 2014 during the Year 2 summative OSCE at a regional site with eighty-five students and twenty-five examiners. Ethics approval for this project was obtained from the university human ethics review committee (University of Queensland, approval number 2014001396). The existing paper-based assessment templates were easily adapted to the software format and the examination upload to the server and download to the tablet devices was uncomplicated and time efficient.  A demonstration of the software during the examiner briefing and a short hard copy user guide were adequate resources to upskill the examiners. Paper-based examination scripts and assessment sheets were available on site in case of software or tablet device malfunctions. However, no significant problems were encountered with the use of electronic marking system during the pilot.

Evaluation

The use of the software had an immediate and positive effect on each of the challenges previously identified. Based on this outcome, the pilot project was extended to include the 2014 Year 4 OSCE at two examination centres (a major metropolitan site and a single rural site), and additionally at the Year 2 and 4 extended OSCEs.

Following the successful completion of these assessments, the use of the electronic marking system was reviewed with each local team of administration and academic staff.   The 2014 OSCE results were congruent with previous years’ results and academic staff were satisfied that the introduction of the electronic marking system did not indicate any impact on cohort results.

The examiners who took part in the pilot (n=165) were emailed a link to an online survey requesting their feedback regarding the acceptability and utility of the electronic marking system. The survey was anonymous and voluntary.   The response rate for completed surveys was 58% (n=96). The respondents represented a diversity of age, gender and clinical backgrounds. Of this cohort of examiners, seventy-five percent (75%) were experienced examiners, defined as having examined at two or more previous School of Medicine OSCEs. When questioned about previous experience with tablet devices, twelve percent (12%) of the respondents stated they had never used a tablet device before the OSCE, and forty-six percent (46%) of the respondents self-identified as basic users. Ninety-two per cent (92%) of the respondents reported that marking using a tablet device was easier than using paper-based checklists. These findings are consistent with Schmitz and colleagues who found the software to have a usability rating of 6.5 on a 7-point Likert scale, a strong examiner preference over paper-based checklists and that the use of tablet devices required lower levels of mental effort by examiners.4

Outcomes

For this large distributed medical school, the success of the electronic marking system pilot was significant in many areas. The administrative preparation time for OSCEs was  reduced, with a flow on effect of more manageable timelines for the secure distribution of the final examination material to the various examination centres. Data management was streamlined by having complete data sets with no requirement for manual data entry or location of missing data. The ease of results upload and download allowed for early availability of results for analysis by academic staff.  This allowed prompt release of final results to students.

Although not formally evaluated, it was noticeable that the use of tablet devices for marking also enhanced the quantity, quality and legibility of the written comments from examiners, consistent with Denison’s findings.6 This allowed immediate verbal feedback to be given to the student cohort on exiting from their examination session based on the real-time analysis of the examiners’ comments.  These comments were also emailed to students individually after the OSCE, and thus provided a more meaningful and specific foundation for personalized student learning.

At this medical school, there were no additional resource implications for the pilot project in terms of information technology support, licensing fees or tablet devices. Importantly, no data security issues were identified with the implementation of the electronic marking process.

Limitations of this project

A formal evaluation of the change in administrative preparation time and impacts on the other identified challenges may have been useful, however, anecdotal evidence of significant improvement in all areas was sufficient that the School has not returned to the use of the paper-based assessment format since this successful pilot. The electronic marking system has subsequently been utilized in all OSCEs in the program, as well as other clinical skills assessments.  Due to the existing availability of tablet devices within the School at the time of the pilot, the resource implications may have been less at this medical school, compared to other medical programs.

Future research directions

The success of implementation of electronic marking systems in clinical assessments leads to a number of related research questions, including the impact of electronic marking on examiner cognitive load, the impact on case writing and the impact on inter-rater reliability.

Take Home Messages

The application of an electronic marking system to OSCEs fulfils the criteria for a disruptive innovation in the provision of timely focussed feedback to students and is progress towards the use of technology to contribute to a personalised learning environment. The introduction of the software was intended to provide an efficient platform for administration of clinical assessments, however the enhancement to the feedback processes aligns with Christensen’s definition of disruptive innovation 9.  The administrative and data management changes associated with the introduction of electronic marking extend the quality assurance process and the continuous innovations that follow into the realm of the improved service delivery of a sustainable innovation.

This study has demonstrated that the use of electronic marking systems for OSCEs for large distributed cohorts is feasible and has a high degree of acceptability by the examiners. It reduces administrative workload and allows for early availability of data for analysis and the timely release of results to students. The provision of feedback to students has been greatly enhanced by this initiative.  This process may be applicable to other workplace-based or performance-based assessments in the training of health professional students.

Notes On Contributors

Dr Fraser is Associate Professor of Medical Education at Griffith University and an Emergency Medicine clinician.

Dr Lane is Associate Professor of Medical Education at Grifith Inversity and a General Practitioner.

Acknowledgements

This case report describes work undertaken at another university, prior to the authors commencing at Griffith University.

Bibliography/References

1. Judd, T, Ryan, A, Flynn, E, McColl, G. If at first you don't succeed...Adoption of iPad marking for high-stakes assessment. Perspectives in Medical Education 2017; August

https://doi.org/10.1007/s40037-017-0372-y   

2. Shirwaikar, A. (2015). Objective structured clinical examination (OSCE) in pharmacy education - a trend. Pharmacy Practice, 13(4), 627.

https://doi.org/10.18549/PharmPract.2015.04.627

3. Sola, M, Pulpon, AM, Morin, V, Sancho, R, Cleries, X, Fabrellas, N. Towards the implementation of OSCE in undergraduate nursing curriculum: A qualitative study. Nurse Education Today 2017;49:163-167

https://doi.org/10.1016/j.nedt.2016.11.028   

4. Schmitz, FM, Zimmermann, PG, Gaunt, K, Stolze, M, Guttormsen, S. Electronic Rating of Objective Structured Clinical Examinations: Mobile Digital Forms Beat Paper and Pencil Checklists in a Comparative Study. In Holzinger, A & Simonic, K-M (Eds.), Information Quality in eHealth Berlin: Springer

https://doi.org/10.1007/978-3-642-25364-5_35   

5. Harden RM, Stevenson M, Downie WW, Wilson GM. Assessment of clinical competence using Objective Structured Examination. British Medical Journal 1975;1:447–51

https://doi.org/10.1136/bmj.1.5955.447   

6. Denison, A, Bate, E, Thompson, J. Tablet versus paper marking in assessment: feedback matters. Perspectives in Medical Education 2016;5: 108

https://doi.org/10.1007/s40037-016-0262-8 

7. Rogers, EM. Diffusion of Innovations. New York: The Free Press. 1983   

8. Christensen, CM & Raynor, ME. The innovator's solution: Creating and sustaining successful growth. Cambridge, MA: Harvard University Press, 2003   

9. Christensen, CM. The innovator's dilemma: When new technologies cause great firms to fail. Boston, MA: Harvard Business School Press, 1997   

10. Yamagata-Lynch, LC, Cowan,J, Luetkehans, LM. Transforming disruptive technology into sustainable technology: understanding the front-end design of an online program at a brick-and-mortar university. The Internet and Higher Education 2015;26:10-18.

https://doi.org/10.1016/j.iheduc.2015.03.002 

11. Zeleny, M. High Technology and Barriers to Innovation: From Globalization to Relocalization. International Journal of Information Technology & Decision Making. 2012;11(2):441-456

https://doi.org/10.1142/S021962201240010X

Appendices

There are no conflicts of interest.

Please Login or Register an Account before submitting a Review

Reviews

Julie Williamson - (08/12/2017) Panel Member Icon
/
This paper describes the implementation of an electronic marking scheme to an OSCE. I agree with a previous reviewer who stated that this wasn't a disruptive use of technology. Without the OSCE having been significantly impacted, it's difficult to make a case for this change being disruptive. In fact, having it work so well nearly means that it wasn't disruptive--it allowed the OSCE to run as usual. Also, there are many institutions using electronic marking for OSCEs; my own institution has been scoring OSCEs electronically for over five years. I was disappointed that the authors didn't include more detail on the technology and software they used, how it functioned, and done a critical analysis of where errors could occur. The addition of those components would have made the paper potentially useful to educators who are still paper-scoring OSCEs.
Ken Masters - (02/12/2017) Panel Member Icon
/
Nice background on the issues from the education and technology perspective.

It was probably a good idea that “Paper-based examination scripts and assessment sheets were available on site in case of software or tablet device malfunctions.” This was a pilot, and so I imagine that there would have been concerns about the technology. Once this system is more widely implemented, then you could probably dispense with those, just as people no longer have paper MCQs for online exams. (Once staff are familiar with the system, they could even have copies on their personal devices as a backup).

I think, however, that the authors are too strongly pushing for this to be considered “disruptive” as it does not appear “disruptive” enough for it to warrant the concept in the title and quite the length of discussion in the introduction. This is borne out by the fact that the perceived disruption is not referred to all after the implementation and during the evaluation of the project.

Although the authors do raise the issue of disruption in the Take-home message, the new system’s contribution is nothing new, nothing disruptive, and no real disruption occurs. Calling it a disruption does not make it one. Its contribution is a greater improvement in the speed of student feedback and apparently greater security of data retention. Considering the innovations that Christensen et al view, this is squarely in the realm of an improvement, a welcome improvement, but it is an incremental improvement, and not a disruption.

For the authors to make such a case, they would need a detailed discussion showing how the work-flow processes have had to significantly altered, demonstrating an impact on the entire OSCE system, and, a really strong test of disruptive innovation, how new technologies and innovations have been developed as a result, and/or have had to be developed in order to accommodate the new disruption.

Perhaps this might come at a later stage, but, at this stage, the authors have described a useful, incremental innovation, and should be content with calling it that.

In addition to the detailed discussion required, I would like to have seen a lot more detail about the technical specifications of the software, backups, security, support, costs, etc. If the authors’ paper is to have value for others, then this type of information would be required.
Michael SH Wan - (24/11/2017) Panel Member Icon
/
With a large intake of medical students each year and the limited resources to run OSCEs across dispersed sites, implementation of the eOSCE can improve efficiency and enhance immediate feedback to students. More description on the implementation and the use of electronic devices would help the other schools who are considering this move in the future to learn from the authors experiences e.g. Wifi stability requirement, number of backup devices, the use of an in-house software or commercially available eOSCE software, Although for the pilot, there was no additional resource implication, some estimation of the additional budget (license, devices cost) per student or per institution would be helpful for the readers to consider moving onto electronic marking.
THOMAS PUTHIAPARAMPIL - (24/11/2017)
/
Interesting paper. Indeed, successful implementation of electronic marking system in a written assessment like OSCE is a remarkable achievement. However, many questions arise after reading through this paper. The process of electronic scoring in a written paper is not made clear. I kept wondering how it is done! Is it a formative assessment or summative assessment? Keeping ready parallel printed set of OSCE papers must be waste of materials!