New education method or tool
Open Access

An independent appraisal of an electronic OSCE management software system in a Psychiatry examination

Colin Fernandez[1], Allys Guerandel[1]

Institution: 1. University College Dublin
Corresponding Author: Prof Allys Guerandel ([email protected])
Categories: Assessment, Educational Strategies, Teaching and Learning, Technology, Undergraduate/Graduate
Published Date: 27/02/2019

Abstract

Background

The Objective Structured Clinical Exam (OSCE) is a method of assessment in psychiatry but is high in expenditure, requiring large amounts of staff time and an extensive paper trail. The Online Management Information System (OMIS) for OSCEs is a software system that proposes to reduce these drawbacks.

 

Aims

The aim of this study was to independently evaluate the use of OMIS through feedback from all key stakeholders involved in developing, administering and implementing an OSCE in a medical school.

 

Method

The independent appraisal of the system was done by an anonymous survey of four major participants in an OSCE; examiners, students, the academic team in charge of preparing and implementing the OSCE and an independent Information Technology (IT) team from the university who appraised the software.

 

Results

Feedback from examiners, academic team and IT experts was positive indicating a preference for OMIS over the typical paper format. Students mostly indicated a neutral response stating that OMIS had neither a positive nor negative effect on them during the exam. A few participants expressed concerns over data security, difficulty with some software features and the system not being optimised for hand-held devices.

 

Conclusion

In general, OMIS is presents as a promising tool to enhance the implementation and delivery of an OSCE exam over the typical paper format of the exam.

 

Keywords: assessment; OSCE; technology; online management; feedback

Introduction

The Objective Structured Clinical Exam (OSCE) was introduced in 1975 as a method of medical examination to overcome many of the disadvantages of traditional examination methods that were open to higher levels of subjectivity (Harden & Gleeson, 1975). It is a popular method of examination offering better validity and reliability in assessing clinical competencies when the appropriate standards are adhered to (Wass, et al., 2001) (Pugh, et al., 2014) (Hodges, et al., 2014). The OSCE offers superior reliability and feasibility as a format of exam over other methods due to its flexibility in accommodating varying numbers of students and examiners, different types of patient presentations and varying exam lengths and durations (M.F, et al., 2013). In psychiatry education, the OSCE is considered a valid assessment of clinical competencies with high levels of construct and concurrent validity (Hodges, et al., 1998). It appears to be the method of assessment of choice that has grown steadily in psychiatry over the last 20 years with several national certification organisations adopting it as a method of examination (Hodges, et al., 2014). As medical education heads towards competency based assessments with the evaluation of learning outcomes, the OSCE in combination with standardised board examinations and direct observation of skills in a clinical setting is considered by many as the 'gold standard' for measuring physician competence (Carracio & Englander, 2000). It is further touted as having a beneficial impact on medical students' learning and future performance (Gormley, 2011). Its value as a valid and reliable assessment method in allied health specialties is also being explored with positive results in nursing (Brosnan, et al., 2006) (Selim, et al., 2012) (Meskell, et al., 2015), midwifery (Smith, et al., 2012), dietetics (Farahat, et al., 2015) and physiotherapy (Ladyshewsky, et al., 2000) (Palekar, et al., 2015) amongst others.

           

Despite its many advantages, the OSCE has fallen under criticism as being high in expenditure, requiring large amounts of staff time and often requiring an extensive paper trail (Cusimano, et al., 1994) (Frye, et al., 1989). For a six station OSCE, it is estimated that 327.5 hours of staff and faculty time is spent on administration and developing an OSCE for each rotation of students (Cusimano, et al., 1994). The cost of this was estimated to be US 6.90 per student per station (Cusimano, et al., 1994). Here in Ireland, Kropmans et al estimate costs of running an OSCE of 7-12 stations approximately 11 times for 670 students in an Irish medical school would produce 9380 assessment forms and cost on average Euro 29 500.00 (Euro 2.80 per form) (Kropmans, et al., 2012). Despite being an expensive exam, cost effective analyses would suggest that it is still an effective and feasible method of examination but due consideration needs to be given to the resources it requires (Frye, et al., 1989) (Brown, et al., 2015) (Hasle, et al., 1994). In the climate of ever escalating costs, demands on resources and staff time, we decided to explore potential solutions to minimise these drawbacks to help preserve the OSCE as a viable method of examination in medicine. In our search, we came across the Online Management Information System (OMIS) for Objective Structured Clinical Examinations; an online OSCE management system designed to capture real time data online without the need for physical paper assessment forms as a record (Kropmans, et al., 2012). With this system, assessment forms are created online with assessment data captured in real time and results available instantly (Kropmans, et al., 2012). The system can be used on tablet devices, laptops and desktop personal computers (Kropmans, et al., 2012). OMIS purports to reduce administration costs per transaction by 70% compared to paper based solutions, increase accuracy of results by eliminating human errors and providing instant student feedback, improve validity of results with advanced psychometric analysis available pre and post exam standard setting and capture data in real time allowing for online reporting and analysis (Qpercom, 2016). Creators of the system report high levels of satisfaction by assessors who use the system, time savings, avoidance of errors and promote it as possibly a more effective method of identifying incompetent students over the traditional paper trail format of an OSCE (Meskell, et al., 2015) (Kropmans, et al., 2012) (Kropmans, et al., 2015). We set out to perform our own independent evaluation of the system as a pilot project in an Irish Medical School.

Objective

The aim of this study was to evaluate the use of an online management information system for OSCEs (OMIS) through feedback from a survey of all key stakeholders involved in developing, administering and implementing an OSCE in a medical school.

Methods

3.1 Setting of the study.

The study was conducted during an OSCE for the Psychiatry module of an undergraduate medical degree course in the School of Medicine in University College Dublin (UCD), Ireland. The school of medicine in UCD has a class of approximately 240 students in a year and the class completes a 6-week module Psychiatry module in their 5th year of a 6-year medical programme. During the 5th year, the class is divided into 4 groups of approximately 60 students each and each group attends one of four 6-week modules (Psychiatry, Medicine in the Community, Obstetrics & Gynaecology and Paediatrics) at a time during the course of the year. The Psychiatry module is conducted four times in an academic year for four groups of students in the year at a time. At the end of each 6-week module for Psychiatry, the students are assessed with an OSCE. In an academic year, a total of 4 OSCEs are run for Psychiatry for approximately 60 students each time.

 

3.2 Format of the OSCE

The OSCE in Psychiatry consists of a circuit of 6 stations each marked by one examiner. The OSCE is conducted in two parallel circuits of the 6 stations to accommodate the student numbers with 12 students completing a sitting of the OSCE at a time. Each exam runs 4 sittings of the OSCE. Some OSCE stations are stations where a student is assessed based on their interaction and assessment of a simulated patient and others stations known as 'object stations' are where the student answers questions based on a particular clinical scenario that is presented to them by an 'object' such as a video or a clinical vignette rather than a patient. In the acted stations, the examiners record their marks based on a marking sheet checklist. In the object stations, the students record their answer on a written answer sheet that is later marked by an examiner. 

 

3.3 OMIS software

The OMIS software was made available to the examiners on a tablet device (an iPad). Examiners received an individual training session on how to use OMIS and how to use an iPad before the OSCE. The paper assessment form that contains the marking criteria was uploaded on to OMIS and each student would now be marked using OMIS on the iPad thus replacing the old paper assessment method. OMIS instantly calculates total marks and stores it as each student completes a station. Prior to the OSCE, the psychiatry tutors and administration staff would create the OSCE stations and input all the necessary data such student information and order during the exam as required by OMIS beforehand.

 

3.4 Independent appraisal of the software

The independent appraisal of the software was done with anonymous survey questionnaires completed by the four major stakeholders in the OSCE; examiners, students, the academic team in charge of preparing and implementing the OSCE and an independent Information Technology (IT) team from the university who also appraised the software. The questionnaires consisted of questions with responses recorded on a Likert scale from 1 to 5; 1= Strongly disagree, 2= Disagree, 3= Neutral, 4= Agree, 5= Strongly agree. A few questions required 'yes', 'no' or 'don't know' answers and some questions were left as a free text box for responses.

 

3.5 Statistical analysis

All survey results were recorded and analysed using IBM SPSS version 20. The number of responses (n) per item of the questionnaire was recorded and tabulated in the tables of results.

Results

4.1 Student feedback

Table 1 shows the feedback from 64 students surveyed during the pilot project of introducing OMIS during and OSCE. In the survey, students were asked if they were aware of examiners using an iPad device during the exam rather than the traditional paper assessment forms and out of the 64, 58 (90.6%) were aware of this and 6 (9.4%) were not. Majority of students indicated that the examiners using iPads did not have either a positive or negative effect on them during the exam. Majority were also not concerned about data protection or confidentiality. It should noted that although the majority indicated a relative indifference to the use of iPads, small numbers, 4 (6.3%) students and 3 (4.7%) did indicate that this affected them negatively during the exam and they had some concerns about data protection issues.

 

Table 1: Responses from students to the questionnaire survey

Questionnaire statement

Strongly Disagree

 

n (%)

Disagree

 

n (%)

Neutral

 

n (%)

Agree

 

n (%)

Strongly Agree

 

n (%)

Total

 

n (%)

1. Examiners using an iPad had a negative effect on me during the exam.

33 (51.6)

10 (15.6)

 

17 (26.6)

4 (6.3)

0 (0.0)

64 (100)

2. Examiners using an iPad  had a positive effect on me during the exam.

22 (34.4)

10 (15.6)

31 (48.4)

1 (1.6)

0 (0.0)

64 (100)

3. Examiners using an iPad did not affect me either positively or negatively during the exam.

0 (0.0)

0 (0.0)

10 (15.6)

16 (25.0)

38 (59.4)

64 (100)

4. I have concerns about the confidentiality of my results on an electronic device.

37 (57.8)

17 (26.6)

9 (14.1)

1 (1.6)

0 (0.0)

64 (100)

5. I have concerns about data protection and security with my information on an electronic device.

34 (53.1)

18 (28.1)

9 (4.7)

3 (4.7)

0 (0.0)

64 (100)

 

4.2 Examiner feedback

In Table 2, the feedback from a total of 16 examiners involved in the OSCE was overall positive in that they found OMIS enjoyable to use, better than the paper forms and both the software (OMIS) and hardware (iPads) were easy to use.  Almost all of the examiners did not find OMIS distracting from the task of examining and made the task of examining an OSCE station easier than with the paper assessment forms. Pertaining to some of the unique features of OMIS; the ability to check summary one's marks and the trend of marks of others, the examiners were overall positive but a few indicated a neutral response to this as they were unaware that this feature existed. Of note, 2 examiners and 1 examiner registered a negative response to the ability to record comments with OMIS on the iPads and found marking with the paper assessment form easier respectively.

 

Table 2: Responses from examiners to the questionnaire survey

Questionnaire statement

Strongly Disagree

 

n

Disagree

 

n

Neutral

 

n

Agree

 

n

Strongly Agree

 

n

Total

 

n

1. I enjoyed examining the OSCE with OMIS.

0

0

0

7

9

16

2. I prefer using OMIS over the paper assessment form.

0

1

0

9

6

16

3. OMIS was easy to use.

0

0

0

13

3

16

4. The iPad was easy to use.

0

0

0

12

4

16

5. I found using OMIS distracting from my task of examining students during the OSCE.

5

10

0

1

0

16

6. OMIS made the task of examining an OSCE easier for me.

0

0

1

11

4

16

7. I found the feature of being able to see a summary of my marks useful.

0

0

3

8

5

16

8. I found the feature of being able to see the marking trends of other examiners useful.

0

0

6

8

2

16

9. OMIS made the task of recording feedback/comments on a students performance easier for me.

0

2

2

9

3

16

 

4.3 Academic team feedback

In Table 3, the feedback from 5 members of the academic team who were involved in preparing, administering and implementing the OSCE was overall very positive. All five members agreed that the OMIS was enjoyable to use and showed a preference for OMIS over the typical paper method. All also agreed that OMIS and the iPads were easy to use. All members agreed that OMIS saved time in both preparing the OSCE and during tabulating results with a reduction of the chance of errors occurring. Again a similar trend as in results with other stakeholders, a slightly more varied response was obtained in Statement 7 concerning the summary marks feature in OMIS where one academic indicated a neutral response. In further feedback, the academic team concluded that OMIS reduced the approximate costs and time spent on preparing and implementing the OSCE by approximately 50%.

 

Table 3: Responses from the academic team to the questionnaire survey

Questionnaire statement

Strongly Disagree

n

Disagree

 

n

Neutral

 

n

Agree

 

n

Strongly Agree

 

n

Total

 

n

1. I enjoyed using OMIS.

0

0

0

3

2

5

2. I prefer using OMIS to manage the OSCE than previous methods.

0

0

0

1

4

5

3. OMIS was easy to use.

0

0

0

4

1

5

4. The iPad was easy to use.

0

0

0

5

0

5

5. I found using OMIS saved time in preparing the OSCE.

0

0

0

4

1

5

6. OMIS made the task of managing the OSCE easier.

0

0

0

4

1

5

7. I found the features of being able to see the summaries and marking trends of the examiners helpful. 

0

0

1

1

3

5

8. Using OMIS saved time in tabulating results after the OSCE.

0

0

0

1

4

5

9. Using OMIS reduced errors while tabulating results after the OSCE.

0

0

0

1

4

5

 

4.4 IT Team Feedback
We asked two independent IT experts who are attached to the university to appraise the software. Both experts agreed that OMIS was user-friendly, well designed and fulfilled its function as an OSCE management software well. The IT experts were both generally unaware of any similar software to OMIS. Both experts did register neutral and one negative response in regards to the degree of security of data and confidentiality of results with the use of OMIS. One expert also gave a neutral response in regards to the ability of OMIS to provide backup of stored information. We also included free text responses on what the experts thought were the best and worst features of OMIS. One expert cited the ease of use, the efficiency and ability to instantly calculate results as being the best features of OMIS. The second expert stated that the multiplatform format of OMIS and the flexibility of it was its strength. Amongst the worst aspects of OMIS according to the experts, were that they thought more information and examination of its security aspects such as encryption was needed. To a lesser degree, one expert felt that OMIS could benefit further from more styling in its appearance to appeal to the user more and a greater variety of OSCE forms were needed to design a wider variety of OSCE stations. They also agreed that the current version of OMIS was not optimised for tablet devices such as iPads and thus could lead to difficulties with users trying to navigate the software on a tablet rather than on a desktop or laptop device.

Table 4: Responses from the IT team to the questionnaire survey

Questionnaire statement

Strongly Disagree

n

Disagree

 

n

Neutral

 

n

Agree

 

n

Strongly Agree

 

n

Total

 

n

1. OMIS is a user-friendly software.

0

0

0

1

1

2

2. OMIS is a well-designed software.

0

0

0

2

0

2

3. OMIS fulfils its function well as an OSCE management software.

0

0

0

2

0

2

4. I am aware of better OSCE management software.

1

0

1

0

0

2

5. Compared to other similar software, OMIS offers more advantages.

0

0

2

0

0

2

6. OMIS provides secure data protection. 

0

0

2

0

0

2

7. OMIS provides adequate features to maintain the confidentiality of students' results.

1

0

1

0

0

2

8. OMIS is reliable in providing backup of stored information.

0

0

1

1

0

2

Discussion

5.1 Main Findings

The main finding that this study revealed was that the OMIS software was overall a positive enhancement to the current OSCE format of examination in psychiatry. Feedback from examiners was positive and they preferred the OMIS method of examination better than the typical paper assessment form. Examiners found the software easy to use and made the task of examining easier and more enjoyable. The examiners gave slightly lower scores relating to specific features related to OMIS largely due to the fact that they were unaware of these functions of the system but if they had been aware, they would have found them useful. The ability to record comments was also somewhat difficult for some examiners and may have been possibly related to the OMIS system not being optimised for use on tablet devices.

           

In terms of feedback given by students, students were generally neutral towards the use of OMIS during an OSCE indicating that the use of OMIS did not affect them either positively or negatively. A small minority reported that it had a negative effect on them during the OSCE and had concerns about the confidentiality and security of the software. The appraisal from two independent IT experts was favourable in that they thought that OMIS was an easy to use and well-designed OSCE management tool. They shared some concerns over the software's security and confidentiality features but overall were unaware of any major faults with the system. Its multiplatform format and ease of use were particular strengths but the current version was not optimised for tablet devices and thus made tasks such recording comments without a separate keyboard slightly more cumbersome for examiners.

 

5.2 Strengths and limitations of the study
To our knowledge, this study is the first independent appraisal of the OMIS software for an OSCE in psychiatry. The study offers a rounded unbiased appraisal from four different perspectives involved in an OSCE (students, examiners, academic team and a perspective from IT experts). As highlighted in the introduction, an OSCE is often a laborious task involving many different parties with high demands on cost, time and effort in many schools.  We hope that this study provides some positive findings in regards to the introduction of technology such as OMIS that could help mitigate some of these factors and continue to support the ongoing use of the OSCE format of exam an effective method of assessment of competencies in medical education.

A relative weakness of the study is that its population size is small and limited to an OSCE in psychiatry. While the format of psychiatry OSCE is not vastly different to other subjects, the results perhaps are not entirely generalisable to OSCEs in other subjects. In addition to this, the lack of a control group (one where the examiners used the typical paper method of assessment) limits the authors from concluding more definitively that OMIS was superior to the typical paper method. However, the inclusion of comparison statements between OMIS and the paper method in the survey questionnaire overcame this somewhat. In this study, we did not include a cost benefit analysis on the implementation of OMIS as this was a pilot project and therefore the authors cannot comment on this aspect of the addition of OMIS to the OSCE.

Conclusion

In conclusion, OMIS presents as a promising enhancement by way of incorporating technology in the field of psychiatry OSCE examination by offering many advantages over the traditional paper-based method.

Take Home Messages

OMIS presents as a promising enhancement by way of incorporating technology in the field of psychiatry OSCE examination by offering many advantages over the traditional paper-based method.

Notes On Contributors

Dr Fernandez is a consultant Psychiatrist working in St Vincents University Hospital, Dublin, Ireland. He has an interest in Medical Education and is actively involved in the teaching and examining of Psychiatry to medical students in University College Dublin.

Professor Guerandel is a Consultant Psychiatrist in St Vincents University Hospital. She is Module Coordinator for the teaching of Psychiatry to medical students in University College Dublin. Prof Guerandel has a special interest in Medical Education and has introduced and investigated online resources to support and enhance teaching and assessment.

Acknowledgements

We have been using Qpercom software for the last few years succesfully.

Bibliography/References

Brosnan, M., Evans, E., Brosnan, E. & Brown, G., (2006). 'Implementing objective stuctured clinical skills evaluation (OSCE) in nurse registration programmes in a centre in Ireland: a utilisation focussed evaluation'. Nurse education today, pp. 115-122. https://doi.org/10.1016/j.nedt.2005.08.003

 

Brown, C., Ross, S., Cleland, J. & Walsh, K., (2015). 'Money makes the (medical assessment) world go around. The cost components of a summative final year Objective structured clinical exam (OSCE)'. Medical Teacher, pp. 1-7.

 

Carracio, C. & Englander, R., (2000). 'The objective structured clinical examination: a step in the direction of competency-based evaluation'. Archives of paediatrics and adolescent medicine, pp. 736-741. https://doi.org/10.1001/archpedi.154.7.736

 

Cusimano, M. et al., (1994). 'A comparative analysis of the costs of administration of an OSCE (objective structured clinical examination)'. Academy of Medicine, pp. 66-69. https://doi.org/10.1097/00001888-199407000-00014

 

Farahat, E. et al., (2015). 'Objective structured clinical examination (OSCE) improves perceived readiness for clinical placement in nutrition and dietetic students'. Journal of Allied Health, pp. 208-214.

 

Frye, A., Richards, B., Philp, E. & Philp, J., (1989). 'Is it worth it? A look at the costs and benefits of an OSCE for second- year medical students'. Medical Teacher, pp. 291-293. https://doi.org/10.3109/01421598909146415

 

Gormley, G., (2011). 'Summative OSCEs in undergraduate medical education'. Ulster Medical Journal, pp. 127-132.

 

Harden, R. & Gleeson, F., (1975). 'Assessment of clinical competence using structured examination'. British Medical Journal, p. 447. https://doi.org/10.1136/bmj.1.5955.447

 

Hasle, J., Anderson, D. & H.M, S., (1994). 'Analysis of the costs and benefits of using standardized patients to help teach physical diagnosis'. Academic Medicine, pp. 567-570. https://doi.org/10.1097/00001888-199407000-00013

 

Hodges, B., Hanson, M., McNaughton, N. & Regehr, G., (1998). 'Validation of an Objective structured clinical examination in psychiatry'. Academy of Medicine, pp. 910-912. https://doi.org/10.1097/00001888-199808000-00019

 

Hodges, B. et al., (2014). 'The psychiatry OSCE: A 20-year retrospective'. Academic Psychiatry, pp. 26-34. https://doi.org/10.1007/s40596-013-0012-8

 

Kropmans, T. et al., (2015). 'Back to the Future: Electronic marking using an online OSCE management information system in schools of health sciences'. Journal of Medical Students Galway, pp. 32-34.

 

Kropmans, T. et al., (2012). 'An online management information system for Objective Structured Clinical Examinations'. Computer and Information Science, pp. 38-48.

 

Ladyshewsky, R., Baker, R., Jones, M. & Nelson, L., (2000). 'Reliability and validity of an extended simulated patient case: A tool for evaluation and research in physiotherapy'. Physiotherapy theory and practice: An international journal of physiotherapy, pp. 15-25. https://doi.org/10.1080/095939800307575

 

M.F, P., Juliao, M., Farelaira, F. & Carneiro, A., (2013). 'Is OSCE a feasible tool to assess competencies in undergraduate medical education?'. Medical Teacher, pp. 503-514.

 

Meskell, P. et al., (2015). 'Back to the fure: An online OSCE management information system for nursing OSCEs'. Nurse Education Today, pp. 1091-1096. https://doi.org/10.1016/j.nedt.2015.06.010

 

Palekar, T., Baxi, G. & Anwer, S., (2015). 'Introducing Objective structured Practical examination in physiotherapy'. Natinal Journal of Integrated Research in Medicine, pp. 66-69.

 

Pugh, D., Tuchie, C., Wood, T. & Humphrey-Murto, S., (2014). 'Progress testing: is there a role for the OSCE?'. Medical Education, pp. 623-631. https://doi.org/10.1111/medu.12423

 

Qpercom, (2016). Qpercom global leader in clinical assessment software. [Online] Available at: https://www.qpercom.com/

 

Selim, A., Ramadan, F., El-Gueneidy, M. & Gaafar, M., (2012). 'Using Objective Structured Clinical Examinations (OSCE) in undergraduate psychiatric nursing education: is it reliable and valid?'. Nurse Education Today, pp. 283-288. https://doi.org/10.1016/j.nedt.2011.04.006

 

Smith, V., Muldoon, K. & Biesty, L., (2012). 'The objective structured clinical evaluation (OSCE) as a strategy for assessing clinical competence in midwifery education in Ireland: a critical review'. Nurse Education Today, pp. 242-247. https://doi.org/10.1016/j.nepr.2012.04.012

 

Wass, V., Jones, R. & Van Der Vleuten, C., (2001). 'Standardised or real patients to test clinical competence? The long case revisited'. Medical Education, pp. 321-325. https://doi.org/10.1046/j.1365-2923.2001.00928.x

Appendices

None.

Declarations

There are no conflicts of interest.
This has been published under Creative Commons "CC BY-SA 4.0" (https://creativecommons.org/licenses/by-sa/4.0/)

Ethics Statement

Institution: University College Dublin, Ireland. There was no requirement for an application for an ethics review as at the time the study was undertaken it was considered to be part of teaching practice.

External Funding

This paper has not had any External Funding

Reviews

Please Login or Register an Account before submitting a Review

Richard Hays - (14/03/2019) Panel Member Icon
/
While I accept that online management systems for OSCEs have many advantages and are probably becoming the norm, a paper about just one proprietary system has major weaknesses. There are several different systems in the market, most offering similar functions. A more interesting paper would be about a comparison of the systems - I recognise this would be difficult - but I do not think that academic papers should appear to be supporting a product rather than the principle that online management systems can improve manage,net efficiency. What should other potential buyers look for in the available systems? What were the problems that other developers could avoid? Lessons learned for broader adoption would be most helpful.
Possible Conflict of Interest:

I am Editor of MedEdPublish

sathyanarayanan varadarajan - (13/03/2019) Panel Member Icon
/
This novel and interesting paper is on evaluating the Online Management Information System (OMIS) for OSCEs as a replacement for the old paper assessment method.

The authors independently evaluated the use of OMIS through feedback from a survey of all key stakeholders involved in developing, administering and implementing an OSCE in an Irish Medical School, as a pilot project. The questionnaires consisted of questions with responses recorded on a Likert scale from 1 to 5, and completed by the four major stakeholders in the OSCE; examiners, students, the academic team and an independent Information Technology (IT) team.

The authors excellently highlighted the feedback of all stakeholders and also described in detail about the feedback (both positive and negative) of all the four stakeholders separately. They also mentioned the Strengths and limitations of the study.

The authors conclude that OMIS presents as a promising tool to enhance the implementation and delivery of an OSCE exam over the typical paper format of the exam.

This paper is an important contribution in the area of new education tool not only to the field of psychiatry but also to all the clinical fields where OSCE is being used as an assessment method.
Mary Anne Cordero - (07/03/2019)
/
I agree that the Objective Structured Clinical Exam (OSCE) is currently considered as one of the popular methods of medical examination used by a number of medical colleges. It has high validity and reliability in assessing clinical competencies given that standards in preparing OSCE stations are followed. However, as reported by a number of studies, conducting OSPE is expensive, time consuming and requires a lot of paper works. Thus studies like this is a big welcome for it provides possible avenue to address the above mentioned drawbacks in the conduct of OSPE.

There were some points which I would have liked more information and details like: More description and information about Management Information System (OMIS) as specifically used in your pilot study; Were there any limitation in its use? Were students also oriented regarding the use of OMIS in OSCE assessment? Examiners received an individual training session on how to use OMIS prior to the exam, how about the students? Were they also trained since they also need to record or write their answers in the object stations?

Results of the study showed students’ neutral response to OMIS and few participants expressed concerns over data security and difficulty with some software features. Can this be expounded and discussed further? Are there recommendations along these findings?

Since students are the recipients of this assessment method, do you think these results will significantly affect students’ performance in the OSPE exam? Do the authors have any plans to evaluate this study further taking into account students’ performance in paper-based OSPE compared with using OMIS?

Lastly, I congratulate the authors for their efforts in improving the assessment methods in medical education specifically in assessing students’ clinical competence.
Junichi Tanaka - (06/03/2019)
/
It was a very interesting article. I think that evaluation by electronic device is better than paper. I hope to further spread it. However, one of the opponents' opinions is concerned about the device being broken when it is being evaluated. I have not got the answer yet.
Felix Silwimba - (01/03/2019)
/
its a good informative medical education student assessment study . that has brought to light the role of IT in improving assessments and reducing costs.