Research article
Open Access

Students designing assessment for future practice

Michaela Kelly[1], Anna Ryan[2], Margaret Henderson[1], Hannah Hegerty[3], Clare Delany[2]

Institution: 1. University of Queensland, 2. University of Melbourne, 3. Queensland Health
Corresponding Author: Dr Michaela Kelly ([email protected])
Categories: Educational Strategies, Students/Trainees, Teaching and Learning
Published Date: 08/06/2018

Abstract

Background

Multiple choice questions (MCQs) are used at all stages of medical training. However, their focus on testing students’ recall of facts rather than actively facilitating learning remains an ongoing concern for educators.  Having students develop MCQ items is a possible strategy to enhance the learning potential of MCQs.

Methods

Medical students wrote MCQs as part of a course on the medical care of vulnerable populations. Student perceptions of learning and assessment through MCQ writing were explored via surveys and focus group interviews. Survey responses were analysed using descriptive statistics and transcribed interviews were analysed thematically.

Results

Students reported that writing MCQs enhanced their learning and exam preparation and reduced their exam-related anxiety. It encouraged students to research what they did not know and benchmark their learning to that of their peers. Students described using deep learning strategies, were motivated to write high quality MCQ items for their peers and prioritised vocational learning in the development of their questions.

Conclusion

The study suggests student-developed MCQs can enhance the learning value of MCQs as a form of assessment.  It also highlighted that students can be capable designers of assessment and that learning processes can be improved if students are provided agency over their learning and assessment.

Keywords: student-developed MCQs; student-generated MCQs; vocational learning; peer learning; medical students

Introduction

Assessment has a powerful effect on learning, influencing both the learning approach a student adopts and the cognitive skills utilised (Combs, Gibson, Hays, Saly, & Wendt, 2007; Jensen, McDaniel, Woodard, & Kummer, 2014; Larsen, Butler, & Roediger, 2008; Wormald, Schoeman, Somasunderam, & Penn, 2009).  Multiple-choice questions (MCQs) have been favoured as an efficient and pragmatic method of assessment in the health professions (Case & Swanson, 2000; Epstein, 2007). The single-best-answer MCQ style is commonly used to assess how students apply their knowledge to specific clinical scenarios (Surry, Torre & Durning, 2017; Swanson, Holtzman, & Allbee, 2008; Tan, McAleer, & Final, 2008).

However, criticisms of MCQ assessment include the privileging of instructor perspectives, failure to acknowledge plurality of knowledge and difficulty addressing relativities and uncertainties (Epstein 2007). MCQs may not support robust clinical reasoning, required for future practice, instead, encouraging early closure in problem-solving processes and reducing opportunities to consider all diagnostic possibilities (Epstein 2007; Schuwirth, Van Der Vleuten & Donkers 1996). Designing MCQs that promote synthesis of information and use of constructive thinking skills, at the summit of Bloom’s taxonomy, is more challenging (Albanese & Dast, 2014; Bloom et al., 1956). MCQ assessment has been associated with inferior retention of knowledge compared to assessment that requires the learner to construct an answer (Larsen, Butler & Roediger, 2008; Yu Tsao Pan et al, 2016). A further criticism is that testing via MCQs without providing feedback enhances the possibility that students will acquire false knowledge (Roediger & Marsh, 2005; Schooler, Foster & Loftus 1988).

These critiques do not favour higher education goals underpinned by adult learning theory where the focus is on providing learners with opportunities to be self-directed; to diagnose their own learning needs and to think autonomously (Knowles, Holton, & Swanson, 2015). They also challenge the role of academics to design assessments which achieve pedagogical goals of lifelong and self-directed learning ( Boud & Falchikov, 2007; Sadler, 2010).

In this study, we combined the pedagogical concepts that inform adult learning theory with the ideas that assessment should drive and sustain learning, by incorporating the requirement for medical students to write their own MCQs as a component of their assessment (Knowles, Holton & Swanson, 2015; Taylor & Hamdy, 2013). There is evidence that medical students can write high quality MCQs (Bottomley & Denny, 2011; Chamberlain et al., 2009; Galloway & Burns, 2015; Harris et al., 2015; Palmer & Devitt, 2007); that they find the process engaging and beneficial for learning and exam preparation (Craft et al 2017; Fellenz, 2004; Gonzales-Cabezas et al, 2015; Gooi & Sommerfeld, 2015; Jobs et al., 2013), and they have previously reported increased confidence and more reflective learning (Baerheim & Meland, 2003). These positive effects also appear to transfer to other forms of assessment (Yu Tsao Pan et al., 2016). Potential disadvantages include students adopting a minimalist approach if they perceive no reward for their effort (Jobs et al., 2013), the potential for poor retention of knowledge (Hoogerheide et al, 2018) and lack of student favour ( Palmer & Devitt, 2006).These studies highlight the need to provide guidance and feedback about the quality of MCQ items, for students to justify the correct answer and be able to identify meaning and relevance in the task. 

In this study, we describe a process of integrating student-developed MCQs into a medical course which included feedback and incentives. We report students’ experiences and interpretations of this task with a specific emphasis on the type of learning they engaged in and how they made sense of the learning and assessment experience.

Methods

This investigation was conducted within the primary care clinical unit of an Australian university. Participants were third and fourth year medical students completing an 8 week course focusing on the medical care of vulnerable people, as part of a 4 year Doctor of Medicine (MD) program. Data collection occurred from February to September 2017 and included 106 students from 3 consecutive course cohorts.

Ethics approval was obtained from The University of Queensland Human Research Ethics Committee B (approval number 2017000192).

Process

Figure 1 Student-developed MCQ Model

The model of student-developed MCQs designed for this study is summarised in Figure 1.  Each student was asked to develop and submit one multiple-choice question (MCQ) per week over 5 weeks of the 8-week course. Students were provided with an MCQ writing guide and checklist. This information ensured students were aware of the criteria for well-constructed MCQs and developed evaluative skills (Sadler, 2010). One mark was awarded per completed question and contributed to the final written exam mark to provide motivation for students to engage in the process (Taylor & Hamdy, 2013). 

Students were asked to (1) write single-best-answer style questions with a clinical scenario stem based on the topics and learning resources provided, (2) include a justification of the answer and (3) identify the learning resource(s) used. Questions were reviewed by an academic and 10 of the submitted questions each week were selected to form a quiz for the course cohort resulting in a total of 50 formative questions over the duration of the course. Once students completed the quizzes they could access the correct answers with justifications. From the cumulative pool of 50 formative questions, 20 were selected for inclusion on the final written exam. This meant that 25% of the written exam mark (5-mark credit for submitting 5 questions and 20 marks derived from the formative pool) was student-owned with the remaining unseen written paper worth 75 marks. This aspect was included to ensure students saw the relevance of their contribution and provide them with an opportunity to potentially shape the learning for themselves and for the broader cohort (Taylor & Hamdy, 2013). The written paper represented the knowledge assessment for this course and represented a third of the assessment weighting, with work-based and reflective practice skills representing the remaining two thirds.

Evaluation methods

A mixed-methods approach was used to obtain both quantitative and qualitative data. Following their written exam, students were invited to complete a survey and participate in focus groups about the student-developed MCQs. The paper survey involved Likert scale responses to statements about student learning, assessment and personal experiences developed for this study. Focus groups, facilitated by an independent medical educator, explored the associated learning and assessment experiences of students.

Analysis

Survey responses were analysed with descriptive statistics using IBM SSPS statistical software. Focus group transcripts were examined using inductive content analysis, allowing insights about students’ perspectives and views to emerge from the data (Hsieh, 2005). These were developed as codes and scrutinised for patterns of ideas and themes extending across interviews.

Results

Survey

84% (89 of 106) students completed the survey and results are summarised in Table 1. Students found writing MCQ’s a positive experience). 86% of students agreed that writing MCQs enhanced their learning and 89% that it encouraged them to engage more with the learning resources. 75% agreed that writing MCQs increased their pre-exam confidence, 68% that it reduced their exam-related anxiety and 53% that it improved their skills in answering MCQs. 72% agreed that writing MCQs was a valuable use of their time, and 58% found it was enjoyable.

91% of students agreed that the student-developed formative quizzes assisted their learning, 89% were encouraged to research what they did not know and 69% believed quizzes helped benchmark their learning compared to peers.

90% of students agreed that writing MCQs was a useful form of assessment, 83% that it provided them with ownership of their assessment and 63% that it encouraged collaborative learning.

Table 1  Student survey responses (SD=Strongly Disagree D=Disagree, N= Neither agree nor disagree, A=Agree, SA=Strongly Agree)

Focus group discussions

Ten students participated in three focus-groups (4, 3, and 4 respectively) with two overarching themes emerging:

  1. Students as designers of assessment
  2. Linking assessment to vocational needs

(A supplementary table of representative quotes is available for interest)

Theme 1: Students as designers of assessment

Students as consumers of learning and assessment processes felt their prior extensive experience of answering MCQs, in exams and via commercial question-banks, enabled them to critically appraise the learning value of MCQ items

At this stage of our career, in theory we’ve done thousands of questions. Every day we do some kind of question (Quote1FG1S2)

Students questioned the purpose and value of some MCQ exams. They preferred questions where an answer was clearly correct rather than contentious, and criticised questions which seemed designed to ‘trick’ them.  Obscure or ‘left-field’ questions were considered unhelpful - they believed assessment should focus on understanding important concepts rather than assessing test-taking (gaming) skills.

I don’t feel like [faculty MCQ exams] assesses my knowledge of the material; it assesses my test-taking skills and whether or not I can work around that MCQ to figure out what is most correct. (Quote6FG1S1)

Students described superficial approaches to learning when preparing for multiple-choice exams. They identified ‘deep learning’ as related to reasoning, connecting ideas and applying knowledge; and ‘superficial learning’ as memorising, accepting without thinking and reproducing facts. They described having to know ‘facts’ and scanning resources for likely multiple-choice-question fodder to prepare for MCQ exams. They spoke of preferring a deep learning approach and believed it was more valuable to their future career.

In the end, we are trying to be doctors. What‘s the point of memorising facts when we’re trying to help our patients? (Quote7FG2S1)

Students described how the opportunity to devise questions for themselves and their peers enhanced their engagement with learning materials requiring higher cognitive skills and a more disciplined approach.

It was a good way to study in a very different way. Where you’re not just reading the material but you’re really synthesising it and trying to apply it to a different scenario (Quote17 FG2S3)

Students associated this agency over their assessment with a reduction in assessment-related anxiety.

It changed my whole outlook on MCQs. Normally I really stress out about school of medicine MCQs because sometimes they’re really out of left field or you don’t understand their wording (Quote31FG1S2)

They described synergistic learning benefits from engagement with writing MCQs and completing student-developed MCQ quizzes.

So, you learn from writing the question and learn from doing the formative quizzes (QuoteFG2S3)

The student-developed quizzes were identified as the most useful formative questions they had encountered.

I liked how when you reviewed the quiz you are given the right answer and feedback. (Quote25FG1S4).

Students described frustration with assessments where correct answers were not revealed or explanations were inadequate. The strategic nature of their learning was evident in descriptions of mindless navigation of formative quizzes without feedback. 

It is really frustrating when you do formatives and you don’t know the answer and you have to do the quiz 15 times to get it right (Quote10FG1S4)

Students acknowledged the time and cognitive effort required to write good quality questions. They approached the task as ‘academic designers’ engaged in practices normally ascribed to the academic. Some students chose to write questions about concepts that lecturers or clinical preceptors had signposted as difficult with the intent of guaranteeing they understood that concept. Some wrote questions on topics that were new or interesting and others gravitated to material they found challenging to promote a better understanding.

[I wrote questions about] mainly things that I just had no idea about. So, it’d be like I’m learning it for the first time and then I’d feel like it was applicable and beneficial (Quote14FG3S3)

Reflecting on their experience, students identified that a greater understanding of the material was required to write a multiple-choice question (mandating a deeper learning style) having to justify the answer was analogous to teaching someone, and in order to teach, one needed to understand it well. They described using a different approach to their learning.

It’s kind of like in a way teaching other people which I think is a good way to learn. By writing a question you have to know it enough to teach it to someone else (Quote16FG1S3)

Finally, students felt strongly that the approach to assessment in general should be improved. They argued that a 50% pass rate devalued the knowledge assessed.

Shouldn’t the aim of exams be that students get most questions right? Like, if there is a low pass mark, why aren’t people concerned by what students get wrong?(Quote9 FG1S2)

Students critically appraised the design and purpose of assessment through the lens of expert exam-takers and articulated clear preferences for assessment content. This contributed to the second theme centred on assessment for vocational purpose.

Theme 2: Linking assessment to vocational purpose

This theme emerged from students’ discussions of how their experience of preparing for typical MCQ exams can handicap the real learning required for future practice as doctors.

More than half the questions don’t actually help us in the clinical setting, but it’s something that we have to learn because we have to do the MCQ exam to be able to pass (Quote8FG2S2)

A tension was evident between the need to be strategic learners for MCQ exams versus focussing on vocational learning needs. Students however, wrote questions that matched their learning needs.

I felt like these [student developed MCQs] were all legitimate questions that everybody who finishes this course should be expected to know. That’s what I appreciated about it (Quote4FG1S2)

At the forefront of student deliberations was the priority of vocational learning. Students expressed a desire to engage deeply with knowledge relevant for future practice and to be provided the opportunity to explain their answers and weigh-up pros and cons for diagnosis and treatment. They believed MCQ exams did not allow them to make this reasoning visible.

MCQ exams make you want to rote learn all the different steps rather than maybe sit and think about what we are trying to achieve in treatment and work it that way (Quote2FG1S2)

Students identified that writing a clinical scenario stem helped them focus on the patient and many based the scenario on patients they had encountered on clinical placement. Students described this process as enhancing their clinical reasoning skills.

It made you really think about the patient ... I was like-oh this’ll be easy, I’ll just write this out and then I was like-wait, does that patient actually present like that or would they actually say that or will they feel this way. (Quote19FG3S4)

Students favoured questions that required them to make decisions about clinical problems and appreciated a well-written, clinical stem (quote20).

More useful types of questions for learning were ’what would you do in this situation’ ‘what’s next in the management of this patient’ or ‘what should be your first step’ (Quote20FG1S2)

This model provided impetus for students to engage in a community of practice with the formative quiz acting as a portal. Students were scattered across hospital campuses and the formative quizzes connected them as learners. Observing questions written by their peers sparked a level of competitiveness and motivation amongst students. This created a peer-learning quality-improvement process where students attempted to improve their MCQ items to match the quality of items produced by peers.

Some of the questions were very in depth and I was like ’oh, I need to step it up’ like everyone’s really putting in effort (Quote32FG1S2)

Comparing content and understanding was helpful with students keen to share in peers’ learnings from different clinical placements.

The fact that we were all on different clinical placements, it was helpful to see whether or not you understand the material but also what others found important (Quote34FG3S2)

On a negative note, one student indicated that she thought one of the student-developed questions was sourced from a commercial question bank – and idea met with disapproval from the group.

Students believed they had capacity to contribute to item development and that this has learning benefit. They appreciated the opportunity to collaborate with peers to develop formative quizzes, and to be members of a community of assessment and learning practice. Students appreciated being given the responsibility to contribute to their own assessment and engage in meaningful learning. In designing their MCQs, they particularly focused on ensuring the questions were relevant to their future clinical work.

When asked whether the intervention should continue, all students agreed it should acknowledging that it fostered deeper learning.

Discussion

In this paper we describe and evaluate a model of integrating student-developed MCQs into a clinical course within a medical program.

A key finding was that students identified a shift in their learning approaches from superficial to deep when given the opportunity to devise MCQs. They also experienced reduced exam-related anxiety. Consistent with adult learning theory (Knowles et al., 2015), providing students with agency over their assessment facilitated an engaged peer-learning environment where vocational learning was privileged.

Writing assessment items provided students with a learning platform to engage, test and then reflect on their knowledge enhancing their assessment literacy (Deeley 2017; Smith CD, 2013). Students showed they could be effective and competent designers of assessment items and understood the limitations associated with MCQs. Students’ frustration with the nature of MCQ assessment was compounded by lack of feedback.

By evaluating students’ responses, this research provides insight into how students understand, interpret and critique the learning and evaluative components of MCQs. In particular, students prioritised vocational learning previously described by Mattick & Knight (2009) and expressed a desire for assessment to support this. Their recognition of the need for assessment to drive learning relevant to future work is well established in higher education assessment discourse (Foos and Fisher 1988; McDaniel et al. 2007). Writing a clinical scenario question-stem and having to justify the correct option was identified by students as enhancing their clinical reasoning and encouraging them to be patient focused. This is supported by studies that have shown that questions incorporating rich descriptions of the clinical context require more complex cognitive processes to answer and are therefore more representative of clinical practice (Schuwirth et al 2001; Schuwirth & Van Der Vleuten 2004).

In this study students not only created the context, developed the question and justified the answer for their own contributions, they also answered the same style of questions constructed by their peers.  The formative quizzes connected their learning experiences and allowed them to learn from each other despite being geographically dispersed. Such advantages of peer learning opportunities have been previously described (Boud, Cohen & Sampson, 2001; Secomb 2008).

These findings from the research, are summarised in Table 2.

Table 2 Tips for incorporating student-developed MCQs into the curriculum

A strength of this study was the use of a mixed-methods approach. The survey provided an overall measurement of the experience of the cohort and the focus groups allowed a deeper exploration of the student experience of learning and assessment. However, there are several limitations: data collection relied on the participation of suitably motivated students and it is therefore possible that the data may be skewed in a favourable direction. The number of students who participated in the focus-group discussions was small, however the themes concerning the student experience of learning and assessment were consistently expressed across all groups. The survey and focus groups were conducted immediately following the written exam which may have provided students with a better opportunity to reflect on the effectiveness of their learning and experience of assessment however, students who felt that the exam was difficult may have been more negative about the process.  The potential for plagiarism of questions has been noted as a potential problem in previous studies (Harris et al., 2015) and was not insured against in this study and although not overt, may have occurred.

Several avenues for further investigation of student-developed MCQs emerged including: measures of retention of learning, standard setting methods for student-developed questions, and whether students engage clinical preceptors in the process.

This intervention invited students to be designers and collaborators in their assessment. Each student had the opportunity to partner in assessment and the process appeared to harness their creativity and perspective, demanding a high level of participation. The model weakened the power-differential between academic and student and reframed it as mutually beneficial, acknowledging that students are key protagonists in the learning process and sophisticated in their understanding of their learning needs. Learning and assessment became a more democratic process as students became willing and proficient co-creators. By providing students with this agency it enhanced the meaning and value of learning and as a result, students privileged assessment targeted at their future vocational needs.

Take Home Messages

  • Involving students in MCQ writing promotes deeper learning and models an important and lifelong learning skill of independence and agency for learning
  • Given the opportunity and appropriate feedback, students have expertise in developing high quality assessment questions for MCQ examinations
  • Poorly designed MCQ exams may discourage or distract students from important learning
  • Feedback on formative and summative MCQs enhances learning.
  • Students want assessment to have vocational relevance.

 

Notes On Contributors

Michaela Kelly, Academic coordinator-Medicine in Society (Vulnerable Populations), Senior Lecturer, Primary Care Clinical Unit, Faculty of Medicine, The University of Queensland

Anna Ryan, Senior lecturer & research fellow, Department of Medical Education, Faculty of Medicine, Dentistry & Health Sciences, The University of Melbourne

Margaret Henderson, Lecturer, Primary Care Clinical Unit, Faculty of Medicine, The University of Queensland

Hannah Hegerty, Medical Registrar, Queensland Health

Clare Delany, Associate Professor, Department of Medical Education, Faculty of Medicine, Dentistry & Health Sciences, The University of Melbourne

Acknowledgements

Bibliography/References

Albanese, M. A., & Dast, L. C. (2014). Problem-based learning Understanding Medical Education 2nd Ed (pp. 63-80). West Sussex: Wiley Blackwell.   

Baerheim, A., & Meland, E. (2003). Medical students proposing questions for their own written final examination: evaluation of an educational project. Medical Education, 37(8), 734-738.

https://doi.org/10.1046/j.1365-2923.2003.01578.x

Bloom, B. S., Engelhard, M. D., Furst, E. J., Hill, W. H., Krathwohl, D. R., & (Editors). (1956). Taxonomy of educational objectives Handbook 1 Cognitive Domain. New York: Longmans, Green & Co.   

Bottomley, S., & Denny, P. (2011). A Participatory Learning Approach to Biochemistry Using Student Authored and Evaluated Multiple-choice Questions. Biochemistry and Molecular Biology Education, 39(5), 352-361.

https://doi.org/10.1002/bmb.20526   

Boud, D., & Falchikov, N. (2007). Rethinking assessment in higher education: Learning for the longer term. New York. London: Routledge.   

Boud, D., Cohen, R., Sampson, J. (2001). Peer learning in higher education: learning from and with each other. London: Routledge.   

Case, S., & Swanson, D. (2000). Constructing written test questions for the basic and clinical sciences 3rd Edition. Philadelphia: National Board of Medical Examiners.   

Chamberlain, S., Freeman, A., Oldham, J., Sanders, D., Hudson, N., & Ricketts, C. (2009). Innovative learning: employing medical students to write formative assessments. Medical Teacher 28(7), 656-659.

https://doi.org/10.1080/01421590600877822   

Combs, K. L., Gibson, J. M., Hays, J. M., Saly, J., & Wendt, J. T. (2007). Enhancing curriculum and delivery: linking assessment to learning objectives. Assessment and Evaluation in Higher Education, 33(1), 87- 102.

https://doi.org/10.1080/02602930601122985   

Craft, J.A., Christensen, M., Shaw N., & Bakon S. (2017). Nursing students collaborating to develop multiple-choice exam revision questions: A student engagement study. Nurse Education Today, 59, 6-11.

https://doi.org/10.1016/j.nedt.2017.08.009   

Deeley, S. J., & Bovill, C. (2017). Staff student partnership in assessment: enhancing assessment literacy through democratic practices. Assessment and Evaluation in Higher Education, 42(3), 463-477.

https://doi.org/10.1080/02602938.2015.1126551   

Epstein, R. M. (2007). Assessment in medical education. New England Journal of Medicine, 356(4), 387-396.

https://doi.org/10.1056/NEJMra054784

Fellenz, M. R. (2004). Using assessment to support higher level learning: the multiple choice item development assignment. Assessment and Evaluation in Higher Education, 29(6), 703-719.

https://doi.org/10.1080/0260293042000227245   

Galloway, K. W., & Burns, S. (2015). Doing it for themselves: students creating a high quality peer-learning environment. Chemistry Education Research and Practice, 16(1), 82-92.

https://doi.org/10.1039/C4RP00209A   

Gonzales-Cabezas, C., Anderson, O. S., Wright, M. C., & Fontana, M. (2015). Association between dental student-developed exam questions and learning at higher cognitive levels. Journal of Dental Education, 75(11), 1295-1303.   

Gooi, A. C. C., & Sommerfeld, C. S. (2015). Medical school 2.0: How we developed a student-generated question bank using small group learning. Medical Teacher, 37(10), 892-896.

https://doi.org/10.3109/0142159X.2014.970624   

Harris, B. H., Walsh, J. L., Tayyaba, S., Harris, D. A., Wilson, D. J., & Smith, P. E. (2015). A novel student-led approach to multiple-choice question generation and online database creation, with targeted clinician input. Teaching and Learning in Medicine, 27(2), 182-188.

https://doi.org/10.1080/10401334.2015.1011651   

Hoogerheide, V. Staal, J., Schaap,L., & van Gog,T. (2018). Effects of study intention and generating multiple choice questions on expository text retention. Learning and Instruction, In press.

https://doi.org/10.1016/j.learninstruc.2017.12.006   

Hsieh, H.-F. & Shannon, S.E. (2005). Three approaches to qualitative analysis. Qualitative Health Research, 15(9), 1277-1288.

https://doi.org/10.1177/1049732305276687   

Jensen, J. L., McDaniel, M. A., Woodard, S. M., & Kummer, T. A. (2014). Teaching to the Test....or Testing to Teach: Exams Requiring Higher Order Thinking Skills Encourage Conceptual Understanding. Educational Psychology Reviews 26(2), 307-329

https://doi.org/10.1007/s10648-013-9248-9   

Jobs, A., Twesten, C., Gobel, A., Bonnemeier, H., Lehnert, H., & Weitz, G. (2013). Question-writing as a learning tool for students - outcomes from curricular exams. BMC Medical Education, 13(1) 89

https://doi.org/10.1186/1472-6920-13-89   

Khafagy, G., Ahmed, M., & Saad, N. (2016). Stepping up of MCQs' quality through a multi-stage reviewing process. Education in Primary Care, 27(4), 299-303.

https://doi.org/10.1080/14739879.2016.1194363   

Knowles, M. S., Holton, E. F., & Swanson, R. A. (2015). The Adult Learner 8th Edition. London: Routledge.

Larsen, D. P., Butler, A. C., & Roediger, H. L., 3rd. (2008). Test-enhanced learning in medical education. Medical Education, 42(10), 959-966.

https://doi.org/10.1111/j.1365-2923.2008.03124.x   

Palmer, E., & Devitt, P. (2006). Constructing multiple choice questions as a method for learning. Annals Academy of Medicine Singapore, 35(9), 604-608.   

Palmer, E. J., & Devitt, P. G. (2007). Assessment of higher order cognitive skills in undergraduate education: modified essay or multiple choice questions? BMC Medical Education, 7(1), 49.

https://doi.org/10.1186/1472-6920-7-49   

Roediger HL 3rd, M. E. (2005). The positive and negative consequences of multiple-choice testing. Journal of Experimental Psychology, Learning, Memory & Cognition, 3(5), 1155-1159.

https://doi.org/10.1037/0278-7393.31.5.1155   

Sadler, R. (2010). Beyond feedback: developing student capability in complex appraisal. Assessment and Evaluation in Higher Education, 35(5), 535-550.

https://doi.org/10.1080/02602930903541015   

Schuwirth, L.W.T., Van Der Vleuten, C.P.M., & Donkers, H.H.L.M. (1996). A closer look at cueing effects in multiple -choice questions Medical Education 30(1), 44-49

https://doi.org/10.1111/j.1365-2923.1996.tb00716.x   

Schuwirth, L.W.T., & Van Der Vleuten, C.P.M. (2004) Different written assessment methods: what can be said about their strengths and weaknesses? Medical Education 38(9) 974-979

https://doi.org/10.1111/j.1365-2929.2004.01916.x   

Schuwirth, L.W.T., Verheggen, M.M., Van Der Vleuten, C.P.M., Boshuizen, H.P.A., & Dinant, G.J. (2001) Do short cases elicit different thinking processes than factual knowledge question do? Medical Education 35(4) 348-356

https://doi.org/10.1046/j.1365-2923.2001.00771.x   

Schooler, J.W., Foster, R.A., & Loftus, E.F. (1988). Some deleterious consequences of the act of recollection. Memory and Cognition, 16(3), 243-251.

https://doi.org/10.3758/BF03197757   

Secomb, J. (2008). A systematic review of peer teaching and learning in clinical education. Journal of Clinical Nursing, 17(6), 703-716.

https://doi.org/10.1111/j.1365-2702.2007.01954.x   

Smith, C.D., Worsfold, K., Davies, L., Fisher, R., & McPhail, R. (2013). Assessment literacy and student learning: the case for explicitly developing students 'assessment literacy'. Assessment and Evaluation in Higher Education, 38(1), 44-60.

Surry, L.T., Torre, D., & Durning, S.J. (2017). Exploring examinee behaviours as validity evidence for multiple-choice question examinations. Medical Education, 51(10), 1075-1085.

https://doi.org/10.1111/medu.13367   

Swanson, D. B., Holtzman, K. Z., & Allbee, K. (2008). Measurement characteristics of content-parallel single-best-answer and extended-matching questions in relation to number and source of options. Academic Medicine, 83(10 Suppl), S21-24.

https://doi.org/10.1097/ACM.0b013e318183e5bb   

Tan, L. T., McAleer, J. J., & Final, F. E. B. (2008). The introduction of single best answer questions as a test of knowledge in the final examination for the Fellowship of the Royal College of Radiologists in Clinical Oncology. Clinical Oncology, 20(8), 571-576.

https://doi.org/10.1016/j.clon.2008.05.010   

Taylor, D. C. M., & Hamdy, H. (2013). Adult learning theories: Implications for learning and teaching in medical education: AMEE Guide No. 83. Medical Teacher, 35(11), E1561-E1572.

https://doi.org/10.3109/0142159X.2013.828153   

Wormald, B. W., Schoeman, S., Somasunderam, A., & Penn, M. (2009). Assessment drives learning: an unavoidable truth? Anatomical Sciences Education, 2(5), 199-204.

https://doi.org/10.1002/ase.102   

Yu Tsao Pan, P., Wang Chia Wei, W., Loh Poey, L., Shen, L., & Yu Soo Hoon, V. (2016). MCQ-construction improves Quality of Essay Assessment among undergraduate dental students. Singapore Dental Journal, 37, 37-40.

https://doi.org/10.1016/j.sdj.2016.04.001

Appendices

Table 1

 

Table 2

 

Table 3

Declarations

There are no conflicts of interest.
This has been published under Creative Commons "CC BY-SA 4.0" (https://creativecommons.org/licenses/by-sa/4.0/)

Reviews

Please Login or Register an Account before submitting a Review

P Ravi Shankar - (26/06/2018) Panel Member Icon
/
The authors describe an interesting project of providing students with an opportunity to write MCQs and using the process to stimulate deep learning. They use the student developed MCQs in both formative and summative assessments stimulating student involvement in the process. I agree with the other reviewer that the authors could provide more description about the methodology of the focus group discussions. I found the tips for incorporating student generated MCQs in the curriculum interesting. The paper is well written and well structured. The appendices add to the information presented in the paper. The list of references is comprehensive. The paper will be of interest to a broad range of medical educators.
BALAJI ARUMUGAM - (10/06/2018) Panel Member Icon
/
It’s a great attempt by the authors of this paper, to utilize students for preparing the assessment tools and FGD survey at the end of the assessment. The author has not explained the method of MCQ developing and the item analysis by the study participants (students); and also the unclear methodology involved in focus group discussions (duration of FGD for each theme, leadership, timing between written exam and FGD) in deriving the advantages of the survey results, Except this sentence in methodology - Focus groups, facilitated by an independent medical educator, explored the associated learning and assessment experiences of students, nothing was said about FGD. Being the medical educator and author of MCQ books, I know the difficulty of making a MCQ and to use it for student assessment. Of course the FGD survey results based on Likert scales clearly states that the learning is enhanced, improved confidence. Nevertheless, the article has a good content but fairly explained methodology in detail.
Nandalal Gunaratne - (10/06/2018)
/
Thank you for a truly comprehensive attempt at getting students involved in assessment. The student feedback before is similar to what i got in my own country.
Some good students do badly at MCQs and this is why they treat MCQs with suspicion. The discriminatory index shows this. They are easy to make for makers and difficult to answer say students. Did your students have such a view and did it change afterwards? Were there any opposition from other examiners?
Regards.