Research article
Open Access

Retention in Enhanced Team Based Learning Course: retain or refrain?

Janil Puthucheary[1], Sok Hong Goh[2], Tam Cam Ha[3], Doyle G. Graham[4], Sandy Cook[5]

Institution: 1. Duke-NUS Medical School, 2. Duke-NUS Medical School, 3. Duke-NUS Medical School, 4. Duke-NUS Medical School, 5. Duke-NUS Medical School,
Corresponding Author: Mrs Sok Hong Goh
Categories: Assessment, Medical Education (General)


Background: Students’ ability to retain content in medical school has always been a concern. At Duke-NUS Medical School, we modified our Team-Based Learning (TBL) classes known as TeamLEAD, a learning strategy for first year basic science content, to include an open/closed-book option in the readiness assurance phase to engage teams in deeper discussion. We hypothesize that the open-book option allows students to engage in deeper learning in their teams, which leads to an improvement in retention ability for each individual student at the end of their first year basic science curriculum.

Methods: A total of 115 MCQs used throughout first year basic science courses from 2011 to 2013 were repeated during a two-week end-of-year review. We analysed the extent of students’ retention by examining the influence of team’s choice for open/closed-book on initial and repeated individual performance score.

Results: Student’s individual scores increased from 56.2% correct the first time encountering the questions to 68.7% for the second time (p<0.01).  For teams that chose open-book option for group readiness assurance (GRA) test, individual students’ second encounter score increased by 11.8% (p<0.01) compared to 10.3% (p<0.01) for teams that chose closed-book option. Students’ retention was higher for those questions from second half of the year compared to questions from first half (p<0.01).

Conclusion: Implementation of open-book component enables students to drive challenging discussion in teams. This helps individual students to consolidate, recall and retain more information over time, which results in an improved individual performance at the end-of-year review.

Keywords: Team Based Learning (TBL), Medical Education, Basic Science Curriculum, Deeper Learning


Duke-National University of Singapore (DUKE-NUS) Medical School is a four-year graduate-entry medical school that matriculated its first class in 2007. Duke-NUS’s first year curriculum is modelled after the Duke School of Medicine (DSOM), Durham North Carolina, USA, which delivers its basic science curriculum in one year.  We use the recorded lectures as core content from DSOM and chose to deliver and reinforce the content through Team-Based learning (Kamei, Cook, Puthucheary, & Starmer, 2012; Michaelsen, 2008), which we call TeamLEAD (Learn, Engage, Apply, and Develop).

We chose this model for a number of reasons. Medical students face the challenge of remembering curricula materials over time (Custers, 2008) as explained by Ebbinghaus (Ebbinghaus, 1964) (the “forgetting curve”) in 1885. It suggested that TBL maps to a Constructivist educational theory model (Hrynchak, & Batty, 2012). In particular, the four key elements of a social constructivist learning model, namely teacher as a guide to facilitate learning, centrality of problem solving to identify learning gaps, use of group and active learning, and finally opportunity for reflection are all present and important in our model of delivery of TBL.

TBL structure consists of an Individual Readiness Assessment (IRA), which contains 25 questions (on average) taken individually. The questions are directly mapped to the material assigned to the class to prepare for that session. The IRA is followed by a Group Readiness Assessment (GRA), which contains the same questions but is completed by the teams. Both the IRA and GRA are closed-book assessments. During the course of submitting the GRA answers, the teams receive immediate feedback regarding their choices and by the end of the assessment, the whole class is aware of the correct answers and their individual and team performance. The GRA will be followed by a faculty-led review of the questions and answers. This entire process takes a total of two hours.

At Duke-NUS, TeamLEAD has undergone some modifications designed to enhance students’ learning. For TeamLEAD, the pre-class preparation and individual assessments are unchanged. Using an online system for the GRA, teams are given the option of choosing zero, one or two of the questions to answer as open-book. After open-book items are identified, students first submit the remaining questions in typical manner as closed-book.

Upon submission, the teams work simultaneously on their pre-selected open-book questions. The whole class will receive feedback on their individual and team choices, and are aware of the correct answers. Hence, in our analysis of the data, we classified questions based on four categories of their GRA behaviour (choice to do the question as open-book or closed-book) and group performance (correct or incorrect) (Figure 1). After completing the GRA, teams identified their remaining queries. During the facilitated discussion, teams are asked to address the queries and issues brought up by other teams, followed by closure from a faculty member on each question highlighted. The choice of which questions to discuss and in particular which facet or principle to focus on are entirely in the hands of the class.


Fig. 1. Four categories based on GRA group behaviour (choice to do the question as open-book or closed-book) and group performance (correct or incorrect).


We believed that the open-book component forced students to confront their collective uncertainty to a much greater degree and required them to discuss how they know, what they think they know, and to have some insight into where and how to look up material if they chose to do so. They must discuss the confidence with which they chose the perceived right answer, and must be able to convey to their teammates this degree of confidence.

We conducted a small pilot study to examine the impact on retention of core concepts by students. After introducing the open-book GRA option, we took 13 questions from the immunology section that were not repeated at any point in the course, and gave those questions to the students again at the end of the course (approximately 15 weeks later). These data suggested that students performed better. We found a significant increase in individual performance from pre to post test (59.2 to 65.1; p<.01). From this pilot study, we decided to explore if the learning and retention pattern was maintained with a larger sample of core curriculum content from the first-year basic science courses.

There were studies which reported that TBL enhances learning, long-term retention and improves critical thinking, problem solving skills and attitudes (Custers, 2008; McInerney, & Fink, 2003; Nieder, Parmelee, Stolfi, & Hudes, 2004; Michaelsen, & Sweet, 2008). However, some studies did not show long-term retention (Emke, Butler, & Larsen, 2015; Farland, Franks, Barlow, Shaun Rowe, & Chisholm-Burns, 2015). Hence, by adding the open-book option, we chose to explore the added impact of this modification to students’ retention of core knowledge. We hypothesized that the open-book option will allow students to engage in deeper learning in their teams, which led to an improvement in retention ability for each individual student at the end of their first year basic science curriculum.


Repeated questions

One hundred and fifteen (out of the 1768) questions were chosen as verbatim repeat questions (RQs). The RQs comprised of 4% of the questions from Course A, 5% from Course B, 5% from Course C, and 9% from Course D. The criteria for selecting RQs were well written questions, unambiguous and directly relevant to critical concepts in the overall core basic science curriculum.  Furthermore, questions where there were variation in team decisions for open-book vs closed-book responses were preferentially selected to specifically allow the exploration of their impact on retention when questions were repeated at the end of the academic year.

Students were informed of the possibility of RQs at the start of the year and before the final two weeks of the academic year. However, students were neither informed which questions were to be repeated nor how many. The two faculty who chose the RQs, were not involved in preparing any of the pre-class or teaching materials. Hence, they were not biased in selecting the RQs. Their active involvement in the first year curriculum and years of experience meant they had a good understanding of the whole curriculum so they were able to select the RQs that related to the key concepts for each of the courses. The initial set of RQs were chosen in 2011.  These exact same RQs were used and repeated for three subsequent academic classes in the academic year AY2011/12, AY2012/13 and AY 2013/14.



All first year students (n=164) from the AY2011/12 (n=56), AY2012/13 (n=54) and AY2013/14 (n=54) participated in our TeamLEAD classes, taking both the initial and repeated questions.


Data Analysis

Students’ individual scores on the RQs from their first encounter with the questions were compared with their individual scores at the second encounter using paired sample t-test. This study further explored if students’ team decision, whether to answer a given question as open-book or closed-book at the first encounter, affected their individual scores at the second encounter. Statistical analyses were performed using IBM SPSS Statistics for Windows, Version 21.0. Armonk, NY: IBM Corp. This research was approved as exempted review by the National University of Singapore Institutional Review Board with reference code 12-478.


All students (n=164) took both sets of RQs (first and second encounter). On average, across all 115 questions, students’ mean score increased from 56.2% to 68.7% (a 12.5% increase, p<0.01).

We found that when teams chose to answer questions as closed-book, students’ individual scores at the second encounter increased by 10.3% (from 60% to 70.3%; p<0.01). On the other hand, when teams chose to answer questions as open-book, student individual scores increased by 11.8% (from 40.5% to 52.3%; p<0.01). We further subdivided these two categories based on the team performance at the first encounter, correct vs incorrect answer. The results are summarised in Figure 2. There was a larger increase in scores when the team answered incorrectly at the first encounter.


Fig. 2. Comparison of students’ individual performance according to the four categories based on group behaviour and performance obtained during first and second encounters.


We were also interested in how students’ individual performance changed between the first and second encounter. Clustering the questions taken from four different courses, we found that students’ individual scores increased slightly (not significant) for courses A and B (from first semester) and had a significant increase (p<0.01) for courses C and D (second semester) (Figure 3).



Fig. 3. Comparison of students’ individual performance based on the timing of the first question in one of four courses during the first year.


From the results, we postulate that our TeamLEAD process is potentially useful for educating medical students about basic sciences. This is likely to result in retaining a higher proportion of core concepts and principles by encouraging students to explain their understanding to each other, interrogate their own learning, and repeatedly recall key knowledge in order to apply it to different problems.

Enhanced retention was identified in our first pilot study and appears to be consistent for all subsets of data. The optional open-book component in TBL poses no negative effects in terms of decline in performance but rather an improvement in students’ individual scores (Tan et al., 2011) as compared to conventionally described models of TBL. One study examining repeated testing by Marsh et al. suggests that ease of recall can act to distract students into believing they understand the question and thus choose the incorrect answer upon retesting (Marsh, Roediger, Bjork, & Bjork, 2007). This is not the case in our study as we observed improvement in students’ individual scores on RQs performed during revision IRA sessions at the end of the year. It is possible that in TBL, the requirement for teams to come to consensus about a question reduces the chance that incorrectly answered questions will be forgotten. It is likely that at least one individual in the team had initially chose the right answer but was unable to convince other team members. After the team choice is revealed to be incorrect, there is further opportunity for reinforcement and peer teaching about the right answer. Furthermore, the grading system for GRA on either open or closed-book questions forces and motivates teams to answer questions correctly at the first attempt. Hence, team members must work together, voice out their comments and reason their way out with supporting evidence from what they have learnt in the course materials, to achieve the best outcome for their team. This process drives an improvement of scores over time. This suggests that students have the opportunity to identify gaps in their learning, understand and retain correct information. Previous studies have also suggested that the inclusion of open-book learning in tests aids retention and understanding (Heijne-Penninga, Kuks, Hofman, Muijtjens, & Cohen-Schotanus, 2012). In addition, it has been shown that an initial MCQs test not only improves performance for items that are repeated on a final test, but also enhances retrieval of information associated with incorrect alternatives on the initial practice test (Little, Bjork, Bjork, & Angello, 2012).

Even though this model is a possible framework for understanding how TBL might be effective, we postulate that some consideration should be given to a framework that addresses qualitative shifts in cognition and analysis that we see in our students as they progress through an iterative TBL course. Particularly, the deeper understanding they develop of their own cognition and cognitive biases is an outcome we value, and would hope to see this reflected in a model of learning chosen as a framework to understand this andragogy.

Nelson's variant of Perry's framework for development of critical thinking, particularly the idea of facilitating transitions from lower to higher levels of cognitive development is reinforced by many aspects of TBL (Thoma, 1993). The need to move from a dualistic framework of absolute right or wrong to one of multiplicity requires that students recognize and grapple with uncertainty and ambiguity. These are important skills that students have to develop in the future as physicians. This transition is reinforced by their need to decide on which questions to do as open-book, and the intra-group discussion about reaching a consensus on the questions they choose to attempt as closed-book. Students have to identify what they know confidently, what areas lack knowledge and the extent to which they can perform research adequately.

The next transition in Perry's framework, from multiplicity to contextual relativism, requires students to recognize that opinions alone are insufficient. In order to decide between a multiplicity of choices, students need to develop robust cognitive tools that withstand scrutiny from their peers. This transition is enhanced and reinforced by the iterative nature of TeamLEAD process in our school. Students remain in their same group throughout the year and engage in TeamLEAD sessions twice a week on average. We noticed that the qualitative nature of the intra-group discussions changes as the year progresses. Indeed, students move towards more robust explanations of the underlying principles and mechanisms rather than merely the question at hand.

The final transition, moving to contextually appropriate decisions, requires students to use the skills and knowledge they acquired and apply them to unfamiliar situations and problems. This is essentially the description of the application exercise phase of TeamLEAD where our students do not have any further preparatory material but are challenged with new clinical case scenarios that they are expected to deal with based on the skills and knowledge they already acquired.

Our TeamLEAD, with the open-book option, is in line with the constructivist theory of learning and we also postulate that acquisition of critical thinking among our students can be understood by using the Nelson variant of the Perry framework to examine the TBL process. This may explain the improvement in students’ individual score in the revision IRA sessions towards the end of the year.


The data suggested that TeamLEAD as implemented in our first year basic medical science courses work well to drive retention of core material and improve understanding over time. However, the design of this study is neither able to determine the exact mechanism through which this happens nor is it able to identify the components of the TBL process that contribute to this. It is possible that the entire process is necessary (Koles, Stolfi, Borges, Nelson, & Parmelee, 2010; Huitt, Killins, & Brooks, 2014). It does not demonstrate that is it superior to other forms of active learning or small group teaching.

Questions chosen for verbatim repetition were drawn from various courses and modules across the first year. Thus, intervals between first encounter and second encounter were different across the courses. Most questions had an interval of less than 18 weeks. Hence, this study may not adequately examine long-term retention and understanding. Ideally, a question being set would be repeated after a longer interval, such as one or two years. However, given the nature of medical education, in terms of various learning experiences and the way clinical exposure can reinforce basic science learning, such a longer term study may not be able to come to a conclusion about the impact of TBL itself.

As mentioned earlier, RQs were not randomly selected. Hence, the study is unable to address neither specific issues of students’ performance of different ability nor questions of different difficulty. The study is not designed to test hypotheses about which students benefited most from TeamLEAD. Generally, the questions identified were more challenging. This is demonstrated by the fact that the students’ average correct IRA score was 56.2% at the first encounter, as compared to an overall average correct IRA score of 75% across the year.

The reason why TeamLEAD was chosen as a learning strategy at our institution was to move away from the typical teaching strategy of emphasizing retention of fact, and instead focus on applying the information. This is particularly so in the application phase of TBL. In this domain, our study adds no information. A valid criticism of the process of repeating questions after an interval is that we may be seeing an improvement as a result of improved memory, retention and recall of facts, rather than true understanding or problem solving. Indeed, there is good evidence to describe improved retention as a result of test taking (Toppino, & Cohen, 2009). Intrinsic to our course structure are repetition and practice at test taking, and even if this is the driver for improvement in scores, our students are well served.

A randomized trial would be the most ideal study design to answer our research question. However, it was impossible to schedule such a study design into our students’ curriculum. Furthermore, there is sufficient data to suggest that a positive enhancement to TBL methodology exists with the addition of the open-book exercise.

Despite the limitations, our study is able to show that our implementation of the open-book modification to conventional TBL does not detract from the process and possibly leads to a deeper understanding of the content.


In conclusion, TeamLEAD with open-book option enhances students’ retention over and above the pre-class preparatory work done by individual students. This is independent of the team’s behaviour in selecting to answer a question as open-book or closed-book, and the team’s performance in getting the question correct. All data analysed showed improvement of students’ individual scores and performance. Our data adds to our experience in delivering a basic medical science course to first year medical students over one year with an andragogy almost exclusively centred on TBL. The open/closed-book component helps to enhance students’ learning and builds on the skills that formed the rationale for implementing the TBL andragogy in the school in the first place such as communication, collaboration and leadership skills. We are reassured that our students are served well by this approach, and will continue to refine it.

Take Home Messages

Notes On Contributors

Dr. Puthucheary is an Associate Professor, Duke-NUS Medical School, Singapore, He had contributed conceptualization, study design, data analysis and drafting of manuscript.

Ms. Goh is an Associate Director, Medical Education, Research and Evaluation Department, Duke-NUS Medical School, Singapore. She had contributed conceptualization, study design, data analysis and drafting of manuscript.

Dr. Ha is an Adjunct Assistant Professor, Medical Education, Research and Evaluation Department, DUKE-NUS Medical School, Singapore. She had contributed conceptualization, study design, data analysis and substantial revision of paper.

Prof. Graham is Emeritus Professor, Duke-NUS Medical School, Singapore. He had contributed conceptualization, study design, data analysis and drafting of manuscript

Dr. Cook is an Interim Vice Dean of Education and Professor, Medical Education, Research and Evaluation Department, Duke-NUS Medical School, Singapore. Co-Director, Academic Medicine Education Institute, Duke-NUS Medical School, Singapore. She had contributed conceptualization, study design, data analysis and drafting of manuscript.

*All authors have contributed significantly to the work and had full access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.




Custers, E. (2008). Long-term retention of basic science knowledge: A review study. Advances In Health Sciences Education, 15(1), 109-128.

Ebbinghaus, H. (1964). Memory: A contribution to experimental psychology (Henry A. Ruger & Clara E. Bussenius, Trans.). New York, NY: Teachers College. (Original work published as Das Gedächtnis, 1885).

Emke, A., Butler, A., & Larsen, D. (2015). Effects of team-based learning on short-term and long-term retention of factual knowledge. Medical Teacher, 1-6.

Farland, M., Franks, A., Barlow, P., Shaun Rowe, A., & Chisholm-Burns, M. (2015). Assessment of student learning patterns, performance, and long-term knowledge retention following use of didactic lecture compared to team-based learning. Currents In Pharmacy Teaching And Learning, 7(3), 317-323.

Heijne-Penninga, M., Kuks, J., Hofman, W., Muijtjens, A., & Cohen-Schotanus, J. (2012). Influence of PBL with open-book tests on knowledge retention measured with progress tests. Advances In Health Sciences Education, 18(3), 485-495.

Hrynchak, P., & Batty, H. (2012). The educational theory basis of team-based learning. Medical Teacher, 34(10), 796-801.

Huitt, T., Killins, A., & Brooks, W. (2014). Team-based learning in the gross anatomy laboratory improves academic performance and students' attitudes toward teamwork. Anatomical Sciences Education, 8(2), 95-103.

Kamei, R., Cook, S., Puthucheary, J., & Starmer, C. (2012). 21st century learning in medicine: Traditional teaching versus team-based learning. Medical Science Educator, 22(2), 57-64.

Koles, P., Stolfi, A., Borges, N., Nelson, S., & Parmelee, D. (2010). The Impact of team-based learning on medical studentsʼ academic performance. Academic Medicine, 85(11), 1739-1745.

Little, J., Bjork, E., Bjork, R., & Angello, G. (2012). Multiple-choice tests exonerated, at least of some charges. Psychological Science, 23(11), 1337-1344.

Marsh, E., Roediger, H., Bjork, R., & Bjork, E. (2007). The memorial consequences of multiple-choice testing. Psychonomic Bulletin & Review, 14(2), 194-199.

McInerney, M., & Fink, L. (2003). Team-based learning enhances long-term retention and critical thinking in an undergraduate microbial physiology course. Microbiology Education, 4(1). doi: 10.1128/jmbe.v4.68

Michaelsen, L. (2008). Team-based learning for health professions education: A guide to using small groups for improving learning (1st ed.). Stylus Publishing, LLC.

Michaelsen, L., & Sweet, M. (2008). The essential elements of team-based learning. New Directions For Teaching And Learning, 2008(116), 7-27.

Nieder, G., Parmelee, D., Stolfi, A., & Hudes, P. (2004). Team-based learning in a medical gross anatomy and embryology course. Clinical Anatomy, 18(1), 56-63.

Tan, N., Kandiah, N., Chan, Y., Umapathi, T., Lee, S., & Tan, K. (2011). A controlled study of team-based learning for undergraduate clinical neurology education. BMC Medical Education, 11(1).

Thoma, G. (1993). The Perry Framework and Tactics for Teaching Critical Thinking in Economics. The Journal of Economic Education, 24(2), 128-136.

Toppino, T., & Cohen, M. (2009). The testing effect and the retention interval. Experimental Psychology, 56(4), 252-257.


There are no conflicts of interest.

Please Login or Register an Account before submitting a Review


David Taylor - (15/05/2017) Panel Member Icon
I find myself wanting to agree with the authors, but being unconvinced that this study provides the necessary evidence.
I really liked the inclusion of Perry’s framework in one of its more recent iterations, but this, to my mind, emphasises what I see as the flaws in this as an evidence based approach.
As my colleagues have suggested, this sort of study is difficult to interpret (in part because of the relatively short time course of the intervention, and in part due to the presence of any number of unidentified confounding factors – including those highlighted by Perry, such as a shift from duality, through multiplicity contextual relativism.). It is also hampered by the. As to the outcome measure, MCQs can give a measure of recall, but only those that are really carefully written can get beyond Kirkpatrick level 2 to those places that we need to reach.
Trevor Gibbs - (13/04/2017) Panel Member Icon
Like my co-reviewer, I also found this difficult paper to understand and relate to the final conclusions. Having seen TBL in action in various parts of the world, I don't feel that this is a unique innovation on a valid educational / learning strategy. One of the basic tenets of both PBL and TBL s the discussion phase - a phase frequently missed by those who value PBL less!- and so one would expect any activity that enhances that would lead to better learning. One of course however needs to demonstrate this through research, which I think the authors aimed to do. Placing great emphasis upon a change in MCQ scores pre and post activity is not though , I feel, an appropriate way of going about this research when so many other factors can affect student learning and final MCQ scores
Richard Hays - (13/04/2017) Panel Member Icon
I found this paper difficult to review. It describes a rather complicated quasi-experimental design with so many interventions that it is difficult to ascribe any particular effect to any particular intervention. I think that all that can be stated clearly is that the use of an open book approach had little effect on learning within the complex environment of small group, team-based, peer-assisted learning, at least as it was measured in a narrow sample of assessment items over a relatively short period of time. Maybe the complexity just confused me? The study is a great example of how 'messy' education can be. Not detracting from learning is good, but caution is warranted in claiming benefits when so many strategies are used together. Having said that, I think that open-book learning and assessment are valuable strategies, but I do not think that this study provides much evidence.