Research article
Open Access

Effect of Reflective Exercises on Academic Performance and Course Evaluations in a Biomedical Sciences System Course

Tobias K. Boehm[1]

Institution: 1. Western University of Health Sciences
Corresponding Author: Dr Tobias K. Boehm ([email protected])
Categories: Educational Strategies, Students/Trainees, Teaching and Learning, Basic and Foundation Sciences, Research in Health Professions Education
Published Date: 15/01/2021

Abstract

Introduction: Reflection is integral component of self-directed learning, and may enhance empathy, self-awareness, professionalism, and attitudes about learning. We sought to determine if addition of reflective exercises also enhances academic performance in first year dental students taking a systems-based, case-based biomedical science course.

Methods: This a mixed method, retrospective study evaluating academic performance and lived student experience of students enrolled in the “Blood and Lymphatic System” course of 2015 and 2016. Courses were identical except for addition of reflective exercises in 2016. We evaluated class characteristics, academic performance as determined by ungraded pre/post test surveys and graded written assessments, and student perception of the course through Likert-type scale survey ratings and both manual and machine-coded textual analysis of written student comments.

Results: Addition of reflective exercises to this course did not produce any significant changes in academic performance in two statistically similar student cohorts. However, the reflection exercises increased themes of empathy, professionalism, interdisciplinary collaboration in course evaluations, and the reflections themselves provided insights on student learning.

Discussion and Conclusion: While reflective exercises seem to have little effect on academic performance, they can be used to gauge themes and sentiments of students enrolled in a biomedical system course.

Keywords: Professional Education; Learning; Academic Performance; Reflection

Introduction

Health sciences education traditionally has been focused on knowledge acquisition and clinical apprenticeship. However, since the development of various learning theories, health science educators have sought to improve student learning and nurture development of life-long learners. Lifelong learners can determine their own learning needs and set their own learning goals through reflection on their own learning progress (Pee et al., 2000).

Reflection is a key element underpinning learning theories used in healthcare education, such as experiential learning (Kolb, 1984; Schon, 1987) or deep learning (Biggs and Moore, 1993). For example, in the experiential learning model, learning occurs through a cycle of 1) identifying and describing a relevant experience, 2) analyzing and evaluating one’s personal response by reflecting on that experience, 3) considering and adapting one’s working hypotheses and assumptions to the experience and 4) responding to a new experience with the gained insights. As seen here, reflection is an essential piece in the process of learning.

Reflection is a metacognitive activity (Hewson, Jensen and Hewson, 1989) that is deliberative (Shulman, 1985) and involves evaluating existing knowledge, attitudes and skills for a given field of practice (Cruickshank, 1987). It involves re-experiencing the last experience for the purpose of articulating what went well, and what did not, and for making normally tacit knowledge explicit, and thus scrutinizing existing knowledge (Collins, Brown and Newman, 1987). Reflection enables self-directed learning, which is a process in which individuals diagnose their own learning needs, formulate learning goals, choose resources and learning strategies, and evaluate learning outcomes (Knowles, 1975). Self-directed learning is a hallmark of lifelong learners, and this skill can be encouraged at the course design level with problem-based learning and self/peer evaluation (Towle and Cottrell, 1996).

On a smaller scale, reflection can also be encouraged with various assignments and classroom activities. One method are “progress files”, “clinical blogs” or portfolios that contain student/doctor achievements, reflective diaries, self-assessments and encounter evaluations (Pee et al., 2000; Shaughnessy and Duggan, 2013; Wakeling et al., 2019). A commonly reported reflection exercise is reflective writing, where students write about their experiences after role playing or patient encounters (DasGupta and Charon, 2004; Beylefeld, Nena and Prinsloo, 2005; Shapiro, Kasman and Shafer, 2006; Levine, Kern and Wright, 2008; Westmoreland et al., 2009; Bradner et al., 2015; Borgstrom et al., 2016; Smith, 2018). A similar reflection exercise, but verbal, is debriefing after a patient encounter with a preceptor or faculty (Lewin et al., 2014; Okubo et al., 2014). Last, a simple, but yet powerful reflective exercise involves sending single or periodic email surveys that ask students what they learned and how they felt about a particular class session or activity, or what they would do different next time in a challenging patient encounter (Henderson and Johnson, 2002; Mori, Batty and Brooks, 2008).

There may be multiple benefits of these reflective exercises. Most of these were increases in soft skills such as a gain in empathy and self-awareness (DasGupta and Charon, 2004; Levine, Kern and Wright, 2008), improved attitudes about self-directed learning (Mori, Batty and Brooks, 2008), increased awareness of professionalism and empathy (Dhaliwal, Singh and Singh, 2018), increased learning satisfaction (Westmoreland et al., 2009), deeper understanding of patients and improved talent for humanizing the health care experience (Bradner et al., 2015). Generally, these were found after analyzing student/doctor responses qualitatively by recurrent themes. However, one report suggested that debriefing also leads to improved clinical reasoning (Okubo et al., 2014) as evidenced through an objective structured clinical exam, and the authors suggested that reflection improved knowledge acquisition.

Given the relative lack of studies that evaluate the effect of reflective exercises on knowledge acquisition, we aimed to assess if addition of repeated reflection exercises in the context of a case-based system course produced any quantifiable increase in academic performance. In addition, we explored qualitatively what effect reflective exercises may have on student's perception of the course and the course material itself.

Methods

This is a mixed method, retrospective analysis of student learning, academic performance and student feedback in a course given two consecutive years starting in 2015. The study was approved as “exempt” by the Western University of Health Sciences Institutional Review Board (protocol number 16/IRB/045).

Subjects: This study evaluated the effect of this course on two groups of first year dental students taking a biomedical science system course, “DMD 5715 Blood and Lymphatics System Course” in the spring semester at Western University of Health Sciences in 2015 and 2016.

Overall course design: This course is a systems-based course using a variety of teaching methods such as lectures, case-based learning, independent study and seminars (Figure 1). The course covers disciplines such as microbiology, immunology, histology, anatomy, biochemistry, physiology, pathology, medicine and dentistry as they pertain to blood and lymphatic conditions. Student knowledge is assessed with an ungraded pretest survey at the beginning of class on the first day of the course, a graded mid-term written exam, a final written exam and an ungraded posttest survey at the very end of the course. Midterm and Final exams have been created individually for each year since the initial iteration of the course in 2011 to discourage students from forming a test question bank. For the 2015 and 2016 courses, pretests and post-tests featured identical questions, and were identical between 2015 and 2016. While the pre/posttest surveys were ungraded, completion of both surveys was incentivized with a small credit (5%) towards the course grade, resulting in near complete participation. Didactic presentations and cases were kept the same for both years, while scientific literature assignments for each class were different to prevent plagiarism.

Figure 1: General course layout of the systems course studied

B&L Course Design showing timing of pretest, posttest, reflective exercises and assessments, lectures and group activities.

Intervention: The course design was kept the same in 2015 and 2016 except with the addition of reflective exercises. These reflective exercises were given at various time points of the 2016 course, and were of the following types:

  • An email survey that was due at the end of the course, asking students to submit a short reflection paper on “What have you learned that you did not know before, and how would you apply it to your practice?”
  • A peer grading exercise that involved developing a management plan for a medically compromised patient and having other students providing feedback on it
  • Debriefing at the end of two case studies in the middle of the course that specifically asked reflective questions such as “Why do you think a medical consult was requested?", "Why was this treatment done?", "How would you go about telling the patient about this condition?" or "Would you have treated this patient differently?"

To ensure participation in the reflective exercises, each reflective exercise was incentivized by awarding a student a small credit (5%) towards the course grade. To make up for these additions, the contribution towards the course grade was reduced from 25% to 20% for quizzes and the literature seminar. Contribution towards the course grade remained unchanged between 2015 and 2016 for attendance (5%), pre/posttest completion (5%), the midterm (15%) and the final exam (25%).

Pretest/Posttest survey: The survey consisted of the following multiple-choice items that were designed to test detailed knowledge recall of the various disciplines represented in this course (see supplemental file “Supplementary File 1.pdf”). Pre/posttests were scored individually against a key, and recorded as pre/posttest survey score for each student in both years.

Data collection: We obtained from the Western University of Health Sciences College of Dental Medicine’s Office of Academic Affairs the number of students in each class, along with the percentage of males, average age, ethnicity/race, incoming Grade Point Average (GPA) and incoming Dental Aptitude Test (DAT) scores. We also obtained pre and post-test survey results with the number of correct answers, along with exam grades and course grades.

Qualitative analysis: Where applicable, we followed the COREQ guidelines for qualitative research (Tong, Sainsbury and Craig, 2007) for qualitative analysis of student course evaluations. For course evaluations, a female Western University of Health Sciences staff member (college-educated, with a five year long experience collecting these surveys but not possessing a dental background and not otherwise engaged in teaching) sent an email instructing students to fill out an online form powered by Qualtrics(XM) (Provo, Utah and Seattle, Washington). Students received this survey in the same form for other courses and were used to taking this type of survey from previous courses. The students were not aware of being research participants but may have been aware of the addition of reflective exercises through conversations with more senior students. All students of both classes submitted these course evaluation surveys as completion of these surveys was required for course completion.

The relevant question for this study were “Q5-Please provide any additional constructive feedback regarding the instructor’s teaching” and “Q6-Comments”, and previous questions solicited Likert-style ratings of course resources. The course evaluation survey content was manually coded by author TB searching for specific keywords connected to expected benefits of reflection (i.e. “learning”, “reflection”, “understanding”) and various spelling permutations that were tied to specific categories (i.e. “learning”) as determined prior to analysis. Search results were verified against the context of the entire comment and tabulated as one instance per student for each category using Microsoft Excel® (Redmond, Washington). In addition, total number of comments, comments related generally to course content and or specific to the professor teaching the course were counted, and the general tone of these comments recorded.

To validate the manual coding, machine-based textual analysis was used to capture prominent themes and sentiments using NVivo 12 Plus (QSR International, Burlington, MA). For this, student responses were imported as PDF files and coded using the auto-code wizard functionality.

We also applied the same methodology to analyze the content of students' reflection assignments to characterize common themes. NVivo 12 Plus was used to generate a word cloud from the thousand most frequent stemmed words.

Statistical analysis: For statistical calculations and data representation, the R statistical package was used (Vienna, Austria). Student average ages, incoming GPA, incoming DAT, pre/posttest survey means were compared with the unpaired student t-test, while student gender percentages, ethnicity and course grades were compared with Fisher’s exact test. Exam scores were found to be non-parametrically distributed and compared using the Kruskal Wallis rank sum test across groups. Sentiment and keyword count proportions among student comments for the 2015 and 2016 years were compared using chi-square proportion testing.

Results/Analysis

The classes taking the 2015 and 2016 DMD 5715 Blood and lymphatics systems course were similar in class size, gender and ethnic/racial make-up, age and incoming GPA and DAT performance (Table 1). Given that no statistically significant differences were observed, we assumed that any differences in academic performance between the 2015 and 2016 courses would be due to the addition of reflective exercises.

Table 1: Class composition taking the 2015 and 2016 DMD 5175 Blood and lymphatics systems course

 

2015 (- reflection)

2016 (+reflection)

Student number

67

69

% Males

54

48

Average Age

28

27

Ethnicity / Race

White

Asian

Other

26

23

18

21

25

23

Incoming GPA

Overall

BCP

Science

Non-Science

3.31

3.19

3.21

3.51

3.29

3.10

3.20

3.49

Incoming DAT

Acad. Avg.

Biology

G. Chem

O. Chem

P. Ability

Quant. Res.

Read. Comp.

Total Sci.

19.19

19.13

19.75

19.95

20.11

18.23

19.61

19.26

19.54

19.47

20.04

20.18

20.41

18.53

20.44

19.49

Class statistics for both classes show no statistically significant differences. Grade Point Average (GPA) and Dental Admissions Test (DAT) Scores Abbreviations: “BCP”: Biology, chemistry, physics. “G.Chem”: General chemistry. “O.Chem”: Organic chemistry. “P. Ability”: Perceptive Ability. “Quant. Res.”: Quantitative Reasoning. “Read. Comp.”: Reading Comprehension. “Total Sci.”: Total Sciences.

We used identical pre/posttest surveys to measure factual knowledge recall during the course. In both 2015 and 2016 courses, students were able to answer significantly more items correctly suggesting that both courses were effective in teaching factual knowledge (see Figure 2 for 2015 class performance). The pre/post test only evaluates recall skills, whereas the written assessments in this course also test higher learning skills such as application and analysis of clinical scenarios. Nevertheless, performance on the posttest correlates weakly with exam performance (R2=0.15), and similarly with the course grade.

Figure 2: Boxplot of pre and post test scores in the 2015 course (without reflection exercises)

Representative increase of test scores between pre and post tests

There was a statistically significant increase in the number of correct answers between pre and posttest surveys in the 2015 course, suggesting effective learning (p<0.05). The boxplot for 2016 course pre and post test score data appears similar to this figure.

To determine if reflective exercises aided academic performance on assessments evaluating knowledge acquisition, we compared pre/post test surveys, exam grades and course grades. No statistically significant differences were found for the pre/post test surveys (Figure 3A) and course grades (Figure 3B). Exam scores varied greatly and there was no apparent trend (2015 exam scores: mid-term average 80%, standard deviation 8%; final exam 73%. standard deviation 8%; 2016: mid-term 65%, standard deviation 19%; final exam: 82%, standard deviation 12%) with all scores statistically significantly different from each other (p<0.001, Kruskal Wallis). This suggests that addition of reflective exercises had no measurable effect on academic performance in this course.

Figure 3: Comparison of pretest-posttest score differences (A) and course grade distribution (B) in course without or with reflection exercises

Addition of reflection exercises had no effect on learning as evidenced by no statistical difference between pre-post test score increase or course grade distribution.

(A) Boxplot of pre to post test score increases for the 2015 course lacking reflective exercises and the 2016 course with the added reflective scores. Addition of reflective exercises did not lead to any statistically significant differences in pre/post test score gains. (B) No statistically significant differences were observed were seen in course distribution (shown) or average course grade. This confirms the pre/post test survey findings and suggests that addition of reflective exercises had no significant effect on academic performance in this course.

Since reflective exercises may improve professionalism, satisfaction, and student attitude, we sought to determine if this was reflected in the course evaluations. Therefore, we analyzed the free text responses in the course evaluation for manually coded nodes such as “learn”, “understand”, “professionalism” or “self (directed learning)” that correlate to intangible benefits of reflection reported previously. Only the 2016 course with the inclusion of reflection exercise elicited student comments on the reflection exercise, albeit a single comment: “(response no. 19) Self-reflection helped solidify information”. Surprisingly, the overall tone of the course evaluation comments felt more negative (Table 2), and a machine-based sentiment analysis confirmed this impression (Figure 4), with a significant reduction in mean sentiment (p<0.05, Mann-Whitney U test). This impression was also reflected in the more negative responses on the quantitative portion of the course evaluation survey (Table 3).

Table 2: Qualitative evaluation of student comments in course evaluation using manual coding

Keyword/Theme

Course evaluations

- reflection

+ reflection †

Total written comments excluding “n/a”

60 (90%)

47 (68%)‡

“reflect..” / Reflection-positive

0 (0 %)

1 (2%)

“reflect..” / Reflection-negative

0 (0 %)

0 (0%)

“enjoy..”/ Enjoyment of course

7 (12%)

1 (2%)

“learn..” / Learning-positive experience

10 (17%)

2 (4%)‡

“learn..” / Learning-negative experience

0 (0%)

0 (0%)

“learn” / about new specific condition

0 (0%)

0 (0%)

“learn”/ about treatment

0 (0%)

0 (0%)

“understand…” / deeper understanding

5 (8%)

5 (11%)

“empathy” / Empathy

0 (0%)

0 (0%)

“collabora..” / Collaboration

0 (0%)

0 (0%)

“self..” / Self-awareness

0 (0%)

0 (0%)

“profession..” / Professionalism

0 (0%)

0 (0%)

Course mechanics*

33 (55%)

30 (64%)

Professor-specific*

28 (47%)

7 (15%)‡

Manually coded themes reported by students in the course report for the 2015 course without reflection exercises (- reflection) and 2016 course with reflection exercises (+ reflection). (*) general themes such as course mechanics (i.e. allotted hours, grading) and professor-specifics (i.e. liked/disliked professor). “n/a” indicates items that were not analyzed for the reflection assignment. † 2015 and 2016 count differences are statistically significant (p<0.001, ANOVA); ‡ individual differences (p<0.05) between 2015 and 2016

 

Figure 4: Sentiment scores by students for the course with or without reflection exercise

When reflection exercises where introduced to the course, there was a corresponding shift to more negative sentiments expressed in the course evaluations.

Sentiment scores range from very negative (-2) over neutral (0) to very positive (2). There was a significant shift towards negative ratings in the distribution of sentiment scores with the addition of reflection (p<0.05, Mann Whitney U).

Table 3: Quantitative Ratings of the Course Experience as provided by the Course Evaluation Survey

Survey Item with Likert type rating

(1=strongly disagree to 5=strongly agree)

Course evaluations

- reflection

(N=67)

+ reflection †

(N=69)

"I reviewed the specific learning objectives for the course"

63 (93%) "YES"

67 (97%) "YES"

"The content was presented in a sequence that effectively supported my learning"

68 (100 %) "YES"

63 (91%) "YES"

“Overall, the course was well designed"

4.5±0.63

4.0 ± 0.92*

“The large group sessions were effective in summarizing the important facts"

4.5±0.55

4.3 ± 0.65

“Overall, the tests reflected the learning objectives and the material emphasized in this course"

4.3±0.61

3.5±1.16*

Likert scale rating averages for the course evaluation's quantitative section. As with the written feedback, the course with added reflection exercises received more critical feedback. (* p<0.05, student t-test)

Given that the unexpectedly negative feedback was associated with no course changes other than the addition, we sought to determine if the reflection exercises themselves provided clues for this shift in attitude. Using the same approach, we analyzed individual responses given during the email survey. Surprisingly, manual coding showed a positive response in general, and students recognizing the value of reflection as reported in previous literature (Table 4).

Table 4: Themes expressed by students in their email survey

Keyword/Theme

Reflection assignments

Total written comments excluding “n/a”

69

“learn” / about new specific condition

26 (38%)

“learn”/ about treatment

32 (46%)

“understand…” / deeper understanding

11 (16%)

“empathy” / Empathy

5 (7%)

“collabora..” / Collaboration

6 (9%)

“self..” / Self-awareness

1 (1%)

“profession..” / Professionalism

6 (9%)

Course mechanics*

1 (1%)

Professor-specific*

0 (0%)

Manually coded themes seen in the individual student's reflection exercises suggest that students see the value of reflective exercises, and the more critical course evaluation feedback seems to have been triggered by some other factor than reflective exercises. * Course mechanics and professor-specifics were coded manually with a variety of words (i.e. "allotted hours", "grading", and any wording referring to the professor)

Machine based coding was not useful to validate the sentiment of the reflection exercises as it wrongly coded statements about disease and medical treatment as "negative". Instead, machine coding of the responses demonstrated themes that reflected practical application of the knowledge acquired in this course (Figure 5). The themes expressed in the reflection assignment also provided valuable feedback on what the students remembered as important in this course.

Figure 5: Themes of the reflection assignment as shown by a word cloud

This shows synonyms for the 1000 most frequently mentioned word combinations of three in the student responses. While "patient" and "learned" were expected from the setup of the assignment, note that many terms directly tie to specifics of patient care for the various blood conditions mentioned here.

Discussion

In this study, we failed to see a significant increase in scores between pre and post-tests, or class performance suggesting that the reflective exercises had no immediate effect on academic performance. Yet, the reflective exercises themselves provide useful insights into possible benefits that are not readily measured by standard academic measures and course evaluation surveys.

Reflection as practiced in this course did not significantly improve academic performance, as this course may already have maximized student learning through other methods. As a system-based course using multiple teaching methods, the 2015 course iteration already contained case-based learning, interaction with faculty in classroom discussions, group projects and assessments. All of these are known to enhance learning in some students. However, it is possible that addition of reflective exercises may still produce measurable academic performance improvement in settings that favor passive learning (Wingfield and Black, 2010).

Another explanation may be that the reflective exercises were not specific enough to reinforce the learned material for better academic performance. We opted to provide a variety of reflection exercises ranging from very broad to very specific so that general reflection skills applicable to the remainder of the career could be nurtured. It is conceivable that more specific, repetitive, guided reflection on key topics assessed on the exams would yield a measurable increase in exam performance. Yet, we rejected such strategy as it also risks rote memorization of facts, which we felt was inappropriate for the goals of this course.

A third alternative explanation may be that improvements in learning triggered by reflective exercises in a health care education setting were not captured by the assessment tools of this study. Analogous to this study, Miller Juve (Miller Juve, 2012) aimed to determine learning improvement in anesthesiology residents as consequence of eight weekly, formalized reflection sessions using Gibbs’ model (Gibbs, 1988) of reflection. Instead of measuring specific attainment of knowledge, Miller Juve aimed to determine improvements in attitudes, abilities and characteristics of learning of these residents using a validated formal assessment tool, the Self-directed learning readiness scale/learning preference assessment (SDLRS/LPA) (Guglielmino, 1978) and a follow-up survey on attained knowledge. Neither showed any statistical difference before and after the reflective sessions, despite the SDLRS/LPA usually regarded as a reliable and valid measure of self-directed learning. Given the dearth of quantitative data on the effects of reflection, it is possible that measurement of positive effects typically attributed to reflection await development of a validated and reliable assessment tool.

A fourth alternative explanation may be that improvements in learning caused by reflective learning may be of a different nature than increase in factual knowledge. Many studies suggest more intangible benefits that could not have been measured with the assessment tools used in this study. For example, George et al. reports that self-reflection leads to setting of more complex learning goals (George et al., 2013), and there are several studies suggesting an increase in self-awareness, professionalism and understanding as described above. The reflection assignments seem to confirm these effects, and further studies may confirm this.

Last, it is possible that the tools used in this study to measure knowledge acquisition were not adequate for the task, and therefore failed to show differences in knowledge acquisition. The pre/post test survey tool may not have had enough questions to uncover subtle differences, and it does not correlate strongly with exam performance. While it is possible that a tool with more questions covering a broader area of knowledge or clinical applications could have uncovered some measurable effects of reflection, it would have been too intrusive in teaching and it was designed to be a compromise between low intrusiveness and high comprehensiveness. While a high degree of correlation between pre/post survey score increase and final exam performance would be an ideal validator of this survey tool, it was likely not possible as exams and surveys are independent. Each exam was unique, with no questions being reused, and none of the survey questions were given on the exams. In addition, the survey tool featured questions focused on simple recall of detailed knowledge, while a high proportion of exam questions tested clinical application and critical thinking skills.

Qualitative analysis of course evaluations revealed several significant trends associated with the introduction of reflection exercises. Compared to the 2015 course that lacked reflection exercises, introduction of reflection exercises led to a significantly reduced number of comments, a lower count of “enjoyment” and “learning” keywords and reduction of professor-specific comments. The general tone of the comments changed from quite positive to more critical, and the length of the comments decreased. It is unknown if the addition of reflection exercises led to this change, or if the 2016 class had a more reluctant and critical class character than the previous class. Comparing student responses between 2015 and 2016 for another class, the system course “DMD5130 Musculoskeletal system course” taught immediately after the course described in this study and including the author of this study as instructor, the same pattern was seen and confirmed with machine based sentiment analysis. This musculoskeletal system course was taught the same in both years. Yet, the 2015 class provided more comments (18 vs. 8) than the 2016 class, and the general tenor was somewhat more positive in 2015. Consequently, the shift in sentiment was unlikely caused by addition of reflective exercises.

Instead, the reflection exercises seemed to have been positively received as based on their content, and demonstrated notions of empathy, awareness, and clinical application. In addition, these exercises provide valuable insights in what the students remember from the course which can help in future course design. As these reflective exercises are simple to create, and take little classroom time, these exercises likely provide intangible benefits that deserve further study.

Conclusion

In the context of a case-based system course, addition of reflective exercises did not produce any significant effect in student learning as measured with pre/posttest surveys, exam scores and course grades. We observed a significant decrease in positive sentiment on course evaluations when we introduced reflective exercises, but connection to the reflection exercise seems unlikely as this was also observed in other courses taken by the same student cohorts that lacked reflective exercises. Based on textual analysis, the effect of reflective exercises in early professional students is most likely seen in skills such as professionalism, interdisciplinary collaboration, empathy and learning attitudes.

Take Home Messages

  • Reflective exercises may not produce measurable gains in academic performance
  • Class sentiment may vary significantly between course iterations
  • Class sentiment changes and addition of reflection exercises may not be correlated
  • Reflective exercises may encourage collaboration, professionalism, and empathy in some students

Notes On Contributors

Dr. Boehm is an associate professor at Western University of Health Sciences, United States of America where he teaches microbiology, the blood and lymphatic system course mentioned in this manuscript, a variety of preclinical and clinical courses of periodontics. He also is practicing full-time at the Western University of Health Sciences The Dental Center as periodontist. ORCID ID: https://orcid.org/0000-0002-5200-6178

Acknowledgements

This study was funded internally by Western University of Health Sciences. We thank the following individuals for their contributions to this work: Dr. Gary Pape for an initial review of the manuscript; Dr. Rod Hicks for guidance on mixed method research, and review of the manuscript; Mr. Rudy Barreras for technical assistance in machine contextual analysis and the office of academic affairs staff for collection and retrieval of student data.

All figures were created by the author of this article.

Bibliography/References

Beylefeld, A. A., Nena, K. D. and Prinsloo, E. A. (2005) 'Influence of community experiences on first-year medical students' reflective writing', Med Teach, 27(2), pp. 150-4. https://doi.org/10.1080/01421590400029590

Biggs, J. B. and Moore, P. J. (1993) Process of Learning. 3rd edn. London: Prentice Hall.

Borgstrom, E., Morris, R., Wood, D., Cohn, S., et al. (2016) 'Learning to care: medical students' reported value and evaluation of palliative care teaching involving meeting patients and reflective writing', BMC Med Educ, 16(1), p. 306. https://doi.org/10.1186/s12909-016-0827-6

Bradner, M. K., Crossman, S. H., Gary, J., Vanderbilt, A. A., et al. (2015) 'Beyond diagnoses: family medicine core themes in student reflective writing', Fam Med, 47(3), pp. 182-6, https://www.ncbi.nlm.nih.gov/pubmed/25853528 (Accessed: 3 September 2020).

Collins, A., Brown, J. S. and Newman, S. E. (1987) Cognitive Apprenticeship: Teaching the Craft of Reading, Writing, and Mathematics (Technical Report No. 403). Hillsdale, N.J.: Associates, L. E.

Cruickshank, D. R. (1987) Reflective Teaching: The Preparation of Students of Teaching. Association of Teacher Educators.

DasGupta, S. and Charon, R. (2004) 'Personal illness narratives: using reflective writing to teach empathy', Acad Med, 79(4), pp. 351-6. https://doi.org/10.1097/00001888-200404000-00013

Dhaliwal, U., Singh, S. and Singh, N. (2018) 'Reflective student narratives: honing professionalism and empathy', Indian J Med Ethics, 3(1), pp. 9-15. https://doi.org/10.20529/IJME.2017.069

George, P., Reis, S., Dobson, M. and Nothnagle, M. (2013) 'Using a learning coach to develop family medicine residents' goal-setting and reflection skills', J Grad Med Educ, 5(2), pp. 289-93. https://doi.org/10.4300/JGME-D-12-00276.1

Gibbs, G. (1988) Learning by Doing: A Guide to Teaching and Learning Methods. Oxford: Oxfort Further Education Unit.

Guglielmino, L. M. (1978) Development of the Self-Directed Learning Readiness Scale. 38 thesis. ProQuest Information & Learning.

Henderson, P. and Johnson, M. H. (2002) 'An innovative approach to developing the reflective skills of medical students', BMC Med Educ, 2, p. 4. https://doi.org/10.1186/1472-6920-2-4

Hewson, M. G., Jensen, N. M. and Hewson, P. W. (1989) 'Reflection in residency training in the general internal medicine clinic.', American Educational Research Association. San Francisco.

Knowles, M. S. (1975) Self-directed Learning: A Guide for Learners and Teachers. Cambridge Adult Education.

Kolb, D. A. (1984) Experiential learning: Experience as the Source of Learning and Development. Englewood Cliffs, New Jersey: Prentice Hall.

Levine, R. B., Kern, D. E. and Wright, S. M. (2008) 'The impact of prompted narrative writing during internship on reflective practice: a qualitative study', Adv Health Sci Educ Theory Pract, 13(5), pp. 723-33. https://doi.org/10.1007/s10459-007-9079-x

Lewin, L. O., Robert, N. J., Raczek, J., Carraccio, C., et al. (2014) 'An online evidence based medicine exercise prompts reflection in third year medical students', BMC Med Educ, 14, p. 164. https://doi.org/10.1186/1472-6920-14-164

Miller Juve, A. K. (2012) Reflective Practice and Readiness for Self-Directed Learning in Anesthesiology Residents Training in the United States. Doctor of Education in Educational Leadership: Postsecondary Education thesis. University of Oregon. Available at: https://pdxscholar.library.pdx.edu/open_access_etds/235/ (Accessed: 3 September 2020).

Mori, B., Batty, H. P. and Brooks, D. (2008) 'The feasibility of an electronic reflective practice exercise among physiotherapy students', Med Teach, 30(8), pp. e232-8. https://doi.org/10.1080/01421590802258870

Okubo, Y., Nomura, K., Saito, H., Saito, N., et al. (2014) 'Reflection and feedback in ambulatory education', Clin Teach, 11(5), pp. 355-60. https://doi.org/10.1111/tct.12164

Pee, B., Woodman, T., Fry, H. and Davenport, E. S. (2000) 'Practice-based learning: views on the development of a reflective learning tool', Med Educ, 34(9), pp. 754-61. https://doi.org/10.1046/j.1365-2923.2000.00670.x

Schon, D. A. (1987) Educating the Reflective Practitioner: Towards a New Design for Teaching and Learning in the Professions. San Francisco: Jossey-Bass.

Shapiro, J., Kasman, D. and Shafer, A. (2006) 'Words and wards: a model of reflective writing and its uses in medical education', J Med Humanit, 27(4), pp. 231-44. https://doi.org/10.1007/s10912-006-9020-y

Shaughnessy, A. F. and Duggan, A. P. (2013) 'Family medicine residents' reactions to introducing a reflective exercise into training', Educ Health (Abingdon), 26(3), pp. 141-6. https://doi.org/10.4103/1357-6283.125987

Shulman, L. S. (1985) 'A course of treatment: teaching and medicine. Third Annual Invited Review', Research in Medical Education Conference. Washington, D.C.

Smith, M. J. (2018) 'Please don't make us write an essay! Reflective writing as a tool for teaching health advocacy to medical students', Paediatr Child Health, 23(7), pp. 429-430. https://doi.org/10.1093/pch/pxy055

Tong, A., Sainsbury, P. and Craig, J. (2007) 'Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups', Int J Qual Health Care, 19(6), pp. 349-57. https://doi.org/10.1093/intqhc/mzm042

Towle, A. and Cottrell, D. (1996) 'Self directed learning', Arch Dis Child, 74(4), pp. 357-9. https://doi.org/10.1136/adc.74.4.357

Wakeling, J., Holmes, S., Boyd, A., Tredinnick-Rowe, J., et al. (2019) 'Reflective Practice for Patient Benefit: An Analysis of Doctors' Appraisal Portfolios in Scotland', J Contin Educ Health Prof, 39(1), pp. 13-20. https://doi.org/10.1097/CEH.0000000000000236

Westmoreland, G. R., Counsell, S. R., Sennour, Y., Schubert, C. C., et al. (2009) 'Improving medical student attitudes toward older patients through a "council of elders" and reflective writing experience', J Am Geriatr Soc, 57(2), pp. 315-20. https://doi.org/10.1111/j.1532-5415.2008.02102.x

Wingfield, S. S. and Black, G. S. (2010) 'Active Versus Passive Course Designs: The Impact on Student Outcomes', Journal of Education for Business, 81(2), pp. 119-123. https://doi.org/10.3200/joeb.81.2.119-128

Appendices

None.

Declarations

There are no conflicts of interest.
This has been published under Creative Commons "CC BY-SA 4.0" (https://creativecommons.org/licenses/by-sa/4.0/)

Ethics Statement

The study was reviewed and approved as “exempt” by the Western University of Health Sciences Institutional Review Board (protocol number 16/IRB/045).

External Funding

This article has not had any External Funding

Reviews

Please Login or Register an Account before submitting a Review

Annwyne Houldsworth - (03/03/2021)
/
I wish to thank the authors for investigating this extremely important and interesting subject with regard to healthcare worker teaching and enjoyed reading the paper.
The paper addresses the effects of reflective practice on academic achievement in biomedical science students. The methods and results are clearly presented and introduced effectively.
The authors must have been disappointed that no significant difference was found in academic achievement and examination results when comparing groups exercising reflection and those not.
The paper does not describe, in detail, any of the reflective exercises that resulted in positive professionalism in the group practicing reflection. Some personal successes in encouraging reflection have involved using systems, such as, Boyatzis, where students consider and develop strategies and plans to improve knowledge acquisition, as a result of student reflection. These reflection-associated plans are intended to actively improve methods of study and learning by recognising areas for improvement and to encourage self-authorship of their studentship and professionalism.
However, more important outcomes of learning are in comprehension and deep understanding of knowledge, rather than shallow superficial memory learning and I wonder the nature of the tests that the students took in this research project.
Reflecting on strengths, weaknesses, opportunities and threats may also improve methods and attitudes to study.
There are many activators of motivation to study as motivation for achievement is not a single construct, made up of goals, concepts of self and valuation of different tasks contributing to achievement motivation. Examinations are generally considered to be external motivation but achievement in examination can, in itself, be an internal motivator.
I did not notice any extension to these investigations in the paper or future plans to build on these findings and wonder if longitudinal findings may observe changes in the student culture that improve academic achievement in future cohorts.
Reporting on individual student case studies as a qualitatively may be an interesting extention and I hope that you continue investigating this important area of healthcare training.
There were some grammatical typos, commas etc could do with a reread from the author.
Possible Conflict of Interest:

None

Emma Brennan-Wydra - (25/01/2021) Panel Member Icon
/
This article describes a mixed-methods investigation of the effect of reflective writing exercises on student learning, performance, and attitudes in a case-based biomedical science course for first-year dental students, which may be of interest to educators in the health sciences and beyond who are looking to increase student engagement and learning. The introduction of the paper effectively lays out the motivations for undertaking the study by establishing the importance of reflection in the learning process and highlighting an apparent gap in the literature. The author uses various metrics (performance on pretest/posttest quizzes, midterm and final exam scores, grades, course evaluations) to compare two iterations of a course, one with (2016) and one without (2015) periodic reflective exercises.

The use of multiple sources of information to obtain a holistic picture of student learning, performance, and attitudes is a strength of the article, but it also introduced some weaknesses. In particular, I was left with questions about the qualitative methods used. For instance, it would be nice to know more about how the list of keywords/themes (e.g., “learning,” “reflection,” “professionalism”) was obtained, particularly since the keywords of interest seem to have been selected a priori. Using a deductive approach to coding is an established qualitative method, but there needs to be some more explanation and justification. Further, the way the methods are described seems to suggest that only comments that contained an exact match for a particular keyword (or common variant of the word) were assigned the code, which makes me wonder if other comments that addressed the theme may have been inadvertently missed. For example, a comment reading “This course gave me a better sense of what it means to be a dentist” may be seen to address understanding (“gave me a better sense”) and professionalism (“what it means to be a dentist”) but does not include either word per se. I was also not convinced of the appropriateness of using machine-based textual analysis given the relatively small amount of text; more commonly, multiple human coders will independently code the data and compare their codes to establish reliability. Including more examples of the student comments themselves (both from course evaluations and from reflective exercises) may further strengthen the presentation of the results.

In the discussion section, the author offers a number of plausible explanations for the main findings (limited evidence supporting the hypothesis that reflective exercises increase student learning) with an effort to place the present study in the context of the literature. The conclusions drawn seem reasonable based on the presented evidence. An additional factor worth exploring in future work is that of survey fatigue or respondent burden: the addition of reflective exercises in the course may have unintentionally decreased student motivation to write (positive) comments in the end-of-course evaluation.
Daniel Xu - (15/01/2021)
/
well written with review of reflection on academic performance, it'll be interesting to know whether the conclusion applies to the senior year or the clinical year of undergraduate medical education.