Research article
Open Access

Metacognitive awareness and the link with undergraduate examination performance and clinical reasoning

Paul Welch[1], Louise Young[2], Peter Johnson[3], Daniel Lindsay[4]

Institution: 1. James Cook University, 2. James Cook University, 3. James Cook University, 4. James Cook University,
Corresponding Author: Mr Paul Welch paul.welch@my.jcu.edu.au
Categories: Educational Theory, Research in Medical Education, Teaching and Learning

Abstract

Theory: Metacognitive awareness is a component of self-regulated learning and helps us to understand and control our thinking and learning. Thinking about thinking is also an important aspect of the clinical reasoning process for medical practitioners.

Hypotheses: This pilot study researched the link between metacognitive awareness and undergraduate examination performance. The Metacognitive Awareness Inventory (MAI) is a validated 52 item survey instrument for measuring metacognitive awareness. It has eight sub-scales grouped into two domains: Knowledge of Cognition and Regulation of Cognition. It was hypothesised that MAI scores would increase between first and fifth-year undergraduate medicine students and secondly that MAI scores would correlate with undergraduate examinations results.

Method: Medical students at James Cook University, Australia were invited to complete the MAI and consented to give access to their examination scores.

Results: The results of this pilot study found that metacognitive awareness was not significantly different between first and fifth-year undergraduates in this sample. For first-year medical undergraduates there were correlations between the Knowledge of Cognition domain and their end of year examination results, but not with the Regulation of Cognition domain. For fifth-year students there were correlations between both the Knowledge and Regulation of Cognition domains and their end of year examination results.

Conclusion: This study identified that metacognitive awareness is not significantly different between first and fifth-year medical students. This may cause concern given that the study identified the importance of both MAI domains in undergraduate medical examinations. This study should be repeated on a larger scale and may confirm that raising metacognitive awareness levels among students is desirable. Increasing metacognitive awareness may raise examination performance and better prepare students for developing clinical reasoning skills. 

Keywords: Clinical reasoning, Metacognition, Undergraduate examinations, Expertise, Learning

Introduction

Clinical reasoning failures are a noteworthy cause of patient mortality and morbidity and result in approximately 40,000 deaths per year, as well as being the cause of 28.6% of malpractice claims in the USA (Newman-Toker & Pronovost, 2009; Saber Tehrani et al., 2013). In Australia 27.6% of adverse events were due to a failure to synthesise, decide or act upon available information, or failing to arrange an investigation, procedure or consultation (Wilson, Harrison, Gibberd, & Hamilton, 1999). Clinical teachers consider clinical reasoning problematic to conceptualise and teach, as well as difficult for students to grasp, but it is considered central to the medical education process (Charlin, 2012; Cutrer, Sullivan, & Fleming, 2013).

In medical practice, doctors gather clinical data about a patient and then make decisions based on the meaning they attribute to this information. These clinical reasoning skills can be defined as the “ability to sort through a cluster of features presented by a patient and then to accurately assign a diagnostic label, with the development of an appropriate treatment strategy being the end goal” (Eva, 2005, p. 98). Helping medical students and junior doctors develop clinical reasoning skills is a central aim of medical education and requires knowledge and experience (Cutrer et al., 2013).

Seeking to better understand self-regulated learning among medical students has attracted attention from medical educators in recent years for two reasons  (Bruin, Dunlosky, & Cavalcanti, 2017; Song, Kalet, & Plass, 2011). Firstly, self-regulated factors have been identified as a source of achievement differences between students (Zimmerman & Pons, 1986). Secondly, self-regulated learning (SRL) has been demonstrated as an effective means of raising student achievement (Schunk, 1981). In the 1980s,  Schunk showed that students who are proactive self-regulators set goals, devise and implement effective learning strategies, create an effective learning environment, seek feedback and help when necessary, show tenacity as well as self- monitoring and being able to assess their progress towards specific goals (Zimmerman & Schunk, 2011). Studies have shown that metacognitive skills, a component of SRL, increase during adolescence, plateau during early adulthood and then decline in older age (Palmer, David, & Fleming, 2014; Weil et al., 2013).

Metacognition, or thinking about your thinking, is vital to the development of clinical reasoning skills due to its role in monitoring and regulating cognitive processes (Colbert et al., 2015; Croskerry, 2003a, 2003b; Dunphy et al., 2010; Marcum, 2012). Medical accreditation authorities recognise metacognitive skills as important and seek to mandate their development by practising clinicians (CPMEC, 2016; ACGME, 2015; GMC, 2016).   Although metacognitive skills are widely regarded as important for the development of clinical reasoning, they are seldom directly taught or assessed at medical school (Colbert et al., 2015).  Metacognitive skills are a component of SRL which manage the cognitive aspects of learning and thinking (Bruin et al., 2017; Gönüllü & Artar, 2014). Effective SRL skills are positively associated with superior levels of student achievement, as well as expertise in clinical reasoning (Zimmerman & Schunk, 2011).

Cognitive skills are needed to perform a task, whereas metacognition is required to manage these cognitive skills (Garner, 1987). Metacognitive awareness skills include the Knowledge of cognition, which enable a person to have knowledge about their thinking and learning processes (Gönüllü & Artar, 2014; Schraw, 1998). The second component of metacognitive awareness is the Regulation of cognition. Being able to regulate their cognition enables an individual to self-monitor the effectiveness of their learning and decision making, regarded as vital to all clinicians, regardless of whether they are students or experts (Bruin et al., 2017).

Knowledge of cognition comprises declarative, procedural and conditional knowledge. Declarative knowledge encompasses knowing about things and covers facts, information, events, rules and processes. It involves networks of facts, is public knowledge and originates from what teachers state or declare (Anderson, 1982). Knowledge is encoded declaratively and then interpreted for it to be understood. For knowledge to be converted into behaviour, it must first go through this interpretive stage. Declarative knowledge may be thought of as conceptual, propositional, descriptive or explicit knowledge.

When declarative knowledge is put to work and involves knowing how to do something, it becomes procedural knowledge which undergoes a process of continual refinement with increases in processing speed. Procedural knowledge refers to the performing of knowledge or implicit knowledge and is a behaviour or skill. If knowing when and why to use a particular skill is important it becomes conditional knowledge which involves the regulation of memory, thought and learning (Ackerman & Zalmanov, 2012).

Regulation of cognition helps the student control or manage their learning or decision making e.g. a medical student may ask themselves if they understand the significance of a specific piece of clinical data, or if they comprehend the rationale behind a clinical decision or management plan. Regulation of cognition comprises of planning, information management, monitoring, debugging strategies and evaluation (Schraw & Dennison, 1994).   

Planning involves choosing the best strategy alongside managing the required information to achieve the desired outcome. Monitoring refers to the ability to self-test the progress being made on a task. Debugging strategies concerns intentionally looking for disconformity evidence to reduce the risk of confirmation biases. Finally, evaluation refers to the ability to globally assess the progress towards solving a clinical problem or gaining mastery in learning.

 

Figure 1 Shows how the components of metacognitive awareness may interact with four stages of the clinical reasoning cycle.

 

Figure 1 shows how Regulation of cognition sub-domains interacts with all stages of the clinical reasoning cycle as the individual plans, monitors and evaluates new information in the light of existing knowledge. As the doctor in training makes use of debugging strategies they test their reasoning for its robustness (Croskerry, 2003a). The Knowledge of cognition sub-domains help the clinician to encode and interpret new information by linking it to their existing knowledge structures. This new knowledge may then be used in the clinical reasoning process. Knowledge already possessed by the doctor in training guides them to seek new patient data, as well as helping to make sense of the clinical findings. Clinical reasoning is an iterative process (Charlin, 2012). It may take place rapidly and with little metacognitive guidance input, but this may increase the clinical reasoning error rate (Croskerry, 2003b; Graber, 2003; Kiesewetter et al., 2016).

Clinical reasoning errors appear not to be predominantly due to incompetence or a lack of knowledge, but arise due to the complexity of the case, or under conditions of uncertainty or time pressure (Scott, 2009). The various errors and biases that impact on clinician judgement have been well documented, and commonly include premature closure whereby a diagnosis or management plan is decided upon before all the possibilities have been adequately considered (Nendaz & Perrier, 2012; Scott, 2009).

Medical school examinations test student knowledge and clinical skills with the goal of producing intern doctors prepared for clinical practice. Part of this preparation includes developing the clinical reasoning capabilities of their graduates and by inference, their metacognitive capacities, in readiness for clinical reasoning in the workplace. In this paper, the authors test the hypothesis that there is a correlation between the metacognitive awareness of medical students and their examination performance and secondly, that there will be an increase in metacognitive awareness levels between first and fifth-year students of a medical course.

This research took place with volunteer undergraduate medical students at James Cook University (JCU), Australia. The university and hospitals used for clinical placements are in regional and rural northern Queensland, Australia. The first year of the medical program at JCU is regarded as the first of three pre-clinical years. Although there are several clinical placements, most of the time is spent in class or laboratory sessions. For the last three years of the program, students are primarily taught while on clinical placements. During their medical course, students are periodically required to write reflectively about their clinical experiences, but they are not explicitly taught metacognitive techniques.

Research questions

The research questions hypothesised firstly, that metacognitive awareness levels would positively correlate with undergraduate examination scores. Secondly, it was hypothesised that there would be an increase in metacognitive awareness scores from the first to the fifth year of the medical course. 

Method

Ethical approval was obtained for this pilot study from James Cook University Human Research Ethics Committee reference H6008. This study used a between-subject design with data collected at one point in time.

Students were invited to participate in this study by email. For participating first-year medical undergraduates (43 from a total of 197)), their mean age was 19.19 (SD = 2.7) years with 19 male and 24 female students. For the fifth-year students (13/177) their mean age was 25.69 (SD= 6.16) years, with seven male and six female students. This study was undertaken in September 2015. Two types of measurement instrument were used, medical school examination results and the metacognitive awareness inventory (MAI) (Schraw & Dennison, 1994).

Measurement instruments

Medical School Examinations

A variety of examinations are used by the medical school to test knowledge and clinical skills, and these include the Objective Structured Clinical Examination (OSCE), Key Features Papers (KFP) and Multi-Station Assessment Task (MSAT). An OSCE is a focused examination consisting of approximately 20 stations where students may be asked, for example, to take a patient history, examine a part of the body or interpret a laboratory report (Harden, 1988). These tasks form part of the information gathering and analysis components of the clinical reasoning process (Welch et al., 2017). The OSCE is a commonly used and highly regarded method of assessment around the world, both in undergraduate and postgraduate examinations (Khan, Ramachandran, Gaunt, & Pushkar, 2013). 

Key Features Papers (KFP) were designed as a means of evaluating clinical problem solving and decision-making skills (Page & Bordage, 1995; Page, Bordage, & Allen, 1995).  The examination is a written test consisting of between 15-20 brief cases where the student is required to make qualified decisions based on the information presented. The Multi-Station Assessment Tasks (MSAT) have strong similarities to the OSCE examination but are adapted for medical students in years 1-3 at JCU. There is a debate in the literature about how best to measure clinical reasoning (Gruppen, 2017). The absence of a ‘gold standard’ test means that reliance on testing depends on several different types of examinations including the OSCE and KFP examinations (Ilgen et al., 2012; Rencic, Durning, Holmboe, & Gruppen, 2016).

Studies have shown that metacognitive awareness allows individuals to plan, sequence and monitor their learning in a way that improves their overall examination performance (Swanson, 1990). Metacognitive skills are essential for any complex learning process, but there is a lack of evidence connecting metacognitive awareness with performance in medical school examinations. This study investigates how metacognitive skills are associated with performance in undergraduate medical examinations.

The Metacognitive Awareness Inventory (MAI)

The Metacognitive Awareness Inventory (MAI) was developed to quantify the different components of metacognitive awareness see Table 1 (Schraw & Dennison, 1994).  The MAI has sub-scales within the Knowledge of Cognition domain including declarative knowledge, procedural knowledge, and conditional knowledge. From the perspective of clinical reasoning, it is important for the clinician to understand how knowledge is added to what they already know about the signs and symptoms of a patient. This knowledge about a patient medical data is declarative knowledge (Schraw & Moshman, 1995). Procedural knowledge is concerned with how practical procedures are sequenced and executed, for example how a patient’s abdomen is palpated to feel for an enlarged liver. Conditional knowledge is knowing why and when to apply various cognitive actions, for example knowing which heart sounds to listen for, based on the result of a patient’s ECG and history (Schraw & Moshman, 1995).

The second domain of the MAI is Regulation of Cognition and is associated with controlling one’s thinking and learning. There are five sub-scales in this domain including Planning, Information Management, Monitoring, Debugging and Evaluation (Schraw & Dennison, 1994). Planning involves controlling one’s thinking and actively developing a strategy for solving a clinical problem. Information Management means actively reflecting on whether there is enough information gathered to enable a decision to be made. If there is insufficient information, the clinician can develop a strategy to obtain the extra information he/she needs to make a decision. Monitoring means the real-time awareness of how the clinician is utilising the information they are receiving and checking how they are performing clinically. Debugging techniques may be used to correct understanding and performance errors. For example, a student may think why they placed undue emphasis on a particular clinical detail which then took their thinking to an incorrect conclusion. Finally, the evaluation sub-scale is linked with analysing the whole decision-making process and reflecting on its effectiveness. This evaluation step is particularly important in enabling the learner to deliberately and consciously make better decisions next time (Ericsson, 2004).

 

Table 1 – Domains and sub-scales of the Metacognitive Awareness Inventory (Schraw & Dennison, 1994)

 

The Metacognitive Awareness Inventory (MAI) was deployed via the online tool Smart Sparrow and consisted of a series of 52 short statements, for example, “I consider several alternatives to a problem before I answer, and I consciously focus my attention on important information”  (Schraw & Dennison, 1994; Smartsparrow, Accessed Sept 2014).

the MAI Knowledge of cognition domain is calculated as the mean score of the declarative, procedural and conditional knowledge scores, while the Regulation of cognition domain score is the mean score for the planning, information strategies, monitoring, debugging strategies and evaluation sub-scales.

The inventory required participants to indicate their level of agreement with each statement by positioning a sliding point on a 10-point scale bar. The far right of the scale bar (10) indicated strong agreement while the far left (1) denoted strong disagreement. Individual statement item scores ranged from 1-10, and scores for each sub-scale were totalled and recorded as a percentage of the maximum available for each sub-scale. Following online completion and submission of the MAI, participants were given a breakdown of their percentage scores in each of the eight sub-scales and feedback about how they could improve skills in each domain.

The following student examination scores were obtained from consenting participants, depending on their year of the undergraduate medical program.

  • First year: Overall year mark, end of year Multi-Station Assessment Task (MSAT) examinations, Key Features Papers (KFP) examination scores.
  • Fifth year: Overall year mark, end of year Objective Structured Clinical Examination (OSCE) and their Key Features Paper (KFP).

All academic performance data was de-identified by the College of Medicine and Dentistry Assessment Unit before analysis by the researchers. The data was stored and retrieved on password protected electronic files.

Results

Results are summarised in Table 2 below.

Table 2 – Results showing the significant correlations between the domains and sub-scales of the Metacognitive Awareness Inventory and Year 1 and 5  student examination results.

Scale for interpreting the Spearman’s correlation coefficient

r= 0.0-0.3 negligible

rs  =0.3-0.5 low positive

rs = 0.5-0.7 moderate positive

rs = 0.7-0.9 high positive

rs = 0.9-1.0 very high positive (Hinkle, Wiersma, & Jurs, 2003)

In comparing the first and fifth year overall MAI results there was no significant difference were found (all p’s >.05).  

Analyses

Descriptive statistics were used to characterise the sample. As the data were not normally distributed, Spearman correlation was used for correlation calculations. The Spearman’s coefficient, rs, the confidence intervals, CI, and the p values are indicated in Table 2 along with the scale used for interpreting the Spearman’s correlation coefficient. The effect size may be determined by comparing the rs value with scale for interpreting the Spearman’s correlation coefficient. Mann-Whitney tests were performed to examine any differences in MAI scores between first and fifth year students. 

The first research question hypothesised that MAI scores would correlate with examination scores. For the first-year students, their overall examination result for the year had a low but significant correlation with the Knowledge of Cognition domain scores of the MAI (r=0.32, p<0.04) and their Key Features Paper result also had a low correlation with the Knowledge of Cognition domain score (r=0.32, p<0.03). There were no significant correlations within the Regulation of Cognition domain score for the first-year student examination results.

For fifth-year students, their overall end of year examination result was moderately correlated with the Knowledge of Cognition domain score (r= 0.69, p<0.04) and their Key Features Paper (KFP) was highly correlated with the Knowledge of Cognition domain (r= 0.82, p<0.01). The Regulation of Cognition domain was moderately correlated with performance in the KFP examination (r= 0.62, p<0.02) and there was a low correlation with the Regulation of Cognition domain score for their OSCE examination (r= 0.48, p<0.04). The fifth-year overall MAI score was moderately correlated with performance in the KFP examination (r= 0.63, P<0.01).

Discussion

For first year students, the Knowledge of Cognition domain and some of its sub-scales were correlated with examination performance. However, the relative unimportance of the Regulation of Cognition domain would infer that performing well in first-year examinations is not significantly correlated with the skills of self-monitoring, self-evaluation and information management strategies (some of the sub-scales of the Regulation of Cognition domain).

When undergraduate students reach the fifth-year, both the Knowledge and Regulation of Cognition domains show important correlations with examination performance. The most notable difference between the first and fifth year students is the increased importance of the Regulation of Cognition domain. For the fifth-year students, the results of this research shows an increasing necessity to regulate their cognition in clinical examinations like the KFP and OSCE. 

The purpose of the OSCE examination is to assess clinical competence in a planned and structured manner, while Key Features Papers are designed to assess clinical reasoning. (Harden, 1988; Hrynchak, Glover Takahashi, & Nayer, 2014). This study provides evidence that performance in both the KFP and OSCE examinations is correlated with performance in the Regulation of Cognition domain of the MAI.

The second important finding from this research is that there was no significant increase in metacognitive awareness from first to fifth-year students. From the findings of our earlier results, we can see that metacognitive awareness becomes increasingly correlated with performance in the fifth-year examinations. If there is no significant change in a student’s metacognitive awareness from first to the fifth year, a lower scoring student may struggle in the later years of medical school where metacognitive awareness is more important. Secondly, and of relevance to graduates, the literature indicates there is a strong link between high levels of metacognition and clinical reasoning expertise – a crucial factor in clinical practice (Croskerry, 2003a).   

The limitations of this study include the small number of students participating. Due to the small number of participants in this pilot study and the data not being normally distributed, Spearman’s correlation was used. Another limitation may be that the MAI has not been used extensively with medical undergraduates before. The overall examination performances of the first and fifth-year students were not significantly different from the overall mean results for their cohorts, meaning the participants in this study were representative of their year cohort. The single site location of this study means the result cannot be generalised.

Future research should aim to confirm that MAI scores do not vary significantly between first and fifth-year medical students and that there are correlations between components of the MAI and the OSCE and KFP examinations for fifth-year students. Future research may extend this study to include students from other medical schools in both Australia and internationally. Practising clinicians may also be studied to investigate the relationship between their performance in Script Concordance Tests or postgraduate clinical fellowship examinations and their MAI scores (Boulouffe, Doucet, Muschart, Charlin, & Vanpee, 2013).

Conclusion

This pilot study found there was no significant difference in metacognitive awareness between first and fifth year medical students. This may be a cause for concern given the importance of metacognitive awareness in self-regulated learning and expertise in clinical reasoning. The positive correlations between the sub-scales of the Regulation of Cognition domain and the fifth year KFP and OSCE examinations highlight the importance of metacognitive awareness for undergraduate clinical examination performance.  Our evidence along with the literature, support reviewing the need to support raising the metacognitive awareness of medical students which may benefit both their examination performance and their clinical reasoning skill acquisition (Berkhout et al., 2015; Bruin et al., 2017).

Take Home Messages

  • Clinical reasoning is core to medical practice.
  • Metacognitive awareness is a key trait of clinical reasoning experts.
  • Metacognitive awareness levels appear to be correlated with some undergraduate examinations.
  • Supporting the learning of metacognitive awareness skills may assist undergraduate examinees and help foster the development of clinical reasoning expertise.

 

Notes On Contributors

Paul Welch

Paul has over twenty years of experience in education in both the secondary and tertiary sectors. His qualifications include BSc. (Hons), PGCE, MA and he is currently a PhD candidate at James Cook University, Queensland in the College of Medicine and Dentistry. His particular interest is in the development of clinical reasoning skills amongst doctors-in-training. Paul has taught and presented both within Australia and internationally.

Louise Young

Louise has a PhD and over 30 years experience in education. Louise’s research interests involve all aspects of medical/health professional education including innovations in teaching and learning, development of clinical teacher skills, mentoring, at-risk students and marginalised populations including people with a disability. 

Peter Johnson

Associate Professor Peter Johnson is the MBBS Course Coordinator and Director of Foundation Studies for the Medicine and Physician’s Assistant programs at James Cook University. He has over 20 years of experience as an educator and curriculum developer in over 30 biomedical and allied health degree programs across 5 universities. 

Daniel Lindsay

Daniel has expertise in quantitative research methodology and statistical analyses.  He currently has a Bachelor of Psychology with Honours, as well as a PhD in Psychology.  Research interests include health behaviours, innovations in teaching and learning and health economics.  

Acknowledgements

We would like to thank Andrew Moore for his IT expertise and assistance in the use the Smart Sparrow software and also for the Assessment Unit of the College of Medicine and Dentistry at James Cook University for their work in de-identifying the student academic performance data for the researchers.

Bibliography/References

Ackerman, R., & Zalmanov, H. (2012). The persistence of the fluency-confidence association in problem solving. Psychonomic Bulletin and Review, 19(6), 1187-1192.

https://doi.org/10.3758/s13423-012-0305-z   

ACGME (2015) - Accreditation Council for Graduate Medical Education. Common Program Requirements. (March 2017) Retrieved from http://www.acgme.org/Portals/0/PFAssets/ProgramRequirements/CPRs_2017-07-01.pdf

Anderson, J. R. (1982). Acquisition of cognitive skill. Psychological review, 89(4), 369.

https://doi.org/10.1037/0033-295X.89.4.369   

Berkhout, J. J., Helmich, E., Teunissen, P. W., Berg, J. W., Vleuten, C. P., & Jaarsma, A. D. C. (2015). Exploring the factors influencing clinical students' self‐regulated learning. Medical Education, 49(6), 589-600.

https://doi.org/10.1111/medu.12671  

Boulouffe, C., Doucet, B., Muschart, X., Charlin, B., & Vanpee, D. (2013). Assessing clinical reasoning using a script concordance test with electrocardiogram in an emergency medicine clerkship rotation. Emerg Med J, 31(4), 313-316.

https://doi.org/10.1136/emermed-2012-201737  

Bruin, A. B., Dunlosky, J., & Cavalcanti, R. B. (2017). Monitoring and regulation of learning in medical education: the need for predictive cues. Medical Education, 51(6), 575-584.

https://doi.org/10.1111/medu.13267   

Charlin, B. (2012). Clinical reasoning processes: unravelling complexity through graphical representation. Medical Education, 46, 454-463.

https://doi.org/10.1111/j.1365-2923.2012.04242.x   

Colbert, C. Y., Graham, L., West, C., White, B. A., Arroliga, A. C., Myers, J. D., . . . Clark, J. (2015). Teaching Metacognitive Skills: Helping Your Physician Trainees in the Quest to "Know What They Don't Know" The American Journal of Medicine, 128(3), 318-324.

https://doi.org/10.1016/j.amjmed.2014.11.001   

CPMEC. (2016). Australian Curriculum Framework for Junior Doctors (March 2017). Retrieved from http://www.cpmec.org.au/

Croskerry, P. (2003a). Cognitive forcing strategies in clinical decision making. Annals of Emergency Medicine, 41(1), 110-120.

https://doi.org/10.1067/mem.2003.22   

Croskerry, P. (2003b). The importance of cognitive errors in diagnosis and strategies to minimize them. Academic Medicine, 78(8), 775-780.

https://doi.org/10.1097/00001888-200308000-00003   

Cutrer, W. B., Sullivan, W. M., & Fleming, A. E. (2013). Educational strategies for improving clinical reasoning. Current Problems in Pediatric and Adolescent Health Care, 43(9), 248-257.

https://doi.org/10.1016/j.cppeds.2013.07.005   

Dunphy, B. C., Cantwell, R., Bourke, S., Fleming, M., Smith, B., Joseph, K. S., & Dunphy, S. L. (2010). Cognitive elements in clinical decision-making: Toward a cognitive model for medical education and understanding clinical reasoning. Advances in Health Sciences Education, 15(2), 229-250.

https://doi.org/10.1007/s10459-009-9194-y   

Ericsson, K. A. (2004). Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Academic Medicine, 79(10 SUPPL.), S70-S81.

https://doi.org/10.1097/00001888-200410001-00022   

Eva, K. W. (2005). What every teacher needs to know about clinical reasoning. Medical Education, 39(1), 98-106.

https://doi.org/10.1111/j.1365-2929.2004.01972.x   

Garner, R. (1987). Metacognition and reading comprehension: Ablex Publishing.   

GMC. (2016). The New Doctor - recommendations on general clinical training. Annex C. (March 2017) Retrieved from http://www.gmc-uk.org/4a_Annex_C_Modernising_The_New_Doctor.pdf_25398444.pdf   

Gönüllü, İ., & Artar, M. (2014). The impact of metacognitive training on metacognitive awareness of medical students. Eğitimde Kuram ve Uygulama, 10(2), 594-612.

Graber, M. (2003). Metacognitive training to reduce diagnostic errors: Ready for prime time? Academic Medicine, 78(8), 781.

https://doi.org/10.1097/00001888-200308000-00004   

Gruppen, L. (2017). Clinical Reasoning: Defining It, Teaching It, Assessing It, Studying It. Western Journal of Emergency Medicine, 18(1), 4.

https://doi.org/10.5811/westjem.2016.11.33191   

Harden, R. (1988). What is an OSCE? Medical Teacher, 10(1), 19-22.

https://doi.org/10.3109/01421598809019321   

Hinkle, D. E., Wiersma, W., & Jurs, S. G. (2003). Applied statistics for the behavioral sciences. Boston: Houghton Mufflin Co.   

Hrynchak, P., Glover Takahashi, S., & Nayer, M. (2014). Key-feature questions for assessment of clinical reasoning: A literature review. Medical Education, 48(9), 870-883.

https://doi.org/10.1111/medu.12509   

Ilgen, J. S., Humbert, A. J., Kuhn, G., Hansen, M. L., Norman, G. R., Eva, K. W., Charlin, B., Sherbino, J. (2012). Assessing diagnostic reasoning: a consensus statement summarizing theory, practice, and future needs. Academic Emergency Medicine, 19(12), 1454-1461.

https://doi.org/10.1111/acem.12034   

Khan, K. Z., Ramachandran, S., Gaunt, K., & Pushkar, P. (2013). The Objective Structured Clinical Examination (OSCE): AMEE guide no. 81. Part I: an historical and theoretical perspective. Medical Teacher, 35(9).

https://doi.org/10.3109/0142159X.2013.818634

Kiesewetter, J., Ebersbach, R., Tsalas, N., Holzer, M., Schmidmaier, R., & Fischer, M. R. (2016). Knowledge is not enough to solve the problems - the role of diagnostic knowledge in clinical reasoning activities. BMC Medical Education, 16(1), 303.

https://doi.org/10.1186/s12909-016-0821-z   

Marcum, J. A. (2012). An integrated model of clinical reasoning: Dual-process theory of cognition and metacognition. Journal of Evaluation in Clinical Practice, 18(5), 954-961.

https://doi.org/10.1111/j.1365-2753.2012.01900.x   

Nendaz, M., & Perrier, A. (2012). Diagnostic errors and flaws in clinical reasoning: Mechanisms and prevention in practice. Swiss Medical Weekly, 142.

https://doi.org/10.4414/smw.2012.13706   

Newman-Toker, D. E., & Pronovost, P. J. (2009). Diagnostic errors—the next frontier for patient safety. JAMA, 301(10), 1060-1062.

https://doi.org/10.1001/jama.2009.249   

Page, G., & Bordage, G. (1995). The Medical Council of Canada's key features project: a more valid written examination of clinical decision-making skills. Academic Medicine, 70(2), 104-110.

https://doi.org/10.1097/00001888-199502000-00012   

Page, G., Bordage, G., & Allen, T. (1995). Developing key-feature problems and examinations to assess clinical decision-making skills. Academic Medicine, 70(3), 194-201.

https://doi.org/10.1097/00001888-199503000-00009   

Palmer, E. C., David, A. S., & Fleming, S. M. (2014). Effects of age on metacognitive efficiency. Consciousness and Cognition, 28(100), 151-160.

https://doi.org/10.1016/j.concog.2014.06.007   

Rencic, J., Durning, S., Holmboe, E., & Gruppen, L. D. (2016). Understanding the Assessment of Clinical Reasoning Assessing competence in the Professional Performance across Disciplines and Professions (Vol. 13, pp. 209 - 235): Springer International Publishing.

https://doi.org/10.1007/978-3-319-30064-1_11   

Saber Tehrani, A. S., Lee, H., Mathews, S. C., Shore, A., Makary, M. A., Pronovost, P. J., & Newman-Toker, D. E. (2013). 25-Year summary of US malpractice claims for diagnostic errors 1986–2010: an analysis from the National Practitioner Data Bank BMJ Quality and Safety in health, 22(8), 672-680.   

Schraw, G. (1998). Promoting general metacognitive awareness. Instructional Science, 26(1), 113-125.

https://doi.org/10.1023/A:1003044231033   

Schraw, G., & Dennison, R. S. (1994). Assessing Metacognitive Awareness. Contemporary Educational Psychology, 19(4), 460-475.

https://doi.org/10.1006/ceps.1994.1033   

Schraw, G., & Moshman, D. (1995). Metacognitive theories. Educational Psychology Review, 7(4), 351-371.

https://doi.org/10.1007/BF02212307   

Schunk, D. H. (1981). Modeling and attributional effects on children's achievement: A self-efficacy analysis. Journal of Educational Psychology, 73(1), 93.

https://doi.org/10.1037/0022-0663.73.1.93   

Scott, I. A. (2009). Errors in clinical reasoning: causes and remedial strategies. BMJ, 339, 22-25.

https://doi.org/10.1136/bmj.b1860   

Smartsparrow. (Accessed Sept 2014). Available at www.smartsparrow.com.   

Song, H. S., Kalet, A. L., & Plass, J. L. (2011). Assessing medical students' self-regulation as aptitude in computer-based learning. Advances in Health Sciences Education, 16(1), 97-107.

https://doi.org/10.1007/s10459-010-9248-1   

Swanson, H. L. (1990). Influence of metacognitive knowledge and aptitude on problem solving. Journal of Educational Psychology, 82(2), 306.

https://doi.org/10.1037/0022-0663.82.2.306   

Weil, L. G., Fleming, S. M., Dumontheil, I., Kilford, E. J., Weil, R. S., Rees, G., . . . Blakemore, S.-J. (2013). The development of metacognitive ability in adolescence. Consciousness and Cognition, 22(1), 264-271.

https://doi.org/10.1016/j.concog.2013.01.004

Welch, P., Plummer, D., Young, L., Quirk, F., Larkins, S., Evans, R., & Sen Gupta, T. (2017). Grounded theory - a lens to understanding clinical reasoning. MedEdPublish, 6(1), 2.

https://doi.org/10.15694/mep.2017.000002   

Wilson, R. M., Harrison, B. T., Gibberd, R. W., & Hamilton, J. D. (1999). An analysis of the causes of adverse events from the Quality in Australian Health Care Study. Medical Journal of Australia, 170(9), 411-415.   

Zimmerman, B.J., & Schunk, D.H. (2011). Self-regulated learning and performance. Handbook of self-regulation of learning and performance, 1-12.   

Zimmerman, B. J., & Pons, M. M. (1986). Development of a structured interview for assessing student use of self-regulated learning strategies. American Educational Research Journal, 23(4), 614-628.

https://doi.org/10.3102/00028312023004614  

Zimmerman, B. J., & Schunk, D. H. (2011). Self-regulated learning and performance. In Z. B.J. & S. D.H. (Eds.), Handbook of self-regulation of learning and performance (pp. 1-12): Routledge.

Appendices

Appendix 1. The Metacognitive Awareness Inventory (MAI) (Schraw and Dennison, 1994)

Question ID

Questions

Domain ID

Domain Code

Domain

1

I ask myself periodically if I am meeting my goals.

M1

M

Monitoring

2

I consider several alternatives to a problem before I answer.

M2

M

Monitoring

3

I try to use strategies that have worked in the past.

PK1

PK

Procedural Knowledge

4

I pace myself while learning in order to have enough time.

P1

P

Planning

5

I understand my intellectual strengths and weaknesses.

DK1

DK

Declarative Knowledge

6

I think about what I really need to learn before I begin a task.

P2

P

Planning

7

I know how well I did once I finish a test.

E1

E

Evaluation

8

I set specific goals before I begin a task.

P3

P

Planning

9

I slow down when I encounter important information.

IMS1

IMS

Information Management Strategies

10

I know what kind of information is most important to learn.

DK2

DK

Declarative Knowledge

11

I ask myself if I have considered all options when solving a problem.

M3

M

Monitoring

12

I am good at organising information.

DK3

DK

Declarative Knowledge

13

I consciously focus my attention on important information.

IMS2

IMS

Information Management Strategies

14

I have a specific purpose for each strategy I use.

PK2

PK

Procedural Knowledge

15

I learn best when I know something about the topic.

CK1

CK

Conditional Knowledge

16

I know what the teacher expects me to learn.

DK4

DK

Declarative Knowledge

17

I am good at remembering information.

DK5

DK

Declarative Knowledge

18

I use different learning strategies depending on the situation.

CK2

CK

Conditional Knowledge

19

I ask myself if there was an easier way to do things after I finish a task.

E2

E

Evaluation

20

I have control over how well I learn.

DK6

DK

Declarative Knowledge

21

I periodically review to help me understand important relationships.

M4

M

Monitoring

22

I ask myself questions about the material before I begin.

P4

P

Planning

23

I think of several ways to solve a problem and choose the best one.

P5

P

Planning

24

I summarize what I’ve learned after I finish.

E3

E

Evaluation

25

I ask others for help when I don’t understand something.

DS1

DS

Debugging Strategies

26

I can motivate myself to learn when I need to.

CK3

CK

Conditional Knowledge

27

I am aware of what strategies I use when I study.

PK3

PK

Procedural Knowledge

28

I find myself analysing the usefulness of strategies while I study.

M5

M

Monitoring

29

I use my intellectual strengths to compensate for my weaknesses.

CK4

CK

Conditional Knowledge

30

I focus on the meaning and significance of new information.

IMS3

IMS

Information Management Strategies

31

I create my own examples to make information more meaningful.

IMS4

IMS

Information Management Strategies

32

I am a good judge of how well I understand something.

DK7

DK

Declarative Knowledge

33

I find myself using helpful learning strategies automatically.

PK4

PK

Procedural Knowledge

34

I find myself pausing regularly to check my comprehension.

M6

M

Monitoring

35

I know when each strategy I use will be most effective.

CK5

CK

Conditional Knowledge

36

I ask myself how well I accomplished my goals once I'm finished.

E4

E

Evaluation

37

I draw pictures or diagrams to help me understand while learning.

IMS5

IMS

Information Management Strategies

38

I ask myself if I have considered all options after I solve a problem.

E5

E

Evaluation

39

I try to translate new information into my own words.

IMS6

IMS

Information Management Strategies

40

I change strategies when I fail to understand.

DS2

DS

Debugging Strategies

41

I use the organisational structure of the text to help me learn.

Domain not denoted

 

Domain not denoted

42

I read instructions carefully before I begin a task.

P6

P

Planning

43

I ask myself if what I'm reading is related to what I already know.

IMS7

IMS

Information Management Strategies

44

I re-evaluate my assumptions when I get confused.

DS3

DS

Debugging Strategies

45

I organise my time to best accomplish my goals.

P7

P

Planning

46

I learn more when I am interested in the topic.

DK8

DK

Declarative Knowledge

47

I try to break studying down into smaller steps.

IMS8

IMS

Information Management Strategies

48

I focus on overall meaning rather than specifics.

IMS9

IMS

Information Management Strategies

49

I ask myself questions about how well I am doing while I am learning something new.

M7

M

Monitoring

50

I ask my self if I learned as much as I could have once I finish a task.

E6

E

Evaluation

51

I stop and go back over new information that is not clear.

DS4

DS

Debugging Strategies

52

I stop and reread when I get confused.

DS5

DS

Debugging Strategies

There are no conflicts of interest.

Please Login or Register an Account before submitting a Review

Reviews

PATRICIA CURY - (16/05/2018)
/
This a very interesting article and the subject Is quite hard to mesure...It Is important to have more similar studies to prove if this results is what it Is realy happening with the students along medical course.
Ian Wilson - (15/05/2018) Panel Member Icon
/
This is an interesting well written article that approaches an important topic, clinical reasoning. The results are somewhat surprising in that there are limited differences between junior and senior students.

My main concern with this paper is the authors have used an instrument that was developed and validated elsewhere. Instruments need validating in the environment where they are used.

That aside this is an interesting article.