Research article
Open Access

A pilot study to assess the effect of a deliberate teaching tool in clinical anaesthesiology

Navdeep Sidhu[1][a], Albert Chan[2][b], Emelyn Lee[3], Eleri Clissold[4][c], Sofia Huddart[5][d]

Institution: 1. North Shore Hospital, 2. Prince of Wales Hospital, 3. Sir Charles Gairdner Hospital, 4. Waitemata District Health Board, 5. Counties Manukau Health
Corresponding Author: Dr Navdeep Sidhu ([email protected])
Categories: Educational Strategies, Teachers/Trainers (including Faculty Development), Teaching and Learning, Postgraduate (including Speciality Training)
Published Date: 09/01/2019

Abstract

The traditional unstructured apprenticeship model of teaching is still prevalent despite evidence that a more deliberate approach to teaching is likely to enhance learning. No deliberate teaching tool has been designed for or evaluated in clinical anaesthesiology. This pilot study aims to measure the effect of a deliberate teaching tool on the operating theatre educational environment in clinical anaesthesiology. We used a literature review to identify desirable teaching practices in clinical anaesthesiology, to design the Concise Regular Assessment and Feedback for Teaching and learning (CRAFT) framework. The CRAFT framework was piloted as an ‘intention-to-treat’ pre-post interventional study, over a four-week period in two sites. The operating theatre educational environment score improved from 73.8 ± 12.4 to 78.8 ± 15.0 (mean ± SD, = 0.002), with statistically significant improvements observed in three of the four educational environment domains. Differences in the degree of improvement were observed between study sites (= 0.271) and participant gender (= 0.069). Female learners found the intervention more feasible (= 0.073) and acceptable (= 0.015) than their male peers. An adequately powered multi-centre study is planned to determine if these effects can be observed in different training regions and to investigate underlying factors in the educational experience of female and male learners.

 

Keywords: Deliberate teaching tool; Graduate medical education; Educational environment; Teaching; Learning; Feedback; Anaesthesiology

Introduction

Clinical training traditionally relies on an apprenticeship model that occurs through immersion in the clinical environment without a formal structure or prior planning (Bleakley, 2006), while newer apprenticeship education models describe a process of constructing new knowledge on the foundations of what is already known to the learner (Bleakley, 2006; Taylor and Hamdy, 2013; Morris and Blaney, 2014). Deliberate teaching tools are “frameworks that enable clinicians to have a purposeful and considered approach to teaching encounters by incorporating elements identified with good teaching practice” (Sidhu and Edwards, 2018). No deliberate teaching tool has been designed for or evaluated in clinical anaesthesiology (Sidhu and Edwards, 2018). The objectives of this research project were to: 

  1. Identify desirable teaching practices in clinical anaesthesiology for incorporation into a deliberate teaching tool
  2. Perform a pilot study to evaluate the effect of a deliberate teaching tool on the educational environment, when applied as an ‘intention-to-treat’ intervention

Methods

Design of deliberate teaching tool

We performed a literature review (Ovid MEDLINE®) on 2/12/15 to identify ideal teaching practices specific to clinical anaesthesiology, with no date and or language limits, combining 1 AND (2 OR 3 OR 4) below:

  1. Teaching (MeSH) OR feedback (keyword) OR debrief* (keyword)
  2. Anesthesiology (MeSH) OR anaes* (keyword) OR anes* (keyword)
  3. Education, Medical, Graduate (MeSH)
  4. Operating Rooms (MeSH) OR operating room (keyword) OR operating theatre (keyword) OR operating theater (keyword)

Two authors (NS and EC) identified peer-reviewed articles that outlined desirable teaching practices in clinical anaesthesiology, with disagreements resolved by discussion. Items were categorised according to feasibility for incorporation into a deliberate teaching tool designed for clinical anaesthesiology.

 

Study protocol

We piloted a pre-post interventional study in two institutions, recruited after approval from the clinical directors and supervisors of training. A site investigator was appointed at each institution. Specialist anaesthesiologists and learners (defined as any trainee or resident/medical/house officer employed in the department) were allowed to opt out of the study, with remaining individuals designated as either participant specialists or participant learners. Participant specialists were provided with brief written material describing the deliberate teaching tool and advice on how to facilitate its implementation, including feedback delivery. Participant learners were provided with basic information on what to expect with the educational intervention and advice on how to facilitate specialist involvement. Reminder cards with a summary of relevant information were handed out to all participants, and A4-sized posters were placed in every operating theatre as cognitive aids (Figure 1). The deliberate teaching tool was applied during every in-theatre teaching encounter occurring in-hours (Monday to Friday, 0700-1800) that involved participant learners paired with participant specialists, over a four-week period. We defined a teaching encounter as either a half-day or full-day theatre list with the same specialist and learner. If a workplace-based assessment was required for training purposes, the study was suspended for that teaching encounter. At the end of each week, the site investigator contacted participant learners to ascertain compliance and reasons for non-compliance. Results were analysed with an ‘intention-to-treat’ approach, regardless of compliance levels. No new educational interventions were applied nor existing ones removed during the study period.

 

Change in the educational environment was determined by administering the Measure for the Anaesthesia Theatre Educational Environment (MATE) (Sidhu and Clissold, 2018), pre- and post-intervention. The MATE is a validated survey tool used to measure the anaesthesiology educational environment in the operating theatre, comprising 33 items derived sequentially from a systematic literature review, a modified Delphi approach, and exploratory factor analysis (Sidhu and Clissold, 2018). The items are categorised into four domains: Teaching Preparation and Practice, Assessment and Feedback, Procedures and Responsibility, and Overall Atmosphere. Respondents are required to rate each item on a 0-6 scale (0 = strongly disagree, 6 = strongly agree), with overall and domain scores presented as percentage scores. Pre-intervention demographic information was collected and a post-intervention survey was administered to provide information on compliance and participant experience. 

 

Data collection and statistical analysis

An online survey tool (Survey Monkey) was used to collect data anonymously. Data was analysed with IBM SPSS 24. A value of 0.05 or less was considered statistically significant. We applied the Kolmogorov-Smirnov test to determine if data were normally distributed. Paired comparisons were carried out using either the paired sample t-test (for normally distributed data) or the Wilcoxon Signed-Rank test (non-parametric data). Unpaired comparisons were carried out using the a two-tailed unpaired t-test. Group comparisons were carried out using one-way analysis of variance (parametric data) or the Kruskal-Wallis test (non-parametric data). If the Kruskal-Wallis test generated a value of < 0.05, Dunn’s non-parametric pairwise comparisons with Bonferroni-adjusted significance were carried out.

Results/Analysis

Design of the deliberate teaching tool

The search protocol yielded 6830 results. 63 papers were identified for furtherscrutiny after review of abstracts, and a further 35 identified after bibliographicreview. Of these 98 papers, six were identified as containing items related to desirable teaching practices in clinical anaesthesiology (Table 1). These items were combined with a model for delivering feedback (Rudolphet al., 2006; Rudolph, Raemer and Shapiro, 2013)to develop the deliberate teaching tool, dubbed the Concise Regular Assessment and Feedback for Teaching and learning (CRAFT) framework (Figure 1). The three phases of the CRAFT framework are based on a previously described instructional structure (Irby, 1992), utilised by almost 75% of subsequently described deliberate teaching tools (Sidhu and Edwards, 2018).The design did not prohibit opportunistic teaching of other clinical teaching points as they emerged during a case. While the framework advocated using a specific feedback delivery model, supporting material stated that participant specialists were free to use any feedback model familiar to them in order not to discourage feedback provision.

 

Table 1: Desirable teaching practices in clinical anaesthesiology identified from a literature review

Feasible for incorporation into a deliberate teaching tool

Utilises and maintains learning goals (Lombarts, Bucx and Arah, 2009; Ortwein, Blaum and Spies, 2014)

Identifies learner’s base knowledge level (Cleave-Hogg and Benedict, 1997; Haydar et al., 2014)

Encourages active learner participation(Lombarts, Bucx and Arah, 2009)

Evaluates learner’s knowledge and abilities (Lombarts, Bucx and Arah, 2009)

Provides immediate, regular, honest, relevant feedback (Cleave-Hogg and Benedict, 1997; de Oliveira Filho et al., 2008; Schlecht, 2008; Lombarts, Bucx and Arah, 2009; Haydar et al., 2014; Ortwein, Blaum and Spies, 2014)

Provides feedback based on observable behaviour, witnessed first-hand (Schlecht, 2008)

Provides feedback that addresses specific performance issues (Schlecht, 2008)

Recognises the importance of positive feedback (Schlecht, 2008; Lombarts, Bucx and Arah, 2009)

Limits feedback to 1-2 issues that will help learner achieve their goals (Schlecht, 2008)

Facilitates further learning (Lombarts, Bucx and Arah, 2009)

   Uses real clinical scenarios to aid teaching (de Oliveira Filho et al., 2008)

 

May be addressed indirectly when teaching is deliberate

Dedication / commitment / enthusiasm for teaching (Cleave-Hogg and Benedict, 1997; Haydar et al., 2014; Ortwein, Blaum and Spies, 2014)

Maintains pleasant / respectful / trusting learning climate (de Oliveira Filho et al., 2008; Lombarts, Bucx and Arah, 2009; Haydar et al., 2014; Ortwein, Blaum and Spies, 2014)

   Understands self as role-model (Cleave-Hogg and Benedict, 1997; Lombarts, Bucx and Arah, 2009)

 

 

Figure 1: The Concise Regular Assessment and Feedback for Teaching and learning (CRAFT) framework, version 1

 

Participant demography and participation rates

Of the two study sites, Site A delivered anaesthesia in 20 locations (operating theatres, interventional suites, etc) and Site B in 22 locations. Sixty-six of 84 specialists (79%) responded to a brief survey on baseline teaching practice (Table 2). Fifty of 84 specialists consented to participate in the study, the participation rate of 60% being equal to both sites. Thirty-eight of 50 (76%) participant specialists completed the post-study survey. There was a participant specialist gender imbalance between the sites (female specialist participants 37% for Site A and 65% for Site B). Levels of prior training on teaching and feedback delivery were similar for specialists across both sites, with 54% having had no formal instruction on teaching, 42% attending a course/workshop without any formal qualification, and 4% possessing a formal qualification in medical education. Fifty-three of 75 eligible learners (71%) consented participate, with 45/53 (85%) completing the post-intervention measure. A participant learner gender imbalance between sites was noted (female participant learners 41% for Site A and 72% for Site B; 53% overall). Study sites were allowed to determine themselves, if fellows were to be classified as ‘learners’ or ’specialists’ for the purposes of the study, depending on their role and usual level of supervision in the department.

 

Table 2: Baseline teaching practice at sites participating in the study 

 

 

Site A

 (n = 37)

 

Site B

 (n = 29)

 

Total

 (n = 66)

 

Discuss learning goals prior to start of each list

 

All the time

2 (5%)

2 (7%)

4 (6%)

Most of the time

3 (8%)

4 (14%)

7 (11%)

About half the time

9 (24%)

6 (21%)

15 (23%)

Infrequently

19 (51%)

14 (48%)

33 (50%)

Never

4 (11%)

3 (10%)

7 (11%)

 

 

 

 

 

Provide feedback at end of teaching encounter

 

All the time

3 (8%)

2 (7%)

5 (8%)

Most of the time

5 (14%)

8 (28%)

13 (20%)

About half the time

7 (19%)

10 (34%)

17 (26%)

Infrequently

21 (57%)

8 (28%)

29 (44%)

Never

1 (3%)

1 (3%)

2 (3%)

 

Change in educational environment

We observed a statistically significant improvement in the overall educational environment and in three of four domains (Table 3). Differences in the magnitude of improvement were noted between study sites (= 0.271) and participant learner gender (= 0.069), though these were not statistically significant in this pilot study (detailed in Table 4). There was no apparent difference when comparing age (= 0.518), clinical experience (= 0.806), or time in the current department (= 0.888). Results for individual MATE items are presented in the appendices.

 

Table 3: The effect of a deliberate teaching tool on the educational environment and component domains 

 

Pre-intervention

(Mean ± SD)

 

Post-intervention

(Mean ± SD)

Change

p value

MATE* (overall)

73.8 ±12.4

78.8 ± 15.0

+ 5.0

0.002

   Teaching Preparation and Practice

66.1 ± 14.7

75.3 ± 16.7

+ 9.2

<0.001

   Assessment and Feedback

71.4 ± 14.7

76.3 ± 17.3

+ 4.9

0.010

   Procedures and Responsibility

84.6 ± 10.6

83.5 ± 15.5

- 1.1

0.866

   Overall Atmosphere

80.3 ± 13.5

84.3 ± 14.2

+ 4.1

0.013

*Measure for the Anaesthesia Theatre Educational Environment

 

Table 4: Comparison of gender and site effects on change in educational environment 

 

Women

(n = 11 in Site A; 13 in Site B)

Men

(n=16 in Site A; 5 in Site B)

Both Genders

 

Pre

Post

Change

Pre

Post

Change

Pre

Post

Change

Site A

74.0

79.5

+ 5.5

71.7

72.5

+ 0.8

72.6

75.3

+ 2.7

Site B

73.7

83.1

+ 9.4

80.6

86.6

+ 6.0

75.6

84.0

+ 8.4

Both Sites

73.8

81.4

+ 7.6

73.8

75.8

+ 2.0

73.8

78.8

+ 5.0

 

Compliance with study protocol

There was a 15% learner drop-out rate due to non-completion of the post-intervention survey. Compliance figures for Site A were not available and described as ‘variable’. Site B reported 100 eligible teaching encounters with 75% compliance for use of CRAFT framework. The reasons cited for non-compliance were as follows: ‘specialist forgot and was not reminded by trainee,’ ‘trainee seeing patients preoperatively therefore did not have time,’ ‘senior trainee under level 2 supervision,’ ‘busy list,’ and ‘list cancellation.’ Almost 60% of participant specialists stated that they provided feedback using the model described in the CRAFT framework, similar across both sites. Eight percent used an alternative model while just under 30% stated they did not utilise any particular feedback model. Five percent of specialists stated they did not provide feedback (11% in Site A; none in Site B). 

Feasibility and Acceptability of CRAFT framework

Participants were surveyed on their opinion of the CRAFT framework, to inform feasibility of implementation (mean ± SD on 0-6 perception scale; 0 = strongly disagree, 6 = strongly agree). Learners were significantly more positive towards the use of a deliberate teaching tool compared to specialists (‘CRAFT framework is a feasible method of enhancing teaching and learning,’ Learners 4.6 ± 1.3 vs Specialists 4.1 ± 0.8, = 0.043; ‘CRAFT framework should be standard practice for every teaching list,’ Learners 4.3 ±1.5 vs Specialists 3.0 ± 1.3, = <0.001). These are detailed by study site and participant gender in Table 5. Specialists did not consider the CRAFT framework to significantly impact levels of vigilance or productivity (‘CRAFT framework affected level of vigilance’ = 1.6± 1.4; ‘CRAFT framework did not impact theatre productivity’ = 4.1 ± 1.3). Learners held positive views towards feedback provision (‘When given, positive feedback reinforced my current performance’ = 4.9± 1.0; ‘When given, corrective feedback helped me identify deficient areas of practice’ = 4.9 ± 0.9).

Table 5: Feasibility and acceptability of using the deliberate teaching tool, by participant groups

(Seven-point perception scale; 0 = strongly disagree, 6 = strongly agree) 

 

Study Site

Participant Gender

 

Site A

(Mean ± SD)

 

Site B

(Mean ± SD)

 

p value

Female

(Mean ± SD)

 

Male

(Mean ± SD)

 

p value

Feasibility*

 

 

 

 

 

 

   Learners

4.4 ± 1.4

4.8 ± 1.0

0.319

4.9 ± 1.1

4.2 ± 1.4

0.073

   Specialists

4.2 ± 0.9

4.1 ± 0.8

0.672

4.0 ± 0.8

4.2 ± 0.9

0.443

 

Acceptability**

 

 

 

 

 

 

   Learners

4.2 ± 1.7

4.4 ± 1.0

0.509

4.8 ± 1.1

3.7 ± 1.6

0.015

   Specialists

2.2 ± 1.3

3.6 ± 1.0

<0.001

3.2 ± 1.2

2.7 ± 1.4

0.229

*The CRAFT framework is a feasible method of enhancing teaching and learning

** The CRAFT framework should be standard practice for every teaching list

Discussion

This was a pilot study to evaluate the effect of a deliberate teaching tool on the educational environment in clinical anaesthesiology, with results revealing a statistically significant 5% positive effect. We consider this effect to be educationally significant, as 64.2% (one standard deviation) of all educational environment scores with this measure have a relatively narrow range of ± 15.6% (Sidhu and Clissold, 2018). Significant positive effects were observed in three of the four educational environment domains, with the ‘Procedures and Responsibilities’ domain remaining unchanged (= 0.866). The CRAFT framework design did not specifically address this domain as related items were not identified as a major theme in the literature review. This domain is consistently rated highest by learners (Sidhu and Clissold, 2018), and therefore least in need of an intervention. That this domain did not exhibit an improvement while the others did, is an indication of the validity of the MATE tool in measuring the effect of the intervention. 

 

There was a clear signal of female learners favouring the intervention in both study sites, compared to their male peers, in terms of efficacy, feasibility, and acceptability. This difference was not anticipated and the pilot study was not powered to detect this difference nor designed to investigate the possible underlying causes. There is recent evidence from the surgical specialty of unconscious/implicit bias against female learners (Meyersonet al., 2017; Nebekeret al., 2017), and it is possible that the implementation of a deliberate teaching tool resulted in more equal treatment of learners by removing some elements of bias. Another possibility may be that certain personality traits, perhaps those associated with men, are less responsive to a structured teaching encounter or any change from current teaching practice. An adequately powered larger study with multiple sites could determine if this trend is observed universally, and a qualitative study may be able to investigate the possible underlying causes.

 

There were a number of differences between the study sites. Site A had a larger number of participants – both learners and specialists – who took part in the study. Pre-intervention, specialists in Site A were twice as likely to not provide feedback (‘infrequently’ or ‘never’) to learners, compared to Site B. The magnitude of positive change was higher in Site B, compared to Site A, despite the former having a higher baseline educational environment score. There was a gender difference between sites, with Site B having more female specialists and learners. The CRAFT framework was significantly more acceptable to specialists in Site B, who were also more likely to adhere to the study protocol. Both sites were located in geographically distinct locations. A multi-site study design with a larger number of sites would be required to investigate the effects of these confounders.

 

We selected a pre-post interventional study design because randomisation and separation of participants in the same institution would be unfeasible, as there would be no guarantee that individuals in the control group would not implement aspect of the deliberate teaching tool when it was being used by their colleagues. We did not randomise whole institutions as controls as we were uncertain of our ability to recruit a sufficient number of sites. Recruiting other institutions as matched comparisons would be challenging due to complexities in matching specialist and learner backgrounds. We employed an ‘intention-to-treat’ analysis to factor in the variable application of new teaching initiatives in a clinical environment. We measured the effect of the intervention on the educational environment instead of directly measuring teaching or learning. The educational environment is defined as “a social system that includes the learner (including the external relationships and other factors affecting the learner), the individuals with whom the learner interacts, the setting(s) and purpose(s) of the interaction, and the formal and informal rules/policies/norms governing the interaction” (American Medical Association, 2009). Measuring teaching requires direct observation (or video recording) by trained observers of every teaching encounter, and scoring using a validated tool. This would only be feasible with a limited number of participants and teaching encounters, and observed teaching practice would be subject to the Hawthorne effect. The current accepted gold standard of measuring learning is performance in a high-stakes examination, for which there would be too many confounders. 

 

In our study, approximately 60% of specialists ‘infrequently’ or ‘never’ discuss learning goals with learners prior to the start of each list and almost half ‘infrequently’ or ‘never’ provide feedback at the end of each teaching encounter. The use of learning goals and routine feedback delivery were key components of this deliberate teaching tool, supporting its implementation at these sites. It is possible that this level of baseline practice varies across different countries and training regions but there is no evidence that these desirable teaching practices are prevalent in clinical anaesthesiology.

 

Limitations

This was a pilot study conducted in two training sites, and results may not be applicable in other sites due to the limited nature of the study. This is compounded by the difference in the participant demographics and effect size between the two sites. The pilot was not designed to evaluate the underlying factors for the differences between sites and participant gender. Although baseline teaching practice (Table 2) and specialist teacher training were similar between the sites, inherent differences in the teaching culture of the two departments may exist. Use of other positive teaching practices already in place were not measured or taken into account by this study. Learner expectations may have been different between the sites and possibly influenced by gender effects towards teaching and learning.

 

The study relied on participant self-reporting for data on compliance and feedback delivery. We cannot be certain that specialists delivered feedback as described by the feedback model. Data collected on feedback delivery were confined to incidence and type of feedback model, but not effectiveness or quality of feedback. Direct observation (or video recording) of all teaching encounters by a trained external observer would supply more accurate data, but requires significant additional resources and results would be exposed to the Hawthorne effect. However, any positive or negative effect of feedback delivery should have been (and was) detected by a change in the ‘Assessment and Feedback’ domain of the educational environment measure, with an ‘intention-to-treat’ analysis.

 

With pre-post interventional studies, it is possible that any measured change may have occurred even without the intervention but a number of factors make this unlikely for our study. Prior to the pilot study, we applied the educational environment measure on a small sample of 10 trainees, revealing no difference in scores after a four-week period without any intervention. In addition, study protocol required that no other educational interventions be applied or removed during the study period. Finally, the one educational environment domain that was not specifically addressed by the intervention did not show any change over the intervention period, supporting the notion that no change is likely to have occurred over the four-week period without the intervention.

 

Future research

Results from this pilot indicate that a number of underlying factors may influence the effect of the intervention, including geographic location (as an approximate surrogate for teaching culture) and gender effects. Our research group aims to evaluate the effect of this deliberate teaching tool in different training regions, in order to account for differences in teaching culture and practice. A qualitative arm will investigate gender effects in more detail. The CRAFT framework may require minor modifications to improve uptake and compliance. Additional resources for teaching feedback delivery will be required, possibly in the form of an online video.

Conclusion

The use of a deliberate teaching tool significantly improved the educational environment in clinical anaesthesiology for learners in two study sites. Differences in the effect were observed between study sites and participant genders. Possible reasons for this include gender effects in teaching and learning, inherent differences in teaching culture (possibly linked to geographical location), pre-existing use of unquantified teaching elements, ability to embrace change, learner expectations, and level of compliance. We plan to launch a definitive study with larger number of sites in multiple training regions, to further evaluate the effect of the CRAFT framework on the educational environment in clinical anaesthesia. 

Take Home Messages

  • Desirable teaching practices in clinical anaesthesiology can be feasibly incorporated into a deliberate teaching tool
  • An appropriately-designed deliberate teaching tool improves the educational environment in clinical anaesthesiology
  • Gender effects may influence the efficacy and acceptability of deliberate teaching tools

Notes On Contributors

Navdeep Sidhu MBChB, FANZCA, MClinEd, FAcadMEd

Dr Nav Sidhu is staff anaesthesiologist, education lead, and fellowship programme director in the Department of Anaesthesia and Perioperative Medicine, North Shore Hospital, Waitemata District Health Board, Auckland, New Zealand, and a senior clinical lecturer with the Department of Anaesthesiology, University of Auckland. He is deputy chair of the Australian and New Zealand College of Anaesthetists (ANZCA) Educators Subcommittee. https://orcid.org/0000-0002-5135-3717

 

Albert Chan MBBS, FHKCA, FHKAM, FANZCA

Dr Albert Chan is a staff anaesthesiologist in the Department of Anaesthesia and Intensive Care in Prince of Wales Hospital in Hong Kong and Honorary Clinical Assistant Professor in the Chinese University of Hong Kong. He is currently the supervisor of training for his department, and acts as the Deputy Training Officer on the Board of Education of the Hong Kong College of Anaesthesiologists.

 

Emelyn Lee MBBS, FANZCA, GradCertHPEd

Dr Emelyn Lee is a staff specialist at King Edward Memorial Hospital and Sir Charles Gairdner Hospital, Perth, Australia. She is involved in education and simulation teaching throughout Western Australia.

 

Eleri Clissold BSc, MBBS

Dr Eleri Clissold is the Strategic Clinical Lead for Safety in Practice at the Institute for Innovation and Improvement, Waitemata District Health Board. Prior to that, she undertook fellowships in Quality Improvement and Medical Education. She is undertaking specialty training in urgent care and is on the executive committee of the Royal New Zealand College of Urgent Care.

 

Sofia Huddart MBChB, FRCA, FANZCA

Dr Sofia Huddart is a senior medical officer at Counties Manukau Health, Auckland, New Zealand. She was previously the Medical Education in Anaesthesia Fellow at North Shore Hospital, Auckland.

Acknowledgements

We would like to acknowledge the contribution of all the individuals who participated in this pilot study.

Bibliography/References

American Medical Association (2009) Report of the Council on Medical Education 7-A-09. Transforming the medical education learning environment. Available at: https://www.ama-assn.org/sites/default/files/media-browser/public/about-ama/councils/Council%20Reports/council-on-medical-education/a09-cme-transforming-medical-education-learning-environment.pdf (Accessed: 20 December 2017).

Bleakley, A. (2006) 'Broadening conceptions of learning in medical education: the message from teamworking', Medical Education, 40, pp. 150-157, https://doi.org/10.1111/j.1365-2929.2005.02371.x

Cleave-Hogg, D. and Benedict, C. (1997) 'Characteristics of good anaesthesia teachers', Canadian Journal of Anaesthesia, 44, pp. 587-591, https://doi.org/10.1007/BF03015440

de Oliveira Filho, G., Dal Mago, A., Garcia, J. and Goldschmidt, R. (2008) 'An instrument designed for faculty supervision evaluation by anesthesia residents and its psychometric properties', Anesthesia & Analgesia, 107, pp. 1316-1322, https://doi.org/10.1213/ane.0b013e318182fbdd

Haydar, B., Charnin, J., Voepel-Lewis, T. and Baker, K. (2014) 'Resident characterization of better-than- and worse-than-average clinical teaching', Anesthesiology, 120, pp. 120-128, https://doi.org/10.1097/ALN.0b013e31829b34bd

Irby, D. M. (1992) 'How attending physicians make instructional decisions when conducting teaching rounds', Academic Medicine, 67, pp. 630-638, https://doi.org/10.1097/00001888-199210000-00002

Lombarts, K. M. J. M. H., Bucx, M. J. L. and Arah, O. A. (2009) 'Development of a system for the evaluation of the teaching qualities of anesthesiology faculty', Anesthesiology, 111, pp. 709-716, https://doi.org/10.1097/ALN.0b013e3181b76516

Meyerson, S. L., Sternbach, J. M., Zwischenberger, J. B. and Bender, E. M. (2017) 'The effect of gender on resident autonomy in the operating room',Journal of Surgical Education, 74(6), pp. e111-e118, https://doi.org/10.1016/j.jsurg.2017.06.014

Morris, C. and Blaney, D. (2014) 'Work-based learning', in Swanwick, T. (ed.) Understanding Medical Education: Evidence, Theory and Practice. 2nd edn. Oxford, UK: John Wiley & Sons.

Nebeker, C. A., Basson, M. D., Haan, P. S., Davis, A. T., et al. (2017) 'Do female surgeons learn or teach differently?', American Journal of Surgery, 213(2), pp. 282-287, https://doi.org/10.1016/j.amjsurg.2016.10.010

Ortwein, H., Blaum, W. E. and Spies, C. D. (2014) 'Anesthesiology residents’ perspective about good teaching - a qualitative needs assessment', German Medical Science, 12, p. Doc05.

Rudolph, J. W., Raemer, D. B. and Shapiro, J. (2013) 'We know what they did wrong, but not why: the case for ‘frame-based’ feedback', Clinical Teacher, 10, pp. 186-189, https://doi.org/10.1111/j.1743-498X.2012.00636.x

Rudolph, J. W., Simon, R., Dufresne, R. L. and Raemer, D. B. (2006) 'There’s no such thing as “nonjudgmental” debriefing: a theory and method for debriefing with good judgment', Simulation in Healthcare, 1, pp. 49-55, https://doi.org/10.1097/01266021-200600110-00006

Schlecht, K. D. (2008) 'Feedback', International Anesthesiology Clinics, 46, pp. 67-84, https://doi.org/10.1097/AIA.0b013e31818623f3

Sidhu, N. S. and Clissold, E. (2018) 'Developing and validating a tool for measuring the educational environment in clinical anesthesia', Canadian Journal of Anesthesia, 65(11), pp. 1228-1239, https://doi.org/10.1007/s12630-018-1185-0

Sidhu, N. S. and Edwards, M. (2018) 'Deliberate teaching tools for clinical teaching encounters: a critical scoping review and thematic analysis to establish definitional clarity', Medical Teacher, (early online access), https://doi.org/10.1080/0142159X.2018.1463087

Taylor, D. C. M. and Hamdy, H. (2013) 'Adult learning theories: Implications for learning and teaching in medical education: AMEE Guide No. 83,' Medical Teacher, 35, pp. e1561-e1572, https://doi.org/10.3109/0142159X.2013.828153

Appendices

Appendix I: Participant learner demographics

 

Site A

(n = 27)

Site B

(n = 18)

Total

(n = 45)

 

Gender

 

 

 

   Female

11 (41%)

13 (72%)

24 (53%)

   Male

16 (59%)

5 (28%)

21 (47%)

 

 

 

 

Age (years)

 

 

 

   21-30

9 (33%)

13 (72%)

22 (49%)

   31 and above

18 (67%)

5 (28%)

23 (51%)

 

 

 

 

Time in department

 

 

 

   8 weeks – 3 months

13 (48%)

1 (6%)

14 (31%)

   3-6 months

2 (7%)

1 (6%)

3 (7%)

   6-12 months

6 (22%)

4 (22%)

10 (22%)

   Over 12 months

6 (22%)

12 (67%)

18 (40%)

 

 

 

 

Anaesthesiology experience

 

 

 

   Under 6 months

4 (15%)

2 (11%)

6 (13%)

   6-12 months

5 (19%)

4 (22%)

9 (20%)

   1-3 years

3 (11%)

4 (22%)

7 (16%)

   3-5 years

5 (19%)

6 (33%)

11 (24%)

   Over 5 years

10 (37%)

2 (11%)

12 (27%)

 

Appendix II: Measure for the Anaesthesia Theatre Educational Environment (MATE)

Values are mean ± SD

Item (0 = strongly disagree, 6 = strongly agree)

Pre

Post

Change*

 

Teaching Preparation and Practice

 

 

 

TP1. I have clear learning goals for theatre teaching sessions

2.9 ± 1.3

4.0 ± 1.5

+1.1

TP2. The learning goals formulated for a theatre session are relevant

3.8 ± 0.9

4.5 ± 1.3

+0.7

TP3. My clinical teachers engage with me when determining learning goals for the theatre session

3.4 ± 1.4

4.3 ± 1.4

+0.9

TP4. My clinical teachers demonstrate an active effort to teach in the operating theatre

4.0 ± 1.1

4.5 ± 1.0

+0.4

TP5. The teaching is appropriate for my level of training

4.6 ± 0.9

4.9 ± 1.0

+0.3

TP6. Teaching is delivered in a clear manner

4.2 ± 1.2

4.6 ± 1.0

+0.4

TP7. I am able to achieve my learning goals in the operating theatre

4.3 ± 1.0

4.8 ± 1.0

+0.5

TP8. I have opportunities to learn about appropriate non-technical skills in the operating theatre

4.3 ± 1.1

4.6 ± 1.2

+0.3

TP9. My clinical teachers seek to identify my current level of knowledge, if it is not already known to them

4.2 ± 1.0

4.5 ± 1.4

+0.3

 

 

 

 

Assessment and Feedback

 

 

 

AF1. Feedback is delivered soon after my work is observed

4.0 ± 1.3

4.6 ± 1.4

+0.6

AF2. I receive feedback that provides me with an opportunity to improve

4.5 ± 1.1

4.7 ± 1.1

+0.2

AF3. Feedback is provided based on direct observation of my work

4.4 ± 0.9

4.5 ± 1.2

+0.1

AF4. I receive honest feedback

4.4 ± 0.9

4.7 ± 1.1

+0.2

AF5. Corrective feedback is provided when indicated

4.4 ± 1.0

4.9 ± 1.1

+0.5

AF6. I receive feedback on specific performance issues

4.2 ± 1.1

4.3 ± 1.4

+0.2

AF7. Feedback is provided on tasks that I perform under direct supervision

4.3 ± 1.1

4.7 ± 1.0

+0.3

AF8. I receive feedback that is appropriate for my level of training

4.5 ± 1.0

4.8 ± 1.0

+0.2

AF9. Positive feedback is readily provided when indicated

4.1 ± 1.3

4.8 ± 1.2

+0.6

AF10. Assessment of my performance in the operating theatre occurs regularly

3.8 ± 1.4

4.0 ± 1.3

+0.2

AF11. I have sufficient opportunities to reflect on my learning

4.4 ± 1.2

4.6 ± 1.1

+0.2

AF12. My clinical teachers are fair in their assessment of my performance

4.4 ± 1.1

4.5 ± 1.2

+0.1

 

 

 

 

Procedures and Responsibility

 

 

 

PR1. The clinical training programme allows me to get first-hand experience in a range of procedures

5.0 ± 0.9

5.0 ± 1.1

0.0

PR2. I have an appropriate level of clinical responsibility

5.1 ± 0.8

5.1 ± 0.9

0.0

PR3. I am aware of my duties and responsibilities in theatre

5.1 ± 0.9

5.0 ± 1.1

-0.2

PR4. I have the opportunity to acquire the practical skills appropriate to my level of training

4.9 ± 0.8

4.9 ± 1.2

0.0

PR5. My clinical teachers provide appropriate support when I perform a procedure for the first time

5.3 ± 0.8

5.2 ± 1.1

-0.2

 

 

 

 

Overall Atmosphere

 

 

 

OA1. My clinical teachers are accessible for advice

4.9 ± 0.8

5.0 ± 1.0

+0.2

OA2. My clinical teachers promote an atmosphere of mutual respect

4.7 ± 1.1

4.9 ± 0.9

+0.2

OA3. My clinical teachers create a trusting and open learning climate

4.5 ± 1.2

5.0 ± 1.0

+0.6

OA4. I feel able to ask the questions I want to

5.0 ± 1.0

5.1 ± 1.0

+0.1

OA5. I view the clinical teachers in this department as positive role models

4.8 ± 0.9

5.1 ± 0.8

+0.2

OA6. I have a good sense of rapport with my clinical teachers

4.8 ± 1.1

5.0 ± 0.9

+0.2

OA7. I am aware to whom I should report, in a variety of circumstances

5.0 ± 0.9

5.2 ± 1.1

+0.2

*Reflects differences in unrounded pre- and post-intervention scores

Declarations

There are no conflicts of interest.
This has been published under Creative Commons "CC BY-SA 4.0" (https://creativecommons.org/licenses/by-sa/4.0/)

Ethics Statement

We obtained approval from our institutional review board (Awhina Research & Knowledge Centre, RM13266). All participants provided written consent to participate in the study.

External Funding

This paper has not had any External Funding

Reviews

Please Login or Register an Account before submitting a Review

Ken Masters - (16/06/2019) Panel Member Icon
/
An interesting paper presenting the results of a pilot study to assess the effect of a deliberate teaching tool in clinical anaesthesiology. In light of the fact that structured interventions may be more effective that a more traditional apprentice-type training, the authors set out to pilot a clinical anaesthesiology intervention. Overall, the study has been well conducted and the teaching processes were comprehensive enough to cover the main items without being overly-complex, and the results were evaluated and analysed in some detail.

It would have been useful if the authors had given some more detail about the six studies that they used as their basis for the items (e.g. a simple table), rather than only the bibliographic data that one has to currently draw from Table 1.

While the impact was probably less that the researchers had hoped for, it would have been useful if they had evaluated the impact on resources vs the resources and results that might have occurred in the apprentice-type approach. Such a comparison (even if not as an RCT) would have given a clearer picture of the relative value of the intervention, and whether broadening it would be worthwhile.

If the authors are going to conduct a much larger study (as they plan to), then I would strongly recommend that part of that research take into account resources needed to the implementation of this intervention.

So, a useful Pilot, and I look forward to seeing the full-blown, broader research article.
Possible Conflict of Interest:

For Transparency: I am an Associate Editor of MedEdPublish

Mildred Vanessa López Cabrera - (09/01/2019) Panel Member Icon
/
The authors present a novel design for teaching and learning anaesthesiology. Faculty members that are familiarized with clinical simulation and debriefing, would find a similarity with the Good Judgment Debriefing. I think that the CRAFT framework would be more easily adopted in teaching because it is more concise.
The design of the study protocol is accurate for a pilot implementation, and several questions about it were addressed in the limitation section. The future research that they have forthcoming about gender effect using a qualitative approach it’s also exciting.
I would suggest a change in the title to include the name of the new teaching tool and match the results of the study. I think that a more accurate name would be: Designing the Concise Regular Assessment and Feedback for Teaching and learning (CRAFT) anesthesiology, or Effect of CRAFT as a teaching tool on the anaesthesiology educational environment.
I wish the introduction section would provide a bigger background of different deliberate teaching tools, and the traditional ways in which anesthesiology is being taught.
The methods section could include more information about how many studies were found, maybe even citing them in providing the reference in that section. Also, information about how the feasibility of implementation was analyzed by the two authors, the different criteria, or ranking of desirable practices that they found could benefit the readers.
The results/analysis section could benefit if a description of how the CRAFT methodology works. For example, how would the participants determine trainee’s previous experience? What kind of goals were agreed? Also in this section, it would be interesting to compare the results from the MATE survey with previous results of its application on other educational environments.
Possible Conflict of Interest:

I have no conflicts of interest.