Case study
Open Access

Are we any good? A case study in developing an audit tool for educational supervisors

Jane Beenstock[1], Lynn Donkin[2]

Institution: 1. Lancashire and South Cumbria Foundation Trust, 2. Bolton Council
Corresponding Author: Ms Jane Beenstock ([email protected])
Categories: Teachers/Trainers (including Faculty Development), Postgraduate (including Speciality Training), Continuing Professional Development
Published Date: 09/02/2021

Abstract

Rationale for the audit: Educational supervision is a component of postgraduate medical education in the UK. The quality of educational supervision is variable with limited ways to examine it. We applied the principles and process of audit to our work as educational supervisors (ESs).

Methods: An audit tool with 11 standards was developed to assess aspects of practice across the seven domains of Health Education England’s Professional Development Framework for Educators. How supervisors and trainees evaluate supervisory practice varies, so we gathered data from trainees and educators. Following a pilot study and a face validity exercise, we refined the tool. Thresholds for standards were defined by an expert panel. The audit was conducted in a training area in the North West of England.

Outcomes: Response rates from public health ESs and trainees were 78% (14/18) and 88% (15/17) respectively. Two of the standards were met by all ESs. Four standards were met by between 70-90% of ESs. Three standards were met by 50-69% of ESs. One standard was met by 19% of respondents and one standard was not met by any respondents.

Discussion: An audit approach can facilitate personal reflection on the quality of supervision for individual ESs, and for assessing a group of ES’s training and development needs. 

Conclusion: An audit is a useful tool, alongside other approaches, for consultants to reflect on the effectiveness of their work as educational supervisors. It can be used for both individual and collective continuous professional development and has the potential to be adapted to other specialties. Reflecting on the subjectivity and variation that exists in how the role of educational supervisors is enacted could aid improvements in the quality of training and support provided to trainees.

Keywords: Educational supervision; audit; public health

Rationale for the audit

Postgraduate medical specialty training in England relies on educational supervisors to have oversight of a trainee doctor’s progress in meeting their learning outcomes, and they assess the evidence about how well they are progressing in their training (Health Education England, 2019a). This is, therefore, an important role in ensuring the quality of medical care provided to all citizens. In England, the specialty of public health draws trainees from a range of clinical and non-clinical backgrounds who follow a five-year specialty training programme in a similar approach to other medical specialties.

Kilminster et al., (2007, p.3) define educational supervision as “the provision of guidance and feedback on matters of personal, professional and educational development in the context of providing safe and appropriate patient care” and the UK General Medical Council (Conference of Postgraduate Medical Deans (UK), 2018, p.17) defines an educational supervisor (ES) as “a named trainer who is selected and appropriately trained to be responsible for the overall supervision and management of a specified trainee’s educational progress during a training placement or series of placements”. It is recognised that quality of educational supervision is “highly variable” (Patel, 2016, p. 1) but there are few ways for the practising ES to benchmark their performance against agreed standards or against peers. This suggests a need for a tool to assist educational supervisors identify areas for development that would improve the quality of educational supervision they provide.

Consultants in the UK are required to undertake annual professional appraisal. A key part of this is to ensure that they reflect on all activities within their scope of practice; those with an educational role are required to demonstrate evidence of their professional development as educators. The Professional Development Framework for Educators, Health Education England (HEE, 2019b, p.2) has identified seven domains of educational practice to ‘guide practitioners in their development as educators, supervisors and preceptors of healthcare learners’.  However, these guidelines lack an indication of how individual supervisors can gauge their proficiency in these areas of practice. We hypothesised that this lack of benchmarking may contribute to observed variation in quality of supervision. In response to this uncertainty we decided to explore the development of tools for ESs in public health with the aims of:

  1. Supporting structured discussion of educational activity at appraisal, and
  2. Identifying training needs for ESs in our local training area.

Audit is a tool commonly used to assess clinical, public health or systems practice against defined standards and prompt action to improve quality (Healthcare Quality Improvement Partnership, 2016). We have applied the principles and process of audit to our work as educational supervisors; to our knowledge this has not been done before.

Methods

Literature search

To check if there was an existing audit for educational supervisors, the following databases were searched: EMBASE, Medline, PsycINFO, CINAHL, HBE, HMIC, using the following terms; education*, supervisor*, supervision, standard*, audit*, guidance, guideline*, medical education. No tool for assessing the meeting of educational supervision standards was found.

Development of the audit tool

We used Health Education England’s Professional Development Framework for Educators (Health Education England, 2019b) to identify aspects of educational supervision to include in the audit. We then developed standards for each of the framework’s seven domains. We wanted to gain insight into the perspective of trainees on these standards, so data collection tools were prepared to gather feedback from trainees. 

A pilot study was conducted to gain feedback on the standards, and to test both data collection tools; for ESs and trainees. Testing of face validity with participants from other public health training areas was undertaken to ensure questions were relevant, reasonable, unambiguous and clear (Bowling, 2002). Both of these tool development activities were undertaken outside of the local training area to minimise sample contamination. The questions developed for each of the standards are shown in Supplementary File 1.

Threshold setting for the standards was undertaken using the modified Angoff method, a common method for standard setting that uses a panel of experts to determine acceptable levels of competency (Wheaton and Parry, 2012). The national Training Programme Directors group acted as the expert panel for this exercise. The steps we took in the audit development are shown in Figure 1.

 

 Figure 1: Flow chart demonstrating stages of the audit approach

 

Sample

Our samples for the audit were: accredited educational supervisors in a training area in the north of England (n=18), and trainees in the same area (n=17). The data collection tools were circulated using an online survey tool, SmartSurvey (SmartSurvey, 2020) Working with a trainee in the previous 12 months was an inclusion criterion for ESs. A filter question at the start of the questionnaire identified those who were eligible.

Consent

The survey tool began by explaining the data collected was anonymous and would be pooled to provide collective results. Consent to this was required before participants could proceed to the questions. Permission to submit this manuscript has been given by the Public Health Head of School for this training area.

Outcomes

Response

The survey was sent to the 18 accredited educational supervisors (ESs) registered in the training area on 1st July 2019. Fourteen replied, a response rate of 78%. Nine met the eligibility criteria and were therefore included in the audit. There were 18 trainees registered in the training area on 1st July 2019. The survey was sent to 17 of them as one was omitted in error. Fifteen replied, a response rate of 88%.

Demographic characteristics of educational supervisors

Of the nine ESs who had been active supervisors during the past year, three had supervised one trainee, two had supervised two trainees and four had supervised three or more trainees. ESs who had more than one trainee placed with them were asked to complete the survey for the trainee they had supervised for the longest time. The ESs had supervised trainees at various stages of training, including full-time and less than full-time trainees. 

Five of the nine respondents (56%) had not completed any formal course for supervision, and four respondents (44%) had completed one or more modules on a local university supervision course.

Demographic characteristics of trainees

Of the 15 trainees who responded, the highest percentage (40%) were in the third year of training, followed by years four (27%) and two (27%), with just 7% in year one and none in year five. Sixty seven percent had been with their current educational supervisor for over six months and 40% were based in a local authority. Almost two thirds of respondents (60%) were full time.

Meeting the standards

We analysed the pooled responses from the ESs and trainees separately. As a group of ESs, of the 11 standards:

  • two standards were met by all respondents,
  • four standards were met by between 70% and 90% of respondents,
  • three standards were met by between 50% and 69% of respondents,
  • one standard was met by 29% of respondents, and
  • one standard was not met by any respondents.

Table 1: Threshold set for each standard and extent they were achieved

Standard

Threshold set* (number of answer options)

ESs who achieved the standard n (%)

 1.1 Educational supervisors ensured that trainees completed all aspects of induction to the placement.

2 (2)

9 (100)

 2.1 Educational supervisors facilitated three or more opportunities to involve the wider team in the delivery of trainees learning in the last 12 months.

3 (4)

7 (78)

 3.1 Educational supervisors used six or more appropriate teaching interventions in the last 12 months.

6 (8)

7 (78)

 3.2 Educational supervisors have used four or more methods to develop trainees self-directed learning in the previous 12 months.

4 (5)

6 (67)

 4.1 Educational supervisors used three or more methods to observe trainee performance and provide feedback in the last 12 months.

3 (8)

8 (89)

 4.2 Educational supervisors facilitated five or more opportunities for trainees to prepare for professional examinations in the previous 12 months.

5 (5)

2 (29)

 5.1 Educational supervisors supported and monitored educational progress through five or more activities in the last 12 months.

5 (5)

7 (78)

 6.1 Educational supervisors used six or more supervisory conversational skills in the last 12 months.

6 (7)

5 (56)

 6.2 Educational supervisors provided two or more approaches to career support in the last 12 months.

2 (5)

9 (100)

 7.1 Educational supervisors sought feedback on their own supervisory practice through three or more routes.

3 (4)

0 (0)

 7.2 Educational supervisors reflected or acted on feedback using four or more methods in the last 12 months.

4 (7)

5 (56)

* The number of answer options a minimally competent ES would complete

ESs as a group reported having carried out activities more frequently than the trainees reported the same activities.

Dissemination

The results were discussed with educational supervisors and trainees from the training area and recommendations for acting on the results were agreed. The results and recommendations have been disseminated to local TPDs and shared with the tool’s contributors.

Discussion

Limitations

This tool was developed within one specialty, public health, and used in one training area and may require adapting for other contexts. ESs and trainees were not identifiable so could not be matched to their training pair and analysis was carried out at a group, not individual, level.

The questions were developed in accordance with standard guidance (McColl et al., 2001) and were piloted and subjected to a face validity exercise. However, the development team was a small group of ESs, and questions were not subjected to any statistical tests of reliability and validity (Coolican, 2009).

It is important to emphasise that this audit is just one approach to reviewing how we perform as ESs. This approach is intended to support individual and peer group reflection. Qualitative methods would have given a depth of insight not possible with this type of tool. However, they do not offer the anonymity of a survey and can be time-consuming and expensive.

Reflections

There was a high response rate (78%) to this survey. Colleagues who took part in the pilot asked to see the final tool; indicating interest in supplementing existing approaches to reflecting on, and improving, the quality of educational supervision.

The process of devising the audit illustrated the variation in the quality already mentioned (Patel, 2016) revealing that educational supervisors have different perceptions about what is an acceptable level of supervision. For example, having to reach consensus about the question wording for each standard and agree the answer options, revealed different perspectives on what the HEE guidance (Health Education England, 2019b) means in practice. The debates challenged our thinking about how we each discharge our ES role and provided valuable learning for the ESs involved in this process; gaining insight into each other’s approaches.

A second example of the variation in approach to what is considered acceptable supervision was the subjectivity apparent when the thresholds for each standard were set. The debates in the development group and expert panel illustrated different expectations from ESs in the level of support that should be provided for trainees. The non-achievement of some standards may reflect this with thresholds being set too high for some questions. Alternately it may be that the thresholds were set too low and this survey gave respondents a false sense of achievement about the quality of their supervision.

As Table 1 shows, there is an inconsistency in the proportion of answer options that needed to be met to achieve each standard’s threshold. This is because the threshold setting process for each of the standards was completed question by question. A blanket approach doesn’t allow for the specific factors intrinsic to each topic. Hence the Angoff method (Wheaton and Parry, 2012) assesses each question independently.

This type of tool provides a structured approach. Audit is also a widely recognised tool for improving the quality of care in many medical specialities (e.g. Esposito and Dal Canton, 2014; Yoston and Wormald, 2010; Mercer et al., 2006). This gives it the advantage of familiarity since most consultants are regularly involved in audit of their practice and can therefore readily appreciate how to apply this approach to their role as an educational supervisor.

The testing phases were conducted in groups outside of our training area so this would not contaminate the sample. This was extremely helpful, giving insights into how the questions were interpreted. This revealed variation in approaches to educational supervision across different training areas. A significant learning point, and area for future development, is the opportunity to agree as a group of ESs what we consider to be the most appropriate ways to support and monitor educational progress. Generating shared expectations and common understandings of what we as ESs provide for trainees will ensure greater consistency and improved quality of training experienced by trainees.

Re-audit is a key part of the audit cycle and should be conducted to see whether the recommendations have been implemented and have resulted in improvements. We have recommended re-audit be carried out after a period of two years. As there is a low turnover of ESs a re-audit would highlight improvements in the quality of educational supervision in this area.

This tool has the potential to be adapted to other contexts; both by specialty and geography. Terms used in our training area, such as ‘the annual event’, were not recognised by colleagues in pilot areas.  In addition to differences in terminology between areas, there may also be difference in terms of activities since unlike many other specialties; public health supervisors often cover both roles of clinical and educational supervisor. Although developed within the specialty of public health in England, we believe it can be adapted to other specialties and training areas.

Another strength of this approach is that the results can be used by both individuals and groups of ESs. Each ES can compare their responses with the collective response; facilitating reflection on how they may wish to change or enhance their ES activities. It also enables groups of ESs to identify themes in their training needs. As medical training takes place in a series of groups; hospitals, regions and countries, there is scope to adjust the scale at which the audit is conducted, allowing individuals to reflect on their own responses, and anonymously compares them with different groups of peers. Consideration of collective audit responses facilitates an understanding of the development and training needs of ESs both within and between specialities.

Conclusion

An audit is a useful tool, alongside other approaches, for consultants to reflect on the effectiveness of their work as educational supervisors. It can be used for both individual and collective continuous professional development and has the potential to be adapted to other specialties. This approach facilitates peer support and peer development. Reflecting on the subjectivity and variation that exists in how the role of educational supervisors is enacted can only be helpful in improving the quality of training and support we provide to trainees.

Take Home Messages

  • Educational supervisors have few ways to assess how well they are meeting standards of educational supervision.
  • Audit principles and processes are familiar to educational supervisors and trainees.
  • An audit can be conducted at both individual and group level.
  • Audits facilitate personal reflection on standards of educational supervision.
  • Audits in training zones identify collective learning needs to improve training quality for trainees.

Notes On Contributors

Jane Beenstock is Consultant in Public Health, at Lancashire & South Cumbria NHS Foundation Trust a mental health and community trust in England. She is an honorary researcher at Lancaster University. ORCID ID: https://orcid.org/0000-0001-7533-6279

Lynn Donkin is Assistant Director/Consultant in Public Health, for Bolton Council, a local authority and part of local government in England. She is a training programme director for one of the training zones in the North West of England. ORCID ID: https://orcid.org/0000-0002-4716-0883

Acknowledgements

The survey tool was designed and developed by an educational supervision audit group which included the authors and Zakyeya Atcha, Gifford Kerr, Nicola Schinaia. Thanks to Cath Harris, Information Services Librarian, Lancashire and South Cumbria NHS Foundation Trust, for conducting the literature searches; Sarah Blagdon and Anne Swift for their helpful comments on an earlier draft of this paper. Thanks also to all those who took part in the pilot, face validity exercise and threshold setting process.

Figure 1. Source: the authors.

Bibliography/References

Bowling, A. (2002) Research methods in health: Investigating health and health services. 2nd edn. Buckingham: Open University Press.

Conference of Postgraduate Medical Deans (COPMeD). (2018)  A reference guide for postgraduate speciality training in the UK: The Gold Guide, 7th edition. Available at: https://www.copmed.org.uk/images/docs/gold_guide_7th_edition/The_Gold_Guide_7th_Edition_January__2018.pdf (Accessed: 20 May 2020).

Coolican, H. (2009) Research methods and statistics in psychology. 5th edn. London: Hodder Education.

Esposito, P. and Dal Canton, A. (2014) ‘Clinical audit, a valuable tool to improve quality of care: General methodology and applications in nephrology’, World Journal of Nephrology, 3(4) pp. 249-255. https://dx.doi.org/10.5527/wjn.v3.i4.249

Health Education England. (2019a) Educational supervisor. Available at: https://www.nwpgmd.nhs.uk/educator-development/standards-guidance/educational-supervisor (Accessed: 21 May 2020).

Health Education England. (2019b) Professional Development Framework for Educators. Available at: http://admin.faculty.londondeanery.ac.uk/files/professional-development-framework-for-educators-2019 (Accessed: 2 December 2020).

Healthcare Quality Improvement Partnership. (2016) Best Practice in Clinical Audit. Available at: https://www.hqip.org.uk/wp-content/uploads/2016/10/HQIP-Guide-for-Best-Practice-in-Clinical-Audit.pdf (Accessed: 20 May 2020).   

Kilminster, S., Cottrell, D., Grant, J. and Jolly, B. (2007) ‘AMEE Guide No. 27: Effective educational and clinical supervision’, Medical Teacher, 29(1), pp. 2–19. https://doi.org/10.1080/01421590701210907

Mercer, S. W., Sevar, K. and Sadutshan, T. D. (2006) ‘Using clinical audit to improve the quality of obstetric care at the Tibetan Delek Hospital in North India: a longitudinal study’, Reproductive Health, 3(1). https://doi.org/10.1186/1742-4755-3-4

Mccoll, E., Jacoby, A., Thomas, L., Soutter, J., et al. (2001) ‘Design and use of questionnaires: a review of best practice applicable to surveys of health service staff and patients’, Health Technology Assessment, 5(31). https://doi.org/10.3310/hta5310 

Patel, P. (2016) ‘An evaluation of the current patterns and practices of educational supervision in postgraduate medical education in the UK’, Perspectives on Medical Education, 5(4), pp. 205–214. https://doi.org/10.1007/s40037-016-0280-6

SmartSurvey Ltd. (2020) SmartSurvey. Available at: https://www.smartsurvey.co.uk/ (Accessed: 23 January 2020).

Wheaton, A. and Parry, J. (2012) ‘Using the Angoff Method to set cut scores’, Proceedings of QuestionMark 2012 Users Conference. Available at: https://www.pmcommunity.ro/download/examen_pmp/Metoda%20Angoff%20scoring%20examen%20PMP%20-%20detalii.pdf (Accessed: 12 June 2020). 

Yorston, D. and Wormald, D. (2010) ‘Clinical auditing to improve patient outcomes’, Community Eye Health Journal, 23(74), pp. 48-49. Available at: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3033614/pdf/jceh_23_74_048.pdf (Accessed: 12 June 2020).    

Appendices

None.

Declarations

There are no conflicts of interest.
This has been published under Creative Commons "CC BY-SA 4.0" (https://creativecommons.org/licenses/by-sa/4.0/)

Ethics Statement

This project was exempt from formal ethics approval by LCFT because the project was considered to be clinical/non-financial audit as described in the national policy document published by the health research authority in England. http://www.hra-decisiontools.org.uk/research/docs/DefiningResearchTable_Oct2017-1.pdf. A letter was provided from the Trust’s Associate Director of Research and Development confirming that this is the review process that was followed, leading to the decision that ethics approval was not required for this audit.

External Funding

This article has not had any External Funding

Reviews

Please Login or Register an Account before submitting a Review