Research article
Open Access

What Factors Influence General Practice Specialist Trainees’ Engagement with their E-portfolio?

Jonathan Rouse[1], Christopher Green[2]

Institution: 1. Southend General Practice Specialist Training Programme, 2. University of Essex
Corresponding Author: Dr Jonathan Rouse ([email protected])
Categories: Assessment, Educational Strategies, Teaching and Learning, Postgraduate (including Speciality Training), Research in Health Professions Education
Published Date: 17/08/2018

Abstract

Background: E-portfolio is the primary strategy for the development and assessment of general practice trainees. Literature indicates that trainees often resist engaging fully, undermining its validity in assessing trainee competence and performance.

Aim: To identify and explore conditions influencing engagement with e-portfolio within a local training programme.

Design: A constructivist grounded theory design was used to conceptualise the factors that influence trainee engagement.

Method: Twelve semi-structured trainee interviews were audio-recorded, transcribed and coded. Aided by Nvivo Pro version 11, initial codes were developed, revised and raised to focused codes.  Through constant comparison, diagramming and memo-writing, theoretical categories were generated.

Results: Data analysis conceptualised a theory of engagement incorporating three conceptual categories: “Conceptualising the e-portfolio”, “Developing and maintaining trust” in e-portfolio and its processes and “Deciding upon investment worth”. Decisions to invest personal resources in e-portfolio engagement depended upon trainees’ appraisal of its personal worth and value. 

Conclusions: E-portfolio valuation was contingent upon trainees’ conceptualisation of its purpose, and the trustworthiness of the learning and assessment processes prescribed by its structure. This has implications for trainees, supervisors, and training programmes related to implementation and ownership of e-portfolio, and the credibility and transparency of its role in the assessment of professional performance.

Keywords: General Practice Training; E-portfolio; Medical Education

Introduction

The Royal College of General Practitioners (RCGP) (RCGP, 2016a) defines e-portfolio as the modality for recording all workplace-based assessment (WPBA) and the “glue which binds the curriculum, learning and assessment”. E-portfolio forms one part of the “tripos” assessment allowing general practice (GP) trainees to achieve a certificate of completion of training, the other two components being national, standardised examinations (RCGP, 2007). Some question the need for e-portfolio, citing the greater credibility of standardised examinations (Lakasing, 2013), whereas, others suggest that objective measurement cannot adjudicate reliability (Sadler, 1989; Rughani, 2008) and that portfolios are well placed to authentically and reliably assess performance at the highest level (“does”) of “Miller’s pyramid” (Rethans et al., 2002). WPBA can reliably assess trainee performance, however, reliability decreases as assessment numbers diminish (Wilkinson et al., 2008). Similarly, reliability depends more upon regular sampling than a standardised test (Van der Vleuten and Schuwirth, 2005). E-portfolio may therefore have greater reliability for assessing clinical performance, but only if engagement remains consistently high. Thus, understanding factors influencing engagement is important, both to assist individuals to excel and to quality assure assessment processes for external stakeholders.

Assessment is a key driver for learning (Van der Vleuten and Schuwirth, 2005) and portfolios provide opportunities for learner feedback. However, as an assessment tool, portfolios reduce trainee ownership and self-confidence (McMullan et al., 2003), and, transparent reflective log entries require sensitive handling, as judgements of those reflections may be viewed as judgements of character (Tomlinson, 2015). Furthermore, trainees are more likely to present their best evidence, rather than displaying weaknesses when being assessed (Norcini, 2003; Driessen et al., 2007), particularly considering the case of Dr Bawa-Garba (Dyer and Cohen, 2018). This normative reflection rarely produces transformative learning (Cotton, 2001).

Portfolio structure also influences its use in summative assessment. Less rigid structures have greater face and content validity (Pitts, Cole and Thomas, 1999) and learners tend to favour portfolios which may be adapted to learning preferences (Gray, 2008), however, summative assessment requires standardisation (Snadden, 1999; Roberts, Newble and O’Rourke, 2002; Norcini, 2003). The rigidity of e-portfolio necessitated by its summative component may have paradoxically created barriers to engagement, negating formative benefits, and is worthy of further exploration.

Whilst early e-portfolio pilots were popular amongst self-selected cohorts, increasingly resistance is reported (King, 2013), particularly regarding reflective writing (Ruiz et al., 2009). For trainees this issue is contentious given varying numbers of log entries required for each review period by different Local Education and Training Boards (LETBs) (Osborne and Bal, 2013; Health Education East of England, 2015), despite the RCGP stating there is no minimum, just “…sufficient evidence of balanced curriculum and competence coverage” (RCGP, 2016b).   

Previous studies have assessed factors affecting e-portfolio use in medical undergraduates (Ross, Maclachlan and Cleland, 2009; Belcher et al., 2014) and across different specialties (Kjaer, Maagaard and Wied, 2006; Tochel et al., 2009; Goodyear, Bindal and Wall, 2013) however good quality studies examining e-portfolio in GP training are lacking. There is also scant evidence available related to how peers, the formative and summative purposes of the portfolio, the structure of the portfolio and socio-political factors influence engagement.  This study attempts to address this gap. Furthermore, this study aims to generate understanding of user perspectives and make recommendations to enhance the quality of educational support for trainees locally and nationally and provide feedback to the RCGP as to how e-portfolio might be developed to optimise engagement.

Methods

This study employed a constructivist grounded theory design (Charmaz, 2006) to identify and explore the factors influencing engagement with the RCGP e-portfolio within a local training programme.  The study was conducted between October 2016 and September 2017 within a single general practice training programme. Trainees were invited to contribute to the design and subsequent reporting of the research, although uptake was limited.

In the interests of transparency several assumptions need to be made explicit. Firstly, the researcher subscribed to the educational value of portfolios. Secondly, the perception was that e-portfolio resistance was endemic amongst local trainees. These assumptions have important connotations regarding reflexivity. Thirdly, e-portfolio has arisen for its educational and regulatory value and is unlikely to be abandoned within medical education in the foreseeable future. Fourthly, engagement with e-portfolio was only considered quantitatively.

Consistent with “criterion-I sampling” (Palinkas et al., 2015), purposeful sampling was employed.  Trainees with lower levels of portfolio engagement were sampled to provide insights into factors negatively influencing engagement. Low engagement was arbitrarily defined as less than half of locally expected numbers of WPBA and learning logs completed for stage of training, inactive personal development plans, stated concerns regarding engagement by the educational supervisor or unsatisfactory educational supervisor reviews. Trainees were contacted by email with an explanation of the purpose of the research and what participation involved. Subsequent emails and verbal invitations helped to increase participation.

Data were collected through twelve semi-structured interviews with twelve general practice trainees that were audiotaped and transcribed onto a secure computer. Focus groups were not deemed appropriate as there were risks that this method of data collection may suppress some trainees from disclosing useful data through emergent hierarchies (Green, 2007). The initial interview schedule (Box. 1) was designed to explore identified gaps within a preceding literature review. All interviews were conducted by Jonathan Rouse. Consistent with grounded theory methods subsequent interviews were adapted based on initial analysis of earlier interviews.  This provided richness and depth to the dataset.

Box 1 Initial Interview Schedule

Previous experience of portfolio prior to GP training

Support offered

  • Supervisor, training programme, deanery

Attitudes and behaviours of peers

Attitudes and behaviours of supervisors

  • Quantity over quality of learning logs

Understanding the purpose of the e-portfolio

  • Sociopolitical factors

Impact of the portfolio on personal development

Place of portfolio within training

  • Compare with AKT and CSA, Summative assessment

Factors affecting engagement

  • Time, rota, personal circumstances, IT

Encouraging use/Improving motivation

 

Data collection and analysis (aided by NVivo Pro v11 data management software) were performed simultaneously. Initial coding commenced following transcription of the first interview and continued throughout the interview process. Transcripts were coded initially, predominantly utilising gerunds to retain actions and decision-making processes, and, in-vivo codes to retain the trainees’ ‘voice’ throughout the analytical process. Initial coding remained open so as not to prematurely limit the analysis. The first six interviews produced over 1000 codes. Initial codes were compared and revised by amalgamating duplicates and modifying similar descriptors to capture ‘what was going on’ across data from different participants.

Focused coding was used to provide abstraction and conceptualisation (Alemu et al., 2015). This involves selecting the initial codes which best represent the data’s core characteristics (Sbaraini et al., 2011). Whilst the a priori literature review inevitably influenced this, merging certain codes and discarding others considered less relevant to the study’s aim provided a legitimate strategy for producing focused codes (Saldana, 2016). In this study, focused codes incorporated the highly prevalent revised initial codes and described virtually the entire breadth of the data barring a few codes that appeared inconsequential to the study’s aims. Analytic memo writing provided comparative syntheses of coded data and shaped the analytic lines of enquiry.  Memos supported the revision of initial codes, and, helped to develop focused codes and theoretical categories.  Diagramming and mapping techniques were used to delineate interactions between categories.

By the eighth interview, repetitive themes persisted, and so interviews were paused to develop seven initial theoretical categories. A theoretical interview schedule was constructed (Box 2) to uncover any disconfirming data, saturate categories and further examine how these were related. Data collected within four subsequent interviews did not produce any new categories but consolidated initial categorisation to three conceptual categories whilst also illuminating the links between categories. 

Box 2 Theoretical Interview Schedule

Assessing level of engagement

Understanding of the primary purpose of the e-portfolio

Who the portfolio is for

Effect of trust upon engagement

  • How trust in the eportfolio and its systems is measured
  • How trust relates to understanding of the eportfolio’s purpose

Relationship of trust and deciding upon investment worth

Worth of the portfolio

  • Impact on engagement
  • Factors influencing perception of worth

Results/Analysis

The analysis illuminated the conditions and trainee decision-making processes that influenced engagement with the e-portfolio.  Trainees made engagement decisions based upon their conceptualisation of the e-portfolio and the trust they placed in it as a learning tool.  This determined the extent to which they were prepared to invest time, energy and resource into its development and completion. The category “Conceptualising the E-portfolio” pertained to socio-culturally influenced constructions of knowledge of how to use e-portfolio and personal understanding of its raison d’etre, which underpinned engagement strategies and their justification. “Developing and maintaining trust” in e-portfolio and its processes illustrated the influence of credibility of learning and assessment methods employed and variance between understanding of ownership and purpose with trainees’ lived experience. “Deciding upon investment worth” described how e-portfolio engagement was influenced by socio-culturally dependent cost-benefit decisions, whereby cost was measured in time and personal sacrifice, and benefit in relation to its perceived role and educational value. This ultimately influenced e-portfolio engagement.

Deciding upon Investment Worth

I think it’s time consuming. Time can be spent better elsewhere…ST3(4)

E-portfolio engagement depended upon investment and worth, whereby investment comprised time, effort and personal sacrifice, and worth, the anticipated benefit.  Trainees balanced the pay-off between spending time on reflection and the outcomes that this achieves:

If you put the time in and you feel that you’ve really learnt something you don’t mind…ST3(3)

Focused codes, such as “getting away with doing less” and “assessing personal gain” eluded to this cost-benefit decision-making.  The focused code “juggling demands” highlighted that e-portfolio does not exist in isolation. Worth of investment is relative to other commitments, such as family, work and professional examinations:

…to find the time is quite difficult because…you need to balance exam revision and also family life. ST3(7)

Deadlines and summative assessment appeared to promote engagement, however, whilst this enhanced perceived worth of investment in relation to qualification, a focus on summative assessment may have been detrimental to reflective learning and development. 

…it was becoming more of a learning tool for me, but now it’s becoming more of…an exercise in completing and getting the numbers... ST3(8)

Time was identified as an important investment commodity needing to be found and spent. “Finding time” implied e-portfolio was of low priority and, for some, perceived worth was so low that even protected time would be considered wasteful.

…do you think it would help if we had more time in a session. No, I think some people wonder that. ST3(4)

Conversely, “spending time” suggested higher implicit value and greater personal investment. Participants expressed the view that time spent on e-portfolio should produce a return-on-investment. Reflective log entries required greater investment, as compared with WPBA and some participants implicated reflective learning logs as less reliable measures of their ability or performance, and less able to capture a trainee’s learning:

I’d rather do more [WPBA], rather than learning logs… I find it difficult to expand and write sentences about how I feel about that particular situation ST1(3)

…if it’s just clerking in a patient…then what is there to reflect on? ST1(2)

Differences in implementation of reflective learning logs by supervisors and trainers further influenced how trainees   perceived its worth and the extent to which they subsequently engaged:

…if [others] can get away with 2 a month and we’re doing 8 a month…what’s that say about how that learning skill is viewed? ST3(3)

Developing and Maintaining Trust

…there’s a lot that you can’t really see from just the portfolio. ST1(3)

Developing trust in e-portfolio processes and trust in the ways in which these processes are implemented influenced the extent to which trainees committed to engaging with the portfolio.  Trust was contingent upon trainees’ experiences – where there was clarity of expectation and consistent implementation, trust developed.  This trust was fragile though and when trainees’ experiences contrasted with their expectation there was a risk that trust in e-portfolio processes would be lost and engagement diminished.  

Inconsistent supervisor expectations and behaviours, knowledge of differing requirements in other regions and perceived lack of punitive measures for underperforming colleagues locally devalued engagement.

…you do want people who don’t bother to have a bit more of a slap on the wrist…I don’t feel like I get a huge amount out of it…if I hadn’t of bothered it wouldn’t have made a huge difference. ST3(3)

Differing standards promulgated subjectivity, reducing faith in e-portfolio as a summative assessment tool, and inculcated feelings of inequality, resulting in distrust and disengagement. 

…there’s also unfair elements to it…the expectations vary so massively…and that definitely brings about a bit of disengagement. ST3(6)

Trust was eroded through dissonance between the espoused value of the e-portfolio as a reflective, self-directed learning object based on adult learning principles and trainees’ experiences of it as a reductionist performance monitoring tool. Quantitatively measuring performance was often viewed as disempowering, which, through loss of a sense of ownership, undermined trust and engagement.

…when you say there’s a set of numbers that you need to do then I think the intention changes from being purely reflective and learning something…ST1(2)

…it’s part of our assessment, you need to still have…the minimum number of learning logs, the minimum number of [WPBA], so I don’t really feel it’s like for me anymore…ST3(2)

Mutual trust was important, and trainees often expressed that espoused values of adult learning were lacking due to factors within e-portfolio and beyond. Platform inflexibility and seemingly irrelevant boxes reduced sense of ownership and stifled creativity.

…what puts me off writing entry logs…is the fact that we have to chase our supervisors to read them…you don’t want to annoy the person who’s your clinical supervisor…ST1(2)

…I just draw mind maps and I just scan it on as my reading, but they still want me to write stuff in the bits like “why did you do this” … ST3(2)

Supervisor behaviours devaluing trainees’ learning and constant reminders regarding targets, without constructive qualitative feedback, undermined adult learning values and this perceived lack of mutual trust negatively influenced engagement.

…there is lots of encouragement from supervisors telling us constantly do this, do that many learning logs, but that’s even more off-putting. ST3(3)

Trust in supervisors was also influenced by the perceived ability of supervisors to assign relevant competences to log entries.

I find it easy just to make sure it’s short and not too over-analysing stuff as people often miss a lot of stuff in there and just tick one or two boxes. ST3(5)

For trainees to engage they needed to trust that e-portfolio accurately represented their capabilities. Trainees were more trusting of judgements made by those familiar with their daily practice but raised concerns regarding judgements made by external assessors.

…they can use the logs…they can use the mini-CEXs and CBDs, but I still don’t think that it’s an accurate reflection. ST1(1)

… I think the person that ultimately should have the say in who passes and who fails, based on your portfolio, should be your supervisor and not someone else who’s never met you... ST3(6)

Trainees’ trust in the methods used by e-portfolio to reflect their capabilities varied. There was a leap of faith that situations they selected to form the basis of reflections would be interpreted how intended – as instances of experiential and transformative learning that facilitated professional development.  Some trainees were anxious that such reflections could be misused by assessors as evidence of poor practice and therefore poor performance.

 …you could come across as a bad trainee when you’re not a bad trainee because you’re reflecting on forced things that you’ve done wrong…whereas that’s not a true reflection of you, but you’re using it as a self-critical way of learning… ST3(3)

Trainees indicated that they were unsure of the confidentiality of records within the e-portfolio and who had access to their reflections.  This directly impacted upon the kinds of situations trainees chose to include in their reflections, limiting potentially transformative learning opportunities.

…there was a case gone to court about this…so, you don’t want to write, you know, too sort of important stuff ST3(5)

Conceptualising the E-portfolio

It’s a place for us to be monitored, to see for supervisors and the deanery and, of course, our tutors to monitor what we’re doing. ST1(2)

The ways that peers and supervisors conceptualised and implemented the e-portfolio were influential in constructing trainees’ knowledge and understanding. Where supervisors lacked awareness of the e-portfolio’s values and processes, trainees’ motivation to engage reduced.

… If the supervisor doesn’t know a huge amount about it, then [you’re] possibly less kind of motivated to do it…ST3(1)

Peer interaction influenced conceptualisation and, despite a culture of negativity towards e-portfolio, shared knowledge and understanding appeared to enhance engagement.

I don’t think that I’ve spoken to a single person who turns round and says this is an excellent thing…everyone hates it…ST3(2)

We try to sit and do it together with my other colleagues who are in GPST1 as well… that’s been useful…ST1(3)

Knowledge was conceptualised within procedural, systematic and contextual domains. Procedural knowledge, dependent upon induction and support from supervisors and the training programme, encompassed knowing how to use e-portfolio and how and when to write reflections. Overcoming engagement inertia was difficult without it.

…I didn’t really know how to use it and then I didn’t know what I should be reflecting on, so just really basic things I didn’t know. ST2(1)

Trainees’ expectations were manipulated by the cues they picked up from others.

…one supervisor says, “yes I think you’re meeting that” and the other supervisor says “mmm, maybe, I see what you mean. ST3(4)

Contextual knowledge comprised awareness of what trainees were doing locally in comparison with implementation processes reported in other regions. This sometimes led to a sense of injustice where trainees perceived differences in expectations between one context and another.

…it’s unfair…whilst certain people in a different deanery will do a few logs we’re expected to do a lot more. ST3(6)

The e-portfolio was conceptualised in terms of purpose and ownership too. E-portfolio was recognised as having developmental, pastoral and managerial purposes. Trainees developed an understanding of its primary purpose which influenced the investment worth attributed to engagement.

…something had to be done…to show that the colleges are monitoring their trainees, and so it’s just created more paperwork to prove that they’re doing that…ST3(3)

…that’s why we have mini-CEX and CEPS and CBDs and these things, to really show that we’re not just working, but also learning…ST1(2)

…it’s also to kind of identify people who are struggling, or, um, possibly need a bit of extra support…ST3(1)

Ownership was conceptualised at two levels. At one level, it concerned physical ownership of the platform and its relation to subscription fees and, on another level, it concerned actual content. An inflexible structure, quantitative measures of engagement and loss of control over access and disclosure tended to enhance the understanding of e-portfolio as being externally owned.

I would be…happy just to write those 2 boxes and I could decide how much I wanted to write in each box... ST3(2)

A Theory of GPST Engagement with the E-portfolio

Figure 1: A Theory of GP Trainee Engagement with E-portfolio

This theory conceptualises e-portfolio engagement as ultimately being influenced by the perceived worth of investing in it. Several memos highlighted that value-based decisions were made by trainees at micro and macro levels regarding e-portfolio’s overall relevance and importance to their training and development. At a micro level, investment worth of specific components of e-portfolio were evaluated. At a macro level, investment was dependent upon how the portfolio was conceptualised, and the trust trainees placed within the portfolio and its implementation.  Less energy was invested where mistrust of e-portfolio’s motives and processes existed, or if trainees felt that they were undervalued or treated unfairly.

You go through the motions because you’re satisfying…this element that you can’t really control…if I thought my supervisor was the person that was going to be assessing me at the end that would be a far more useful assessment tool because then I’d give it more value. ST3(6)

Understanding the purposes and beneficiaries of e-portfolio also informed evaluation of worth. Perception of e-portfolio as a performance management tool decreased investment relative to that as a personal development tool.

…it’s…evidencing that we’ve trained you…if you’re actually that good, you can actually use it as a learning tool as well…ST3(5)

Conceptualising the e-portfolio also informed development and maintenance of trust in this and its processes. This was undermined by knowledge of differing standards locally and in other regions, and, by e-portfolio’s perceived reductionism.

…you can feel like you’ve created a good doctor...Look at all these doctors, they’re ticking off all these boxes…Does it actually mean that? My feeling is no, I don’t think it does... ST3(4)

Furthermore, understanding of purpose subsequently challenged by experience negatively influenced maintenance of trust in e-portfolio and its administration.   

It’s supposed to be a way for us to enhance our reflection, stimulate learning…but then if it’s for us then why is it so didactic? Why is it so, enforced upon us...? ST3(6)

Discussion

Summary

The findings of this study posit e-portfolio engagement as an investment decision, underpinned by the trust that trainees have within its learning and assessment methods and perceived inconsistencies in educational expectations. The ways in which trainees conceptualise the e-portfolio and its implementation influences trust and investment decisions.  These provide the basis for evaluating congruence between expectations and experience and informing the cost-benefit decisions that mould engagement.

Initial preconceptions of the researchers were challenged, including that of engagement itself, highlighting qualitative dimensions, and producing reflection on how trainee engagement should be evaluated. The negative effects of quantitative reductionism that trainees characterise in this study appear to reduce personal investment and encourage superficial engagement in e-portfolio. This resonates with several extant psychological theories, which shall be explored below.  It may be time to re-evaluate whether the RCGP e-portfolio fulfils its intended purposes for all stakeholders. 

Strengths and Limitations

This study addresses the gap within the literature pertaining to e-portfolio engagement since its widespread implementation in GP training. Significant overlap with prior work on portfolios and other extant psychological theories limits originality. However, the conceptual category “developing and maintaining trust”, which incorporates democratic issues, adds a further dimension to discourse on value and worth within medical education.

Although some interviews lacked full understanding of taken-for-granted meanings retaining participant voices throughout the analysis increased resonance, as indicated by participant feedback. Conceptual categories explained nearly all the data, however, revised coding may have narrowed data analysis earlier and limited potential depth.

Focusing on trainees with lower engagement levels means that these findings may have less relevance to those with higher levels of engagement. However, several practical recommendations were generated, and the abstraction of the conceptual categories appears to have reach beyond this specific context. Furthermore, e-portfolio informs trainee Annual Review of Competency Progression and the current review of this process (Rimmer, 2017) may be informed by this research.      

Comparison with Existing Literature

Resonating with prior studies highlighting a positive correlation between perceived value and portfolio engagement (Snadden and Thomas, 1998; Hrisos, Illing and Burford, 2008) this study identifies “deciding upon investment worth” as being the most influential factor for engagement amongst this study population. This may be explained by neoliberalist commodification of education and participant assumptions regarding learning and assessment as consumerist transactions (Ingleby, 2015).

This study finds incongruent pedagogies in action that may adversely influence engagement.  The contradictions inherent in reflective writing that do not allow for disclosure of experiences of suboptimal practice without risk of sanction is one example here previously observed amongst medical students (Grant et al., 2006). Furthermore, lack of trust, loss of ownership and control appear to undermine autonomy, adversely affecting engagement and echoing findings amongst Dutch GP trainees (Kjaer, Maagaard and Wied, 2006).

Amongst medical students, supervisor understanding of portfolio requirements was identified as being important for engagement (Austin and Braidman, 2008), whilst perceived lack of supervisor clinical knowledge has been identified as a barrier to completing WPBA (Hrisos, Illing and Burford, 2008). Similarly, this study suggests that supervisors whose knowledge and understanding of e-portfolio enables them to identify opportunities for WPBA and appropriately assign relevant professional competencies to learning logs positively influences engagement. Linking quantitative measures of engagement to supervisor ratings was apparent in this study, however, contrasting with earlier observations in which this was not directly linked to portfolio engagement (Goodyear, Bindal and Wall, 2013) the findings indicate a negative impact when supervisors focus on quantity rather than quality. Supervisors who appear to value trainee learning and offer formative rather than “evaluative” feedback (Dickinson, 1995), positively influence engagement. Sometimes supervisors may project the negative perception of the e-portfolio, with one trainee suggesting that the burden of e-portfolio upon supervisors is often transferred to trainees.  This echoes previous findings (Hrisos, Illing and Burford, 2008).

The findings of this study mirror those of others (Grant et al., 2006; Kjaer, Maagaard and Wied, 2006) in that time, competing demands and proximity of summative examinations all influence engagement. However, this study places these conditions within the context of an investment decision and therefore provides insight into value-based judgements that trainees make regarding e-portfolio engagement. Previous observations that engagement correlates with the perceived interest of individual cases (Driessen et al., 2005) is also evident with many trainees suggesting that routine clinical contact provided minimal impetus for reflective log writing.

Trainees who value experiential learning and view e-portfolio primarily for learning and development suggest engagement would continue in the absence of summative assessment. Paradoxically, those perceiving it as a performance management tool, expressed opinion that if unassessed engagement would cease. However, summative assessment also appears demoralising, owing to a sense of autocracy. This, coupled with mistrust in how input is interpreted, produces superficial engagement. Process curricular models in medical education, whereby learning is assessment-driven (Fish and Coles, 2005) should align with objectives and instructional activities to avoid under-recognition of teaching and learning (Anderson, 2002). This combined with the ramifications of summative assessment for qualification (Thompson et al., 2017) may explain this phenomenon.

“Deciding upon investment worth” is identified in this study as the main influence upon e-portfolio engagement. This has previously been recognised within the context of inter-professional learning when investment of personal resources appeared dependent upon “…perceived value of specific learning processes and outcomes” (Green, 2013), and, in online learning when the “propensity to study” was influenced by its value and associated costs (Scott, 2007). Braskamp (1986) described “Personal Investment Theory” amongst university students suggesting that “personal investment”, defined as behaviour and choice to engage in specific activities, was influenced by “personal meaning” constructed through pre-existing understanding and context (Burton, 2012). This resonates with the socially constructed understanding of e-portfolio identified in this study.

The “Theory of Planned Behaviour” (Ajzen, 1991) offers explanation of observable behaviours. The main determinants of an intended behaviour are thought to be “attitudes towards that behaviour”, “subjective norms” and “perceived behavioural control” (the ease or difficulty of executing the behaviour) (Archer et al., 2008). The “Theory of Planned Behaviour” has been used to explain e-portfolio acceptance using a devolved model (Ahmed and Ward, 2016) in which “attitudes and behaviour” related to perceived ease of use, usefulness and compatibility with learning preferences, “subjective norms” to superior and peer influences, and “perceived behavioural control” to facilitating conditions and self-efficacy. Many of these findings echo those of this theory of e-portfolio engagement.  

Implications for Practice

A summary of recommendations is shown below.

Table 1 Summary of Recommendations

 

Hospital Clinical Supervisors

Educational Supervisors

Training Programme

LETB

RCGP

Conceptualising the E-portfolio

Provide e-portfolio training at faculty group meetings to develop a shared consensus on purpose and implementation

 

Increase time at induction and throughout the programme spent on e-portfolio

 

Harness peer support by providing formal sessions and encouraging social networking

 

 

Developing and Maintaining Trust

Highlight importance of positive attitudes towards e-portfolio.

Calibration and capacity building during trainers’ workshop

 

Uncover conditions that have led to differing regional expectations.

 

Deciding Upon Investment Worth

Practice assigning professional competencies at faculty group meetings

Encourage constructive formative feedback rather than evaluative

 

 

Discussion regarding making the components of the e-portfolio more flexible

 

Hospital Clinical Supervisors

Clinical supervisors should receive regular training on implementing e-portfolio effectively, assigning professional competencies to log entries, identifying when and how to undertake WPBA and encouragement to be pro-active in initiating WPBA for trainees. The importance of supervisors espousing the values of e-portfolio should be highlighted.

Educational Supervisors

Issues regarding evaluative feedback and varying expectations should be fed-back to GP trainers at local trainers’ workshops. To avoid paradoxical disengagement, educators’ notes should be qualitative rather than quantitative. During workshops, activities could be undertaken that calibrate individual trainer’s standards, for example, analysing portions of anonymised e-portfolios.

General Practice Specialty Training Programme

Shared understandings around implementation and expectations are needed amongst stakeholders.  This may be achieved through consolidated induction and supplemented with written guidance and facilitated support sessions within half-day release programmes, helping those unable to attend. Peer support may be encouraged by providing formalised opportunities during protected half-day educational sessions for drawing on shared reflective practices that can be used to directly deliver on one element of e-portfolio and encouraging the use of social networking outside of these sessions to enhance knowledge and understanding. 

LETB

These findings suggest that differing e-portfolio expectations between LETBs negatively influence engagement. It is unclear why these discrepancies exist, and this should be discussed between LETBs to identify potential policy changes or whether greater transparency is needed.

RCGP

The RCGP should discuss the (in)flexibility of e-portfolio such that potential changes, specifically to the learning log section of the platform, could be considered. Less restrictive reflective space with prompts, as opposed to mandatory fields, may enhance engagement. 

Implications for Future Research

This study finds that supervisor behaviours influence engagement. Understanding reasons for varying supervisor expectations may produce strategies for decreasing variance. Exploration of the conditions influencing supervisor expertise of e-portfolio and attitudes towards it may identify systematic and cultural barriers to be addressed. Identifying the factors which promote quantitative over qualitative supervisor feedback is important if the latter is to be encouraged.

Seemingly a large area of contention identified is the differing e-portfolio requirements between regions. Several trainees discovered this information informally and speculation abounds as to the reasons for it. In the interests of transparency and democracy a full realist evaluation of e-portfolio implementation and outcomes is suggested.

Conclusion

The value attributed to the e-portfolio was contingent upon trainees’ conceptualisation of its purpose, and the trustworthiness of the learning and assessment processes prescribed by its structure. This has implications for trainees, supervisors, training programmes, LETBs and the RCGP. These implications concern implementation and ownership of the e-portfolio, and the credibility and transparency of its role in the assessment of professional performance.

Take Home Messages

  • Portfolios are described within the medical literature as having formative and summative purposes.
  • Engagement is highly variable with limited evidence of why this happens.
  • This study provides empirical evidence of barriers conceptualised as an investment decision. 
  • Investing personal resources depends upon individual conceptualisation of portfolios and the extent to which they trust the learning and assessment processes embedded within.

Notes On Contributors

Dr Jonathan Rouse has been a general practitioner for the past 14 years in England. During this time, he has helped to educate medical students, local foundation doctors and latterly general practice trainees. For the past 5 years he has worked as a general practice training programme director.

Dr Christopher Green is programme lead for the MSc in Medical and Clinical Education (MaCE) in the School of Health and Social Care at Essex University. He sits on the Best Evidence in Medical Education (BEME) Board and is associate editor for the Journal of Interprofessional Care.

Acknowledgements

The authors would like to extend thanks to the trainees on the local training programme who agreed to participate in this research.

Bibliography/References

Ahmed, E. and Ward, R. (2016) ‘Analysis of factors influencing acceptance of personal, academic and professional development e-portfolios.’ Computers in Human Behavior, 63, pp. 152–161. https://doi.org/10.1016/j.chb.2016.05.043

Ajzen, I. (1991) ‘The theory of planned behavior,’ Organizational Behavior and Human Decision Processes, 50(2), pp. 179–211. https://doi.org/10.1016/0749-5978(91)90020-T

Alemu, G., Stevens, B., Ross, P. and Chandler, J. (2015) The Use of a Constructivist Grounded Theory Method to Explore the Role of Socially-Constructed Metadata (Web 2.0) Approaches, Qualitative and Quantitative Methods in Libraries. Available at: http://www.qqml.net/papers/September_2015_Issue/433QQML_Journal_2015_Alemu_517-540.pdf (Accessed: January 3, 2017).

Anderson LW. (2002) ‘Curricular Alignment: A Re-Examination.’ Theory Into Practice, 41:4, pp. 255-260. https://doi.org/10.1207/s15430421tip4104_9

Archer, R., Elder, W., Hustedde, C., Milam., et al. (2008) ‘The theory of planned behaviour in medical education: a model for integrating professionalism training,’ Medical Education, 42(8), pp. 771–777. https://doi.org/10.1111/j.1365-2923.2008.03130.x

Austin, C. and Braidman, I. (2008) ‘Support for portfolio in the initial years of the undergraduate medical school curriculum: what do the tutors think?,’ Medical Teacher, 30(3), pp. 265–271. https://doi.org/10.1080/01421590701758673

Belcher, R., Jones, A., Smith, L-J., Vincent, T., et al. (2014) ‘Qualitative study of the impact of an authentic electronic portfolio in undergraduate medical education,’ BMC Medical Education, 14(1). https://doi.org/10.1186/s12909-014-0265-2

Braskamp LA. (1986) ‘Chapter 2: Applying Personal Investment Theory to Better Understand Student Development’, in Maehr ML. (ed) The Motivation Factor: A Theory of Personal Investment. 1st edn. Lexington: Lexington Books. pp. 21-37

Burton, D. (2012) PERSONAL INVESTMENT THEORY, SlideServe. Available at: https://www.slideserve.com/erelah/personal-investment-theory (Accessed: May 7, 2017).

Charmaz, K. (2006) Constructing grounded theory a practical guide through qualitative analysis. Los Angeles: Sage Publications.

Cotton, A. H. (2001) ‘Private thoughts in public spheres: issues in reflection and reflective practices in nursing,’ Journal of Advanced Nursing, 36(4), pp. 512–519. https://doi.org/10.1046/j.1365-2648.2001.02003.x

Dickinson, L. (1995) ‘Autonomy and motivation a literature review,’ System, 23(2), pp. 165–174. https://doi.org/10.1016/0346-251X(95)00005-5

Driessen, E. W., Tartwijk, J. V., Overeem, K., Vermunt, J. D., et al. (2005) ‘Conditions for successful reflective use of portfolios in undergraduate medical education,’ Medical Education, 39(12), pp. 1230–1235. https://doi.org/10.1111/j.1365-2929.2005.02337.x

Driessen, E., Tartwijk, J. V., Vleuten, C. V. D. and Wass, V. (2007) ‘Portfolios in medical education: why do they meet with mixed success? A systematic review,’ Medical Education, 41(12), pp. 1224–1233. https://doi.org/10.1111/j.1365-2923.2007.02944.x

Dyer, C. and Cohen, D. (2018) ‘How should doctors use e-portfolios in the wake of the Bawa-Garba case?,’ British Medical Journal. https://doi.org/10.1136/bmj.k572

Fish, D. and Coles, C. (2005) ‘Chapter 2: The Practice of Curriculum Design: it’s Principles, Processes, Components and Logic’, in Fish, D and Coles, C. Medical education: developing a curriculum for practice. 1st edn. Maidenhead, England: Open University Press. pp. 29-55

Goodyear, H., Wall, D. and Bindal, T. (2013) ‘Annual review of competence: trainees’ perspective,’ The Clinical Teacher, 10(6), pp. 394–398. https://doi.org/10.1111/tct.12040

Grant, A., Kinnersley, P., Metcalf, E., Pill, R., et al. (2006) ‘Students views of reflective learning techniques: an efficacy study at a UK medical school,’ Medical Education, 40(4), pp. 379–388. https://doi.org/10.1111/j.1365-2929.2006.02415.x

Gray, L. (2008) Effective Practice with ePortfolios: Supporting 21st Century Learning. Available at: https://www.webarchive.org.uk/wayback/archive/20140615090512/http://www.jisc.ac.uk/media/documents/publications/effectivepracticeeportfolios.pdf (Accessed: January 29, 2016).

Green, C. (2012) ‘Relative distancing: A grounded theory of how learners negotiate the interprofessional,’ Journal of Interprofessional Care, 27(1), pp. 34–42. https://doi.org/10.3109/13561820.2012.720313

Green, J. (2007) ‘Chapter 7. The Use of Focus Groups in Research into Health,’ in Saks, M., and Allsop, J. (eds) Researching Health: Qualitative, Quantitative and Mixed Methods. 1st edn. London, London: Sage, pp. 112–132.

Health Education East of England (2015) Your Learning LogNHS Choices. NHS. Available at: https://heeoe.hee.nhs.uk/welwyn_learning_log (Accessed: November 1, 2016).

Hrisos, S., Illing, J. C. and Burford, B. C. (2008) ‘Portfolio learning for foundation doctors: early feedback on its use in the clinical workplace,’ Medical Education, 42(2), pp. 214–223. https://doi.org/10.1111/j.1365-2923.2007.02960.x

Ingleby, E. (2015) ‘The house that Jack built: neoliberalism, teaching in higher education and the moral objections,’ Teaching in Higher Education, 20(5), pp. 518–529. https://doi.org/10.1080/13562517.2015.1036729

King, A. (2013) ‘A trainees guide to surviving ePortfolio,’ Clinical Medicine, 13(4), pp. 367–369. https://doi.org/10.7861/clinmedicine.13-4-367

Kjaer, N. K., Maagaard, R. and Wied, S. (2006) ‘Using an online portfolio in postgraduate training,’ Medical Teacher, 28(8), pp. 708–712. https://doi.org/10.1080/01421590601047672

Lakasing, E. (2013) ‘Formative assessments in medical education: are excessive, and erode the learning and teaching experience,’ British Journal of General Practice, 63(608), pp. 145–145. https://doi.org/10.3399/bjgp13X664298

McMullan, M., Endacott, R., Gray, Jasper, M., et al. (2003) ‘Portfolios and Assessment of Competence: A Review of the Literature.’ Journal of Advanced Nursing, 41(3), pp. 283-294. https://doi.org/10.1046/j.1365-2648.2003.02528.x

Norcini, J. J. (2003) ‘ABC of learning and teaching in medicine: Work based assessment,’ British medical journal, 326(7392), pp. 753–755. https://doi.org/10.1136/bmj.326.7392.753

Osborne, P. and Bal, B. (2013) ‘Registrar feedback on ‘Formative assessments in medical education,’ British Journal of General Practice, 63(612). https://doi.org/10.3399/bjgp13X669121

Palinkas, L. A., Horwitz, S. M., Green, C. A., Wisdom, J. P., et al. (2013) ‘Purposeful Sampling for Qualitative Data Collection and Analysis in Mixed Method Implementation Research,’ Administration and Policy in Mental Health and Mental Health Services Research, 42(5), pp. 533–544. https://doi.org/10.1007/s10488-013-0528-y

Pitts, J., Coles, C. and Thomas, P. (1999) ‘Educational portfolios in the assessment of general practice trainers: reliability of assessors,’ Medical Education, 33(7), pp. 515–520. https://doi.org/10.1046/j.1365-2923.1999.00445.x

Rethans, J.-J., Norcini, J. J., Baron-Maldonado, M., Blackmore, D., et al. (2002) ‘The relationship between competence and performance: implications for assessing practice performance,’ Medical Education, 36(10), pp. 901–909. https://doi.org/10.1046/j.1365-2923.2002.01316.x

Rimmer, A. (2017) ‘Health Education England asks doctors for input on ARCPs,’ British Medical Journal. https://doi.org/10.1136/bmj.j3868

Roberts, C., Newble, D. I. and Orourke, A. J. (2002) ‘Portfolio-based assessments in medical education: are they valid and reliable for summative purposes?,’ Medical Education, 36(10), pp. 899–900. https://doi.org/10.1046/j.1365-2923.2002.01288.x

Ross, S., Maclachlan, A. and Cleland, J. (2009) ‘Students attitudes towards the introduction of a Personal and Professional Development portfolio: potential barriers and facilitators,’ BMC Medical Education, 9(1). https://doi.org/10.1186/1472-6920-9-69

Royal College of General Practitioners (2007) A Brief Guide to Workplace Based Assessment in the nMRCGPOffice 365 Login. Available at: http://www.rcgp.org.uk/training-exams/training/mrcgp-workplace-based-assessment-wpba.aspx (Accessed: November 12, 2016).

Royal College of General Practitioners (2016a) Trainee ePortfolioRoyal College of General Practitioners. Available at: http://www.rcgp.org.uk/training-exams/training/mrcgp-trainee-eportfolio.aspx (Accessed: November 1, 2016).

Royal College of General Practitioners (2016b) The ePortfolio for GP Specialty Training (Including WPBA Guidance) A Guide for TraineesRoyal College of General Practitioners. Available at: http://www.rcgp.org.uk/training-exams/training/mrcgp-trainee-eportfolio.aspx (Accessed: November 1, 2016).

Rughani, A. (2008) ‘Workplace-based assessment and the art of performance,’ British Journal of General Practice, 58(553), pp. 582–584. https://doi.org/10.3399/bjgp08X319783

Ruiz, J. G., Qadri, S. S., Karides, M., Castillo, C., et al. (2009) ‘Fellows Perceptions of a Mandatory Reflective Electronic Portfolio in a Geriatric Medicine Fellowship Program,’ Educational Gerontology, 35(7), pp. 634–652. https://doi.org/10.1080/03601270902877360

Sadler, D. R. (1989) ‘Formative assessment and the design of instructional systems,’ Instructional Science, 18(2), pp. 119–144. https://doi.org/10.1007/BF00117714

Saldana, J. (2016) ‘Chapter 5: Second Cycle Coding Methods,’ in Saldana, J. The Coding Manual for Qualitative Researchers. 3rd edn. London, London: Sage, pp. 233–268.

Sbaraini, A., Carter, S. M., Evans, R. W. and Blinkhorn, A. (2011) ‘How to do a grounded theory study: a worked example of a study of dental practices,’ BMC Medical Research Methodology, 11(1). https://doi.org/10.1186/1471-2288-11-128

Scott, H. (2007) The Temporal Integration of Connected Study into a Structured Life, Grounded Theory Review RSS. Available at: http://groundedtheoryreview.com/2007/03/30/1135/ (Accessed: July 13, 2017).

Snadden, D. (1999) ‘Portfolios - attempting to measure the unmeasurable?,’ Medical Education, 33(7), pp. 478–479. https://doi.org/10.1046/j.1365-2923.1999.00446.x

Snadden, D. and Thomas, M. L. (1998) ‘Portfolio learning in general practice vocational training - does it work?,’ Medical Education, 32(4), pp. 401–406. https://doi.org/10.1046/j.1365-2923.1998.00245.x

Thompson J, Houston D, Dansie K, Rayner T., et al.  (2017) ‘Student and Tutor Consensus: A Partnership in Assessment for Learning,’ Assessment and Evaluation in Higher Education; 42(6), pp. 942-952. https://doi.org/10.1080/02602938.2016.1211988

Tochel, C., Haig, A., Hesketh, A., Cadzow, A., et al. (2009) ‘The effectiveness of portfolios for post-graduate assessment and education: BEME Guide No 12,’ Medical Teacher, 31(4), pp. 299–318. https://doi.org/10.1080/01421590902883056

Tomlinson, J. (2015) Don't judge me! Reflections on reflectionA Better NHS. Available at: https://abetternhs.net/2015/07/01/dont-judge-me-reflections-on-reflection (Accessed: November 22, 2016).

Van Der Vleuten, C. P. M. and Schuwirth, L. W. T. (2005) ‘Assessing professional competence: from methods to programmes,’ Medical Education, 39(3), pp. 309–317. https://doi.org/10.1111/j.1365-2929.2005.02094.x

Wilkinson, J. R., Crossley, J. G. M., Wragg, A., Mills, P., et al. (2008) ‘Implementing workplace-based assessment across the medical specialties in the United Kingdom,’ Medical Education, 42(4), pp. 364–373. https://doi.org/10.1111/j.1365-2923.2008.03010.x

Appendices

None.

Declarations

There are no conflicts of interest.
This has been published under Creative Commons "CC BY-SA 4.0" (https://creativecommons.org/licenses/by-sa/4.0/)

Ethics Statement

Ethical approval was given by the University of Essex (Ref. 16021) with permissions to involve GP specialist trainees from Southend University Hospital Trust and Health Education East of England.

External Funding

This paper has not had any External Funding

Reviews

Please Login or Register an Account before submitting a Review

Ian Wilson - (02/09/2018) Panel Member Icon
/
I agree with Masters and Gibbs in their reviews. I would particularly like to emphasise the problems associated with the transfer of results in one region to the entirety of the program. this is problematic in this paper where it is evident that the manner of implementation varies between regions.
As I work in Australia I had particular difficulty with the unexplained abbreviations. I presume the S in ST1 etc stands for Speciality. And I have no idea what LETB means.
In summary an interesting article of limited impact due to its focus on one region.
Ken Masters - (25/08/2018) Panel Member Icon
/
Good detailed description of the Grounded Theory process applied.

The major problem I have with the paper is trying to grasp the results, as the relationship between the quotations and the descriptions is really difficult to follow.

They begin by being laid out with quotation and then description.
• Firstly, it might be easier for the reader if this were reversed (as is fairly standard in most GT studies I have read). Otherwise, the quotation is read with almost no context, and then makes sense only when the description is read. In order to better understand the paper, I found myself skipping the quote, going down to the description, and then back up to the quote. And sometimes there are two quotes, and it is unclear how these relate to the descriptions.
• Secondly, the order then appears to shift, as a description is given, followed by a colon and then an illustrating quotation. As an example, the sub-section headed “Conceptualising the E-portfolio” begins with quote followed by description, but then ends with a final quote, so the reader does not know if there is a description missing, or if that quote relates to the description above it. It might have helped if the authors had put in extra spaces, to group description and quote/s, so that these could guide the reader.
• And sometimes, finding the connections between the descriptions and the quotations appeared very obscure. This may be because the authors are deeply familiar with the original material, and the connections are obvious to them. The connections, however, are not always obvious to the reader. For example, in this set, it is almost impossible to determine the relationship between description and quotation.

---

[Description]
Trust in supervisors was also influenced by the perceived ability of supervisors to assign relevant competences to log entries.

[Quotation]
I find it easy just to make sure it’s short and not too over-analysing stuff as people often miss a lot of stuff in there and just tick one or two boxes. ST3(5)

[Description]
For trainees to engage they needed to trust that e-portfolio accurately represented their capabilities. Trainees were more trusting of judgements made by those familiar with their daily practice but raised concerns regarding judgements made by external assessors.

[Quotation]
…they can use the logs…they can use the mini-CEXs and CBDs, but I still don’t think that it’s an accurate reflection. ST1(1)

----

In cases like this, it might have been useful if the authors had edited the quotation with square brackets, making the designations of the pronouns clearer (e.g. they [the assessors/staff/students]).

So, I found myself often having to gloss over much of the Results section, trying to form a general impression only. This meant, in turn, that I had difficulty evaluating the Discussion, because I could not reasonably form an opinion on whether or the Discussion was a true reflection of what was shown in the Results.

I would strongly recommend that, in the revised version of their paper, the authors:
• Clearly indicate the grouping of quotation and description.
• Preferably give description before quotation.
• Have their Results read by a person unfamiliar with the study and the original material, to ensure that the descriptions are accurately reflected in the quotations.


Small correction:
“eluded to this cost-benefit” should be “alluded to this cost-benefit”

Antony Willman - (18/08/2018)
/
There is so much that is good about this paper. As a GP Trainer myself, I recognise a lot of what is reported. The study is well constructed and raises some excellent points.

As a Trainer using the old structured trainers report, then transitioning to the e-Portfolio after a period not training, it is interesting to compare the formative vs summative role. This is brought out nicely. There is a certain amount of hoop jumping I think we all agree but this correlates with engagement.

A semi-structured interview was I think the correct way to go as opposed to focus group - it is a very personal thing and even in small groups, this may bar GPSTs telling the whole story.

It would be interesting to run a similar study on over-performing GPSTs or those who were in the top tranche of their MSRA to see if different themes emerge. Another group to possibly look at would be 1st or 2nd year First5 GPs. This might give some triangulation.

How many entries are enough is a perennial question which does seem to vary. The variation would directly impact 'investment worth' especially as it is a question often asked of me as an ESR. If I don't know the exact number, what does that say? Again this is brought out nicely.

Limiting it to one training programme may skew the results but somehow I doubt this based on the anecdotal experience of my own and my colleagues during discussions at Trainers Groups.

Lastly, some of the comments about CSR and ESR engagement are revealing. We attempt to calibrate on our own annual trainers' conference but I think this is brought out as an extremely important point. Methods to enable this as an ongoing formative process (possibly peer review) would be worth looking at. Local Trainers groups can help but may not provide enough rigor in calibration.
Trevor Gibbs - (17/08/2018) Panel Member Icon
/
Despite the wide acceptance of portfolios and e-portfolios, there still seems to be many questions asked of their value and effect. This paper digs into the e-portfolio story and provides some interesting findings.
I thought that the paper was well structured, very easy to read and with a very useful introductory section with very useful references.
I am not sure as to why the authors avoided a focus group approach, given that in my views the focus groups allow for much more open and informative discussion, especially if run over time. Also, the low numbers of participants and the fact that they appear to come from the same training group may actually skew the results , dependent upon how the trainees were "taught" about e-portfolios and their uses. I wonder how many of the participants had used e-portfolios in their undergraduate years and how much "training / teaching" they had experienced during previous teaching activities.
I would very much agree with the author's final bullet point, which suggests to me that for many situations the personal element of portfolios- exploring the inners of the student - is often not explored, with the faculty concentrating on both the summative element and the clinical portion.
Despite my reservations, I would recommend this paper to all those involved in developing portfolios at whatever level and in whatever form.