Research article
Open Access

Moving the Needle on our Med-Peds Residency Board Pass Rate

Siobhán O'Keefe[1], Mary Catherine Turner[2], Nathan Andrew Brinn[3], Dale Newton[2]

Institution: 1. Children's Hospital Colorado, University of Colorado. , 2. East Carolina University & Vidant Health, 3. University of South Florida
Corresponding Author: Dr Siobhán O'Keefe ([email protected])
Categories: Assessment, Students/Trainees, Teaching and Learning
Published Date: 04/04/2018

Abstract

Introduction

The national pass rates for the American Board of Internal Medicine (ABIM) and the American Board of Pediatrics (ABP) Certification Exams (CE) has declined for combined internal medicine-pediatric (Med-Peds) residents. The lower board pass rate is especially evident in smaller institutions like our own.

Methods

We instituted a board preparation assistance program where each resident is stratified into a color zone according to their in-training exam (ITE) scores.

Results

The ITE results of 67 residents from 12 classes were analyzed. 65.7% were graduates of medical schools within the United States (US) and 85.1% were allopathic graduates. Mean scores for USMLE step 1 and step 2 were 213 and 217 respectively. Mean ITE scores increased in each post-graduate year.

Our first time ABIMCE board pass remained constant at 73% before the intervention and 75% after the intervention with 84% of residents ever passing the ABIMCE prior to the intervention and 81% after. Our first time ABPCE board pass rate is much improved from 46% to 75% and our overall pass rate improved from 57% to 93% (p=0.0121).  On multivariate analyses there was no association between passing either the ABIMCE or ABPCE and any of our measured variables.

Conclusions

It is encouraging that our ITE scores and ABPCE and ABIMCE pass rates are improving although this seems to be due to factors other than our intervention. Other institutions that struggle with poor board pass rates may be interested in our approach.

Keywords: MedPeds; Boards

Introduction

In 2002 it was reported that 88% of med-peds residents who graduated from the med-peds residency programs between 1994 and 1998 took the American Board of Internal Medicine certifying exam (ABIMCE) and 86% had taken the American Board of Pediatrics certifying exam (ABPCE). Of these residents the pass rate for the ABIMCE was 97% and the pass rate for the ABPCE was 96%.  A reported total of 79% of all Med-Peds residents from the programs in the survey had passed both examinations and were dual certified 1. Over the past decade or so there has been gradual decline in the national pass rates of both the ABIMCE and the ABPCE for all residents.  Unfortunately, the national pass rate of combined medicine-pediatrics residents of both exams is lower than that of categorical residents nationwide. This disparity seems to have disproportionately affect smaller underserved institutions like our own 2-3.

Prior to this study the combined medicine-pediatrics program at our institution had a worrisome low pass rate for both boards.  Between the years of 2007 and 2011 the national first time pass rate of the ABIMCE was 88%2.  Our categorical IM residency program has a pass rate similar to this national standard at 87% whereas for the same time period our combined Med-Peds residency had a pass rate of 69% for the ABIMCE.2 The national pass rate for the ABPCE had been on average 77% between 2007 and 2011.4  The overall first time pass rate of the ABPCE at our institution was 66%5. The 5 year pass rate for our combined medicine-pediatrics graduates seemed higher at 80% but this pass rate needs adjustment to reflect a very low “sit” rate. Unfortunately, only 60% of our eligible combined medicine-pediatrics residents actually took the ABPCE.  Therefore, overall only 48% of all of our graduating board eligible combined Med-Peds residents actually pass the ABPCE.

In 2007 Freed et al reported that approximately 11% of physicians who claim to be pediatricians in the US have not ever achieved board certification with the American Board of Pediatrics (ABP) 6. As more institutions and practices and ultimately patients and payers use board certification as a benchmark for qualification in a certain subspecialty our graduates would be at a significant disadvantage.

The faculty and program directors of our combined medicine pediatrics program felt very strongly that in order to consider ourselves an educationally sound and successful residency program we needed to attain a board pass rate of at least above the national average and preferably as close to 100% as possible.  Therefore, we decided to create a new approach to the board preparation plan for our combined Med-Peds residents.

There are several factors that are associated with success in national board exams.  Ranking within the top one third of one’s medical student class, Alpha Omega Alpha status and mandatory research year in a surgical residency was associated with an increased likelihood of passing the American Board of surgery certification exam whereas a delay in taking the exam was associated with failure.7-9  Self-directed reading of an electronic knowledge resource, being younger, and international medical school graduation were significantly associated with Internal Medicine In-training examination (ITE) performance in some studies but no evaluation of effect on success in the ABIMCE was studied 10-11. Attendance at teaching conferences was found to be associated with IM-ITE performance in some studies but this was not corroborated by other studies11-12.

United States Medical Licensing exam (USMLE) scores have been shown to be predictive of passing national board exams in multiple specialties including emergency medicine, surgery, orthopedics, internal medicine7, 8, 13-16. USMLE scores <200 may be the most predictive of failure in national board exams and in some cases the score in the step 2 exam seemed more predictive than step 1 scores 7, 8, 12, 16.

Residency ITEs have also been found to be highly correlated with passing national board examination in multiple specialties including surgery, ophthalmology, orthopedics, internal medicine and pediatrics 8, 14, 17-21. One study found that those residents who scored below 35 percentile had an 83% probability of failing the ABIMCE20.  Another study showed that a raw score of 49 or below predicted failure in the ABIMCE 21. In pediatrics one study showed that residents scoring 410 or above in their 3rd year pediatrics ITE have a 96% chance of passing the boards on first attempt whereas those scoring 250 or less during the PGY3 year have less than 50% chance of passing the boards on first attempt19.

Several residency programs have instituted formal programs in order to help increase their board pass rate.  One surgical residency instituted weekly assigned reading followed by weekly examinations and another administered brief practice exams periodically throughout the year.  Both interventions had positive results on their national board exam pass rates 22-23.  An internal medicine residency program instituted a 12-month intervention of pre and post tests for each clinical rotation. This was coupled with board review closed book testing.  After exposure to this intervention the study group had a statistically significant increase in their percentile scores on the ITE exams.  That residency program was participating in internal medicine educational innovation project (EIP) which gave them the flexibility to conduct that study during a 12-month long block of ambulatory months, inpatient and outpatient electives without night call24. A pediatrics residency program in Brooklyn a weekly email based board review course and found that total number of questions answered had a significant positive correlation with standard board scores25. In the same residency program, a board review course was found to increase the standard board examination scores and slightly increase passing rate in at risk residents (i.e. residents with ITE scores <200)26.

Having reviewed what approaches have been tried at other institutions we created a board preparation assistance program specific for our med-peds residents.

Methods

We instituted a board preparation assistance program in 2012.  Each senior resident was stratified into a color zone depending on their Internal Medicine ITE (IM-ITE) and Pediatrics ITE (Ped-ITE) scores.

For internal medicine each resident receives a score with a percentile that compares them to other combined medicine-pediatrics residents nationally. Therefore, the plans assigned to each resident for internal medicine were decided by their percentile scores. For pediatrics there are no percentile scores assigned and the research describing the predictive value of ITE scores is based on the raw scores. Therefore, the designation of color zone for pediatrics was according to their raw score in the Ped-ITE. For pediatrics the ABP changed the ITE reporting score in 2012 from a 0-800 standardized score to a 0-200 standardized score. In order to compare the performance of our current residents to residents in years past we converted all of the old standardized scores to the new standardized score according to a conversion supplied to us by the ABP. 

Figures 1 and 2 summarize the initial remediation activities for our residents depending on their color zone. Our departments buy each resident Medical knowledge self-assessment program (MKSAP) books or online access according to the residents’ preference as well as annual online access to Pediatrics Review and Education Program (PREP). The residents were encouraged to read the chapter relevant to the rotation from their review books and practice questions during their rotation. We organized some of the questions from their MKSAP and PREP resources into mini-tests that were related with every clinical rotation that our combined medicine-pediatrics residents rotate through each year.  The residents were required to score 75% on the mini tests to satisfactorily complete the rotation.  They were allowed to take the test up to 5 times if needed. It was an open-book exam. There was a time limit of 30 minutes on each test. The residents in yellow, orange and red zones were emailed once every 1-2 months with a reminder of their color zone and results on their rotation specific mini-tests.

Those residents at high risk of failure of their boards based on their ITE score were also heavily encouraged to follow a board remediation plan. This remediation plan involved developing and maintaining an active study plan that is supervised by the program directors and participating in all categorical board preparation events available. As part of their study plan we requested that the residents email us an update regarding their progress with their study plan every 1-2 months.

Analyses

We compared the ITE scores and final ABIMCE and ABPCE sit and pass rate in the years prior to implementation of our board assistance program to the ITE scores and ABIM and ABP sit and pass rate after the implementation of the program. ITE scores were compared using two-sample t-tests and pass rates were compared using Chi-square tests. Logistic regression models were used to identify possible factors that may affect the likelihood of passing boards. We also used logistic regression models to compare the likelihood of passing board before and after the intervention controlling for possible confounding factors such as USMLE step 1 score, USMLE step 2 score, US or international medical school graduation, and first year ITE score.

 

Figure 1. Initial Pathway for resident assignment to Internal Medicine board preparation assistance Program

 

 

Figure 2. Pathway for resident assignment to Pediatrics board preparation assistance Program

 

Ethics statement: This study was eligible for Exempt Certification by the East Carolina University and Medical Centre Institutional Review Board under reference number UMCIRB 12-001675.

Results

The ITE results of 67 residents from 12 classes were analyzed. 65.7% were graduates of medical schools within the United States (US) and 85.1% were allopathic graduates. Mean scores for USMLE step 1 and step 2 were 213 and 217 respectively. Mean ITE scores for each post-graduate year are higher after the intervention and this difference is statistically significant except for PGY1 year for IM-ITE and PGY-4 for Peds-ITE (Table 1). However, there is no significant difference in the change of scores from one post-graduate year to the next including from 1st year to 4th year (Table 2).  

 

Table 1

Year

 

Exam

Mean Score Before Intervention

Mean Score After Intervention

P-Value

1st

IM-ITE

48.79

53.06

0.0515

2nd

IM-ITE

54.64

58.71

0.0344

3rd

IM-ITE

58.21

64.06

0.0300

4th

IM-ITE

59.47

69.24

<0.0001

1st

Ped-ITE

114.91

134.94

0.0013

2nd

Ped-ITE

137.97

148.65

0.0499

3rd

Ped-ITE

150.61

165.12

0.0146

4th

Ped-ITE

166.42

175.53

0.2686

 

Table 2

Between Postgraduate Years

Exam

Mean Change in Score Before Intervention

Mean Change in Score After Intervention

P-Value

1st Year to 2nd Year

IM-ITE

6.13

7.59

0.3743

2nd Year to 3rd Year

IM-ITE

4.53

4.18

0.8445

3rd Year to 4th Year

IM-ITE

2.91

4.53

0.3183

1st Year to 4th Year

IM-ITE

14.05

16.76

0.1744

1st Year to 2nd Year

Ped-ITE

25.41

15.59

0.0988

2nd Year to 3rd Year

Ped-ITE

15.01

11.50

0.4601

3rd Year to 4th Year

Ped-ITE

20.27

6.46

0.0618

1st Year to 4th Year

Ped-ITE

58.92

44.93

0.1968

 

Our first time ABIMCE board pass and remained relatively constant at 73% before the intervention to 75% after the intervention and similarly our ever pass ABIMCE remained at best stagnant with 84% passing prior to the intervention and 81% after. Our first time ABPCE board pass rate is much improved from 46% to 75% and our overall pass rate improved from 57% to 93% and this improvement was statistically significant (p=0.0121). 

 

Table 3

 

Before Intervention

After Intervention

P-value fisher’s exact test

1st time pass ABIMCE

19/26 (73.0%)

12/16 (75.0%)

1.000

Ever Pass ABIMCE

22/26 (84.6%)

13/16 (81.25%)

1.000

1st time pass ABPCE

12/26 (46.1%)

12/16 (75.0%)

0.1087

Ever Pass Peds ABPCE

15/26 (57.7%)

15/16 (93.7%)

0.0121

 

On univariate analysis first year ITE scores and USMLE-1 and USMLE-2 scores was associated with the first time pass rate of the ABIMCE and ABPCE and overall pass rate of the ABPCE. Only ITE-1 scores and USMLE-2 scores were associated with the overall pass rate of the ABIMCE. None of these associations remained when variables were controlled on multivariate analysis.  There was no association between being a US graduate and board pass rates. Our intervention appeared to be associated with likelihood of passing the ABPCE based on univariate analysis but again this association disappeared once controlled for other variables on multivariate analyses.

 

Table 4

Univariate Analysis Likelihood of Passing Boards

Passing ABIMCE First Time

 

Estimate

P-Value

Odd’s ratio

95% CI

ITE-1

0.2960

0.0081

1.344

1.080-1.674

USMLE-1

0.1013

0.0092

1.107

1.025-1.194

USMLE-2

0.0991

0.0143

1.104

1.020-1.195

US Grad

0.1386

0.6958

1.319

0.329-5.295

Intervention

0.0500

0.8905

1.105

0.266-4.597

Ever Passing ABIMCE

ITE-1

0.3270

0.0152

1.387

1.065-1.806

USMLE-1

0.0511

0.1312

1.052

0.985-1.125

USMLE-2

0.1001

0.0458

1.105

1.002-1.219

US Grad

0.4069

0.3327

2.256

0.435-11.708

Intervention

-0.1192

0.7766

0.788

0.152-4.088

Passing ABPCE First Time

ITE-1

0.0354

0.0234

1.036

1.005-1.068

USMLE-1

0.0952

0.0035

1.100

1.032-1.173

USMLE-2

0.0630

0.0102

1.065

1.015-1.118

US Grad

-0.0578

0.8560

0.891

0.256-3.102

Intervention

0.6264

0.0729

3.500

0.890-13.764

Ever Passing ABPCE

ITE-1

0.0385

0.0273

1.039

1.004-1.075

USMLE-1

0.0733

0.0165

1.076

1.013-1.143

USMLE-2

0.0744

0.0133

1.077

1.016-1.143

US Grad

0.0345

0.9208

1.071

0.275-4.176

Intervention

1.1989

0.0302

10.999

1.258-96.198

 

Table 5

Multiple Logistic Regression Model- Likelihood of Passing Boards

Passing ABIMCE First Time

 

Estimate

P-Value

Odd’s ratio

95% CI

ITE-1

0.2454

0.0261

1.278

0.990-1.650

USMLE-1

0..0730

0.3072

1.076

0.935-1.237

USMLE-2

0.0275

0.6834

1.028

0.901-1.173

US Grad

-0.2578

0.7284

0.597

0.033-10.967

Intervention

-0.2743

0.6814

0.578

0.042-7.929

Ever Passing ABIMCE

ITE-1

0.5222

0.0650

1.686

0.968-2.935

USMLE-1

-0.1636

0.2363

0.849

0.648-1.113

USMLE-2

0.3330

0.2007

1.395

0.838-2.324

US Grad

-1.6823

0.1940

0.035

0.001-5.541

Intervention

-1.0107

0.4110

0.132

0.001-16.408

Passing ABPCE First Time

ITE-1

0.0122

0.6332

1.012

0.963-1.064

USMLE-1

0.0901

0.0808

1.094

0.989-1.211

USMLE-2

0.0574

0.2335

1.059

0.964-1.164

US Grad

-0.0673

0.9186

0.874

0.066-11.525

Intervention

0.4674

0.4043

2.547

0.283-22.915

Ever Passing ABPCE

ITE-1

0.0154

0.5985

1.015

0.959-1.075

USMLE-1

0.0424

0.5241

1.043

0.916-1.189

USMLE-2

0.1396

0.1092

1.150

0.969-1.364

US Grad

-0.6805

0.4583

0.256

0.007-9.357

Intervention

1.3187

0.1331

13.977

0.448-436.545

Discussion

It is encouraging that our ITE scores and board pass rates have improved compared to before the intervention although this seems to be due to factors not related to our intervention such as prior academic success of the residents in the intervention group. Other interventions in other residency programs have also demonstrated improved board pass rates. 22-26 However to our knowledge this is the only board preparation assistance program described in the literature that specifically targets medicine-pediatrics residents. Our intervention is also somewhat unique in that we focused the energy of the intervention on “at risk” residents. Given limited resources in underserved institutions such as our own, and limited available faculty time we feel that focusing on “at risk” residents is the most sensible use of time and energy.

While other authors have shown a strong association between ITE scores and USMLE results and board pass rates the association found in our study did not remain when controlled for other variables.7,8,14, 12-21  We did not demonstrate any relationship between board pass rate and US versus international graduate status or allopathic versus osteopathic status which may be reassuring to international and osteopathic graduates and those programs that match residents from these groups.

Limitations to our study included the following:

  • Our sample size was small which can be improved by many more years of implementation and study.
  • The ABP changed the standardized scoring system during our study. The ABP was able to provide us a chart so that the old standardized scores could be converted to the new scale. The old standardized score of zero converted to a wide range in the new standardized score so we used the upper quartile of this range to be consistent in the conversion. We chose the upper quartile because we felt this would be less likely to skew the data than choosing the median or upper limit of the range.
  • For the analysis looking on effect of USMLE scores we had to exclude the osteopathic residents if they only had COMLEX scores. 
  • In the pre-intervention group, we assumed that a resident did not pass if they are not listed as certified on the ABP or ABIM website. There may be some residents who chose not to sit a particular board because of their area of subspecialty. There were also some residents in the post intervention group that we know did not attempt one of the boards because they were pursuing subspecialty fellowship but in order to be consistent with our data recording we recorded these results as if the resident failed the board. This may have negatively skewed the data but we wanted to ensure consistency between the pre and post intervention groups. We also felt this approach was appropriate because our goal as a program is to have every resident sit and pass both the ABIMCE and ABPCE after graduation from our combined residency regardless of their career choice.
  • We did exclude one recent graduate from the analyses because of board exam deferral for one year due to extenuating family reasons.

Over the course of the intervention we observed that ultimately those who were successful in their boards were typically those who applied themselves early on during residency to their studies. Our Board Preparation Assistance Program seemed to help those who would typically be self-motivated but perhaps needed a reminder of the urgency of the need to study. Many residents came to us with study plans and started emailing us monthly with their progress in those plans.  Those who did not benefit were often residents who under-performed in other areas and could not seem to self-motivate. We were hoping that just drawing attention to their position in the red or yellow zone would be enough on its own but it seemed that if a resident procrastinated over their study it continued regardless. Although there was considerable passive motivation applied by the program directors in the form of regular emails reminding them of their scores on the online tests and one on one meetings emphasizing the importance of studying, ultimately there was no hard stop for those who opted out of meaningful participation. After considering this obstacle in autumn of 2015 we revised our board preparation policy. We met with our categorical pediatric colleagues and tried to standardize the approach between the two programs and then we updated the policy to simplify it. The updated board policy can be seen in appendix 1 and includes such items as mandating an academic assessment for those who are demonstrating stagnation in their ITE scores, mandating a study plan be emailed to the program directors by every resident in the red zone, mandating a minimum number of PREP and MKSAP questions be completed on their online account by end of the academic year etc. Tied to this we now have decided that any resident who is in the red zone and not meaningfully engaged in the board preparation assistance program will receive a formal letter of notification. A sample of this letter is in Appendices. The letter notes that if they do not demonstrate meaningfully engagement in the program by their next semi-annual evaluation they will then be placed on academic remediation without extension [DN2] of length of residency. If they continue to fail to meaningfully participate in a board preparation assistance program, then academic remediation with extension of residency would be considered. Of note the decision to issue a letter of notification is based solely on the participation or meaningful engagement of the resident in the board preparation assistance program not on their ITE scores. If a resident has been actively participating in the preparation program but still remains in the red zone they would not receive a letter. Since introducing this policy we have noticed an increase in participation among those few residents who were having difficulty motivating to improve their study habits.

Future steps include continuing to collect and analyze the data annually especially after the implementation of the new policy in 2015. During the course of the study it also became apparent that personal/family circumstances seem to anecdotally have an influence on board success rates. Further areas of study could include analyses of the effect of major life events during residency such as pregnancy on the likelihood of passing boards.

Conclusion

We successfully implemented a board preparation assistance program specifically tailored for medicine-pediatrics residents. The ITE results and the likelihood of passing the boards are improving. As we continue to implement the more robust board preparation assistance program we hope to see continued improvement every year. Other institutions that struggle with poor board pass rates may be interested in our approach.

Take Home Messages

Notes On Contributors

Siobhán O'Keefe was a clinical assistant professor and the associate MedPeds program director at East Carolina University during the time of the study. She is currently completing a pediatric critical care fellowship at Colorado Children's Hospital.

Mary Catherine Turner was a faculty member during the time of the study and is the current MedPeds Program director  at East Carolina University

Nathan Andrew Brinn was the MedPeds program director at East Carolina University at the time of the study and is currently a clinical associate professor at University South Florida.

Dale Newton was a clinical professor and a associate program director of the MedPeds residency program at East Carolina University at the time of the study. He is currently retired and enjoying time with his family.

Acknowledgements

The authors would like to thank all of the MedPeds residents that we have had the pleasure of working with at East Carolina University over the years and during the period of the study. Your enthusiasm, work ethic, warmth and comradry made it a pleasure to work with you.

Bibliography/References

1. Frohna JG, Melgar T, Mueller C, Borden S. Internal medicine-pediatrics residency training: current program trends and outcomes. Acad Med 2004 Jun; 79(6):591-6

https://doi.org/10.1097/00001888-200406000-00018   

2. American Board of Internal Medicine Website. http://www.abim.org/pdf/pass-rates/residency-program-pass-rates.pdf. Accessed June 29th 2016   

3. The Medicine Pediatrics Program Director Association Website. http://www.im.org/Meetings/Past/2012/2012APDIMSpringConference/Presentations/Documents/MPPDA%20Meeting/Distinguished%20Lecture_Frohna.pdf Accessed June 29th 2016.   

4. American Board of Pediatrics Website. https://www.abp.org/sites/abp/files/pdf/gp_training_program_pass_rates.pdf. Accessed June 29th 2016.  

5. https://www.abp.org/content/general-pediatric-training-program-pass-rates

6. Freed GL, Uren RL, Hudson EJ, Lakhani I; Research Advisory Committee of the American Board of Pediatrics. J Pediatr 2007 Jun; 150(6):645-8

https://doi.org/10.1016/j.jpeds.2006.12.053   

7. Shellito JL, Osland JS, Helmer SD, Chang FC. American Board of Surgery examinations: can we identify surgery residency applicants and residents who will pass the examinations on the first attempt? Am J Surg 2010 Feb: 199(2):216-22

https://doi.org/10.1016/j.amjsurg.2009.03.006   

8. De Virgilio C, Yaghoubian A, Kaji A et al. Predicting performance on the American Board of Surgery qualifying and certifying examinations: a multi-institutional study. Arch Surg 2010 Sep; 145(9):952-6

https://doi.org/10.1001/archsurg.2010.177   

9. Malagoni MA, Jones AT, Rubright J, Biester TW, Buyske J, Lewis FR Jr. Delay in taking the American Board of Surgery qualifying examination affects examination performance. Surgery 2012   

10. McDonald FS, Zeger SL, Kolars JC. Factors associated with medical knowledge acquisition during internal medicine residency. J Gen Intern Med 2007 Jul; 22(7):962-8.

https://doi.org/10.1007/s11606-007-0206-4   

11. McDonald FS, Zeger SL, Kolars JC. Associations of conference attendance with internal medicine in-training examination scores. Mayo Clin Proc 2008 Apr; 83(4):449-53

https://doi.org/10.4065/83.4.449   

12. Cacamese Sm, Eubank KJ, Hebert RS, Wright SM. Conference attendance and performance on the in-training examination in internal medicine. Med Teach 2004 Nov; 26(7):640-4

https://doi.org/10.1080/09563070400005446   

13. Thundiyil JG, Modica RF, Silvertri S, Papa L. Do United States Medical Licensing Examinations (USMLE) scores predict in-training test performance for emergency medicine residents? J Emerg Med 2010 Jan; 38(1):65-9.

https://doi.org/10.1016/j.jemermed.2008.04.010   

14. Swanson DB, Sawhil A, Holtzman KZ, Bucak SD, Morrison C, Hurwitz S, DeRosa GP. Relationship between performance on part 1 of the American Board of Orthopedic Surgery Certifying examination and Scores on USMLE Steps 1 and 2. Acad Med 2009 Oct; 84 (10 Suppl): S21-4

https://doi.org/10.1097/ACM.0b013e3181b37fd2   

15. Dougherty PJ, Walter N, Schilling P, Najibi S, Herkowitz H. Do scores of the USMLE Step 1 and OITE correlate with the ABOS Part 1 certifying examination?: A multicenter study. Clin Orthop Relat Res 2010 Oct; 468(10):2797-802.

https://doi.org/10.1007/s11999-010-1327-3   

16. Perez JA jr, Greer S. Correlation of United States Medical Licensing Examination and Internal Medicine In-Training Examination performance. Adv Health Sci Educ theory Pract 2009 Dec; 14(5):753-8.

https://doi.org/10.1007/s10459-009-9158-2   

17. Johnson GA, Bloom JN, Szczotka-Flynn L, Zauner D, Tomsak RL. A comparative study of resident performance on standardized training examinations and the American board of ophthalmology written examination. Ophthalmology 2010 Dec; 117(12):2435-9.

https://doi.org/10.1016/j.ophtha.2010.03.056   

18. Garibaldi RA, Subhiyah R, Moore ME, Waxman H. The In-training examination in Internal Medicine: an analysis of resident performance over time. Ann Intern Med 2002 Sep 17; 137(6):505-10

https://doi.org/10.7326/0003-4819-137-6-200209170-00011   

19. Althouse LA, McGinness GA. The in-training examination; an analysis of its predictive value on performance on the general pediatrics certification examination. J Pediatr 2008 Sep: 153(3):425-8.

https://doi.org/10.1016/j.jpeds.2008.03.012   

20. Grossman RS, Fincher RM, Layne RD, Seelig CB, Berkowitz LR, Levine MA. Validity of the in-training examination for predicting American Board of Internal Medicine certifying examination scores. J Gen Intern Med 1992 Jan-Feb: 7(1):63-7

https://doi.org/10.1007/BF02599105   

21. Rollins KR, Martindale JR, Edmond M, Manser T, Scheld M. Predicting pass rates on the American Board of Internal Medicine Certifying Examination. J Gen Intern Med 1998 Jun; 13(6):414-6.

https://doi.org/10.1046/j.1525-1497.1998.00122.x   

22. De Virgilio C, Chan T, Kaji A, Miller K. Weekly assigned reading and examinations during residency, ABSITE performance, and improved pass rates on the American Board of Surgery Examinations. J Surg Educ 2008 Nov-Dec; 65(6):499-503

https://doi.org/10.1016/j.jsurg.2008.05.007   

23. Corneille MG, Willis R, Stewart RM, Dent DL. Performance on brief practice examination identifies residents at risk for poor ABSITE and ABS qualifying examination performance. J Surg Educ 2011 May-Jun; 68(3):246-9.

https://doi.org/10.1016/j.jsurg.2010.12.009   

24. Mathis B, Warm E, Schauer DP, Holmboe E, Rouan GW. A Multiple Choice Testing Program Coupled with a Year-long Elective Experience is Associated with Improved Performance on the Internal Medicine In-Training Examination. J Gen Intern Med 26(11):1253-7

https://doi.org/10.1007/s11606-011-1696-7   

25. Langenau EE, Fogel J, Schaeffer HA. Correlation between an email based board review program and American board of pediatrics general pediatrics certifying examination scores. Med Educ Online 2009; 14:18

https://doi.org/10.3402/meo.v14i.4511   

26. Aeder L, Fogel J, Schaeffer H. Pediatric board review course for residents 'at risk'. Clin Pediatr 2010 May; 49(5):450-6

https://doi.org/10.1177/0009922809352679

Appendices

Declarations

There are no conflicts of interest.
This has been published under Creative Commons "CC BY-SA 4.0" (https://creativecommons.org/licenses/by-sa/4.0/)

Reviews

Please Login or Register an Account before submitting a Review

Ken Masters - (11/05/2019) Panel Member Icon
/
The paper deals with attempts to increase the pass rates on the Med-Peds Residency Board Exam in the USA. The authors begin by showing their pass rates are lower than average, examine some of the correlating factors affecting pass rates, and then describe an intervention (“board preparation assistance program”) that attempts to improve their pass rates.

The results are somewhat encouraging, but mixed, although that also may be due to the small sample size, where statistical significances are difficult to measure. The researchers may need to run their program for at least three years before they receive more concrete data.


Some issues:
• The authors begin with an explanation of the various pass rates on various exams, nationally and from their own institution. Given that these examinations apply to the USA only, and that most readers of the journal would be unfamiliar with them, there is a strong risk that readers will get lost in mixture of alphabet soup and percentages. Although it is unusual, it probably would have been better if the authors had laid these out in the form of a table, much as they would do with statistics from the Results, so that the perspective is clear. Because the message appears to be simple – the institution’s pass rates are lower than and the pass rates of the country. But the mix of numbers and labels somewhat hides that message.
• I understand that students are exam-driven, and an institution is often judged by its pass-rate, but the intervention does appear to be focused less on medical education, and more on getting good scores in the exams – read the chapter, pre- and post-tests, preparation for the Board exam. I see little in there that talks about assessing underlying issues, about developing skills for a qualified practitioner (critical thinking skills, etc.)
• The Discussion is strong in its analysis on the information and reflection with a view to refinement of the intervention, but is weakened by its not relating the findings back to the literature, especially the literature cited in the Introduction.

Overall, it appears that the researchers have embarked on a long-term project, and may have to expend a great deal of effort before they see the fruits of their labour, or, at the very least, have greater clarity on the reasons behind their students’ poor results. I would recommend, however, that they focus a little less on preparation for the exam, and a little more on developing practitioners that will serve their patients better. I look forward to seeing another paper on the topic in a few years’ time that shows the results across student cohorts.
Trevor Gibbs - (22/06/2018) Panel Member Icon
/
Although a very US-centric paper with many of the forms of assessment etc. not applicable to other countries I do think that there are some things in the paper that are transferable . It seems that everything in the paper was very exam-centred and teacher-centred and I would have liked to see less of that and more student-centred investigation as to why the students were not as motivated as they should have been and were there other external factors affecting exam performance. I like the individualisation shown by stratifying the students, although I again wondered about how this affected the performance of those in the lower groups.
I do think that this paper is a useful paper to be read, specifically by our US colleagues and those taking these external assessments, but i do think it is lacking somewhat in its qualitative measures