Research article
Open Access

An Exploratory Study of a Novel Adaptive e-Learning Board Review Product Helping Candidates Prepare for Certification Examinations

Michael Healy[1], Emil Petrusa[2], Carl Gustaf Axelsson[2], Praelada Wongsirimeteekul[2], Ole-Petter Hamnvik[3], Matthew O’Rourke[4], Roger Feinstein[4], Ruth Steeves[4], Roy Phitayakorn[1]

Institution: 1. Department of Surgery, Massachusetts General Hospital, Harvard Medical School and NEJM Group, 2. Department of Surgery, Massachusetts General Hospital, Harvard Medical School, 3. Department of Medicine, Brigham and Women’s Hospital, Harvard Medical School and NEJM Group, 4. NEJM Group
Corresponding Author: Dr Michael Healy (healy.michael.g@gmail.com)
Categories: Technology, Research in Health Professions Education
Published Date: 06/08/2018

Abstract

Objectives: Online board-review products are widely used to prepare for high-stakes examinations; however, it is unclear if these products are truly helpful and which features are most useful. This study examined the user ratings of an e-learning examination-preparation product and its impact on first-time American Board of Internal Medicine Certifying Examination (ABIM-CE) test takers.

 

Methods: 4,400 users were anonymously surveyed between 2015 and 2017 to determine their level of satisfaction with the NEJM Knowledge+ Internal Medicine Board Review (NEJM Knowledge+) product and their board-exam passage rate.

 

Results: The responding users’ (n=836) average ratings of NEJM Knowledge+’s content, relevance, components, and features were high. Similarly, a subgroup of responding users from the United States who primarily used NEJM Knowledge+ to prepare for an initial ABIM-CE between 2014 and 2016 (n=153) rated the content, relevance, components, and features high, with practice exams, adaptive delivery of questions, and reports on progress, performance, and metacognition as the most valuable features. Importantly, this subgroup of users passed the ABIM-CE on their first attempt at a significantly higher rate than the national average (95% vs 89%, z=2.6397, p=0.0083).

 

Conclusion: An e-learning examination-preparation product that features adaptive delivery of questions, offers detailed feedback, and analyzes overall metacognition may be an effective tool for first-time test takers preparing for initial certifying examinations.

 

Keywords: Adaptive e-learning; Certification examination; Examination preparation; High stakes examinations; Resident education; Physician education

Introduction

Individuals on the path towards becoming a physician in the United States spend a great deal of time in educational settings and must pass many standardized high-stakes examinations.  These examinations may have tremendous implications for their futures, including getting into college (SAT or ACT), medical school (MCAT), as well as residency training programs (USMLE Steps 1 and 2). After they complete residency and/or fellowship training, most physicians typically seek board certification, as hospitals have increasingly implemented board certification as a required element in the credentialing process (Cassel and Reuben, 2011; Freed, Dunham and Gebremariam, 2013) and physicians without board certification may have a slightly lower earning potential (Gray et al., 2012). Board certification has also been linked to improved knowledge of current medical standards through continuous learning and generally providing better patient care (Lipner, Hess and Phillips, 2013). Additionally, failing and needing to retake board certification examinations can impact a physician’s career (Atsawarungruangkit, 2015). As such, board certification examinations are high-stakes examinations.

 

The American Board of Internal Medicine (ABIM) has the largest number of board-certified physicians in the United States with over 250,000 members (American Board of Medical Specialties, 2017) with over 7,000 physicians having taken the American Board of Internal Medicine Certifying Examination (ABIM-CE) for the first time in 2016 (American Board of Internal Medicine, 2017). As internal medicine residency programs are assessed on the board-exam success rate of their graduates, both by applicants and by the Accreditation Council for Graduate Medical Education, which requires programs to have at least an 80% certification examination first-time pass rate over a three-year period (Accreditation Council for Graduate Medical Education, 2017), internal medicine residency program directors must be invested in the board-exam success rate of their graduates. The best methods to prepare for the ABIM-CE are unclear but have included taking advantage of residency program educational opportunities, using internal and external board-review courses (Flannery, 2014), and most recently, using online board-review question banks.

 

In 2014, NEJM Group partnered with Area9 Learning (now Area9 Lyceum) to develop and launch the NEJM Knowledge+ Internal Medicine Board Review (NEJM Knowledge+) product, which supports the board certification preparation and lifelong learning for internal medicine physicians. This product features a comprehensive question bank that NEJM editors developed on the basis of an augmented ABIM blueprint. The editors wrote clinical scenarios and learning objectives as the foundation of three types of questions: (a) board-style multiple-choice questions, (b) related, shorter multiple-choice questions, and (c) related fill-in-the-blank questions (see Figure 1). Both types of multiple-choice questions include detailed feedback and citations/further resources. This comprehensive question bank uniquely incorporates adaptive-learning technology, which uses an algorithm “to observe how an individual learns and then delivers questions that are targeted to address any gaps in knowledge” (McMahon and Drazen, 2014, p. 1648). Specifically, through this algorithm, NEJM Knowledge+ takes into account the learner’s answer to a question, self-reported confidence on a four-point scale (1– “I know it”, 2 – “I think so”, 3 – “not sure”, and 4 – “no idea”), the learner’s overall confidence in the topic, the time the learner takes to indicate an answer, and how others have answered in order to determine the optimal sequencing of additional questions. In addition, the algorithm uses spaced repetition of related items mixed in with new questions until it determines that the learner has likely mastered all of the underlying learning objectives in each specialty module. Additionally, a separate “recharge” function fields question variations on items that the learner has already seen and may have forgotten by using spaced repetition based on Ebbinghaus curves and the algorithm described above. At launch, NEJM Knowledge+ included 2,477 learning objectives, 1,577 case-based questions, and 3,147 total items, including all the variants. The content has been maintained through both proactive review and a “challenge” function that allows learners to suggest improvements. New questions have been added for a current total of 2,787 learning objectives, 1,771 case-based questions, and 4,935 total items. Further, these questions are used for the basis of the two full-length practice exams that learners can access, which complements the other features such as progress reports to evaluate and measure performance.

 

The aims of this exploratory study were to measure the users’ ratings of NEJM Knowledge+ features and to address the relationship between the use of NEJM Knowledge+ and passing the ABIM-CE. Our primary hypotheses were that the user ratings of the features would be high and that the use of NEJM Knowledge+ would have a beneficial impact on passing the ABIM-CE.

Methods

NEJM Group developed a series of five surveys (two in 2015, one in 2016, and two in 2017), using a cross-sectional survey design intended to measure satisfaction and board passage rate of NEJM Knowledge+ users. These surveys each included over 30 questions that consisted of multiple-choice, rank-order, rating, scaled, and open-ended questions. The surveys were electronically distributed anonymously to a total of 4,400 users. Distribution occurred during the respective years and collection occurred between three and four weeks after the distribution date. Minor incentives were offered to those completing the survey in the form of Amazon gift cards in the amount of $10 in 2015, $25 in 2016, and $25 in 2017.

 

Twelve questions that were identical across the five surveys were analyzed for this study (see Table 1). Ten questions (questions 1-7 and 9-11) were used to develop a basic profile of the resident, fellow, and physician users that responded via descriptive statistics. Seven questions (questions 2, 5, and 8-12) were used to conduct an advanced analysis using a subgroup of responding users from the United States who primarily used NEJM Knowledge+ to prepare for an initial ABIM-CE between 2014 and 2016. The advanced analysis first consisted of examining the ratings and rankings of NEJM Knowledge+ and comparing those who passed and those who did not pass the ABIM-CE on their first attempt via conducting independent-samples t-tests using IBM SPSS 24 (Armonk, New York) and a Mann–Whitney U test using a Mann-Whitney U test calculator (http://www.socscistatistics.com/tests/mannwhitney/default2.aspx). We also examined the ABIM-CE pass rates comparing the identified subgroup of the responding users to the average published ABIM-CE national pass rates between 2014 and 2016 for first-time test takers (https://www.abim.org/~/media/ABIM%20Public/Files/pdf/statistics-data/certification-pass-rates.pdf) by conducting a z-score calculation using a z-score calculator (http://www.socscistatistics.com/tests/ztest/Default2.aspx).

Results/Analysis

Basic Profile of the Resident, Fellow, and Physician Users

 

A total of 836 of 4,400 (19%) users responded (2015 survey responses = 267; 2016 survey responses = 180; and 2017 survey responses = 389). Respondents were 31.2% residents/fellows (n=261) and 68.8% physicians in practice (n=575). Of the respondents, over 40 countries were represented, with the three most frequent consisting of the United States (n=651, 77.9%), Canada (n=34, 4.1%), and Australia (n=27, 3.2%). The most frequent factor that influenced the respondents’ use of NEJM Knowledge+ was the NEJM Group brand (n=224, 26.8%). Additionally, respondents most frequently completed/almost completed the program (n=293, 35%) or were using it regularly (once or twice per week) (n= 217, 26%).

 

The most frequent reasons for using the product included preparing for an initial ABIM certification exam (n=256, 30.6%), lifelong learning (n=219, 26.2%), or preparing for an ABIM recertification exam (n=179, 21.4%). The difficulty of the questions in NEJM Knowledge+ compared with the difficulty of the questions on the exam were rated as a lot harder (n=12, 1.4%), somewhat harder (n=88, 10.5%), just about as difficult (n=250, 29.9%), somewhat easier (n=100, 12.0%), or a lot easier (n=20, 2.4%). The majority of respondents rated NEJM Knowledge+ as either about as helpful in preparing them for the exam (n=146, 17.5%) or substantially more helpful in preparing them for the exam (n=139, 16.6%) than the best of the other resources they had used.

 

Respondents rated NEJM Knowledge+’s quality of content most frequently as either excellent or good (n=776, 92.8%), relevance to practice most frequently as either excellent or good (n=750, 89.7%), were most frequently very satisfied with the five product components (average percent very satisfied=74.4%), and were most often very satisfied with the ten product features (average percent very satisfied=55.5%).   

 

Advanced Analysis – Identified Subgroup of Responding Users

 

Of the 836 respondents, there were 153 (18.3%) from the United States who used NEJM Knowledge+ primarily to prepare for an initial ABIM-CE between 2014 and 2016.

 

As illustrated in Table 2, the average ratings of the quality of NEJM Knowledge+’s content and its relevance to practice amongst users in this subgroup were high. Although respondents who passed the ABIM-CE on their first attempt rated the quality and relevance of the content higher than those who did not pass, these differences did not reach statistical significance. 

 

As illustrated in Table 3, users in this subgroup of respondents rated the five components of NEJM Knowledge+ highly on a scale of 1 (very dissatisfied) to 4 (very satisfied). Those who passed on their first attempt rated the components higher than those who did not pass; however, the only difference between these groups’ ratings that reached statistical significance was in the component “detailed feedback” (p=0.002), as those who passed rated it 3.68, on average, compared with those who did not pass, who rated it 3.00, on average. 

 

As illustrated in Table 4, users in this subgroup of respondents also rated the ten features of NEJM Knowledge+ highly. Though there were differences between the ratings for those who passed and those who did not pass on their first attempt, the differences were not statistically significant.

 

The ranking of the ten features of NEJM Knowledge+ according to their value by users in this subgroup of respondents from highest (1) to lowest (10) indicated that the top three ranked features were practice exams (3.57, on average), adaptive delivery of questions (3.75, on average), and reports on progress, performance, and metacognition (4.36, on average) (see Figure 2). There were no significant differences between the rankings of those who passed on their first attempt and those who did not pass (p=NS).

 

Finally, during this time period (between 2014 and 2016), there was a significantly higher proportion of users in this subgroup of respondents who reported passing the ABIM-CE on their first attempt than the national pass rate average (95% vs 89%, z=2.6397, and p=0.0083).   

Discussion

A physician’s time is divided amongst several areas, including clinical care, documentation through electronic medical records, and staying up to date with medical knowledge and procedures. Specifically, Leafloor et al. (2015) highlighted that internal medicine residents spend an average of 41.8% of their time on patient care activities, compared to an average of 13.8% of their time spent on educational activities. Methods to comprehensively, effectively, and efficiently prepare for high-stakes examinations are essential and can be highly beneficial. Advances in technology have led to the development of products, such as online question banks and e-learning examination-preparation products, that can potentially provide physicians with an advantage in their exam preparation.

This study highlights that the NEJM Knowledge+ product can potentially be useful for learners and that specific features, including adaptive delivery of questions, practice exams, and reports on progress, performance, and metacognition, could aid in enhancing learning efficiency and knowledge retention. These results are consistent with other research findings in the literature. While the effectiveness of adaptive e-learning environments for both health professionals and health professions students is unclear (Fontaine et al., 2017), the possibilities of adaptive learning technologies in higher education is being explored as there is much agreement regarding its beneficial value (Liu et al., 2017) when applied thoughtfully and in a manner consistent with learning needs. Additionally, retrieval practice by taking periodic practice exams or quizzes and reviewing the results appears to enhance overall medical knowledge retention and may potentially provide effective direction for continued study and preparation (Wagner et al., 2016). Further, the application of metacognition allows learners to effectively think critically, direct their own learning, and self-assess, which are important skills in lifelong learning and increasingly emphasized by health-professional organizations and accreditation bodies (Medina, Castleberry and Persky, 2017).

In addition, this study highlights that the use of NEJM Knowledge+ may have been a contributing factor for users in the identified subgroup of respondents on passing the ABIM-CE. As such, program directors, faculty members, and educators can benefit by being aware of the potential educational value of e-learning examination-preparation products to provide another avenue of support for their residents and junior colleagues in successfully attaining initial board certification.

Finally, the design of this exploratory study had many strengths, including the anonymity of the users who responded to the survey, numerical data utilized for statistical analysis providing researcher objectivity, and the ability to develop an identified subgroup of respondents for advanced analysis. There were some limitations, including a low response rate, which is a common problem in surveys of health care professionals and affects the generalization of these results. Additional limitations included that other factors related to user preparation for the ABIM-CE were not examined in this study and users self-reporting their ABIM-CE result. Future research includes examining which specific adaptive learning features and user metrics (e.g., time of login, duration of session use, duration of product use, etc.) related to NEJM Knowledge+ ultimately increase overall knowledge retention.

Conclusion

This exploratory study highlighted that an e-learning examination-preparation product featuring adaptive delivery of questions, effective feedback, and analysis of overall metacognition can potentially be an effective tool for first-time test takers preparing for initial certifying examinations. Thus, the ability for residents, fellows, and physicians to utilize products such as NEJM Knowledge+ as learning tools in this high-stakes-examination era can potentially be of high educational value. Finally, further research should be undertaken regarding e-learning examination-preparation products in other medical and allied-health specialties.

Take Home Messages

  • First-time test takers using the NEJM Knowledge + e-learning examination-preparation product, which can be a contributing factor, passed the ABIM-CE at a significantly higher rate than the national average.
  • E-learning examination-preparation products that have features such as practice exams, adaptive delivery of questions, and reports on progress, performance, and metacognition can potentially be effective in the preparation for initial certifying examinations.

Notes On Contributors

Michael G. Healy, Ed.D., is a medical education researcher at Massachusetts General Hospital’s Department of Surgery. 

 

Emil R. Petrusa, Ph.D., is an Educational Research Specialist at Massachusetts General Hospital’s Department of Surgery and Learning Laboratory, and an Associate Professor of Surgery at Harvard Medical School.

 

Carl Gustaf Axelsson, M.D., M.Phil., M.M.Sc., is a neurosurgery resident at Imperial College in London and a medical education researcher at Massachusetts General Hospital’s Department of Surgery.

 

Praelada Wongsirimeteekul, M.D., is an ophthalmology resident from Thailand and former Surgical Education Research Fellow at Massachusetts General Hospital’s Department of Surgery.

 

Ole-Petter R. Hamnvik, M.B.B.Ch., B.A.O., M.M.Sc., is an Associate Physician at Brigham and Women’s Hospital’s Department of Medicine and an Assistant Professor of Medicine at Harvard Medical School.

 

Matthew O’Rourke is the General Manager of Education Content and Product Development at NEJM Group.

 

Roger Feinstein, M.H.A., is the Director of Market Intelligence at NEJM Group.

 

Ruth Steeves, M.B.A., is the Market Research Manager at NEJM Group.

 

Roy Phitayakorn, M.D., M.H.P.E. (M.Ed.), FACS, is an Assistant Surgeon at Massachusetts General Hospital’s Department of Surgery, the Department’s Director of Medical Student Education and Surgery Education Research, and is an Associate Professor of Surgery at Harvard Medical School.

Acknowledgements

The authors of this study would like to acknowledge Mrs. Josette Akresh-Gonzales for her thoughtful review of this manuscript.

Bibliography/References

Accreditation Council for Graduate Medical Education. (2017) ACGME Program Requirements for Graduate Medical Education in Internal Medicine. Chicago, Illinois. Available at: https://www.acgme.org/Portals/0/PFAssets/ProgramRequirements/140_internal_medicine_2017-07-01.pdf

American Board of Internal Medicine. (2017) First-Time Taker Pass Rates - Initial Certification. Philadelphia, Pennsylvania. Available at: https://www.abim.org/~/media/ABIM%20Public/Files/pdf/statistics-data/certification-pass-rates.pdf (Accessed: 3 November 2017).

American Board of Medical Specialties. (2017) ABMS Board Certification Report: 2016–2017. Chicago, Illinois. Available at: https://www.abms.org/media/139572/abms_board_certification_report_2016-17.pdf (Accessed: 3 November 2017).

Atsawarungruangkit, A. (2015) ‘Relationship of residency program characteristics with pass rate of the American Board of Internal Medicine certifying exam’, Medical Education Online, 20(1). https://doi.org/10.3402/meo.v20.28631

Cassel, C. K. and Reuben, D. B. (2011) ‘Specialization, Subspecialization, and Subsubspecialization in Internal Medicine’, New England Journal of Medicine, 364(12), pp. 1169-1173. https://doi.org/10.1056/NEJMsb1012647

Flannery, M. T. (2014) ‘Trends in the History of Certification and Recertification of the American Board of Internal Medicine’, The American Journal of the Medical Sciences, 347(1), pp. 74-77. https://doi.org/10.1097/MAJ.0b013e31829ce04c

Fontaine, G., Cossette, S., Maheu-Cadotte, M. A., Mailhot, T., et al. (2017) ‘Effectiveness of Adaptive E-Learning Environments on Knowledge, Competence, and Behavior in Health Professionals and Students: Protocol for a Systematic Review and Meta-Analysis’, JMIR Research Protocols, 6(7). https://doi.org/10.2196/resprot.8085

Freed, G. L., Dunham, K. M. and Gebremariam, A. (2013) ‘Changes in hospitals' credentialing requirements for board certification from 2005 to 2010’, Journal of Hospital Medicine, 8(6), pp. 298-303. https://doi.org/10.1002/jhm.2033

Gray, B., Reschovsky, J., Holmboe, E. and Lipner, R. (2012) ‘Do Early Career Indicators of Clinical Skill Predict Subsequent Career Outcomes and Practice Characteristics for General Internists?’, Health Services Research, 48(3), pp. 1096-1115. https://doi.org/10.1111/1475-6773.12011

Leafloor, C. W., Lochnan, H. A., Code, C., Keely, E. J., et al. (2015) ‘Time-motion studies of internal medicine residents' duty hours: a systematic review and meta-analysis’, Advances in Medical Education and Practice, 6, pp. 621-629. https://doi.org/10.2147/AMEP.S90568

Lipner, R. S., Hess, B. J. and Phillips, R. L. (2013) ‘Specialty Board Certification in the United States: Issues and Evidence’, Journal of Continuing Education in the Health Professions, 33 (Suppl. 1), pp. S20-S35. https://doi.org/10.1002/chp.21203

Liu, M., McKelroy, E., Corliss, S. B. and Carrigan, J. (2017) ‘Investigating the effect of an adaptive learning intervention on students’ learning’, Educational Technology Research and Development, 65(6), pp. 1605-1625. https://doi.org/10.1007/s11423-017-9542-1

McMahon, G. T. and Drazen, J. M. (2014) ‘Introducing NEJM Knowledge+ and Its Adaptive Personalized Learning’, New England Journal of Medicine, 370(17), pp. 1648-1649. https://doi.org/10.1056/NEJMe1401777

Medina, M. S., Castleberry, A. N. and Persky, A. M. (2017) ‘Strategies for Improving Learner Metacognition in Health Professional Education’, American Journal of Pharmaceutical Education, 81(4). https://www.ajpe.org/doi/pdf/10.5688/ajpe81478

Wagner, B. J., Ashurst, J. V., Simunich, T. and Cooney, R. (2016) ‘Correlation Between Scores on Weekly Quizzes and Performance on the Annual Resident In-Service Examination’, The Journal of the American Osteopathic Association, 116(8), pp. 530-534. https://doi.org/10.7556/jaoa.2016.106

Appendices

Figure 1

 

 

Figure 2

 

 

Table 1

 

Twelve Questions Analyzed for this Study

1. What is your professional status?

2. In which country do you reside?

3. What factor most influenced your decision to purchase NEJM Knowledge+?

4. Tell us about your use of NEJM Knowledge+ Internal Medicine Board Review.

5. What is your primary reason for using NEJM Knowledge+?

6. How did the difficulty of the questions in NEJM Knowledge+ compare with the difficulty of questions on the exam? NEJM Knowledge+ questions were...

7. How did NEJM Knowledge+ compare with the best of the other resources you used? NEJM Knowledge+ was...

8. Did you pass your exam?

9. Please rate the quality of the content and its relevance to practice:

- Quality

- Relevance

10. Please rate your satisfaction with these components of the content:

- Case Descriptions

- Short Questions

- Key Learning Points

- Detailed Feedback

- Citations

11. Please indicate your satisfaction with each of these features in NEJM Knowledge+:

- Dashboard and design

- Practice Exams

- Ability to earn CME credit

- Ability to earn MOC points

- Automatic delivery of MOC points to the ABIM

- Recharge

- Adaptive delivery of questions

- Fill-in-the-blank questions in Recharge

- Ability to challenge questions

- Reports on progress, performance, and metacognition

12. Please rank the following features according to their value to you in your experience of NEJM Knowledge+:

- Dashboard and design

- Practice Exams

- Ability to earn CME credit

- Ability to earn MOC points

- Automatic delivery of MOC points to the ABIM

- Recharge

- Adaptive delivery of questions

- Fill-in-the-blank questions in Recharge

- Ability to challenge questions

- Reports on progress, performance, and metacognition

 

Table 2

Rating of Quality of the Content and its Relevance to Practice

Attribute

Passed (mean ± SD)

Not Passed (mean ± SD)

Quality

n=146, 4.42 ± 0.7

n=7, 4.00 ± 1.0

Relevance

n=146, 4.30 ± 0.7

n=7, 3.86 ± 1.1

Scale: 1 (Poor) – 5 (Excellent)

SD=standard deviation

 

Table 3

Rating of Satisfaction with these Components of the Content

Attribute

Passed (mean ± SD)

Not Passed (mean ± SD)

Case descriptions

n=145, 3.72 ± 0.5

n=7, 3.43 ± 0.5

Short questions

n=144, 3.65 ± 0.6

n=7, 3.43 ± 0.5

Key learning points

n=145, 3.77 ± 0.5

n=7, 3.43 ± 0.5

Detailed feedback*

n=145, 3.68 ± 0.5

n=7, 3.00 ± 0.8

Citations

n=118, 3.74 ± 0.5

n=4, 3.25 ± 0.5

 *p=0.002

Scale: 1 (Very Dissatisfied) – 4 (Very Satisfied)

Respondents who selected N/A Did not use were removed from the analysis for this question 

 

Table 4

Rating of Satisfaction with these Features

Attribute

Passed (mean ± SD)

Not Passed (mean ± SD)

Dashboard and Design

n=145, 3.54 ± 0.8

n=7, 3.71 ± 0.5

Practice Exams

n=134, 3.64 ± 0.6

n=6, 3.50 ± 0.8

Ability to Earn CME Credit

n=84, 3.56 ± 0.7

n=6, 3.67 ± 0.5

Ability to Earn MOC Points

n=82, 3.61 ± 0.6

n=4, 3.25, ± 0.5

Automatic Delivery of MOC Points to the ABIM

n=80, 3.54 ± 0.7

n=6, 3.50 ± 0.5

Recharge

n=133, 3.61 ± 0.6

n=5, 3.60 ± 0.9

Adaptive Delivery of Questions

n=145, 3.66 ± 0.6

n=7, 3.86 ± 0.4

Fill-in-the-Blank Questions in Recharge

n=131, 3.38 ± 0.7

n=5, 3.60 ± 0.5

Ability to Challenge Questions

n=99, 3.52 ± 0.7

n=4, 3.50 ± 0.6

Reports on Progress, Performance, and Metacognition

n=144, 3.65 ± 0.6

n=7, 3.71 ± 0.5

Scale: 1 (Very Dissatisfied) – 4 (Very Satisfied)

Respondents who selected N/A Did not use were removed from the analysis for this question

Declarations

There are some conflicts of interest:
Drs. Healy, Hamnvik, and Phitayakorn each receive partial stipends from NEJM Group. Mr. O’Rourke, Mr. Feinstein, and Ms. Steeves are employed by NEJM Group. Drs. Petrusa, Axelsson, and Wongsirimeteekul have no disclosures to report.
This has been published under Creative Commons "CC BY-SA 4.0" (https://creativecommons.org/licenses/by-sa/4.0/)

Ethics Statement

IRB determination by Partners Human Research Committee (Protocol #: 2018P001598/PHS).

External Funding

This paper has not had any External Funding

Reviews

Please Login or Register an Account before submitting a Review

Sateesh Babu Arja - (12/08/2018) Panel Member Icon
/
This paper sounded interesting when I read the title. This is a well written paper, and easy to understand. But I am not sure if this paper answered the research hypothesis in question. Even though most of the respondents are from USA, it is not clear if the users of NEJM materials also used the other resources are not for board examinations or only used NEJM materials. As authors mentioned there is no clear control group and they compared the national average with NEJM users. The response rate is very low even though the incentive was given. Using e-learning resources for MCAT, GRE, TOEFL, and USMLE is dated long back. It is interesting to see the e-learning resources for board certification exams too. This paper might be useful for residents, faculty involved in residency training programs, and residency program directors.
Wolf Hautz - (07/08/2018) Panel Member Icon
/
In this article, the authors describe results of a survey that aimed to assess the satisfaction of users with a commercial e-learning platform. The extend of the platform seems quite large and apparently a lot of consideration was put into the products design.
The survey finds that respondents are generally satisfied with the tool. A subsample of 3.5% of the respondent’s rate of examination success at first attempt is compared to all test takers. Respondents were more often successful at their first attempt. This result cannot be generalized to the population of all product users due to potential self-selection bias and the low response rate. A non-responded analysis is lacking.
Findings of the study correspond well with previous research on e-learning (see Cook DA or Norman G 2018): users like it, when compared to nothing. There is no comparison to something in the paper though.
Of minor note, the discussion states that “Further, the application of metacognition allows learners to effectively think critically, direct their own learning, and self-assess, which are important skills in lifelong learning”. The literature on both, generic critical thinking skills and a generic self-assessment ability is extensive and can be summarized as: both generic skills are most likely inexistent. For reviews of the literature, see Eva KW or Norman G.