Research article
Open Access

Accelerating the Noticing Skills of nursing and medical students Using Video Simulation

Abby Razer[1], Christie McIntyre[1], Peter Fadde[1]

Institution: 1. Southern Illinois University Carbondale
Corresponding Author: Dr Abby Razer ([email protected])
Categories: Educational Strategies, Students/Trainees, Teaching and Learning, Simulation and Virtual Reality
Published Date: 09/04/2019



This study examines the acceleration of expertise by training novices to attend to the situational cues that experts notice to solve problems.  The purpose of this qualitative study was to examine and expand upon training of expertise by using the expertise-based training approach as a guide.


Medical and nursing students were asked to interactively observe videotaped simulated nursing scenarios for the presence of errors, with experts’ observations of the same clips provided as feedback.  The study qualitatively analyzed the scenario observations and focus group responses to determine if the students expanded their basic level of clinical judgment. 


The study found that the students enjoyed the scenarios even while promoting cognitive dissonance as the errors generated a substantive discussion.  In addition, this study will guide the use of expertise-based training (XBT) by suggesting the XBT design include debriefing those who disagree with the expert so that the cognitive dissonance can be captured and used to promote learning.


The potential benefits of the study to nursing education, along with other fields employing simulator training, will be new ways to employ simulators for maximum efficiency and effectiveness.

Keywords: Expertise based training (XBT); human patient simulators; simulation; nursing education; medical education; decision making


Imagine becoming a head nurse, doctor, or firefighter involved in solving a critical, never-before-seen crisis scenario.  What thought processes would individuals use and what training methods could help them become prepared in an efficient manner?  The key question then becomes, what can educators do to improve their students learning to accelerate the development of their expertise?  The purpose of this qualitative study was to examine training of expertise using the XBT approach that involved re-purposing the types of tasks used by researchers to measure expertise in order to accelerate the training of expertise.  Specifically, the study investigated a strategy for systematically scaffolding the development of expert-like decision making/problem solving by targeting the early noticing stage of situational awareness and pattern recognition.  By incorporating videotaped simulation scenarios, the benefits of the simulations were amplified beyond the typical focus of simulator-based training on performing procedures.  The goal was to see if professional education programs such as nursing and medical programs could get more instructional value by using simulators to produce scenarios that could be used to scaffold novices’ noticing abilities early in their professional education.  In this context, interactive reflective observation was designed to encourage the participants to interact with videos of simulation scenarios by making comments, comparing their observations to those of experts, and then reflecting on those observations.

Research Questions

  1. How do participants’ respond when their scenario observations agree or disagree with the expert opinion?
  2. What perceptions do nursing and medical students have regarding the effectiveness of interactively observing staged simulation scenarios?

Literature review

Expertise and Decision Making

The research and theories behind expertise-based training also include Jonassen’s problem solving theories that connect to Klein’s decision-making process.  David Jonassen describes strategic performance problems as those that occur in dynamic environments causing the problems to be ill structured.  In many situations, strategic performance problems involve high-stakes situations that can involve life and death scenarios (Jonassen, 2010).  Jonassen presented Klein’s Recognition-Primed Decision (RPD) model as the process by which strategic problems are often solved. In the medical field, this process is termed illness scripts.  Illness scripts provides that expert physicians recognize and use previously known cues and patterns to diagnose a patient.  The ability of the problem solver to automate solving the problem depends on the familiarity of the recognized cues or patterns (Jonassen, 2010; Klein, 1998).

Expertise Based Training

Expertise based training (XBT) is an instructional theory based on several theoretical foundations, including the work of Gary Klein on naturalistic decision making and his Recognition-Primed Decision (RPD) model.  While RPD is intended to model the cognitive processes of expert practitioners, such as expert firefighters or expert ICU nurses, expertise based training (XBT) opens the door to training individuals to think more like experts by recognizing specific situational cues (Fadde, 2013).  While the responses to these cues are most important, they can vary according to the performer’s experience and training even when the cues are the same.  Therefore, it is foundational to first focus on the recognition of cues to advance the individual to a higher level of expertise.

Transitioning from novice to higher levels of expertise.  One example of the XBT approach was demonstrated by a study involving preservice teachers.  The purpose of the study was to examine whether the expertise levels of novice preservice teachers could be enhanced by having them watch short video clips of other student-teachers’ classroom teaching. The clips were developed by editing video of student-teachers teaching a lesson in an authentic classroom. Three experienced teacher-supervisors (experts) focused on issues of classroom management and student questioning while viewing the video clips.  The experts essentially coded the video clips, much as a qualitative researcher might, for instances of the target behaviors (Fadde & Sullivan, 2013).  The goal of the XBT instructional activity was for students to compare their observations of key classroom activities (classroom management and student questioning) to those of the teacher-educators, with the intent that the novice preservice teachers would increasingly align their observations with those of the expert teacher-educators.


Educational Innovation Instructional Method

This is a qualitative cross case analysis study that involved one case of medical students and one case of nursing students.  Each case was analyzed as a within-case analysis where the data is analyzed comprehensively and independently.  This was followed by a cross-case analysis that brings elements from both cases to build a general explanation for all cases (Merriam & Tisdell, 2016).  The purpose of this study was to examine nursing and medical students’ expertise development and their perceptions of staged simulation scenarios.

The instructional method of interactive video observation with expert model feedback is defined as a method that uses a task that involves identifying critical incidents in video clips of performance situations and noting the time code where the incidents occur.  The performance situations that are used as trigger videos are intended for students to view and notice aspects of the situation or the performers.  Trigger videos can either be real, simulation runs, or staged simulation runs that are recorded for later use.  The expert model feedback component consists of the recorded observations made by experts when they watched the same video clips.  Students view and comment on the trigger videos, then compare their observations to those of the experts.  These trigger videos are referred to as staged scenario videos or scenarios in this project.  This process allows for reflection-in-action that is operationalized due to the video clips.  Eventually, the novice learners’ noticing skills begin to match with the experts’ feedback.  The instructional method of video observation with expert model feedback was implemented during a teacher noticing study conducted by Fadde and Sullivan (2013).  The research presented in the current study-involving medical and students is a unique study that extends the previous work on XBT.  The study represents a new direction in the use of XBT.


Case 1: Medical Students included five volunteer first year medical students from a Midwestern School of Medicine.  Case 2: Nursing Students included three volunteer undergraduate nursing students at a community college.  All the participants were asked to choose a pseudonym for anonymity.  X institutional review board approved the protocol.

Data Collection

Each case occurred in a monitored computer lab setting where participants completed two simulation scenarios followed by administering the Health Sciences Reasoning Test (HSRT) posttest.  The HSRT allowed the researcher to collect data to further understand the participants as a demographic element.  Each staged scenario video took approximately thirty minutes depending on the participant’s self-pacing.  The participants were given a maximum of sixty minutes to complete both scenarios.  Participants were allowed sixty minutes to take the HSRT.  Participants were asked to participate in a one-hour focus group.  The focus group was audio and video recorded for later transcription.  This allowed participants’ responses to be matched with their scenario performance and HSRT responses during the analysis.  A member check was offered and a reflexive journal was maintained.

Data Analysis

The two cases provided the cases for the cross case analysis.  The qualitative results from the focus group and written observations in the scenario activities were analyzed by coding the data to discover patterns and overall themes (Patton, 2002).  For example, in Case: Medical Students, agreement with the expert was defined as participant-written notes indicating that they agreed with the expert opinions.  A statement such as “my answer matches the expert opinion” would have been coded as agreement with the expert (SIUC scenarios, participant notes, March 20, 2015).  Together the codes came together to form a theme, in this instance, agreement with the expert supported with with examples such as: “My answer did not mention checking the bracelet specifically, but overall agreed with the expert opinion” (SIUC scenarios, participant notes, March 20, 2015).  This same approach was applied to Case: Nursing Students.  Further examples of these themes and examples are in Table 3 and Table 4. After the analysis of both cases, the same five themes emerged including:

  • Agreement with the expert
  • Suggesting a different course of action or disagreement with the expert
  • Connecting the scenarios to past experiences
  • Reflection of lack of error recognition
  • Reflection on correct error recognition

The HSRT score along with demographic data were used to form a participant profiles shown in Table 1 and Table 2. 

Table 1. Medical student profiles



SimMan Usage



HSRT Score Overall (33 possible)


Masters, no clinical experience



Strong (25)

Moderate (3)


Bachelors, no clinical experience



Strong (24)

Strong (5)

Bruce Wayne

Bachelors, no clinical experience



Strong (21)

Strong (5)

Denny Crane

Bachelors, EMT, clinical experience 2 years



Superior (29)

Strong (6)


Bachelors, EMT, clinical experience 4 years



Superior (28)

Strong (6)



Table 2. Nursing student profiles



SimMan Usage

Age Range


HSRT Score Overall (33 possible)


2nd year (current level), CNA, clinical experience 5 years




Strong (25)

Betty White

2nd year (current level), Bachelors, clinical experience 2 years, non-traditional student




Strong (22)

Amelia Bedelia

2nd year (current level), clinical experience 1 year




Not manifested (13)


Table 3. Qualitative themes for Case: Medical Students





1. Agreement with the expert

“My answer did not mention checking the bracelet specifically, but overall agreed with the expert opinion.” (Tess)

“My answer was the same as the expert observation…” (Tess)

“My analysis agrees with the expert's opinion.” (Charlie)

2. Suggesting a different course of action or disagreement with the expert

“I think I disagreed more when it was no errors from the experts and I had written down an error. Um, like I-I got it maybe because like I was searching for an error too much. Um, but-- and just little things as well…” (Tess)

“I think explaining the basics of medication (dosage, side effects, why it is given) when a patient is unfamiliar with the medication is appropriate in most circumstances.” (Charlie)

“…I am surprised that airway was not assessed for a supine vomiting patient.” (Denny Crane)


3. Connecting the scenarios to past experiences

In reference to prior EMT position, “it was nice to kinda look thru those glasses again.” (Denny Crane)

“Yeah, I worked in a path lab for almost a year, and I've um never really seen a nurse work on a day-to-day basis so it was kinda cool to see that and kind of guess what was going on…” (Charlie)

“…besides I do have path-- path lab, and I could actually pick up on a few errors, and I was-- there was a few errors I missed. I was like, "Oh, I should've got that!"” (Charlie)

4. Reflection on lack of error recognition

“I realized that something should be done about patient positioning, but was not correct on what exactly should have been done.” (Mags)

“Yeah, I don't really have any clinical experience, so a lot of the time I was like, "Well I don't really know what would be the right way to go about things, so I can't really say…” (Mags)

“The errors identified by the experts are not practices that I have yet been exposed to, perhaps accounting for my failure to notice them.” (Tess)

5. Reflection on correct error recognition

“pretty consistent. I may have gone into more specifics concerning the nurse's overall inappropriate tone.” (Denny Crane)

“In this answer, I felt more confident identifying the errors associated with IV medication administration, having seen them in previous scenarios. Overall, I think that I matched up well with the expert observation.” (Tess)

“ I understand the error not verifying the medication and the medication, I just missed that in the video.” (Charlie)


Table 4. Qualitative themes for Case: Nursing Students





1. Agreement with the expert

“ I agree with the expert. I was wrong” (Betty White)

“I caught all the errors--minus the flush” (Amelia Bedelia)

“The second nurse should have introduced herself. I agree with the expert..” (Danyelle)

2. Suggesting a different course of action or disagreement with the expert

“I just feel it’s necessary for the nurse to let the patient know what the nurse is doing to their body. What they're listening to and maybe even why.” (Betty White)

“I feel that there were errors in the way the nurse handled herself with the patient--asked questions but did not seem to care about the patient's response to them.” (Amelia Bedelia)

“I feel like the experts have let a lot of things slide that I feel are fairly significant errors” (Amelia Bedelia)

3. Connecting the scenarios to past experiences

Did not occur with this group



4. Reflection on lack of error recognition

“I didn’t think about the 2nd nurse introducing herself, but it is important.” (Betty White)

“I was more focused on method of assessment than listening to the patient, didn’t catch the error.” (Amelia Bedelia)

“I wondered about the replacing emesis basin with a clean one. She did flush the medication- I miss that one.” (Danyelle)

5. Reflection on correct error recognition

Not present for Betty White.

“While worded differently, my observations matched the experts.” (Amelia Bedelia)

Not present for Danyelle.


In order to assist in this analysis, the researcher used the qualitative analysis software dedoose.  The researcher anticipated that the majority of the themes would be emergent in nature.  However, the researcher realized that she was predisposed to recognize the focus group’s mentioning of anxiety caused by the simulation scenarios due to the numerous studies that discussed this phenomenon.  To avoid this bias and to provide reliability to the study, the researcher directed the research monitor to code a sample of the data.  The research monitor concurred with this analysis by conducting an audit trail.


This research project employed the Health Sciences Reasoning Test (HSRT) designed by Insight Assessment.  The test is online and composed of multiple-choice items given in 50 minutes.  The scales reported included overall critical thinking, along with individual scales for analysis, inference, evaluation, induction, and deduction (Facione, 2013). The HSRT was used as a demographic element.



Based on the themes and insights, the researcher was able to address the following research questions from the medical and nursing students’ responses.

RQ#1 Case Medical Students.  When the medical students agreed with the expert, they were able to state their agreement and further reflect on the video clip.  For example, Charlie reflected on video clip 3 in scenario 1 by saying, “While I agree with the expert opinion, but I would like to think that nurses check the 4 major areas of heart sounds on the chest and that they would announce what they are doing…” (SIUC scenario 1, participant notes, March 20, 2015).  Stating their agreement and reflecting related back to the theme of medical students suggesting another course of action, but it also allowed them to develop reflections that were more complex.  Similarly, when disagreeing with the experts, these students were realistic in their expectations.  Bruce Wayne understood that the compiled expert answer was a preference.  The group admitted that they disagreed more with the “no error” aspect, but they noted that they were unconcerned about their performance, since they were early in their medical careers and outside of the nursing realm.  The medical students were able to go “outside the box” and were curious about their progress.  At the time of the scenarios, the students were unaware of the scoring process.

RQ#1 Case Nursing Students.  The nursing students disagreed and explained why after completing the scenarios—in this case, during the focus group.

RQ#2 Case Medical Students.  The medical students found the staged video scenarios were an effective way to experience new perspectives and reported that they were able to use the scenarios as a learning experience.  All of the medical students reported that they enjoyed the scenarios, and even months later, during the member check, one individual commented on that aspect again.  As Tess commented during the focus group, “I think it would be really useful because they [sic] are errors that you might not think about all the time…” (SIUC focus group, personal communication, March 20, 2015).  After participating in the scenarios, the medical students’ desired to understand the reasoning of the compiled expert answer.  They wanted to use and learn from scenarios like these and suggested that they would like to analyze videos featuring themselves.      

RQ#2 Case Nursing Students.  The nursing students said that they found the staged video scenarios to be effective because they offered an alternative to their lengthy and primarily paper-based labs.  The scenarios allowed them to experience a live patient and took less time to complete.  Amelia Bedelia said, “[w]e do a lot of time like defining terms and, um, answering questions.  And we have a case study or a care plan this week… I've definitely learned a lot thru the care plan, but I feel like this is more of, uh, something that I could, I could spend an hour watching scenario videos every week versus probably not spend ten hours a week doing a care plan.”  Danyelle commented, “I really liked the videos, I thought they were kind of fun.  And like really watching them…” (JALC focus group, personal communication, September 12, 2015).  This was a sentiment shared by the group; several said they would like to see the staged video scenarios as part of their education.  In addition, the group focused on their development of empathy, since they thought the nurse in the scenario was rude.  The entire group agreed that they felt they would be more aware of how they treated patients after having experienced the scenarios.  The group also spent time discussing their disagreements and feeling frustrated by the expert’s answers.  In addition, the nursing students expressed a desire to understand the reasoning of the compiled expert answer and wished to use and learn from the videos as part of their educational process.  Lasater (2007, p. 273) also found her participants expressed “intense desire for more direct feedback” during a live simulation scenario debrief.

Cross-case Insights (Case: Medical Students & Case: Nursing Students)

In analyzing the two cases, the researcher used the research questions and allowed for any others that might emerge during the analysis.  For example, the researcher examined the participants’ engagement and anxiety.  The only time the students indicated in the focus groups that they felt anxiety was during the HRST test.  All of the students appeared to be highly engaged during the study.

For RQ#2, when the medical students agreed with the expert, they agreed and reflected, while the nursing students agreed and moved on.  The medical students disagreed but used the opportunity as a learning experience.  The nursing students disagreed and explained why after completing the scenarios—in this case, during the focus group.  The nursing students made more type 2 errors that were false positive identification of errors not noted by experts, including: hand hygiene, discussion of the nurse’s tone, and the urinary catheter procedure.  The medical students were able to self-judge their agreement with the expert, but the nursing students were less accurate and needed time to develop the skill.  Both groups used the compiled expert answer as guidance for future responses to better match the compiled expert answer with regard to the scenarios.

For RQ#2, medical and nursing students found the activity effective; which promoted cognitive dissonance, discussion, and further learning.  The medical students tended to report more of a learning experience and provided longer and more detailed reflections of the staged video scenarios.  The nursing students had shorter and less detailed reflections on the staged video scenarios and raised more discussion about why they disagreed with the expert.

Both the medical and nursing students reported that they enjoyed the scenarios.  The medical students enjoyed seeing a perspective they had not seen before.  They were especially interested in the role of the family member as a different perspective.  The medical students were able to decrease their frustration by learning from the scenarios, while respecting being outside the nursing field.  They recognized they were early in their learning. By contrast, the nursing students were able to learn despite being frustrated and disagreeing with the compiled expert answer.  The nursing students were also able to develop a sense of empathy participating in the scenarios, and which was mirrored by the medical students, but to a lesser degree.  Medical and nursing students both suggested a better compiled expert answer that might include a rationale.  Both groups desired to use scenarios like these in their education.  Medical students wanted to see video scenarios that involved themselves as the actors.

The researcher became interested in how the medical and nursing students’ professional focus seemed different in how they approached the interactive video activity.  The medical students appeared to be predisposed to taking risks and “thinking outside the box” so they could learn from the mistakes.  The medical students demonstrated this in their longer scenario reflections and focus group discussion.  The nursing students instead focused more on the exact actions and tended to be brief in their reflections.  The researcher suspected that this shift was due to differences in their professional focus, as well as to the training they received within their program. 


Many times, in instructional design we ask ourselves if the instruction is efficient, effective, and engaging (E^3) based on the principles proposed by Merrill (Carliner & Shank, 2008).

Efficiency is often related to the time and money that an instructional activity requires in the process of making and performing the activity. For example, my research project proposed that since SimMan is expensive, we should find a way to maximize its value by having the SimMan scenarios recorded so they could be used again. The researcher believed the project was financially efficient, but it may not be as efficient at first thought, since creating the videos and recording the experts’ analyses required time and money. Using the videos repeatedly should reduce the per-use cost over time. Since many nursing and medical education programs are already recording SimMan labs performed by students in the programs, the only cost might be the expert analysis. The more interesting issue that came up for the researcher is that when the researcher started the project, the researcher hoped that the students would find the scenarios effective and engaging. The result was that the researcher did not expect to encounter that students found the scenarios efficient, particularly the nursing students when they compared one hour to several hours of work in their current program. Perhaps programs can have all three, since the students went onto to discuss effectiveness and engagement.

The medical and nursing students who participated indicated that they found the staged video scenarios engaging and effective, but they desired to further understand the compiled expert answers. The desire to understand the answer is echoed in Lasater’s (2007) study, where the students wanted to know how to improve.

In relation to the clinical judgment and critical thinking aspect of the project, the HSRT was able to show the similarity between the groups, but looking back at Tanner’s ideas about thinking like a nurse (2006) provided fascinating insight into some of results that came out of this study. Tanner proposed that clinical judgments are more influenced by what individuals bring with them than by the objective data. Both groups expressed this idea. For the medical students, they drew on prior knowledge from classes and EMT experiences to answer many of the video clips. The nursing students did not discuss this aspect as much, but Betty White, a non-traditional student, did approach the scenarios in a different manner. During the focus group she commented that she ignored certain aspects because she realized they were meant to be excluded from her possible answers. The researcher believed this is one way that using XBT truly can accelerate expertise. We can use the staged video scenarios as a teaching tool to give medical and nursing students’ ideas to help them understand medical errors, all of which they can bring with them as they go into practice.

Tanner (2006) also points out that clinical judgments are influenced by the context of the situation. During the focus groups for both cases, all of the participants made comments on the context of the clips, and they expressed that they were unsure about how to react to certain aspects. Finally, Tanner (2006, p. 207) discussed that “reflection on practice is often triggered by a breakdown in clinical judgment and is critical for the development of clinical knowledge and improvement in clinical reasoning.” When the researcher made the clips for this study the researcher did not intend to promote cognitive dissonance to the degree observed in the nursing students. However, as the researcher worked through the process, the researcher realized that there were benefits to being incorrect, especially for the medical students, who wrote lengthier reflections about why they were wrong and were able grow from those experiences. The researcher wondered if a similar phenomenon might have happened for the nursing students if they had been less frustrated by not knowing the compiled expert answer rationale for their answers.

According to the clinical judgment model, reflection on action is needed for noticing. The scenarios provided a reflection-in-action opportunity that was more granular than typical reflection experiences since the students were asked to reflect after viewing the compiled expert answer observations clip-by-clip. In addition, while the focus group was not intended to address the “correctness” of the compiled expert answer answers, it could be used as a reflection-on- action opportunity with a medical instructor moderating the discussion. Another implication for XBT design, then, is that a learning activity should include a de-brief similar to the research focus group in which learners are able to explore their differences with experts. The XBT method can also be enhanced by recording, either in text or audio, experts’ rationale for what they observe or do not observe as problems in staged video scenarios. Learners could then be provided with a mechanism for checking the rationale of multiple experts, taking advantage of a learning moment generated by the cognitive dissonance of being “wrong.”


The researcher believes that this research will help medical educators to utilize their simulation labs to their fullest potential.  The researcher determined that the medical students enjoyed the scenarios and were able to learn and reflect on their scenario answers, even when they were incorrect.  In contrast, while the nursing students also enjoyed the scenarios they were frustrated by the compiled expert answer answers.  Therefore, they spent some of their time debating the answers.  Although, the researcher did not directly ask the students whether they found the scenarios effective, but the researcher can infer that they found the scenarios effective, since they appeared to learn from their cognitive dissonance produced by disagreeing with experts.  This is similar to idea that Christine Tanner (2006) proposed in her Clinical Judgement Model that incorporates the idea that noticing is comprised of reflection-in-action and reflection-on-action.  Reflecting-in-action is often described as “thinking on your feet” (Schön, 1987; Rainer, 2002).  Tanner (2006) equates this to noticing how the patient responds to the intervention and adjusting appropriately as needed.  Reflection that occurs before or directly after the activity is considered reflection-on-action and subsequently leads to clinical learning (Schön, 1987; Rainer, 2002; Tanner, 2006).

The students also wanted scenarios like these in their program, as they were useful and took less time than some assignments.  The students also wanted to see themselves in the scenarios and to get more detail from the compiled expert answer to understand why they critiqued the scenario.

These scenarios have the ability to provide a repeatable learning activity that can extend use and value of the simulation lab.  The researcher has previously discussed that Tanner included reflection as a key aspect of noticing.  The XBT approach encourages students to reflect on their answers as they proceed through the video clips, but the researcher would also propose that the reflection component could be strengthened with a debriefing session similar to the focus groups used in this research.  As Lasater (2007) proposed, noticing is not engaging enough and the students need to interactively observe.  The focus group component of this research revealed that students are willing to discuss areas where they did well and where they would like to improve.  Therefore, modifying that focus group experience to be a debriefing would allow an instructor to give direct feedback and understand what portions of the scenario could be improved for future use.  XBT will also promote interactive observation that could improve communication skills and teamwork since noticing and awareness is increased.

Take Home Messages

  • Participants enjoyed and learned from the scenarios
  • Expertise based training encourages interaction, reflection and discussion
  • Expertise based training offers a new way to use simulation labs

Notes On Contributors

Dr. Abby Razer is an instructional designer and IT consultant. She has a special interest in the benefits of expertise-based training (XBT) in the medical and veterinary science fields. Currently, she is working in the technology field to gather insights on new technology that may impact the future of XBT.

Dr. Christie McIntyre studies teacher development, assessment for student learning (Pre-K to College), and applications of constructivism. As Associate Director for Program Review and Assessment, she supports curricular and co-curricular faculty in their development of program and course assessments to measure student learning and engagement across campus.

Dr. Peter Fadde studies expert learning and performance and has developed patent-pending computer applications for training expert perceptual skills in sports. His expertise-based training (XBT) approach, which adapts the laboratory methods of expertise researchers for training purposes, has been applied in fields ranging from sports to truck driving to classroom teaching. Dr. Fadde also conducts research on video in teacher education and blended online learning.


The authors would like to acknowledge and thank everyone who contributed to this project.


Carliner, S., and Shank, P. (2008) The e-learning handbook: Past promises, present challenges. San Francisco: Pfeiffer.

Facione, N. (Ed.). (2013) HSRT test manual. San Jose, CA: California Academic Press.

Fadde, P., & Sullivan, P. (2013) ‘Using interactive video to develop preservice teachers' classroom awareness’, Contemporary Issues In Technology and Teacher Education, 13(2), pp. 156-174.

Jonassen, D. H. (2010). Learning to solve problems: A handbook for designing problem-solving learning environments. London: Routledge.

Klein, G. (1998). Sources of power: How people make decisions. Cambridge, Mass: MIT Press.

Lasater, K. (2007). ‘High-fidelity simulation and the development of clinical judgment: Students' experiences’, Journal of Nursing Education, 46(6), pp. 269–276.

Merriam, S. B., & Tisdell, E. J. (2016). Qualitative Research: A Guide to Design and Implementation. 4th edn. San Francisco, CA: John Wiley & Sons.

Patton, M. (2002). Qualitative research & evaluation methods. 3rd edn. Thousand Oaks, CA: Sage.

Rainer, J. D. (2002). Reframing teacher education: Dimensions of a constructivist approach. Dubuque, IA: Kendall/Hunt Pub.

Schön, D. A. (1987). Educating the reflective practitioner. San Francisco: Jossey-Bass.

Tanner, C. A. (2006). 'Thinking like a nurse: A research-based model of clinical judgment in nursing.' Journal of Nursing Education, 45(6), pp. 204­–211.




There are no conflicts of interest.
This has been published under Creative Commons "CC BY-SA 4.0" (

Ethics Statement

This study has been reviewed and approved by the Southern Illinois University Carbondale Human Subjects Committee. The protocol numbers are 14402 and 13433.

External Funding

This article has not had any External Funding


Please Login or Register an Account before submitting a Review

Ken Masters - (25/07/2019) Panel Member Icon
An interesting paper on analysing the error-noticing skills of nursing and medical students using video simulation.

The study itself is very useful, but the authors do need to address several issues in the paper, as these severely undermine its value:

• The paper refers to the interesting example of the preservice teachers. While the project and its aims are described, there is no indication of the value or outcomes of that project. It would be useful (even in a sentence or two) if this could be given, as it is surely part of the motivation for using this method on the nursing and medical students.
• Given the origin of the authors, one might assume that the study was conducted in the USA, but this needs to be stated explicitly. The authors do describe it as “Midwestern School of Medicine”, but the term “Midwestern” is also used in other countries.
• It is probably not necessary to anonymise the school (unless that was a condition of the ethics’ approval), and writing “X institutional review board approved the protocol” in the text is probably of little value, given that the ethics approval below the paper indicates approval by the Southern Illinois University Carbondale Human Subjects Committee.
• There are structural problems in reporting on the data. It is not clear if the themes emerged only through the coding, or if they were established beforehand, based upon the literature. If based upon the literature, then they can be identified in the Methods. If emerging from the data, then the authors should state explicitly the methodology used in the coding theming (e.g. constant comparison (from the description given, this appears to be the case)). Then, from somewhere within the paragraph headed “Data Analysis”, Table 1 and Table 2 needs to go into the Results section. (I say “somewhere” because the exact place is determined by the method used to determine the themes; whether or not they would go into the Results would depend on whether or not they were pre-established or emerged from the data).
• The first time the term “SimMan” is used appears to be in Table 1, with no prior discussion or explanation. This needs to be addressed.
• The data in Table 1 do not appear to match the headings.
• The paper refers to “the researcher”, but there are three authors of this paper. Either it should refer to “the researchers” or the specific researcher should be identified.
• The Conclusion appears to introduce new material. This should be moved to the Discussion or deleted.
• The Take-home messages sell the study short, and the authors may wish to expand on these a little.

Smaller issues
Overall, the language of the paper is fine, but there some careless writing where the authors might wish to give a little more attention to the detail.
• Small basic punctuation and language issues (e.g. their students learning -> their students’ learning; “Illness scripts provides” either “The use of Illness scripts provides…” or “Illness scripts provide”; “noting the time code where the incidents occur” should be “noting the time code when the incidents occur”)
• Writing out of XBT before using the abbreviation (in the Introduction, where it is first used). And then use only XBT. (Sometimes, if the term is used again only much later in the paper, the author might wish to write it out, but twice in one paragraph is unnecessary).
• “training also include” perhaps remove the “also”.
• “experts essentially coded” remove “essentially” (they either coded or they did not).
• Other similar instances occur, especially the omission of commas and hyphens, and it would be a good idea for the authors to go through their manuscript carefully to correct them.

So, overall a good study, but the paper needs some important work.
Possible Conflict of Interest:

For Transparency: I am an Associate Editor of MedEdPublish

Barbara Jennings - (10/06/2019) Panel Member Icon
This is an interesting report of a qualitative cross-case study. The authors designed the study to build on a method described as expertise-based training (XBT). It is a cross-disciplinary study of undergraduates in medicine and nursing, and analysed the students’ situational-cues when they observed clinical scenarios. The authors documented the participants’ reflections on any dissonance between their interpretation, and the interpretation of experts.
One of the strengths of the study, is the presentation of a method that can be widely applied across professions where new and challenging scenarios are regularly encountered – that depend on “noticing skills” and rapid evaluation.
However, the study of illness scripts & how clinical reasoning skills are developed (and can be tested for) are quite well established in medicine. The authors did not elaborate on this in their background literature review. Interested readers may therefore find this overview of scripts and clinical reasoning useful in the following AMEE guide:
Script concordance testing: From theory to practice: AMEE Guide No. 75.Medical Teacher. Mar2013, Vol. 35 Issue 3, p184-193.
Other suggestions that I would make for a revised paper are
1. The addition of more explanation of how / if HSRT data was integrated with other findings.
2. For other researchers interested in taking a similar approach, a lot more detail of (if not access to) the simulated cases is necessary.
3. Obvious errors in table 1 should be corrected.
4. Dedoose software should be cited appropriately.
5. The Results section would be clearer if the authors use descriptive sub-headings to introduce #RQ1, #RQ2.
I will recommend this paper to colleagues interested in problem based learning and the assessment of clinical reasoning.
Possible Conflict of Interest:

For Transparency: I am an Associate Editor of MedEdPublish.

Lee HangFu - (11/04/2019)
The paper "Accelerating the Noticing Skills of nursing and medical students Using Video Simulation" offered a nice perspective approach to share the expert opinion on scenarios that can be offered in class and distant learning. The study compiled responses from diverse healthcare professionals of nurses and medical students, the hypothesis and intended response may vary pending on the nature and personality of the healthcare student. A nicely presented paper with overwhelming comparative data from diverse healthcare learners.