New education method or tool
Open Access

Virtual Spinal Tap: using haptic data to learn procedures with feel

David Topps[1], Mike Korolenko[1], Jamie de Domenico[2], Donna Newhouse[2]

Institution: 1. Northern Ontario School of Medicine, 2. Lakehead University
Corresponding Author: Prof David Topps (topps@ucalgary.ca)
Categories: Assessment, Research in Medical Education, Technology
Published Date: 04/04/2018

Abstract

Distributed medical education programs have trouble providing suitable resources for certain forms of clinical training. Certain procedures are harder to learn because watching a skilled practitioner tells the learner little. Using a dual-control haptic device interface, and anatomically accurate virtual spinal models, we were able to create a Virtual Spinal Tap to improve the verisimilitude of an online learning experience. The devices and interfaces provided the means to measure and reproduce the subtle forces involved in a procedure that is learned by feel. Integration of this model with other simulation modalities would enhance its applicability.

Keywords: haptics-based simulation, virtual scenarios,

Introduction

The Northern Ontario School of Medicine, the first new medical school in Canada in 35 years, was established with a focus on distributed medical education (DME) for remote northern and rural communities. DME poses many challenges, both pedagogically and logistically. This is especially true for teaching surgical procedures where there is a clear need for remote teaching and evaluation of skills and praxis. Many important clinical procedures that are urgently needed in rural communities, such as lumbar puncture, joint aspirations, ocular procedures, or even burr holes, are seen too infrequently for learners to acquire or practitioners to retain a good practical feel.(Carr, 2005; Cohen et al., 2006; Luft, Bunker, & Enthoven, 1979; Park, Witzke, & Donnelly, 2002)

Distances in the North are great, plus it is not always feasible to ask much-needed clinicians to leave their communities for additional training. Some groups have been successful in taking simulation training on the road to small communities but poor roads and inclement weather pose additional barriers. It is particularly difficult to ask surgeon teachers with particularly scarce skills to waste time touring, and travelling road show crews have seen high rates of burnout.(King, Moseley, Hindenlang, & Kuritz, 2008) To address some of these problems, our research team set out to explore the use of haptic simulators for teaching surgical procedure or skills remotely. In particular, we were looking to design and implement a system that was modular, mobile and hardened for travel.

There is a long history of exploratory and pilot projects looking at the use of haptic simulators for teaching procedures such as lumbar puncture - some of the articles date back to 1985. There is reasonable evidence to support the use of haptics for procedural training.(van der Meijden & Schijven, 2009; Westebring-van der, Goossens, Jakimowicz, & Dankelman, 2008) Haptics, early in training, have been known to make a significant difference in skill acquisition.(Strom et al., 2006) Cao et al have shown that haptic feedback improves skill acquisition when learners are tested under additional cognitive loading.(Cao, Zhou, Jones, & Schwaitzberg, 2007), and many studies have shown high degrees of learner and teacher satisfaction when haptic simulators are employed.

Methodology

Because this is a newly developing field, where much of the work is exploratory, we felt that it was premature to use traditional techniques such as a randomized controlled trial. Rather, we pursued an design-based research approach (Design-Based Research Collective, 2003) examining challenges in the development and implementation of remote haptic simulation, particularly with regard to both the technical and pedagogical aspects. We received funding from the Canadian Foundation for Innovation (CFI), which supports infrastructure for research in virtual reality. We engaged various groups of undergraduate medical students to provide program evaluation at different stages of the project, through convenience sampling. We employed a SharePoint Team Site for the collaborative aspects of the project, particularly in relation to ongoing field notes through several iterations of the learning designs, project documentation, participant surveys and a wiki on technical issues.

Results

Our first concern was that of simple distance. Previous research has shown that haptic feedback signals need to be cycled at around 1000 Hz in order to establish realistic sensation.(Marshall, Yap, & Yu, 2008) At this frequency, the speed of light becomes an issue, limiting direct cycles to less than 100 km, which would be too short for our needs. This is a well-known problem and we were pleased to find some tools that used local caching and algorithmic interpolation of the signals to address this problem. Initially, we selected a package from Handshake VR (http://olab.ca/handshakevr-haptic-toolkit/, which provided us with smooth signal interpolation, together with an attractive rapid application design (RAD) interface. See Figure 1.

 

Figure 1: Handshake VR interface, showing spinal model and RAD interface

 

By using a shared set of Handshake VR packages, including the Phantom Omni haptic device, we found that we were able to set up interacting dual controls with haptic feedback over a distance. This enabled a powerful learning tool where the teacher can feel what the learner is doing and provide guidance. This is a crucial asset in the learning of some procedures: you can never lay a hand on the hand of somebody else when doing a lumbar puncture and expect to feel what they are doing without completely spoiling their ability to sense underlying tissue structures. The Handshake VR dual control allowed us to selectively sense what the learner was feeling without interfering with their sensation.

The dual channel interactivity between the paired devices can be selectively controlled: in Teacher Mode, the master device could be used to apply guidance and subtle feedback, helping the learner to find the optimum path through the tissues; in the Examiner Mode, the master device would sense all that the learner was feeling but without providing any haptic feedback towards the learner, whether intentional or not. This dual control capability was tested with teachers and learners in a variety of situations, sometimes in the same room, but also between our two main teaching sites, separated by 1200 km. There was no appreciable lag in feedback, despite the increased distance. Both teachers and learners evaluated this capability very highly. "This is a game changer", quoted one highly impressed teacher.

We created the 3-D volumetric model of the lumbar spine by using data from a DICOM image set, taken from the CT scan of a real patient, and modeled using a toolset from Akamai (www.akamai.com). Using this approach produces a much more realistic model than simply using a classic artist’s impression – it has real anatomical variability, can more easily incorporate real anatomical pathology, and can also more easily generate additional anatomical features such as osteophytes, osteoporosis or old fractures. Using CT data gave us very fine resolution of the bony detail, which was ideal for our purposes, but MRI data can be used as well, which provides improved complexity for soft tissue structures.

 

Figure 2: Phantom OMNI haptic device

 

The Phantom Omni haptic device (Figure 2) that we used has 6 Degrees of Freedom. Of these, it only has active resistance with 3 Degrees of Freedom. As other similar projects found, this produces the disconcerting fact that you can freely wobble the stylus around, no matter how deeply it is inserted into the virtual tissues. In order to eliminate this effect, one either has to produce motion resistance in the remaining 3Degrees of Freedom, or physically anchor the entry point of the needle. We explored the use of paired haptic devices to provide this additional motion restriction in the other 3 Degrees of Freedom but found that this was not practical or cost-effective.

We experimented with a number of rubber membranes to represent the skin of the patient's back and to provide the physical fixation point at the entry site of the virtual needle. We were unable to find a material that provided good stabilization, while allowing user choice of entry point. We note that others such as Haluck et al have had some success with this.(Haluck, Webster, Mohler, & Melkonian, 2000)

We experimented with a variety of user visualizations - the display portrayed of the model. Initially we felt that stereoscopic viewing would be an essential feature but we quickly found from user feedback that users could quite easily mentally translate a rotatable 3-D model displayed on a 2-D screen. They did not find that stereoscopic viewing produced much more than a novelty effect. As several users pointed out, one does not have the luxury of looking at needle placement in real life and that it was more important to learn needle guidance by feel alone.

We had more success with providing the simple option of making the virtual skin translucent or opaque. Early learners appreciated being able to see roughly what the underlying bony model looked like. Once they were comfortable at being able to find a workable needle path for the lumbar puncture model, they then appreciated the skin being made opaque on-screen and having to reproduce that needle path by feel alone. We also provided learners with the ability to select an orthogonal view of the model - in effect, they would be looking laterally, which they sometimes found helpful in assessing needle depth. However this feature was much less used, and occasionally proved confusing to some learners.

A few additional operational issues showed up in testing and implementation. For example it is crucial to exactly align for the user both the visual display and haptic angles of attack for the needle device. Such calibration was typically only needed once at the start of a learning session. Operating paired devices over a distance required the opening of specific ports on the university firewalls. The haptic devices themselves are now reasonably robust but still require supervision during the setup phase. (See Appendix 1.) We highlight these particular issues because, while they are common to most laboratory setups and easily solved therein, they pose far greater challenges in the educational environment that is our main objective: providing remote simulation opportunities to northern and rural communities.

Metrics

One aspect of working directly with a haptic device such as the Phantom Omni is that we had access to sophisticated metrics on user performance, including positional and force data provided on a real-time basis. At first, we attempted to use this data flow along with Hidden Markov Models in an attempt to measure the differences between optimal paths as derived from expert users and those paths generated by neophyte learners. While this approach looked promising, we quickly found that our experts preferred working directly with their paired learners and that they trusted their own subjective analysis more. Given that teachers are a scarce resource in such a unique area, we briefly explored the possibilities of digitally recording user-generated paths so that teachers could review their progress on a time shifted basis. For our particular application in remote teaching, this approach looked promising but we were unable to complete this analysis within the funding period allowed.

As part of a related project connected to the Health Services Virtual Organisation (HSVO, http://olab.ca/hsvo-health-services-virtual-organization/), we also explored the utility of connecting these haptic devices to the HSVO Savoir bus interface.(R Ellaway et al., 2010; R. H. Ellaway, Cooperstock, & Spencer, 2010; Spencer, Copeland, Harzallah, Hickey, & Liu, 2009) Although the user controlled light paths, available through Savoir, provide high bandwidth and volume of data throughput, the cycle turnaround and delays were sufficient to limit the usefulness of this approach. However we did have some success with connecting our haptic devices to OpenLabyrinth virtual patients(Rachel Ellaway, Joy, Topps, & Lachapelle, 2013), such that the starting parameters such as device position, initial resistances, relevant anatomical models to load, could all be made available at the time of device launch. This would make it easier for teachers to set up models in remote labs, ready for learners to use when needed, and the virtual patients would provide clinical context so that the learners were familiar with the clinical reasoning behind the procedure to be learned. (http://olab.ca/virtual-spinal-tap-scenario)

User satisfaction collected from both teachers and learners via surveys and focus groups was uniformly high. With only 16 surveys, none of the results achieved statistical significance but it was clear that both participant groups were enthusiastic and appreciative of the capabilities of haptic devices in remote teaching of surgical procedures. Many users, particularly teachers, particularly liked the dual control aspect of paired haptic devices, expressing that this presented an entirely new modality for teaching procedures where students have to learn the feel of the procedure.

A number of other procedures were considered and evaluated as potential topics for haptics assisted procedural training. See Table 1.

 

Procedure

Feasibility

Fidelity

Notes

Spinal tap

+++

+

Reduction in mobility as needle penetrates skin is hard to reproduce

Dental caries

+++

+++

Single point of drill easy to reproduce

Corneal rust ring

+++

+++

Corneal surface can be made more or less malleable

Joint injections

++

++

Reduction in mobility as needle penetrates skin is less important

IUCD insertion or endometrial Bx

++

+

Reduction in mobility as sound penetrates uterus is hard to reproduce

Table 1: potential clinical procedures evaluated for haptic modeling

Economics

One element of this project was to explore the possibilities of implementing these devices in remote teaching situations. Several levels of haptic device were tested, including newer consumer level haptic devices such as the Novint Falcon (https://cs.stanford.edu/people/conti/falcon.html). Although these $300 consumer devices were much more attractive from a cost perspective, their sensitivity and utility were insufficient for our needs. The Phantom Omni device was a more reasonable compromise in this respect.

Software costs were also an issue. Although the Handshake VR rapid application development platform was very effective for prototyping, the cost of a more widespread multiple site license quickly became prohibitive. Part of this cost was the MatLab licence (https://www.mathworks.com/products/matlab.html), which is often now available as a site licence for universities.  We extensively explored alternative languages and development tools, such as CHAI from Stanford.(Conti et al., 2003) While these tools are significantly cheaper, or even free as Open Source, the development time required when using these tools quickly became prohibitive. New development tools such as X3D or H3D (http://www.h3dapi.org/modules/mediawiki/index.php/Beginner_X3D_and_H3DAPI ) looked promising but we were unable to assess these within the time constraints of the project.

Conclusions

Overall, our project uncovered some exciting areas of development and illustrated huge potential for the use of haptic devices, especially in paired mode, for remote procedural teaching. However, it was also clear that a number of areas need to be strengthened before this is practically viable. First among these would be the availability of haptic application development software that provides runtime licensing that is affordable when scaled across multiple sites. Connecting these devices into other projects through the use of network enabled platforms such as that being developed by HSVO looks very promising but is still in its infancy.

Take Home Messages

Some procedures cannot be learned by watching. Haptics-based simulation is useful for some procedures, such as the spinal tap. 

Paired haptic controls allow the teacher to feel what the learner is feeling. 

Integrating haptics-based simulation with other simulation modalities in blended scenarios affords a more holistic approach.

Notes On Contributors

This work was carried out over a decade ago but the data files were lost. This report is based on resurrected data. 

David Topps designed the project and wrote the manuscript. 

Mike Korolenko managed the project and contributed to the manuscript. 

Jamie de Domenico provided data and coding assistance in the project. 

Donna Newhouse provided subject matter expertise on the virtual anatomy aspects of the project.

Acknowledgements

We acknowledge that this project was partially funded by the Canadian Foundation for Innovation. 

In accordance with the Tri-Council policy statement (Research, 2014) and the Helsinki Declaration (WMA, 1964), an ethics certificate was not required as this was regarded as program evaluation.

Bibliography/References

Cao, C. G., Zhou, M., Jones, D. B., & Schwaitzberg, S. D. (2007). Can surgeons think and operate with haptics at the same time? J Gastrointest Surg, 11(11), 1564–1569.

https://doi.org/10.1007/s11605-007-0279-8   

Carr, M. M. (2005). Program Directors Opinions about Surgical Competency in Otolaryngology Residents. The Laryngoscope, 115(7), 1208–1211.

https://doi.org/10.1097/01.MLG.0000163101.12933.74   

Cohen, J., Cohen, S. A., Vora, K. C., Xue, X., Burdick, J. S., Bank, S., … Villanueva, G. (2006). Multicenter, randomized, controlled trial of virtual-reality simulator training in acquisition of competency in colonoscopy. Gastrointestinal Endoscopy, 64(3), 361–368.

https://doi.org/10.1016/j.gie.2005.11.062   

Conti, F., Barbagli, F., Balaniuk, R., Halg, M., Lu, C., Morris, D., … Salisbury, K. (2003). The CHAI Libraries. Stanford, CA. Retrieved from http://dmorris.net/publications/chai.eurohaptics.2003.abstract.pdf   

Design-Based Research Collective. (2003). Design-based research: An emerging paradigm for educational inquiry. Educational Researcher, 32(1), 5–8.

https://doi.org/10.3102/0013189X032001005   

Ellaway, R. H., Cooperstock, J. R., & Spencer, B. (2010). Simulation integration for healthcare education, training and assessment. In 2010 Fifth International Conference on Digital Information Management (ICDIM) (pp. 484–489). IEEE.

https://doi.org/10.1109/ICDIM.2010.5664229

Ellaway, R., Joy, A., Topps, D., & Lachapelle, K. (2013). Hybrid Simulation with virtual patients. Simulation in Healthcare : Journal of the Society for Simulation in Healthcare, 7(6), 392–468. Retrieved from http://www.dtic.mil/docs/citations/ADA560817   

Ellaway, R., Topps, D., MacDonald, J., Copeland, B., Olmos, A., & Spencer, B. (2010). HSVO: A functional XML specification for integrating simulation devices. Bio-Algorithms and Med-Systems, 6(11), 47–51. Retrieved from https://www.infona.pl/resource/bwmeta1.element.baztech-3f13e5e8-c7e1-4daa-84e9-486f54f07470   

Haluck, R., Webster, R., Mohler, B., & Melkonian, M. (2000). A Prototype Haptic Lumbar Puncture Simulator. Hershey, PA. Retrieved from http://cs.millersville.edu/~rwebster/haptics/lumbar/index.html     

King, C. J., Moseley, S., Hindenlang, B., & Kuritz, P. (2008). Limited Use of the Human Patient Simulator by Nurse Faculty: An Intervention Program Designed to Increase Use. International Journal of Nursing Education Scholarship, 5(1), 1–17.

https://doi.org/10.2202/1548-923X.1546   

Luft, H. S., Bunker, J. P., & Enthoven, A. C. (1979). Should Operations Be Regionalized? New England Journal of Medicine, 301(25), 1364–1369.

https://doi.org/10.1056/NEJM197912203012503   

Marshall, A., Yap, K. M., & Yu, W. (2008). Providing QoS for Networked Peers in Distributed Haptic Virtual Environments. Advances in Multimedia, 2008, 1–14.

https://doi.org/10.1155/2008/841590   

Park, A., Witzke, D., & Donnelly, M. (2002). Ongoing Deficits in Resident Training for Minimally Invasive Surgery. Journal of Gastrointestinal Surgery, 6(3), 501–509.

https://doi.org/10.1016/S1091-255X(02)00021-5   

Research, C. I. of H. (2014). Tri-council policy statement: ethical conduct for research involving humans. Ottawa: Interagency Secretariat on Research Ethics.   

Spencer, B., Copeland, B., Harzallah, Y., Hickey, J., & Liu, S. (2009). Health Training with SAVOIR and the RSM. In 2009 Fifth International Conference on Semantics, Knowledge and Grid (pp. 262–265). IEEE.

https://doi.org/10.1109/SKG.2009.93   

Strom, P., Hedman, L., Sarna, L., Kjellin, A., Wredmark, T., & Fellander-Tsai, L. (2006). Early exposure to haptic feedback enhances performance in surgical simulator training: a prospective randomized crossover study in surgical residents. Surg Endosc, 20(9), 1383–1388.

https://doi.org/10.1007/s00464-005-0545-3   

van der Meijden, O., & Schijven, M. P. (2009). The value of haptic feedback in conventional and robot-assisted minimal invasive surgery and virtual reality training: a current review. Surg Endosc, 23(6), 1180–1190. Retrieved from https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2686803/

Westebring-van der, Putten E. P., Goossens, R. H., Jakimowicz, J. J., & Dankelman, J. (2008). Haptics in minimally invasive surgery--a review. Minim Invasive Ther Allied Technol, 17(1), 3–16.

https://doi.org/10.1080/13645700701820242   

WMA. (1964). WMA Declaration of Helsinki – Ethical Principles for Medical Research Involving Human Subjects – WMA – The World Medical Association. Retrieved June 12, 2017, from https://www.wma.net/policies-post/wma-declaration-of-helsinki-ethical-principles-for-medical-research-involving-human-subjects/

Appendices

Appendix 1: Setup instructions for Phantom OMNI haptic device

Instructor

  1. Navigate to C:\MATLAB701\work\Spine\NorMedInstructor
  2. Execute NormedSpineInstructor.exe
  3. While that command window is still up, open MATLAB from the desktop
  4. In the MATLAB explorer, navigate to Navigate to C:\MATLAB701\work\Spine\NorMedInstructor
  5. Double click on NormedSpineInstructor.mdl
  6. To the left of the text window with ‘inf’ in it, there is a Connect to Target button. Click that.
  7. There will be a 5 second pause and the button to the left of that will turn into a little black arrow. Click that.

The simulation should now begin for the instructor.

---------------------------------------

Student

  1. Navigate to C:\MATLAB701\work\Spine\NorMedStudent
  2. Execute NormedSpineStudent.exe
  3. While that command window is still up, open MATLAB from the desktop
  4. In the MATLAB explorer, navigate to Navigate to C:\MATLAB701\work\Spine\NorMedStudent
  5. Double click on NormedSpineStudent.mdl
  6. To the left of the text window with ‘inf’ in it, there is a Connect to Target button. Click that.
  7. There will be a 5 second pause and the button to the left of that will turn into a little black arrow. Click that.

Once both are up and running, either can click on the white button on the wand to remove the skin, and click on the blue button to give rights to the other person.

Declarations

There are no conflicts of interest.
This has been published under Creative Commons "CC BY-SA 4.0" (https://creativecommons.org/licenses/by-sa/4.0/)

Reviews

Please Login or Register an Account before submitting a Review

Meghana Sudhir - (30/05/2018)
/
This is a study worth exploring. The authors tried to find out the use of haptic simulators for teaching surgical procedures or skills training remotely. They designed and implemented a system that was modular and mobile which helped them to use this remotely.

Haptics was always a concern in manikins and software based simulation. This study was trying to design procedures with feel. This mode of simulation is useful when there are multiple centers to manage as well as centers which are geographically dispersed without compromising with the learning outcome.

The cost for developing as well as software license seems to be expensive. The expertise of the staff managing this system is not mentioned. There need to be trained staff to manage the simulation remotely. It is mentioned that "distances in the north are great". What does that mean?

The authors shared their experience which can help shape other areas in simulation.

James Fraser - (13/04/2018) Panel Member Icon
/
This is an interesting paper which introduces a new method for the learning of minimally invasive procedures which require the clinician to interpret sensory feedback to adjust their performance of the skill. This could have application to a range of skills and it would be interesting to hear of any long term evaluations or adoption to other skills. Providing adequate opportunities for skill development or re assessments of competency to distributed students and clinicians is as challenge for many countries and this paper will be of interest for those involved in undergraduate and postgraduate training.
Nandalal Gunaratne - (05/04/2018)
/
Very interesting. Since this was done 10 years ago, I would presume things have improved and become less costly. Perhaps AI will replace the need of a human teacher and would give tireless repeated feedback. GitHub, H3D and Open haptics are open source software tools that have become available that I just found on a search.